Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Dori Gonzalez-Acevedo (00:00):
Yeah. Hi,
Steve.
Steve Thompson (00:02):
Hey, Dori, good
to see you.
Dori Gonzalez-Acevedo (00:03):
Good to
see you. I am. I'm a little
exhausted. Are you
Steve Thompson (00:06):
totally great
conference a lot? Oh, man a lot.
Yeah, a lot to
Dori Gonzalez-Acevedo (00:11):
do.
Right. And so, what we're goingto kind of recap first, I think
the Lean validation implementingCSA and modernizing 21 CFR Part
11 connects conference we justconcluded just moments ago here
in Philadelphia. It's nice tosee you in person.
Steve Thompson (00:26):
Yeah. And great
weather. That's a good thing.
Beautiful. Yeah, right. Yes,
Dori Gonzalez-Acevedo (00:30):
last few
days. Let's, what are some of
the key highlights for you thelast
Steve Thompson (00:36):
few days while I
gave the presentation? And yeah,
so it was a lot. It was a hardpresentation for me because it
was covering, you know, part 11annex 11 Pick as guidance
standards. And, and so there wasa lot of research involved. But
as you know, I mean, I likedoing presentations, because I
(00:56):
learned, and I use chat GPT tohelp me with my presentation.
And I found out that a funnystory you know about this, I
also had this other AI tool togenerate thing on CSA, and to
show you that it doesn't reallywork it spit out something like
controlled substance act. And soit's a lesson learned. And we
(01:17):
could get into like theexperience and knowledge and
wisdom that we have. We can'tbelieve everything AI says. But
it was a great experienceexperience. And I learned a lot
through it. And so yeah, I washappy to do that presentation.
And then you gave one, which wasreally good. And
Dori Gonzalez-Acevedo (01:32):
yeah, so
I did something new and
different. And I changed up acouple of times before we even
started because what I reallyvalued about this whole last few
days was everyone came withtheir A game, all the presenters
really kind of not only gave usthe basics, but then went one
step further in their thoughtsin their experiences and how
(01:55):
they're working withorganizations, and what are the
things you need to challengeyourself on? And so yeah, so I
mixed it up last minute, too,because I think this is so
evolving, right?
Steve Thompson (02:06):
Yeah, it even I
mean, my the past month, or
whatever I was thinking aboutthis earlier, it's like drinking
from a fire hydrant sitting on achair on the beach with a tidal
wave coming. That's what'shitting us with CSA and
technology and everything. AndI'm like, so motivated and amped
(02:27):
up with all this stuff andexhausted at the same time. But
you know, you did in yourpresentation, what I really
liked was the Poll Everywherethat you did. And one of the
fascinating things is what'sgoing to I can't get the exact
question, but how will youreally get leadership to affect
change? And the message thatcame out was a regulatory
(02:49):
response or something negativethe negative regulatory action,
which is not a good thing? Yeah.
Yeah,
Dori Gonzalez-Acevedo (02:55):
it is
often the case when I interview
folks and go into to dostrategic assessments, they
don't feel like they would makeany change until they got some
regulatory finding that's prettycommon is like the fear based
approach, right? What don't, youknow, don't fix anything if it's
not broken, right, you know, andwe see that I know, you and I
(03:15):
were chatting at lunch, too, wesee a lot of still paper based
programs out there. Leadershipnot wanting to invest in new
technology, because, well, we'venever had a finding we'd never
had an issue or, you know, thosesorts of mindsets from 1015 20
years ago, right? And if wedon't get folks into a system
(03:36):
that we start to be able to useand analyze the data. What are
we going to do those companiesare going to be very, very far
behind. Right?
Steve Thompson (03:44):
Yeah, I mean,
you if you're still paper based,
I mean, you really got to movefast, because everything is
moving so fast. Every minutethat you're not digitized,
you're going you're fallingreally fast you are speeding in
reverse is essentially what'shappening. And unlike waiting
(04:05):
for a regulatory action, we'retalking earlier, it's like a kid
deciding that they're not goingto do something right until they
get in trouble, which is notreally the way we were raised.
So why don't we take itindustry?
Dori Gonzalez-Acevedo (04:18):
Yeah, I
don't know. It is a good
question. I think most of thefolks that come to connect right
our line managers, some seniordirectors, but we don't get
we're educating at thegrassroots level, which and and
that's great. And folks want todo better and they want to feel
more added value in their day today jobs. But we're not getting
(04:39):
necessarily the leadership buyin to make significant
advancements. They say they wantto get to machine learning and
AI and advanced technologieswithin their products or spaces.
But it takes a lot of change inorder to get there and I'm not
sure that the message is gettingheard on both sides, right.
Steve Thompson (04:59):
So It's
interesting, because even in the
conference, there weredifferences of opinion, which is
great. I mean, we need to hearthose. That's part of critical
thinking, which is another wholetopic we could talk about. But I
heard that some said, Oh, no, ithas to come from the top down.
Other said, No, it has to comefrom the bottom. So what do you
think?
Dori Gonzalez-Acevedo (05:16):
Well, I
think it's both. Yeah. So one of
the things that I really triedto instill in training is a
dialectical concept ofphilosophy, right? Having both
of those things be true, andmaking meaning in the middle
somewhere. Right, right. Becausewhile senior leadership wants
(05:38):
stockholder investments, andcontinue to add value of a
company, there's also areinvestment of resources at the
grassroots level in order to getthere, right. And so while you
want that other thing, you alsohave to do something down here.
And that's hard for folks tohave both of those things. Yes,
(05:59):
the EU should be in some ways,quote, unquote, fearful of
regulatory findings, but does itshould not bar you from making
advancements and innovation?
Steve Thompson (06:09):
Exactly right.
And even like, I agree with you,it's both approaches, right? So
you need buy in from the bottomup. And then thinking about it,
you can't buy anything, if youdon't, senior level approval. So
it takes in an in and I'm sureyou can, you could have a
grassroots to effect change, oryou can have senior leadership
and and make a decision and itgoes down. But the best way is,
if you, you know, do bottom uptop down at the same time, and
(06:33):
then you meet in the middle,right in
Dori Gonzalez-Acevedo (06:36):
the
middle. Yeah. I mean, I think I
know, you and I have been aroundfor similar, similar timeframes
here. And we see a so we havefolks still on paper, and then
we have folks on a very, veryadvanced test automation
framework. And, and yet stilldoing multiple levels of
(06:58):
approval, and right, right, avery, very laborious process on
top of a very robust testautomation process. And and part
of that is because the qualityteams haven't kept up with the
technology, right? And how toeducate quality teams about the
new technology as well aseducating the IT teams about the
(07:20):
changes of regulations and howthey're coming. And what does
that mean and adoption of newways of doing things?
Steve Thompson (07:26):
Well, two
interesting things that were
mentioned. One was and was justrecently mentioned, you know,
you may have somebody from thebusiness that you've put them
down to sign, and they'resigning off on an IQ that they
really don't understand is thatI mean, it, where's the value
being added, or you could havesome tech person that is signing
off something as the businessprocess that they don't
(07:48):
necessarily understand. And Ibelieve Mark had mentioned in
his presentation, and I agree, Imean, you can have somebody that
authors and then final QA or QAapproval, so you can have two
signatures, which is adequate.
Everybody can read andparticipate in meetings and
stuff. But the more signaturesyou have, which is another
point, I think Stephen cookbrought it up was that if you
(08:10):
have so many signatures,everybody assumes that everybody
else is really doing it. Andthey just sounds
counterproductive,
Dori Gonzalez-Acevedo (08:17):
which
speaks to brandy. Stockton is a
presentation on on criticalthinking, cognitive biases, and
a lot of those very instilledstuff that we've done over the
years just being kind of rotenow, right? So yes, so there's
like four or five approvers. Andso no one reads anything, does
anyone read every requirementthese days are just blindly
(08:38):
signing? Right? Part of I know,my consulting and how to do
digital validation is gettingaway from the document centric
stuff, right? Because postdocswere just signing documents,
they weren't reading therequirements, right? So we move
to digital validation was reallygetting down to the entity level
of an individual requirement.
What does that write individualtests? What does that mean? And
(09:00):
what excites me around agileSDLC right is writing user
stories where you're encodingeverything in one fell swoop
right so that it's verytransparent. What is what is it
what are you going to testwhat's exception criteria Pass
Fail done?
Steve Thompson (09:18):
Exactly right.
And true agile, you know, Imean, we hear mentioned a lot
and I know I mean, fulltransparency obviously I'm at
the technology company and andI'm excited what's happening
with with our stuff and agilecoming out true agile, I mean,
you know, the user story, epics,user stories, and then
elaborating in the functionaland testing at that level and
the sprints and the burn downs.
(09:42):
I mean, there's a lot of goodtools, and we could learn from
other industries and whatthey're doing and leverage the
best practices not live in ourown bubble, dealing with the
things we made up and theproblems we created for us.
Dori Gonzalez-Acevedo (09:55):
We made
up which it was against back to
our age and our fact that you Wemade it up 1997 came out, we
made up stuffed and to getthrough because we didn't know
what to do. Right. And so but alot of us have been saying the
same thing evolve over time.
What is true risk base? Andsomehow along the way, I think
some of that messages havegotten diluted, right? Maybe not
applied Well, perhaps notwanting to have uncomfortable
(10:21):
conversations with each other.
Yeah. Right. Really debatingabout what is this high risk,
low risk, medium risk, whateverrisk, right? And just getting
back to that kind of checklistssort of thing, which we talked
about to
Steve Thompson (10:37):
browse a good
conversation, right? Yeah, to
check us.
Dori Gonzalez-Acevedo (10:41):
What does
that enable, right? And if you
look, we talked a little aboutthe FAA, right? Like, you look
at the FAA, and they live anddie on their checklist, because
some of those checklists reallydo matter. But they actually go
through the checklist.
Steve Thompson (10:56):
Yeah. And
Joanna, or Joanne? Joanne
Goldberg? Yeah, with Medtronic,you know, she gave a really good
presentation about thechecklist. And the point that
she brought up is, yeah, youcheck the box, but you don't
have any context on what thatyes meant. And so there's
something more that's needed.
You know, I mean, there's a timeand place for pretty much
everything, but still, you know,complete reliance upon a
(11:18):
checklist alone, you know, areyou really getting value out of
that, so, and then circlingback, you know, on critical
thinking at all, we really haveto start thinking, and not just
doing things by rote because itis burned. And it's almost like
some of our procedures arewritten in stone tablets are
falling, and you can't changeit. But a good attribute of a
(11:40):
quality system is continuousimprovement. Well,
Dori Gonzalez-Acevedo (11:44):
you would
think, right, but I know, and
I'm sure you have your stories,but there's organizations that
just say flat out, it's going totake four months, six months for
a change control to go through.
Yeah, you know, and I don't evenknow how to respond to that
anymore. Frankly, like a changecontrol should not take four to
six months,
Steve Thompson (12:05):
you have a
problem that's floating out
there for four to six months.
Right. Right. Like,
Dori Gonzalez-Acevedo (12:09):
you know,
when you're looking at a quality
management system, and if that'sthe time that it's taking for a
single point change to occur.
What else is taking so long?
Steve Thompson (12:20):
Yeah, seems like
an opportunity for improvement.
Dori Gonzalez-Acev (12:24):
Interesting,
right. So it was great. It was a
you know, as always, Cisco isalways available and helpful to
all of us.
Steve Thompson (12:33):
That's
excellent. We needed that.
Right. And it's so much, it's,it's so good to have that buy
in, because that is effectingchange, which we need, which is
decades old regulation. And, andwhen I was doing the
presentation, we all know, youknow, like, part 11 Is, is
vague. And it's only a fewpages, and then I compared it to
(12:54):
the GAMP and to the pic as andto the annex 11. And then edge
request, I use some of theirinformation, which they said
that part 11 is thou shalt andannex 11 is how to, and so both
of them really work goodtogether. But like back into a
point that you just mentioned,you know, it is very vague, at
part 11 in and of itself, youknow, systems must be validated
(13:16):
to ensure blah, you know, weknow the rest of the record, but
it's very short. And then wemade up all the other stuff
around it because we had tobecause it's like there's the
law. And now we got to figureout what to do.
Dori Gonzalez-Acevedo (13:27):
So, and
25 years later, technology has
changed dramatically, right? Theapplications that we're all most
of us are purchasing fromsoftware vendors, regardless of
whether or not the softwarevendor is dedicated to the life
science space or not. These arebusiness reputations out there,
they're doing their duediligence to sell a product.
(13:50):
They might not follow the samelifecycle management or quality
management system that we haveas an industry specific. But
they're making money there. Thatis why they're in business,
right? And so to put some trust,and redefine some of those, the
customer relationships withthose vendors to leverage that
(14:10):
stuff, right. I know you guyshave done a great job of
providing your customers withdocumentation that they can
leverage. Other vendors do thesame thing. It's changing that
relationship to the vendormanagement. And the last Carlos
did talk wonderful around. Whatdoes that look like now? It
(14:31):
means your procurement people,you need your legal people, like
you need your business people toreally define what this stuff is
and what's important, what's notimportant, right? It's much more
complex, and that relationshipbuilding is not a single
transaction anymore. It's acontinuous conversation. I do a
(14:52):
lot of application managedservices and represent in that
like the middle part there rightbecause sponsors don't know what
they need. You and I can helptranslate. Yes. I mean, the
vendor.
Steve Thompson (15:03):
Yeah. You
brought that point out
technology. And you know, thebusiness don't always speak the
same language and you need aliaison. And you were saying you
serve as a liaison, which isgreat, you know, your
multilingual
Dori Gonzalez-Acevedo (15:16):
thing.
Yeah. Yeah.
Steve Thompson (15:18):
So, the other
thing is that you know, you I
know from the past, you do womeninvalidation, and you're now
even expanding upon that I havetwo of my colleagues that
attended one of your sessions.
And that I think, is great,because, you know, it's, it's
good to empower all people inour industry, because we can all
learn from each other. So how'sthat? I am not going to your
(15:41):
meetings.
Dori Gonzalez-Acevedo (15:43):
Yeah. So
so we're still doing some women
leading validation specificallyin connects, but we are
expanding to quality connectswhere we're going to, you know,
incorporate LGBTQ community,young emerging scholars, if you
will, I don't know if that's theright term. But you know, there
are pockets of underrepresentedgroups that that we need to help
(16:05):
in order to get to the nextlevel. And that helps us all get
diversity of thought at thetable. It helps. I know, like
Binney did her firstpresentation with Laurent and
you know, that is part of whyhope of a legacy is to help in
impart all my knowledge, as muchas that is to the next
(16:26):
generation, right and make, letthem have great ideas.
Steve Thompson (16:30):
You don't know
this, but I had a sidebar
conversation. And with somebody,I won't call him out by name.
And she said, how much sheappreciated. And she, I can't
remember if you are her idol orher mentor, but she looks up to
you. And so anyway, I didn'teven tell you that that's kind
of a surprise. But it's good toknow that you are making a
(16:51):
difference there. And, andbeneath with the presentation,
it was really great. And I wastelling you how important it is
to get the word out and get upthere. And because we learned I
learned a lot by doing thepresentations and coming to
these events. And yeah, it'sjust a great experiment meeting
people like you. Yeah,
Dori Gonzalez-Acevedo (17:09):
it is a
thing. It's it gives us all an
opportunity to expand ourthoughts coming with that that
curiosity mindset, right? Thatin when we do a met like this,
where it's very specific ontopic, we do the validation
University is broader. And wehave a lot of ground to cover
and in a short period of time,and we jumped around from things
(17:31):
to things, but this is a verysingular topic. And we we went
from beginning to end, we wereable to also hear everyone and
the context in the story inwhich we told and I thought it
was really great to be able tohave that thread throughout
Steve Thompson (17:45):
and and we met
with the architects and mark
from Fresenius. You know, Ilearned about how all of this
basically started and obviouslyhaving Cisco here from the FDA,
it was really I mean, we havethe the founders, the architects
of the CSA, and I know Mark,when he gave his presentation,
(18:05):
it was excellent, because wehave the scripted unscripted, ad
hoc, exploratory all thesetests. And he gave examples of
you know, what those templatescan look like and how we're
overdoing things. But one thingI didn't know your thought, but
one thing that I noticed is someof the questions that were
coming up is well, okay, if it'shigh risk, then we use scripted.
(18:27):
If it's low risks, then we useexporter, but it's not black and
white. That's critical thinkingand it's gray. And we have to
figure out people
Dori Gonzalez-Acevedo (18:35):
don't
like the gray, Steve. But that's
it makes them uncomfortable,right? So one of what I try to
instill is that we need to bemore uncomfortable sitting in
that, right, we just need to beokay, just to say, Okay, well,
this is high. But we're going todo exploratory because it makes
more sense to do exploratory inthis in this regard. I think
folks are when we're asked todocument our opinion, is the
(19:01):
hard part. Right, right, becauseyou're doing risk assessment.
And hopefully, folks are doing arisk assessment with a variety
of stakeholders, not a singularstakeholder doing the risk
assessment, we want people tohave diversity of thought today
at the risk of seven, but thento take to make a claim and say,
This is what we think at thispoint in time. And therefore
(19:23):
we're going to do XY and Z. Andoh, by the way, we'd have
periodic reviews as part of ourprocess. Exactly. And part of
that process should be thenevaluating and we're looking at
that risk that we took, and thenreassessing that part of the
process, for most part doesn'tget done. Well, if at all
Steve Thompson (19:42):
right, and even
risk is we really risk is
something that lives throughoutthe life of a system risk is not
something that you do at thevery beginning to stamp at high,
medium and low and forget aboutit. And that's another area
where in general, as anindustry, we really need to
understand because risk todaycould be different from risk
(20:04):
tomorrow as things change. So
Dori Gonzalez-Acevedo (20:06):
the
concept of living entities and
living parts of the lifecycleand you know that we've, I try
not to use the word documentanymore, right? Because I think
that that, that sets insomeone's mind a artifact that
is set in time. Right, right.
And all of what we're talking alife when we talk about this,
this is all lifecycle stuff,right? Whether you want to use
(20:28):
the validation lifecycleterminology, or you want to use
it system development life orwhatever, it's a lifecycle.
Exactly. That implies it isliving.
Steve Thompson (20:37):
Exactly.
retired.
Dori Gonzalez-Acevedo (20:40):
So, yeah,
so all of that being part of the
process. And I think it's thoseskills, like we talked about,
are not necessarily taught inschool. Right. So that was the
other part like some folks inthis industry have come from
hard sciences, soft sciences,non sciences, and this kind of,
(21:02):
you know, scientific mindset is,is needs to get in there.
Steve Thompson (21:08):
And that goes to
critical thinking and bias and
stuff that we bring. And we haveto be open minded and humble.
And, I mean, those are theskills from critical thinking,
again, back to brandies, I guessher presentation on things that
we could learn in that respectis an open mind, just, you know,
there are differences that weshould at least consider, we may
not end up following that, butwe should at least think about
(21:30):
it. Think, you know, thinking isimportant.
Dori Gonzalez-Acevedo (21:35):
And most
of us do it, we just when push
comes to shove, is reallydefending the action of which we
do based on those thoughts.
Right,
Steve Thompson (21:45):
right. You know,
again, I want to circle back to
marquee be brought up a pointlike a question was raised not
to pick up on pick on anybodythis is the way it happens is
that you may do dry runs, andand then you figure out how to
make it work. And then oldschool, then you write the
protocol, based upon successfula series like I think it was
(22:06):
like 10 dry runs as ahypothetical that was used. So
you go through Tinder, I run Doyou think you figured it out,
you write the script. And nowyou run the script formally, now
you've got 11 runs, and thenafterward in the hallway, I
talked to Mark, I'm like, yeah,and then you get a protocol
error, you have to change theprotocol. So that is a lot of
(22:26):
unnecessary time and effort,when you can do the exploratory
ad hoc, you know, just poundaway at the system. And then if
it is high risk, if it's trulyhigh risk, you got good risk
management, and you could zeroin on that and test that right
formally scripted with the rightobjective evidence
Dori Gonzalez-Acevedo (22:43):
that it's
an actual true defect versus a
script. If I see anothertypographical error from but
we're still doing that, a lot offolks right,
Steve Thompson (22:54):
and that cost
money to fix those little typos,
right. Yeah, a lot of money toreroute that people's time,
Dori Gonzalez-Acevedo (23:00):
which
doesn't if we keep doing that
cycle doesn't get to moreexciting things that you and I
like to talk about, right? TestAutomation, machine learning
exactly. Yeah, right. If we'regetting bogged down in all of
those little material, thingsthat don't add up to actual
value, we can't get to theexciting stuff.
Steve Thompson (23:18):
And it's
interesting, because if our
processes are ineffective, andwe're bogged down, new systems
and technologies aren't beingleveraged like they are in other
industries, and we're reallyconcerned about patient safety,
and what if we're holding thingsup, maybe holding the approval
of a new in a know a new drug, anew biologic, a new medical
(23:42):
device, maybe holding up thatapproval or delaying some
releases of things could impactthe safety because of our own
inefficiencies. So it'simportant for us to be
efficient, and get the rightstuff out there to the people
that need it in the right time,
Dori Gonzalez-Acevedo (23:58):
right. We
know, quality engineering
principles, shifting left, doingthings faster, or failing
forward faster. Absolutelyproduces more effective,
everything right, from bottomline dollars to go to market
strategies to getting to thenext new on the market, right,
(24:18):
which is what all of us arebeing employed to do
Steve Thompson (24:22):
our job in one
way or another. Yeah. And then
you you were bouncing around.
There's just so much Yeah, andyou know, it's exciting, but
it's exhausting, but it's fun.
But we had the lunch today andit was great, because we were
talking about artificialintelligence machine learning
chat, GPT blockchain, all ofthis stuff that can be used and
(24:46):
also the concerns about usingthis so we and then the news
articles like you know, the headof the FDA saying that we need
to be nimble about it. We got tobe careful Google announcing
they're they're in the game now.
But There is fear and there'sexcitement. At the same time. It
was a great, I think, a greatsession at lunch. Yeah,
Dori Gonzalez-Acevedo (25:05):
it is.
Because it's very exciting rightthere is there's the what ifs
how to, for a systems thinkerlike myself, right? It's just it
makes my mind kind of go, whatcan we do here here in here?
Right? And there's so much to doso much opportunity, right?
Obviously, not every company cango through all those iterations
(25:27):
and need to be smart about whereto deploy it, where's the most
added value? I do a lot of costbenefit analysis, right, for
companies like that is this isthis is the time right? There's
tons of stuff out there. Andwhere does it make most sense to
utilize it? Where do we get themost benefit from that? Where is
(25:48):
it going to be the most riskfrom a regulatory regulation
perspective? Are Where can itreally advance? How cutting edge
does a company want to be?
Right? Right, I use I still do atalk around you know, technology
adoption. Alright, are you onthe the laggard, are you not
(26:08):
right on the cutting edge? And Ithink each of these companies
are doing their, their walkaround this, right, where to
where to incorporate and wherenot to, to your point earlier
around incorporating otherindustries into lifestyles to
get those lessons learned. Iknow a lot of companies are
hiring from AWS or Azure orother industries finance systems
(26:32):
coming in as the IT guys, right?
Because they've done it?
Steve Thompson (26:37):
Well, right. We
can learn from them, we can
absolutely learn from
Dori Gonzalez-Acevedo (26:41):
them.
Right. I mean, we are so relianton face ID on my phone, right?
And to do everything to LinkedInto all of my passwords. But we
don't do that in life sciences.
Steve Thompson (26:53):
Yeah, I mean,
there's a fear of artificial
intelligence and machinelearning, but we use it every
day. Yeah. All our devices, ourcredit card companies are,
everything is you know, I mean,it's serious. But people know,
my dog is named Alexa my catsnakes. But it's all started
because of the devices that mydog thinks her name's Alexa. But
(27:13):
But anyway, I mean, we use itall the time. And the the new
generation, the new workforcehas grown up on technology,
guess what, they don't take pinsout and write on paper and to
attract talent and bring theminto the organization. And we
have archaic systems and rigidways. Not only can we not
(27:34):
attract new talent to bring intothe organization, and keep our
industry going. But we can'teven bring folks from other
industries like the AWS orwhatever, because they're like,
why would they step backwards?
When they can move forward? It'sdangerous for technology people
to step backwards, they have toalways be advancing. Yeah,
Dori Gonzalez-Acevedo (27:53):
that's a
really good point. I think that
for organizations that are notlooking to, I was always
thinking about it from acompliance risk perspective. But
from the workforce perspective,you're spot on. Like, that is a
big risk. I know, folks want toretain talent. But if we're not
doing that, from a technologyperspective, in some of the most
basic ways, right? What's goingto happen?
Steve Thompson (28:16):
Well, I mean,
like I said, you know, Google
announced AI for drug discovery.
So you've got a big company thatcan attract talent. So guess
what, now we are competing inour industry, with a technology
company that's moving into thisspace, which it's all good, you
know, but we all have to beworking together to advance.
(28:37):
Yeah.
Dori Gonzalez-Acevedo (28:40):
So tell
me a little bit more about
because I know you're superexcited about AI, machine
learning. What are some of yourthoughts? What do you send me
predictions?
Steve Thompson (28:48):
I predict. I
mean, so I've been in computer
that was my formal education andknown about artificial
intelligence for a long time.
I'll give you a simple example.
Right? For a long time.
Artificial intelligence was notcapable of understanding this
sentence. Now, if I tell youthis sentence, you could write
it out on a piece of paper, noproblem every all of us can do
(29:11):
it. And the sentence is, that'sright, missus, right, I'll write
you a letter. All right. For along time, AI could not figure
that out. It can now I've testedit out. And if you do dictation,
or whatever it will do it. Sothat gets into the natural
language processing NLP. Andthen we've got the large
language models LLM, which iswhat cat GPT relies upon. So
(29:34):
these are trillions of recordsin this neural network that is
so complex, that it takes a lot.
I mean, you could spend abillion dollars which some of
these big companies can do ondeveloping these large language
models. That's why it's sofascinating. Now the caution is
just like I said, with the CSA,and I thought it was controlled
(29:55):
substance act versus ComputerSoftware Assurance. There's a
lot of learn thing that needs tohappen. And we talked about this
at lunch. So we, that have beenin industry for a long time can
bring knowledge and wisdom tothe table. And by using the
technology, we can teach thesystems because it's machine
learning, and it's gettingbetter and better. So my belief
(30:17):
is that we can really affectchange and help shape Believe it
or not these large languagemodels and the neural networks
to give good information,because chat GPT is also a great
garbage collector. And so wehave to know what's garbage, and
what is good solid informationwe can take. So there's a lot of
learning that has to happen. Theother thing is, it's an open
(30:40):
system. And it's using the data.
So companies have to be verycareful with intellectual
property, they have to becareful with personally
identifiable information andclinical trials, personal health
information, you don't want thatstuff getting out there. So
somehow, as an industry, can wefigure out our large language
model? Can we have encryption?
(31:02):
Can we make it a closed systemthat we can share? And that's
when we were talking aboutblockchain. So all of these
technologies are comingtogether. It's exciting. But
there is the fear that therecould be misuse. And we were
talking about this as well, alot of the regulatory action as
a result of a catastrophe likeTherap 25. And I know Brandi had
(31:22):
insights on that, and, and then,you know, the thalidomide babies
and stuff. So we we createstuff, and it causes a public
health concern and misuse ofalgorithms, bias in data, that
is the potential for a disaster,possibly a catastrophe. And we
have to be aware of that andavoid it, or even breach of
(31:43):
information. So we need tounderstand the concerns and
proceed with caution. It's ayellow light. It's not a total
green. It's not a red, it's ayellow.
Dori Gonzalez-Acevedo (31:53):
Yeah. And
so along those lines of where to
apply it. Have you seen someexamples today?
Steve Thompson (32:02):
Yeah. So yeah,
so I'm messing around with it
right. And just for fun, youknow, encourage everybody to go
go on to chat, open AI, chat,GPT. Play with it. You saw in my
slides, I even generated imagesthrough Dali, and it's so it's a
lot of fun. I use it all thetime as a research tool as a
(32:26):
first rough draft. And so I useit to help me jumpstart me on
the presentations and now we'llcontinue to use it right it was
instrumental in me figuring outpic as annex 11, part 11. You
know, gams see all of thatright? CSA didn't get quite
right, but it's new. So I useit. But if you go on there, for
fun, I said, Give me part 11requirements, it did a great
(32:50):
job, I turned around and said,write a test script for one of
the requirements like loggingin, it did a phenomenal job, in
my opinion. So it is, it is ableto create drafts of requirements
and drafts of test scripts thatjumpstart me that I can use.
Now, of course, I'm excitedbecause full transparency, I
work for a technology company,and I see the potential of how
(33:12):
that can go into the technologyand essentially automate. And
when I say I think the face ofvalidation will be completely
different in five years. It ismy belief, and and doing some
test runs very, very early earlystage test runs, I believe
there's the potential that wecould move into validation, that
(33:32):
takes minutes, not validationthat takes months. And if we can
do that and get into things likethey have continuous process
verification and stuff likethat, if we could get into the
full automation, then we'vesolved our validation problem.
Because guess what, if we canvalidate all the time and
minutes, then why not just everychange to the system, why not
(33:56):
fire a job off at night to makesure your system is still in a
validated state.
Dori Gonzalez-Acevedo (34:01):
And that
is the ultimate from a test
automation perspective, right?
And while test automation toolsdo good, right, you still need a
human to build out thatscenario. Train that system,
like all those things still needto train just in the same way,
like attack GPT. Right. We stillneed a human to validate if you
will, right. Yes, whether thatcontent says coming out is good
(34:25):
or not just like you did. Ithink that the the acceleration
of you know in the in thetraditional shift left's kind of
methodology and getting to testautomation we want to do you
want to get to an ideal of 80%test automation 20% manual
(34:46):
testing, that's kind of like theideal kind of in a true software
development and kind of model isright. But if we can reduce that
20% of manual testing to hybridOh, Most automated manual
assisted testing, right and getto the true content, right? Get
better requirements get Yes,really, really clear acceptance
(35:09):
criteria and shift that mentalload, right to added value
operations. I think a couple ofthings will happen. One, I think
we have a workforce that will bemore enjoyable for folks. Right?
Who wants to be continuously,you know, making sure the
weather that they have everytime date stamp, right, applied,
(35:31):
right? Another pet peeve aroundaudit trail review, right? Like,
you know, those sorts of thingslike these are automatable tasks
that we need to stop doing fromoverall industry CSB
perspective, right? We can so wecan get to the more meaty meaty
stuff. Well, I
Steve Thompson (35:51):
mean, this could
be controversial to the
listeners that I know that wetalked about auditing, audit
trails, and, you know,obviously, I like technology.
But to me, it's like, I liketechnology is great at data and
information better than humanbeings. And if you take a human
being to really read an audittrail, it's a lot of data. And
if I'm doing that, I'm startthinking, you know, what's my
(36:13):
dog doing, you know, what, mymind wanders, and that's human
error. And, and so I can throwtechnology at it that will look
at every bit and every bite,with artificial intelligence,
machine learning, it can look atpatterns, it could see patterns
that I may not even see in thedata. So it's capable of
identifying things, the entireaudit trail all the time. So
(36:35):
that you know, I mean, again,it's like, how can we leverage
technology, and the non valueadded stuff that we're really
doing this mundane tasks, Iwould rather do something
exciting and new and learn andthink and you know, rather than
just a rote, mundane task overand over again, so I don't mean
to offend anybody. But again,it's critical thinking, let's
(36:57):
stop and ask why. And see ifthere's really value. And yes, I
mean, I understand the objectiveof it. But if we can leverage
technology, and then do betterthings that technology can,
which we as human beings, theknowledge and wisdom can, then
let's leverage what we can dobest and let technology do what
it does best. Yeah, partner withtechnology. Yeah,
Dori Gonzalez-Acevedo (37:19):
it is.
And so I think you werementioning during lunch around
the increase in data scientistsin the FDA, were you mentioned,
Steve Thompson (37:27):
I didn't, but I
wouldn't be surprised. Yeah.
Dori Gonzalez-Acevedo (37:31):
I mean, I
think when it gets push comes to
shove, I mean, there was whenthe amount of algorithms that
are coming in to be Oh, to beassessed, right, we really need
to have experts than to expertsin that kind of structure of
data to be able to clinicalvalidate, again, whether or not
those are good algorithms.
Steve Thompson (37:52):
Yeah, that
definitely like, I mean, I do
know from previous lives thatthe FDA is really good. I'm sure
all the agencies are reallygood. They could find issues in
data that we didn't see if yousubmit it, and if they kicked
back your data, and it'sembarrassing, and it's it's not
a good thing if the data getskicked back. Right. So they
(38:14):
obviously know, statistical, Imean, really, machine learning
is statistics. You know, it'slinear algebra. I mean, I'm
really oversimplifying, but it'scurve fitting linear algebra,
statistics is essentially whatit is. And they're really,
really good at it. And I knowthey have their center of
excellence. And, you know, theyhave teams working on it. But
this whole algorithm, and wetalked about, like in clinical
(38:35):
trials, the IRB is and theconcept of an algorithm Review
Board, being, you know, have aindependent group of folks to
really look at the algorithms tomake sure that we're using the
right algorithms. And we'reusing the right datasets without
bias because, again, we couldmiss use algorithms, use data
(38:55):
that's biased, come with, comeout with decisions, and then
create problems for ourselves.
And then, by result of that,now, we've, we've, you know,
affected our ability to use thetechnology because we've just
created a problem. Now we've gotto scurry and figure out how to
control it to make sure itdoesn't happen again.
Dori Gonzalez-Acevedo (39:13):
Right,
right. It's all very exciting. I
do I get very energized when Icome to connects Congress, we're
able to have this dialogue, andhow to carry what I often come
away with is how to carry thatmomentum, how to continue to
inspire folks to do somethingdifferent to take the learnings
that many of us have and add onand challenge and engage in
(39:36):
dialogue around all this stuff.
And it's been pretty palpable,actually, the energy in the room
this time,
Steve Thompson (39:43):
I have never
seen the past month or two. I
personally have never seenanything. I was on the elevator
talking to some random personthat was here training nurses at
the University of Pennsylvaniaand he brought up chat GPT
everybody's talking about it.
Yeah. My friend's husband isusing chat GPT to write love
(40:05):
poems to his wife. That'sawesome. Everybody's using I
don't know, that's cheating.
Dori Gonzalez-Acevedo (40:19):
It make
it make it your own afterwards.
But I think that is part of it.
Right? It's it's experimentingand being that accessible. Also,
I think some of the technologyhistorically had not been as as
accessible as this. Right. Sonow this is really super
accessible to common people.
Right? And so what what do youdo with it? When I engage with
(40:43):
some senior leaders, it's, theyhave so many ideas, right to
what to do with it and use casesand, and then you need to really
figure out what is the rightthing to go to market with. And
that's hard because you know,the, how much money you invest
in a go to market strategy for anew drug line or whatever,
(41:06):
right? It takes a lot,historically, in the billions,
right to bring a drug to marketwhere it can't cost that much
anymore. We have to do better. Idon't
Steve Thompson (41:16):
know what the
numbers are now. But I think it
used to be 10 years and abillion dollars. Yeah, but yeah,
it's a lot of money. But I mean,we could leverage the technology
appropriately, right, safely,you know, proceed with caution.
Dori Gonzalez-Acevedo (41:29):
But why
we did it during COVID? Yes. Did
we not? Well, it even
Steve Thompson (41:33):
at lunch. I
mean, we're doing presentations
next week, and and I'll betalking about this again, now.
It's my new passion. And I doingthe research came across slides
of companies that are alreadyusing it in healthcare, and also
in life sciences, and alsothings the FDA has already
approved using AI ml. And I wasactually surprised at how much
(41:56):
there actually is. So we say ourindustry is behind, but it's
happening. It's happening. Andin my slide in preparation, I
put like will AI or something beused? And then I did the
research and my is boiler italready? It's yeah, that
Dori Gonzalez-Acevedo (42:11):
will
those again, he gets to
technology adoption curve,right? Yeah, have those
forefront thinkers that are wellin advance of the curve. And the
life sciences in general, onwhole have been toward the mid
to laggard side. And so withinsome of the big ones, there
might be a division that isdoing emerging technologies,
(42:34):
right. They're always forefrontof the mother. Company. Right.
And that's great. But but thereare how many new, you know,
small emerging technologycompanies out there today. 1000
cells, if you want, somecolleague told me around salt
cell and gene therapy, it's like1000 companies incorporated in
(42:55):
life last three years orsomething like that. It because
it's those are so individualideas, and in order to get to a
therapy, right, but that's,that's enormous.
Steve Thompson (43:05):
That's, and you
know, what, I'm sure the same
thing is going to happen with,if a company decides not to it's
happening, and you're gonna soonrealize that you're falling
behind. And so in order to staycurrent or to catch up, then the
bigger companies will starteating up the companies that
have created these, theseadvanced technologies. And it's
(43:26):
like, now we need to buy thatcompany to bring it in. Yeah,
just keep up. So yeah, I feelyou're going to do it one way or
another.
Dori Gonzalez-Acevedo (43:33):
So you
better get on board. Yeah. But
it's true. I mean, the we do seem&a strategies, like galore that
are just popping up. Andsometimes that is part of those
smaller companies strategies tobegin with, right? And they
target that, but it ishappening. You're absolutely
right. And so where do you wantto be? Do you want to be in
(43:55):
advance? Or do you want to befalling behind? And that's an
individual corporation toCorporation conversation.
Steve Thompson (44:04):
Well, and
there's another thing on
resistance, and one of the fearsis, oh, my gosh, this is going
to take over my job, and I'mgoing to be out of work. But
there are new jobs beingcreated. There's such a demand
right now. And there's thisprompt engineers that this new
position that has come up wherethey're making $400,000 a year
(44:26):
if they're prompt engineers, tobasically help train these
generative AI, which chat chatGPT is and there's more. I mean,
there's Bing, and, you know,others that there's not just
chat GP is not the only kid onthe block, right. And so, you
know, there are newopportunities that are springing
out, you can't keep up with thenew opportunities. So do you
(44:47):
want to be stuck doing the samething that you've been doing for
so long? Or do you want toadvance it's interesting, it's
fun, and I don't see less, and Imean, in my role technology's
been kind had been my life, lifeI mean limb systems, they
thought, oh my gosh, youautomate the laboratory, we're
going to be out of a job. Guesswhat it creates, still to this
(45:08):
day limit, folks are in demand.
Right? It creates newopportunities,
Dori Gonzalez-Acevedo (45:11):
a great
point, because all of those
systems that we currently use inplace need to incorporate some
of the stuff in those systems,too. Right. So while you and I
have been in the business for 25years, we could still probably
have another 25 years if wereally wanted to, because it's
just going to evolve. Yeah.
Right. And so whether or not youwant to continue to advance
those technologies, those thosespecific use cases still
(45:32):
required. Right. And it's just amatter of how to how to do that
differently.
Steve Thompson (45:40):
And if you at
least stay abreast of the
technology, you will become inmore demand. Yeah. If you stuck
in, you know, in the old way,you won't be in as much demand.
Yeah, yeah. I mean, you kind ofsee it,
Dori Gonzalez-Acevedo (45:54):
right.
Yeah. So what's next? You go innext week, you do more talks?
Steve Thompson (45:59):
Yeah, we're
doing San Diego and San
Francisco. We're doing theseone. There's just so much
happening. We're doing these oneday seminars. And it's not just
us, you know, we bring ourcustomers and people from
industry in and get a group ofpeople together, give some
presentations, have a good time,share ideas, like we're doing.
(46:19):
And, and obviously, with Kinect,we have things coming up. So
stay tuned. Yeah, even us as acompany, we're partnering up
with Kinect to, you know, bringsome other things out. And, and
definitely the Kinectconferences, we always see that
telling Matt and it's beengreat. And yeah, so. So yeah,
what's next? I don't know, I'llwake up tomorrow and something
(46:42):
you know, 10,000, new thingswill be brand. I'm exhausted.
Dori Gonzalez-Acevedo (46:46):
So I'll
be in Indianapolis next month,
if brandy and the ISP GAMPcommunity in the mid mid Great
Lakes chapter. So there'll be aneducational day there.
Steve Thompson (46:57):
What do you do?
What are you talking?
Dori Gonzalez-Acevedo (46:59):
I'm not
talking but it's just an
education day, right? Because Ithink this is the other thing.
You know, gab came out with thesecond impression, I think
there's still an opportunity toeducate on that. Some never new
original GAMP. And so it's kindof introducing that for the
first time. But it's also reallydoubling down on the changes
that the Gam community didbecause I think those are, were
(47:23):
timely and important, right?
They complement the CSA, draftguidance, it complements
everything else that's comingout. And we need to leverage
that. And so
Steve Thompson (47:35):
exactly, you
know, I mean, I know that we're
probably wrapping things uphere. But it was funny because I
can talking about part 11 andtalking annex 11 calls out cloud
computing yamp calls out cloudcomputing pick as call. You know
why part 11 doesn't call it out?
Because it
Dori Gonzalez-Acevedo (47:52):
was it
didn't exist. It didn't exist,
Steve Thompson (47:54):
wrote it shows
out
Dori Gonzalez-Acevedo (47:55):
it didn't
exist. Yeah, we could do a whole
nother topic on cloud computing.
I'm sure. We'll see. Thank youso much for wrapping up today
with me. And we're gonna we'regonna share this podcast on both
of our podcasts. And I'll seeyou soon. Always a pleasure.
Thanks a lot. All right. Bye.