All Episodes

April 2, 2025 31 mins

Christian Espinosa, founder of Blue Goat Cyber, reveals the critical vulnerabilities in medical devices and how his company is working to secure the technology that keeps patients alive. After surviving a near-fatal health crisis that was diagnosed using a medical device, Christian dedicated his career to ensuring these life-saving technologies remain secure from increasingly sophisticated cyber threats.

• Average hospital bed connected to 14 medical devices, creating numerous attack vectors
• Medical device hacking could lead to patient harm or death through manipulation of pacemakers, surgical robots, or diagnostic systems
• Christian personally experienced the importance of secure medical devices after six blood clots nearly took his life
• Hospitals represent "hostile environments" from a cybersecurity perspective with poorly segmented networks
• AI-enabled medical devices introduce new vulnerabilities through potential data poisoning attacks
• Securing medical devices from the ground up during development is 90% more effective than adding security later
• FDA and regulatory bodies are only now catching up to security standards Blue Goat has implemented for years
• Medical device manufacturers often delay security considerations until just before submission, causing costly delays

Listen to Christian's MedDevice Cyber Podcast and visit bluegoatcyber.com for more information on securing medical technology and protecting patient safety.


Josh's LinkedIn

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Christian, thanks for joining.
A lot of my listeners maybewent back because I saw that
after about six months or sothey went back and watched and
listened to the first episode,which just happens to be the
first time that I ever talked toyou, and so thanks for coming
back on what is now Cybernomics.
Back then it was SecurityMarket Watch, but you were the

(00:24):
first guest on my podcast, soit's such an honor to talk to
you again.

Speaker 2 (00:28):
Yeah, I remember that I was in Idaho when we recorded
, so I appreciate you having meon as the first guest.
I feel honored for that Glad tobe here today.

Speaker 1 (00:36):
Good, good, good to see you.
You know before that.
I'm thinking back.
Actually it was even before thesecurity market watch days.
Back then it was cyber chomps,that's right, it was, you know.
Now I'm remembering that wassuch a crazy idea.
I was like I'm going to read abook a week in cybersecurity and

(00:57):
then interview the author.
So I read your book and remindeverybody that's listening what
is the name of that book.
And yeah, let's go from there.

Speaker 2 (01:08):
The name of the book is the Smartest Person in the
Room, and it's really aboutdeveloping emotional
intelligence for highlyrationally intelligent people,
which, in cybersecurity, there'sa lot of people that with high
IQs but almost no EQ, skill oremotional intelligence.
So it's about raising the EQ tocompliment IQ.

Speaker 1 (01:29):
Yeah, I read that whole book in one week.
I remember I was just like Ifelt so called out because I
feel like I'm one of thosepeople and my girlfriend always
tells me she's like you know,you're a smart guy, but
sometimes you don't have EQ, youcan't read the room, and every
time she says that I'm likemaybe I should go back and read
Christian's book one more timebecause it's really relevant

(01:52):
today.
So whenever you pitch this bookto people or you offer it to
them, do they typically feelpretty offended or are they kind
of receptive to it offended?

Speaker 2 (02:04):
or are they kind of receptive to it?
I haven't been pitching it perse.
I mean, on social media I havea social media person that posts
reviews and kind of softpitches it.
But I'm very careful aboutsaying you know, you don't have
any IQs or, sorry, eq skills.
So you might need to read mybook.

(02:26):
I position it as if you hit aglass ceiling in your career and
and it's a tech career, thenyou might want to check out my
book, because often that glassceiling is because a lack of
awareness or people skills, uh,and and that's that's the way I
typically position it huh,that's a really eq way of
putting it.

Speaker 1 (02:43):
You know, like if you don't have like so, somebody
like me who may not have like aton of eq, I might see that
somebody needs that book and I'mgonna be like dude, you have no
eq, you can't read the room.

Speaker 2 (02:56):
You should read my book and that's probably not
gonna read it, if that's how youyeah, exactly, if they're like
dude, why would I read your book?

Speaker 1 (03:03):
if this is the way that you approach me, you
obviously have no idea whatyou're talking about, but you do
, christian, so you know it's.
Yeah, I'll go back and readthat book again.
Actually, I enjoyed it verymuch and you had this idea that
stuck, and it was the idea ofKaizen, which I found very cool.

(03:24):
Right, it was like the japaneseway of just kind of going with
the flow and the idea ofincremental progress.
Right, not being too hard onyourself, not being too hard on
your employees and the peoplearound you, but having a lot of
grace and forgiveness foryourself.
So could you, could you tell mea little bit about Kaizen?

(03:47):
And only because I knowsomebody's listening to this
right now and I know thattypically, if they're a security
person, they're probably superhard on themselves, and I think
that that idea of Kaizen doesgive people a lot of grace and
forgiveness and ultimately leadsto achieving great success.

(04:07):
So I don't.
Obviously I'm not the authorityon this.
You are.
So what is Kaizen?

Speaker 2 (04:15):
Kaizen, like you alluded to, is a Japanese word
that means constant improvement,or can I, continuous and
never-ending improvement.
And there's a couple facets ofKaizen that I think are
important.
One of them is a lot of peopleare afraid to do something new
because they think they have tomaster it.
And from my experience in life,if you adapt the philosophy of

(04:37):
Kaizen, it gives you the courageto take the first step and
realize it's okay, if I have amisstep I can correct along the
way, but now I know whichdirection is illuminated, at
least the next couple steps, andthen I'll take a next step and
it might be illuminated a littlebit different way.
So it gives you that courage totake the first step.
And also, as you alluded to,from a forgiveness and grace

(04:57):
perspective, in life it's easyto beat yourself up.
I know I used to beat myself upa lot.
I probably still do.
Sometimes I look in the mirror.
I'm like I should be doing this, this, this and this, and if I
just step back and look atKaizen and say, you know what
I'm making improvements, and ifI look at my life five years
from now, I'm sure I will havemassively improved.

(05:19):
But we tend to overestimatewhat we can do in a year and
underestimate what we can do inthree to five.

Speaker 1 (05:31):
Yeah, it's like boiling the frog, but like in an
optimistic way.
Right, You're not boiling thefrog to the frog's detriment.
You're doing things, littlethings, and you know, day after
day, one step in front of theother, until one day you wake up
and hey, you've got a podcast.
The other, until one day youwake up and hey, you've got a
podcast, You've got a thousandmore followers, You've got that
position that you wanted.

(05:51):
And I've applied that concept ofKaizen after our conversation.
And when I say I read your book, I really did and I took it to
heart and I applied that conceptof Kaizen to the point.
If you go on my TikTok actuallyI only have like four videos on
my TikTok One of them is metalking about Kaizen and a lot
of people reached out andcommented and liked it.

(06:14):
And when I say a lot of people,I mean for me, a lot of people
on TikTok is like 45 people,right, but it's more people than
have ever reached out to me onTikTok and I applied it over the
years and so thank you for that.
That was a really goodconversation and I applied it to
my life and I saw incrementalimprovement.
And that was about what threeyears ago.
Yeah, it was about three yearsago, Wow, yeah, and a lot has

(06:37):
changed since then.
Speaking of change.
You've got a brand new company,Blue Goat Cyber, and you are
specializing in the med techfield, which is really
interesting to me because, um,medical devices are kind of you
know, they're kind of important,you know they they sort of keep

(06:58):
people alive, and that, to me,is a very important thing.
And so, you know, with thesecyborgs running around, people
with electronic hearts and allthese things, pacemakers and my
dad was recently diagnosed witha not-so-good disease, not

(07:18):
terminal, and now I'm thinkingabout medical devices more than
ever, right, and if one of thesethings are to get breached a
pacemaker were to get breachedor hacked, because everything's
connected to the internet thesedays, lots of things can go
wrong.
And so why did you form BlueGoat Cyber?

(07:38):
Why did you go to the med techroute and the medical devices
security route, as opposed topretty much anything else that
you could be working on?
I mean, you're a really smartperson, you're one of the
geniuses, I think and I know alot of people don't like to be
called a genius, especially thegeniuses, but I'm going to say
it You're a really smart guy andI think you're a genius.

(07:58):
You could have applied yourmind to so many things, but why
med tech?

Speaker 2 (08:04):
My wife used to be a cardiac nurse, so she works for
my company now and sheunderstands the importance of
these devices.
My head of cells used to be acardiac surgeon, so he
understands all the languagefrom a medical perspective and
the importance of these devicesand he's constantly saying I
wish I had this sort of devicewhen I was doing heart surgery.

(08:26):
So I haven't surrounded myselfby just a bunch of cybersecurity
geeks.
I have people that know theindustry and it's a very
challenging industry becauseyou've got the FDA, the MDR, all
these requirements.
You've got these very complexproducts that you have to test,
every type of interface andwireless connection and entry
point.

(08:46):
You have threat modeling.
You have to do staticapplication security testing.
So we have the ability to doall that.
But we have the perspective ofthese devices and, from a value
proposition perspective, weoffer a fixed fee.
So we've been doing this since2014.
I did it in my first company soI know how much effort it takes

(09:07):
.
So we do a fixed fee guaranteed.
So a lot of these innovatorsare looking for investment from
investors and if I can give thema fixed fee, they know what to
ask for when they add thecybersecurity.
And we've done over 150submissions so far.
We haven't had any rejected bythe FDA or any what was called
deficiency come back, and weguarantee our work.

(09:28):
We partner with the devicemanufacturer all the way through
their clearance and sometimesit takes the FDA three or more
months to clear the device.
But we are there with them tomake sure the device is secure
and it gets on the market, andthen afterwards we can help them
monitor the device what'scalled post-market management.

(09:49):
I sold my last company inDecember of 2020.
We did medtech cybersecurity inthat company, but it wasn't our
sole focus.
And while I was working for theparent company, in February of
2022, I almost died from sixblood clots.
So it wasn't for a medicaldevice, I wouldn't be here.

(10:10):
A portable Doppler ultrasoundwas able to quickly diagnose me
and after getting through thatdepression you know, at first I
was grateful to be alive, butthen I felt caged.
I was taking these bloodthinners.
I just decided to stop takingthe blood thinners.
Reclaim my health.
I just decided to stop takingthe blood thinners.
Reclaim my health, get all myblood work done, get another
test on my leg, and that simpledecision it wasn't an easy

(10:39):
decision, but it was a simpleone gave me the courage to start
a new business and I thoughtwell, maybe the universe is
telling me this is what I shouldfocus on, since I wouldn't be
here without a medical device.
If that device was recalledbecause somebody hacked into it
or gave a misdiagnosis, then,yeah, I wouldn't be here.

Speaker 1 (10:53):
Oh, okay, that's a good story.
I think that's a really goodreason.
And that's not the universejust knocking on the door.
That's the universe.
I feel like breaking the doordown and saying dude, you need
to do this.
You are specially and uniquelyqualified to do this.
So what would happen if amedical device were hacked?
I mean, is that something thatpeople really should be worried

(11:13):
about, or is it kind of like piein the sky?
You know we're just going tofreak people out, or, you know,
are the risks real?

Speaker 2 (11:21):
The risks are super real.
On average, there's 14 devicesper hospital bed.
A lot of those havevulnerabilities and we've just
started really as the end of2023, raising the bar for
cybersecurity.
Before then, pretty muchanything could get on the market
, and we're migrating towardsmore and more surgical robots,

(11:44):
as example, and those are goingtowards autonomous surgical
robots.
So imagine if somebody hacksinto the surgical robot that's
performing surgery on your spine.
They could paralyze you or,like you talked about what Dick
Cheney had his defibrillatorremoved because there's a
legitimate threat.
Someone could wirelesslyconnect to it and shock him to
death.
So yeah, and it's not just thosesystems, it's also diagnostic

(12:06):
systems.
In the industry we call it IVDand vitro diagnostics.
It's a system that takes asample of your tissue or blood
and tells you what's wrong withyou and recommends a course of
treatment.
So if your blood has somethinglike sepsis, which means your
blood is toxic, and that devicemalfunctions or there's a delay
or misdiagnosis, then you coulddie within 24 hours as well.

(12:27):
So I think the risk is superreal.
I think I know a lot of deviceshave been recalled, more than
most people realize, and I thinkthere's been some incidents
that have been sort of coveredup.

Speaker 1 (12:40):
Why are people not freaking out about this?
I mean, this sounds like a bigproblem.

Speaker 2 (12:46):
Well, if you think about it like these devices are
on a hospital or healthcaredelivery organization's
environment, which, from myperspective and my team's
perspective, we consider ahospital environment, a hostile
environment, because just aboutevery day you hear of a bigger
hospital data breach.
So these devices are on thehospital environment and they're

(13:06):
typically not segmented.
They're typically, you know,you can get to them from, like,
the kiosk computer where you cancheck the internet, often for a
guest, or you can go to thechapel and plug into the
ethernet board and get thesemedical devices.
So it's a, it's a big deal.
They're constantly beingbombarded and attacked.

Speaker 1 (13:23):
Let's say one of these devices gets hacked right
At a hospital.
Who Say one of these devicesgets hacked right at a hospital.

Speaker 2 (13:32):
Who's on the hook for that?
The two people.
One is the medical devicemanufacturer themselves, the
company that created the device,and the second is the hospital
as well, and there's a big pushnow where a medical device in
the US has to be cleared orapproved by the FDA in order for

(13:52):
the device manufacturer to sellit.
In addition to that, though,we've seen a shift where,
because of all these databreaches at hospitals, they're
asking for more scrutiny of thedevice than the FDA requires,
because they don't want toaccept the risk of putting it on
their environment, because,imagine, you have a patient come
to your hospital.
One of those 14 devicesconnected to their bed is

(14:12):
compromised and somethinghappens to that patient.
Of course, it's not going tolook good for the hospital or
the device manufacturer.

Speaker 1 (14:19):
When these devices are hacked or breached or
whatever.
What are some signs that thehospital can look for or that
the patients themselves can lookfor?

Speaker 2 (14:32):
The signs are typically anomalous activity,
inconsistent results, likethere's a lot of AI-enabled
medical devices now, and if themodel is skewed or there's model
evasion attacks or datapoisoning attacks, you know
that's something else we have tolook at.
So it's really aboutunderstanding the risk to the

(14:53):
device and what some of thoseindicators are.
And one of the requirements formedical device manufacturers is
that the device manufacturer hasto provide labeling of all the
risk.
So if I'm the hospital and Ipurchase a device, I know what
the risks are and I know, likeif there's a tamper-proof seal,
if it's broken, maybe I shouldconsider the device compromised.
I also know what's called thesoftware bill of materials, so I

(15:16):
know all the third-partycomponents that make up the
device.
So if something like shellshock happens again, I can look
in my device and look at thebill of materials and say, oh,
this is vulnerable to that andmaybe I should contact a
manufacturer, take it off myenvironment or have a tech come
out from the manufacturer toupdate it.
And that's the other challengeis these devices on a hospital

(15:38):
network, but the hospitaladministrators don't update the
devices.
They typically have embeddedoperating systems.
So now you have a device thatyou can't control, kind of like
an IoT device to a degree, butyou have to rely on the
manufacturer to get a patch outto the device and often the
devices are air-gapped andthere's a USB port, so they have
to send somebody to thehospital.

Speaker 1 (16:00):
Well, there's no way to do this over the internet?

Speaker 2 (16:02):
Well, there are, but, as you know, anytime you
connect something to theinternet, even though it's
supposedly for secure updates,updates that opens up another
avenue for an attacker to getinto your device.

Speaker 1 (16:14):
So that's probably why they have to recall them.
Either they have to go to thepatient or to the hospital, or
you can send the device back.
But I'm imagining if you've gota pacemaker, you can't just
when they recall it, you can'tjust go hey, all right, like
Tony Stark I'm going to take myheart out and just put it in a
box and send it to you.

(16:34):
That probably isn't going towork, huh.

Speaker 2 (16:38):
Yeah, that's the dilemma, right, We've got
pacemakers, neurostimulators,all kinds of implantables that
often have wirelessfunctionality so the doctor can
get diagnostic data off of it,or it can be updated and, yeah,
you have to take them out of thepatient if there's a problem
with it.
Or the patient has to acceptthe risk, like I mean, if you

(16:58):
like the scenario you gave, if Ihave a defibrillator, I know
it's vulnerable to somebodywirelessly connecting to it and
shocking me to death.
What do I do?
Do I get it taken out or justaccept the risk, knowing that it
may not happen but it ispossible, right?

Speaker 1 (17:12):
How possible?
If you were to put a number onit 10% likely, 80% likely.

Speaker 2 (17:19):
I would say 99% likely, 99% possible.
I won't say likely, but most ofthe attacks on the internet are
non-directed attacks.
It's cyber criminals like justtrying to find something
vulnerable to install ransomware, for instance, to make money A
pacemaker would have to be adeliberate, directed attack
because I would have to be inproximity relative proximity to

(17:41):
the person, I would have to havethe right tool, I would have to
follow them around.
So it is totally feasible, butsomebody would have to follow
them around.
So it is totally feasible, butsomebody would have to be
extremely motivated.
And this is why, like you know,with Dick Cheney, it's like a
nation state that would attackhim.

Speaker 1 (17:56):
How close to Dick Cheney would they have to be in
order to pull off an attack likethis?
And here's the problem withcybersecurity Whenever you ask
these questions, I think thatthere are hackers listening to
the podcast, and I'm verycareful to not give away any
dangerous information, but I'mgoing to ask it anyways.
How far away does somebody haveto be from Dick Cheney in order

(18:17):
to shock Dick Cheney to deathwith his internal shock machine?

Speaker 2 (18:22):
Yeah, Well, that's a good question.
I guess it depends on yoursetup.
A lot of these defibrillatorsand things use Bluetooth low
emission, so you're supposed tobe relatively close because they
don't have a massive batterylife.
However, I know people haveconnected to Bluetooth and
sniffed Bluetooth signals fromup to a mile away using a what

(18:44):
they call blue sniper rifle withbasically a high powered
directional antenna.
So I would think it's not a fewfeet.

Speaker 1 (18:53):
I would think you could pull it off from probably
a couple hundred feet.
Wow, blue sniper rifle, I wantone.
That sounds cool.
Okay, talking about AI, youmentioned AI a second ago and
you mentioned a term that I'veheard in the past few weeks a
bunch of times and, I'll behonest, I have no idea what it
is Data poisoning and I know ithas something to do with AI.

(19:13):
What is data poisoning?
That sounds bad.

Speaker 2 (19:19):
Yeah, ai is based on data in and data out.
So if I poison the data on AI,I feed it bogus data to make it
have bogus results coming out ofthe system.
And one of the challenges withAI is training the model.
And the less the model istrained, the more susceptible to

(19:40):
data poisoning.
If I'm training my AI model todetect cancer and I feed it a
million samples of tissue let'ssay of breast tissue that don't
have cancer and I feed it acouple hundred that do have
cancer, that model is going tobe more careful about how you

(20:05):
train the model and then thepoisoning is taking advantage of
that model not being trainedproperly as well.
Because I need to train themodel to detect bogus
information garbage coming intoit also.

Speaker 1 (20:18):
So garbage in, garbage out, 100%.
And who's doing this?
Data poisoning?
I've heard of state actors andwho else would be doing doing
this and why would they evenwant to do this?

Speaker 2 (20:40):
I think the state actors would do it.
Then I have an advantage, Justlike if I can take out
industrial control systems inthe United States or cause havoc
, then if I'm an adversary inthe United States, I have an
advantage.
So I think that is a drivingfactor.
The probability is great onthat, but the likelihood is not

(21:02):
too high.
Someone's got to have theintent.
I think also it justarbitrarily happens because a
lot of people don't understandAI and they're feeding it the
wrong information and it'smaking the wrong decisions.
As an example, a lot of peopleuse chat GPT.
One of the challenges with AIis it will make stuff up the

(21:26):
answer.
It will not tell you, itdoesn't know.
So imagine a medical device.
You feed it some data, it makesup an answer.
So you're sort of poisoning themodel because we haven't
trained a model enough to say Idon't know, because humans have
a hard time saying I don't know.

Speaker 1 (21:41):
Yeah, yeah, I have that problem.
Yeah, it's hard to say I don'tknow.
I noticed that with Chad GPTand sometimes I have to tell it.
I'm like do you really know?
Are you pulling my leg?
Be serious with me, be honest.
And sometimes it'll be likeyeah, yeah, you know, I, I try,
I kind of made that up and, andhere's the truth, so yeah if

(22:02):
you're listening to this, youcan go right now and check out
ch GPT and try, try it foryourself.
You'll.
You'll be astonished that youknow it will make some things up
.
Um, but now I think it'sbecoming a little bit more
honest.
If I ask it something reallystraight up, like, what did my
dad had?
Uh, what did my dad eat forbreakfast, it'll say something
like I don't know your dad andwhat he ate for breakfast, but

(22:25):
the typical person eats blah,blah, blah, blah, blah, blah and
it'll give you some answer,which I guess is better than
nothing.
But with data poisoning, I feellike it's not.
Something is better thannothing, it's you really want
clean data in there.
And now the intersection of AIand medical devices.

(22:45):
How does that affect yourbusiness, how does that affect
your clients, your customers,and what are some of the new
vulnerabilities or the new risksthat AI would pose specifically
to medical devices?

Speaker 2 (23:03):
It's affecting quite a few devices.
We see what's called AI-enabledsoftware as a medical device
and typically we see a lot ofthese used for image enhancement
.
So you can take an MRI or anultrasound, run it through the
AI.
The AI will highlight problemswith your vascular system, for

(23:24):
instance, and a doctor will makea diagnosis based on that with
your vascular system, forinstance, and a doctor will make
a diagnosis based on that.
So the challenge is, if thatmodel is compromised and a
doctor's making a decision basedon the results of the AI model,
then he's making a misdiagnosisor could miss something or just
completely make the wrongdiagnosis.
So it's important these modelsare trained properly.

(23:48):
And the other thing that's toconsider is not every country
shares their data, their medicaldata.
So if I'm trying to train an AImodel to detect vascular
disease, as an example, I'm kindof limited to the data.
I can feed the model Because Ican maybe get it from the United
States, but can I get it fromBrazil?
Feed the model Because I canmaybe get it from the United

(24:09):
States, but can I get it fromBrazil?
Can I get it from Mexico?
Can I get it from Europe?
Probably not.
So now your model is alreadytainted to a degree.
It has a bias because itdoesn't have an accurate sample
of the population.

Speaker 1 (24:20):
So let's say there's somebody, a medical device
manufacturer in Brazil, right,and they call you up.
They say, hey, Christian, we'reabout to roll out a bunch of
medical devices.
There's some, you know, we'reconcerned about AI-based attacks
.
What do you do then?
Do you fly out to Brazil andtake a look at these devices?

(24:42):
Like, where in the supply chaindo you fit in?

Speaker 2 (24:48):
We fit in when the medical device manufacturer,
like the one in Brazil, isideally developing the
requirements for the product.
Unfortunately, most devicemanufacturers kind of forget
about cybersecurity.
Somebody going through achecklist says, oh, there's some
cybersecurity requirements here, and then they contact us like

(25:08):
two months before they're tryingto get it approved by the
regulatory authority and wealways find a lot of things
wrong and it delays thatsubmission, frustrates investors
, frustrates the innovator andends up delaying their time to
market and costing them a lot ofmoney.
So we have to get involvedbecause I don't think anyone
cares about cybersecurity unlessthere's a compliance driver

(25:29):
which is the regulatoryauthority, either FDA, United
States or, like European medicaldevice regulation, in Europe.
So we have to get involvedbefore that device is cleared
and the medical devicemanufacturer is responsible for
choosing a vendor like mycompany to help with that
cybersecurity a vendor like mycompany to help with that

(25:50):
cybersecurity.

Speaker 1 (25:51):
So if Dick Cheney's medical device company had
contacted you before, I don'tknow why we're picking on Dick
Cheney.
Everybody picks on Dick Cheney,I guess because it's Dick
Cheney.
But would it be an accuraterepresentation or
characterization to say that ifa company like yours, people
like you got in at sort of theground level and built security

(26:15):
into that device, how much saferwould he be knowing that
someone like you had builtsecurity into that device at the
ground level, as opposed to youknow these companies just sort
of putting these things out tomarket?

Speaker 2 (26:33):
I would say upwards of 90%, safer.
I mean, there's no such thingas perfect security.
But you know, if we were toevaluate it through a
penetration test, staticapplication security testing,
look at the third partycomponents to make up the device
then absolutely the device willbe much, much more secure,
especially if the manufacturercontacted us when they're still

(26:56):
developing the requirements,Because then we can actually
include the cybersecurityrequirements into the design,
versus try to bolt them on atthe end, which is what most
people do.

Speaker 1 (27:07):
You also mentioned earlier that the hospitals, they
might be a little bit morestrict than the FDA.
Are you more strict than theFDA or are you depending on the
FDA to pretty much give you theguidelines that are necessary to
protect these devices?

Speaker 2 (27:24):
Yeah, it's interesting.
In my first company we didmedtech cybersecurity, as I
mentioned, and we did a lot ofthe things that the FDA is just
now asking for, because, from myperspective, if we sign off on
one of these devices, somebodyhacks into it, my paperwork is
tied to that device.
It does not look good for mycompany, especially if we kill

(27:44):
somebody.
It won't look good for meeither and I certainly won't
feel good about that.
So I make sure we are veryaccurate, thorough, complete and
diligent in all of our testing,because it's not just about
compliance, it's about the realimpact of these devices and the
real risk, which is patientsafety.

Speaker 1 (28:03):
What do you want, your?

Speaker 2 (28:03):
legacy to be.
As an entrepreneur.
I know how challenging it is tobring a product to market or
build a company, and one of thechallenges is a lot of these
innovators don't understand theroad in front of them,
especially from a cybersecurityperspective.
So part of my legacy is toraise that awareness so they're

(28:24):
not caught off guard, becausewe've had companies come to us
two months before submission tothe FDA.
We found 4,000 vulnerabilities.
They have to fix like 2,000 ofthem and eight months later they
still haven't fixed it.
So they're scrambling and thisis a product that took, on
average, like three to fiveyears to build and if it delayed

(28:45):
eight months is it going toupset the investors, the
innovators that came up with theproduct going to be upset.
Everyone's going to befrustrated.
So from a legacy perspective, Iwant to try to solve that
awareness problem so it costsless money to get the product on
market and then get to marketmuch faster, because there's a
lot of outstanding inventions.

(29:06):
That I think will really helpelevate humanity if we can get
to the market faster and avoidthose delays.

Speaker 1 (29:12):
Christian Espinoza, Blue Goat Cyber.
Reducing risk what else?
You are increasing revenue foryour clients.
You are saving lives.
You are raising awareness and,as a byproduct, you'll probably
save Dick Cheney from beingassassinated with his internal

(29:34):
shock machine, as now I amstarting to call it.
Is there anything else that youwant our listeners to know?
And if people want to find you,how can they find you?
And if people want to find you?

Speaker 2 (29:48):
how can they find you ?
I think one of the thingsthat's important to understand,
because we have a lot oftraditional, what I call
traditional cybersecurity.
People come to me looking forjobs and to me, traditional
cybersecurity is way less risky.
Typically we're concerned aboutinformation disclosure or
someone stealing your creditcard information or your health
records, which is very differentthan medtech cybersecurity,

(30:08):
because we're talking aboutsomebody potentially killing
somebody or maiming them orinjuring them.
So I mean that's one thing Ilike to leave people with, just
that the industry is verydifferent.
And then people can find mycompany, bluegoatcybercom, or
LinkedIn, YouTube.
We have our own podcast, wehave webinars monthly and we're
doing everything we can do tobecome the market leader this

(30:30):
year.

Speaker 1 (30:30):
That's the goal what's the name of your podcast?

Speaker 2 (30:34):
the MedDevice Cyber.

Speaker 1 (30:36):
Podcast.
Thanks for stopping by,christian.
Always good to see you and I'llsee you on LinkedIn.
I see your stuff all the time.
I read your posts, I commentand like whenever I can and I'll
check out your podcast.
Thanks again for coming onCybernomics.
Yeah, thank you for your posts.
I comment and like whenever Ican and I'll check out your
podcast.
Thanks for thanks again forcoming on, cybernomics.

Speaker 2 (30:51):
Yeah, thank you for having me on, josh.
I appreciate it.

Speaker 1 (30:54):
All right.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.