Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 2 (00:02):
what you do uh, sure,
um, I'm uh dave saunders.
I'm a former chief technologyofficer of galen robotics and uh
, now I'm working with anothersurgical robotics company that's
kind of in a stealth mode anduh, I've got about 30 years
worth of internet technologyexperience.
(00:23):
That began from the very earlydays before the internet was
public.
I've developed some earlyinternet protocols that
unfortunately are no longer inuse anymore Maybe it's fortunate
and also worked on high-densityaccess concentration in the
(00:43):
days when dial-up was sexy andcool and my group I was also a
research manager at Lucent BellLabs and my group invented the
first commercial Wi-Fi hotspot,which was the little
spaceship-looking Apple Airport,if you remember that thing I do
(01:06):
.
We actually built that in mygroup at Lucent Bell Labs.
So you know I've been arounddifferent hardware and software
technologies for a long time,certainly worked with a lot of
the early explorers intointernet security issues.
Speaker 1 (01:27):
Amazing.
Speaker 2 (01:31):
And so you know it's
certainly been part of my DNA.
For a really long time, andmost recently about the past 10
years, I've been working oncommercializing some different
surgical robotic technologies.
I took one through a successfulFDA clearance about two years
(01:52):
ago and I've been for the past10 years working in close
relationship with Johns HopkinsUniversity on diversity, on
advancing surgical robotics andtaking that into kind of the
next generation of computervision and AI utilizing those
(02:14):
things, and of course, all ofthat is affected by
cybersecurity, which is a goodthing.
It's not an afterthought, butit's something that you really
do think about as anarchitecture from the get-go.
Speaker 1 (02:29):
Amazing and from my
understanding, Galen developed
the concept of digital surgeryas a service.
Can you explain how complianceconsiderations went into
developing a surgical robot?
Speaker 2 (02:45):
how compliance
considerations went into
developing a surgical robot.
Yeah, you know, surgical robotsare interesting because, of
course, they appear to come inmany flavors, but really, in
essence, you've got a mastersystem component and then it's
driving the controls foractuators and sensors and all of
(03:05):
that to give you whatever thatrobotic function is.
That creates a number ofpotential security issues, so
not the least of which is justyour basic man-in-the-middle
risk, least of which is justyour basic man-in-the-middle
(03:27):
risk.
The master of your surgicalrobot is always disconnected by
millimeters or feet or milesbetween the slave components
that it's intended to beoperating and responding to, and
so I think that's probably oneof your first and foremost
issues in terms of just makingsure that the data that you send
(03:53):
and receive between thosecomponents is authentic and is
not subject to any sort ofinjection risks.
And I think that's kind of likeyour first and biggest issue,
because obviously that's goingto be the first thing that is
(04:16):
potentially going to put apatient at risk.
Hipaa compliance and patientprivacy those are also very
important, but when you'redealing with a surgical robot,
my first concern is to make surethat the patient is never put
at physical risk of harm throughthe use of the system and that
(04:38):
you know, everything is arisk-based model, and that risk,
I believe, is far moreimportant than HIPAA compliance.
Those are important too, butyou know my my patient isn't
going to literally live and diebased on HIPAA compliance, right
?
So so I think that's where yourfirst model comes from.
(05:04):
Then your kind of your nextareas then are you know how
plugged into the rest of yourhospital administration system
are you?
Are you plugged into Epic?
Are you actually pulling apatient record or are you aware
(05:30):
of who the patient is andassociating your logs just for
maintenance or follow-up reasons?
Do we want to have any sort ofassociation between the
operation of the surgical robotand a patient record?
And if the answer there is yes,then now we start to have some
HIPAA compliance issues, where Iwant to de-anonymize my data
(05:53):
and I want to make sure that anyindividual record is not going
to expose patient privacy issues, and those are a challenge as
well.
A lot of surgical robots todayand there's about 170 of them on
the market are not actuallyreading patient data directly,
(06:18):
and so they tend to be islands.
So that kind of patient privacyis not necessarily an issue
today, but when we look forwardin terms of where the technology
is going.
When I start to do things like,for example, for head and neck
(06:39):
surgery, there is technology ondeck for future applications
where I might be taking the CTscan in, building a 3D model and
then having the robot actuallylook through the surgical
microscope and trying to seewhat the surgeon sees.
(07:02):
To see what the surgeon sees,and now I have direct access to
patient data, right?
So I want to make sure that I'mnot putting you know risky
issues in terms of attachingpatient records now to
navigation data.
You know what's the serialnumber of their, of their um, of
(07:25):
their hearing implant, um, youknow now.
You know now I do have thepotential of of really breaching
a lot of uh, private, uh, hipaacompliance issues and things
like that in terms of of patientprivacy.
So it's, I would say, theirpatient privacy issues are
(07:45):
probably not as much of a riskwith most of the surgical robots
on the market today, butcertainly, as we try to bring in
next generation featuresinvolving computer vision,
involving machine learning,involving computer vision
(08:09):
involving machine learning,those issues do become more of
an issue because, just forexample, probably the hottest
topic when it comes to surgicalrobotics is something called a
digital twin, where I want to beable to take in patient model
data which is going to beassociated with their records in
Epic in some way, and I'm goingto associate that with
real-time telemetry data whichcould be the visualization
(08:32):
system from my endoscope cameras, navigation data that is
automatically calibrated andregistered to patient CT models.
So where this is going in thefuture.
Yeah, we've got a really,really tight coupling between
those patient records and theoperation of the robot.
So those things do become moreand more of an issue over time,
(08:55):
which just means that thesecurity teams are going to have
even more of a mess on theirhands.
I mean, I know security teamsare already overburdened, um,
just allowing hospitals tooperate on the internet in
general, um, uh.
But I think some of those issuesare are going to become, uh,
certainly more forefront overtime.
(09:17):
Because, going back to what Iwas saying originally, um, you
know patient safety with withthat, when I've got a robot
that's effectively armed with asharp pointy device, a scalpel,
a cauterization tool, that thingcould do a substantial amount
(09:39):
of harm to the patient in theevent of data being manipulated,
man-in-the-middle telemetry,all of those sorts of things.
So those become more and moreof a risk the more I tie
features in that becomepatient-specific and we start to
allow the surgical robot tobegin to have more autonomy.
(10:04):
I mean, today it's really justthe tip of the iceberg.
You look at your average pediclescrewdriver robot.
It's not really doing that muchin terms of autonomy.
It's positioning a cannulatedinstrument adapter, a cannulated
(10:31):
instrument adapter, and thesurgeon is still, you know,
twisting the handle on the drill.
They're still driving the screwright, and so there still is a
substantial amount of patientI'm sorry of surgeon autonomy.
That's driving even the mostautomated of surgical robots out
there, but it's not going to bethat way forever.
And it's not even the mostautomated of surgical robots out
there, but it's not going to bethat way forever and it's not
even going to be that way forthe next five to ten years what
(10:53):
type of uh like data does, orlike emr, ehr, phr like.
Speaker 1 (11:00):
What type of data
does these robots collect or do
they require to to perform thesurgeries or functions they're
designed to do?
Speaker 2 (11:09):
Today it's very
limited.
So if we take like, obviously,the 800-pound gorilla in the
room, you know a big daVinci-style system.
This is a tele-operated robot.
It's got, you know, four orfive laparoscopic rods inserted
into the abdomen of the patientand there is a surgeon sitting
(11:31):
at a console and they're remotecontrolling those rods.
The amount of patient datathat's required for that
procedure, in terms of what therobot needs to know, is almost
nothing today, and that'sbecause the robot is being
driven by the surgeon.
(11:52):
It's just, it's a remote, it'sbasically.
Just.
Keep in mind that the DaVinciwas invented based on DARPA
research that was originallydesigned to remote, manipulate
radioactive materials and allowthe operator to be safely away
from the radioactive material,right.
(12:16):
So that's the concept there.
It's not that the da Vinci'sdoing anything autonomously,
it's simply extending the handsof the surgeon into the
patient's abdomen, giving themexcellent vision, giving them
really incredible dexterity, butit's soft tissue surgery, so
there's really very little thatit needs to know today about the
(12:39):
patient.
Now, one example of a publiclydemonstrated piece of technology
that is not commerciallyavailable in any way is
augmented reality for thevisualization system of a da
Vinci or a da Vinci type robot,where I can get false colors to
(13:04):
show me that I'm now touchingthe actual patient organ and if
I push on it, I can actually getlike a color map showing me how
hard I'm pushing on itIncredible, incredible feedback.
Now, does that require patientdata?
No, it doesn't.
It's all done in real time.
It's all done in real time andso it's all very detached from
(13:32):
the patient record information.
Now can that information beuploaded into the patient record
for use?
Absolutely.
You know, I just recently had acolonoscopy right and in my EHR
, um, I've got a bunch of uhsnapshots uh, great, for eight
by twelves and um you know of ofof my insides Right, and so
(13:55):
that's now part of my my record.
Well, um, at some point there,it makes complete sense that you
would be doing the same thingwith a surgical robot.
Where I'm going to be takingsnapshots of this is the
position of your pedicle screw.
This is the position that wedecided.
This is where digital twinsbecome very helpful.
(14:17):
This is the planned position ofyour knee implant and this is
the actual position based onbone data, based on the fact
that the registered camera andnavigation system can actually
see your bone.
It can see where the implantwent and we are only 1% off of
(14:39):
the intended original plan.
We're definitely moving into asituation where we will actually
be placing that kind of datainto the EHR and it makes
complete sense to do so.
But in terms of the rawoperation of the robot it's
(15:00):
pretty limited in terms of howmuch patient record information
the robot needs to operate.
Speaker 1 (15:07):
Right, yeah, so you
would think that attackers,
right like hackers, would havesome moral standing to not
attack a hospital or anemergency like care facility,
but unfortunately that happenednot that long ago.
Yeah, um, and it happens prettyregularly um hospital in boston
(15:30):
.
Yeah, horrible, horrible dds yepunfortunately, my dad was uh
was in um on a ventilator whenthe crowd strike uh event
happened, and and uh so curiousto see how you guys are kind of
uh uh distancing yourselves fromfrom.
You know probabilities thatcould attack hospitals or
(15:52):
institutions.
You know how do you?
Uh assure that your deviceremains functional and and right
?
Speaker 2 (15:58):
Very, very important.
So, first and foremost, um, asof right now, um, if I asked a
hospital for a live internetconnection for a surgical robot
that I'm installing during theprocedure, um, I'm gonna get a
hell no from the hospital.
There's no chance.
(16:18):
Um, now I still need that foroccasional updates.
I want to update my SAS datafor maintenance, prediction and
things like that.
So there is that issue ofvulnerability.
That's not intraoperative,right, so that will be a risk in
(16:40):
the future.
But today, yeah, I'm going tobe connecting my system to the
internet to do, you know, saasuploads and things like that.
So that's when I'm mostvulnerable right now.
So, like I was saying earlier,my first concern and this is
(17:01):
where my paranoia comes in, justbecause of the hackers I've
known and black hats versuswhite hats man in the middle is,
as far as I'm concerned, thething that actually keeps me up
at night, right, and so when I'mmaking that connection, I want
to make sure that I'm notgetting any patch data to the
(17:22):
operating system that did notcome authentically from my
company, and so I have a seriesof layered MD5 fingerprints and
some other techniques tovalidate each of the subsystems
in my control mechanism.
(17:42):
So our software is kind ofbifurcated between two different
computers that talk to eachother inside of the system, and
then there's another controlleras well.
So there's a few differentcomputing systems within our
robot alone that actually talkto each other, and so each of
(18:04):
them is authenticating their ownsoftware to make sure that if
there is a patch, it has to beheavily authenticated and any
signs that I don't have anauthentic load automatically
(18:25):
shut me down.
Now, that being said, I do wantmy robot used in research
settings.
So there are ways that therobot can be put into a
not-for-human-use.
You know, the screen isplastered with I'm not operating
(18:45):
on a person, that sort of thing, and so then students do have a
documented ability to take overthe robot controls, but that is
system software that's beingrun on a research platform, and
they don't have the ability tooverride this, to actually patch
(19:07):
the software on the robotitself.
So there are mechanisms thatallow you to completely hijack
the system, but if you do that,if you know, as you're saying,
like if, if there was a badactor scenario, um, that would
that override would be detectedby, kind of like, the display
(19:28):
computer and it wouldimmediately go into a you're not
operating mode.
There's all kinds of, like youknow, klaxons and red overlays
that would indicate that we'rewe're in research mode only.
And if that ever happened in OR,um, you know the the the
surgeons are are trained thatyeah, okay, there's, there's a
problem, and I would not use thesystem to operate if, if I'm
(19:52):
not getting a, you know we're inoperating mode.
So those elements are, uh areelements are pretty well
protected and preserved.
Now, all of that data, so evenmy SAS data, my operating logs
and things like that, those aregoing up encrypted, they're
(20:13):
going up signed.
So any data that's being movedback and forth between the
system and my, my SAS servers isalso authenticated,
deanonymized, anonymized,anonymized.
Speaker 1 (20:29):
I guess you don't
deanonymize it.
Speaker 2 (20:31):
So it's anonymizing
it and you know, and signed.
So encrypted, signed,anonymized all of that is going
through a default process whenit's being uploaded to anywhere
outside of the hospital.
(20:52):
So if I'm taking any of thatdata outside of the hospital I
can't have any patient recordinformation ever.
I don't think any of theagreements that I've done with
hospitals would allow me to,even for research purposes, have
the name of the patient and Ihave no need for it anyway.
So all of that is stripped ifit goes up into the SAS server
(21:12):
and that is explained in detailin the operating agreement that
I have with the hospital and youmay not be able to answer this.
Speaker 1 (21:22):
I also don't want to
get too much of your secret
sauce, but as AI is evolving,how do you see the future of
surgical robotics advancing?
Speaker 2 (21:34):
So yeah, so there's a
couple different areas there.
One I think it gives me somereally interesting opportunities
for cybersecurity protection,right.
One of those just beingwatching for ticks and blips in
latency.
You know, man in the middle,I'm just going to keep saying
(21:55):
that that's where my paranoialies.
I'm just going to keep sayingthat that's where my paranoia
lies.
If I'm able to actually watchwhat my back and forth ping
times are, I can see where myblips are, and so that's digital
triage opportunities for me fora cybersecurity standpoint, and
so just being able to monitorthe data flow and say does this
(22:16):
look normal?
Does this look like just ageneral back and forth buzz?
I mean, my updates are likeevery five milliseconds and
things like that.
And if there is a hiccup, youknow I should be able to have an
AI system that's in tune enoughwith the other subsystems that
it could say oh, that was just ahiccup on the disk drive or
(22:38):
whatever, right, and so the AIshould be able to account for
most hiccups in latency and beable to associate them with some
system operation that actuallyis normal and it's okay.
And so that means that now Ionly have maybe two or three,
(22:59):
hopefully, anomalies that I wantto go and look at and if they
actually are false positives,then I'm going to take those out
.
Next, level up, from a clinicalstandpoint, this is, of course,
where everybody cares aboutcybersecurity, but nobody's
going to buy my product unlessit has clinical relevance right.
(23:20):
So the AI that is mostinteresting to the customers is
where we're going to start doingthings like augmented reality.
We're going to use things likeAI to do patient registration,
and what I mean by that isvisualization systems, where I'm
(23:41):
going to look at any patientimaging data that I might have
in the EHR and I'm going to usethat to either authenticate and
make sure that I'm operating onthe right patient.
With bones, that's obviously alot easier.
So, for example, I'm about toopen up a patient behind the ear
for a mastoidectomy a prettysignificant drilling process.
(24:06):
If I've got a CT scan of thepatient's skull, I should, with
today's technology, couldn't dothis 10, 15 years ago.
But with today's technology,just based on what I can see
through the surgical microscope,which is a stereo microscope, I
should be able to very quicklyfind anatomical landmarks and
(24:29):
match up what I can now see ofthe actual IRL patient and match
that up to the CT scan that'sin the EHR.
That's great, because not onlyis that potentially valuable for
future navigation and augmentedreality applications, but if I
(24:49):
can't get a match, I'm going toscream bloody murder to the
surgeon and say there'ssomething wrong here.
Are we operating on the wrongside of the head?
Is this the correct patient?
And you know so.
I think that gives us a reallyinteresting opportunity to
(25:12):
really make sure that we don'thave those one in a thousand or
even one in 10, ten thousand orone in a hundred thousand.
Nobody want that's a lottonumber that nobody wants to win,
right?
Yeah, um, so it really doesn'teven matter what the error rate
is for those surgical errors.
Nobody wants to be the one.
And if I can do such a simplething as have a machine learning
(25:35):
algorithm, just take anypatient data that I have and
correlate it to what is actuallyhappening in real time and say
wait, wait, wait, wait, wait,wait.
That's not an appendix, yeah.
Speaker 1 (25:48):
Something obvious.
Speaker 2 (25:50):
Are we in the right
part of the body?
Are we doing the surgery that?
Oh, by the way, I have a.
I have a page, I have a copy ofthe patient's surgical order.
You're about to remove theprostate of a patient that was
in here for a colostomy right.
The potential for machinelearning to actually double
(26:11):
check the procedure and just gohey, I know what your camera's
pointing at.
Why are you about to sever thisperson's prostate when we're
supposed to be working on theircolon to remove diverticulitis
or you know something like that,right, that's a really
interesting opportunity, andthat, obviously, is going to
(26:33):
require me to either havereal-time access to the EHR or
I'm going to want to be able todownload it temporarily before I
start the procedure.
So there's some really coolopportunities for just ensuring
patient safety in the future.
When it comes to AIapplications and surgical
robotics, I think that's prettycool all by itself.
(26:57):
There are also opportunities foradvanced medical procedures,
and the one that I've beenstudying the most recently is
middle ear surgery, but thesechallenges apply all over the
body.
So when I'm going to do amiddle ear procedure, I'm going
(27:19):
to drill about an inch and ahalf diagonally through the
hardest bone in the human body,which is your skull, right
behind the ear.
This is the mastoid.
And, um, not only is it 60 to90 minutes of drilling with an
80 000 rpm dremel tool, oh mygod, it's, it's, it's, it's,
it's gruesome, that's terrible.
(27:40):
Um, the the punch line to thispart of the procedure is that
embedded in the bone are twofacial nerves, and you cannot
see them until you've exposedthem.
And if you nick one, um, yourpatient will have a.
They're not going to die, butthey will have an irreparable
(28:01):
injury.
Either they're, uh, they willlose control of one side of
their face, um, or they willlose, um, pretty much all sense
of taste and smell for, likesalt and savory items.
So, no more chocolate, red wineor meat, oh my god.
(28:28):
Um, and that sucks right, yeah,so, um, and so this is,
honestly, even though you'reabout, oh my God, I'm holding
that drill in a white knuckledeath grip, looking for weird
little, not so standardizedlandmarks that will tell me that
I'm really close, and I thinkthis facial nerve is here.
(28:52):
And I think this facial nerve ishere that by taking the full
body or, I'm sorry, full head CTscan and giving it to a neural
net.
The neural net is able tosegment all of the anatomical
structures in the middle ear,all of the bones, and it's even
(29:14):
able to pick out the littleghost trails of the two facial
nerves that are embedded in themastoid.
So what can I do now?
Well, if I have that 80,000 RPMDremel tool being held in the
hand of a surgical robot, I openthe patient's skin up behind
(29:34):
the ear.
I've got now enough boneexposed that I should be able to
map the CT scan, a 3Dconversion of the CT scan to the
actual patient bone.
Boom, I now have effectivenavigation.
Now I register the tip of thedrill, which doesn't require old
(29:56):
school touching a bunch offiducials.
I can see the tip of the drillwith a 4K surgical microscope.
So with OpenCV and a neural net, I can now register the tip of
that drill.
From this point, all I have todo is watch the kinematics of
the robot relative to thepatient bone.
(30:17):
So now I know where thoseembedded nerves are, even though
the surgeon can't see them.
So this is super cool.
So now, even though the surgeonmight be, their hand may still
be literally on the hand of thedrill, but the robot is holding
the drill at the same time.
(30:39):
Now I allow the surgeon to dothe drilling.
They can go as fast, as slow asthey want.
But because I know where thosefacial nerves are, I can create
virtual keep out zones, virtualbarriers, and I can now protect
the patient nerves by, you know,haptics.
I can, you know, give you alittle little little shutter on
(31:00):
the drill.
You know, if you move too closeto where I know that nerve
happens to be, um, I can juststop you entirely if you're
about to hit, you know, hit thebone.
That would have exposed it.
And so now I can very safelyguide the surgeon to just do
their drilling but completelykeep them away from those
(31:22):
critical anatomical structures.
Super, super cool.
And that has the potential ofincreasing the speed of those
surgeries.
In fact, in research we'vetaken 60 to 90 minutes of drill
time down to about 15 to 20minutes, so we're potentially
cutting an entire hour off of ORtime.
(31:44):
The hospital might as wellthrow you a parade for that much
savings, right, and they wouldstill save money.
And so that's a big deal rightthere, big deal right there.
(32:05):
But that also now means thatpatients that might have had
some bone abnormalities or thenerves in a weird place and I'm
just having a hell of a timedoing that drill through.
There are a lot of patients thatare turned away from these
kinds of so-called routinesurgeries simply because the
approach or some abnormality ofthe patient themselves they just
make them a bad candidate forthe surgery.
(32:34):
And because we're now alsocreating these guardrails to
make these ergonomically andjust exhausting, you know, huge
cog I mean imagine doing, youknow, this kind of no mulligan
drill through, not all surgeonscan do that.
(32:54):
Not all surgeons are qualifiedto do neurotology.
It's just the burden issignificant.
And yet I will go out on a limband say every single attending
surgeon out there in the fieldof otolaryngology has at least
been trained.
They know functionally what todo to do a middle ear surgery,
(33:21):
to do a middle ear surgery.
Just, not all of them have that, like you know, super, super
crazy, steady hands that youknow the rock star surgeons have
and those sorts of things.
The robot has the potential ofmoving those people over on the
other side of the bell curve andallowing them to utilize all of
that training and know-how andintelligence they have.
But now give them the magichands through virtual barriers
and through things like that,and so that means more surgeons
(33:45):
that are able to do theseprocedures.
We've already seen this justbecause of the mechanical
advantages of operating a daVinci or a similar tele-operated
robot.
We're already seeing thoseadvantages.
Right there, we're literally anorder, if not two orders of
magnitude, more abdominalsurgeons are able to do
(34:06):
minimally invasiveprostatectomies and minimally
invasive hysterectomies becauseof the da Vinci.
That's incredible, and you knowwe're talking.
You know we're talking.
You know two decimal pointshifts in terms of the number of
surgeons that are able toprovide that standard of care to
(34:28):
patients.
That's huge.
By providing computer vision, byproviding HUD guidance through
the visualization systems,through the technologies that
they're already familiar with,where, basically, the AI is your
(34:51):
co-pilot, right?
Maybe co-pilot is a bad wordjust because kind of Microsoft
has soured that for a lot ofpeople, but still that's what it
is right You've got the AIsitting over your shoulder.
The surgeon's still the onethat goes to jail if something
goes really, really wrong.
The surgeon's still the onethat gets sued.
(35:11):
So let them be in control, butuse AI to give them better tools
or to make the tools that theyalready use, make them better,
and I think that is the trueclinical advantage of AI today
in terms of clinicalapplications.
But on the cybersecurity frontit's no different.
(35:32):
All of your IT people at thehospitals?
They're overburdened, they'vegot way way too much to do.
If you can do digital triage onyour security profiles and have
them, you know, burp up.
You know, oh, here's anabnormal log.
Tell me if this is a falsepositive or not.
You know, have them, burp thatstuff up up.
(35:56):
You're taking the mundane crapthat nobody ever wanted to do in
the first place.
That's what computers are goodat.
Let them do that sort of thing.
And we see a parallel of that inradiology.
In fact, a very close friend ofmine recently found a tumor in
(36:27):
her chest that initially a humanradiologist had said I don't
see anything wrong, but we justacquired an AI system that
couples with our mammogramscanner.
Would you mind if we ran itthrough that?
They don't have it set up as astandard of care yet.
And they ran it through the AIand it came back and said look
at this shadow here, this issomething.
(36:49):
And so then they went back toher with a more refined
instrument, did a close-upsonogram and, sure enough, they
found something that the AI hadfound.
So at some point this needs tobecome standard of care.
When you're standing at amammogram machine, the very
first thing it should do is gothrough a simple triage sorter.
(37:14):
You know, 99% guaranteed,you're totally fine.
You know, put on your shirt, gohome, you're in, you're in,
you're you look.
Okay, I've got like a 95%confidence.
We'll put it on the stack forthe radiologist to double check.
Here's a couple of little thingsthat I think I found.
Or the other side of the bellcurve, do not let this person go
(37:37):
home.
And if you can have that donein milliseconds or even seconds,
integer seconds, you'reproviding a better standard of
care for the patients and you'rerelieving burden on the human
beings that right now are stuckwith.
Okay, at least they're in theEHR system, so they're not
(37:58):
literal stacks of radiologyscans, but they still have just
so much to go through and mostof it is fine.
They're routine mammograms.
There's no risk there.
Let AI sort those things out.
I still feel that a human being, at least for the foreseeable
(38:19):
future, should scan throughthose things manually as well
and double check them and signthem off.
But let's let the sorting hatdo its job and put those things
in the right piles andprioritize where the work should
be, and and those are the kindsof things that I think you can
do from a cyber cybersecuritystandpoint, cybersecurity
(38:41):
standpoint, and I think thoseare the kinds of things that you
can do from a clinicalstandpoint that are going to
give you the most bang for yourbuck and give you a real return
of value for the hospitals, forthe patients, for the payers,
for the doctors.
I mean all of the stakeholdersbenefit when you apply AI in
those areas, in that model ofparadigm, and that's that's how
I see it.
Speaker 1 (39:01):
Amazing.
Um, I know we're about 40minutes in.
Um, I don't want to like clipyou at your knees.
Um, I also um don't want tolike.
I want to be respectful of yourtime.
I just want to make sure youdon't have like a hard stop or
something.
I'm good right now, okay.
Okay, I guess I got one morequestion and?
Um, as a film guy, you knowfilms like I, robot, terminator,
(39:23):
things like that.
You think of the, the sky netright Of uh.
You know you have a lot ofpeople that are are less
trusting in technology.
Um, have you, have you uh,encountered resistance using uh
the DaVinci?
Um, how do you communicate, howdo you ease patients and let
them know that this technologyis safe, and how do you build
(39:46):
rapport with patients beforegoing into surgery?
Speaker 2 (39:50):
Today it's actually
not a problem.
In fact, I had spoken tohospital CFOs who are buying
surgical robots because they arelosing business without them.
Wow, it is the weirdest thingI've ever seen.
(40:11):
I mean, imagine going to ahospital, being told that you
need surgery and saying, well,what brand of scalpel do you use
?
Yeah, brand of scalpel do youuse.
And if you don't get the answeryou like you go.
Well, I'm going to go to thehospital down the street because
they use XYZ brand of scalpel.
Right, that's effectively whatis driving a lot of surgical
robotic utilization today.
(40:33):
Now, I still believe thatthere's an extreme value.
I mean, I know some incrediblesurgeons that use they don't
pick and choose when to use theda Vinci and when not to.
They use it as their standardof care and they are just
magicians with the da Vinci.
It's incredible to watch themdo their work.
(40:53):
And so I believe that the moretraining and the more experience
you have with surgical roboticsLet me give you a quick
parallel example.
If you ever have LASIK, okay,lasik is relatively new
technology.
It is a surgical robot, okay,it is tracking your eye and it's
(41:15):
shooting a little laser to giveyou scars.
Now, what existed before LASIK,a procedure called PRK, where a
surgeon had a hand-drawn map ofyour eye and they had a little
trajectory plan of where theywere going to make incisions on
your eye to do exactly whatLASIK does.
Right, the scarring caused bythose incisions from the laser
(41:37):
burns, reshapes your eye andthat's what causes your vision
to change with LASIK.
Well, that used to be done witha diamond-tipped scalpel held
in the hand of anophthalmological surgeon.
So let me give you ahypothetical question.
Let's say you need to have yourvision improved.
Are you going to go with LASIK,which is a surgical robot, or
(41:59):
are you going to say no, no, no,I need old school.
Give me a surgeon with ascalpel in their hand and let
them poke me in the eye withthat thing.
I'll tell you.
I know exactly what my decisionis going to be yeah, totally
Same same.
I can't imagine what paralleluniverse I would be going.
I don't know, I need to go oldschool.
Give me that diamond tipscalpel, and that is becoming
(42:24):
the market demand for things.
Now let me give you anotherexample.
I hope that you are neverunfortunate enough that you are
told by a doctor that, hey, I'msorry, but we got to take your
prostate out.
If that was ever to happen, youthree options.
The first is old school opensurgery.
They open you up like a, like afrog in biology class, man.
(42:47):
It is gruesome.
You lose eight units of bloodand there is no open procedure
for a prostatectomy that willprotect the nerve that runs up
through the middle of theprostate, which means that when
you're done with thatprostatectomy, that will protect
the nerve that runs up throughthe middle of the prostate,
which means that when you'redone with that prostatectomy, a
certain malfunction that youwould probably like to have
(43:09):
preserved will never happenagain.
Okay, option number two ismanual laparoscopic surgery.
That was invented.
The manual laparoscopicprostatectomy was invented at
Johns Hopkins University orSchool of Medicine and there are
(43:32):
certainly surgeons that can dothat as a manual laparoscopic
procedure, but the number ofsurgeons actually is in the
dozens we're on a planet with 8billion people and there are
probably dozens of surgeons thatwill do a manual nerve-sparing
laparoscopic surgery.
In the state of Californiaalone, the last time I looked
into this, there were probably3,000 surgeons that could do the
(43:58):
laparoscopic, nerve-sparingprostatectomy with the da Vinci
or that kind of teleoperativerobot.
So that means that you've gottwo orders of magnitude more
surgeons that can do thenerve-sparing technique as long
as they're doing it with therobot compared to manual
laparoscopic surgery.
(44:19):
So here's the rub.
Sometimes you see thesenewspaper articles and they say
oh, there was this longitudinalstudy and it says that using the
surgical robot for thisprocedure is only as good as a
laparoscopic surgeon doing itmanually.
Okay, well, that's aninteresting data point.
But if there are two orders ofmagnitude more surgeons that can
(44:44):
do that procedure, well thenthat means I'm not waiting nine
months to have my.
In fact, I had a friend passaway on the waiting list because
this was over a decade ago.
He was terrified of having theda Vinci used for his
prostatectomy and so he waitedon a waiting list for a manual
(45:07):
prostatectomy and died on thewaiting list.
Speaker 1 (45:09):
Wow, that's terrible.
Speaker 2 (45:10):
So that's how I see
the opportunity for surgical
robotics is that, even if it isonly as good as the best
practitioners in the field thatare out there today, if I can
take surgeons that are fullytrained, they know exactly what
they're supposed to do and theysimply operate a robot as the
extension of their hands andthey get consistent results day
(45:34):
in, day out.
You know every single procedureof the day.
What's my downside, everysingle procedure of the day,
what's my downside?
And that's, I think, to me thatis really important.
And what's really what I thinkis gratifying is that, from the
discussions I've had withpatients, with surgeons, with
hospital administrators, withthe payors, they seem to agree
(45:56):
that that is that right now,that's a very significant, not
only market driver, that that isthat right now, that's a.
That's a very significant, notonly market driver.
But it's also a huge benefit ifI know that at my hospital,
simply by buying a two milliondollar da vinci, I can now take
my entire urology department andgive them all superpowers.
Um, that that's a huge, huge win.
If also from a marketingstandpoint, that means that I'm
(46:19):
not losing patients from goodsam and san jose to you know
stanford, you know up the road.
Um, that's, that's a valuablepurchase, that I can justify the
purchase just by saying I'm notlosing patients.
Um, because at the end of theday, I mean, you know this, we
are, this is a cup.
You know we have to haverevenue to keep the hospital
(46:43):
open.
And so losing patients justbecause I don't have the latest
tool is that's a significantburden.
As well, that's a financialissue.
So you know, right now, withthe current state of the
surgical robotic market, theacceptance of surgical robotics
is pretty good, and so wherethat goes in terms of like, if
(47:07):
we start allowing AI to performany kind of autonomous function
with that surgical robot, thatmight change.
That perception might change.
Right now, to my knowledge, 170robots in the market.
The FDA has not approved asingle application in AI that is
(47:30):
directly affecting the surgery.
There are lots of opportunities.
In fact, there's over 800 AIapplications that have been
approved by the FDA.
Zero of them utilize LLMs ofany kind, but the 800 are
probably 80 to 90% of them arediagnostic right, and so that's
(47:57):
certainly where the sweet spotis in terms of getting approval
and applying those today.
But there's going to become apoint where, yeah, I'm going to
be able to demonstrate clinicalbenefit and I am already I've
described some of those to youalready where, by bringing in AI
, I'm going to be able todemonstrate real clinical
benefits working with a surgeon,working with a surgeon.
(48:25):
But yes, when I lift up thehood, the veil is pulled back.
And yes, there is.
The AI is, in fact, taking awaya little bit of the thinking
process.
The AI is definitely relievingthe surgeon from some decisions
that they would otherwise haveto make on their own, and you
could argue that that is takingaway surgeon autonomy.
That's, that's when the FDA isgoing to really start to
(48:47):
scrutinize those applications.
Nobody's tried it yet.
It's coming.
It's coming, they're inresearch, but they haven't been
commercially approved.
When that starts to happen, thefirst one to do it unless
they're an idiot, when thatstarts to happen, the first one
to do it unless they're an idiotis going to just have the PR
blitz of a lifetime, becausewhoever is the first one to get
(49:08):
approved has cut ice foreverybody else with the FDA.
So it's a big deal when thatfirst approval comes.
But with that massive fanfareand screaming that you were the
first from the mountains,certainly there are going to be
some patients that are going tobe and surgeons that are going
to be a little apprehensive andgo.
(49:29):
I'm not sure right, that'sgoing to happen, but I don't
know.
The market's a weird thing.
I mean, look how many peopleseem to be willing to sit back
and say you know, hey, chat GDP.
Would you please write my legalbrief and include?
(49:49):
You know the trust might not bewarranted, but there's
certainly.
I think there's certainlyevidence out there that people
are willing to trust AIs, andthat therefore puts the burden
on people like us to make surethat, whatever we're
implementing by golly, thatsucker had better be handcuffed
(50:12):
to the wall and make sure thatit cannot go Skynet and it
cannot go off in the wrongdirection.
Because when it comes tosurgical robotics, I mean you
know what are some of thecomparisons of your average
neural net, I mean, at best I'vegot a six-year-old's brain at a
highly specialized function.
Yes, it can process more datathan any human being, but its
(50:35):
decision capabilities are stillincredibly narrow, and yet I'm
putting a weapon in its hand andI'm asking it to do something
effectively.
You know, even even 20%autonomous, 80% surgeon control,
I'm still putting some trustinto a, a, a robot holding a
weapon, and that is theperspective of the folks that
(50:58):
I've spoken to at the fda.
That's how they're looking atit, and so whoever tries to go
through that door first yeah,they, they gotta they got some
work on their hands, um, butit's gonna happen.
And it's gonna happen not onlybecause the potential clinical
benefits coming in the futurethey're just they.
(51:20):
They're, they're too much topass up.
It's not just about, like, oh mygosh, you know I'm going to
nerd out over putting AI into asurgical robot.
It's not that.
The re, the, the potentialclinical benefits are real.
The potential time savings um,even though the surgeon is still
there, operating side by sideco-pilot, whatever you want to
call it.
I'm already looking atpotential clinical applications
(51:45):
where I could cut a literal houroff of OR time.
That's huge.
You can't pass up anopportunity like that, and so
those are big deals and we won'tbe able to hold that damn back
forever.
So, as those applications getdeveloped, yeah, I got to make
(52:06):
damn sure that the AI is onlydoing what it's allowed to do
and what it's supposed to do,because we can't predict every
outcome.
So it needs to at least haveenough pseudocognition to know
when it's about to be asked todo something that shouldn't be
doing.
Um, but then where does that?
(52:27):
We haven't, I know we haven'ttalked about this and we're
about to click on an hour and Idon't want to take up too much
of your time.
Speaker 1 (52:35):
My next appointment's
at 10.
Speaker 2 (52:36):
So I've got an hour
10, so I've got an hour.
So so, um, you know, uh,where's that processing happen?
Um, do I put you know a fleetof um, nvidia, I don't know what
, what, what's going to be,what's going to be the hot card,
you know, three years from now?
(52:56):
50, the 50, 90, the 60, 90?
Yeah, um, do I, do I put thatin a small Nakoda reactor beside
my surgical robot and do all ofthis based on edge computing,
which means I now have to havean architecture for downloading
all of my latest learning datafrom the cloud, where I'm doing
(53:17):
like aggregate processing.
I'm aggregating all of thesurgical robots, everything
they've learned, everythingthey've encountered.
I'm pulling that up into thecloud, I'm doing processing up
there, I'm vetting it and thenI'm pushing it down to the edge,
because the hospitals, I hope,never allow an active internet
connection anytime soon.
So I can't do my processing inthe cloud.
(53:39):
Do I want to, even if I could?
So I have to make thosedecisions Like how much is it
really going to cost to havesufficient edge computing in
every device, and do I want itto learn locally, or do I only
want to do what it's alreadyalready vetted to be allowed to
(54:03):
do?
And take edge cases, take umfalse positive, false negatives,
whatever, push those up to thecloud, allow them to be
processed and vetted up there bya team of of computer and data
scientists and then, once theyvetted those new learning models
, push those, those down to allof the computers.
That needs to be figured outright.
(54:26):
And then at some point we getinto, you know, the robot.
You know what's the holy grailthat we're really trying to
achieve here.
Right, we want the autonomoussurgical robot from the movie
Prometheus.
Right, I've got an alien in myabdomen.
I need an movie Prometheus.
Right, I've got an alien in myabdomen, emergency C-section.
Right, and it just knows whatto do.
(54:47):
And you need that.
I mean at some point.
You know moon base alpha andMars colony, if that ever
happens.
I'm a little skeptic of themars thing, but at some point
we're going to be in space,right, and I don't know if you
remember, there was a surgeon inthe antarctic the only surgeon
(55:12):
in the antarctic at the time whohad a burst appendix.
Well, who's going to do thatsurgery?
Um, uh answer.
He operated, he took out hisown appendix.
Oh, my god.
And you can find pictures ofthis.
It's pretty, it's pretty wild.
Um so um, you know when you,when you have, you know these,
(55:36):
these orbital platforms you'vegot moon base.
We send a bunch of people toEuropa, whatever it's going to
happen someday, right?
So how many surgeons are goingto be on board those spacecraft?
How many of them are going tobe on the remote colonies, and
what happens if somethinghappens to them?
What happens if there's acrisis and you need more than
(55:57):
one surgeon, because you've got10 people that all need
life-saving care?
We've got to have surgicalrobots just to take care of that
.
We've got 8 billion people.
I don't know if you've everlooked into what surgery is like
for rural India or rural China,but it ain't pretty.
We need telerobots in thoselocations just to allow the
(56:21):
surgeon that is in Shanghai tooperate on somebody that's in,
like you know, a less developedarea, guangzhou or something
like that, right.
Same thing for India.
We need that to help thosesurgeons.
But at some point we're over1.8 billion people in India.
Over 1.8 billion people inIndia, right.
(56:43):
And at some point we needautonomous surgeons to do the
cookie cutter surgeries, right.
If I've got an appendix burst,at some point we really would
benefit from having a surgicalrobot in the back room of
Walgreens.
This scenario sounds scary, butreally, if you had that happen,
(57:08):
I'm here in.
Oh good, sorry, you know I'mhere by Johns Hopkins.
I know what the ERs look likehere.
I've talked to ER doctors.
My wife had an injury a whileback and you know, six hours of
(57:28):
sitting there with a compoundfracture is not time that
anybody wants to spend.
And if you've got triage robotsthat can do certain cookie
cutter tasks now not, you know,biology is is not standardized.
Human beings do not fit intoeasy patterns, unfortunately,
but there are certain things youcan do.
You could probably build arobot that can set a bone Um and
(57:52):
uh.
If it's a femur, you want to dothat as fast as you can or your
patient will die.
You can actually die fromcertain broken bones, right, and
so, having triage robots thatcan be handling these kind of
cookie-cutter procedures, weneed that.
It's a matter of practicality.
It's not an issue of whether ornot there's patient acceptance
(58:13):
or doctor acceptance.
We're getting to the pointwhere there's 8 billion people
on the planet and there's simplyaren't enough available
practitioners in a localgeographic area to handle not
only just normal traffic flowbut mass shootings or dirty bomb
.
I mean I don't want to get intothe into, into you know
(58:35):
dystopian kind of stuff, but youknow, shit happens right
volcanoes, earthquakes.
I mean it's not just you knowbad people, there's just nature,
stuff that happens and itcreates this massive burden on
the health care system that wecan't deal with as a simple hr
problem.
I can't just be like, well,we'll go hire, we'll go recruit
(58:57):
some doctors From where theroads are gone, you know.
So in those situations you'vegot to be able to have some kind
of technology that's going tobe able to back up the
practitioners and provide thatkind of cookie cutter care be
able to back up thepractitioners and provide that
(59:18):
kind of cookie cutter care andthat's going to handle 80% of
your use cases.
And if 80% of your use casescan be handled by a robot or an
automated system of some sort,well then hopefully your actual
human practitioners then canhandle the remaining 10% to 20%
(59:40):
of your edge cases.
That's what you've got to build.
That's what the market alreadyneeds it today.
This isn't a future dystopiawhere people don't get health
care if we don't have this kindof technology.
People aren't getting healthcare today because the systems
are already overburdened andalready unable to provide the
(01:00:04):
kind of care that we need,affordably and when it's needed,
to all of the patients that areacquiring it, and so we're
already in a healthcarerationing situation.
That's not just a United Statesthing, that is a planet thing,
and so we have a need for thistechnology.
There's no way that you canoversell I mean, the hype around
(01:00:28):
AI is horrific as it is, butwithin the in terms of the need,
I don't think you can oversellthe need and potential for
robotics and AI in healthcare.
There's just it's.
The need is now.
It is there now.
The technology might not bemature enough, but we've got to
(01:00:53):
get there.
And there's just there's no,there's no ifs, ands or buts
about it.
Speaker 1 (01:00:58):
It segues me, like
perfectly to my next question,
and this will be my final, finalquestion Um, but like, how do
you decide to place these, uh,these medical robots, surgical
robots?
And then I'm assuming you knowyou you talked about being a
couple million dollars.
It seems like that's a highbarrier of entry.
That seems like that's a highbarrier of entry.
So part one of the question ishow many robots are in service
(01:01:19):
right now and how do you decidestrategically to place them?
And the third I'm assuming youhave limited production
capabilities.
You can't just be makingmillions of robots a year.
So how do you balance that andhow do you juggle that challenge
?
Speaker 2 (01:01:37):
It's a huge, huge
problem that I think if you
solve, there's a Nobel Prize init for you.
The issue with surgicalrobotics in general and this is
a universal axiom is no onesolution solves all problems,
right.
Is no one solution solves allproblems, right?
(01:01:57):
The robot that can take outyour appendix is not the robot
that does LASIK for your eyes.
Sure, sure, and because justyou know, throw out a couple of
just you know little buzzwords,you know gear ratios alone, like
if I built a DaVinci with thelevel of precision required for
ophthalmological surgery, I amover-engineering the da Vinci
(01:02:18):
just beyond, like, just toorders of magnitude beyond what
is necessary to take out anappendix or to take out a
prostate or or, you know,hysterectomy or something like
that.
So there isn't one robot torule them all.
And that creates that creates aproduction problem, because
that means you know there's 170robots on the market today.
Eight of them do pedicle screwsjust for spine fusion.
(01:02:41):
So even within a subsegment ofsurgery, there's a limited
number of robots that willexperience consolidation.
So maybe we go from eight tofour over the next 10 years, but
you know there's still 170other robots, yeah, and so that
that creates a real interestingchallenge there.
(01:03:01):
And then the next one is OK,you know, rose, is like one
point seven million.
It's primarily a neurosurgery,orthopedic robot.
Da Vinci's incredible atabdominal surgery.
It's two million and change, um, you know, when you add on the
accessories 10 service modelevery year for a break, fix and
(01:03:25):
maintenance.
So it's like, oh my god, thenumbers just add up and yeah,
and then you have.
And then you have these littledepartments like ophthalmology
or laryngology where I can stillbenefit from surgical robotics,
but I can't.
Just, even if the Da Vinci wasgood for doing laryngeals or
vocal cord surgery, your, your,your voice, throat surgery
(01:03:51):
department cannot amortize a twomillion dollar robot over any
period of time.
It just doesn't fit thefinancial model.
So you've got different robotsthat have to have different
price points so that they canfit into different models.
So that's a huge problem rightthere.
And because of that, right nowmost of your surgical robots are
(01:04:12):
stuck in teaching hospitalswhere the weird Dr Strange
procedures are being performed,but that's not where the highest
volume of procedures are beingperformed.
So if I'm like Texas SpineInstitute, I'm doing tons of
back surgeries, right, thesearen't weird.
(01:04:34):
You know crazy use cases thatare going to the teaching
hospitals these are.
I don't want to denigrateanybody's need for surgery by
calling these run of the mill,but certainly they're not weird.
You know case studies in spinesurgery.
They're relatively routine,which is why a group like the
(01:04:56):
Texas Spine Institute can churnout, you know, hundreds and
hundreds of spine surgeries youknow a month because they're
bringing in patients that youknow just have.
You know, really the same kindof procedure over and over again
, and that's great.
But they've got high throughputrequirements and they've got
(01:05:16):
really tight margins.
They're not a teaching hospital, so they can't financially
justify any of the spine robotson the market today.
So those spine robots stay atthe teaching hospitals and the
ASCs have no technology.
And so what you really need isthe next generation of surgical
(01:05:37):
robots to have cost ofacquisition models that aren't
just CapEx.
So we need to see, you know,realistic lease to own models.
We need to see SAS models whereI can place the robot, and if
my robot costs a half milliondollars to build internally,
placing that thing for free andmaking it up on per usage
(01:06:01):
charges, that's a prettyburdensome thing for the company
to do, right.
So these are really challengingsituations.
So the next generation ofrobots.
We need to see the price pointscome down.
(01:06:21):
We need to see those robotsrequire kind of the next
generation model of navigationregistration using actual vision
, automatically registering toyour CT scans and bone models
and things like that.
I can't spend 20 to 30 minutestouching fiducials and then
tapping confirmation on thetouchscreen to register the
surgical robot.
That's 20 to 30 minutes of ORtime.
(01:06:42):
That is just.
I can't afford to do that in anASC.
So we need to see the nextgeneration of technology.
That's AI, that's computervision, that's integrating with
visual systems, digital twins,right.
So that's the whole nextgeneration of software-based
technologies and we need tocouple that with the next
(01:07:04):
generation of low-cost surgicalrobotics and fortunately that is
possible.
You know a lot of well.
I mean, lasik has blazed thetrail right.
You used to need to, you know,take a mortgage out in your
house to get LASIK done, and nowit's like I can get it done at
the Walmart Eye Center.
I don't know if they really dothat yet, but LASIK has become
(01:07:27):
incredibly accessible andaffordable and the technology
has improved to the point wherepatients that were turned away
as LASIK candidates 10 years agothey're now like slam dunks the
software capabilities andmotion tracking and all of those
required technologies for LASIKhave just gone through the roof
(01:07:50):
.
So the technology has continuedto improve and the price point
has come down.
We need to see that throughoutthe rest of the industry because
we need to remove the robots,have to move out of the teaching
hospitals, or robots.
I haven't looked in this, Ihaven't looked into this in
years, but I believe that the davinci's market penetration is
(01:08:10):
like 10 to 15 percent.
If you had a company that was20 years old, 25 years old and
you still were boasting that youhad a 10% to 15% market
penetration, hbr would be doinga case study on what a failure
you are.
And yet DaVinci's IntuitiveSurgical is leading the field.
(01:08:32):
They're the most successfulones but because of the expense
of that robot they're stuck inthe teaching hospitals.
Good Sam, I think, is buyingthem.
So there are some secondaryhospitals but the Da Vinci
hasn't really been able topenetrate kind of that.
More rural centers and thingslike that.
I think they're getting therebut it's difficult because the
(01:08:54):
price point is very, very hardto absorb at a lower volume
hospital or a higher volume,tighter margin hospital, which
is what an ASC effectively endsup being so.
The economic challenges aresubstantial and those need to be
met while at the same timeimproving their technology.
(01:09:16):
So you need better technologyat a cheaper price.
It's not impossible.
I mean, you know the iPhone andwhatever your favorite
smartphone is is a perfectexample of that.
When I was a kid, radioshackliterally sold a transistor
radio that one of the biggraphics on the box was that it
(01:09:36):
included nine transistors.
Why this radio has ninetransistors in it?
Um, I think, my, I thinkliterally, the apple watch is a
billion transistors, right sothat the technology of the
transistor has become soubiquitous that I don't even
think.
in fact, I'm looking at stuff onmy desk and I'm like transistor
, transistor, transistor.
(01:09:57):
I've got transistors in justeverything around me.
My desk lamp, I know for a fact, has multiple transistors in it
, and so it's funny that thatkind of technology has become so
ubiquitous that we don't eventhink about it.
And yet when that thing wasinvented, it changed the world,
and AI needs to do the samething.
(01:10:18):
Right now, we're stuck in thehype cycle of AI.
Ai needs to go back in thetoolbox, where it belongs, and
you don't know that it's therebecause it's working.
And I don't know if you knowwho Don Norman is.
He had an emeritus position, Ithink, until he passed away at
apple, but he, uh, was at hp fora long time and he was this big
(01:10:42):
visionaries in terms ofcomputing, and he always used to
talk about the ubiquitousness,the ubiquitous ubiquitousness of
computing, and that's what wewant to see, right?
Um, I've got smart bikes nowand that's obviously got a
computer in it.
It's like trying to figure out,like, what's the most efficient
.
You know, um, you know energyto apply to my ability to pedal
(01:11:06):
the thing and to to give me aboost um, that's driven by a
computer.
Do I care?
Do I know?
Um, my microwave that's got acomputer in it.
Do I care?
Do I know?
Computers are so ubiquitous thatI don't even know that I'm
interacting with one anymore.
And if you think about theearly days of computing, that is
(01:11:28):
so science fiction to conceiveof that.
Computers would be soubiquitous that I wouldn't even
realize if I was dealing with acomputer or not.
That's what AI needs to do.
We need to get AI out of thishype cycle where we're putting
it in giant letters in the boxAI inside and we need to get it
to be so that it's so reliable,so functional, that we don't
(01:11:52):
care that it's there anymore,that it's just doing its job.
Like when's the last timesomebody asked you I don't know
if you're a coder, but there's20 different sorting algorithms
When's the last time somebodyever asked you which sorting
algorithm you were using toalphabetize a list?
Nobody cares.
It shouldn't matter, it doesn't.
Speaker 1 (01:12:14):
Right.
Speaker 2 (01:12:19):
And so that's where
things need to, that's where
things need to go, and, and soyou know right now, if you want
to get a company funded, if youwant to, if you want to call
your company a unicorn, you knowyou're gonna.
It's gonna do you good to putai on one of your slides.
That's just the nature of thebeast.
But AI is going to become mosteffective when we forget that
it's there, and to me that'sShangri-La right, that's the
(01:12:43):
holy grail.
That's where we got to go, andthat requires reliability.
That requires an incrediblyrobust infrastructure.
We've got to figure out theline between cloud computing and
edge computing and how thatmakes the most sense.
And it makes the most sense formedical robotics in a different
(01:13:03):
way that it makes sense fordiagnostics or for managing your
EHR.
All of those use cases have adifferent balance.
We got to figure out what thosebalances are and apply them
correctly and apply themtransparently, and so that's
kind of like that's my finalsermon.
Speaker 1 (01:13:24):
No, this conversation
has been very insightful.
It's been a learning experiencefor me.
I mean, some of the stuff thatyou dropped is epic and I
haven't considered and justlearning about you know, like
Galen, and how you do what youdo, and your thought process has
been been amazing andinsightful.
Speaker 2 (01:13:42):
Hey, thanks,
appreciate it.
Speaker 1 (01:13:43):
Yeah yeah, thank you
so much.
Uh, I will definitely be intouch, um, I think, based on our
discussion, um, let me see if Ican end the recording.
All right, uh.