Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
And I have lots of scientists on the show, and
I like to kind of keep you updated on some
of the most interesting things going on in the world.
And a lot of times it's physics Today's not physics
Today is kind of the intersection of robotics and medicine
and just a remarkable story. You see from time to
(00:24):
time people especially in military veterans, but lots of others
as well, who have an injury that requires them to
get a prosthetic, prosthetic hand, a prosthetic foot, and so on,
and of course they need to manipulate things.
Speaker 2 (00:38):
You need to just let's stick with hands. You need
to pick something up, and you and.
Speaker 1 (00:41):
I, if you're not working in a prosthetic hand, might
kind of take it for granted. But when you're picking
something up, the way you pick it up is very
much determined by what you feel in your fingers. And
you know, do I need to squeeze it harder? Am
I touching it at all? Am I moving all this stuff?
What about for folks who have prosthetic hands? Joined now
by Jakomo Valle, who is head of the Neurobionics Lab
(01:05):
in the Electrical Engineering department, and and and Life Bionics
Department at Chalmers University of Technology in.
Speaker 2 (01:12):
Gottenburg, Sweden. He's he was in Chicago for a while
as as well.
Speaker 1 (01:16):
And he's just working on some incredible research in this area.
Speaker 2 (01:19):
So thanks for being here. I really appreciate it.
Speaker 1 (01:22):
Joining us from nine hours away or whatever the time
zone difference is right now through the through the miracle
of zoom. Tell us a little about this research and
and what you're accomplishing.
Speaker 3 (01:33):
Yeah, sure, so, first of all, thank you, thank you
for the invitation. Yes, so, this actually is a research
following in the field of brain computer interfaces, so those
technologies that are actually uh interacting with the with the brain,
with the nervous system and connect normally like external devices, robots, computers,
(01:57):
so with the with the score of improving quality of
life of people with disabilities.
Speaker 4 (02:05):
This is the goal.
Speaker 1 (02:07):
Uh.
Speaker 3 (02:07):
This is a research of a consortium called Cortical Bionic
Research Group with This is like a cross cross institutional
collaboration between American universities, so University of Pittsburgh, University of Chicago,
Northwestern University and also challengers here in Sweden. So is
now international where the scope is to restore sensory and
(02:34):
motor functions in people with paralysis, in particular spinal cord injury.
So here we have if you think about it, after
a spinal cord injury, you have the interruption of every
signal coming from and to the brain.
Speaker 4 (02:54):
Caused by the lesion. So the body and the.
Speaker 3 (02:57):
Brain are somehow disconnected. So the way that we approach
this problem is, since the brain of these individuals is
completely healthy, is to like connect extra corporeal so a robotic.
Speaker 4 (03:14):
Or prosthetic device that we think about a side.
Speaker 3 (03:18):
Of a wheelchair to the brain and the motor and
sensory areas.
Speaker 4 (03:23):
Of the brain. So I can go more in detail
of course on that.
Speaker 1 (03:29):
So all right, so I think I probably slightly misdescribed
the issue you're working on at the beginning. So these
folks don't necessarily have like a prosthetic hand, right, It
could be their original hand, but due to some kind
of you know, typically typically it would be a.
Speaker 2 (03:46):
Break somewhere in the spinal cord.
Speaker 1 (03:47):
And depending on how high up in the spinal cord
it is, you know, that's where you get the level
of your disability.
Speaker 2 (03:53):
So what you're.
Speaker 1 (03:54):
Working on is making it so that somebody who has
that kind.
Speaker 2 (03:59):
Of let's say a quadru polgia and can't.
Speaker 1 (04:00):
Use their hands or can again like feel something with
their hands. Would it also get them to be able
to move their hands or is that a different thing.
Speaker 3 (04:12):
So this is like there are groups trying to restore
actually the body functions, so the arm. But our approach
is more like it's using robotics. So in that case,
what we did we do is replace the real arm
with the robotic arms. So what we do is we
(04:34):
connect another arm robotic with a prosthetic hand at the end,
and we record from the brain the intention of moving
their arms. So the subject is thinking to move his
or her own arm, we are able to record it
to decode that this for that and command the actuation
(04:59):
to the to the robotic car, so the robotic car
can be controlled directly with the thoughts and the intention
of the user. The novelty here is that it's not
only about the motor parts or something that maybe you
have heard or see from ailor mask with the narrow link.
Speaker 4 (05:20):
But here there is like a step more.
Speaker 3 (05:22):
So we have a sensation, so we are able to
send back information from the robot to the brain of
the user.
Speaker 4 (05:31):
In order to.
Speaker 3 (05:35):
Recreate sensations, so the subject is able to tell if
the object grasp is art or soft, is curved or flat,
the texture on it, So we can actually communicate back
with the brain and.
Speaker 4 (05:50):
Set an artificial sensory feedback.
Speaker 1 (05:53):
It's just absolutely amazing. Just so I can picture this clearly.
The robotic arm that we're talking about, Yeah, is this
a stand lone thing that's sitting on something or is
it some kind of of exoskeleton kind of thing that
you're put around the person's arm.
Speaker 2 (06:09):
It's a separate because it's is a third one.
Speaker 1 (06:14):
Yes, okay, yes, And so they think and that arm
starts to move and do the things that their regular
arm would be doing if they didn't have a spinal
cord injury. And your advance on this is that the
prosthetic hand on that robotic arm gets some kind of
(06:34):
feeling where you're trying to approximate the feeling you would
get in your own fingers, in your own palm of
your hand, or wherever the sensors are.
Speaker 2 (06:41):
That's so, what is this is going to sound like.
Speaker 1 (06:46):
A stupid question, but I want you to get pretty
in the weeds with me here.
Speaker 4 (06:50):
What are the.
Speaker 2 (06:51):
Benefits of that for the person who's doing it.
Speaker 4 (06:55):
Yes, so you can imagine that. So the person with the.
Speaker 3 (06:59):
With the with the thetaplegia so completely paralyzed from the
neck down.
Speaker 4 (07:06):
Some some interaction.
Speaker 3 (07:10):
Completely is losing completely the possibility to interact with environments,
with the objects around. So the goal of this is
give back something to uh interact with the with the
with the external world, to like type on a keyboard,
to like feed themselves, uh, to to to shake.
Speaker 4 (07:31):
The hands if you meet someone.
Speaker 3 (07:34):
So this is something that uh is maybe again is
able to like make these people independent again.
Speaker 4 (07:43):
Uh, this is the goal thanks to this neural connection.
Speaker 3 (07:47):
Uh. And so this implant of electrodes in the brain
and this robotic arm. So the system is composed by
two parts. One is the brain interface where implant this
tiny contacts electrical contacts in the brain, and the prosthetic device.
Speaker 1 (08:10):
Yeah, we're talking with Jacomo Valet, who is head of
the Neural bionics lab at Chalmers University in Sweden. You know,
I realized it'd be a little bit odd the first
time you walk into a room to talk to somebody
who has one of these one of these injuries and
you shake hands with a bionic arm that's next to him.
Speaker 2 (08:31):
It's a little bit odd, but it also I.
Speaker 1 (08:36):
Mean I actually kind of had an emotional reaction.
Speaker 2 (08:39):
I did when you.
Speaker 1 (08:41):
Describe, like like the concept of somebody who has that
injury being able to have the feeling again of shaking
somebody's hand.
Speaker 4 (08:53):
Yeah.
Speaker 3 (08:54):
Yeah, this is this is specifically important because our sense
of touch is not one of course, is involved in
how we manipulate object in our motor control, and so
we are able to we are dexterus because we have
the sense of touch. So we are able to interact
with the object in an efficient way thanks to the
sense of touch. But not only that, So the sense
(09:16):
of touch is involved in many other aspects related to
the effective area and the fact that you the sense
of ownership and awareness of your body. So the fact
that the subject is feeling sensation from this external object
is like a prosthetic hand, is making the hand more
(09:36):
part of the body, more incorporated, more embodied, accepted by
the user. So this is making everything more realistic and natural.
This is the thing that also the users are reporting.
Speaker 1 (09:50):
So I have an interesting listener question. I have smart
listeners and they send a lot of interesting questions. Could
this be used for people with brain injuries that don't
allow the correct signal to be sent from the brain,
for example people with cerebral palsy.
Speaker 4 (10:09):
Yeah, it's a good question.
Speaker 3 (10:11):
There are there are different neurotechnology of brain computer interfaces
that are now becoming a possible solution for patients. Not
only for example, in our case, what we do we
restore arm like try to replace if you want arm
functions or sensory motor but the same type of neuroprosthesis
(10:34):
can be used to restore speech for example people that
have aphasia due to stroke or als or also vision
and in case.
Speaker 4 (10:47):
Of blind people that they have the.
Speaker 3 (10:50):
Generation of the optic nerves or the retina, they are
implanted in the in the visual cortex and actually you
can evoke instead of tactic sensation, you can of vote
visual sensations, so in order to provide some visual feedback.
Speaker 4 (11:06):
And these are these are there are.
Speaker 3 (11:08):
Companies in the US particularly to try to commercialize it,
to bring these devices uh to to to patient as
fast as possible. So it's becoming a possibility.
Speaker 4 (11:21):
For for for in the future.
Speaker 1 (11:24):
Okay, you probably don't know this, but I'm president of
the Bad Analogy Club, and so I'm going to try
one on you now.
Speaker 4 (11:32):
So here.
Speaker 5 (11:33):
Yeah, Uh, imagine you had some electronic device and it
had a and it had a connector that you could
use to for input and output to this device.
Speaker 1 (11:46):
But the connector has one thousand pins and you don't
know what pin is what, and you have to sort
it out before you can make a device that would
plug into that thing and be able to control it.
And what I what I'm thinking about is and sort
of moving to a more basic level of the kind
of research that you're doing, or more fundamental level of
(12:06):
the kind of research that you're doing. How the heck
do you interface with a brain in a way where
you know where to go to pick up my thought
that I'm trying to close my left hand.
Speaker 4 (12:24):
Yeah, so this is the next question is exactly the.
Speaker 3 (12:28):
Challenge behind this technology for us, for new engineers. So
the point is that, so we have a billions of
neurons in our brain, and our brain is divide, that
is subdivided these areas where hundreds of thousands of neurons
are responsible for specific functions.
Speaker 4 (12:49):
In this case, imagining the.
Speaker 3 (12:53):
Area where you control your arm, and then there is
the area where you sense from your arm, or like
you interpret the sensation coming from your arm. So what
we do is to we take some electrical contacts, so
these are called microelectro microelector the arrays, So these tiny
(13:17):
channels that we can implant directly in that area.
Speaker 4 (13:20):
That unfortunately right now are up.
Speaker 3 (13:23):
To one hundred or something like that, so we are
under sample a lot the neurons. But so what we
do is trying to we record the activity of these
neurons when the subject is trying to do something. So
we have this pattern of electrical activation every time that
you think about moving in different ways.
Speaker 4 (13:43):
So our goal, our actually challenge is.
Speaker 3 (13:48):
To decode this activity and link this activity with the
movement of the arm. So we need to understand what
is the code of your of the brain to actually
command that these activities. So and at the same time,
when we want to restore sensation, we need to send.
Speaker 4 (14:08):
The correct code that the brain.
Speaker 3 (14:11):
Is able to interpret as a curve, as a flat surface,
as a as a rough texture. So the understanding the
code of the brain is actually and all the neuroscience
neuroscientific research behind this is actually that trying to understand
how the brain is uh encoding this information Okay, So.
Speaker 1 (14:34):
If I'm understanding you correctly, you put a bunch of
microelectrodes in a brain. You tell the person, let's say, uh,
try to close your left hand, and then you imagine
and then you measure which neurons are firing when when
when the subject is doing that, and then you know
that's the right the right part of the brain.
Speaker 4 (14:51):
Uh.
Speaker 2 (14:52):
Thinking about the scaler, thinking about.
Speaker 1 (14:54):
The scalability of this, how do you how do you
generalize that in the sense that if you you know,
you find the right spot in my brain, but in
your brain is not going to be in that spot.
I mean it'll be close, but close isn't good enough
in this work, So how do you generalize this?
Speaker 3 (15:11):
That's that's another very good question, because first of all,
one important thing is that our brain is true. Our
brain is not exactly the brain is not exactly the
same in each participant. But our brain is structured in
a way that the the the area responsible for movement,
in the area responsible of sensation is in a specific
(15:33):
location in the brain.
Speaker 4 (15:34):
So we know thanks to many.
Speaker 3 (15:36):
Studies with the with the monkey research or primary non
human primates research, and also in the fMRI, so in
the in.
Speaker 4 (15:45):
The scanners or the scanners that we have of.
Speaker 3 (15:51):
Brain scans from from elth individual. We know that there
is a specific region that every time that you think
about movement and moving your arm is activated. The point
is true that that area is not exactly the same
and these electros are very very small, so if you
miss the area even few millimiters, you can lose or
(16:16):
not have exactly the same connection. So, unfortunately have to
say for now this type of technology needs to be personalized.
Speaker 4 (16:25):
With the user. In the future, we hope that we can.
Speaker 3 (16:28):
Generalize completely that and have a device that can be calibrated.
But if you think with the novel machine learning algorithms and.
Speaker 4 (16:38):
Artificial intelligence that can.
Speaker 3 (16:40):
Auto recalibrate the system and find you know this, this
can help to have a more personalized technology for the user.
Speaker 2 (16:51):
All right, two more questions for you.
Speaker 1 (16:52):
One one is from a listener, and I'm going to
rephrase the question a little bit. But theoretically, could you
use let's say the Internet or something in between the
brain connections and the arm and have this kind of
functionality available while you're sitting in Sweden with an arm
(17:14):
sitting here in Denver.
Speaker 3 (17:18):
Yes, so this is actually a good point because we
are trying to do this. So because we can do
with the robots that you have in place there, but
we can for example, do also this in virtual reality.
Speaker 4 (17:32):
So another thing that we do is that we can.
Speaker 3 (17:35):
Connect our participant to virtual realities. I think when actually
there is a virtual arm and they can move and
feel from this virtual arm in virtual reality. So in
principle we can also do this. We need just to
understand the bandwidth of the communication but also remotely so
(17:56):
make the patient able to.
Speaker 4 (17:58):
Move an arm.
Speaker 3 (18:01):
Yeah, connect that your internet and then connect something far away.
Speaker 4 (18:06):
Right.
Speaker 2 (18:07):
That's fascinating.
Speaker 1 (18:07):
I mean, we know there's there's already remote robotic technology
where you know, like a surgeon can do surgery with
a machine that's somewhere else and control it. But this
is the first I've ever heard about getting getting feedback
so the user feels like the user can feel something,
which could really be a game changer depending on the application.
Speaker 2 (18:28):
All Right, my last question for you.
Speaker 1 (18:30):
Jacamo, and this is probably the most important one, and
I think I really feel like I need to ask
you this because you're Italian, even though this was a
thing on my show yesterday rather than today.
Speaker 2 (18:41):
But since you're Italian.
Speaker 1 (18:42):
I got to ask, is it okay to put pineapple
on pizza?
Speaker 2 (18:50):
No?
Speaker 4 (18:52):
It is not, I'm far of it is not.
Speaker 3 (18:55):
Even if it's maybe tasting good, it's something that we
cannot colerate.
Speaker 4 (19:00):
Sorry for that.
Speaker 1 (19:04):
Jakomo val A heads up the Neurobionics Lab in the
Department of Electrical Engineering and Life Bionics at Chalmers University
in Sweden.
Speaker 2 (19:13):
Just some fantastic.
Speaker 1 (19:14):
Research and a great conversation and I really appreciate your time.
Speaker 4 (19:20):
Yeah, thank you, thank you very much for the invitation.
Speaker 3 (19:23):
And next time, if you want that can bring with
me one of our participants that can tell directly keys
on or her own experience, so you can ask to
the to the person who's actually using this.
Speaker 2 (19:35):
That would that would be fabulous.
Speaker 1 (19:37):
And maybe when you make kind of your next advance
in something where you think, all right, we've got it,
we've done a new thing now and we can kind
of add to the story, we'll do it that way.
Speaker 4 (19:48):
Absolutely, thank you very much.
Speaker 2 (19:50):
Grut Yea.
Speaker 1 (19:54):
That was awesome. That was absolutely awesome. Can you I
wasn't planning on talking him now. This happens from time
to time, you know, he planned talk to someone for
ten minutes and then it just gets so interesting.
Speaker 2 (20:03):
I hope you found that as interesting as as I did.
That was just remarkable.
Speaker 1 (20:08):
Imagine, by the way, imagine being smart enough to do
that work.
Speaker 4 (20:13):
I can't.
Speaker 2 (20:14):
I can't.
Speaker 1 (20:14):
I don't think of myself as a dummy, But compared
to that guy, imagine being smart enough to do that work.
We're gonna implant stuff in somebody's brain, and then we're
gonna not only use that to move a robotic arm
just by thinking about it, but we're gonna design the
arm in a way that it can feel things. And
then we're gonna implant stuff in a different part of
his brain so that the person can actually feel from
(20:35):
the hands of the robotic arm, almost as if he
was feeling with his own fingers.
Speaker 2 (20:39):
Are you kidding