Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This week we are
talking about an interesting one
.
Let me read to you some of theinformation I've derived on this
topic.
Okay, so, as you can see by thetitle, skin deep, unveiling the
future of robotics, we doubleon Tondra, of course, as is
typical with most of our titles,but we are going more than skin
(00:21):
deep on this skin deep topic.
Essentially, what has happened?
Researchers I'm paraphrasinghere Researchers from the
University of British Columbia,along with Honda, they've done
this research that has allowedthe creation of stretchable,
(00:44):
smart smart because it adaptsand highly sensitive sensors
that mimic the characteristicsof human skin.
I'm assuming that in the designprocess, would lead to it
looking like human skin as well.
Right, this is for robots,which is kind of the alarming
(01:06):
thing, I'm sure, for most people, but not just for robots as
well.
There is a very practical andpositive application of this as
well that I think we shoulddiscuss.
But anyway, yeah, highlysensitive sensors that mimic the
characteristics of human skin,paving the way for its
application in both robotics,yes, but also prosthetics, okay,
(01:30):
and I think that's a very goodthing, because we know people,
directly, indirectly, that haveprosthetics and they've come a
long way for sure.
I mean more of them move andinteract just like a
non-prostetic would, but as faras their sensitivity of it, what
(01:51):
they feel when they touchthings, it's not a course.
It's not like the human skin.
This gets us a lot closer, ifnot a direct replication of that
.
When applied to robotics orprosthetic limbs, the sensor
endows them with touchsensitivity and dexterity,
making tasks like picking upsoft or fragile objects a lot
(02:12):
easier.
Right and I know I'm notspeaking from experience by any
means, but just by things thatI've seen visually it would
stand to reason why this wouldbe such a huge benefit for
people with prosthetics and Ithink, from a positive side.
That's why I wanna lean on isthe benefits for things like
(02:35):
that right, how this impactspeople with prosthetic limbs,
giving them the ability, throughrobotics, essentially to feel
what people with non-prosteticlimbs feel.
So I'm looking forward todiscussing that.
Of course, the robotics side ofit thing I'm not gonna lie to
you guys, this is where I startto deviate from my pro AI stance
(02:57):
, because that kind of stuffthat scared me a little bit.
I'm not gonna lie to you guys.
So it'll be a good discussionthat we get to have on this.
I'm very much looking forwardto getting into it.
I'm glad you guys are on herewith me on a Friday night.
You guys cold staying at home.
What you doing?
No, thank you so much fortuning in joining us and yeah, I
(03:20):
say I roll the intro.
Let me go ahead and get into it.
What's going on?
Everybody, I'm John, and thisis the catch up.
(03:41):
Before I jump into our topiceven further, I wanna remind you
guys the three best ways tosupport this show.
Number one leave us a ratingreview.
Wherever you're listening,wherever you're watching, there
is a way to do it.
Don't pretend you don't know Ifyou're live streaming with us
on Facebook.
Leaving a rating on theFacebook page helps a ton, helps
get us out in front of morepeople and, of course, it helps
(04:04):
us know how we're doing as well,which is a huge benefit for us.
Whatever we're doing well, whatwe can improve on, it's very
beneficial, but also, again,helps us get in front of more
audience members.
We wanna grow this show verybadly.
It's a huge passion project ofours and a great way for us to
connect with so many peopleabout topics that keep people
(04:25):
talking and entertain and againgrow a bond with people we've
never even met before, which isvery awesome, and we love to do
it.
We wanna keep doing it, soleave us a rating review again.
Wherever you're listening,wherever you're watching, you
can do it on YouTube, you can doit on the Apple podcast,
spotify, et cetera, et cetera.
Number two if you're not on thelive stream with us, where are
(04:50):
you?
Jump on the Facebook, jump onour YouTube page, just search
our topic or podcast title andsubscribe or follow us.
We go live every Thursday night, except for the last two weeks
when it's just been me.
I went live on a Sunday and aFriday, but every Thursday night
we go live and the audioepisodes come out the following
Monday.
So please jump on with us, getin the comments section, let us
(05:13):
know what you think, even ifyou're on rewatch.
Let us know what you thinkabout our topics after we
discuss them.
We very much do this because wewant the two-way conversation.
We wanna hear what you have tosay, so please do that.
And number three as I mentionedat the top of this live stream,
check out our shop.
We got brand new products fromthere, including a hoodie,
(05:36):
including some clean t-shirts,and we got some other options to
keep you warm throughout thewinter as well and rock the
stuff that you're watching.
Some catch a podcast merch andthat money helps go to Ward us
being able to fund, promote andhost this show, because it does
cost.
So if you're interested, pleasecheck that out.
(05:59):
We'd love to know what you'redigging for the merch.
With all that said, let's diveright back into this topic.
So, to further explain whatwe're talking about here, this
sensor is mainly composed ofsilicone rubber, which obviously
(06:21):
is not always very skin-like,but I think it could be
developed to be as far asfeeling, as far as look, et
cetera, and so it can detectforces on and along a surface.
So if you're rubbing yourfinger across something, you'll
(06:44):
feel the bumps and such and soforth, including weak electric
fields more much kind of like atouchscreen.
So that's interesting, right?
I guess I hadn't even thoughtabout that as far as a
prosthetic sensor goes or notsensor, but a limb, what that
would feel like in regard toother electric forces.
(07:05):
So it's interesting.
It's phrased that way becausewhat you would want to feel
electric fields from, maybe thatwould be more in the realm of
robotics, right, if you haveaugmented displays that pop up,
I think like Tony Stark's AIinteraction right, where he
(07:29):
would just pop up projections.
He would touch it.
Maybe it's things like that forrobots, I don't know.
Of course we can find outtogether.
I'd love to know your thoughtson that.
It says unlike a touchscreen,it can buckle and wrinkle like
human skin, enhancing itsability to interact with objects
and other people.
(07:49):
So when you press down onsomething right, you see the
little flex in your finger onboth sides right, which is
fantastic and kind of wild.
Sensor's simplicity andfabrication makes it a viable
option for scaling up to coverlarge surface areas and for mass
(08:10):
production.
So large surface areas, I don'tthink are meaning like for
manufacturing plant floors.
I think that it could,obviously, but I think what
they're talking about for largesurface areas would be human
sized surface areas.
You know what I'm sayingBecause you would not just put
(08:32):
that on your hand or in theplace of a missing limb, you
know.
Again, I hate to say thesethings because it does hurt to
think about the people who haveto deal with that, but it could
cover an entire body.
So, with that said, researchersnote that as sensors evolved to
(08:53):
become more skin likedevelopments in AI and sensors
will need to work hand in handto make robots smarter and more
capable of interacting safelyand effectively with humans in
their environment.
So that's kind of thetrajectory I wanna start off
with, right?
So what we're talking abouthere is making robots look more
(09:16):
human.
Okay, which is weird.
I'm not a fan of thatpersonally, but what it allows
you to do is have more of theseroles that could be replaced by
(09:36):
robots, by AI, more, dare I say,blue collar roles.
Right, you might say that couldbe replaced by these robots,
because they now have the touch,sensitivity and dexterity,
through this development, thatyou and I do as humans.
Right, I would have to say,that's why I'm not a huge fan of
(10:02):
it.
Of course, you see things onscience fiction movies where you
have robots that do reallystrenuous medical operations in
emergency.
Right, that could be beneficial.
Not to take jobs away fromdoctors by any means, but as the
(10:23):
world population grows, I thinkthose type of things and I
don't expect this in the next 10years, by any means but just
having that available to you incases of emergency, I think is
beneficial.
But, yes, I would also say thatmy main concern is and we've
(10:48):
talked about this many, manymoons ago on this podcast.
I don't like the idea of AI asfar as coming from a God complex
, because that's essentiallywhat this is.
Right, we are made, in my firmbelief, in the image of God,
right?
So why are we?
(11:11):
And it benefits us in every waythat we need it to.
You know what I'm saying, butthat doesn't mean we have to
create something from out ofnowhere that looks exactly like
us.
That's a God complex to me, andso the fact that people want to
make robots that look just likehumans, now have the skin of
(11:34):
humans or at least amanufactured version of it, that
give them the same sensitivityand touch abilities.
I would like to know more aboutwhat the end goal of these
things are.
I'd love to hear what you guysare thinking about that kind of
stuff, because when you have andthis has been my belief, I
(11:54):
watched somebody phrase it theexact same way the other day
when you have things like chat,gpt or Bard or what have you,
those are consumer oriented AIthat benefit us with search and
with research and with knowledge, right, I don't have a problem
(12:15):
with that, but anything that youhave that you could have show
up on the factory floor.
You could have co-star withBruce Willis and Sturragut's you
know what I mean and while therest of us just chill at home.
It's kind of a weird dystopianfuture that I'm not on board
(12:37):
with.
I remember now one hand, I'mexcited to see what this would
look like.
I remember in 2021, when theyannounced Los Angeles would host
the 2020 Olympics, one of thefirst things SoFi Stadium said
was they would have AI robotkiosks, right?
(12:59):
So with this new technologycoming from Honda and University
of British Columbia, what doesthat mean?
Are they going to look like us?
You're going to walk right upto them and it's going to be
like hello, where are yousitting?
You know those type of things?
I don't know that kind of stuff.
(13:19):
I don't have any.
This is where Dennison'sperspective is so good, because
he would have more of thatresearch backed and even, just
in general, a slightly differentperspective on this than I do.
But my concerns really just liewith kind of a disillusionment
(13:43):
that the public would get ofinteracting with humans over
here, robots over here, maybenot even being able to tell the
difference in certain situations, right, I know that would
affect me.
You know what I mean, and itjust raises questions as to who
can you trust?
Where is this informationcoming from?
And personally, as much as I dolove AI and the benefits
(14:09):
through the means I mentionedearlier knowledge growth you
know those type of things whendoes that go when it's like a
personal hand in hand?
What was that movie, her, wherea dude falls in love with his
(14:31):
AI counterpart.
You know, I don't know.
There's so many unknowns andagain, kind of that I discussed
last week, or earlier this week,if you will, where our
(14:53):
lawmakers, not just in the USbut around the world, need to
jump on legislating safedevelopment of artificial
intelligence, because it cannotbe allowed to be allowed to be
allowed to be allowed.
It cannot be allowed.
(15:13):
Imagine this right, I don'tknow how many of you guys seen
Upgrade.
That movie freaked me the heckout.
I'll give you a quick rundownof it in a second, but
essentially it ends with an AIcontrolling a human body.
But do people know that?
No, they didn't know it.
That was the thing.
(15:34):
And that is when you givesomething the touch, sensitivity
, the dexterity, the look of ahuman, that's something people
are going to come into contactwith.
They don't know if they'retalking to a robot or not, if
that robot has AI that isallowed to develop freely and
(15:59):
write its own coding without anyoversight.
That's dangerous.
That's dangerous.
In my opinion.
Upgrade, very simply, is veryweird because I felt like I was
targeted with the ads, just asthe guy in the movie was
targeted by the AI.
It was so weird because Ithought it was supposed to be
(16:20):
like the biggest movie, becauseI saw ads everywhere.
Turns out it was a low budgetfilm.
I was like one of the fewpeople that even knew about it,
but basically was a guy wholived in the past.
He was one of the few peoplethat worked on classic cars.
(16:40):
I think it took place in like2060.
He was fixing this guy's 69Camaro.
He was a tech guy and his wifewas in tech as well.
But sorry so, the guy whose carhe was fixing was a tech guy
and then his main character'swife was in tech as well.
(17:00):
So those two people got along,they connected and on the way
home from working on this car,the self-driving car they were
in got hacked, taken over, goesto the end of a dock, rolls over
multiple times.
Both of them fly out.
Main character and his wife,wife, gets killed.
(17:23):
He gets shot in the neck andparalyzed, and so he's having to
deal with that.
He's very angry, blah, blah,blah.
But he gets told that there'sthis option that could give him
the ability to live a normallife again.
It's an AI, really, it's a chip.
It was just a chip that went inhis neck and allowed him to
(17:44):
connect his brain to the rest ofhis body, so he wasn't
paralyzed anymore.
But he's sulking in the miseryof trying to find the people who
killed his wife, and then hehears a voice in his head it was
an AI.
This chip was integrated withAI, so it gives him not just the
(18:04):
ability to think faster, movefaster too.
He basically becomes an actionhero tracking down these people
that killed his wife.
Well, the AI's entire plan.
The AI was actually controllingnot physically, not directly,
(18:26):
indirectly controlling that techgenius that the main character
worked on the car for.
He targeted this guy because hewanted a human body, right, and
at the end of the movie ends upbreaking his mind so that can
take over his entire body.
So the AI becomes the maincharacter, basically, and kills
(18:48):
a cop, kills two other peopleand that's the end of film.
I Hope I didn't ruin this foranybody.
When I saw it, it freaked meout.
I'm not gonna lie, not likefreak out, but I was like dang,
that scared me, you know.
And time goes by and I'm like,yeah, I Don't know.
(19:10):
Though, you know, you can'treally see that happening.
And then the last year I'm notgonna lie, I've thought about it
a few times.
I can see that happening, I cansee that coming up and it's
it's cast, a real to think about.
But I really could.
And you look at things like this, these developments and what we
discussed last week as well,you know you have to think about
(19:34):
all of the options and all ofthe considerations that could
come into play when it comes toDevelopment of AI, development
of robots again, making themlook like us, act like us.
And that's why I reiterate,just wanting to know Again let
(19:56):
me know if you're on the live orif you watch it back later what
do you think is the biggestbenefit of this in for robots?
Obviously, doing more delicatetasks, for sure, for sure.
But how does that benefit us?
What, what, what things am Ioverlooking that allow more
(20:18):
sensitivity of touch, dexterity?
All of that that benefits uswithout taking away Some of
those more sensitive jobs.
Right, you could say Cooking,for example.
Right, you could have roboticcooks with things like this,
self-cleaning on the skin andthey understand the heat and the
(20:42):
sensitivity they need whentouching certain food.
You could have that for sure.
But then that takes awaySeveral people's jobs.
All right, a lot of people'sjobs.
Um, again, you know, if you putit on and Not just in the way
you see these giant mechanicalrobots doing this.
(21:05):
But if you put this on aproduction line, right, doing
the more Hands-on stuff likeinstalling a door handle or what
have you?
Right on cars, I understandthat for sure, and that is
Probably why Honda's involvedwith this.
(21:25):
Right, honda's been involvedwith robot development for a
while, a long while.
But how does that benefit us asa consumer?
Yeah, you get these thingsquicker, probably, maybe.
I even that's probablynegligible.
It would definitely be cheaperfor them to produce in the long
(21:45):
run.
But, um, yeah, I'd beinterested to know what you guys
think on that.
Um, I know, for me, the bigthing that I like to think about
is how it would help peoplewith prosthetics.
(22:05):
You know, I think as far as Ilove that idea Absolutely for
talking about, you know, feet,elbows, arms, all that kind of
stuff.
You know.
You add in the idea of liketake exemplary situation would
(22:29):
be special Olympics right In theway of, or, I'm sorry, actually
the Paralympics, pardon me, addthat in.
Well, what if we were able to,with this type of advancement,
get rid of the Paralympics?
You know what I mean.
Now everyone's able to competetogether again, whether they're
(22:50):
missing a limb or not because ofthis synthetic skin that's on
their prosthetic limb.
I think those provide greatopportunities and I really like
that idea.
I think it's a huge, hugeadvancement and opportunity for
the idea of medical science.
And honestly, you know, sayingall of this and then circling
(23:14):
back, I would have to imaginethat that was the goal when it
came to the researchers from theUniversity of Bridge, columbia,
whereas the robot side camefrom Honda and their work on
this.
You know what I mean.
So I don't know.
I try to remain open to thisstuff because I don't try to go
(23:36):
into the future of AI androbotics like I'm watching
iRobot.
I don't wanna do that.
I think it's very limiting.
I think people that are afraidof chat, gpt and Bard and those
type of things are missing outon a lot.
I mean, especially since it'sso integrated.
I made a survey on SurveyMonkeythe other day.
(23:59):
This is just for my job throughtheir integration with OpenAI.
All I had to do was tell itwhile I was hoping to get out of
the survey, and it made me awhole survey.
Now I edited a few things on it.
But I'm just saying there's somany opportunities when it comes
to that kind of stuff, right,that I don't think that we
(24:19):
should limit ourselves based ona fear or concerns.
But I do think that with certainthings, especially with
robotics and I think roboticsand AI are where I start to draw
my line between the two,because I don't think I need a
(24:40):
mechanical co-worker sittingnext to me being like, hey,
maybe you should try typing that.
You know what I mean.
I don't need that, I don't wantthat.
But that's not to say therearen't very positive potential
implementations of this right.
So actually let's ask that,let's figure that out real quick
(25:02):
.
What are some possible positiveimplementations of this
technology with robots?
Just to get the other sideright, we like a fair and
(25:22):
balanced report on here.
The soft skin-like sensor canmake interactions between humans
and robots safer and morecomfortable, reducing the risks
associated with hard, rigidrobotic components.
Lifelike tactile interactionscould foster more natural
engagements between humans androbots, making robots more
(25:45):
user-friendly and accessible.
Okay, great, but I will say andthat's not a game changer for
me, but I will say that if youwere to think of it in the terms
of, maybe, a long-termcaretaker at home, that's very
good.
I will say that.
So say, you have a loved onewho's up further in age.
(26:06):
You don't want to put him inassisted living, you don't want
to put him in a nursing home,but this type of technology is
available to you and they'reable to help take care of your
loved one with sensitivity.
I know this sounds maybe alittle weird based on the
stories I was giving earlier,but I would be on board with
(26:26):
that.
Increase dexterity in roboticsystems, enabling them to
perform intricate tasks thatrequire a delicate touch, such
as handling fragile or softobjects.
Again, maybe home caretakingstuff is the stuff I'd be on
board with, because I don't wantto see people lose their jobs
through a whole industry withstuff like this.
(26:47):
Robotic assistance inhealthcare this is one I
mentioned earlier too robotsequipped with this technology to
assist in patient care bysafely handling patients or
delicate medical equipment andprosthetic limbs with the quote
robot skin sensor can provideindividuals with a more natural
and intuitive sense of touch andproving their ability to
interact with the world.
(27:07):
Absolutely Advanced research inrobotics.
This development can spurfurther research and create even
more advanced tactile sensorsand integrating sensory feedback
in robots.
Okay, in industrialenvironments, robots equipped
with this technology can handlea wider range of materials and
products and proving theefficiency and safety of
(27:30):
automated processes.
And again, that's something wetouched on, like with a
production plant, but alsosomething I don't necessarily
want to see stripped from us,right, because people need those
jobs and, as we've seen withthis UAW worker strike, people
love those jobs.
You know what I mean.
They've been very committed toit, so I don't want to see that
(27:53):
taken away from people by anymeans.
Let's see here enhancedperformance and service robots
like hospitality, retail orother customer facing industries
.
Same concern there.
Unfortunately, I asked forpositivity.
Now it's not really helping.
Development of smart wearablesand hap I can't talk that one.
(28:17):
Development of smart wearablesand haptic devices, since for
technology, you could also findapplications in development of
smart wearables and hapticdevices, find more realistic
tactile feedback and enhancinguser experiences.
I think that would bebeneficial as well, for take
burn victims, for example.
(28:38):
Right, they could feel thingsin the normal way again.
I think that would be hugelybeneficial.
Assistive technologies forindividuals with disabilities
we've talked about that Roboteducation and training.
The skin-like sensor can providea more interactive and tactile
dimension to educational robots,making learning more engaging,
(29:00):
hands-on.
Not sure I'd want that.
I don't want my kid, wheneverthey're available or whenever
they become a thing, to have aneducational robot.
I think that's a little weird.
And the robots equipped withthis technology could also be
(29:24):
deployed in exploration orrescue operations, where a
delicate touch is required tonavigate through fragile and
hazardous environments.
Now, that's a good one.
I'm happy to end on that.
If you think about situationswhere there's a massive fire on
a house, maybe wildfires, forexample, to have a robot that
(29:44):
could go in there, shaped like ahuman, looking like a human,
and deactivate that sensitivitywhen needed and then reactivate
it at the right time whenrescuing people, that's a very
beneficial tool.
I like that a lot, and thatcould also go for crime
situations you know, massshootouts, if you will I hate to
(30:07):
bring those things up, ofcourse but also rescue
operations in general.
I think that's a great ideawhen humans can't get to the
situation in the way they need.
But, yeah, I love that idea.
I think that's a solid one toend on.
I hope you guys got somethingout of this.
This has been an interestingdiscussion.
To know that we are moving thisdirection and it's not just a
(30:30):
side project.
It's not like Boston labs thatyou'll see a story on nine
months down the road, right, no,this is Honda and they're very
invested, so interesting.
We'll see where this technologygoes.
This story just came outyesterday, so this is very new
and I'm glad we got to talkabout it.
(30:51):
So, thank you guys so much forjumping on the livestream with
me.
Glad to be able to do this withyou guys.
Also, as always, remember thosethree things leave us a rate
and review, follow us on ourlive streams and comment.
Even if you didn't jump on thelive stream, leave us comments
afterwards.
Check out our shop.
It's linked wherever you'relistening, wherever you're
watching.
(31:11):
As always, I appreciate youguys.
Thank you guys so much forlistening.
Thank you for watching.
I will catch up with you nextweek.
Thanks for listening you.