All Episodes

February 15, 2011 41 mins

Will tomorrow's school systems be dominated by infallible robotic instructors? Tune in as Julie and Robert explore the bounds of education, artificial intelligence and human-robot relationships in this podcast.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Welcome to Stuff to Blow your Mind from how Stuff
Works dot com. Hey, welcome to Stuff to Blow your Mind.
My name is Robert Lamb and I'm Julie Douglas. Julie
coming this. Have you ever turned to I know sometimes
it's difficult to find babysitters in the last minute, but
have you ever turned to the room bo? I've thought

(00:26):
about it. I mean, obviously I thought about my cat first,
and then the rumba because I thought, you know what,
she could heard my daughter. I mean, she's a to
your old toddler running around. The rumba could just chase
her around. Opening a big deal. Is the cat of
female male well still, I mean male cats or at
least used to batten, you know, little kittens around and
keeping them in line, right so right right, yeah, between

(00:48):
the rumba and my cat, owen, I think it could work. Yeah, yeah, yeah.
The cat disciplined in the room that can just sort
of lay out rooms and occasionally bump into the child
and let it know that it's loved right exactly. Yeah,
they're they're that interaction. It would be really enriching for
my daughter. I think, yeah, well, this is exactly the
thing that we're going to talk about for twenty minutes

(01:08):
or so here uh or or longer. I don't know.
And and that's gonna be Can we turn to machines
to two babysit children, to raise children to uh to
just you know, can we just pretty much hand them
off to robots at birth? Or or or will robots
be doing those delivery as well? Who knows? And uh
It used to say, cool idea. Is this a frightening

(01:30):
prospect for the future. I don't know. I mean sometimes
I wonder if we're entering into the winter of our
robot discontent, because I mean it's a little bit yeah,
you know, robots rearing a child could be a little
bit scary. Yeah, And and yet it's it's one of
those things that we were talking about yesterday about just
about robots, like what can a robot not? Do? You know?

(01:52):
If it seems nothing, do we get enough negatives into
that into that for you? Um, there's a we were
just constant dreaming up. We've been just been dreaming up
ways that they can enter our lives and take over
different responsibilities for ages. I mean, the term robot comes
from I believe the check word for for slave for something.

(02:12):
You know, it's it's very tied to the idea of
let's make something that will do things that I don't
want to do. So child rearing, Uh, you know, we've
sort of reach there, reached that point by default. But
we've we've we've already handed off a lot of our
labor to robots, the vacuuming, for instance, UM, a lot
of industrialized functions. Um, We're we're making a lot of

(02:34):
headway when it comes to surgery. Um. Robots have been
able to demonstrate the ability to uh, to compose music,
to create art. There's play Jeopardy played Jeopardy. Is that
going on right now? I think it's about too. I
think it's around the corner. They can already beat me
at scrabble pretty easily if my iPhone is in any indication. So, um,

(02:57):
can they raise children? I mean it's not so much
of a stretch, right, mean they think about right now, television,
iPad iPod. I mean there's a ton of things that
we use to some of us and I'm guilty sometimes,
Um for the computer two put our kids in front
of and sort of babysit it for a couple of minutes, right, Um,
So it's not too much of a stretch of the

(03:18):
imagination that you would use a robot and that it
actually may be enriching and actually maybe program to actually
teach your children's stuff. Right. Yeah, it's like like Teddy Ruxpan, Right.
Teddy Ruxpin was like a robot cuddly bear that you
just put a tape cassette and it's stomach and then
it would blurk out all sorts of wonderful entertaining stuff. Right.

(03:39):
It was super creepy at the same time, right, So yeah,
which I guess is the whole point of this. So
there's also the idea that once you introduce technology into
the mainstream, we're not really going to go back, right,
Like we're gonna we're not gonna quit using our iPods
or iPads to entertain our children from time to time. Um,
if we have a robot introduced as a babysitter, most

(04:01):
likely we're not going to back off of that anytime soon.
And in fact, this is happening right like yes, well,
um in Japan. In Japan where robots are everywhere, Um there,
because they have a rich tradition of of of loving robots,
they've been far more into sci fi than than than
we are as a as a nation. For for ages,
they've they've grew up with They have all these guys

(04:22):
that grew up with the dream of robots and they
all talk about it in interviews that they grew up
watching like robots on cartoons and then they get into
robotics and they're creating that vision sort of piece by piece,
you know, nobody's creating you know, whammo, here's a robot
human that does all the things that human does. But
but like literally any tiny or even large even some

(04:43):
more complicated tasks that you can think of. Like there's
this huge uh food food industrialized food uh uh convention
that happens in Tokyo every year, and like there you
can go and you can see robots making sushi, robots
off you know, frying up. They are free the food
and so stuff like it is delicious, right, Like I

(05:04):
mean they interact with everybody. Yeah, yeah, that's of course.
That's the other huge thing in robots is making robots
that can interact with us and can and can interact
with a human environment socially intelligent machines. But but in Japan,
there are two particular examples of of of of these
robots that have come out that are they're very much

(05:26):
in line with the idea of a robot nursemaid or
robot babysitter. One is Reba. That's r I b A
not like not to be confused with a country singer, right,
and it doesn't have a big shock of red hair
or anything. And it's but it does look like a giant,
very cute, very cartoony, very Kauai. I believe it's a term. Yeah,

(05:48):
cuteness factor. Yeah, very very very cute looking, looks like
a giant teddy bear, except it's you know, it's not soft.
It's kind of hard looking. But we can lift patients.
In fact, it can. It's designed to lift up to
a dred thirty four pounds of of like old person
from a chair, I guess. And this is a child.

(06:08):
This is the robot that's like four pounds, right, four pounds.
Without the giant cute face, it would look horribly menacing
right coming at you to pick you up. Yeah, So
I guess that's the point of putting the Kauai face
all over it. Yeah, And and you know it's like
very it's that that hello kitty thing if if anybody's
having trouble like picturing this just hello kitty everywhere. And

(06:29):
but but even in Japan they recognize that that kauai
can be creepy because they have the whole uh kauai
now air um a thing going on where No, seriously,
it's like they have it's like cute stuff, but it's
also like really grim um. I wish I could think
there's a there's an artist by the name of Junko

(06:49):
or Junkio. I'll have to link to it on the
blog post that goes with us. But she she's really
heavy into like creating these cute things, but they're all
like doing horrible things as well. Or Gloomy Bear is
a bokay yea so um, but Reba is not gloomy.
Reba is very happy looking and as these big padded
claws to lift people and things. Another example is there

(07:12):
is a there's a major retailer in Japan called a
called Eon Company. Yeah, sort of like a Macyzer. Yeah.
And of course you go shopping, you have kids with you,
you want to, you know, put the kids somewhere while
you shop. It's like like here they do it at
Ikea all the time, right, you have the Ikea playland.
We drop the kids off and then either go shop
or like you've suggested, go to work for the day,

(07:34):
right and come back to yeah, yeah, yeah, that's my
other choice childcare. So that the the the company t
Musk came up with this, uh, this robot that looks
kind of like a large like yellow and white appliance
with kind of a wally head on top, you know,
very cute, uh and uh, and it can it can
interact with the children by like each child will have

(07:55):
like a badge on on his or her shirt, and
the robot can like scan badge. You know, it'll know
to interact with this person with its limited vocabulary, so
it would be like hello Robert. Yeah, hello Robert, your
parents are totally coming back, you know. Yeah. And and
then there are other awesome things. It has two u
two eyes. One of the eyes can be used to

(08:16):
project advertisements at the child, and the other is is
a camera so they can like record things and then
play them back for the kid. Okay, So that's the
super scary part, right, and not even that it's robot
interacting with your kids, that it's beaming like McDonald's advertisements
at your child and taking pictures of it at this time. Yeah,
Like I'm thinking of the scene from The Dark Crystal
where the they strapped the little gelfling in the chair

(08:37):
and then like the laser goes right into his eyes
or sort of a clockwork orange kind of thing with
cute robots and little Japanese kids. Okay, so it's there,
it's in existence. Japan is exploring it. And obviously, UM,
you know that the technology is going to be slower
to creep into the US, for instance, UM in the West,
but it's there's no denying it. It's it's possible, um,

(09:00):
and at some point we're probably going to be using
this technology to interact with our children, if not babysit,
at least teach them right right, But what ares I
mean we've talked about operating rooms. Um, We've got the
DaVinci robot right that that UM surgeons are using. And
uh yeah, in two thousand nine alone, seventy three thousand

(09:21):
American men underwent robot assisted prostate surgery. So I mean
it's robot assisted. It's not like, uh, you know, go
in drop your pants and a robot. I mean, it's
it's full about assisted. It's now the drop of your
pants just through me. Okay, um, I think what's cool
about that too, Um, not the drop your pants port,
but the robotic assistance is that it's got tremor control

(09:43):
on it, right, so if you know, if this or
it sort of recalibrates for any errors that the surgeon
might make in terms of the tremor that they're using
to control it. Yeah, and generally with the robotic us
or you're dealing with smaller incisions, a lot smaller skill
opera because nobody's having to like get their hand in there. Yeah,
it's three D vision, so you're able to actually look

(10:05):
at the area that you're operating on in a in
a full or way. Correct. So that's a plus. That's
a that's great robotics ya um. And then you spoke
of the REBA. And then there's also the Cody the
sponge bath. Oh yes, Cody, yeah, who works with the elderly.
And it can open doors and drawers and cabinets and

(10:28):
and give you a sponge bath. I think men like
like just figuratively can open doors, Like it just open
stores for itself by administering sponge baths. Oh yeah, like
the really important people, right, things just weird things start happening. Yeah,
next thing you know, it's it's no longer worked in
the hospital. It's got a it's got a secretarial position uptown.
And why not. I mean, who who doesn't need a

(10:49):
sponge bath at work sometimes? Um? And then there's the
baby seal pair of have you seen this one? Oh yeah,
I think I saw no Sharky referring to this. Okay, yeah, right,
and this is it's a little baby seal, right, and
it interacts with you. So if you look at it,
it starts cooing at you. And they're actually using it
with Alzheimer's patients, um. And and they think that it's

(11:12):
really helpful for the patient to bond with the animal,
and it's therapeutic. So I mean there's there's that. Yeah.
Well I like the idea of them just having cats
and dogs. I mean, that's that's been pretty successful in
various nursing comes. Yeah, we know if you pet an
animal that it already is going to learn your heart
rate and probably can tell your secrets to the cat
or dog and you know, interact with it. But I

(11:34):
guess what the thing about the seal is that it does.
Maybe maybe your cat runs away like mine does. Right,
the seal will actually interact with you and sort of
encourage you to have that bond as if it's listening
to you or understanding you. Yeah, and I won't say,
go to sleep on somebody's face, and yeah, it won't
pop a squat on your face, which is always good. Um.

(11:54):
And then we actually talked about this for about before.
It's Roxy the sex Spot. Yeah, yeah, Roxy with three
X yeah, three xs in case you didn't already get
or maybe more. I don't know. At this point they
may have added more exs. Yeah, the newer model. But
she's you know, she costs nine thousand dollars, she's anatomically correct.
She you can program her, she can interact with you. Again,

(12:18):
here's the technology in use right now. Yeah. And um,
and there's another actually interesting example. UM Tokyo University of
Science Professor Hiroshi Kobyashi and uh two thousand nine, he
was working on a classroom robot named Saya, and they

(12:40):
were experimenting with using a robot teacher the front of
the class. Uh. And this was not you know, they
didn't like roll in the teacher and like let it
loose to teach an entire year or anything. They just
sort of tested it out, and you know, it wasn't
really capable of doing much beyond calling roll and shushing
children and and on. Obviously, this would not work in

(13:01):
some of like America's worse schools to be unless it
actually had like missible watchers or something. But um, but
it's another example of people are figuring it out there,
like how can we take a machine, how can we
program it to interact with children, to teach them and
maintain their attention and and make them behave well. And

(13:21):
I think that's that's that's that's where the worst maybe
comes in when you're talking about robots and children. Is
this idea that they can teach children, particularly like toddlers. Um,
because repetition is so important when you're learning, not just
throughout your life, but at those early stages. So um,
having a robot um interact with the child, particularly with

(13:43):
the child with autism is actually, it's found found to
be pretty helpful. There's a two foot tall robot name now,
and you can introduce himself, extends his hand for a
shake and announces that children like to play with him.
So he's already putting that idea in the kid's brain
and it can take a bow. That's what you want
every robot to do. Um uh performs tai Chi routines

(14:05):
with the company music yeah. Um. But more importantly, it
can be programmed to incrementally increase the complexity of its
routines over time. So it's the children progressed through therapy. Um.
It actually helps them to go to different levels. And
so if you think of an autistic child, they actually
they need to log a lot of hours of therapy.

(14:25):
And this is where I think robots can actually be
really helpful because that it's pretty intensive, right, that this
sort of therapy the child with autism needs. So you
can have you can program this robot to work with
that child over and over again again that repetition, that
the burrowing of that neural pathway to learn a certain task,
and that can be helpful. I mean not just for

(14:45):
a child with autism, but you know, for a young
two three year old who is doing something over and
over again, robots maybe the way to go. I mean,
of course you still need the human element and human interaction,
but that sort of technology I think it's pretty intriguing.

(15:06):
This presentation is brought to you by Intel sponsors of Tomorrow. Now,
one thing about this is a particular study. I found
a quote where they mentioned that the children, the children
are not only connecting with the robot, but also with
the tester who controls the robot, right, yeah, and they're
both sharing this novel experience. Yeah, so I mean that

(15:27):
kind of I mean, not the I'm not poo pooing
the idea, but just playing Devil's addeviate here. You're you're
still not talking about a robot like completely taking on
this child. They're human in the mix, which is kind
of like it's I mean, if that is a robot
teaching a child, then is dad working on a remote
control truck with his son? Is that a robot teaching

(15:49):
a child? That's a good question. I think more specifically
with the kids with autism, what they were thinking is that, um,
the children are responding well to the robots because they
can the robot's behavior. So yeah, there's a there's uh
an adult human with them, but the sort of interaction
that they're having with a robot, they and and some

(16:11):
of this is speculative, right, but they think that there's
a comfort level and that they can say Okay, well
humans and I can't. I don't always know what they're
about to do. But this robot is it is so
repetitive and it's teaching these these things and I'm really
responding to it. That's the idea behind it. And I mean,
we can't eagineting this without talking about are just ridiculous

(16:33):
ability to anthem amorphize anything everything. Yeah, Like like if
anybody out there watches is a community. Um. I think
it's the first episode where Jeff Linger uses the example
of you name a pencil, like this pencil's name is
Carl or whatever, and then you snap it in half
and everybody's gonna feel a little part of themselves die
because I mean that's all it takes. It's like two

(16:54):
dots in a line makes a face and we can
instantly associate with it, you know if it's it's it's
just completely rampant everywhere. Well yeah, so it's it's just
ridic like like the idea that a person could could
feel like a machine is a robot is real and
uh and is and and actually maybe even feels for them.

(17:14):
Uh you know, I that's that's just a no brainer
because I mean we do that all the time with
things that were like dogs and cats. You know, it's like,
just my cat actually love me. Well no, not really,
but but there, I mean, there's a bond there, but
it's not, it's not. I definitely anthropomorphize that. I make
more of it out of it than it is. And

(17:34):
I sort of willingly partake of this, this worldview, this
fiction that I create in which my my cat thinks
more of me than say, just a warm food provider.
Oh yeah, I mean, I think we do it all
the time. We don't even know. Like I've noticed that
our I T mastermind um is he like will refer
to my computer as her all the time. And I
think it's really sweet. She she's great. Don't worry about her.

(17:56):
She's in good hands. Okay, yeah, I mean, you're right,
we can't. But and we have mentioned Sherry Turkle before.
She's the psychologist at m I T who was there
at the forefront of robotics development there and social robotics, right,
and she has talked about how there's a sort of
cautionary tale here because we can't help but connect with

(18:18):
another UM thing and and describe it as having a
sort of sort of humanness, And she saw that over
and over again with their studies, particularly with children and
even adults. The most basic primitive robot, someone might come
away feeling like they had some sort of deep relationship
with that robot, and she herself had developed one a

(18:41):
sort of crush on a robot. Um cog there. Yeah,
I think you mentioned in the previous podcast. Yeah, I
love hating robots. Yeah, and um she cites though specifically
this robot named Kismet that worked a lot with children,
and she talks about how the kids would interact with
is Meant. And one day kis Meant malfunctioned. The child

(19:03):
killed the child. That's right, and this was the twelve
year old child, and that the child thought that kiss
Meant was rejecting her. Now there are there's another adult
there sharing experience. The child didn't actually die. No, the
child did not die. In fact, she got so upset
and she just thought that she had done something to
um make the robot hate her that she started stuffing

(19:26):
her face full of snacks and crying. So I mean
she had like a food disorder after that. Um. But
to your point though about having the shared experience right there,
there's another adult there, and that adult is saying no, no, no,
it's it's you know, Kisman is broken. It's not you.
And yet she's the one that goes away feeling like
she's done something to this robot. And and this is

(19:47):
what Sherry Turkle says, is the problem is that we're
scribing all this meaning to something that can't relate back
to us, that doesn't have the sort of nuance or
social uh socialization, that sort of contact X to actually
interact with us in a meaningful way. Yeah, like it
can at a certain extent that the fiction can fall

(20:08):
apart that you create. I mean, at least with with animals,
there's a I mean, there's definitely a bond there, and
you're just kind of you're you're putting a whole bunch
of layers over it. But but yeah, with the with
the robot, it becomes a little more tricky. But I mean,
I also can't help but think of puppetry when we
talk about any of this, you know, I mean we're
talking about like one person using a machine to interact

(20:30):
with a second person, And I mean the the medium
of puppetry is essentially one person using um a a puppet,
a false little creature on the end of their hand
or hanging from some strings, etcetera, to to interact with
a person or a group of people. Um, there was
there was some study we were talking about the other

(20:52):
day about uh, children observing like a robot acting badly
or I believe Oh yeah, actually it wasn't. It was
a puppet show. Yeah. They were trying to figure out
whether or not eighteen months old eighteen month old children
had the ability to sniff out right and wrong. So
they want right and they're like that punch is horrible. Yeah,

(21:13):
they really that. There was like made mee a rabbit
that would come and there were three characters and one
was a rabbit I think, and the rabbit was pulling
all sorts of shenan again and not acting nice and so, um,
the child actually would react pretty strongly to that rabbit
and not like it. I mean it was it was
obvious in every single case that the kid was like, no,
that rabbits up to no good. So yeah, I mean

(21:35):
the kid has the capacity for that sort of Um.
I guess you could even say empathy, right, yeah, yeah,
some some form of empathy. Anyway, I understand that there's
a lot of stuff really still taking hold in the
early stages. But there is, but there's a lot more
that they're discovering that that babies and toddlers are um
able to detect and process um that we sometimes we

(21:58):
think that we're just faking it until we make it right. Um,
but there may be some sort on some sort of
level saying well that's just not good. You know, that
turned rabbit. So it's i mean sometimes we use that
stuff to explain away, uh wh wh my little kids
are such rats, you know. So it's kind of like
if they if they really can empathize, then they're just
all the more horrible they really are that self well

(22:20):
you know you kind of. I mean, there's the theory
that kids are little monsters, right, are little creatures and
they have to learn to be human. And so you've
got to maybe go through inflicting that pains you know,
as a child, are having it inflicted upon you in
order to understand that. Okay, So so if you're falling
along um in the car at home, children are monsters
and we need soulish robots to transform them to humans. Yeah, yes, um.

(22:46):
And of course the way that adults, I mean, there's
you know, children are watching what adults do, and the
way that we interact with with robots also has a
huge influence. Um. There's a study from Andrew melts Off
that the University of Washington and Seattle, they did a
study about gaze following, and they found that infant like,

(23:06):
you know, a robot turns his head and looks at something,
and then you look to see, oh, what's that robot
looking at? Um, Like, you wouldn't necessarily do that with
a with a security camera just sort of going back
and forth. You would recognize that as something that had
that some sort of right. You wouldn't be like, oh,
I wonder what that's looking at. But they found that
infants would follow the gaze of robots if adults had

(23:27):
treated the robots as people. So, um, so you know,
if we're everybody's anthropomorphizing, and then you have you have
adult humans treating the robots as if they're, if not humans,
at least something on par with a human, and then
you know children are growing up in that environment, then uh,
then yeah, it's it's completely feasible and believable that they

(23:48):
would see this robot is something. But then what happens
when it shuts down? Right? That had I mean, all
of a sudden, you're traumatized. Right. So you had mentioned
Harry Harlow some of his studies, and I read the
material and I have to say that I'm still a
little bit I'm reeling from it, but I think it's
interesting in this contact. Yeah, Harlow for those of you,

(24:10):
uh not familiar, this is back in the fifties and
he worked with Reese's monkeys and he u for instance,
one of his experiments that we're not going to go
into uh what's called the pit of despair. So that
that should tell you these were very controversial experiments because
like with the Pit of the Despair, he was just
gonna try and get like induced clinical depression in Reese's

(24:33):
monkeys by putting them in like this horrible little cage
with weird walls. It's a chamber and yeah, with no
absolute all sensory deprivation. Right, Yeah, it was years. It
was technically it was not called the pit of despair
in the experiment. They referred to it as a vertical
chamber apparatus. But that's right. And the monkey was hung
upside down anyway, So you know, we're dealing maybe with

(24:54):
a mann I depressive here, but he did do some
interesting research. Yeah, that the one involving love particularly. There's
some cool videos of this and I'll be sure to
put these into the blog post lit accompany this. Um
and you've probably seen clips of this before where he
had little baby monkeys and then he had to fake
mama monkeys. There's the metal monkey, which just looks like

(25:17):
this piece of scrap metal with a sort of monkey
looking head and like a single nipple coming out of
its chest with the milk. Yes, and then there's a
a furry mommy which is uh, this this all equally
fake looking monkey thing that is, but it's covered in
fur so it's soft to the touch. And so he

(25:40):
was curious is to how, um, you know, how interaction
with the mother affected the infants well being and as
in this this sort of came out of this idea
to that up until then, people thought that the balling
between a mother and child was completely um nutritive, right,
Like the the child was bonding with the mother because
knew it could get milk. So the idea is that

(26:03):
you have this wire mesh creature with milk and it'll
bond with it. Yeah, but he found that the baby's
rarely stayed with the wire model, with a wire monkey
longer than it took to get the food it needed,
and then it would scuttle off to the the cloth
soft monkey model, especially if it were scared, and it
would hang out with with that one most of the time.

(26:23):
If they switched the nipple to the soft monkey, it
wouldn't hang out with the metal monkey mom at all. Right,
and it's like I got everything I need right here. Yeah, okay,
And so we've got some obvious parallels with robots coming up, right,
which I mean it just to throw back to the
to some of the Japanese models we're looking at. You know,
it's the they know better than to make a a
you know, a nursing or babysitting robot that looks like,

(26:45):
you know, a water heater. They're they're creating them cute,
they're creating them. In the case of the uh, the
the rebud that lifts uh that looks like a bear
that lifts patient stuff. I mean, it's it's it's hands
are soft, so um so yeah. But what happens when
when monkeys are are raised by these things? He he
did some some additional studies, and he found that babies

(27:09):
raised with real mothers but no playmates were often fearful
or inappropriately aggressive. Baby monkeys without playmates or real mothers
became socially incompetent and uh. But when and when older
were often unsuccessful at mating. And he also found that
young monkeys reared with live mothers and young peers easily
learned to play and socialize with other young monkeys. Uh.

(27:31):
While the while, if you were if you grew up
with the cloth monkey the fake monkey, um, you were slower,
but you seem to catch up socially by about a
year if you were around other monkey babies. But if
you're deprived of any interaction with your species, then you
were a violent mess. Right. But then again we have
to look at the fact that these were very basically

(27:52):
these were not even robots. These are just the just
mimicking like the cloth monkey was just basically mimicking softness.
But see, what I think is interesting about this is
that if they had the interaction with their species, then
they could have that nuanced communication. And again this is
one of the problems with robots. Write that they don't
have the social context and they can't right not yet,

(28:14):
they can't interact with us on a level that's meaningful.
So we have this little thing called mirror neurons, and
those are activated when people observe an activity, and these
neurons resonate as if they were mimicking the activity. So
the brain learns about an activity by effectively copying what's
going on, and the brain processes the observed actions as

(28:35):
if it's doing itself. Right. So if I'm a monkey
and I'm watching my monkey mother, uh do this some
this sort of thing, uh, in a either social context
or maybe she's using a tool to do something, then
I'm able to mimic that in my brain. But but
but if I'm deprived of that, uh, then I as
a monkey and at a great disadvantage. So you look

(28:56):
at the obvious parallels here with robots. Yeah, and the
mimicking thing is is interesting too when you look at
the in the future of socially intelligent machines UM, just
as far as it concerns, say, more complicated industrial tasks.
There's the the idea that you would have a machine
learning from a human watching the human, observing human doing

(29:16):
the task and learning it. Or in a surgical situation,
human is using the robot, controlling the robot to conduct
a surgery, and if a sufficiently advanced machine, sufficiently socially
intelligent machine would learn from those, uh, those different techniques
in maneuvers that the human is using and be able
to replicate them. Uh. That's that's kind of the ultimate goal.

(29:37):
And then and will not the ultimate goal, but one
of the big goals. Yeah. And once we get there,
you know, then we have we have a robot that
can learn skills and then and then improve upon them
with machine precision. Yeah. And it's interesting that m I
T is actually looking at toddlers in the way that
they learn and develop and applying that to robots as
well physically, not so much with with emotions right now.

(30:00):
But you know, we were talking about tracking gays or
doing the same thing with robots and trying to figure
out how to make that more nuanced for them. Yeah,
And it comes down to like how complex the system
is too, Like assembling some you know, cogs on an
assembly line that is, you know, that's going to be
just so complicated, and then surgery you're you're dealing with
a with it, you know, a more complicated system, but

(30:22):
it's it's a system now how about child rearing, Like
how complex, how nuances that system? And then how much
how much effort has to go into it, how much
programming has to go into a robot capable of navigating
that system and dealing with these little monsters that behave
so erratically and and may have you know, other you know,
other things influencing their lives. Well in language is key, right,

(30:45):
I mean, if anything, that's what distinguishes us from creatures, um,
is that we have the ability to communicate with each
other close that close. Clothes are helpful, um, but and
you know some creatures do wear clothes. I'm not going
to go into specifics. That's for another podcast. Are you
talking about those TV shows with the monkeys in them?
Because I think it's hard to argue that that's consensual

(31:07):
that they're wearing squirrels too. But um, but language that's
the problem. Right. So AI developers they've been able to
develop robots physically, but they can't quite figure out how
to do this this mysterious thing which is too imbibe
them with language and to be able to communicate with us.
And again that's why they're looking at toddlers and babies

(31:30):
and seeing how their language centers develop um and in fact,
there's AI specialist Rao and he says, only by integrating
sensations from their own mechanical bodies will robots have a
shot at understanding what it means for a chair to
be soft and a person to be soft hearted. So
they have to have those experiences themselves. I think we've

(31:52):
touched on this before. How I think maybe it's in
the Pain podcast about like the the idea of creating
robots that can feel pain like part of sensation. You
would need to create a robot that can feel something
like pain. That that that has to uh, that makes
a task strenuous of its strenuous um you know. Yeah,
that's right to have the ability to empathize, right right, Yeah,

(32:14):
I mean they're getting closer there. There's some have you
heard about the I cub Okay, this is a robots
as tall as a three year old, It weighs fifty pounds,
it has a child's face, and it's a bear. No
only in your world, cub, I'm picturing like a small
robotic bear. Yeah, it is a little misleading. It's got
five fingered hands and uh, it's actually under the tuliage

(32:38):
right now of a computer scientist or one of them
is Georgio Metha and the Italian Institute of Technology in Genoa.
So if you're not seeing the geppetto and similarities, here
there you go. But when not crawling, I cub can
sit up and grasp objects. It possesses the robotic equivalent
of sight, hearing, and touch and has a sense of

(32:59):
balance and flexible skin. Is now in the works for
it um that the idea is that eventually it can talk,
and they're they're getting there again. They're they're studying patterns
and ways that they can try to create the same
neural circuitry that we have in terms of mirror neurons UM.
And it's a super creepy robot. Well, that's another thing

(33:21):
we have to deal with, is the you know, the
the whole uncanny valley, the idea that the closer you
and I guess that's why making a cute robot look
like a cartoon bear is is far better than trying
to make it look like a person. Because the closer
you come to making um an artificial structure or creation
looks like a human is the idea, the more creepy
it becomes. It's the Yeah, the more grotesque it is

(33:41):
because you we we can look at it and say,
there's something wrong with that. Right, it looks just like
me except where it's not me um and and that's
where the alienation comes from. Right, So better off to
just make it look like you know, Yogi bear and
have it change divers that way. But it's it's interesting,
especially after last week's podcast where we talked about love

(34:03):
and about like what's going on in the in the
mind with love and like what love is, because we're
talking about creating a machine that can essentially like no
matter how complicated you you talk about the the AI
and and in the way it interacts for this, no
matter how socially intelligent it is, it is essentially faking it.
It's it's faking being human. It's faking it's compassion for

(34:26):
the for this child, it's faking it's love for the
lonely dude who buys um you know, the future version
of Roxy. You know that there, it's faking all of
these things that are human and it and it's easy
to to sort of get on a high horse and
about that and be like, oh, it's just this is
so disgusting it's just gonna fake all these things that
matter to this. But when you look like at the

(34:46):
at the the neural activity behind the real organic realities
of love and compassion um or, I mean, it all
boils down to kind of things that are fake as well, uh,
because I mean it comes down to issues of like,
all right, well, this organism that we call mom, you know,
it just wants to uh, to carry on it's it's genes.

(35:08):
It wants to to eat food and and uh. And
you know, it's like there are a number of very robotic,
robotic tasks and on top of that, there's this neural
complexity that it creates these things that we call love
and and uh. And so by by looking at a
robot faking, we have to look at how much of
it is fake in our lives general. Unless well, that's

(35:28):
interesting to say that because I had read something about
how some programmers at Georgia Tech had given some robots
the ability to deceive, and I was horrified by that.
But you're right, I mean, that's that's a purely human thing, deception.
I guess. I was just horrified in the fact that
you would have a robot that would have that ability
and specifically, they programmed it um in cases of warfare.

(35:53):
So if you're in the battlefield and you have the
power of deception, you could you know, successfully hide and
mislead enemy if you are a robot, and also give
false information. But take it back down to your robo nanny.
You know, do you want your robo nanny to be deceptive? Well,
what if the kid asks, Santa Claus is real? Yes,

(36:14):
you don't. You don't want the robot to be like
Santa Claus is a cultural construct that means nothing. You
know that kids crying and then I don't know what
happens next. Yeah, but maybe I mean, what does Robin
Nanny doing. I mean, Robinani could then turn to you
and say, we had a great afternoon, we did enriching
activities and really like Robin Nanny was, you know, watching

(36:34):
watching the soaps, you still need a nanny cam to
watch then, and then the nanny camp itself is a robot.
You end up with all these levels. Uh, it becomes
a quagmire pretty quickly, and you start talking about a
robot filled future. It's it's a little bit. Uh, it's
it's dim, right. There are aspects of it that seems
kind of dim, but I will say that they're there.

(36:58):
There's possibilities in terms of teaching children. Yeah, I mean
lots of children already. I mean not that this is
like there's not necessarily a pro into a large extent.
It's maybe a negative in some cases, but I mean
children are already depending on computers more and more. They're
they're turning the video games and social networking for their
their community. So, um, you know this is we're just
talking in a way an extension of this, right. You know.

(37:21):
I think that um, everyone's want. I think we try
to figure out like what the next niche thing is.
And I really think that robot camps, yes, for anybody
who's who's got the equipment out there that E mean
like a camp where people go and build robots. Yeah, yeah,
the Moxie. Yeah, I mean I'm thinking about I mean,
my daughter, I think I need to enroll her in
some AI so she can control her Oh well, I mean,
well they have things like first and all. No, no, yeah,

(37:44):
but I mean like we're talking like early childhood education here,
like never mind music garden. Okay, you know, I think
I think we should all aspire to maybe have our
children be able to take a part, put together and
manipulate their own robots by age four. I mean, robots
are going to be making all the music anyway, right,
so true? Yeah, just you know, when the technological singularity comes,

(38:06):
they're prepared. Yeah. So uh, you know, I guess you know.
We put it out to you guys to let us
know what you think about the prospect of future generations
being raised by robots, babysitted by robots. Um, if anybody
out there was raised by robot, let us know. I
don't know, it's possible some you know, secluded uh family
situation where dad's a mad scientist and and uh and

(38:29):
mom is addicted to the home shopping network and isn't
all that available or some documents have recently become declassified
and you can speak about it. Yes, yeah, let let
us know. Send us this document and speaking which I
do have one listener mail to read here And this
comes from Caitlin, and Caitlyn writes, Hey, Robert love the podcast.

(38:51):
As an English teacher, I often recommend it to my
adult students to train their ear to the American accent.
Which this is kind of funny. Um, But Caitlin is
writing to us from Madrid, staining by the way she
continues just writing because something occurred to me today as
I was listening to you talk about terraforming in the
beginning of the Werwolf Principle podcast where we talk about

(39:13):
engineering humans for space. I'm sure you have read Kim
Stanley Robinson's excellent Red Mars, Green Mars, Blue Mars trilogy,
but there are so many times when I expect to
hear you bring it up and you don't. You have
read it, right. All of Robin Robinson's work is excellent,
but the Mars trilogy is probably the best work of
fiction I've read of any genre. I really enjoy your
book recommendations, especially since I got a kindle, and thanks

(39:35):
to you, I've decided to read some more Ian Banks,
although after reading The Wasps Factory I was a bit
turned off. But it was his first work, or one
of his early works, right, and I've been told later
works are more different. Keep up the good work, both
of you. Uh So, first of all, I do have
to admit that I have not read the Mars trilogy. Yeah,
it's it's been on the list for a while, but

(39:57):
I have the habit of finding authors I like, and
then I got compulsively read that authorn till I either
read everything they've written or I just get really sick
of them, which you know is good when I'm onto
a good author, but it means I'm not necessarily getting
to some of these other works that I should read.
And as for banks work, yeah, wasp Factory was his

(40:17):
first novel, and uh it is it's rather different than
a sci fi It's it's a rather dark story about
really really messed up Scottish family, uh that lives on
the beach, and it's more of a it's like sort
of a psychological horror kind of a thing. But his
his later books are like, they're the sci fi books
that I've been talking about. And so if one word

(40:39):
to pick up their first u in in banks sci
fi book, I would say The Player of Games is
probably a good starting point. So Caitlin, thanks for writing
us and for the rest of you, if you have
anything you want to share with us, you can find
us on Facebook and Twitter both as blow the Mind.
We regularly update that feed with the cool links to

(41:01):
how stuff work stuff as well as uh curios from
throughout the web, and you can also drop us a
line at blow the mind at how stuff works dot com.
For more on this and thousands of other topics, visit
how stuff works dot com. To learn more about the podcast,
click on the podcast icon in the upper right corner

(41:22):
of our homepage. The how stuff Works iPhone app has
a ride. Download it today on iTunes.

Stuff To Blow Your Mind News

Advertise With Us

Follow Us On

Hosts And Creators

Robert Lamb

Robert Lamb

Joe McCormick

Joe McCormick

Show Links

AboutStoreRSS

Popular Podcasts

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.