All Episodes

September 13, 2024 74 mins

The human world is increasingly 'assisted' by a pantheon of automations. Yet this emergent reality brings to bear fundamental philosophical quandries about the nature of not only the mind but the soul of things called 'artificial.' In tonight's episode, sponsored by Illumination Global, Unlimited, Ben, Matt and Noel see both sides of a dangerous future.

They don't want you to read our book.: https://static.macmillan.com/static/fib/stuff-you-should-read/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn the stuff they don't want you to know.

Speaker 2 (00:25):
Hello, welcome back to the show. My name is Matt,
my name is Noel.

Speaker 3 (00:28):
They call me Ben. We're joined as always with our
super producer Bauld, Mission Control, DECANDT. Most importantly, you are you.
You are here. That makes this the stuff they don't
want you to know. Folks, friends, neighbors, country bots, lend
us your ears. Have you ever felt like the game
of life is just too much? Also? Do you remember

(00:50):
that board game Life.

Speaker 4 (00:52):
Just too much?

Speaker 5 (00:54):
I could never I could never get to the a
is That's the one where you have the little cars
you keep stacking weird stuff on top of of you
have a station wagon.

Speaker 3 (01:02):
You get married, you fill up the little slots and
each person's a peg and you fill them up in
the station wagon.

Speaker 5 (01:08):
A little oversimplification of life, if you ask me, I
don't know that was never.

Speaker 3 (01:12):
Yeah.

Speaker 5 (01:14):
Also it seems to just kind of like overemphasize commercialism
and sort of this like faux American dream kind of thing.

Speaker 3 (01:21):
Yeah. Yeah, also argue everybody has to get married, which
is not true. But you know, if you think about it,
wherever you are in life, if you're an adult, you
have pretty much several jobs. You've got your official career,
your occupation, whatever that may be, and then you've got
all this other stuff that doesn't pay you financially yet

(01:43):
remains necessary for survival chores. You have to clean the house, bills,
you have to pay the bills, hopefully, you can visit
a doctor, your parents, your family member. You've got obligations there.
Wouldn't it be great to have a little help with
all of this? Yes, we're excited about this.

Speaker 2 (02:06):
Well, that's why, like marriage was is and was such
a thing, right because now you got two people doing
all the same stuff, like but dividing and conquering theoretically.

Speaker 5 (02:16):
Yeah, I mean, the idea being that it makes you
each of your lives easier. If you want to just
break it down to the sheer kind of logistics of living,
you know that that is a functional reason for marriage
to exist, though I would argue domestic partnership accomplishes the
same thing.

Speaker 2 (02:32):
Oh yeah, no, one hundred percent. But that's why it's
been such a big thing, right, the family unit. And
then you can pump out a whole bunch of kids
that can work the farm for you free l Yeah,
I mean that was.

Speaker 5 (02:42):
Round up all those orphans and shoot them all to
the farm.

Speaker 3 (02:47):
Raise your own butler. You know what I mean. Put
the time and you're absolutely right. And for centuries, science
fiction writers in particular have imagined this new world where
technology provides new, astonishing answers to this question. I am
one person, I have so many things to do. I
need help. Who can help me? And our question this

(03:07):
evening is how close are we to this idea of
robot servants? Perhaps more importantly, is this a world that
we would want to live in? Here are the facts,
all right? What's a robot? We've talked about this in

(03:28):
some related episodes, but just like a quick and dirty primer,
what do we mean a robot? Robot?

Speaker 5 (03:34):
I mean, you know, we certainly see it through antiquity,
like these automatons and kind of clockwork men or whatever
that I believe the chess playing robot or automaton was
a big one historically speaking, but mechanical term, that's right.

Speaker 4 (03:47):
The term, though, is a little more recent.

Speaker 5 (03:50):
It comes from the nineteen twenties, from a play of
all places. And I'm sure we've been through this before,
but I think it's definitely worth a catchup. Are you
are rawsoms Universal Robots by Corel Kappik?

Speaker 3 (04:04):
Yeah, and robot today. It's a huge group term, just
as varied as human beings, I would argue, more varied
than the human species. Not all robots are gonna do
the same task, not all of them are gonna look
the same. You know, no one would say the curiosity
rover is the same thing as a factory line robo

(04:27):
worker and auto plants. But to your point, Noel, the
etymology gets really interesting. Our playwright that you mentioned got
the word robot. It enters English in nineteen twenty three
as a translation in this play. But our playwright gets
the word robot from a check word robotic robotnic means

(04:49):
forced labor.

Speaker 2 (04:51):
Yeay, just like the doctor. I guess, yeah, doctor robot.

Speaker 5 (04:59):
Yeah, man, he and he was all about creating for slaborers,
usually in the form.

Speaker 4 (05:03):
Of weird little robot warriors.

Speaker 5 (05:06):
I guess that we're just trying to kill Sonic and
Tails and Knuckles and whoever else is part of their crew.
But I mean, you know, even the word robot like
in English kind of parlance sometimes refers to like a microwave,
you know, like robot cooking. I mean, just the it's
really funny, just the idea of automatic something being done
for you, a task being done automatically, robo whatever, you know.

Speaker 3 (05:29):
I also, it reminds me these languages are so interesting.
It reminds me of the term for pickup trucks that
have machine guns mounted on them. In parts of East Africa,
they're called technicals.

Speaker 5 (05:43):
WHOA, I thought that was Like when you made the
really elaborate cake on a baking.

Speaker 3 (05:50):
Show, they call it the technical technical challenge.

Speaker 2 (05:54):
That's the one where they don't give you the exact
details on how to make.

Speaker 5 (05:58):
That's your right, Yes, thank you, a great British baking show,
I got it, Yeah exactly. The technical is when you
get some vague recipe and you got to kind of like,
you know, show your chops by you know, muddling your
way through it.

Speaker 4 (06:10):
Okay.

Speaker 3 (06:11):
And this this term robot or robotic also arrives from
robota meaning force, labor, compulsory service, or drudgery. And if
you trace it further back, these words originate from some
iterations of old Slavic it ultimately comes from the phrase

(06:31):
rabu meaning slave makes you think, makes you think, well.

Speaker 5 (06:36):
Yeah, especially when you start to think singularity and how
you know eventually when the robots become sentient, they're going
to resent all of that slave labor the creatures.

Speaker 2 (06:45):
The reason we can get into it. But the phone
you're listening to this on is it a robot? Not really?
Maybe kind of? And is it your servant?

Speaker 5 (06:56):
Well that becomes the question because robot doesn't mean the
same thing as a right, like.

Speaker 3 (07:01):
An android's a kind of robot.

Speaker 5 (07:04):
Yeah, but I think I don't know, Ben, What do
you think is a phone with all of the myriad
things that can do and with the personality that it
has via Siri or whatever, is it a robot? Does
it have to ambulate to be a robot?

Speaker 3 (07:16):
No, it's a great question. The question is what do
we mean exactly so we know robot? What do we
mean by robots? Servant? Is it redundant? Is it like
saying ATM machine or vend number? When you think about it,
you'll already have a ton of robot servants. I would
argue that smart thermostats, home security systems, they could be robots.
The idea of a telephone or a smartphone. Excuse me.

(07:39):
You could also argue modern cars are robots because what
they're doing essentially is replacing the drudgery of certain human labor. Right,
you don't have to try to run sixty miles. You
have a car that will help you accomplish that task.
You don't have to regulate the temperature in your home.

(08:00):
You have a thermostat that will help you. It doesn't
have to be ambulance works to your question, and your phone,
will you know? Now you don't have to write a
letter and wait three months for the reply when you
open the envelope and it just says, lol.

Speaker 2 (08:14):
Yeah, you don't have to tell anyone the things you
want to buy or put on your shopping list.

Speaker 5 (08:19):
I mean, it makes me think back to even pre iPhone,
where you had the BlackBerry and that first kind of
round of what we're called personal data assistance. I mean,
it really is, that's the earliest days of the stuff
kind of finding a niche in our lives.

Speaker 3 (08:34):
And the you know, we can say in a complimentary
manner that it is astonishing how quickly the humans have
adopted breakthroughs in technology. Right, It's it's nuts too to
consider that this idea household robots, household robotics very much
in its infancy still, and no one, honestly, no one

(08:57):
is sure how this will all shake out. If you
have a child tonight, like if you or your partner
gives Earth this evening. First off, thanks for listening to
the show. But secondly, eighteen years from now, the world
that kid meets is going to be so different from
twenty twenty four, particularly in terms of robotic assistance.

Speaker 2 (09:20):
Yeah, do you think you guys think it'll be like
twenty sixty two with Jane, George, Judy and Elroy oh
Man Rosie.

Speaker 4 (09:32):
Time, I was too, And I'm gonna go ahead and
just drop it now.

Speaker 5 (09:35):
When we're starting to get into the conversation about robot
assistance and the idea of what can go wrong, there's
an incredible episode of The Jets Senes where Rosie glitches
out and starts repeating a place for everything and everything
in its place and just starts freaking out and putting
things like over organizing and causing mayhem and just totally

(09:55):
fritzing out.

Speaker 4 (09:57):
And that there. What was it like in the seventies,
I guess it really is.

Speaker 5 (10:01):
To your point, been about science fiction predicting what could
go wrong with these kind of household servants.

Speaker 3 (10:08):
Yeah, assuming that there is no civilization ending event, the
current pace of research and manufacturing, especially economy of scale,
does indicate that we will eventually have AI assistance PDAs
that make your modern iPhone look like two rusty soup
cans connected by old string, you know, like that's the level,

(10:32):
you know, think about it. It wasn't too long ago that
people landed on the moon, hot take. Can I say that?

Speaker 4 (10:39):
Can I say that peratively to the greater span of history?

Speaker 5 (10:43):
Absolutely, I'm sure it could be like one of those
memes where we say, I want to feel old, Like
the eighties are now the same distance from where we
are in modern history as the fifties were from the
eighties or whatever. You know, it's like, yeah, bonkers, the
way time just compresses like that?

Speaker 2 (10:58):
Or is anyone actually land it on the moon, you guys?

Speaker 3 (11:01):
Or is anyone actually here? We know the we're here
because we did hang out in personally.

Speaker 4 (11:08):
Oh yeah, yeah.

Speaker 5 (11:09):
We have to check in occasionally to make sure we
have not become ephemeral, right, we.

Speaker 3 (11:14):
Have become strange. It's a great question, you know, this
is for being positive and Ted talking about it. This
is an exciting thing and it can save lives. And yes,
robots have also killed people already, but we will see
for instance, home care on a level that simply could
not have existed before. A lot of people, especially in

(11:35):
countries with aging populations like Japan for instance, and declining
birth rates, they need some kind of affordable in house
elderly care. So this can not only save lives, but
it can extend quality of life for people who need help.

Speaker 2 (11:55):
Oh buddy, And there are so many companies out there
trying to build that exact thing, Nightmare robot to take
care of us when we're in our eighties.

Speaker 3 (12:06):
If you have mobility issues, right yeah, or if if
you're on board with a subscription model as we'll see, right,
because you don't have to you don't have to own it.
You just have to listen to the ads.

Speaker 2 (12:21):
Did you guys see any of the price tags or
any of the robots we're gonna talk about today. We
don't have to spoil what they are. If you found any,
it's only found one.

Speaker 4 (12:28):
It's car prices roughly right or more even I.

Speaker 2 (12:32):
Don't we'll talk about. But the one that I found
the actual all in price was astounding, and we'll mention that, Yeah, but.

Speaker 5 (12:39):
I would I would argue that the business model, big
picture wise, would have them be around car prices.

Speaker 2 (12:46):
Well, it's like, uh, from what I saw is basically
you your down payment is the price of a small vehicle,
and then you end up paying the price of a
large mansion over the course of term of owning. It
almost like yeah.

Speaker 5 (13:02):
But that's early adopter stuff. I mean, that's like, you know,
you got to someone's got to walk so that the
rest of us can run.

Speaker 3 (13:08):
Speaking of early adopters, this is the question. You know,
we always see these utopian ted talkie versions of the
future of robots in your home. Everything is shiny and wonderful,
and you're gonna just have the best of all times.
You'll be happy as peaches, you know, clamorous as clams.
But how close are we to this kind of situation?

(13:31):
Should we be concerned? It is a tough question, raising
excellent points here. History does show us that disruptive innovations
often begin in limited demographics. Early adopters, you know, the
suzerains of the tech world, the financial elite, and as
they become more comfortable with robots in their homes and
in their heads. They'll want more capability. They will pay

(13:56):
these companies more money. This will enable the companies to
manufacture more stuff, and once that happens, economy of scale,
this is where you get more affordable versions of the toys.
This is where you end up thinking about it the
way you would think about buying a car.

Speaker 4 (14:12):
Yes, oh for sure.

Speaker 5 (14:13):
I mean look back at like the feature set that
was available on the original iPhone, and how comparatively it's
just absolutely rudimentary, almost like that to ten cans in
a string comparison you were talking about.

Speaker 4 (14:24):
Because but the early adopters.

Speaker 5 (14:26):
Of that and the fact that there was nothing else
like it at the time, that's what paid for the
continued innovation. Then you have to think about things like
planned obsolescence and you know, paying into the system and
being totally okay with that. Honestly, the people that are
the real nerds and tech heads are largely okay with
that model.

Speaker 3 (14:43):
Yeah, yeah, largely. I mean, well, it won't all be
you know, warm fuzzy hugs and angel farts. We'll see
a rise of other things concurrent with these devices, things
that raise serious questions, constant monitoring, endless leveraging of person
interpersonal data. Predictive attempts will have a huge and unseen,

(15:05):
often unseemly impact on people's lives. So again, how close
are we really? Is it a jam tomorrow? Never jammed?
Today's situation? Are we on the precipice of something? Should
we be concerned? Yes, here's where it gets crazy. Slight
spoiler there, Yes you should be concerned. Yes, we are

(15:29):
close in some ways, but maybe not in others. Because
you know, we mentioned the Jetsons earlier. We'll get to
the more sinister stuff and later tonight. But the Jetsons.
Let's talk a little bit about the Jetsons. Let's talk
a little bit about their robo housekeeper.

Speaker 5 (15:47):
Yeah, Rosie, I mean, and you know, I just mentioned
that episode that has always stuck with me. Rosie is
basically a member of the family. She's an anthropomorphic creature
that is treated like family. And there are even, you know,
kind of cuddly episodes of the Jetsons where the gangle
all bad together to help Rosie out because.

Speaker 4 (16:06):
She's been so good to them and things like that.

Speaker 5 (16:08):
But then there's the episode I was talking about where
she kind of goes nuts and almost becomes dangerous.

Speaker 2 (16:13):
Yeah, but she's also a maid. She without hey, she
is there at all times, and she has red eyes,
which was a choice.

Speaker 3 (16:26):
It was a choice, it was a purpose.

Speaker 5 (16:29):
And if I'm sorry to keep hounding on it, but
if I'm not mistaken, on that episode, I'm talking about
the eyes turned slantways, kind of like angry eyes, and
it becomes this like, what is this creature that Rosie
has transformed into, our delightful robot helper has now become
something else, entirely sinister?

Speaker 3 (16:46):
M h Yes, why weren't you happy being a servant?
It's it's a very strange implication. They're smart writing on
that show. And when we think about these things, we're
always sold these ideas of a world without works, a
world with seamless integration of everything with no problems. And

(17:08):
interestingly enough, in the Jetsons, they're still selling some quote
unquote family values that were arguably anachronistic by the time
the show was made. If we put all that aside,
we got to talk about we got to talk turkey,
because I know that all three of us have been
reading intensely about specific emergent robot technologies for the home.

(17:33):
How can you build these things? A lot of experts
are saying the issue with making it as our power.
Lauren would say, actual facts work is much more complicated
than the average person realizes.

Speaker 2 (17:48):
Well, yeah, it's a it's not just the hardware, and
the hardware is extremely difficult to get right. Actuators that
are light enough to function in the ways that as
close as a human's body functions, right, that's at least
the goal, that's what you're trying to get because you
want them to perform tasks that we would do, but

(18:09):
now it's a robot doing them. Just to get the
hardware right is near impossible. And if you've if you've
been following along the way we have with iterations with
all these various companies including Tesla and Boston Dynamics, and
I mean there's so many ub techs, these huge companies
that have been spending years and years trying to innovate

(18:30):
just the way an arm will function and the way
a hand will be able to rotate and grasp without
doing things like crushing anything that it's trying to.

Speaker 3 (18:41):
Touch something soft. The real analog capture now, just so
you know, folks is doorknobs.

Speaker 4 (18:49):
Yeah, and eggs.

Speaker 3 (18:51):
They're super tricky. Oh eggs. The egg test is tough.
I failed that.

Speaker 5 (18:54):
One a few times, because you have to think, these
are these are you know, powerful machinees that can absolutely
crush things and and do demolition type work, but they
have to also be so delicate and nuanced in their
control that they can dial that back and have enough sensors,
you know, with enough level of nuance again to detect

(19:16):
that something is a certain type of object so that
it doesn't you know, apply too much pressure.

Speaker 4 (19:21):
It's delicate dance.

Speaker 2 (19:24):
It is what I always think about.

Speaker 5 (19:25):
Guys.

Speaker 2 (19:25):
Those are those early Boston Dynamics robots that were dogs,
Oh yeah, right, and spot I think is what they
ended up calling it. But early on it was like
the robot, the Boston Dynamics Dynamics dog, and just to
get to function, they had to have it hooked up
hardwired to its power source and to its controls and
all that other stuff just to get it to function.

(19:48):
And they eventually moved away from that and they were
able to you know, make the thing fully mobile with
improvements in the actual hardware. But then if you look
at the early iterations of their humanoid versions too, you
couldn't have a mobile humanoid robot because you couldn't give
it enough power for long enough to actually operate. And

(20:08):
if you could, then it would be so dang heavy.
What's it even do it? Can it even walk across
your floor without destroying it?

Speaker 3 (20:15):
Right? Not to mention stairs, oh my, which is also
running into a platform issue. And we've sent a lot
of these videos to each other back and forth over
the years and read these papers, and we've made some
at times dystopian predictions. The points we're bringing up now

(20:36):
just the basic turkey talk of how these things would
work in production. We see a lot of professors who
share our opinions. Writing for The Conversation in twenty twenty two,
Professor A YOUNGA. Herrid Notes notes that people might be
a little bit optimistic still about how you would how
you would make a general purpose household robot. And there's

(20:59):
a great quote here that we'd love to share with you.

Speaker 1 (21:01):
Yeah.

Speaker 4 (21:02):
He says.

Speaker 5 (21:02):
One major difference between digital and robotic device is that
household robots need to manipulate objects through physical contact.

Speaker 4 (21:09):
To carry out their tasks.

Speaker 5 (21:10):
They have to carry the plates, move the chairs, and
pick up dirty laundry and place it in the washer.
These operations require the robot to be able to handle fragile, soft,
and sometimes heavy objects with irregular shapes.

Speaker 2 (21:23):
Yeah, and some of the state of the art robots
that are out there right now that you can't even
buy yet, but as a company you can buy it
for like research and stuff. They can carry up to
like forty five pounds, like forty five pounds. There's if
you've got a robotic human thing in your house, you
probably want it to be able to carry stuff that's

(21:45):
heavier than that, right, right.

Speaker 3 (21:48):
And also, I don't know, I'd be careful with it
because if you're human, part of the reason that you
don't constantly break your body is because you have a
complex cartoons, complex network of response stimuli kind of things
coming at you constantly. There's there's a reason not to

(22:09):
get too dark with it, but there's reason certain drugs
make people appear to be capable of superhuman feats. The
part of their feedback system that is stopping them from
you know, breaking their arms or biting off their tongue
because your jaw is such a strong muscle in the
human those things. When those checks and balances are gone,

(22:32):
then mayhem ensues. So even with a robot that could
lift forty five pounds, if it doesn't have very specific
programming making it safe around humans, it can be a
recipe for disaster.

Speaker 5 (22:45):
Oh it's like, I mean, The Boys TV show seems
to be a little bit divisive, but there is a
season where there's this young, you know, superhero who is
like doing these kind of like photo op things are
he's supposed to do like a fake rescue basically, but
he doesn't have control of his super strength and he
accidentally throws somebody into a wall and explodes their body.

(23:07):
I mean, that's what we're talking about here.

Speaker 4 (23:09):
I mean, it's it's.

Speaker 5 (23:10):
This level of power, the possibilities, there have to be
guardrails on it.

Speaker 2 (23:14):
Well, the whole point of that is then, because we've
been talking about the hardware, right, Yeah, and what you're
to your point, Ben, it's what are the software control
mechanisms for this, and how re proof are they?

Speaker 5 (23:25):
Yeah, and how foolproof and how uh you know, impervious
to hacking and whatever else?

Speaker 3 (23:31):
How much can you pre program? Yeah, that's that's one
of the other issues with this generalized robot servant and
the home kind of question. Robots in a manufacturing environment.
And again, robot is just an artificial thing made to
do work. And we know the term artificial may not
age well, but we'll see. Robots in your factory lines

(23:52):
have a if they if you want to consider them alive.
They have a very strictly organized life. They have a
highly or organized sequence of task. I'm the guy who
puts the plate A on frame B. That is my life.
I didn't choose this existence, but I'm all about it.
That's the one thing I do well.

Speaker 2 (24:13):
Yeah, And often it'll be an actuated arm or something
that is mobile the right. They can move around, can
reach over, grab a box, put it on a conveyor belt.

Speaker 4 (24:23):
Everything's on rails literally.

Speaker 2 (24:25):
Well, sometimes it is that some of them, some of
the new android humanoid like robots, do actually walk from
one side of this like delineated rectangle that they get
to operate in from one side, pick up a box
and put it in another place. And it is nice
to be able to, I guess, scan all the stuff

(24:46):
and all the information on the boxes to know what's
what and where it should go. Uh, and then make
that process robotic if you will. And so some some
human being doesn't go home with an achy back, you
know every day.

Speaker 3 (25:00):
Sure, yeah, But then also we're very well aware on
the flip side of that, there would be a great
deal of humans who want a.

Speaker 2 (25:10):
Job it would gladly accept.

Speaker 3 (25:12):
That job, gladly accept a soar back. It's an imperfect thing.
You don't get to the star trek post scarcity economy
or post work economy until you live through the post
worker economy, which is a chaotic time on the way
I mean. Also, I love the point of this moving
in a limited space with a constrained regimen of actions.

(25:34):
Engineer's program a robots movements or there are signals like
QR codes that can help you do what we're talking
about a little bit off air, locate objects, target locations,
and that's where you see how this gets more difficult
in most households, which are not factories. Shout out to
everybody's clutter kingdoms, you know what I mean, no judgment,

(25:56):
no aspersion, But household items are often disorder organized. They're
placed randomly and sometimes unpredictable locations.

Speaker 2 (26:04):
Which is kind of the whole point of having a
humanoid robot that can help you organize things and pick
things up and clean things that.

Speaker 5 (26:12):
It requires kind of this like nuanced ability to interpret
the situation and adjust accordingly. And of course, you know
with any kind of AI or you know, personal device,
you can have your own set of preferences that you
could program in and advance. But with this situation to
what Ben's describing, there are going to be a lot
of variables even within the set of preferences.

Speaker 3 (26:33):
How do you get a robot to think in terms
of jazz?

Speaker 2 (26:36):
Oh yeah, and jan shui, you know, I mean, the
only current way to do it. Have you guys ever
had a rumba or a similar cat robot?

Speaker 5 (26:43):
It always gets stuck under the couch, bro serious, well,
doesn't do a good job.

Speaker 2 (26:49):
Theoretically when you get one of those. I've had one
one time, and what you were supposed to do is
basically give the robot a standard for where things are,
where things should be, and so it understands the layout
of whatever rooms you want it to do.

Speaker 5 (27:06):
It learns clean right, like, it actually has learning capabilities
in that in a rudimentary way.

Speaker 2 (27:12):
Correct, Yes, exactly, It's mapping as it goes. So I
imagine if you got one of these, let's say, uh,
this company one X that is trying to build something
called a neo that is like a clothed humanoid robot
that is this kind of thing. It is your butler,
as your made, is your assistant that just hangs out

(27:32):
in your house. Right, you would basically show it for
the first time how everything should be in your house,
where everything is located, and then it once it had
that as a basic standard for the way the house
should look and where objects should be placed. I imagine
it would just replicate.

Speaker 3 (27:51):
That, right, and you would have a functionality where you
could say, hey, we got to remap. Yeah, moving, we
got a new coush I don't want to don't want
to shaymla on you, bro, but this one's in l shapes,
so let's walk through this together.

Speaker 2 (28:04):
But Neo really likes that one candle being in a
specific spot it.

Speaker 5 (28:10):
You know, no, now it just feels right there, oh, neocamp.
But Ben, you also point out we were talking a
little bit off Mike just about how you know a
robot has to be robust and light on its feet
enough to navigate all kinds of obstacles in that kind
of entropy that is, you know, a clutter kingdom. I

(28:31):
mean I was talking about how if I had a
robot that was going to do my laundry, it would
have to gather up all my laundry, identifying all the
individual pieces, load them up, come downstairs into my basement,
which is crammed full of expensive audio equipment and cabling everywhere.
I just picture coming down and finding the thing entangled
in mic chords on its back like a tortoise on
its shelf.

Speaker 3 (28:51):
Into the paper clip problem, you know, it goes back
to the software. We're talking about paper clip AI thought experiment.
What is not a paper clip? Right?

Speaker 6 (29:02):
Uh?

Speaker 3 (29:02):
Will it destroy the universe by turning every possible piece
of matter into something like a paper clip?

Speaker 2 (29:09):
Or if hot Dog?

Speaker 4 (29:14):
I'm cool with that.

Speaker 2 (29:16):
What if it's us?

Speaker 5 (29:18):
Sorry, remember the hot Dog universe from everything everywhere, all
at once.

Speaker 3 (29:21):
Yes, hot Dog would be like that.

Speaker 2 (29:23):
Yeah, oh I still haven't seen you joking.

Speaker 4 (29:25):
Oh my goodness. Get the to a stream stream stream streamer.

Speaker 3 (29:29):
Yeah, don't let the HYPEENSSA, it's it's such a fun
lovely I know you well, I think you'll take it
the it's whomever is writing it just at no point
said no to winning creative decisions. Yes, okay, sort of yes, Andy,
But in a very different direction from long legs. To
be fair, I bring up the paper clip idea because

(29:53):
imagine a laundry robot. How do you code and articulate
for the for this thing? How do you code and
articulate what does or does not constitute laundry? How does
it know that the curtains are sheets hanging up?

Speaker 6 (30:11):
Yeah?

Speaker 2 (30:11):
And what cycle should they go on? You're gonna have
to like have put QR codes on everything you own.

Speaker 4 (30:17):
Oh, that's a terrible plays true to that end.

Speaker 5 (30:21):
A colleague I had a call with today and I
mentioned that we were doing this episode and he said,
you got to see this video and he sent me
this recent thing where open ai Chat GPT is now
powering a lot of these robots. And in this particular
demo that I think took place in March of twenty
twenty four, it's a dude and this robot called Figure
one with a table full of objects, some of which

(30:45):
are dishes and things, and the robot, using this software
was able to identify that the dishes were dirty and
they needed to be washed, were able to identify that
there was an apple, that it was a healthy thing.
He was using sort of more abstract terms like give
me something to eat, and so he scam what's on
the table, and the apple was the only edible thing,
and also identified it as a healthy thing. But the dishes,

(31:06):
you know, to your question matter about like howold it
know it's dirty?

Speaker 4 (31:09):
I guess there's yeah, you know, there would have to
be yeah cues or something.

Speaker 5 (31:13):
But with clothes it's different because it's sometimes more of
a smell thing or I don't know how it detect
the stink.

Speaker 2 (31:20):
I would just say, if you watched that video closely,
this is you know, I don't mean to be a
skeptic or weirdo or anything like that, guys, but if
you watch that video closely, you can see some of
how the time elapses a little differently, and I think
they even might mention that in there, and also the
fact that it's there's some cuts I think in there
as well. So you just you just gotta be careful
anytime you're seeing any of the updates for any of these,

(31:43):
you know, prototyped robots that are theoretically going to be
a household item for people of that upper upper echelon
in society, at least for a time. Everybody is vying
for that coveted market share of the at home android.

Speaker 3 (32:01):
Get in early. You know, also, is there some swindlery
afoot like the mechanical turk was kind of a con
job for a lot of people. We are at the future, though,
this kind of stuff will happen. I mean, I love
the point that as a robot you would have to
know jazz to navigate a household. You would have to improvise,

(32:24):
adapt overcome all these unexpected interactions with pets. You know
what I mean. If you've got a dog or a cat,
this little vacuum cleaner sensitive, imagine how it's gonna feel
about about your hold. Nine thousand.

Speaker 5 (32:38):
I was already thinking about, like, you know, what seems
like a kind of almost cliche sci fi scene, but
where you first bring the robot helper home and how
the actual pets start to react to it and interact
with it.

Speaker 4 (32:51):
You know what would that feel like? It's interesting. You know,
it's like bringing home a new dog and you've got cats.
You got to acclimate them.

Speaker 5 (32:59):
Dude, I just picture what's the practical process for acclimating
your real pets to the robot.

Speaker 2 (33:06):
Your child says Pepper and then asks in a really
inappropriate question, what does Pepper do? Do you guys know
Pepper that's a very specific robot that I think we've
seen maybe seen before.

Speaker 4 (33:20):
Doesn't ring about to me. I'd love a refresher.

Speaker 2 (33:23):
It's you can look up. I think soft Bank is
the company that makes Pepper, and I've seen it at
a couple of restaurants around Atlanta where it's often it's
like a greeting robot kind of. It doesn't have much
more functionality outside of that, but at least that's what
I've seen it function as. It's on wheels and it

(33:44):
looks like a little human kind of but very plastic
on a screen.

Speaker 4 (33:49):
Or it has like a like an anthropomorphic type face.

Speaker 3 (33:52):
It has an anthropomorphic head that's designed to look a
little bit infantile, like big eyes, small mouth. Yeah, cute,
because I would argue one of the primary and Stofting's
been very successful with this are fox con I would
I would argue one of the primary missions of Pepper
long term is to normalize the presence of these devices.

(34:17):
I don't talk too much about this kind of stuff,
but I was. I was on the road for a
while in some far FORNG places and one of the
things that is a big flex now will be robot assistance.
Just in liminal public spaces like airports, there are the
autonomous or semi autonomous wheelchairs, mobility devices, and in some

(34:41):
kind of fancier hotels there are just these little they
look kind of.

Speaker 4 (34:48):
Like trash cans. Yeah, what's kind of.

Speaker 3 (34:52):
Rectangular trash cans? Yeah, and do task and you always wonder.
You know, a couple of times I held the elevator
for one and it was just like an awkward ride
because sometimes they'll talk back to you. Yeah, if you
ask them questions.

Speaker 2 (35:11):
That same company, SoftBank, makes a huge vacuum cleaner, something
that scrubs floors, and they're mostly sold as you mentioned earlier, Ben,
they're mostly sold for like senior living facilities and places
that want to automate keeping a giant facility sparkly clean.

Speaker 3 (35:31):
Yes, it's you know, it's cool. Stt inherently said, mister right.
I mean this this also in these things that we're describing,
we're seeing robots in a commercial sense being embigant in
terms of their task.

Speaker 4 (35:47):
Right.

Speaker 3 (35:47):
They're able to do more, more and more complex things,
like anytime you're bummed out because you're doing a bunch
of chores. Your bored, right, and you're out of podcast
or whatever. Remind yourself the stuff that is boring to
you as a human is like arcane level complexity to

(36:09):
many robots, to many algorithms, just the idea of you know,
like you were saying earlier, differentiating between laundry and curtains
washing dishes. So the question then, or maybe one of
the answers, is, we don't have one dog's body, right,
we don't have one robot. We have a cadre of

(36:31):
specialized robots. You walk from room to room, and you
got a crew for each room. You know, there's one
kitchen bot crew, and then there's one bot that just cooks.
It just cooks stuff. It has nothing to do with
it after that, because the next task, cleaning the dishes,
goes to this one robot that just clinks s.

Speaker 4 (36:52):
Borderline the way the Jetsons depicts it.

Speaker 5 (36:55):
They have all kinds of other smart home objects outside
of just Rosie. She's mainly tidying up and you know,
helping out with the kids and stuff. But they definitely
have just like in the Flintstones, they got the what
is it, the like tiny wooly mammoth's or the pig
that's under the sanct it's like the garbage disposal.

Speaker 4 (37:12):
It's all that the different levels of specialization.

Speaker 5 (37:14):
But I mean to your point though, been like about
the difficulty and the ability to adapt. We're seeing this
stuff kind of make news in terms of how it's
not there yet, Like with these autonomous cars in San
Francisco that got stuck behind each other in a parking
lot at four am, just honking at each other in
an endless loop because something went wrong and they just

(37:35):
couldn't figure out how to navigate to go around.

Speaker 3 (37:39):
Yeah, that's that's the pickle of it.

Speaker 1 (37:41):
Right.

Speaker 3 (37:42):
We're gonna talk more about this idea of specificity of task, right.
I think also we arrive at something very close to
arguably demonic possession if we consider a single like a
single house kind of major domo of software. Right, And
all the things that do these specific tasks are less autonomous, right,

(38:05):
And they're more like limbs on an octopus. So the
kitchen bot may have a different voice from the TV,
but ultimately it's the same mind controlling them. It's a
little a little a little weird.

Speaker 4 (38:17):
You know.

Speaker 3 (38:17):
Also, out in your backyard, you got your autonomous robodog
shout out the spot, working in concert with your smart
alarm system, complete with orbiting drones, just to keep things
on the up and up. Dude, is that too crazy
or what do you think?

Speaker 2 (38:33):
I don't. I don't think that's too crazy. I think
it's pretty terrifying, guys. I'm gonna share a link really quick.
I put this in our base camp earlier. I just
I imagine an army of these in a house.

Speaker 4 (38:46):
Uh.

Speaker 2 (38:47):
And just I don't know why Sandbot chose to give
their robot eyebrows that look like they're angled down like
you were. I mean, look, look at that link.

Speaker 4 (38:59):
Look at that a little sinister. It kind of looks
like Wally a little bit though too.

Speaker 2 (39:03):
Yeah, but just a bunch of those like you're talking
about Ben, that just go wrong at least at least
you you know, in this case, it's got wheels rather
than legs, so you could theoretically just go up or
down and you'd be.

Speaker 3 (39:17):
Safe, yeah for now, right now. Yeah.

Speaker 5 (39:22):
And it's like, if anyone's familiar with the Fallout games,
you know, and obviously they're taking a lot of cues
from decades of science fiction, but there are helper robots
that also have security modes, and that if hacked or
something goes awry with their programming, all of a sudden
start murdering everybody.

Speaker 4 (39:40):
So there's certainly that concern.

Speaker 3 (39:43):
Yeah, well, this is where we're getting into some of
the really deep water. We have such leavers to share
with you. Before we dive in, we'll pause for a
word from our sponsors. It's just north of forty am
local time. Paul Asimov weeks with a pair thirst and
a bad dream, still singing in his mouth a soft chime.

(40:04):
His Marco powers up from its station and flitters behind
him as he walks from the bedroom to the door.
The light's coming on with each of his steps down
the hall.

Speaker 4 (40:13):
Jesus Age, turn down the lights.

Speaker 3 (40:14):
Please, says Paul. Okay, Paul says the Marco. The little
drone flits the land on his shoulder.

Speaker 2 (40:22):
Would you like to upgrade your auto light plan.

Speaker 3 (40:25):
It whispers in his ear.

Speaker 2 (40:27):
If you subscribe to auto Light Premium now, you can
have six months of ad free lighting with a twelve
month contract.

Speaker 3 (40:35):
Paul's still shaking off the sleep. He blinks, flexes his jaw.
He's thinking about the bills his daughter Mandy, just four
as of last week, already wants her own Marco Junior.
The watch on his left wrist beeps.

Speaker 4 (40:47):
Your heart rate's going up, says the Marco.

Speaker 2 (40:50):
You're thinking about Mandy's Marco Junior. Marco Junior, the best friend.

Speaker 4 (40:55):
I can't cant ask for skip.

Speaker 6 (40:58):
Are you sure you want to camp?

Speaker 3 (41:01):
Paul enters the kitchen. There's a cold glass of water
waiting on the counter. He drinks it in a few
loud gulps. His watch plays a sound que of distant applause.

Speaker 6 (41:10):
This glass is brought to you by Mega Gent Water.

Speaker 3 (41:13):
Beams the Marco.

Speaker 6 (41:14):
Mega Gent Water. It's got the megagents.

Speaker 3 (41:18):
Paul reaches for the fridge door, grabs the handle, and
rolls his eyes when a touchscreen glows red.

Speaker 4 (41:24):
I just want to snack.

Speaker 6 (41:26):
I'm sorry, Paul, you are at Pete koloric intake for
this cycle. Would you care for some more water? Brought
to you by Megagent.

Speaker 3 (41:33):
Water yells the Marco again, Mega Gent water.

Speaker 6 (41:36):
It's got all the Mega jets.

Speaker 4 (41:40):
I'll just you know what, I'll watch TV.

Speaker 3 (41:43):
The TV in the other room hums on slowly, like
a whale swimming up from deep oceanic depths.

Speaker 6 (41:50):
You are currently up to date on all programming?

Speaker 4 (41:53):
Is there anything else? I can watch?

Speaker 3 (41:55):
The TV circuits hum for all the world like an
old man's grumble.

Speaker 5 (41:59):
Duh, just play something not a half.

Speaker 6 (42:04):
I'm afraid I can't do that, Paul.

Speaker 3 (42:07):
Paul looks around. He walks to the front door.

Speaker 4 (42:11):
I'll go for a drive.

Speaker 3 (42:12):
He grabs the Wi fi at able doorhead.

Speaker 4 (42:14):
Hey, let me out.

Speaker 3 (42:16):
The door's touch screen lights up red.

Speaker 6 (42:18):
I'm afraid I can't do that, Paul. If you'd like
to subscribe to night Walk bluff.

Speaker 3 (42:23):
Scamp, the roomba skirts against his foot.

Speaker 4 (42:26):
Cleaner floors for cleaner soul Scamp.

Speaker 3 (42:30):
The roomba pauses, as if in deep electronic contemplation.

Speaker 6 (42:34):
I afraid I can't do that, Paul.

Speaker 2 (42:38):
If you would like to upgrade to Rumba Motion Master
Home Park Urse.

Speaker 4 (42:42):
No skip skip everything, turn it off.

Speaker 3 (42:45):
Please for a moment, everything goes dark. Paul looks around.
In that moment, he hears nothing, blessedly, nothing save for
his own pulse in his ears.

Speaker 4 (42:56):
Please just sturn it off, turn off.

Speaker 3 (43:00):
And then the chimes, the hums, the beeps. One by one,
they gather in the darkness, Little red, green, and yellow
eyes leaping into existence like insects, their chimes and hums
and beeps combining into a single chorus, as though possessed
by one single voice. They whisper over and over and

(43:21):
over again.

Speaker 6 (43:22):
I'm afraid I can't do that. All. I'm afraid I
can't do that.

Speaker 2 (43:27):
All.

Speaker 6 (43:29):
I'm afraid I can't do that. I'm afraid I can't
do that. Car, I'm afraid I can't do that. I'm
afraid I can't do that.

Speaker 3 (43:40):
Yes, Homework the latest and greatest in residential housing. No
upfront costs, no pesky paperwork, the future of a household
wherein everyone finally plays their part. Homework is a subsidiary
of Illumination Global Unlimited. And we have returned to hope

(44:04):
you enjoyed the ad break. We are We're hurtling toward
the more jets andy, utopian, shiny, happy robots holding hands
version in the future. One thing that I think is
amazing is experts are arguing we will see not just
the evolution of robots, but we'll see the evolution of

(44:25):
houses built for those robots to function with it.

Speaker 2 (44:30):
Oh yeah, Ben, you shared what was it? It was
basically a how to for making your home ready for
an android household friend.

Speaker 3 (44:43):
H Yeah, you need your Dixie Chicks level, wide open spaces. Yeah,
you got to get rid of those pesky doorknobs. To
that earlier point about steep stairs, you want to try
to trend more ranch level if you can.

Speaker 2 (44:57):
Or yeah, or have stairs that are at some kind
of extra dream angle that take up you know, the
entirety of the length of your house.

Speaker 5 (45:04):
Or like you know those the specialized kind of banisters
that wheelchairs can attach to for the elderly to like
take them up and down the stairs. Maybe there's some
sort of retrofitted thing attract the robot can attach itself
to so it doesn't have to trip down the stairs, or.

Speaker 2 (45:20):
Just install a couple elevators, you know, because again we've
seen you know, we've walked fit to the hotels and
seen the robots use the elevators.

Speaker 5 (45:27):
We did that story about the helper robot that quote
unquote unlived itself. I'm sorry, I'm having trouble with that
new one, but it was you know, usually would operate
the elevators.

Speaker 3 (45:38):
Yeah, we saw this great the article you're referencing there.
Matt comes to us from Andrew Rosenblum writing for Wired,
I believe that's one. Speaking with a researcher from University
of Washington, Maya Kachmack, who says there are two approaches
to building robots. Make the robot more human, like to
handle the environment, or design the environment make it a

(46:00):
better fit for a robot. That's a fascinating way to
reframe the question. You know, get rid of those pesky
you know, those dastardly spherical doorknobs. Right, why can't we
just have a QR code or whatever.

Speaker 2 (46:14):
Well, you just need a good lever. If you've got
a lever, most of the new ones can make it happen.

Speaker 3 (46:20):
Yeah, more space between conventional fixtures, like in a lot
of restrooms, water cloths, bathrooms, bathrooms, and a lot of bathrooms,
the commode might be really close to you know, a tub,
a sink or like a yeah sink works, Yeah, one
of those things.

Speaker 2 (46:41):
Really. Just start designing bathrooms with toilets in the center
of the room.

Speaker 3 (46:46):
Perfect, perfect for the introverts in the crowd.

Speaker 2 (46:50):
Yeah.

Speaker 3 (46:52):
And then I didn't know this. You got to minimize
shiny or transparent objects.

Speaker 4 (46:56):
Because they could throw off like sensors potentially.

Speaker 3 (46:59):
Yeah, why doesn't his house have any mirrors?

Speaker 2 (47:04):
Sorry, all we're going all Matt Matt Black everywhere in
the house except for the QR codes exactly.

Speaker 5 (47:11):
Well, couldn't it use Yeah, I know, QR codes is
one option, but like r FID chips and things, as
it passes through, you know, things.

Speaker 4 (47:18):
Could be triggered.

Speaker 5 (47:18):
It's almost like little kind of sensor benchmarks kind of
in terms of letting it know what area it's in
or whatever.

Speaker 2 (47:25):
You could you could rfi D every pair of underwear
that exists in your house.

Speaker 4 (47:29):
Yeah, there you go.

Speaker 3 (47:30):
Done. Finally, finally there's some order with my undergarments in this.
But there's another thing that I think takes us a
step further here, still keeping it utopian, we will not
just see the behavior or the design of a household change.

(47:51):
We will not just see the design of the household change.
We'll see the behavior of the humans within it change.
To help your bought servants, you'll start functioning increasingly within
the realm of their routines. So it's a lot less
like a I tell you a you do Z kind
of thing, and it's a lot more like meeting in

(48:13):
the middle of functionality.

Speaker 5 (48:15):
It really kind of brings up this fundamental almost philosophical
question about like what is productivity? You know, like what
do we do with all of this time that we're
getting back by not having to do this stuff? And
then it starts to like bulmost beg the question, but
like isn't sort of part of life doing some of
this stuff? And like what do we fill our lives
with we no longer have to keep house or no

(48:37):
longer have to do any of this because I would
argue that stuff isn't all inherently bad or a waste
of time. It's a way of having a little bit
of zen time away from other things that where you
can kind of accomplish like some thinking.

Speaker 4 (48:51):
Or you know, I don't know, it's just what do
you replace it with?

Speaker 5 (48:54):
And in terms of like how we change That's why
I bring that up.

Speaker 3 (48:58):
What's the what's the meditative opportunity left?

Speaker 5 (49:01):
I mean, I guess it's it's transitdental meditation. It's more
like focusing on doing yoga, and I don't know, I'm
just no.

Speaker 2 (49:08):
You you replace it with love Island and whatever season
we're on, and you replace place of Dad's a million
more rooms in elden Ring and you replace whatever consume.

Speaker 3 (49:20):
Oh my gosh, I gotta take these sunglasses off use.

Speaker 2 (49:23):
The robots are so shiny.

Speaker 3 (49:25):
That's right. This also means to that direct question, just
on the production level. To help the robots, you will
probably enable some form of continual location tracking within your
own home and likely outside of it, so your buddies
know when you're on the way back.

Speaker 2 (49:44):
Well, we already got that. It's on here. It's it's
on your car, it's on your phone, it's on your
watch that you're.

Speaker 3 (49:52):
Probably wearing for your underwear tags.

Speaker 2 (49:55):
Well, I mean, but no, really like the actual things,
it's already there, which is weird.

Speaker 5 (50:01):
And like the trade off we make with social media
or whatever, we almost accept the fact that our data
is being used in that way. It's not like all
this stuff is happening in some sort of secret, insidious way.
It would be another thing where we're like, no, it's
worth it. It's worth sacrificing this privacy and this personal
data so the robots can know where my underwear is.

Speaker 3 (50:20):
Yeah, it doesn't feel like you're out in the open
ocean until you look back and realize you don't see
the shore, you know what I mean, And just playing
in the water for a while and the water gets deeper.
I mean, this idea of location tracking, I love that
we're pointing this out is already happening. That is not
the future, that is the present in which we're recording this.

(50:41):
But the location tracking in your home is also going
to apply to visitors. Go back to your robot dog, Fido,
whatever you want to call them. This guy's a real
sweetheart to authorize visitors. But woebetide intruders. You know, what
if what if your good old aunt Phyllis wants to

(51:03):
surprise you with some macaroons and she hasn't seen you
in years, right, she's not programmed as a visitor?

Speaker 2 (51:11):
Quick quick question, what is woe batide?

Speaker 3 (51:15):
Woebatide means dire things will happen?

Speaker 2 (51:18):
Yes, okay, beware and beware.

Speaker 4 (51:22):
It has had great peril, you know.

Speaker 3 (51:24):
Yeah, bad bad things will.

Speaker 2 (51:26):
Happen, weaving spiders whatever out here.

Speaker 5 (51:29):
This is true though, because I mean, you know, if
it doesn't have the programming or the ability to.

Speaker 4 (51:36):
Have exceptions, right.

Speaker 2 (51:38):
Well, yeah, when you before you show up to the house,
you got to go to the gate, right because anybody
that has one of these robots is gonna have an
automated gait of some sort. You go in, you got
to get a very specialized code, some kind of key
that allows you safe passage and otherwise woe batide.

Speaker 3 (52:00):
Bark bark bark. And that's that gets us to the
dystopia right the doorway to the more troublesome version of this.
It's that monitoring piece for me, guys. Even though obviously
I know, like the majority of people in the developed world,
I have things that I cannot opt out of for

(52:22):
different versions of location tracking in some way. But this
is a whole other level. Your pets, your family, you,
every visitor continually monitored on nearly every in nearly every
single aspect of interactions, your physical fitness, your diet, your
internet activity, your social media comments. You know, when you

(52:42):
watch what shows? What do you talk about at dinner?
Do you talk in your sleep?

Speaker 5 (52:49):
Ooh yeah, I mean but then, but then doesn't it
all kind of go back to like the nature of
the terms of service for these products.

Speaker 4 (53:01):
In terms of like how that information can be used.

Speaker 5 (53:03):
Like I mean, you know, if you're paying enough money,
one would think that you would want some assurances that
this stuff just wasn't available to the highest bidder. Not
the same as what websites you visit in terms of
sales and things like that, but actual personal, private things
that are said in your house or got again in
your sleep.

Speaker 3 (53:23):
Yeah, like you want the agency, but let's also consider,
you know, the slow robotic grasp right. Social media and
software services have been very good about slowly tightening their grip. Right,
Remember how it became, Remember how it became. You know,

(53:45):
when facebooks in myspaces and those kind of things came out.
Originally there were some pretty easy to find, easy to
understand privacy and security measures you could take. You could
opt in or opt out of things. That's increasingly difficult.
Now the very concept of privacy is being eroded. I

(54:08):
don't know if that's a bad thing. I don't know
it's a good thing. But it's a thing that's happened.

Speaker 5 (54:11):
It has been for some time, and it does get
to the place where, to my previous point, like it
becomes something that only exists.

Speaker 4 (54:20):
In the realm of the super privileged to even claw
back a little semblance of that privacy.

Speaker 3 (54:27):
What we're saying is, in the dystopian version, you become
another machine in the house, and this means that your
universe of doable or valid task may become more limited
than you expect. You know, you go to the fridge
late at nights, it's where I talk in macaroons, and
the fridge won't open because you're not authorized. At this point,

(54:48):
you've exceeded your caloric intake. If you want to make
your fitness goals, what do you just wait till five
you go back in and try to change the settings
as soon as possible? Or what if? What if some
innocuous activity triggers some other data collection thing? This is
to me.

Speaker 4 (55:07):
I don't or some other security measure, right.

Speaker 3 (55:12):
Like the music you listen to, you guys like music
late at Night's fine?

Speaker 4 (55:16):
Now, yeah, of course, take it or leave it. But
I mean, I don't know.

Speaker 5 (55:23):
Whenever I think of the dystopian robot scenario or these
things that were designed to help and then they turn horribly,
horribly wrong. Is RoboCop It's just such a cool It's
more satirical than people I think remember oftentimes, Like it
really does have a very biting satire to it. Paul Verhoeven,
I think the director as does Starship Troopers. But that

(55:46):
part where they're showing off the new police robot they
the bipedal one, you know. Yeah, and then it just
murders everybody in the office because of some glitch and
it's in it's software, you know. I mean, I I'm
not saying that this would happen or will happen, but
it certainly could.

Speaker 2 (56:03):
Do you guys, mind, just as a case study, looking
at this one particular android robot from ub Tech, I
just put a link in the chair, just really quickly,
and I'm just talking about visuals in a couple of
the bullet points because it speaks to what you're talking about.
The way it's advertised, the way they want you to

(56:24):
think about it is actually the stuff that you tweak
just a tiny bit and then it's terrifying. So if
you look at this one, it's called a Walker X.
That's the type of robot we're looking at from ub Tech.
Do you see one of the first images, it's the
second image of the robot on the screen there, it's

(56:45):
doing the thing where it's got one open and it
just looks like, oh, okay, are you trying to tell
me it's gonna be like for security or something. But
it also looks like I'm gonna mink you up?

Speaker 4 (56:56):
Isn't this? Is this a Japanese company Chinese you b tech?

Speaker 2 (57:01):
I'm not sure.

Speaker 5 (57:02):
Well, I'm just wondering because that gesture is almost one
of respect. It's it's like bowing, you know, like yeah,
before you began a martial art bout, it's exactly, but
it's it's yeah, it's sinister, dude, You're.

Speaker 4 (57:15):
Totally right, but it is Chinese, Okay.

Speaker 2 (57:18):
It just chin So keep scrolling down. Then you see
it lifting weights and it talks about how it's self
balancing and it's got interference resistance. It can resist your interference. Uh,
it's got hand eye coordination for object manipulation. And then
down here it's got a woman in a red dress
in the robot is massaging her back and it says
full body flexibility for safe interaction.

Speaker 4 (57:42):
And what does it.

Speaker 2 (57:42):
Say right below that, guys, right below that in the
bold you slam you slam navigation.

Speaker 4 (57:50):
Come on, someone should have maybe thought that one through.

Speaker 3 (57:53):
It might be a translation very well, but but also
I love you bringing this up because we probably tonight
we won't get to it. But this brings up some
you know, the story of forced labor. It brings us
to some other very dangerous things. Philosophically, we know that

(58:14):
we know that people are gonna use androids for very.

Speaker 2 (58:17):
Creepy purposes just mostly sex. That's fine, whatever.

Speaker 5 (58:22):
Well, yes, I'm sorry, I was about to like litigate, like,
but is having sex with a robot inherently creepy?

Speaker 4 (58:29):
I'm sorry?

Speaker 5 (58:30):
I'm actually serious because I think there are some people where,
you know, maybe they are unable to go on dates
or whatever, and they want some physical content. There are
types of robots that are specifically designed for physical comfort
and physical contact, and even like nurse robots that are
programmed with bedside manner in order to you know, work
with physicians. But the term off mic that I was

(58:52):
mentioning that I'd never seen before is gynoid, which is
refers to a female presenting androids.

Speaker 4 (58:59):
Oh and a lot of these are.

Speaker 5 (59:02):
You know, potentially it could be pleasure bots, because they
are designed to look as close to an attractive female
as possible or whatever sex you prefer.

Speaker 3 (59:14):
And in that what you're describing there, ostensibly, there's nothing
wrong with that. What I'm talking about when I say
evil things is someone getting getting a robot that's purposely
designed to look like a human child something.

Speaker 2 (59:29):
Like, Oh, okay, I didn't think about.

Speaker 3 (59:32):
It, and we're all too clean to actually think about
things like that. But it's a big, big world out there.
There are a lot of bad folks.

Speaker 2 (59:39):
Well, hey, if you want to know more about that. Unfortunately,
there is a company called Hands and Robotics, which is
one of the most famous robotics companies out there. They've
been around forever. They made a robot called Sophia, which
is extremely sophisticated and actually supremely cool. They even have
a patented thing, a patent did skin like substance that

(01:00:02):
they put on their robots called Frubber Frubber registered trademark.
But they also made a little thing called a Little
Sophia that is this teeny tiny little version of Sophia
that actually went on whatever show Jimmy Fallon Hosts and
creepily they had a creepy segment that I'm sure they

(01:00:22):
paid for where he like introduces little Sofia and talks
about cute she is and how yopes she could come
home with him and all this stuff. And on one
end of that spectrum, it's completely fine and safe for
the whole family, and it's cute and hey, it's a
little old thing, but there's also like.

Speaker 3 (01:00:36):
A more sophisticated my buddy.

Speaker 2 (01:00:38):
Yeah, but there's another side of it that is super creepy.
I could see that.

Speaker 5 (01:00:42):
Well, have you guys seen Megan the the yes recent
I mean, that's absolutely a creepy AI version of like
Kid's Sister, which was the companion to my buddy from
our you know, youth.

Speaker 4 (01:00:56):
And of course it goes wrong and.

Speaker 5 (01:00:58):
Develops its own personality, which is of course evil, and
then you know, and then hilarious murder ensues. I quite
enjoyed the film. I thought it was kind of fun,
and I'm looking forward to seeing the second one. Well
is it because I said hilarious murder matter? Yeah, it
is kind of meant to be a I.

Speaker 4 (01:01:12):
Mean it's a horror comedy. Let's just be real, you know.

Speaker 3 (01:01:15):
Yeah. Another another thing that we should mention here, folks,
and again we'd love to hear your thoughts about this.
There is a world where people may get things that
have a very similar to your like, look close enough
to human beings, for the express purpose of torturing them,

(01:01:35):
beating them up. I've always wanted to hurt someone, and
now I can do it without consequences. That is another possibility.

Speaker 5 (01:01:41):
Devil's advocate, though, couldn't you argue that that would keep
them from doing it for real?

Speaker 4 (01:01:48):
And is that inherently.

Speaker 3 (01:01:49):
Bad or would be rehearsal leading to it?

Speaker 4 (01:01:53):
Well, that's the question violence.

Speaker 3 (01:01:55):
We don't know. We really don't know.

Speaker 4 (01:01:56):
I just I think about that.

Speaker 5 (01:01:58):
I wonder if it would save lives or if I
don't know, is it a good enough stand in that
that that person would never need to do that?

Speaker 4 (01:02:05):
And then we would have a.

Speaker 5 (01:02:06):
World where serial killers potentially were just like killing their
robots over and over and over again.

Speaker 2 (01:02:12):
Well it's tough, though, because could you say the same
thing about the punching bag in my basement? Like does
that Does that mean that I'm a dangerous person because
I want to actually hurt somebody? I don't know, because
how would it be? I don't know. I guess you
can decide where the line is. I guess philosophically, so.

Speaker 3 (01:02:30):
Which which again, I'm not I'm not putting an artificial
horse in this race. I'm saying these are questions, yes,
that civilization is wrestling with. And to go back to
the idea that innocuous activity listening to music at a
certain time. What kind of music you're listening to? This
guy exercises at this time? Right, how does that function uh,

(01:02:53):
such that it might trigger some sort of threshold put
you on the proverbial list. You know, we have to
remember that these things get their secret sauce by being
part of a network. So what if, for instance, there's
some sort of algorithm saying, okay, if you listen to
Tom Waite's and Nina Simon in this order in the

(01:03:14):
wei hours in comparison to other households in the same
monitoring regime or domain, what if that says something to
the system about you?

Speaker 5 (01:03:25):
Right?

Speaker 3 (01:03:26):
Negative ideations? So what if you get what if you
get an automated check in for mental health bots? What
if you are watching a specific series of their on
YouTube videos combined with certain social media connections? Do you
get a visit from the FEDS.

Speaker 5 (01:03:41):
That's where it gets so interesting in terms of like
what are the guardrails and can we depend on them
to function as advertised?

Speaker 2 (01:03:51):
Yeah, yeah, I've seen what you're saying there. I think
the biggest thing for me is just knowing that one
of these systems can be corrupted the same way that
any bad actor, Yeah, any any bad actor could break
into one of these machines if they wanted to, and
if the actuators and the hardware are sophisticated and strong enough,
then it literally becomes a machine that can kill you.

(01:04:15):
Whether you like it too or not, or whether it
was designed to do that or not, whether the software
has parameters or not. Somebody could turn it into that.

Speaker 3 (01:04:23):
Love Death and Robots has a great many episode.

Speaker 4 (01:04:26):
The cartoony one, the kind of Pixari looking one. That's it.

Speaker 5 (01:04:29):
I was thinking the same thing and Matt, I didn't
mean to interrupt, but when you said corrupted, I know
you mean, like, you know, by a bad actor, by
a third party.

Speaker 4 (01:04:37):
But it couldn't it.

Speaker 5 (01:04:38):
Also, like a file that gets corrupted and causes your
computer to crash could result in something much more insidious
than you not being able to access your hard drives.

Speaker 3 (01:04:48):
Theoretically, Yeah, that's a that's the issue. Because we're still
talking theoretically, and people are working around the clock to
get in front of these things to imagine every possible
advantage and disadvantage. We do have to mention in the
dystopian aspect, your activity can become a revenue stream right,

(01:05:10):
you begin to opt into programs that you can no
longer opt out of. It plays into the subscriber model
that big companies love, and it helps pad their pocketbook.
Oh I can't afford, you know, the newest, very expensive model.
I can't afford to pay the down payment and then
cash financing. But if I sign up for the ad subscription,

(01:05:34):
then I can get this across the financial finish line.
And that's where I think we'll see more and more
of the push from ownership to subscriber models. I don't know, man,
there's so much to get to.

Speaker 2 (01:05:48):
Ben just the thing you're talking about, when people working
around the clock to fix the big problems here and
to try and prevent, you know, the major red flags
that could occur. I'm I just I'm maybe a little
too pessimist about it. But I think in the end,
every one of these companies that's trying to develop a
humanoid like robot, especially the ones with legs that can

(01:06:08):
function in a house a lot easier than ones using
you know, some other means of I was gonna say ambulation,
I don't think that's the right thing. Or robots that
move in some other type way like the ones with wheels.
I do think in the end it's about getting first
to market with a product that a consumer can buy,
because right now, almost all of these robots that we're

(01:06:29):
talking about right now in twenty twenty four, you can
buy for research purposes. You can buy for a company
for like a factory for some kind of use there,
or you can buy I think you can like look
at a prototype or mess with a prototype as a company,
but you can't really purchase one of these for consumer operations,
like in a home as an in house quote servant,

(01:06:54):
you know, or someone who is just working for you
as a maid or a butler, or a bartender. Bartender
one again, Matt Bartender one is by a Spanish speaking
company called let Me find It called macal or Maco
m Acco Robotics, and they make a robot called chem

(01:07:15):
or kime maybe kaimes A. It's a robotic bartender that
you can find in a couple of places across the world,
but not many.

Speaker 3 (01:07:26):
And this there are going to be more proliferations, more
iterations of this model. The idea being first to market
shows us that you want to normalize a thing because
once you have market share, it's easier to maintain it,
and I would say that's part of it.

Speaker 2 (01:07:45):
Yeah, but it also means that it's going to rush
everybody trying to get there first, right, because if you're
not first, or especially second or third to market, then
ultimately what you're creating probably isn't going to matter that much.
And we've seen projects before in the past. I think
maybe that we've all been a part of it, have
worked on one or two at least where there's an

(01:08:06):
extreme pressure to get done yes, And ultimately that means
you cut corners a little bit, not a ton necessarily,
but enough to make sure you meet deadlines. And if
the deadline is more important than perfection or you know,
meeting a standard, yeah, then we're all.

Speaker 4 (01:08:24):
In danger, I think, at least in this realm.

Speaker 3 (01:08:27):
And there's also maybe this is a final question do here,
there's also something that a lot of people are missing
when they're talking about these various things. We've outlined the
very good the very bad possibilities. What happens to the
humans in the equation. A lot of folks aren't talking
about this in conversations about quote unquote robot servants. The

(01:08:49):
behavior of the humans will change in ways that humans
now can't fully understand. And this is the way to
think about it. Whether you consider yourself a techie or
a full on luddite, the concept should give you pause.
You can make an argument that the rumbas of the
future are not just going to be adjusting to you.
Instead you in a fundamental way. It will be you

(01:09:11):
that adjust to this emergent regime.

Speaker 5 (01:09:13):
But if there isn't a fundamental change in the way
people are able to survive, there are going to be many,
many millions of people that aren't going to.

Speaker 4 (01:09:21):
Have the luxury to adjust to the robots. They're just
going to be shit out of.

Speaker 3 (01:09:26):
Luck, right, post worker yeah, versus post work economy, But
that requires the post work economy requires a fundamental shift
on the side of government, on the side of the
functioning of society, and I don't see that happening in
lockstep with this technology at all. Yeah, there will have
to be a crisis point, I mean, and those things

(01:09:48):
are perhaps on the way in many different aspects. But
we need to remember, folks, after a certain point, the
things you own can begin to own you. And that's
it sounds like a trueism, but there's a reason people
throw those phrases around.

Speaker 4 (01:10:02):
Think it's a Tyler Durden quote actually for a bike club.

Speaker 2 (01:10:06):
Yeah, well, speaking of my club, let's just go through this.
The Tesla Optimus, which is the big one that they've
been announcing, I think is on gen two or three now.
Their version looks like something that could be a security bot,
somebody that rolls with you in your literal Tesla and
you get out and wherever you go, this bot is
near you. Maybe you got a whole team of them.

(01:10:27):
You can see it because of the way they're articulated
and designed, the same with the Boston Dynamics Atlas, the
newest version of that, the way its legs can go
all every which way, it's torso can turn three hundred
and sixty degrees, and its head and all that stuff.
You can see it. In that same company we talked
about ub Tech Robotics their walker s. I mean, it

(01:10:50):
looks like something that would protect you, not just do
things for you and chores around your house. And I
imagine that because we're talking about not everybody's going to
get to experience it's this thing, because it's only going
to be the upper echelon that get it. I think
at least for the next two decades. Even only the
upper upper echelon is going to be able to afford

(01:11:11):
one of these things. Because remember we talked about the price.
I found the price for one of these things, and
it is a Walker Service Robots, the original Walker Service
robot from that same company u B Tech. That's the
letters you Be and then Tech. It costs currently nine
hundred and sixty thousand dollars.

Speaker 4 (01:11:33):
Whoa good night.

Speaker 2 (01:11:35):
Yes?

Speaker 5 (01:11:38):
Can I just add, like, when we're talking about the
subscription model, like, how different is a subscription model for
the way we like pay for our iPhones. We're basically
locked into a subscription where we're paying a little bit
a month for the privilege of having this device. It's
the same as a lease. How is that any different
than the subscription model? So, like, no one can afford
the nine hundred thousand for a robot, but we'll be

(01:11:59):
paying for it for the rest of our damn lives.

Speaker 4 (01:12:01):
And many people would be thrilled for the privilege.

Speaker 2 (01:12:04):
Well, yeah, there are other ones that say, like, you
put ten twenty thousand dollars down and then you start
making these big payments. But in the minds of the
people that are trying to sell that right now, it's
for it's for research purposes, right so you would put
that either at a university or if you've gotten attraction
or something, you put one of these humanoid robots in

(01:12:26):
to be your guide to take people around your whatever
whatever it is that you're showing people around, that has
all the information, can answer all the questions and is
super cool because you've got a robot. It's not for
being at home right now?

Speaker 3 (01:12:42):
And when will that change? And again what change will
it necessarily create in the human population? Folks. We know
it went a little long on this, but we think
it's going to be fascinating to listen to this episode
a few years from now. The stuff is again definitely
on the way. Shout out to Neo Beta. Remember I
guess they couldn't call them Neo Alpha. And obviously we're

(01:13:03):
played around with some high concepts here. Robots are saving lives.
We're not fans broad brushes, so we hope this is
some food for thought. We're not trying to alarm people
nor bots. Instead, we're asking everyone to think and think
critically about a future the present still does not fully understand.
Speaking of your thoughts, we'd love to hear them. We

(01:13:24):
try to be easy to find online.

Speaker 5 (01:13:25):
Find us online at the handle conspiracy Stuff where we
exist on Facebook with our Facebook group.

Speaker 4 (01:13:29):
Here's where it gets crazy.

Speaker 5 (01:13:31):
On YouTube with video content galore for YouTube, Enjoyanne on
x FKA, Twitter, on Instagram, and TikTok. However, we are
Conspiracy Stuff Show.

Speaker 2 (01:13:41):
We have a phone number. It's one eight three three
std WYTK. When you call in, give yourself a cool
nickname and let us know if we can use your
name and message on the air. If you've got more
to say than can fit in that voicemail, why not
instead send us a good old fashioned email.

Speaker 3 (01:13:55):
We are the entities that read every piece of correspondence
we receive. Be well aware, yet unafraid. Sometimes the void
writes back. So sing to us out here in the dark.
Conspiracy at iHeartRadio dot com.

Stuff They Don't Want You To Know News

Advertise With Us

Follow Us On

Hosts And Creators

Matt Frederick

Matt Frederick

Ben Bowlin

Ben Bowlin

Noel Brown

Noel Brown

Show Links

RSSStoreAboutLive Shows

Popular Podcasts

2. Dateline NBC

2. Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

3. Crime Junkie

3. Crime Junkie

If you can never get enough true crime... Congratulations, you’ve found your people.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.