Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The Mandy Connell Show is sponsored by Belle and Pollock
Accident and Injury Lawyers.
Speaker 2 (00:05):
No, it's Mandy Connell and Don.
Speaker 1 (00:11):
On KLAM ninety four one FM.
Speaker 3 (00:15):
God Waity Kim The Nicety three by Connald.
Speaker 4 (00:23):
Keith Sad Thing.
Speaker 3 (00:26):
The Mandy Connell Show is sponsored by Belle and Pollock
Accident and Injury Lawyers.
Speaker 2 (00:31):
No. The Architect of the Future, the future is now.
It's futurist Thomas Frye.
Speaker 3 (00:38):
At futurists speaker dot com. If you'd like him to
come speak to your organization. I do want to give
a little payoff for those people who waited David who
was on hold with his two people to repopulate the earth.
I'm just going to tell you who they were because
it was too funny not to, he said, Chris Christy
and Simone Biles. I don't know the thinking behind that,
but it made me laugh out loud. So thank you
(00:59):
David for that. We'll talk about that a little bit later.
But Thomas is on today to talk about robots. And
you've been you've been deep in the robot rabbit hole
for a little while. What is uh, what have you
been seeing in the world of robots, Thomas.
Speaker 2 (01:15):
Well, they're going to start evolving very quickly here.
Speaker 1 (01:19):
I think we're going to start seeing robots with capabilities
we never imagined. But I recently wrote this this column
just titled will robots replace the children that we're not having?
Speaker 2 (01:33):
Replace the kids are not having?
Speaker 1 (01:34):
And uh, it's interesting because there's benefits to having robots
around it. I mean, as soon as we ironically, as
soon as we have robots that can change a dirty diaper,
it'll make it much easier to actually have kids. So
maybe robots will actually increase the population rather than decrease it.
Speaker 3 (01:54):
Do you think there's going to be because I got
to tell you, as a mom, like, the thought of
handing over my little baby you to a robot just
does not drive with me right now, That just does
not What kind of what kind of curve are we
looking at for robot acceptance?
Speaker 1 (02:10):
Now?
Speaker 3 (02:10):
You're talking about development, right and I think some of
the stuff is happening in robot development is fascinating, really
really fascinating. But when you're talking about a new mom,
especially of second mom, like second kid, the mom will
be like take the baby, it's fine. But first time moms,
I mean, how what is that curve going to be
looking like before we all just accept that we have
(02:31):
a robot made that come is in our house, or
a robot taking care of our babies, or a robot
taking care of our older people.
Speaker 2 (02:42):
Yeah, we have to have robots that we're comfortable around.
Speaker 1 (02:47):
We have to have ones that have a soft touch
that we can have interactive conversations with that they feel
much more human than anything that we exist today.
Speaker 2 (03:01):
We still have a long.
Speaker 1 (03:01):
Ways to go up, and so there'll be many iterations
of evolution on the robot world before we get to
that point. But yeah, I think that having a robot
that can walk to your kids to school, that can
actually go out and play with the kids, I think
people are going to love that.
Speaker 2 (03:23):
And interestingly enough.
Speaker 1 (03:24):
You know people are worried about too much screen time
for the kids today, Well, they might be worried about
too much robot time with.
Speaker 2 (03:30):
The kids in the future.
Speaker 3 (03:31):
I would think that that would be concerning to some parents.
I think other parents would be grateful that they don't
have to deal with that anymore.
Speaker 4 (03:42):
But you know some parents who are very involved.
Speaker 3 (03:44):
I think my questions would more be along the lines
of if my child asks my robot nanny.
Speaker 4 (03:50):
We'll call it robot nanny. Okay.
Speaker 3 (03:53):
A question, a philosophical question. As little children do they
ask you these questions throughout their childhood, all of these
really deep, meaningful questions. How do I program the sort
of response that I would want my robot to give?
For instance, I lean conservative politically, liberty and freedom are
very important to me. How do I, as a parent
(04:14):
ensure that my robot will reinforce those values that I'm
trying to instill in my kid?
Speaker 1 (04:23):
Yeah, they're not going to have little switches at say
Republican or Democratic on them.
Speaker 4 (04:28):
Yeah, they are.
Speaker 1 (04:32):
This this role I think these robots, I think will
evolve around the conversations that are currently being had around
for the family.
Speaker 2 (04:41):
I don't think that.
Speaker 1 (04:45):
There's any hardcore way of programming the robot, but I
think that they will be very adaptive and understand what
the conversations are that are taking place.
Speaker 2 (04:55):
Currently and how to actually feed on that.
Speaker 1 (05:00):
I see robots becoming much more human like over the
coming years, and to the to the point where where
they actually are hybred, actually flesh bots that are we
actually have human like flesh that grows on the outside
of these things and so the touch that they have
(05:23):
is going to be actually real human touch, and of
course there's going.
Speaker 2 (05:28):
To be all kinds of ups and downs with a
lot of a lot of weirdness that goes along with this.
Speaker 3 (05:35):
But okay, you just took me right to Star Trek
next Generation and Data. Okay, Data was a robot, but
he always looked like his skin had been grown in
a lab to me. So I mean, is that what
you're talking about where they look humanoid but they actually
have a robot in our core?
Speaker 2 (05:57):
Right? Right? Yeah, we will see how how this all
takes place.
Speaker 1 (06:03):
And there's lots of people that have talked about downloading
their personnel, their their essence of who they are into
the internet to live forever, but talking about downloading it
into a robot that actually has physical capabilities so you
can live forever as a robot.
Speaker 2 (06:24):
That's that's a whole different ballgame.
Speaker 1 (06:26):
And so that opens up another wide area of discussion
that we're currently not prepared to have.
Speaker 3 (06:33):
Well, it would be my look that only the irritating,
annoying people who text me relentlessly every day would find
a way to download their consciousness into a robot so
they would never die.
Speaker 4 (06:43):
That it would only be the annoying people, right that
would do that.
Speaker 3 (06:46):
It wouldn't be the people we actually wanted to do that.
It would just be people that are irritating and whatnot.
Speaker 4 (06:52):
So I don't know. I mean, I am on the
one end.
Speaker 3 (06:56):
I'm very excited about the future of what robot do
and what they can accomplish and how.
Speaker 4 (07:04):
Game changing they can be. And I guess.
Speaker 3 (07:08):
The evolution of not just the robots, but also how
we interact with the robots is going to be a
much slower process, so we'll have time to adapt, you know.
I mean, it's been kind of funny to me to
see how quickly people adapted to self driving cars in
San Francisco, where they're willing to get into a waygo
that has no driver and go ahead and let it
(07:31):
take them around San Francisco. I would have thought that
would have been a much longer curve, but it's not.
People seem willing to just like, let's do this. Is
that surprising at all?
Speaker 1 (07:41):
I guess.
Speaker 2 (07:45):
Yeah.
Speaker 1 (07:46):
I was reading some of the projections on humanoid robots
and how quickly these things will grow and how many
people are going to buy these things, and projections are
by twenty and thirty five. One projection most that we'd
have one and a half million robots out there.
Speaker 4 (08:03):
Oh wow.
Speaker 1 (08:04):
And another projection that was by twenty and forty was
that we would have a billion of them out there.
And so somewhere in the middle, I think is the
real number. And so how popular will these things be? Well,
it depends on all their capabilities and how easy.
Speaker 2 (08:23):
They are to work with and get along with.
Speaker 1 (08:26):
And certainly if they can take a lot of the
workload off of us, that will they'll make it much
more valuable in our own minds. So I see, I mean,
if a robot can take care of our kids and
take care of the dogs and clean up the house
(08:46):
for us.
Speaker 2 (08:47):
If they can wash our dishes, well we still need dishwashers.
Speaker 1 (08:52):
If they can cook our food for us, well we
still need We still be inclined to go out to restaurants,
or will be more inclined to to eat dinner at
home and engage and bring our friends over. And how
long will it be before a robot comedian is funnier
than human.
Speaker 3 (09:11):
Comedians or talk show host I mean, I mean everybody
likes to think, well, my job's not replaceable, but realistically,
we're very quickly moving to an arena where AI. There
are radio stations right now that have fully AI DJs,
and it's not as well recepted, I think, as they
would have liked, because people are a little turned off
(09:34):
by that, I guess, But ultimately, I mean, are we
all replaceable? Are there any Are there any fields where
it would be safe to say I could not be
replaced because I don't see it.
Speaker 1 (09:48):
Yeah, it's actually hard to so some of the areas
that I don't know how we'd replaces. A robot will
not start a busess will not know how to take
risks if there's no human involved. I don't think a
(10:10):
robot will be able to open a bank account. Robots
don't need clothing, they don't need food. There's lots of
needs that they don't have. So we still need humans
to generate our economy. But they can replace a lot
of the workload that we would normally have. So it's
(10:34):
hard to actually purse us out into Yes, s will work,
and now that won't work because we're going to figure
out new ways to do things right robots that we
never mentioned.
Speaker 4 (10:44):
Somebody just asked this question. I think it's kind of interesting.
Speaker 3 (10:48):
If a robot is a witness to a crime in
the house, how would this impact the accused and their rights?
I mean, I'm assuming that they have the capability of
out lat least a limited time playback that they could access,
and they might be an actual video tape of what
(11:09):
just happened. I mean, that would have to be sorted
out because I don't want my robot recording everything that
happens in my house.
Speaker 2 (11:17):
Could your robot get subpoena? Did in court?
Speaker 4 (11:20):
That's kind of I think what they're asking like, what
do you know? Do you could the video be subpoena? Yes?
But could the robot testify? I don't know.
Speaker 3 (11:28):
I mean I don't know if that would be I
don't know if I'm on a jury and a robot
sits there and says, this is what happened.
Speaker 4 (11:35):
Because they can't swear on a bible. I mean they could,
but does it really mean anything?
Speaker 1 (11:40):
You know what?
Speaker 2 (11:40):
I mean?
Speaker 4 (11:40):
What's the ethical standards that robots have?
Speaker 1 (11:46):
Yeah?
Speaker 2 (11:46):
What does robot bible look like?
Speaker 1 (11:48):
Yeah?
Speaker 4 (11:49):
Exactly, exactly, go ahead.
Speaker 1 (11:54):
Yeah, I think that we're going to all kinds of
legal issues that we never anticipate it as.
Speaker 2 (12:00):
Well, having robot intruders. A robot breaks into my house,
Oh gosh, fault is it?
Speaker 3 (12:11):
I mean it would have to be the people that
program the robots to break into your house. I didn't
even think about that, Thomas. That's a whole new worry,
because a robot could kick in your door, come in,
overpower you physically and not you. You can't do anything
to fight back against a robot. And then what are
you going to identify? Yeah, robot came in my house,
I'll give you a description. You know, they all look
the same kind of thing. That's a little bit scary
(12:32):
as well. So I mean it's going to be as
soon as criminals get robots. What are criminals going to
do with robots?
Speaker 2 (12:40):
Right? Yeah?
Speaker 1 (12:44):
So so I was I was thinking if if you
want to buy groceries, should probably want your robot to
go to the grocery store and buy them for you, right,
pick them up and bring them home. But you don't
want to give the robot access to your main bank
accout count.
Speaker 2 (13:00):
So would you set up a separate bank account for
your robot? I would?
Speaker 3 (13:05):
I mean I would because I have had situations where
someone else was doing my shopping for me and I
set up a separate bank account just for them, and
that was a human. I'm not giving a human access
to my bank account, you know what I mean? I
mean my husband, yes, but not a general person. So
that would make sense, I mean yeah, I mean you
just give them their own little bank card.
Speaker 1 (13:28):
So then if somebody decides to hijack robot and steal
their money, what does that look like? Then?
Speaker 2 (13:39):
Who's going there's all these what if scenario?
Speaker 3 (13:42):
It was gonna say, how do you hijack a robot?
Are you talking about hacking in or or standing there
holding you know, a gun at a robot saying give
me all your money or I'm gonna I'm gonna shoot you.
Speaker 1 (13:56):
Yeah, something like that. And should a robot be allowed
to defend itself? And to what extent should it be
able to disable the person that's trying to harm it?
Speaker 3 (14:11):
I'd say yes, because ultimately, if a robot is there
to serve and protect you, then they should be able
to defend themselves, but only in a defensive posture, meaning
they could block someone from hitting them. They could neutralize
someone by putting them in a hold, a robot hold,
(14:32):
so they can't move or do anything while they're simultaneously
calling the police on their little imaginary headset that I
just gave them inside their robot heads I mean, yeah,
but it's the most challenging thing for me is that
I know that the military is working on making robot soldiers,
and how do you program a robot to kill certain
(14:55):
people but not others? And what's to prevent someone from
hacking into that set of robots and turning them on you.
That's this kind of stuff that I think about.
Speaker 2 (15:05):
Thomas, Right, there's a downside to all of this.
Speaker 1 (15:10):
But yeah, if you if your robot suddenly calls the
police to come in and intervene, it would actually be
a police bot that shows up.
Speaker 2 (15:23):
That takes over.
Speaker 1 (15:25):
And so how does a police spot then take care
of this person that's giving problems? Does that police spot
have more leeway in dealing with criminals? Yeah, that's it
raises all kinds of interesting questions. Should they be allowed
(15:45):
to use me be allowed to use tear gas?
Speaker 3 (15:49):
In theory, you would think that, like, let's talk about
RoboCop here for a second, not the movie, but an
actual RoboCop that we're building in my mind here, you
could make that robot pretty right now, when a police
officer engages with a criminal in a physical way, there's
a good chance.
Speaker 4 (16:06):
That cop could also get hurt.
Speaker 3 (16:07):
But if you had a robot cop that could then
sort of disable that person by holding them down or
getting them still because of their superior strength, and they're
they're in their unlikelihood of getting hurt, I think that
we'd be all, like, Yeah, if that robot grabbed you
and held you and then you hurt yourself because you're fighting,
that's your own fault, you know.
Speaker 1 (16:27):
What I mean.
Speaker 4 (16:28):
I mean, I think we'd be willing to give them
a little.
Speaker 3 (16:30):
Leeway because they can actually get closer to the criminals
than actual police officers can.
Speaker 1 (16:38):
Yeah, if we had a situation where somebody actually died
in the hands of a robot, what kind of feedback
would happen? Then?
Speaker 2 (16:48):
What kind of backlash would occur?
Speaker 1 (16:52):
A sys another George Floyd situation where we would have
overpowering robots setter demanding.
Speaker 2 (17:01):
Too much of our people.
Speaker 1 (17:04):
It raises these ethical questions we've just not had discussions
about yet.
Speaker 2 (17:09):
I Mean, this is all Virgin territory, and.
Speaker 4 (17:12):
Isn't the ethics kind of the hardest part of this
whole thing?
Speaker 2 (17:18):
Well, it is. Is there a regulations that should be
in place?
Speaker 1 (17:23):
I think so, but we don't know enough about these
things yet to know what the regulation should be.
Speaker 3 (17:29):
Well, a lot of my texters are giving you some
variation of this. Has anyone ever seen the movie Terminator
or Mandy. I've seen I Robot and Ex Machina. I'm
good on the robots, but ultimately, you know, we've been
scared of things in the past. We used to think
lightning was the gods that were angry, right, so as
(17:50):
we as our knowledge has evolved, as we become as
we've traveled in space, we're less likely to believe that.
You know, we're gonna have the kind of aliens are
going to come down and kill us on Independence Day.
So I think as this stuff starts to roll out,
because it starts with a roomba, it starts.
Speaker 4 (18:06):
With a robotic law. One of my neighbors has one
of these.
Speaker 3 (18:10):
It doesn't start with fully humanoid robots living in our
houses full time with skin that was grown in a lab.
That's not how it gets started. So perhaps that will
make the adoption a little bit easier. But somebody did
ask this, Thomas, and I think this is an interesting question.
They said, what happens when the government hacks my robot
(18:31):
and can see inside my house? Could you, in theory,
tell your robot if anyone tries to change your programming
or program you to spy on me, I need you
to let me know right away. Would that work with
something that simple, in something that is constructed to work
within a framework, an ethical framework that has to be created.
Speaker 4 (18:54):
With something like that.
Speaker 3 (18:55):
Works, say, look, if anybody's trying to spire on me,
I need you to tell me.
Speaker 1 (19:01):
I think one of the core pieces of every one
of these robots is they need to be the guardian
of your privacy. They need to be your protector, that
they're watching out for you more than anybody else's interests.
Speaker 2 (19:18):
And so.
Speaker 1 (19:20):
If we don't have that, then I don't think that
I don't think they're going to sell many of these robots.
But if the robot is actually working for you, if
it's doing everything you wanted to do, then you can
rely on it. It's trustworthy, something that you can count
on to help you every day, then it becomes super
(19:41):
valuable and it.
Speaker 3 (19:42):
Would become I would think I would envision this as
being robots becoming a part of the family, at least
in the United States of America. In some other cultures
where they still have servants and things of that nature,
perhaps they would be treated like a servant. But I
could see Americans bringing the the robots in and having
them become an extended part of their family.
Speaker 2 (20:03):
Don't you think, Yeah, these robots should be your buddy bot.
They should be your best friend, the friend you always
wish you had.
Speaker 1 (20:15):
Something that can actually finish your sentences for you and
knows what you're going to be saying next that you
have all these good conversations with. I think that's the
kind of robot people want, and I think the people
that are designing robots are going to go on that
path and try to figure out, oh, yeah, this is
(20:35):
narrow spectrum of things that everybody wants.
Speaker 2 (20:38):
Let's try to include all of that into this robot.
Speaker 3 (20:41):
Thomas rise our futurists. You can find him at futurist
speaker dot com. Thomas, great conversation and now lie awake
at night waiting for the robot uprising. Thanks so much
for that.
Speaker 4 (20:55):
All right, all right, Thomas, I have a good day.