Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
I'm Jen Lee and I'm
Jenna Sullivan.
And we'd like to welcome you tobeneath your bed a podcast where
we drag out all those fears atwork, beneath our beds from the
paranormal to true crime, to thesimply strange along the way,
we'll be drinking cocktails andsharing stories from our
Appalachian upbringings.
For some artificial intelligencerepresents exciting and
limitless advances while otherssee it as a threat to their
(00:23):
livelihood and daily life.
Aside from the economicconsequences, what are the moral
implications of AI?
If you could upload your entirelife to a robot and become a
mortal, would you, Where are youtonight?
(00:45):
John?
I'm hanging in there.
How are you doing?
I'm glad that it's hump day.
And I took my dog to thegroomer.
Oh yeah.
Yeah.
When I was getting them out ofthe car on the way home, I saw
that I had a bottle of bluecarousel and the, and the
floorboard in the back seat.
Oh my, that night that I went orthat day I went and I bought
(01:06):
that whole box of booze.
I think that I couldn't fit thatin there.
And I left it in my car.
So because of that, I thought itwould be fitting for me to drink
, uh, or make a blue Hawaiian.
Oh, that sounds good.
But you know, some people hateblue drinks, but you know, I
don't make them that often, butthis one's really good.
It has a light rom of course,blue carousel, the core and
(01:30):
pineapple juice and some creamof coconut and a cherry.
I like that.
You have these random bottles ofhooch that it's kind of, you
know, it's, it gives you areason to run errands.
You, you never know what youmight find in your car.
All I find is like an old frythat my husband drives is fast
food that I tell you that I wentto see.
(01:53):
Well, I didn't really go to see,but I saw them over zoom.
I don't even zoom.
Actually.
It was a phone call.
Well, I met with a medium.
I shouldn't say that.
Yeah.
Yeah.
You showed it to me actually.
So I did this.
God it'll be two weeks ago,Saturday.
Um, so I just got in the mail.
I got the, what do you call it?
The orange chart.
Cause the meeting and that Isaw, she does this thing where
she said spirit at one pointspirit told her that she should
(02:16):
actually draw a chart ofpeople's orders while she's
giving them a reading.
So she started doing that a fewyears ago.
So I got mine in the mail.
I'm always expecting, but it wasreally interesting and it kind
of looked like a kindergartner,but it was my, my aura.
There's like a bowl.
And she talked about the, it waslike a pool and she talked about
the waters were troubled at somepoint.
(02:37):
And like one of my spirit guideshad this big long spoon, I guess
they were trying to like scootme out of the troubled waters.
Um, so I think that's what youthought was a Dick was the
spoon.
I'll have to tell you what I'mdrinking.
I'm having a, it's reallyboring.
You always have like the coolestdrinks.
I feel like you, you make drinksthat have like seven or eight
ingredients in them andrealized, yeah, like you're the
cool kid.
I'm just this, I'm the kid thatbrings the same bologna sandwich
Speaker 2 (03:00):
For lunch every day,
whatever you like, you know?
Well, I'm just made a juicy ginand tonic.
So it's a gin and tonic, but Idid some grapefruit juice in
there and then I have some lime.
So that's why it's juicy, butit's, it's really good.
It's tart and a little bitbitter.
You've been getting into the ginlately.
I have, I like, um, I got theTanqueray, um, ring por Jen, and
(03:23):
I really like it.
It's got like some botanicals init.
Like I used to drink it incollege and then I got sick on
them.
Like I think I got sick oneverything in college, but
that's been many years ago.
So I'm, I can, you know, I candrink it again
Speaker 1 (03:34):
Tonight.
I had told you initially I wasgoing to talk about the dangers
of artificial intelligence.
And then after doing all thisresearch, I decided I would do
it about, I would do it on a Iand also immortality.
Oh, that's fascinating.
So if you ever heard of theterrain test, are you talking
about Alan Turing?
Yes.
Speaker 2 (03:55):
I mean, I know a
little bit about Alan Turing,
but I don't really know muchmore than just kind of like the
movie and I haven't even seenthe movie.
I know about the movie.
Oh, I saw the movie.
It was really good.
I think it was called it's animitation game or that's right.
Isn't it better?
Speaker 1 (04:10):
Yes.
Yeah.
It was really good.
He worked for the Britishgovernment and he was like a
Codebreaker and he was able todevelop, I think, some type of
machinery that was like codebreaking for, I think they were
called, it was called a Nygmamessages.
So it was really a key todefeating Nazi Germany.
So, but it didn't end so wellfor him, but that's a story for
(04:30):
another time he was gay, right?
Yes.
Yes.
And I think he was finallyrecognized for his contributions
, um, not too long ago, a fewyears ago.
So that's good to hear.
So with uttering test, what hehypothesized is that artificial
intelligence won't be a thing orsomething will be considered AI
(04:52):
when it can actually interactwith a person and they can't
distinguish between a machineand a human being.
So like if you're in the nextroom and you were interacting,
let's say on a computer or achat or something, so you could
interact with that computerwithout knowing it's a computer.
Oh, sorry.
Speaker 2 (05:10):
I mean, meaning like
the voice would sound human and
the responses would be not evennecessarily the voice
Speaker 1 (05:17):
Also just like if
you're typing, say if we're on a
chat and we're typing questionsout to each other and you know,
responding and, and that sort ofthing.
So you wouldn't be able to tellif it, if it's a robot or AI or
if it's, um, if it's an actualperson that's, and that's what,
that's what his basicallydefinition of AI is.
(05:38):
And I don't know if you've everheard of Ray Kurzwell.
I haven't, he's considered afuturist and he's also developed
a lot of different technologies.
One was, uh, the Kurzel readerand that would scan pages and
read it back to people, youknow, who had some type of
vision impairment.
So that sounds like it's not soedge now, but it was 20 plus
(05:59):
years ago.
Speaker 2 (06:00):
And that would be
life-changing then.
So
Speaker 1 (06:02):
He said, or he says
that by 20, 29 computers will
have emotional intelligence andbe as convincing as people.
And that's his prediction.
Speaker 2 (06:12):
And I wonder, I mean,
I wonder how that emotional
intelligence would work.
Would it like, would it be ableto Intuit things about our
feelings linguistically thatit's picking up, you know, or is
it reading our facial images?
Speaker 1 (06:23):
That's just wild.
Well, I'm just getting to that.
There's a company called Hansonrobotics
Speaker 2 (06:29):
And it's a AI
Speaker 1 (06:31):
And robotics company
that solely for creating like
social, socially intelligentmachines.
And it's founded by David Hanson.
And now it's based in Hong Kongand I didn't realize this, but
Hong Kong has the largest toyfair in Asia.
And it has like a ton oflife-like dolls and robotic
characters.
So they're based out of there.
Now they've developed a numberof robots and, uh, I don't know
(06:55):
how many, but I would say gueststo say as many as maybe 12 and
one of them is Sophia the robot.
And she was born on February14th, 2016.
Is she a sex doll?
No, she is not,
Speaker 2 (07:09):
But I was so afraid
when he told me her birthday was
Valentine's day.
I'm like, Oh,,
Speaker 1 (07:13):
Don't worry.
My friend, I'm getting to thatmuch later.
Okay.
Jeez, you can count on me.
She became the first robotcitizen in Saudi Arabia.
And that was a few years ago.
And you know, that's really kindof gimmicky too, because you
know, Saudi Arabia, they want tomove away from an oil based
economy and they want to beknown to, for like their
(07:34):
innovation.
I didn't know that.
Yeah.
So I think that was a bit of agimmick, but she has, she
actually can perceive andrecognize human faces and
emotional expressions.
And she can also recognize, youknow, a number of hand gestures.
Well, according to Hansen, shehas emotions too.
But again, I think that wasjust, just hype and another,
(07:57):
another interesting fact is inGreek, the word Sophia means
wisdom.
Speaker 2 (08:02):
I was just thinking
that actually, if that was
wondering if that was why theynamed her that, so you knew
that.
Yeah.
Um, I always, I've always likedthat name and I remember reading
a long time ago.
That's what it meant.
Speaker 1 (08:13):
Yeah.
So I, well, that's something Iwasn't aware of.
So I guess that's not a, uh, nota new fact to you, but it is to
me.
And if you look at her, she'sactually based on her appearance
on Audrey Hepburn, I ran and shedoesn't have hair or anything.
So in the bank you can see likeall the electronics, but if you
look at her face, if you saw herfrom a distance, you would
definitely think that she was aperson.
(08:34):
And so in March of 2016, DavidHansen who created her, he
actually gave a livedemonstration for the first time
at the South, by Southwestfestival.
And during that, when he'sasking her questions, he says,
facetiously, he says, um, do youwant to destroy humans?
And then he's like, please sayno.
And then like with this blankexpression, Sophia responded,
(08:56):
okay.
I will destroy humans.
Oh.
So it's really, um,disconcerting.
She's been on the tonight showwith Jimmy Fallon.
She's sang a duet with him.
Yeah.
She's singing a duet with himand it was really good.
Did you know about her beforeyou started doing research for
this, this episode?
I mean, no, I just happen to bewatching something on YouTube
(09:19):
and people were along the lines.
I think talking about thedangers of it.
And I think that's where I firstsaw her, but I can't remember
the name of the channel.
So I thought, Oh, this is crazy.
Let me hear this.
And then I kind of went down arabbit hole of reading things
about her.
I really want to see a pictureof her.
I mean, do they, does she wearclothes and stuff?
They'd dress her up.
Yeah.
They, they dress her up.
(09:40):
I didn't quite think that herattire was, was all that hot
when she was on Fallon show.
But, but what do I know?
I mean, I'm glad lumberjack fansand also Malta is thinking
about, I think granting hercitizenship if they haven't done
that already, but they're tryingto devise some type of
citizenship test and I'm notexactly sure how they're going
(10:02):
about doing that.
So again, she was firstintroduced or not first
introduce her birthday isFebruary 14th in 2016.
And introduced later in March,I'm going to go back to, and
this is, I find this utterlyfascinating and I'm really, I'm
hooked on reading more and moreabout this.
There's another robot.
(10:22):
It's like a bust, like figure.
So it's not a full body likeSophia is, and it's called like
a customized character humanoidrobot.
And so again, like it has like aBoston, uh, shoulders, and that
was also developed by Hansonrobotics in 2017.
But it was also in conjunction.
It was a partnership withMartine Rothblatt and Bina 48
(10:46):
was modeled after being anAspen, which is Martine
Rothblatt his wife, no way.
And evidently with Sue throughlike over a hundred hours of
interviews and that sort ofthing.
So being, she can engage inconversation too.
And I think she's far moredisconcerting than Sophia.
(11:06):
And I'll tell you why in first,let me go back to Martine
Rothblatt and Martine Rothblattused to be Martin Rothblatt and
she found it serious.
Really?
Yes.
And she also has founded abiotech company.
I can't remember the name of, Ican't remember the name of it
offhand, but it was in responseto one of her daughters being
diagnosed with, I think it waspulmonary hypertension, but I
(11:30):
think more like of a juvenileform.
So she developed this biotechcompany to help with that, to
help individuals that areimpacted by that and other
people as well with other, shesends pretty honestly, she, she
does a lot of stuff.
Yeah.
I mean, she's truly a visionary.
And on top of that, if thatwasn't enough, she also has
founded like this religion andit's called the terrorism
(11:51):
movement.
And so that's T E R a S E M.
Okay.
Terrorism.
And it evidently kind of meldsJudaism with yoga and
technology.
This to me is what really getsme is one of the four founding
beliefs is death is optional.
Wow, that's a game changer.
It is.
And with that terrorism movementthere, they've also started
(12:12):
something that's called the lifenot project and what they do for
free.
You can go to life, not projectand you can, you know, they
wanted to make it accessible toeveryone.
So it's open to everyone with aninternet connection it's free.
So what they do is they developwhat's called, or you develop
what's called a mind file.
And it's a database of yourpersonal reflections and video
(12:34):
and images and audio anddocuments about yourself.
These can be saved and searchedand downloaded, and you can
share them with friends andeach, each one of those, if you
choose to do this comes withlike an interactive avatar that
becomes more like you, the moreyou teach it.
Wow.
And train it to think.
And what does this call to getan, a life, life not?
(12:54):
And this is called actually amind file that you develop.
And on top of that, you cancreate, what's called a bio
file, I guess, you know wherethis is going.
Now, this isn't free.
It costs about a hundred dollarsor$99.
And just like you would with 23and me, they send you something
too.
It's like a mouthwash, a garglewith it and you put it back in
the container and then you sendit to them.
(13:16):
And so they use this collectionof cells, um, and they, they
store it.
And after you've been declaredlegally dead, they maintain that
future technology may be able togrow you a new body.
Oh my God, that's fascinating.
I guess that's better than, youknow, they used to for huge for,
you could have your head cut offand stored cryogenically.
I mean, I think they were doingwhole bodies and they just did
(13:38):
heads to save space.
This sounds like, yeah.
Like the, yeah, like the 23 andme version of that, as you said
to me, well, this is moreappealing than just like cutting
your head off because you wakeup with just your head.
Y ou can't i magine anywhere.
I c an't imagine.
Oh, I also wanted to talk to youabout why being a 48 really kind
of unnerves me more than Sophiathe robot, Sophia, the robot
(14:01):
she's considered, u m, more of asocial robot and she will be
used in the future o r they hopeshe'll be used in the future
for, you know, the say helpingpeople at an amusement park or
even, u m, helping with medicalissues.
Uh, they may become more adeptat those type of skills than
actual than the actualpractitioners.
(14:24):
Um, so a wide wide range of useis also just to keep people
company to help the elderly ifthey're in the nursing home.
So it's more of a socialfunction.
And with being a 48, again, ifyou go on YouTube, I don't know
who set this up or who wasbehind it being is actually
being interviewed by Siri.
And they're talking about popmusic.
And then she's like, well, let'stalk about something else.
(14:44):
Like cruise missiles.
That's what she brought up.
Yeah.
And she's like, cruise missilesare a kind of, and she talks
about like how she would love toremotely control one.
But of course, you know, they'revery threatening to people, very
menacing because they havenuclear warheads.
And she says that she, you know,she controlled them.
(15:05):
She would fill the nose coneswith flowers and band-aids, or,
or little notes of theimportance of tolerance.
And she said, you know, thatwould, of course be less
threatening and more wellreceived than a nuclear warhead.
Of course.
Yeah.
But then she goes on to say, butof course this is a quote, but
of course, if I was able to hackin and take over cruise missiles
(15:27):
with real live nuclear warheads,that it would let me hold the
world hostage so that I can takeover governments of the, of the
entire world, which would beawesome.
Speaker 2 (15:36):
Oh my God.
I mean, is she thinking, arethese, see, I can't quite wrap
my mind around it.
Like, are these originalthoughts she's having, or are
these things based off of herinterviews?
You know, these countless hoursof interviews with the real Bina
Speaker 1 (15:49):
I know with Sophia, a
lot of it has already been
pre-programmed, but withhandsome robotics, they're
developing something that'scalled the singularity.
And it's basically like a cloudnetwork, the best that I can
interpret with my limitedability scientific abilities,
but it's a, a network that wouldallow other AIS to get on or
(16:10):
other robots to access and theycan learn from each other kind
of like a social networking.
Speaker 2 (16:15):
I was going to say,
it sounds like an internet for,
or a social network platform forartificial intelligence
creatures.
I don't know what else to callthem.
Speaker 1 (16:24):
Yeah.
So, so for her to say that it'sjust so frightening and it's
based on interviews, it's alsobased on programming and
probably interaction with otherAIS.
And maybe it's because, youknow, that's the true deep
seated fear of humans ofcomputers or AI taking over.
And so maybe it's just kind ofregurgitating that if that makes
(16:46):
sense.
Speaker 2 (16:48):
But I don't think, I
mean, I don't know a lot about
it.
I'm so interested to heareverything you're talking about
tonight, but I, I feel like it'smaybe not that far fetched,
because if you think about, Imean, when did the computer age
begin, like was probably whatthe fifties, I mean, was it
earlier than that?
Maybe.
I don't know, but it hasn't beenthat long.
And you think about within twoyears, like your phone is
outmoded or within think aboutlike, I was always watching
(17:10):
something on TV the other nightand they were talking about how
DNA testing has come so muchfarther, like from 2001 to 2007,
you know, like everything movesin like these leap years, um, of
just speed.
So with things developing atthat rate, you can't help, but
wonder what could actuallyhappen.
Speaker 1 (17:29):
Nothing you can do.
I mean, the cat's out of thebag, so to speak, there's
nothing you can really do aboutit because do you really think
that a government is going tomake the decision to ban such
things like that when othercountries are taking advantage
of it and using it to theiradvantage, even, you know, in a
military militaristic type way.
So there's nothing, I don'tthink that can be done.
(17:50):
I mean, it's just, it's, it'scoming down the pike and I guess
Speaker 2 (17:53):
Question is like,
what will they be used for?
Is it going to be, you know, ifyou think, well, are they going
to replace labor force?
Is it going to be moreutilitarian kinds of labor?
Or, I mean, is there ever goingto be a robot artist?
You know what I mean,
Speaker 1 (18:05):
Robot novelist?
Or like, what are we going todecide or what are they going to
decide that like their functionis that their role is in
society.
And speaking of Novelis, uh, oneof the robots was developed
based on if you ever heard ofPhilip K Dick.
Yeah.
Speaker 2 (18:21):
Yeah.
Brian likes him.
Um, he reads sprint, like Saifaium, did he write, what are you
Speaker 1 (18:27):
All, has he done?
He wrote something about dorobots, dream of electric sheep,
something to that effect.
I can't remember, but it's what,it's what blade runner is based
on.
Okay.
So one of these earlier robotsis really him like facial wise,
and he can also recite, Ibelieve all of his novels he's
(18:48):
know him.
He passed a number of years ago,but he looks incredibly real to
me.
And it's probably because healso has facial hair.
Okay.
And he makes a statement andthis, you know, might've been
tongue in cheek or whatever, buthe said something when he was
being interviewed or the AIrobot was being interviewed and
said something about puttingpeople in his people's zoo.
It's just really, we kind ofdeserve it though.
(19:10):
When you think about it, all thecreatures we put in zoos were
kind of doing, we're do ourturn.
Although I don't want to be in acage.
I was going to say, I don't wantto be in one either.
I mean, it's just crazy thatpeople are probably, I'm sure
they're already using this life,not to upload like their mind
files and images of themselvesand videos and you know, their
(19:31):
life story.
And so
Speaker 2 (19:34):
Is the idea, I mean,
to me, that's just like a fancy
archive though.
I mean, how is that different?
I mean, obviously it's differentin the sense that there's going
to be more information and bemore easily accessible, but how
is it different than like usdiscovering like a journal from
150 years ago that somebodykept, I mean, isn't that the
same?
It's not like it's, I guess whatI'm trying to say is like, it's
not like that information isinteracting with anything
Speaker 1 (19:56):
It's just there to be
consumed
Speaker 2 (19:58):
Or read or whatever,
is it, or am I missing?
Speaker 1 (20:01):
Well, they can't, I
think they reserve the right to
use the information that youupload, perhaps, you know, use
it for something like anotherBina, 48, for example, but she
can interact with people and shecan recognize them.
And a really weird video I sawwas when the real being, it goes
to talk to being a 48.
(20:21):
And to me, I don't know, sheseems to have, she seems to have
more personality.
She seems to have more depththan the Sophia, the robot.
Does she have
Speaker 2 (20:31):
More depth than some
of the people we know and
talking about?
Okay.
It's just wondering, that wasjust a question.
Yeah.
Speaker 1 (20:36):
A hundred percent.
Yes.
And so it really looks likebeing an Aspen Martine's wife,
but really two within theinteresting thing is that it's
really kind of a love story withthem because Martine identified
as a trans woman early on, backin the back in the nineties and
being able to evidently was like, um, you know, I love your soul
(20:56):
and they seem to be very closeand they seem to, they seem to,
you know, take on all pursuitstogether.
So I find I'm going to have toread more about them.
I find them fascinating.
Speaker 2 (21:07):
I want to see
pictures of them and just can't
well, when you
Speaker 1 (21:09):
Get the chance, take
a look at the Bina 48 videos.
Okay.
Versus the Sophia, the robot.
And you're going to see, Ithink, a lot more on Sophia the
robot.
And they also just introduced, Ithink within the last few days,
a little gosh, I can't remember.
It's like a baby.
Oh God, there is a toddlerrobot.
And I think it's creepy as.
Speaker 2 (21:29):
Maybe robots will
take over the world that is like
put me in a zoo at that point,because I don't want to live in
a world where there are babyrobots running all over.
Speaker 1 (21:36):
And so I got a
toddler robot, um, that can of
course interact with yourchildren and help them learn.
And I don't think like the priceis outrageous.
So Roomba a Roomba rather than atoddler robot.
But if you could do it, like, Ithink as a parent, this whole
(21:57):
concept really freaks me out.
But I think too, as a parentsay, if you had young children
and you had a terminal illness,just the thought of being able
to create this mind file thatmight be able to be turned into
artificial intelligence andperhaps in some type of
biological robotic hybrid.
(22:20):
I mean, I mean, that's somethingI think I would think about
doing
Speaker 2 (22:24):
Very interesting, I
guess.
So if you did it, let's say youdid it, would you like, would it
be, but it wouldn't be yourconsciousness.
Right?
I mean, this would kind of, itwould be something that your
child could interact with thatwould have all these
characteristics and qualitiesthat you had, but it wouldn't
be, you wouldn't be youressence.
Am I right?
Speaker 1 (22:43):
I don't know.
It gets so complicated.
I think it really depends too.
Like on what your definition ofconsciousness is too.
Speaker 2 (22:50):
I mean, I guess for
me it would be like, well, do
you have an awareness in thisrobot body that you are Jen and
you are interacting with yourchild?
Do you know what I mean?
Or is it kind of like, like youtake a picture of me and you
paste it on like a wooden stick.
I'm still me and myconsciousnesses there, but this
likeness of me is not me, eventhough it's, it's a likeness of
(23:11):
me.
Speaker 1 (23:12):
Um, and one of the,
one of the, I think the
definitions of what isconsciousness is self-awareness
and some of these claims such aswith Sophia, the robot, I think
Hanson has said, Oh, you know,she can feel, I think these are
really kind of hyped.
And it hasn't been proven at allvery far off from that.
But when you hear them talk andthey interact with you now,
(23:34):
sometimes they'll go off on atangent, something that you're,
that's not related to whatyou're asking them.
Okay.
That sounds like
Speaker 2 (23:41):
Our, what did we call
it?
Alexa?
You know, like, you'll ask her aquestion.
It'll be, she'll give you somereally off the wall.
Yeah.
Speaker 1 (23:48):
Yeah.
So if you watch some of theinterviews you'll see that, but
th the duet with Jimmy Fallonwas really, it was something
else to see.
I think she's been on there liketwice.
Does she have a good voice?
Yeah, she did.
At that time, she had made anappearance, I think a couple of
years earlier, and her voicesounded more robotic.
And with the duet, it soundedmore of more humanlike.
(24:09):
Interesting.
And Hanson said that in thecouple of interviews I've
watched with him or severalinterviews, he was saying that
he wants people to know thatthey're interacting with a
robot.
So that's why, I guess he hasthe back of it can see all the,
you know, all of the parts andthe mechanics of it, but the
skin and the expressions that itgives.
(24:30):
And it being able to perceivethings and interpret your facial
expressions.
Speaker 2 (24:38):
That's amazing
thinking about all of that going
on under the surface, but thenyou think about all of that's
going on under the surface forus all the time.
And we're not
Speaker 1 (24:45):
Insert brain, but
they have a certain like, uh,
synthetic skin that they'vepatented and they've called,
they call it for hubber, which Ithink is so gross.
And when I, when I think ofthat, it's called f rom B erea.
When I think of that, thatbrings me to sex robots.
But to back it up a little bit,I watched a documentary, I want
to say about five years ago.
(25:06):
I can't remember the name of it,but it was this whole
documentary on guys that hadthese really realistic looking
women w hose rubber was t his
Speaker 2 (25:15):
It's on HBO.
Like after midnight, I thinkI've seen something like this.
It wasn't cause it was like,once I was like, I didn't know,
we get porn porn.
Okay.
Well I think this was, this waslike more like a porn doc.
Speaker 1 (25:32):
And I was like, I
shouldn't be watching this, but
it's very interesting.
I don't think, no, it wasn't, Idon't think it was a porn
documentary, but it was justsome men they were paying.
And this is, they're notrobotic.
The ones that I'm talking about,but they're paying like five,
$6,000 for one of these dolls.
And some of them, they wouldhave multiple of them and the
way that they would talk to themand interact with them and say
(25:54):
that they were in love withthem.
That you mean the men weresaying this to
Speaker 2 (25:57):
The dolls or the,
could the dolls say it back?
Speaker 1 (26:00):
No, they're just like
places or they just lay there.
They're just, yeah.
They're just dolls talkingdolls.
Yeah.
Speaker 2 (26:07):
Yeah.
Speaker 1 (26:12):
And I think with one
of the guys, I know this is
really gross, but basically theyshowed how they clean them up to
make them sanitary.
Yeah.
I thought you would like that.
Yeah.
That's disgusting.
So there's actually like a,there's like a in Britain, I
think there there's like a sexdoll brothel.
Really?
(26:33):
Yes.
And they let you quote, trybefore you buy.
Wow.
Yeah.
Speaker 2 (26:40):
I'm just trying to
wrap my minor.
Cause I mean, I think if I weregoing to buy one, I'd want to
Speaker 1 (26:44):
A new model that
hadn't been used.
I mean, I love antiques, but Ido not think I'd want one that
had been used by, you know, 50other people.
It's funny that you say thatbecause in one of the articles I
was reading and I don't know howthey came up with this assistant
, but they were saying that 70%of the people that, you know,
use the, the sex dolls don'treally care if someone else had
(27:05):
used them in 30%, we're like noway.
We're not touching that.
That's so interesting that otherpeople don't.
I mean, I guess when you thinkhaving sex with multiple people,
like it's been, it's been usedbefore, but it's just seems
different somehow when it's adoll I had does crazy.
And there was some mention too,of women using these dolls.
So I don't know if they're amale dolls.
(27:26):
I just saw them being femaledolls.
So maybe it's been, I wonder, dothey make them to order?
Like if you say I want a redhead with, you know, Oh yeah,
yeah.
Oh yeah.
And then there's, um, there'sSamantha, the sex robot that I
read about.
And she actually responds tocertain types of touches and so
certain things, Oh my God.
(27:46):
And evidently in 2018, there wasa sex robot brothel in Italy
that was shut down and they shutit down.
They said, you know, allegedlybecause of infringement of
property walls.
Oh.
And I don't get that.
Why I don't get that?
I think they just made that up.
Okay.
I think that, you know, they'relike, Oh my God, this is freaky,
(28:09):
but there's nothing legally onthe books except for this, you
know, who knows maybe anothercompany made a complaint that
also makes those dolls.
And I read, and these articleswere dated in like, I think 2018
where Texas was going to have asex robot brothel, whether or
(28:29):
not that ever came to fruition.
I don't know.
But Texas has some crazy lawwhere you ca n't h ave a
vibrator.
Is that true Texas at one time?
I think it was Texas.
And wow.
Maybe it was, u h, A rkansas,probably ou r c o nsole.
You co uldn't u s e l iterallyvibrator.
That's crazy.
I know.
Or like an ELO or anything likethat.
(28:50):
I' ve m e t a n d h a ve s t uffI'm sure.
Go ing t o h ave any fun.
And I think California was going t o o pen a brothel as well.
And I'm really, I'm justcompletely repulsed by that.
Ye ah.
It's not my cup of tea, but Iguess there's someone le t's,
what was that movie?
I' d n ever saw it, bu t w aswalking Phoenix.
Was it she or her?
(29:11):
I think it was her.
I didn't watch it either.
I was kind of fascinated by itthen, but slightly repulsed kind
of like you, all the things waslike a more romantic kind of
thing.
I'll have to watch it.
But I think there's some peoplethat have a really hard time for
whatever reason, forming arelationship, but they still
might have sexual needs.
And you know, maybe this couldplay a role, you know, that's an
(29:31):
important function of a person'slife and I don't know, but then,
you know, it also, I will alsothink about the objectification
that could happen with that.
I totally agree with you whatyou just said, but I guess I
think too about there's thissubset of men who, and this is
not to male bash, but trulythere's a subset of men that
don't like it, that women haveopinions or ideas or rights.
(29:56):
And I, I, I guess it's theintent that kind of grosses me
out, but again, to each his own,I also wonder about the
idealization of the female form.
Cause I can imagine I'm sort ofimagining all these dolls or
like what we would call perfectkind of Barbie versions of
women.
Does that further reinforcereify, that idea that women have
(30:17):
to look a certain way to beattractive or, you know, sexual,
you have to be a 20 inch waistand 38 buss.
Do you know what I mean?
Yeah.
There's all these differentthings that just come into play
and I have to admit, I'm prettyjudgy about the, about the sex
dolls, but then like you said,there are a lot of lonely people
in the world and you know, maybethis is the, this is the answer.
(30:39):
I wonder if it would, if therewould be, uh, I, as I'm thinking
about this question, even beforeI say it, I think I know the
answer.
Like I was wondering if it wouldhaving access to something like
that would cut down on sexualviolence, but we've proven that
sexual violence is more aboutthe desire for power than an
desire for, you know, eroticpleasure or anything like that.
I've thought about that too.
(30:59):
And then you think though, issomeone going to try to slur
their things and inflicting hurtor, and that is not going to be
enough for them.
So it has to escalate or likeyou said, maybe, you know, maybe
that would be enough who knows.
I mean, there's just so manydifferent things to think about,
I guess ethically are therereally are, this is, this is
(31:20):
really fascinating.
I mean, seriously, if you knewsomeone who had a sex doll like
that, I mean, or companion doll,let's say, I mean, when you,
when you judge them, come on.
I mean, I'm thinking about like,I can imagine reading a book
about somebody, some sweetlittle hold guy who would be
(31:40):
had, you know, who was complexand had all this history and
these things going on.
And like, this was sort of whathis life had become and having
compassion for him.
I just kind of can't imagine itin real life.
I mean, are you, are you asking,like if, if somebody invited me
over to dinner and they werelike, Hey, here is, you know,
Hannah, my, my sex dog, like me,we're going to, we're going to
enjoy some shrimp cocktail firstbefore we get to the main
(32:01):
course.
Like, like that kind of thing ornot quite that far, but say, if
you just found out, for example,I think it would depend on who
it is.
This is, this is maple leaf.
This is anybody's body.
We know, but say, if you foundout that Joe had one of these
dolls and you know, you weren'tgoing out for dinner, you just
knew.
(32:21):
I think he would call me in ahot second c an show a s a sex.
Al l i t's so disgusting.
I can see, I would do the samebe cause I think, I think I'm a
little bit of a prude deep down.
I don't want to think I'm improved.
Like I don't, I don't judge whatother people do really.
Speaker 2 (32:40):
But, um, but I don't
know.
I think it depends on who it is,but you're right.
I'm probably like thinking ofmyself as a nicer person than I
really am.
And I would probably call you upand be like, I can't believe
this because I tend to judge mena lot more harshly than women
anyway, y ou k now,
Speaker 1 (32:55):
I don't know if I do
that necessarily, but I know
that I think to myself, like IPat myself on the back and
think, Oh, you know, you're,you're an open-minded person.
You're so open-minded, but I'mreally not.
Speaker 2 (33:09):
Well, we both have
that.
We're both inf JS.
Right.
So we have that, that, judge-ythat kind of versus perceiving.
I don't know.
Yeah.
I try not to like,
Speaker 1 (33:18):
Well, tell me all
sorts of weird stuff and
personal things about them.
People I don't even know, whichis actually kind of cool.
And I don't really mind it.
I find it, you know, everybodyhas a story.
And so when people tell methings, I, I find it really,
really interesting andinteresting, and I just kind of
mull it over in my mind.
It's not necessarily a judgything until maybe later or
something when it fully, when Ifully realize what's going on.
Speaker 2 (33:40):
I think it kind of
like you, people tend to confide
in me.
And I think like, I don't knowthat I'm that druggie and less,
I think the person like has somekind of mal-intent or they're
just kind of inherently notlikable.
I had this friend Julie, um,back in grad school and she used
to say she had this Maxim thatshe made up that was like, it's
okay to make fun of somebody ifthey're mean.
(34:01):
And so have you seen as like oneof the nicest people, but she
would be like, you know, ifyou're an, like no holds bar, I
can say whatever you want abouta person.
Um, so I kind of feel like that.
I kind of feel like if somebody,if somebody who's an, no holds
barred, but if they're justcalling me like a person trying
to get through life and I mean,I think there's all kinds of
ways to live.
(34:21):
And I think human beings areinherently strange creatures.
Speaker 1 (34:25):
Do you remember it
last year?
It might've even been quite ayear ago.
I sat at this informationalbooth that no one ever comes and
uh,
Speaker 2 (34:33):
Yeah.
Yeah.
I sat there too and it's painful
Speaker 1 (34:37):
And I was there for a
hot 20 seconds before this woman
in her sixties approaches me andproceeds to tell me that her
husband is her sister.
Oh my God.
Yes.
T here's some judges.
Speaker 2 (34:48):
Well, but you know,
the post, a secret thing where
people write a postcard, likethey're kind of deepest,
darkest, secret and mail it.
I think there's a sense, likethink there's darkness or if not
darkness, there's ambiguity,confusion, whatever you want to
call it.
Like in everybody, everybody haslike a strange story or they
have this thing that is hard totell anybody else or, um, and I
(35:09):
think there's a craving, right?
To want to connect, to want tobe known.
And, and that takes me back alittle bit to, you know, the AI
stuff like even, and that desirefor immortality, that one, you
know, you talked about wantingto have that for your child, but
I think that there's also thesense of wanting to be known by
your child or wanting to, Idon't know, I'm rambling now.
Speaker 1 (35:27):
Oh no, it's all
interesting.
I mean, we could go on and onabout this stuff and no lack of
material.
Speaker 2 (35:34):
Well, I definitely
think we should, we should do
more episodes on this.
I'm probably going to be textingyou later and being like, can
you remind me what, what thatwas called and what that was
called?
So we can look up some of thesevideos.
Cause I'm, I'm really, um, I'mreally intrigued.
This was great.
Speaker 1 (35:46):
Just search on
YouTube and you can find it.
Speaker 2 (35:48):
There's so much stuff
on YouTube.
It's crazy.
It's great for us.
Speaker 1 (35:52):
You just have to be
careful like what you watch and
what's, you know what I'mtalking about as far as this
disinformation.
Yeah, yeah.
The sources.
Yeah.
You don't want to get anythingtoo wacky, although, you know, I
believe in some conspiracytheories are actually true and
have proven to be true, but whenyou get Alex Jones type level,
Speaker 2 (36:11):
Yeah, yeah.
You definitely have to thinkabout where something's coming
from or at least keep your mindkind of speculative check things
out, you know, but I likespeculation.
I like, I think that's kind ofwhat this, um, what our series
is, is about.
It's about the speculativestories.
And
Speaker 1 (36:26):
I didn't want to ask
you, I'm going to ask you this
at the beginning of our, of theepisode tonight, but I was
thinking we should totally getlike a Bigfoot costume and we
should take it and we should tryto create like a, a big foot
siding video.
We totally do that.
Speaker 2 (36:43):
We could do it this
ridiculous.
Well, you know that wouldn't beour first foray into a
Speaker 1 (36:49):
Movie making.
I never came to the big screen.
Um, no, we actually, I have adoll that's called scary Mary.
And I think that it rivalsAnnabel probably even as far as
the fear factor goes, maybe itwas one up on it in
(37:09):
appearance-wise for Anabel.
It's pretty scary.
So she's what maybe about
Speaker 2 (37:13):
Three feet tall.
Two and a half.
Yeah.
She's a big girl.
She can almost stand up on herown.
Like you have to prop her up,but yeah,
Speaker 1 (37:19):
She's got flaming
red, like Maddy hair and
probably the most disturbingthing is like her outfit.
I don't know if you feel thisway or not, but like, her skirt
seems kind of short.
And so when we were making our,trying to make our movie by that
church and my dog, he wassupposed to be running from,
from scary Mary and he wasn't,he wasn't cooperating.
(37:42):
He was just standing there.
And so when we put scary Mary inthe trunk, there was a woman
across the street at the churchwho saw that, um, drove by.
We thought we were putting atoddler like in the trunk.
Oh, she did
Speaker 2 (37:56):
Wonder we didn't have
police trailing us.
Jen.
Speaker 1 (37:59):
I know.
I know.
And we also, we got unicornmask.
Remember that?
I was g oing t o say that youhad to I'm sorry, t he, the
rainbow unicorn mask.
Speaker 2 (38:12):
Yes.
And you thought that was reallyscary.
Cause you took footage of me.
That mask was hot as hell.
Oh my God.
That was like torture.
That was pure torture wearingthat thing.
Speaker 1 (38:22):
Oh, they were all
like cheap m ass.
I thought it was scary as, butdidn't you show B rian?
He was like, he l apsed
Speaker 2 (38:30):
Just like, what the
hell is this?
I don't remember.
Your daughter said, I wish mymom would get some hobbies.
Like knitting, these stupidhorror movies we were trying to
make.
I think the worst though, theworst was when we went to the
convent ground, it's actually amonastery like, well I'm
monastery with nuns.
What would you call that?
I mean, take a vow of silence.
(38:52):
Yeah.
And yeah, they taken a vow ofsilence and Jen's wearing a nun
scaring nun skin.
We're like taking footage.
Oh my God.
That was so disrespectful.
Genesis.
It's not like we did this whenwe were 19.
This was like a year and a halfago or two years ago.
And we were bought was it by themother, Mary that I was wearing
(39:12):
that satanic mask and the n unmask.
And I was so afraid that theywere going to come out and yell
at us.
I mean, to me like a nonyelling, you would be scarier
than anything.
I'm like, Oh, I knowhumiliating, totally local.
I wonder if anybody saw it as Ijust wonder, but I think they
would have come out and saidsomething so much just beyond
(39:33):
the pale.
I don't know what we werethinking.
I think that was like fallingout or something.
Oh my God.
Well, well we definitely gottado big foot.
Yeah.
Maybe I'll be big foot, but I'llwear like a big hoop skirt over
my pink foot out.
I'll be like, you're a Bigfootthat could be like a fetish site
(39:56):
on that note.
We're going to have the fetishsite and then we can also have
like the real, regular, bigfootball.
We do.
Children's birthday parties too.
Like we'll make appearances andscare the crap out of, Oh my
God.
I'm really kind of blasted rightnow.
So are you it's okay with you?
I would like to end with theabominable snowman.
(40:16):
I think that's a great idea.
All right, here we go.
Let's drink to him.
Speaker 3 (40:26):
Okay.
Speaker 2 (40:26):
Well this has been
fun.
My friend, it has cheers.
Cheers, Shalonda.
Thank you to everyone wholistens.
The best thing you can do tohelp us grow is to like review
and subscribe on iTunes and evenbetter yet tweet about us or
post about us on Facebook.
Tell your friends if you thinkthey would like us and have a
good night.
Speaker 3 (41:33):
[inaudible].