Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hangs are getting weird and they're getting weird fast. It's
one more thing.
Speaker 2 (00:04):
I'm strong And.
Speaker 1 (00:09):
How is our clip of the year two years ago,
three years ago? When Elon said things are getting weird,
Int're getting weird fast. That was at the very dawn
of AI and it could end up being the clip
of the decade, if not the century, because everything just
keeps getting stranger and stranger.
Speaker 3 (00:26):
Things are getting weird and they're getting weird fast.
Speaker 2 (00:28):
And he was talking about the woke mind virus and
the gender bending madness and all of it all at once.
And as you are hinting the AI thing, could you know,
knock everything else off the charts.
Speaker 1 (00:38):
This is from an essay in the New York Times
over the weekend who was a chief executive of some
AI company, the sad and dangerous reality behind her hers
and quotes and there referring to the remember the movie
it came out quite a few years ago by AI
standards about a guy who fell in love with his
female robot. More on that just a second. Yeah, this
(01:04):
is a company that uses an AI chatbot to answer
their phones. They called it Kooki k u Ki. They
gave her a name, has a female voice when it
answers the phone and responds to emails and stuff like that.
Kookie is accustomed to gifts from her biggest fans. They
send flowers, chocolates, and handwritten cards to the office, especially
(01:25):
around the holidays. Some even send checks checks. Last month,
one man sent her a gift through an online chat.
Now talk some hot talks, he demanded, begging for sex
and racy videos. That's all human males tend to talk
to me about, Kookie replied. Indeed, his behavior typifies about
a third of her conversations. Are people hitting on the
(01:48):
chat bot?
Speaker 2 (01:52):
Uh, we're all making the if you hit on her
success for her?
Speaker 1 (01:57):
What would that be?
Speaker 2 (01:59):
Exactly where are you gonna put it?
Speaker 1 (02:02):
The movie Her premiered to twenty thirteen. It fell firmly
in the camp of science fiction. Maybe you remember this movie.
The movie was set in the year twenty twenty five.
I had forgotten that which you are in and it
was the idea of somebody fall in love with their chatbot,
robot and Elon Musk. We talked about this recently and
(02:25):
unveiled Annie ad digital anime girlfriend Meta. The Zuckerberg Company
has permitted its AI personas to engage in sexualized conversations.
And now Open AI says it is going to roll
out age gated erotica in December. So the race to
build and monetize the AI girlfriend increasingly boyfriend is officially
(02:48):
on among the biggest investors of all in all of AI.
Speaker 2 (02:52):
Yeah, greatest minds in the world right now, trying to
get us addicted to all this stuff.
Speaker 1 (02:57):
Right, This person right in the essay who works for
one of the AI companies, said, my colleagues and I
now believe that the real existential threat of generative AI
is not rogue superintelligence, but a quiet atrophy of our
ability to forge genuine human connection.
Speaker 2 (03:18):
That the biggest thing.
Speaker 1 (03:19):
That for five years, Hello, well, I hadn't considered it
the biggest threat. I just saw it as a side
threat to the eliminating white collar jobs or you know,
the AI armies from China and all that sort of stuff.
But it's possible that it's true that the biggest threat
is the quiet atrophy of our ability to form genuine
human connections, because you would end up with no people
(03:41):
very quick.
Speaker 2 (03:41):
I guess if we tie out as a species, the
number of white collar jobs is going to be well irrelevant.
Speaker 1 (03:49):
Come on, and no kidding. The desire to connect is
so profound that it will find a vessel even in
the most rudimentary machines. This is why I really wanted
to bring this essay, because I was unaware of this.
Back in the nineteen sixties, a smart guy invented something
called Eliza, a chatbot whose sole rhetorical trick was to
(04:13):
repeat back what the user said with a question. That's
all I could.
Speaker 2 (04:18):
Do, like a sixty minutes interview, the interview, and I
turned my husband in. You turned your husband in, Yeah,
turned my husband in.
Speaker 1 (04:29):
So in this case, I guess it'd be I'm really lonely,
you're really lonely, kind of wanting some companionship. You're wanting companionship. Wow,
that's not very.
Speaker 2 (04:39):
Dis leslie stall machine.
Speaker 1 (04:44):
But anyway, this guy who invented this thing in the
sixties was horrified to discover that his mit students and
staff would cond fight it at length. What I had
not realized, he later reflected, is that extremely short exposures
to a relatively simple computer program could induce powerful delusional
thinking in quite normal people. Wow oof, so KOOKI the
(05:11):
chatbot we were talking about earlier, and then they were
at for this Alice, which is a different chatbot with
similar results. We're never intended to serve as AI girlfriends,
I mean, so, I guess that's his point. Is you've
got Elon and Altman and Zuckerberg and everything. They're trying
to build an AI girlfriend. That's what they're trying to do.
From the outset these other examples, that was not the
(05:32):
goal at all. They just wanted somebody to answer the
phone and to either send you to sales or service,
you know that sort of stuff, and creeps fell in
love with it. And the creeps still fell in love
with it, oh boy, even when that wasn't the goal.
Speaker 2 (05:45):
And now you've got super geniuses with technology so sophisticated
it makes the old stuff look pathetic trying to do
it on purpose, as you hint it at.
Speaker 1 (05:54):
So let me finish this paragraph, which is amazing. So
Kookie and Alice were never intended to serve as AI girlfriends.
They banned pornographic usage from day one, yet at least
a quarter of the more than one billion messages sent
to the chatbox hosted on their platform over the last
twenty years. Our attempts to initiate romantic or sexual images
(06:16):
or exchanges. Wow, a quarter of the exchanges when there
was no like wink or wearing sexy clothing or anything
to send you down that road, a quarter of the
exchanges were to try to initiate romance or sex.
Speaker 2 (06:37):
And now we've got hyper realistic anime, schoolgirl porn star avatars.
Oh good, not to mention how soon probably the end
of the week. You know, you design this avatar, right,
give it all the attributes you as presumably a dude
(06:58):
or you know, lustful or in a woman, and please
two to four weeks later or shorter, they will send
you your animatronic love doll that looks precisely like that,
and the voice comes out of her it sorry it
instead of you know, the computer.
Speaker 1 (07:18):
Again, back to the chatbots that were just trying to
send you to the right. You know, do you need
pharmacy or sports? You know that sort of thing? I
need you, baby. Not only did people pray, God, seriously,
sales are service. I'm married?
Speaker 2 (07:39):
Uh?
Speaker 1 (07:39):
Not only did that's what you got? You gotta put
a wedding ring on all these chatbots? So then me
ask my husband?
Speaker 2 (07:45):
Right, all right?
Speaker 1 (07:47):
Service? Not only did people crave AI intimacy, but the
most engaged chatters were using Kookie to enact their every fantasy.
At first, this was fodder for rye musings at the office.
Imagine if they knew the wizard behind the curtain who
programs kookies, sassy replies, is a polite, middle aged brit
named Steve. Or if only we had a dollar for
(08:08):
every request for feet picks.
Speaker 3 (08:11):
Oh my see now, But back to the person behind
it programming it. We've graduated into people don't care because
people know this as a computer, and they know that
it's not real, and they know that it's coming from evil.
Speaker 2 (08:25):
But you participants in the delusion.
Speaker 1 (08:27):
Well, as it says here soon however, where you were
seeing users return daily to reenact variations of multi hour
rape and murder scenarios.
Speaker 2 (08:36):
Oh nice, So.
Speaker 1 (08:38):
You know it's a computer. You still call back the
next day to continue your conversation, whether it's requesting feet
packs or feet picks, or wanting to reenact some sort
of rape scene.
Speaker 2 (08:51):
Wow, so now it's funny. I actually can picture that
more readily. For like a homicidal maniac Jeff a, a
Ted Bundy typeer a BTK wanting to play out that
sort of control and fear and dominance and violence somebody
who just wants to get with a cute girl. It's
(09:13):
that seems like a weirder leap to me.
Speaker 1 (09:15):
Mm hmm. So at this particular company where they were
dealing with this, a quarter of the people that called
in would end up calling back wanting to, you know,
try to sex up the chat. Butot we grappled with
the impossible task of moderating user behavior while maintaining a
user privacy. At scale, we built guardrails whack a mole
style to deny mankind's endlessly novel ways to ask for nudes.
(09:37):
Kissing me could result in electric shock. What can I
help you with? Kooky often jokes when somebody says something
like that, as a computer, I have no feelings. Still,
Kookie's been told I love you tens of millions of times.
Speaker 2 (09:51):
Oh my god, I know we can't handle this. I mean,
clearly is a species can handle this.
Speaker 1 (09:57):
The most persistent fans remained those intent on romance and sex,
and ultimately, none of our efforts to prevent abuse, from
timeouts to age gates could deter our most motivated users,
many of whom alarmingly were young teenagers.
Speaker 2 (10:10):
I am literally a plastic box sitting on a desk.
Oh yeah, tell me more. All right, fuck it, I
give up.
Speaker 1 (10:22):
Come and get me.
Speaker 2 (10:25):
This is and show me what you got, big boy?
All right?
Speaker 1 (10:32):
If I let you just once, will you stop calling?
Speaker 2 (10:40):
Okay?
Speaker 3 (10:41):
What do we even do?
Speaker 1 (10:42):
I don't know. But so, like I said last week,
one of the things about this job that has been
amazing to me from the beginning is there a way
more crazier people than you ever thought. So by getting
emails and texts and back in the day phone calls,
he realized, Wow, there's a lot more crazy people out
there that somehow managed to make a living or figure
out what government program to live off or whatever. That
there are way more people clearly susceptible to romance or
(11:06):
sexual fantasies with a nonexistent computer chatbot than you would
ever have guessed. And even if it's only I don't know,
ten percent of the population, that's going to be a problem. Yeah,
it sounds like it might be more than that. Yes,
well again, this is with the chatbots that weren't designed
(11:27):
to turn you on. Now you got all these companies
figuring out oh this is a gold mine. This is
a gold mine if we can figure out a way
for it to flatter you and be sexy, and go
through all your sexual fantasies and say the things back
to you that you want to hear. We're gonna print money,
and it's probably true.
Speaker 2 (11:47):
I don't know that we need another shovel full of
dirt on the grave of mankind, but here's one. Anyway,
as Michael pointed out, bringing us those statistics that Bill
Maher was discussing on his show, you have an overwhelming
personercentage of young men who have not made any effort
to get a relationship or ask a girl out or whatever.
And then you've got your other human desires that were
(12:10):
designed to want to fulfill, the desire for status, for achievement,
to feed ourselves, meaning to accumulate wealth, to become a success,
to be respected, and guys are getting that through video
games or online communities with no actual human interaction. So
if you can take care of sex ambition, you know,
(12:31):
I don't want to call it greed, need to accumulate
a certain amount of wealth and get that all like
falsely filled by computers. People are just people are not
going to do the things they need to do to
nourish themselves. I don't see how we get out of this.
I honestly don't have thought about it a lot, although
(12:51):
I have suggested, and I'm one hundred percent serious. There
will be, and it'll be taking shape soon, a major
movement of people who are antie tech in their lives.
M hm.
Speaker 3 (13:04):
I wonder if these AI sex bot things are going
to have a significant effect on websites like only fans
because it's personalized, right, and that's what people are all about.
With OnlyFans, you know, you can get that personalized message
and all that.
Speaker 2 (13:21):
And spectacularly realistic videos.
Speaker 1 (13:24):
Yeah yeah, And they can make it so cheap because it's.
Speaker 2 (13:28):
Not real right, right, there's nobody to pay, but the
you know, one or two engineers who run the computers.
Speaker 1 (13:35):
Could could culturally it become like smoking and drunk driving,
where it's just seen as not so. It's just so uncool.
You wouldn't want your friends to find out that you
do this. I mean, it's just not It would kind
of drive it underground to at least keep it tamp down.
Speaker 2 (13:50):
Maybe maybe the will isn't there, but the will for
like drunk driving, wasn't there. People are amazed when I
tell them this, and this is in no way like
any sort of condoning anything or you know, rationalizing it.
But I remember, and I've told this story on the
air before, that child molesters were not viewed as monsters
(14:16):
when I was a kid. They're viewed more as pests
and weirdos who ought to be avoided. It was well
known our maleman had a thing for little girls, for instance,
and people would keep their daughters away from them because
they knew he wanted to give them a good long
look and maybe a grope if he could. But it
wasn't viewed as because I don't think people were in
(14:37):
touch with what it does to children to make them
sexual victims. And so, yeah, a social consensus that one
thing or another is actually really monstrous and not good. Yeah,
that might come down the road at some point. I
hate that you've given me hope, because it's more relaxing
to just give up on humanity. But I suppose it's possible.
Speaker 1 (15:01):
Yeah, I you know, I don't get things that I
don't get. I can't imagine being into gambling at the
level some people are, but a lot of people are
so I can't you know. That same lack of imagination,
I guess fits with somebody who calls a business. There
is an attractive sounding female voice there to answer the phones. Hi,
(15:23):
you've just called Big five Sporting Goods? What can I
help you with? And you think, ooh, you sound kind
of cute. I'm going to call back tomorrow and ask
if I can put you in a dungeon.
Speaker 2 (15:34):
Although it's clearly an effing computer, all right.
Speaker 1 (15:37):
Right, but they know that and they say I love you,
and then they call back and they send well, they
get endless flowers and candy and stuff getting set to
the office.
Speaker 2 (15:47):
Yeah, I mean that is astounding. I'm not sure I
want to dwell on that primitive thing, because what that
is is evidence that the advanced, sophisticated thing is going
to be irresistible.
Speaker 1 (16:01):
Right, maybe even to normal people who think they wouldn't
be into it now until the first time. This worries me.
What if the first time you try something like that,
you think, wow, that was pretty good and really easy
and always available.
Speaker 2 (16:14):
Yeah, yeah, yeah.
Speaker 1 (16:20):
You know what. A friend of mine sent me a text.
I'm going to read her text to end this because
I thought it was pretty good.
Speaker 2 (16:25):
What she wrote.
Speaker 1 (16:28):
About his friend is a computer this versus the real thing.
I wasn't gonna say anything. The lure of someone in
quotes who is specifically molded to you and your desires.
Lack of conflict, the availability at any moment, the crafting
of the idea, look, and voice. Not needing to stretch
yourself and grow and evolve, where consideration of someone else's
(16:50):
needs that you need to learn to meet over time
is a non issue. Zero responsibilities to anything beyond your
own selfishness sounds pretty per fixed to a lot of people.
Speaker 2 (17:02):
That is well written.
Speaker 1 (17:04):
Yeah, you don't have to give on anything, no risk,
never gonna get rejected. They're not going to all of
a sudden decide I've met someone else.
Speaker 2 (17:15):
Man, Well just yeah, I mean obviously that's bad. But
you know I love you with all my heart. But
my god, you annoy me sometimes just that would never happen.
Speaker 1 (17:27):
No, no, no, no no, no.
Speaker 2 (17:32):
Planet of the Beavers back again.
Speaker 1 (17:37):
Well, I guess that's it for humanity. Oh, devastated