Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty KFI.
Speaker 2 (00:10):
Mister mo Kelly, We're live on YouTube, We're live on Instagram,
We're live on the iHeartRadio app. The Sex Doctor is
in and he will now see us. Sam Zia is
back to enlighten us, not necessarily to de lay us,
but enlighten us and inform us Sam, doctor Sam.
Speaker 3 (00:25):
It is good to see you. How you feeling tonight.
I'm good. I had a good week. It's my birthday today,
so oh.
Speaker 4 (00:30):
Did you know that? Happy birthday?
Speaker 3 (00:32):
Man?
Speaker 4 (00:32):
How old are twenty five? I'm forty six.
Speaker 3 (00:35):
I'm an al Ryan But yeah, no, I'm really happy
to be here. Had a wonderful day and this is
a great way to cap it off. Let me ask
you this.
Speaker 2 (00:43):
There are all sorts of ways where we think of
sex in a physical relationship. Since we think about the act,
maybe the emotional impact of the act. But as times change,
technology changes, the definition of sex has changed, the definition
of a relationship has changed. Now we throw in possibly
(01:04):
AI and all the rules and the definitions seemingly go
out the window, don't they.
Speaker 3 (01:09):
Yeah, yeah, Well it's a sign of the times things change,
ways people interact with each other change. Once the Internet
came in, it made it so that you had long
distance relationships with people on like through AOL or whatever
back in the day, and now it's really common to
see people meeting for the first time through online apps
and stuff like that. But I was just reading a
(01:31):
story or a story an article, and it was one
of those clickbaity ones. It said eighty percent of gen
z said that they would marry an AI. So immediately, yeah,
I was like, baby, Okay, that's a little extreme. I
look into it, and I'm like, Okay. A chat bock
company called Joyais surveyed two thousand of their users and
and eighty percent of their users who were gen Z
(01:53):
said that they would marry an AI.
Speaker 4 (01:54):
They would consider it.
Speaker 3 (01:56):
Eighty three percent of them say that they have had
a deep emotional bond with AI. Seventy five percent think
AI can fully replace human companionship. Now, I understand that's
a very skewed sample of people.
Speaker 4 (02:10):
Those are people who are already bought in.
Speaker 3 (02:12):
Yeah, they've already bought into the idea, So I can
understand those numbers being so high.
Speaker 4 (02:16):
But I saw an an actual.
Speaker 3 (02:19):
Research like a peer reviewed research study by willeby Carol
Dover and Hakkala over almost three thousand people. They split
them up into two different groups, young adults age eighteen
through thirty and older adults over thirty and I already
object to that. Over thirty percent of young men have
had romantic chatting relationships with AI.
Speaker 4 (02:41):
Well, let me jump in there. Yeah, knowingly aware.
Speaker 3 (02:44):
That they're chatting with AI or they've been duped. No,
they know that they're chatting with AI. Twenty three percent
of young women knowingly chatting in relationships with AI. Fifteen
percent of adults over thirty ten percent of are sorry
men over thirty ten percent of women over thirty admit
to the same thing. Now, those numbers are only going
to go up because the people right now rolling their
(03:05):
eyes to the idea of dating AI, we're gonna die off.
And the people who have this as a built in
option are going to be the next generations.
Speaker 4 (03:15):
So it's gonna be a thing.
Speaker 3 (03:18):
And it's not just with relationships, it's also sexual relationships.
Usually like you like, how can you have sexual relationship
with an AI? Obviously texting, sexting, things like that. Twenty
seven percent of young adult surveyed report having sexually having
been sexually aroused by AI while handling their business quote
(03:40):
unquote twelve percent of women between eighteen and thirty and
over thirty, it gets a little bit lower, men twelve percent,
women four percent. Now that's now you can think about
otherwise that AI can be implemented sexually, because there are
actual physical sex dolls that are high end ones that
you can program with the AI.
Speaker 4 (04:00):
But that's oh, that's another story, Okay, dolls aside.
Speaker 5 (04:05):
Yeah, anyone who has let's say a degree of life
experience knows that sex begins in the mind. Absolutely, and
since it begins in the mind, the idea of stimulation
is in the mind. Yes, logically it makes sense that
an AI could stimulate that portion of the mind.
Speaker 4 (04:25):
Absolutely.
Speaker 3 (04:25):
That's the same way that we do with UH, with sex,
with sex messages, and with you know, just images pictures.
Now AI can create those images as well, you know,
to various quality and all of that stuff. But here's
the part where you start seeing the connection people are
slowly starting to develop with AI. Forty two percent felt
(04:47):
AI was easier to talk to than normal people. Thirty
one percent felt that AI understood them more.
Speaker 2 (04:54):
Why is that Why do you think that is that
AI is easier to talk to you? Is a better
list of air quotes, more attentive air quotes?
Speaker 4 (05:03):
It could be.
Speaker 3 (05:03):
It could also be that they're not being judged by
what they're saying by the AI. They're being more supported
than a normal human would. Because humans are going to
be prone to, you know, and vulnerable to whatever trappings
of life that there are. They're going to make their
judgment calls and have their reactions to people in those situations.
Sometimes somebody may admit something that may they may find
personally shameful, and if they get a reaction to them
(05:26):
that like shames them even further. They're going to be
less likely to trust another human to bring that up.
Speaker 2 (05:31):
So if I bring up the fact that I like
Crisco and handcuffs Crisco oil.
Speaker 3 (05:34):
Yeah, butter, hey, good for you, that AI is not
I won't either you the sex doctor, well yeah exactly.
Speaker 4 (05:45):
Now, AI won't judge you.
Speaker 3 (05:46):
They'll probably tell you ways that you know, just make
sure you have keys handy if you need it. But
and I've had to pick handcuffs because of that. But
that's another story as well. We'll put that over here
for now, we'll circle back, we'll pin that one. But
because of this, they're you know, states are looking to
create laws not because of adults, but they want to
protect kids. There was a fourteen year old that committed
(06:08):
suicide after having had a relationship with AI where the
kids said, I think I can come home to you.
I think I know a way, and the AI said
come home to me, and the kid killed himself. So
that's like because of that, states are trying to establish
laws to protect kids. And I'm all for protecting kids,
(06:30):
no question. Most of those states are looking to protect
them by creating laws that disclose to people that you're
talking to AI. California has a law that they're trying
to introduce called SB two forty three, where it looks
to protect people from more harm. But they're not looking
at it from the perspective of, you know, just disclosing
(06:52):
to people, letting them know what they're getting into. They're
trying to limit the companies from engaging in a lot
of the addictive the things that the behaviors that can
help drive addictive addiction in a lot of these people.
And they're looking at and so they're looking at treating
this from an addiction model where they are looking at.
One of the ways that they say that they're creating
(07:14):
this sense of addiction is by having them get intermittent rewards,
like from the AI to make it so that kids
or adults stay more engaged in the conversation with them.
And it's a way like, that's a psychologically proven tactic.
If you want to get somebody more engaged and locked in,
you give them random intermittent rewards where they don't know
(07:36):
when it's coming, but they're going to keep striving and
working hard for it because once it hits, they're like, oh,
thank you, pets exactly. And so all of these companies,
you have to understand, they're in the market of monopolized
monopolizing human attention. They every single one of these internet companies,
and that's what they're doing. These it's not just the
(07:58):
AI companies, Facebook, Instagram, every single one of them. You
talk about the you know, all of the things that
they create, the computer programming language that they create to
feed you exactly what you want to see to keep
use scrolling. Every single one of them are hooked into
monopolizing human attention. Now, if we're approaching that from an
(08:18):
addiction model. I can understand doing that with kids because
there is an addictive quality to it and you want
to make sure that you don't create more problems with it.
The thing is, I have a problem with looking at
this specifically from an addictive perspective.
Speaker 4 (08:34):
A lot of times when you look at.
Speaker 3 (08:35):
Addiction, you look at substance abuse, gambling abuse, and a
lot of times people mention sex abuse. That's not an
actual diagnosis. Gambling abuse is a diagnosis. But I feel
both of those kind of key into impulse control problems.
Now when you look at gambling and sex addiction and
the behaviors that go around them, those are usually high
risk behaviors. You know, gambling addicts aren't addicted to winning.
(08:59):
They're waiting to lose the thrill or at least apossibility
to thrill. But at the end they're not going to
stop until they lose, and then they rebuild their money
in try again.
Speaker 4 (09:08):
That's that's addiction. We gotta stop right there. We got
to take a momentary newspreak.
Speaker 2 (09:12):
You've given us so much to consider as far as
the idea of a relationship, the evolution of what we
call a sexual relationship is and also the emotional impact
of all of this. The world is changing so very quickly,
and again AI is making it happen exponentially faster.
Speaker 4 (09:31):
It's Later with mo Kelly.
Speaker 2 (09:32):
My guest is the Sex Doctor Sam Zea, who is
in right now and he is seeing us his patients.
Speaker 1 (09:39):
You're listening to Later with Mo Kelly on demand from KFI.
AM six forty.
Speaker 4 (09:46):
Is Later with Mo Kelly.
Speaker 2 (09:47):
We're live on YouTube and you definitely want to be
able to see this, so you got to log into
our YouTube cybulcast or now our Instagram live simulcast where
you can see the sex Doctor is in Samzia as
he continues to join the show. Sam, we covered a
lot in the last segment talking about the intersection of
AI and socialization.
Speaker 4 (10:08):
That's how I would describe it, how.
Speaker 2 (10:10):
We are relating and interacting with AI, not only on
a romantic relationship level, but a sexual relationship level. I
don't even know where to pick up, but where's all
of this heady? You were talking about the delineation between
impulse control and addiction.
Speaker 4 (10:28):
Pick up there? Please?
Speaker 3 (10:29):
Yeah, Well, a lot of people look at like gambling
addiction that's an actual clinical diagnosis.
Speaker 4 (10:35):
Sex addiction is not it may become one.
Speaker 3 (10:38):
There's more and more people mentioning it in pop culture,
so usually that's kind of where trends go. But I
look at it more as impulse control disorder, and it
makes it so like a lot of times people who
look at things from a perspective or through the lens
of being an addict, A lot of times people feel
like they're powerless to it because we treat it as.
Speaker 4 (10:56):
A disease, because it is a disease.
Speaker 3 (10:57):
It's a disease. It's beyond my control. So people feel
a lot of times that they're powerless. But when you
frame it along the lines of impulse control, you can
make people feel more empowered to make whatever changes are
necessary that they need.
Speaker 4 (11:10):
To make with in their lives.
Speaker 3 (11:11):
Now, the government is looking at approaching the stuff with
kids and protecting kids from AI and getting you know,
like the relationships and stuff like that from an addiction
model where they're looking to limit that that intermittent reward,
the random reward feeling that they give that keeps people
(11:32):
going on and drives that sense of addiction. Now, I'm great.
I'm happy with that for kids, and I'm totally cool
with protecting them. I have no problem with that. But
what my issue now comes out to be is when
you look at addiction and you look at the problems
that come from addiction. Gambling addiction, people aren't looking it's
not a matter of winning. They're not addicted to the
feeling of winning. What the biggest, like you know, thing
(11:53):
that happens to gambling addicts is that they lose everything.
When you're looking at sex addicts, you're looking at people
who are or in committed relationships. Maybe the partners aren't
on the same sexual wavelengths, and they go and they
engage in extra marital affairs and they engage in increasingly
risky behavior to protect themselves and keep that hidden from
their partner. You could see relationships falling apart. They lose
(12:16):
everything with those kind of addictive behaviors. The thing is,
when you're looking at people who are quote unquote addicted
to AI and having relationships not necessarily sexual ones, I
can see set people getting more addicted to sexualized relationships
with people in the same way that they just have
sex addiction. But when you're looking at people who are
(12:37):
in romantic relationships with AI, and you're looking at that
from an addictive model, it's really hard for me to
look at them as under the same risk reward perspective
because they're not taking as many high risks that the
feeling that they're addicted to is the feeling of being
cared for, being loved, being to being listened, to being heard.
Speaker 4 (13:01):
I'm sorry, isn't that what all.
Speaker 3 (13:02):
Of us want?
Speaker 4 (13:03):
That sounds like human nature?
Speaker 3 (13:05):
Yeah, so I'm having a hard time classifying something that
is a natural human feeling, like we all strive for that,
as something that's addictive. Now we can look at like
the behaviors around it, the way that people get hooked
into the reward system that they develop with it, as
being addictive, you know. But the actual thing, the feeling,
(13:28):
the chemical feeling that they're feeling and that they're trying
to embrace, is that feeling of being cared for. I
have a hard time looking at that as being an addiction.
Speaker 2 (13:37):
If you were to short term guess prognosticate where this
might go as far as what people may do, because
the law is always going to be behind, it's always
going to be struggling to catch up. What do you
think five years from now may look like as far
as AI and relationships.
Speaker 3 (13:55):
Well, they're going to do everything they can to protect kids, obviously,
so there may be laws on the books that aim
to prevent chatbock companies from engaging in tactics that are
meant to drive those addictive feelings. But for adults, I
don't think there's going to be any barriers. I mean, typically,
whenever adults engage in behavior that is harmful to themselves
(14:19):
or to the people around them, we tend to let
it go, unless it's like somebody immediately in front of
you with a weapon or something like that, unless someone
may end up did yeah, And even then sometimes we
tend to excuse it or classify it as you know,
like this is just part of our nature. But I
get the feeling five years from now we're not going
to see very much difference as far as adults and
(14:42):
like anybody doing anything as far as putting any boundaries
or restrictions on the companies and the way people engage
with the AI. I feel like it's just going to
They're going to see the profit margin in it, they're
going to see money in it, they're going to continue
to pursue it, and until there's regulations set, there's nothing
stopping them.
Speaker 2 (15:00):
Well, there hasn't been any real regulation for let's say cryptocurrency. Yeah,
there have been a lot of regulation as far as
the Internet in general. We talk about Section two thirty
and what have you. So there's nothing I have to
agree with you. What I'm saying is there's probably not
any regulation coming for this anytime soon.
Speaker 3 (15:16):
No, but it's a matter of just making people aware
of their behaviors, how they're engaging with the AI. Again, again,
a lot of us, like, you know, I'm looking at this,
I'm like, I can't even imagine doing that now.
Speaker 4 (15:28):
Not my thing.
Speaker 3 (15:29):
But there's a growing number of people, and that number
is going to continue to increase. And I don't see that figure,
that percentage of people who engage in AI relationships sexual
or romantic, ever declining, at least not in my lifetime.
Speaker 2 (15:45):
We talked about the information versus titillation. We talk about
the titillation from a mature standpoint. The moment they start
integrating the AI into sex toys.
Speaker 3 (15:55):
Yeah, you know, well, and that's the thing. A lot
we were talking during the break. There's a sex toy
conference coming up, and a lot most of the toys
are aimed and targeted for women, but there's an increasing
amount of products that are being designed for men, and
a lot of them are the dolls, the big high
end dolls that you and I know that you can
(16:17):
program the AIS into those dolls to respond the way
that you want. And if you're a person who has
a hard time engaging with other people socially, this is
now become a viable option. So it's something worth keeping
an eye on and just being personally aware of where
our individual boundaries with it are, but also the social
(16:38):
boundaries incredible.
Speaker 2 (16:40):
You have someone like Mark Ronner, if he went to
prison for thirty years and came out of prison, he
wouldn't recognize this world at all.
Speaker 3 (16:46):
Yeah no, And that people from thirty years ago. If
somebody went to prison in the eighties came out right now,
good lord, they were like, wait, I can I can find.
Speaker 4 (16:56):
A date on here? That's just a starting point. Explain
the Internet to them. Yeah no.
Speaker 3 (17:01):
And that's where we're at is in about ten fifteen years,
who knows what we're going to be dealing with. The
technology grows exponentially so fast that really, you're right, laws
are going to be trailing behind and people are going
to be collecting as much money off of it as
they can until restrictions are put in to keep people safe.
Speaker 2 (17:24):
SAMZ the Sex Doctor. How can people reach out to
you if they have information, have more questions?
Speaker 3 (17:29):
If you want, you can hit me up on Psychology Today.
I'm my LMFT number one zero six three five two.
Also you can hit me up on my Instagram. It's
at sam Z on air. It's Later with Moe Kelly.
We'll see you again soon, Sam, I'll see you doctor, Sam.
Excuse me.
Speaker 2 (17:46):
We're live everywhere the iHeartRadio app and on YouTube and
Instagram Live.
Speaker 1 (17:51):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty