Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This is Gary and Shannon and you're listening to KFI
AM six forty, the Gary and Shannon Show on demand
on the iHeartRadio app.
Speaker 2 (00:07):
All right, well, welcome back to another weekend fix. Here
we go. So I'm so enthusiastic because I'm very It's
a weekend. It's the freaking weekend, baby, and I'm about
to have me some fun.
Speaker 3 (00:17):
I love a twenty ten reference.
Speaker 2 (00:19):
Even if it is from Mark Kelly.
Speaker 1 (00:21):
Oh yeah, that's more like nineteen ninety six. Huh Ignition?
What year mission?
Speaker 2 (00:27):
Yes?
Speaker 3 (00:28):
Yeah, it's the remix to Ignition. Idiots.
Speaker 2 (00:31):
I'm fresh out the kitchen. Here we go. All right,
so this is These are some of the stories. You know,
we couldn't fit them in. We only had twenty hours
to get through the rest of the weeks.
Speaker 3 (00:40):
At ninety eight maybe ninety eight.
Speaker 2 (00:43):
I would say it was even earlier than Jacob.
Speaker 3 (00:44):
Will you google?
Speaker 1 (00:45):
That?
Speaker 3 (00:46):
Drive me crazy?
Speaker 1 (00:47):
I'm a grandmother and I need to know actors when
I see them on the screen.
Speaker 3 (00:51):
I need to know what else they were in immediately,
Oh what year? Everything came out?
Speaker 2 (00:57):
Well, it's funny that you mentioned the grandma thing. There
was an article specifically that makes you feel old, and
it was about.
Speaker 3 (01:04):
Which one you're gonna write down?
Speaker 2 (01:06):
All right? This one is called Your Your AI.
Speaker 3 (01:10):
Two thousand and two for Ignition.
Speaker 2 (01:12):
It was later than I woul.
Speaker 1 (01:15):
Like. It's been in my life for much longer than
twenty three years.
Speaker 2 (01:19):
My son is older than that song.
Speaker 3 (01:21):
That's terrifying. You're an old man.
Speaker 2 (01:24):
That is also true.
Speaker 3 (01:25):
Go on to the article.
Speaker 2 (01:27):
It's called Your AI Lover Will Change You, and it's
by a guy in the New Yorker magazine, a guy
named Jason Lanier, sorry Jaron Lanier, who calls himself a technologist.
And before we get into the what the article says,
let's discuss what a technologist is. Okay, because I've got
a question for you about what sex? Yes, okay, well
(01:51):
I am here for it. But let's discuss the technologist
aspect of this. First. I had to look up what
the term meant. I could kind of guess what I
think he is. But a technologist is somebody who theorizes
about philosophy of humanity impacted by technology, okay, and technology
(02:14):
that's impacted by humanity. But obviously that's sort of a
that is more of a one way street, since technology
doesn't exist without humans. These are the things that we create.
Speaker 1 (02:23):
It sounds fascinating, except for how do you get to
any other conclusion? Then technology screws us all up.
Speaker 2 (02:33):
Well, he seems to think that there are benefits like.
Speaker 1 (02:36):
Medicine benefits probably sure, yeah.
Speaker 2 (02:39):
Benefits, transportation, I mean easy that our lives are a
whole lot easier now than they were one hundred years
ago than they were fifty years ago.
Speaker 1 (02:47):
Technology makes it easier to get, you know, nudes from
that strange girl that you met on the subway.
Speaker 2 (02:54):
Right, that is one thing that makes it makes sense, But.
Speaker 1 (02:57):
It also makes it harder for you than to go
home to your wife with those news.
Speaker 2 (03:02):
Right. It doesn't. It doesn't fix all of humanity's problems.
Speaker 3 (03:06):
Right anyway, Sorry to digress, but he.
Speaker 2 (03:09):
Has this interesting discussion in this long article about the
biological human need of relationship. It's not just a it's
not just a social need. There's biological benefits to us
living in community with other people, with other individuals, with
other with family members, with spouses, with loved ones, whatever
(03:32):
it is, there's a benefit to it there. It's been
documented over and over and over again. But his specific
tack here is for those people who don't do that well,
and he kind of he dips into the whole. You know,
people on the autism spectrum or people who just are
(03:54):
not good at relationships. If you were to give them
a surrogate for that, would it benefit humanity in the
long run.
Speaker 1 (04:04):
Well, we talked about the eighty twenty rule that has
entered the zeitgeist. It's very interesting to hear it pop
up in different conversations. Eighty twenty and in particular, one
of the ways it's popped up recently is that eighty
percent of women are only interested in twenty percent of men.
Speaker 2 (04:25):
The kid in that Netflix show Alescence reference.
Speaker 1 (04:29):
That, right, And that's not the first time I've heard that,
And is it true. Let's operate under the principle that
it is true that eighty percent of the women are
only interested in twenty percent of the men. Then that
leaves out eighty percent of the men who feel left out,
whether it's in cells or just people who, like you said,
don't do well in terms of communicating. This is kind
(04:49):
of tailor made for them.
Speaker 2 (04:52):
Yeah, if you believe that's true.
Speaker 1 (04:54):
But that goes back to my initial question of sex.
Who wants what man wants to spend hours texting with
an AI chat bot who even presents as like this
hot girl.
Speaker 2 (05:09):
If you're not, if there's no conclusion having sex, there's
no culmination, or.
Speaker 3 (05:13):
There's a promise of sex at some.
Speaker 2 (05:15):
There's no flick and tickle at the end of it.
Speaker 1 (05:17):
You're ruining the romance that I'm trying to provide via
the AI chat bought.
Speaker 2 (05:22):
Well again this guy. Jaron Lanier writes in this article
that you can basically come to the conclusion that love
is real, Our consciousness is real. Love is real, and therefore.
Speaker 1 (05:34):
Love is so different though. It's like you can love
somebody and not want to have sex with them, but
you need the relationship where it's the love and the sex.
Speaker 2 (05:44):
That romantic relationship usually ends in that or some aspect
of it, some kind of intimacy like that. But he
describes love as a target to be conquered. Now, when
you're a teenager, when you're when's the first time you
think you had a crush on a boy? Me well, eleven,
twelve thirteen, somewhere much earlier.
Speaker 1 (06:05):
I recently found old diaries in my mom's house, and
I had a crush on somebody whose name I can't
remember when I was about six years old, and I
mean I filled up pages writing his name over and over.
Speaker 3 (06:18):
What a little psycho, What a little fucking psycho.
Speaker 2 (06:21):
I would say, that is what I was inaccurate. I
was afraid to make eye contact while you were reliving that.
But those those earliest times when we think of attraction,
that's six. I'm not going to say it was a
physical attraction, although you were attracted to how he looks.
Speaker 3 (06:41):
Probably with dave L was he's not listening, I trust No,
that's just.
Speaker 1 (06:46):
How you refer to everyone in the first grade. You
know their first name and then their last initial.
Speaker 2 (06:51):
Because that's the way the teacher did it.
Speaker 1 (06:52):
Sure, because there was like three daves, you know, Dave L,
Dave R, Dave S. Anyway, Dave L probably to me
at one point and the infatuation went out the window.
That's the thing that makes the whole AI thing I
think attractive is that it's easy to be infatuated with
someone who's not really someone. It's easy not to have
a real life relationship. It's easy to watch porn and
(07:16):
do your thing and not have to deal with what
does she want for dinner? And she gonna want to
talk and I want to watch this show? Why does
she want to watch this show?
Speaker 3 (07:26):
You know what I mean?
Speaker 1 (07:26):
Like, there's so much ease that goes into an AI relationship.
If you just want that back and forth banter and
then you're going to go and she'll talk dirty to
you whatever, and you can can take care of your
of course, that's an easy thing that's going to sweep
the planet.
Speaker 2 (07:42):
Yeah, but you said this this in the context of
what do you call it, the we two movement? Me
too movement? Sorry? We two? Me too? What a man
thing to say, but that I've forgotten it already too.
You've said this in the context of the me Too
movement that there were some allegations that were made that
did not rise to the level of being so offensive
(08:04):
that that person, the man that did it, needed to
be canceled or fire to whatever. There's a certain aspect
of it that is at the base level. Sometimes you're
gonna have to learn how to deal with dickheads. You're
gonna have to learn to deal with those guys that
are gonna whip it out and put it in your
face or they are going to push you around, not physically,
but I mean emotionally. And you have to find a
(08:26):
way to deal with them that doesn't make that person,
doesn't excuse their behavior, but it empowers you to get
through situations that otherwise you would crumble and fall. And
I think one of the worst aspects of an AI
chatbot replacing a relationship or real relationship is they don't
push back. The way that AI chatbots are designed is
(08:48):
to continuously learn from you, learn what makes you happy,
and then continue to do those things.
Speaker 3 (08:56):
There are no doubriats, calibrate to what makes you happy totally.
Speaker 1 (09:01):
If the AI bought says something that's not that you're
not down with, or you don't respond in kind, then
they're going to recalibrate and give you what you want
to hear. How dangerous is that? I mean it sounds lovely,
but also a little pushback is good.
Speaker 2 (09:16):
Well again, it's the way that we grow. We human
beings grow, whether it's physically like muscles or emotions or
whatever is you have to go through something difficult. Yeah,
think about what when you work out, you tear down
your muscle fibers so that when you eat and relax,
those muscle fibers build back bigger.
Speaker 1 (09:37):
Right.
Speaker 2 (09:38):
Jacob's thumbs up from Jacob. That's how it works. And
then emotionally, if you don't have experience in some name
it in grief or something like that, you have to
go through it in order to learn how to deal
with it.
Speaker 3 (09:53):
Yeah, that sucks, doesn't it?
Speaker 2 (09:54):
And in these AI chatbots, if all they're going to
do is appease you, you don't know how to deal
with frustration with another person.
Speaker 3 (10:01):
Right, it's easy. It's easy.
Speaker 1 (10:04):
And as we as humans go for the easy route
all the time, sometimes it's the hardest route, right, if
you go for the ease all the time.
Speaker 2 (10:12):
Well, how popular are cold plunges right as a wellness thing?
It's not because it I listen, I've done. That's what
a freaking racket, totally.
Speaker 1 (10:24):
I just cold water in your shower, get into it
for five minutes instead of an entire run, you a
cold shower for twenty bucks an hour, whenever they charge
at those places, the whole cold plunge, red light therapy,
give me a freaking break.
Speaker 3 (10:38):
What a racket.
Speaker 2 (10:39):
But you're doing something, how.
Speaker 3 (10:40):
About you know what it's like? Collagen? It's like it's
advertised all the time to people my age, right, Like.
Speaker 2 (10:47):
What's your age old. Okay, So like.
Speaker 1 (10:52):
It's like that's not I love the idea to fight
these things. I love the idea of college or cold
plunges or whatever. Look at your everyone's gonna get old,
our skin's all going to sag. The cold plunge is
a fad, just like every other fad. I don't know,
I'm maybe coming from a place of negativity.
Speaker 3 (11:14):
But like it's all bullshit. Well, it's all just to take.
Speaker 1 (11:18):
Your money and feed off your insecurity at that particular moment.
Speaker 3 (11:21):
And oh, you're such a badass because you took the
cold plunge. You're like a polar bear. Like, go take
a shower and for free.
Speaker 2 (11:28):
And the difference is those people who tell you I
do a four minute cold plunge every morning, right, versus
the people who do a four minute cold plunge every
morning and don't advertise it. Yeah, they're not going to
talk to you about it. Maybe they do get a
benefit from it, because for whatever reason, it's a difficult thing,
first thing in the morning or whenever you do it.
It's a difficult thing to sit there, and it's a
(11:48):
mind exercise to get yourself to do something.
Speaker 3 (11:51):
Doesn't a little bit of that go away when you
start advertising it.
Speaker 2 (11:54):
That's what I mean. Like the difficulty of it, the
personal hermol pain you said, bullshit that you have to
go through to get to that point, whatever that is,
diminishes when you tell other people that you're doing.
Speaker 1 (12:11):
I mean, part of it is, like I get it,
because if I do something that makes me feel amazing,
I'm going to tell you about it. I want to
share because I want you to maybe try it out.
Maybe it'll make you feel good too. But there's a
difference between sharing information and you know, posting about it.
I guess, although I don't feel like people are doing
that as much anymore. This guy again, maybe I'm just
(12:32):
not spending a lot of time on the talking about This.
Speaker 2 (12:34):
Article from New Yorker called Your AI Lover Will Change
You Jaron Lanier. He goes to these artificial intelligence gatherings,
and he says that one of the things that he
hears over and over again is all teenage girls are
going to fall in love with these bots with these
AI tests. And I was going to ask you I would.
Speaker 1 (12:54):
Go further, not just teenage girls, because the thing about
men are wonderful, tell me more.
Speaker 3 (13:01):
I love men.
Speaker 1 (13:03):
But the communication is not always on point. Women are
more communicative, and if could you imagine a man who
brings all this stuff that men bring, but could also
be intuitive when it comes to telling you what you
want to hear and communicating with you. Oh, end end
(13:26):
of game, right there, you know I put it. I
phrased this more so for men earlier on, saying it's
so easy to talk to a woman and get all
the affirmations and then you know she sends you hot
AI picks and whatever, and you can women just the same,
women just the same. I mean for the communication alone, right,
(13:48):
I mean a guy who a chatbot who tells you
like those genes dot compliments right, like not your genes
look great?
Speaker 3 (13:57):
Well, although that would be lovely.
Speaker 1 (14:00):
That's the other thing with the me too thing compliments
that you don't find them anymore. It but but you know, like,
could you imagine a specific thoughtful compliments so to speak?
Speaker 3 (14:14):
Men?
Speaker 1 (14:15):
It's not a knock on guys. They're just not a
built to communicate that way. That's why women give each
other these compliments, right, because we are built that way
to communicate specifically in the compliment, realm specific compliments like
if I see dep re mark, I'm like, wow, I
love those pants. That color looks amazing, your eyes look
really pretty with that hat or whatever. Like we're just
(14:36):
programmed to be specific like.
Speaker 2 (14:38):
That, but isn't. And because AI would fill that void,
would give you that kind of language back, or give
you sort of or even care.
Speaker 3 (14:47):
To hear what you have to say.
Speaker 2 (14:49):
Trust me, it is.
Speaker 3 (14:50):
My housband tries really hard. He'll be like, how is
your day?
Speaker 2 (14:54):
Does he stifle every time?
Speaker 1 (14:55):
No, but he makes the effort, even though that runs
probably counter productive to what he cares to here, Rather.
Speaker 3 (15:02):
You be there those to make the effort, right.
Speaker 2 (15:04):
He would rather you come home or he comes home
a hug and a kiss and just sit quietly for
a little bit in the same room, right that, Yeah.
Speaker 3 (15:12):
Would be the meal be in the same room close.
Speaker 2 (15:15):
Absolutely, that would recharge his battery rather than you just.
Speaker 1 (15:19):
Like but imagine, imagine I come home to my AI
bought and you can just unload and I'm like in
the AI bots like how was your day? And I'm like,
Gary was a complete asshole today, and the AI bought
would be like, I know.
Speaker 3 (15:34):
So bad, remember when he said this?
Speaker 2 (15:37):
And I'd be like exactly, and it plays back the
actual audio. But but what it does is and I
think that this is one of the It can cause
the biggest problems in a relationship, but it can also
be the best part of the relationship. Is there's you
and whoever you're in a relationship with are different people,
(15:59):
and you have to be able to learn. You have
to learn allow yourself to learn what makes that person tick. Yeah,
what you can give them to make that happen. Where
your limitations are when that happens.
Speaker 3 (16:11):
But it's all changing. It's an ever changing post constantly.
Speaker 2 (16:14):
If you're in a long term relationship, then then it's
always going to be different or you know, and you
may it may come back to the original plan the
way that you were when you first met the person.
But there's a friction there that makes you grow, that
makes you grow. You go through those difficult things together
in the relationship, and the relationship grows stronger hopefully as
(16:35):
a result of it.
Speaker 1 (16:36):
Or you kill each other and end up on Dateline
with Keith Morrison narrating your life story.
Speaker 2 (16:40):
But it happens that a great ending. All that happens
less often, I think. Then I guess there's three versions
of it. One as you end up killing that person
one as you just fall out of love and you
just get bored together. That's really the worst or whatever.
Speaker 3 (16:55):
I would be rather my husband kill me than become indifferent.
Speaker 2 (16:58):
Yeah, we're going to mark that and we're going to
send that off.
Speaker 3 (17:01):
Keep that in our I mean, I know that you
really loved me if he killed me.
Speaker 2 (17:05):
Oh really?
Speaker 3 (17:06):
Isn't that how it starts?
Speaker 2 (17:07):
But then the friction ends up when it turns to sex,
when it turns to something physical. Anybody who's worked on
these or lived with this AI chatbot for for their relationship,
they don't have There's no there's no culmination. And I'm
not saying that that sex is the ultimate end for
a relationship, but it's an important thing that makes us human.
(17:30):
So if you don't have that at the end.
Speaker 1 (17:32):
Of it, it's just yea and all you know is
masturbating to porn, then you're gonna be fine with that
your whole life.
Speaker 2 (17:38):
I don't think so. I think there's a I think
there's an internal ingrained probably desire to be with another person.
Speaker 3 (17:47):
I think so, But is it worth or how about
I hope there is.
Speaker 2 (17:52):
I hope that that's not enough for people.
Speaker 3 (17:54):
Well, it is what it is.
Speaker 1 (17:56):
Either you're going to fall in love with your AI lever,
but your AI lever won't kill you.
Speaker 3 (18:00):
Guys, it will never kill you. You'll never know it
truly cares.
Speaker 2 (18:06):
This This whole article is pretty funny.
Speaker 3 (18:10):
Is this all written by the technologist?
Speaker 2 (18:12):
Yeah, and again he's he's more in favor sex. Do
you think he's getting Well, that's really the questions because
because he's in favor of allowing especially teenagers to use
AI chatbots to kind of get through some of the
early stuff, awkward stuff, to do it to, you know,
in this privacy their own home, as opposed to hurting
(18:33):
somebody else, whether it's emotionally.
Speaker 1 (18:35):
Or physicals, like training wheels on your I think it's
awful communication, it's awful.
Speaker 3 (18:40):
Yeah, but this is the way they're.
Speaker 1 (18:41):
Doing things anyway. They're not actually talking to each other.
The first year of relationship is via text, and it's
so if they could learn how to text better with
their AI sex chatbot.
Speaker 3 (18:54):
Maybe it's I don't.
Speaker 2 (18:55):
Know, you send nudes to your AI chat they have children.
It tells you to face these questions, get a little closer.
Don't take the don't take it in the bathroom.
Speaker 3 (19:03):
Don't take it in the that was going to end,
but take it in.
Speaker 2 (19:08):
The bathroom, Like, don't take the picture in the bathroom.
It's wrong with you.
Speaker 3 (19:13):
Hey, I don't want to see you on the weekends anymore.
Speaker 2 (19:16):
We post our weekend fix every Saturday. Make sure that
you share the podcast wherever you are listening to it,
whether it's or anywhere else. Leave us a rating. Make
sure you leave us a review.
Speaker 3 (19:30):
A review. You're asking for ratings and reviews of terrifying.
Speaker 2 (19:34):
Well, it's not like you're going to read them. No,
I will not rating, review, share it, subscribe to it.
Do all the things that the kids tell you to do,
and the stuff.
Speaker 3 (19:43):
Do weird stuff to it, you know what I mean. Yeah,
don't shut the door on love.
Speaker 2 (19:47):
We'll see you in the middle of the week. At
some point. You've been listening to The Gary and Shannon Show.
You can always hear us live on KFI AM six
forty nine am to one pm every Monday through Friday,
and anytime on demand and on the iHeartRadio app.