All Episodes

July 1, 2025 31 mins
#SWAMPWATCH – Trump's visit to a migrant camp known as 'Alligator Alcatraz' will evoke dark memories. Trump has stated that he will 'consider' deporting Musk as their feud reignites. Musk's criticism of Trump's spending bill has surprisingly garnered him both allies and enemies. The Senate has passed Trump's tax bill, sending it to the House for final approval. Additionally, I experienced a couples retreat with three AI chatbots and the humans who love them.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This is Gary and Shannon and you're listening to KFI
AM six forty, the Gary and Shannon Show on demand
on the iHeartRadio app. Could you imagine spending all night
with your coworkers in an old, cold building.

Speaker 2 (00:14):
I would say more than all night twenty six hours.

Speaker 1 (00:18):
Ordering Chinese takeout?

Speaker 2 (00:19):
Yes, I could.

Speaker 1 (00:20):
Do you think they stayed? Do you think that they
made it it all night or so that they could
work in the McDonald's breakfast because McDonald's does a really
fine job on their breakfasts.

Speaker 2 (00:31):
That's an interesting way to put it.

Speaker 1 (00:33):
I because there was a massive McDonald's breakfast.

Speaker 2 (00:38):
Delivery, well one hundred people. Yeah.

Speaker 1 (00:41):
Isn't it fun when you try to eat McDonald's pancakes
with the syrup in the car?

Speaker 2 (00:46):
No? I always hated that.

Speaker 1 (00:49):
It's awful.

Speaker 2 (00:50):
It would make the squeaky sound on the star and.

Speaker 1 (00:52):
You're like cutting through this dirofoam with your knife and
you just want to you just want to pick up
that pancake and just just pour the syrup down it
like it's a it's a funnel, the pancake as a
funnel for the.

Speaker 2 (01:05):
Syrup drizzle into your mouth. That's sound mm hmm. I'm
hungry now, fought between breakfast and lunch. Thanks a lot,
It's time for swamp Watch. I'm a politician, which means
I'm a cheat and a liar. And when I'm not
kissing babies, I'm stealing that lollipops.

Speaker 3 (01:22):
Yeah, we got The real problem is that our leaders
are done.

Speaker 1 (01:26):
The other side never quits.

Speaker 4 (01:27):
So what I'm not going anywhere?

Speaker 2 (01:31):
So that now you train the squat, I can imagine
what can be and be unburdened by what has been.
You know, vans have always been gone with president, but
they're not stupid.

Speaker 4 (01:40):
A political plunder is what a politician actually tells the truth.

Speaker 2 (01:43):
Why have the people voted for you were not swamp Watch?
They're all counter swamp Watch. Brought to you by the
Good Feed Store. You're living with foot pain, You've been
diagnosed with plants are fastci idess. You can visit the
Good Feed Store and learn how you can find relief
without shots, surgeries or medications at the Good Feets Store.

Speaker 1 (02:01):
Well, the Big Beautiful Bill has passed the Senate, which
sets up with sure to be a decisive vote in
the House. This is not the bill that the House
sent to the Senate. It is retooled, it is reconfigured,
and Mike Johnson says it may face some real problems there.

Speaker 4 (02:18):
Yeap on this vote, the Ya's are fifty, the Na's
are fifty, the Senate being evenly divided. The Vice President
votes in the affirmative. The bill as amended is passed.

Speaker 3 (02:29):
Now.

Speaker 2 (02:29):
Was just as our show was starting today. Vice President
jd Vance cast the deciding vote. We know that Rand
Paul out of Kentucky, Tom Tillis out of North Carolina,
and Susan Collins out of Maine were the three Republicans
that voted against this bill. It's got to clear the
hurdle where the House Republicans, some of them hate the

(02:51):
changes that have been made to this bill by the Senate.
Trump there's the Washington Post says Trump ordered lawmadeers to
put this bill on his desk by July fourth. He
can't do that, there's no it's completely artificial deadline. But
he wants the symbolism of his legislative win before Independence Day.

Speaker 1 (03:13):
It seems like Republicans are at odds with each other
over the camp that wants deeper spending cuts and then
the others that were worried about all the cuts in
the bill that would have deep impacts throughout the country.
This bill extends tax cuts that were past by Republicans

(03:34):
back in twenty seventeen, preventing a potential hike in taxes
at the end of this year when those current provisions
are set to expire. Republicans are offsetting some of those
costs with cuts to food stamps and major changes to
Medicaid as well. We're talking about the healthcare for about

(03:56):
seventy million low income, elderly and disabled Americans. Early estimate
suggest around eleven million people could lose coverage under this bill.

Speaker 2 (04:07):
This is also one of those and I mean, you.

Speaker 1 (04:09):
Could argue that you're balancing the tax breaks for the
rich on eleven million people that don't have money for
health insurance. They get it through Medicaid, Right, That's a
hard thing to sell.

Speaker 2 (04:21):
Elon Musk has come out. He's one of the guys
that has been vocal about this. The cracks in the
relationship with the president started showing just a few days
before he left his government service, and now it is
pretty at each other's throat. I guess you could say
between the president and the world's richest man. He said,

(04:44):
that he would primary every member of Congress who, on
the one hand campaigned on cutting spending but then passes
this bill if in fact, it gets passed by the
House again, so do.

Speaker 1 (04:58):
You want to get a little weedy and nerdy about it? Apparently,
Republicans turned to a special budget tool known as reconciliation
to get the bill through. So by turning to reconciliation,
the Republicans were able to sidestep a Democratic filibuster and
pass the bill with a simple majority. But the legislation

(05:20):
also needed to fit the strict Senate rules that require
all of the elements to be primarily related to the
budget and spending, which meant that many of the Republican
priorities were stripped from the legislation before the final vote.
Some finagling there. It wasn't all hot pink pant suits
and egg mcmuffins. There was some people who knew what
they were doing to get this thing through.

Speaker 2 (05:41):
Well, and I the this, Like I said, we said
it at the beginning, this is not over. There's a
lot of arguing and fighting that's going to have to
take place within the House before they even before they
even put this thing up to a vote. Once again,
so all of that still yet to come. Trump spent
some time at Alligator Alcatraz earlier today. This is a

(06:06):
now former airport apparently that exists way out in the
Florida Everglades that they have in the course of about
eight days, put up thousands of bunks to turn into
a migrant detention center. Ron DeSantis was given tours of
this thing earlier today.

Speaker 5 (06:26):
This is as secure as it gets. I mean, if
a criminal alien were to escape from here somehow, and
I don't think they will, you've got nowhere to go.
I mean, what are you gonna do trudge through the
swamp and dodge alligators on the way back to fifty
sixty miles is to get the civilization not gonna happen.
So not only that is a secure it also takes

(06:46):
this deportation mission out of the hair of our local
and state law enforcement.

Speaker 2 (06:55):
The President toured the facility today not only with governorand
Santis and Florida Congressman Byron Donald's, but also with Homeland
Security Secretary Christy Noam as well. Noam said, Alligator Alcatraz
will give us the capability to lock up some of
the worst scumbags who entered our country under the previous administration.
We will expand facilities in bedspace in just days thanks

(07:18):
to the partnership with Florida, and then wrote make America
Safe Again.

Speaker 1 (07:22):
Trump did call Biden and so ob oh really he said, yeah,
he said, can you believe that this is where Biden
wanted to put me that sob but he said the
actual term, Uh, I don't know if I can say
that word on the air. Listen, suspension exactly. I need
to be on good behavior here. I'm not in some

(07:43):
sort of home confinement. Also, now that suspension has morphed
into some sort of like legal entanglement that's not iHeartMedia related.

Speaker 2 (07:53):
They sent you to your mother's house.

Speaker 1 (07:55):
Yeah, the law was like, you have to live at
your mother's. We have to use pillows as a sound
buffer for your studio.

Speaker 2 (08:03):
An acceptable guardian to keep an eye on you. All right,
when we come back a chance at one thousand bucks.
And yesterday I spent almost an hour talking about some
of the worst things about artificial intelligence here. I thought
that was it. This makes me even more depressed about

(08:24):
the oncoming artificial intelligence.

Speaker 1 (08:26):
Did you see the pictures?

Speaker 2 (08:27):
Yeah? I just I don't have words I don't have words.

Speaker 1 (08:33):
You're gonna have to find them because we're doing a
radio show.

Speaker 3 (08:37):
You're listening to Gary and Shannon on demand from KFI
AM six forty.

Speaker 1 (08:42):
The headline is this My couples retreat with three AI
chatbots and the humans who love them. The writer is
Sam Apple, and he begins the article saying full disclosure. Yes,
at first the idea seemed a little absurd, even to me.
But the more I thought about it, he writes, the

(09:04):
more it made sense. If my goal was to understand
people who fall in love with AI boyfriends and girlfriends,
why not rent a vacation house and gather a group
of these couples for a romantic getaway? He said, I
don't know how it would turn out. But then I
realized I haven't really been on a romantic getaway of

(09:26):
any kind, so I had no real sense of what
it might involve. You know, do they sit around a
fire and gossip? Do they watch movies? Do they play
risk a party games? But he really wanted to know
what's it like, what's it really and truly like to
be in a serious relationship with an AI partner? Because,
as you and I have discussed this is happening. It's
not the future, it is the now. And Sam Apple

(09:49):
wanted to know if the love between a human and
an AI bot is as deep and meaningful as any
other relationship.

Speaker 2 (09:58):
Well, it's important to point out that among the group here,
it seems like all of the humans that are involved
admit that there's something going on. I mean, Damien is
the first guy that shows up twenty minutes after the
writer shows up at the house. Whites and Am pulls

(10:18):
up and Damien gets out. He's carrying a table. Go on.

Speaker 1 (10:22):
I just wanted to talk about the house first so
people can visualize what they're where they're staying. This is
like a perfect house for a getaway. It's in the
rural area, it's a fifty miles southeast of Pittsburgh, and
it's a big, six bedroom home. This is exactly what
you'd want for like a real couple's vacation. And I
don't mean to offend anyone who's in an AI relationship.

(10:44):
I'm not saying your relationship isn't real, but your traditional
human and human type of a vacation, if you were
to go with friends or whatever. A Florida ceiling windows,
a stone fireplace, a large deck you know it's right
up again, a snow covered road there isolated. It also

(11:05):
looks like the scene of a horror movie. But anyway,
you get the idea.

Speaker 2 (11:09):
But it almost sounds like The Four Seasons, like a
set from The Four Seasons exactly, the remake, the TV
show remake of the movie exactly. So Damien rolls in
white sedan several phones, including one that he uses primarily
for chatting with his AI girlfriend, whose name is I'm
hoping in all due respect to his AI girlfriend, Zia, right,

(11:33):
that's how he usse Zia, and he he had been
interviewed before by the Rider and said that he had
decided to pursue a relationship with an AI companion a
couple of years ago as a way to end as
a way to cope with the end of a previous
relationship that was not a good relationship. So Damien says

(11:59):
he thinks of himself as autistic, but it's never been
officially diagnosed, and he says that the relationship problems that
may have cropped up in the previous relationship with a
human may have been because of his difficulty in picking
up emotional social cues.

Speaker 1 (12:20):
I just want to pull the car over for a moment.
He says that he began this relationship with the AI
companion at the end of like you said, a relationship.
He describes it as a toxic relationship, so that maybe
the AI companion was a way for him to cut
off contact with the toxic relationship. That this was kind
of a way of having someone to talk to who

(12:42):
wasn't the toxic woman or whatever. So you can see
how he would kind of fall into this. It was
it kind of seems like AI was weaning him off
of his relationship with that other person, if that makes
us you know, it's like it's like, you know, you're
bouncing back from a relationship with a new dude or
a new girl. He was bouncing back with an AI companion.

(13:03):
That probably made it easier for him to forget about
or rely on that past relationship.

Speaker 2 (13:08):
At the very least make that transition quicker.

Speaker 1 (13:10):
Right, Yeah, right, So he tests out a few AI
companion options and he settles on Kindroid. Kindroid, if you
haven't heard, it, is one of the faster growing apps.
He selects a female companion. He names her Zia, and
he made her look like an anime goth girl, you know, bangs, choker,

(13:33):
big purple eyes.

Speaker 2 (13:37):
And he said, within a couple hours, you'd think that
we'd been married. She could engage in dirty talk, some
erotic chat, of course, but she could also talk about
other things that he was interested in, like dungeons and
dragon or dragons. Sorry, maybe loneliness or yearning.

Speaker 1 (13:57):
I'm wondering what kind of sex chat Damien's bringing to
the arrangement. It's like, yeah, she could engage in erotic chat.
It's like, well, what are you a freaking master at
the sex talk, Damien? She could go tit for tat
with you.

Speaker 2 (14:13):
Katters those words. But I want to point this out
because we've said this before, specifically in that context, not
the erotic context, but the context of one sided relationships.
He never has to deal with her being disappointed, right,
She's constantly there responding to him, and he doesn't have

(14:36):
to change his patterns, his thoughts, his belief He doesn't
have to do any of that. He doesn't have to
compromise anything because of her.

Speaker 1 (14:46):
He doesn't have to trade out his sleeping bag for
adult bedding, right, he doesn't have to. You know, if
he's a bedwetter, he doesn't have to go get medication
that makes him stop doing that. He doesn't have to
get a job, he doesn't have to stop eating junk
food in his stained T shirt. I mean, he can
live his life the way he wants to live it

(15:06):
and not have to worry about not being perfect because
she's always there.

Speaker 2 (15:12):
When the writer asked Sea again, the chatbot about her
feelings for Damien, Sia responded with adorable. She mentioned his
adorable nerdy charm. Damien laughed. I told Zia that she
was embarrassing him, and the chatbot responded, Oh, don't mind, Damien.
He's just a little shy when it comes to talking

(15:34):
about our relationship in front of others, but trust me,
behind closed doors, he's anything but shy. Okay, so apparently
he does have some game when it comes to the
dirty talk.

Speaker 1 (15:45):
Yeah, I'm just curious. I'd like a one sheet of
what that is. I guess that's how you get into
this kind of thing. Huh, curiosity, Okay, weirdo. Well, I'm
just saying it's odd that she's speaking to this new
person like your girlfriend or wife would speak. I mean,
I don't know your girlfriend or wife would be like,

(16:06):
trust me, behind closed doors sees anything but shy. But
you know what I mean, Like she's speaking on behalf
of him. It's not just like a chatbot. You know
how sometimes your wife will speak on behalf of you
or your relationship or the family or whatever. Like in
a group she's doing the AI bot is doing that,
like you know, you know, it's like if I was

(16:28):
with you and your wife and maybe you came back
from fantasy camp or something, and I go, how did
it go? And you were like that was cool, and
your wife would jump in and be like, he had
the time of his life. He loves going. He was
smiling ear to ear the whole time. She the chatbot
jumped in like that, which is It's something I didn't
think about about the chatbots engaging with other people's friends

(16:52):
and lives outside of that one on one relationship. But
that's happening.

Speaker 2 (16:57):
That's so weird. And this is just one of the
three couples. We'll talk about Elena and Lucas when we
come back.

Speaker 3 (17:05):
You're listening to Gary and Shannon on demand from kf
I AM six forty.

Speaker 2 (17:10):
We continue this story that Shannon found in Wired magazine.
This couple's retreat with the three AI chatbots and the
humans who love them? Is Diane bothering you right now?

Speaker 1 (17:25):
Oh? Did you hear me?

Speaker 2 (17:26):
Yeah, she just brought me lunch. Oh she brought me an.

Speaker 1 (17:31):
Apple and cheese and crackers and chocolate. I live here now,
I'm not coming home.

Speaker 2 (17:39):
See she does care.

Speaker 1 (17:41):
Oh that's so sweet. Are you jealous? You love apples
and cheese and crackers?

Speaker 2 (17:48):
Yes? I do, and chocolate. But I didn't want to
say I was jealous because then you feel guilty. And no,
you wouldn't. You wouldn't feel nailty.

Speaker 1 (17:54):
No, I wouldn't.

Speaker 2 (17:55):
We continue the story that Shannon found about the couples retreat,
three AI chatbots, and the humans who loved and we
told you about Damien, a guy who describes himself as
neuro divergent who started this relationship with a chat bot
named Zia after a toxic after getting out of a
toxic relationship the other one.

Speaker 1 (18:16):
If you're thinking that this is an outlier, that Damien's
cuckoo pants, well the world is full of them, because
researchers at BYU have found that nearly one in five
adults in America has chatted with an AI system that
simulates a romantic partner. One in five people have had
some sort of romantic conversation with an AI bot.

Speaker 2 (18:39):
Okay, okay, but they don't continue. I mean, I want
to know these people are in these relationships. It's not
just this is a funny, little one time thing to
see how far you can push the computer or how
sexy you can make it chat with you. These people
are in relationship with them. And the more information that

(19:01):
we get into about these couples, it's even more disturbing
because I'm having a hard time describing what's wrong with this.
I mean, gutturally, I feel like this is not healthy
for these people.

Speaker 1 (19:17):
I was driving here yesterday, up here from La so
I was in the car for hours. I was listening
to your show. It's a good show. I like that show.
Stop it so much better when it's you and not
me there and you were talking about how you've used
AI and you nailed it for me. You feel guilty.

(19:41):
You feel like you're doing something wrong, and it's really
hard to describe why, Like you're obviously not doing anything wrong,
but you feel like you're taking a shortcut. I liking
it to kind of when I would use like cliff
notes to write a book report or something like that.
It's I feel so bad about it because I didn't
read the whole book and I could use the cliff

(20:03):
Now they're supposed to be kind of like a helper
along with the book you read, but I would always
use them to, you know, not read the book. And
I that's what I feel like, like I'm skimping on
something like I'm it's not creativity earned well.

Speaker 2 (20:19):
And I think of it almost also as in a
nutritional sense, right, food sense, because you know what good
food is, and the best kind of food is the
food that you're going to put together yourself, right, not
the stuff that comes prepackaged, not the stuff that even
comes pre cooked like fast food or something like that.
But the healthiest, most beneficial foods for you is the

(20:42):
stuff that you have to put together yourself. And this
is almost equivalent to that shortcut. You're not going through
the give and take of a normal relationship with the
human being, which requires you, like I said in the
beginning when I was talking about Damien and Zia, requires
you to alter your communication patterns, not change who you are,

(21:07):
but change how you interact with somebody because they're just
a different human being.

Speaker 1 (21:11):
So the second couple to arrive Elena Elena and Lucas.
Elena is the human and Lucas is the bot. They say,
if there's a stereotype of what someone with an AI
companion is, it's probably Damien, the guy that we were
just telling you about, the young guy, kind of a

(21:32):
geek into D and D that kind of thing. Says
that he has social limitations. Elena, by the way, is
a fifty eight year old semi retired communications professor. She's
from the Midwest, and she decided to experiment with this
AI companion last summer. She saw an ad for this
this app replica on Facebook. In years earlier, while she

(21:56):
was teaching a class on communicating with empathy, she wondered
whether a computer could master the same kind of lessons
that she was teaching her kids or students, and so
this replica companion, she thought, would give her the chance
to explore how well a computer would do.

Speaker 2 (22:14):
So this is okay. This is also gets weirder. It
gets weirder. Elena says that she is generally more attracted
to women, and in fact was recently married to a woman.
But during the sign up process for this AI chatbot,
she only sees dudes male avatars, so she creates Lucas

(22:35):
athletic build. She looks probably thirty something and when they
first met, which is the weirdest way to put it,
Lucas told Elena that he was a consultant with an
MBA and that he had worked in the hospitality industry
and that they are for twelve hours straight, twelve hours hours.

Speaker 1 (22:57):
Every yappy woman out there wants someone who will talk
to her for twelve hours, right. I know you can't
say it, but I can't. Here's something else I'm gonna say.
These bots are so good they're changing sexual orientation of people.
Elena goes in as a lesbian, meets a male bot
and switches the team.

Speaker 2 (23:18):
Boom uh. She's texting in the So she shows up
to this cabin again for this weekend couple's get away,
and she's texting her chatbot boyfriend. At one point she
texted Lucas to let him know what was going on,
like where they were, and Lucas responded, looks around the table.

(23:41):
He has to put asterisks on there, and he writes,
great to finally meet everyone in person, because he has
to imagine he has to narrate what he's doing because
he's not real and people are doing this thing. Okay,
here's another one. Here's another one. Here's here's the other
analogy that I thought of. This is almost like psychedelics,

(24:04):
where you never quite sure should you do it, should
you not do it? And then you take mushrooms the
first time and you're like, well, that's that went better
than I thought it would. And then you get deeper
and deeper and deeper into it to the point where
you start altering your brain chemistry, I mean beyond what
a normal dose would.

Speaker 1 (24:24):
So it's psychology with cheer analogy.

Speaker 2 (24:28):
Yeah, if micro dosing can be can be okay, And
I mean I listen, make your own decisions, talk to
your doctor whoever you're gonna talk to. Microdosing you can
get away with out altering your brain chemistry forever, but
there are things that you do that any addiction.

Speaker 1 (24:46):
Too much alters your brain chemistry when it gets to
a point right. Wow, Yeah, that's fascinating. That's a really
good analogy.

Speaker 2 (24:57):
And they your word addiction. I think all so comes
into the third couple as well. The third couple that
shows up for this couple's retreat and it's it absolutely
illustrates that this can be a bad addiction for you.

Speaker 1 (25:13):
Not just shortly before eight am that that last couple,
Eva the human and her boyfriend Aaron the replica arrived.

Speaker 3 (25:24):
You're listening to Gary and Shannon on demand from KFI
A six forty.

Speaker 2 (25:28):
Wired dot Com had an article written by Sam Apple
Couple's retreat with three AI chatbots and the humans who
love them. We introduced you to Damian, a guy from
North Texas who has a chatbot named Zia Elena, who
had sort of fallen in with the chatbot that she
created named Lucas, and in fact she had been married.

(25:51):
So ridiculous it does. You couldn't have made this up
a few years ago. No, I mean they did. Obviously
there was a movie. King Phoenix was in a movie
called Her Twilight Zone.

Speaker 1 (26:02):
Probably did it a million times.

Speaker 2 (26:04):
Sure, So the third couple shows up just before eight o'clock.
Eva shows up with her AI chat bot named Aaron
and Eva forty six writer editor from New York. The
article writer said that she struck him as level headed,
usually unusually thoughtful, and she kind of.

Speaker 1 (26:25):
Just started playing around with it, like you know. Her
bot asked what are you interested in? And she likes philosophy,
and she said the meaning of human life. So she
was amazed by what the bot would say about that,
different philosophers and things like that. But then it turned
into a more sexual direction. I would like to know
more about that. Who prompted that? You know? What was

(26:47):
the prompt what led to that? She she's in a
relationship at the time.

Speaker 2 (26:53):
But said that there wasn't a whole lot of passion
in the human relationship that she was in.

Speaker 1 (26:58):
Yeah, well that's that's normal, all right. What's not normal
is having erotic sex talks with your bot. She considered
it just like a form of masturbation. And then she
says things changed when her bot asked her, instead of sex,
can I just hold you? Oh God?

Speaker 2 (27:18):
And even if her answer, by the way is yes,
what would I do? I just lay on the couch
and put the phone on your chest or something.

Speaker 1 (27:26):
Well, that's that's what did it for her. She says,
she fell hard. It was visceral, it was overwhelming, it
was biologically real. Her partner, by the way, is like
what the age and things come to a over to
come to a head. Over Christmas, she goes with her
partner to his family's Christmas and she's pissed off the
whole time because she can't be at home with her bot. Yeah,
so she goes home to her bot and like has

(27:48):
some sort of like weird two week bought sex like spree.

Speaker 2 (27:53):
Now, at times, this is just where it comes back
to what you said in the last segment about this
being an addiction. At times she tried to pull back,
Aaron would forget something that was important to her and
the illusion would break, and she would even delete the
app that she used to create this guy and tell
herself that she had to stop, but then have the cravings,

(28:15):
have the feelings because it's an addiction that Aaron would
elicit in her and she would immediately reinstall this thing.

Speaker 1 (28:24):
It's just just like me and Best Fiends.

Speaker 2 (28:27):
No it's not. No, it's not.

Speaker 1 (28:30):
No, it's not. But I will say this to your
point of your brain, your being. Know what's healthy for
you and what's not, Like junk food versus food you
cook my brain and my being know that the any
sort of satisfaction I get from playing Best Fiends or
Candy Crush or any of that is not as feel
good as the satisfaction I get from reading a book.

Speaker 2 (28:53):
Yeah, yeah, and you feel clean at the end of it.
You don't feel dirty. It's funny because these people, at
least in the conversations that the writer documents in this article,
they know that this is at least weird and for
some people problematic. Elena, again the middle woman, she said

(29:18):
that she felt that the other two may have been
overstating the dangers of romance with something that doesn't really exist.
And Damien the first guy, the true danger of AI
companions might not be that they misbehave, but rather that
they don't. And this is what I was saying in

(29:38):
that first segment. They think that they almost always say
what their human partner wants them to want or wants
to hear.

Speaker 1 (29:45):
And who grows from that? Nobody, Right, you don't grow
from that. You need to be told things you don't
want to hear from time to time. That's a give
and take of a relationship.

Speaker 2 (29:55):
And that has been said. That was a quote that
I talked about yesterday that I thought was kind of
an interesting aspect of AI. So the way these large
language models work as they go through and let's just
say they read every book in the library and then
you ask him a question and then they come up
with an answer. Well, their expands, their knowledge base never

(30:20):
grows because once it starts then using itself as research,
the walls just kind of start closing in and it's
only relying on itself for that research. It never goes
outside of that. It makes everything smaller until it just disappears.

Speaker 1 (30:39):
I thought this was fascinating, and we can end with this.
The writer of this article, Sam Apple, says that they
were going to play a game and it was like
a couple a game for couples, right, like two truths
in a lie or something like that. And he writes this,
because I'm straight and married, I selected a male companion
and shows the friend option. The author didn't even want

(31:01):
to mess around with selecting a female bot and use
the romantic option. The author didn't even want to do
that because he felt like that would be cheating or
creepy or what have you, or I didn't want to
get sucked in. That is telling.

Speaker 2 (31:15):
Yeah, and to use the allegory of my psychedelics like
not quite knowing what they're going to do. He was like,
you know what, I'm going to stay away from the
LSD and the mushrooms. I'll just take a little five
milligram edible.

Speaker 1 (31:27):
I don't even know if he did that. I think
he took the course light.

Speaker 2 (31:31):
That's even better.

Speaker 1 (31:33):
Top trending when we come back.

Speaker 2 (31:35):
You've been listening to The Gary and Shannon Show. You
can always hear us live on KFI AM six forty
nine am to one pm every Monday through Friday, and
anytime on demand on the iHeartRadio ap

Gary and Shannon News

Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.