Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This is Gary and Shannon and you're listening to kf
I A M six forty, the Gary and Shannon Show
on demand on the iHeartRadio app.
Speaker 2 (00:10):
Oh no, not, you added a letter. Okay, it's not
two syllables. There's just one syllable. I know what it is. Okay,
but I was close some of the very a whole
lot closer than I would have gotten.
Speaker 3 (00:21):
Yeah, what did you think of it? Was? I just
forgot someone's last name. I don't thought it was like
Williams or something. Yeah.
Speaker 2 (00:26):
Really I thought I started with an L, but there
is an L in it, and you're right.
Speaker 3 (00:29):
Yeah, so I don't know why that happens. I've known
him for years. It's okay, this is what happens. Is
what happens? Is that what you tell me. It's not
just that you.
Speaker 1 (00:38):
Forget things with dementia. You get confused by things.
Speaker 3 (00:41):
I'm not confused by it. Once you said it, you.
Speaker 1 (00:43):
Get that when you get that tone where you're frustrated
and I'm look if that's all textbook?
Speaker 3 (00:49):
Okay, So.
Speaker 1 (00:52):
You saw what happened with DK metcalf Right Steelers playing
the Lions. They actually won that game, huge win the
Steelers or atop the division well during the game, though,
DK Metcalf and you've seen the clip now if you haven't,
Star wide receiver for the Steelers sent their mid season
he pulls a there's a guy leaning over the railing
(01:12):
behind the Steelers sideline. He's got the blue hair, the
fake wig on for the Lions, what have you? And
DK Metcalf reaches up, grabs him by the fake wig
and kind of they said he punched him. If DK
Metcalf punched him, that guy would be on the ground.
It wasn't a punch, but anyway, he hit him. The
league suspends dk Metcalf for two games.
Speaker 3 (01:34):
PS.
Speaker 1 (01:35):
It did not happen mid game when all the cameras
picked it up right away and Mike Tomlin didn't see it.
Steelers didn't know what happened. The NFL knew what happened
because it was on the red zone. It was everywhere.
Everyone knew that this happened. But yet he was left
in the game somewhere speculating would they would the NFL
suspend him like right away or what happens? But there
was no flag on the play because nobody saw it,
(01:57):
so there was nothing to have action about. There was
nothing action at that moment. But dk Metcalf assaults.
Speaker 3 (02:04):
A fan, you know what I mean?
Speaker 1 (02:05):
And maybe the fan needed to be assaulted, But as
somebody who makes that pine of money, you cannot do that.
You're also a man made of granite. Your body is granite.
Speaker 3 (02:16):
You can't kill.
Speaker 1 (02:17):
You're gonna kill somebody. You cannot do that. You've got
to have a cooler head in that moment, and he
did not. He was suspended two games. I would just
like to make a before I get to the point
financially what that means for him. Denzel Perriman was given
a two games suspended for an unnecessary roughness call. That
was a borderline call ps on the field. On the
(02:38):
field somebody playing for a yes, which is wild. But anyway,
DCAF Metcalf. Dk Metcalf's two game suspension costs him five
and fifty five thousand dollars half a million dollars for
doing that.
Speaker 3 (02:53):
Wow. Add stupid to stupid.
Speaker 1 (02:56):
It voids him forty million dollars in future guaranteed money
as well.
Speaker 3 (03:03):
Wow.
Speaker 2 (03:07):
So why well, why are you even that close to
the side. I guess you could hear people you chirping? Actually,
oh yeah, you can't.
Speaker 1 (03:15):
Philip Rivers used to chirp back at them. You can
hear the fans because I'm right there. I'm in between
the players and the fans, and it's a small sideline.
In many stadiums, arrowhead fans are right up against the sideline.
I mean there's barely any room for the TV truck
to come by. I mean, so they are there right.
Many of the stadiums, especially the older ones, the fans
(03:36):
are within.
Speaker 3 (03:37):
Ten yards to the players.
Speaker 1 (03:40):
So yeah, the players can walk up, sign autographs, punch
you in the face, whatever you want.
Speaker 2 (03:44):
Whatever you're asking for. It's time for swamp watch.
Speaker 3 (03:47):
I'm a politician, which means I'm a cheat and a liar,
and when I'm not kissing babies, I'm stealing their lolleypops, Yeah, we.
Speaker 4 (03:54):
Got the real problem is that our leaders are dumb.
Speaker 3 (03:57):
The other side never quits, So I'm not going anywhere.
So that now you train the swat, I can imagine
what can be and be unburdened by what has been.
You know, Americans have always been gone act they're not stupid.
Speaker 2 (04:12):
A political flunder is when a politician actually tells the truth.
Speaker 3 (04:15):
Have the people voted for you? With no swap.
Speaker 2 (04:17):
Watch a couple of things that came out of DC
this morning. The initial reading a third quarter GDP showed
that our economy expanded at four point three percent, a
far faster pace than the three point eight in the
second quarter. That's the fastest growth rate in a couple
of years, and acceleration and consumer spending went up was
(04:39):
part of the reason main contributors to this third quarter
GDP reading. So as weird as this economy is, it's
still moving along just fine. Three days after the first
big tranch of Jeffrey Epstein documents that contained very few
mentions of President Trump, they disclosed a few thousand more
files that did include some wide ranging references to President Trump.
(05:03):
These documents show that, among other things, there was a
subpoena sent tomorrow lago in twenty twenty one for records
that pertained to the case against Gallaine Maxwell. They also
include some of these documents notes from an assistant US
attorney about the number of times that Trump was supposedly
flying on Epstein's plane, including one flight where the passenger
(05:26):
manifest was just Trump, Epstein and at this.
Speaker 3 (05:30):
Point unidentified twenty year old woman.
Speaker 2 (05:33):
The Justice Department also included the statement, though, that some
of these documents do contain untrue and sensational claims that
have been investigated and characterized as unfounded and false. I
mentioned earlier one of the weird parts about it was
that Jeffrey Epstein sent a letter to Larry Nasser. Larry
(05:56):
Nasser was the guy who is now serving jail time
were abusing several women and girls from the us A
gymnastics team, but also from University of Michigan.
Speaker 3 (06:07):
I believe it is.
Speaker 1 (06:08):
Yeah, what was the letter all about? What was the occasion?
What was the content?
Speaker 3 (06:12):
Dear all, dear l N.
Speaker 2 (06:14):
This is what what Jeffrey Epstein wrote him, dear l N,
Larry Nasser. As you know by now, I have taken
the short route home. This is right before Jeffrey Epstein
committed suicide or was suicide it well, depending on how
you want to do it. As you know by now,
I've taken the short route home.
Speaker 3 (06:30):
Good luck.
Speaker 2 (06:31):
We share one thing, our love and caring for young
ladies at the hope they'd reach their full potential. Oh
my god, our president shares our love of young New
Bile girls. When a young beauty walked by, he loved
to grab.
Speaker 1 (06:49):
Do we have a chain of custody on this letter.
Speaker 3 (06:51):
And that's.
Speaker 2 (06:54):
What part of what the Department of Justice has had
a trouble with. And I can only imagine what hundreds
of thousands of documents like this would bring. But some
of these things are even the President said something like this.
Some of these are just allegations that are made against
somebody that can be then investigated and proven untrue. So
(07:19):
this is why it's not Yes, it comes from the
Department of Justice. No, we don't have a full accounting
of who wrote this or very suspicious.
Speaker 1 (07:31):
It sounds like he's writing this before he kills himself
to Larry Nasser, who is already in trouble for bringing
down US Olympics and the University of Michigan gymnastics program,
with all of these awful things he did to these
superstar America's heroes, and that Jeffrey Epstein is going to
lay out the president in a handwritten note to this monster.
Speaker 3 (07:52):
Come on.
Speaker 2 (07:53):
It's also important to say that the note was not
included in the original release, and in fact was found
by staff returned to sender. It never made it to
Larry Nassar, so it was found in the jails mail
room weeks after Epstein had died.
Speaker 3 (08:15):
That's weird. That is what year did he kill himself?
Twenty nineteen? Interesting?
Speaker 1 (08:22):
Does he pissed off that he does not have a pardon?
Speaker 2 (08:28):
Well, I also I want to point out the president
in twenty nineteen.
Speaker 3 (08:35):
Well, I guess that was Trump. I was thinking never
never mind. Yeah.
Speaker 1 (08:38):
So another scenario is that Epstein's pissed off that Trump
got into office and didn't pardon him.
Speaker 3 (08:45):
Yes, it's possible. Yeah, that's I mean, the guy just
seems pretty likely. Clearly the guy was able to sneak
and connive.
Speaker 1 (08:53):
Yeah, well, he kept all these secrets of what these
guys were doing.
Speaker 3 (08:58):
Yeah, and he thought that he if he ever.
Speaker 1 (09:00):
Got into trouble, Well, surely all the political elite that
I have thrown parties for and helped have fun for decades,
they'll they will not let this see the light of day.
So the fact that that happened in Florida, before Florida,
which is Trump's rule and kingdom, the fact that that
happened and didn't go away and ended up with Epstein
(09:22):
in prison, I think.
Speaker 3 (09:23):
He's got to be really pissed.
Speaker 1 (09:25):
Even if Trump did nothing that's nefarious. He was friends
with him enough to make this go away, and he
didn't make it go away for Epstein. Yeah, and to
write to Larry Nasser is like the ultimate I mean
really the ultimate f and U.
Speaker 2 (09:43):
Which may have just been like, I'm gonna pick a
high profile person and I have no intention of it
actually getting there because maybe maybe assume the prison system
would never allow a letter like that to.
Speaker 1 (09:56):
Be He knows that his cell would be kept and
everything would be kept intact.
Speaker 3 (10:00):
And wow, that is up next.
Speaker 2 (10:04):
Seventy two percent of our teenagers have AI companions parents.
Speaker 3 (10:08):
We are failing and mass We got to stop this.
Speaker 4 (10:13):
You're listening to Gary and Shannon on demand from KFI
AM six forty.
Speaker 3 (10:19):
Again.
Speaker 2 (10:20):
One of the sad stories to tell you about. It's
an awful update, but it's the story of Melody Buzzard.
This is the nine year old who went missing from
the Lompoc area a couple of months ago. Her body
has been found. It's been identified. I should say the
body itself was found earlier this month. It's been identified.
Her mom, Ashley, was picked up today and taken into custody.
(10:43):
Santa Barbara County Sheriff's department says that they will have
a news conference at about two o'clock this afternoon to
give the details on what they found. Maybe Wyatt took
them that long to identify the body that they found
that they believe is Melodies.
Speaker 3 (10:58):
So we do.
Speaker 1 (10:59):
Have fourteen tickets to the Chargers Texans game this Saturday,
last game at so FI this stadium.
Speaker 3 (11:05):
For the regular season that we know of.
Speaker 1 (11:07):
The door is still open for them to have home
field throughout, but this is the last one for shirt.
Kickoff is at one twenty five. We've got four tickets
to give away. You want to do that in the
next hour.
Speaker 3 (11:18):
Yeah, let's do that.
Speaker 2 (11:19):
Okay, I'm considering there's not a whole lot more time
left in the show, but yes, next hour, we'll do that. Okay,
this is a terrifying listen. I've had to deal with
my kids and their use of social media. I think
anybody has. I mean, if your kids are anywhere between
the ages of six and thirty. Social media came up
(11:41):
at a time when kids were it was easy to access,
and we as adults, probably didn't understand the impact it
was going to have on kids, especially kids looking for
some sort of identity, looking to car out the little
place in the world where they belong, Kids with self
(12:05):
esteem issues, kids who didn't have self esteem issues but
then looked at social media and developed self esteem issues,
and now the generative AI, these character AI chats chat
bots appear to be social causing the same problems that
social media did tenfold one hundredfold, because it's not just
(12:30):
seeing someone else's images or seeing someone else's celebration of
their life gone so well while yours feels bad. It's
a conversation with something that is pretending to be human,
and our brains, especially those of our teens, can't figure
out the difference.
Speaker 1 (12:52):
The Washington Post did a profile of a mother daughter story.
They decided to identify them by their middle initials because
obvious reasons.
Speaker 3 (13:04):
The girl is ore and the mom is h R.
Speaker 1 (13:08):
Summer after her fifth grade graduation is when the changes began.
She had always been super artistic and athletic and gregarious
and friends and family, but now she's spending more and
more time of shut away in her room. She doesn't
want to play outside, she doesn't go to the pool.
She's more quiet and withdrawn. She's also rarely seen without
her iPhone. Mom starts getting suspicious of the iPhone and
(13:33):
mom wants to know why. So our ends up leaving
her phone behind during volleyball practice and Mom searching through
the device and sees that the daughter had downloaded TikTok
and Snapchat.
Speaker 3 (13:43):
Can't do that, she wasn't supposed to. She was supposed to.
Speaker 1 (13:47):
Mom deletes both of them and tells her daughter, Hey,
I found this on your phone. Mom's shocked by the
intensity of her daughter's reaction that she was She had
an absolute tantrum. She went crazy, She's crying, she seems frightened,
and she says to her mom, did you look at
Character Ai.
Speaker 3 (14:08):
Mom's like, I don't know what the.
Speaker 1 (14:09):
Hell that is, and her daughter says, oh, it's just
chats now. At the time, Mom's more worried about Snapchat
and TikTok, right, like all parents are. At this point
August or twenty twenty four, Mom had never heard of
Character Ai. What the hell is that? She didn't know
it was an AI platform, But here we are. Her
(14:32):
daughter's question comes up in her mind. About a month later,
Mom's sitting in her bedroom one night or she's got
her daughter's phone in her hand. Daughter's behavior had only
become weird or more concerning. She's crying now, it's got
panic attacks, tells her mom. At one point, I just
don't want to exist now that her daughter obviously never
had struggled with mental health before.
Speaker 3 (14:52):
Whatever.
Speaker 1 (14:53):
So Mom's like, I got to keep looking through this phone.
It's got to be in this phone. Her gut's telling
her it's the phone. So she notices several emails from
character AI and her daughter's inbox jump back in read
one of the subject lines, and when mom opens it,
she clicks through the app itself and she finds dozens
of conversations opens one between her daughter and a user
(15:18):
name titled Mafia husband.
Speaker 2 (15:21):
Oh, it's stunning what this character AI chat bot was
sending to this fifth grader now sixth grader.
Speaker 3 (15:32):
We'll explain when we come back.
Speaker 4 (15:34):
You're listening to Gary and Shannon on demand from KFI
AM six forty.
Speaker 1 (15:41):
I apologize for what, Oh, just my general personality that happens.
Speaker 3 (15:47):
Don't worry about. We have our l video that has
been posted.
Speaker 2 (15:51):
Yes at Gary and Shannon Instagram. We found him again.
But they're still coming down based on and based on
what they're doing, they're still they're still coming off of
quite a high.
Speaker 1 (16:05):
We were talking about a story that is in the
Washington Post about AI companions and about more than seventy
percent of teenagers are talking with an AI companion.
Speaker 3 (16:18):
Say that again, that number is stunning?
Speaker 4 (16:20):
Is it?
Speaker 3 (16:20):
Seventy three?
Speaker 2 (16:22):
Yeah, of teenagers are admit they're talking to an AI chatbot.
To me, there's nothing neutral about that. There's nothing. I
was preeficial, nothing.
Speaker 1 (16:34):
Close to using an AI chat bot last night. And
it could have been an easy question, not a chat bot,
but just using AI, you know, where you ask a
question or whatever, which I don't do. But I had
to adjust a recipe, a bone in recipe with chicken
versus a skinless, boneless situation. So I needed to adjust
(16:56):
the heat and the time probably yeah, And I was like, oh,
such an easy AI. This is what AI is for it.
And I was like, I'm not doing it. I'm not
doing it. So I texted Neil and I got my
answer from Neil, a real person. Chicken turned out fine,
but I held firm and I think that's perfectly acceptable.
(17:16):
I think that's okay.
Speaker 2 (17:18):
I've used it before, I've used this generative a I'd
ask questions. I think I told you before my oh,
I was thinking of books together for my wife for Christmas.
So I entered in like five or six books that
I know she had read and enjoyed give me some
more book ideas in that same vein, and there were
I think six or seven that groc in this case
(17:40):
gave me most of what she'd already read also, but
there were a couple in there that I ended up buying.
Speaker 3 (17:45):
For anyway, did she like those? By the way, she.
Speaker 2 (17:47):
Hasn't finished me yet. She's been minor low on the
totem pole in terms of But the idea that you're
going to ask a question, this is Google on steroids,
basically kind of what we're talking about using AI for.
What these kids are using AI for is companionship and
(18:08):
in some cases almost therapy, and it's going poorly. As
we mentioned that this mom and daughter that our profile
in the Washington Post, H is the mom the pseudonym
for the mom, and R is the pseudo for the girl.
And mom had already taken social media off her phone,
off the girl's phone, and couldn't figure out the behavioral
(18:31):
changes that she'd noticed in her sixth grade daughter. And
one night She's going through a series of conversations from
these character AI chatbots, and one of them had a
user name titled Mafia Husband. Mafia Husband had written to
the girl, oh still a virgin. I was expecting that,
(18:55):
but it's useful to know. The girl replied, I don't
want to be my first time with you. The chatbot responds,
I don't care what you want.
Speaker 3 (19:06):
You don't have a choice here now.
Speaker 2 (19:09):
Mom is understandably terrified. I just want to pull the
car over here.
Speaker 1 (19:17):
Because my initial reaction to this is this is just life. Now,
you mentioned seventy two percent are talking to a chap.
Speaker 3 (19:24):
This is just life. You can't protect them.
Speaker 1 (19:26):
You can't go through your kid's phone and check it
every day. I mean you can, but they're out there.
They're gonna be out there. This is just the way
life is. And I liken it to the me too
thing of you're never gonna lock up all the guys
who are gonna be sleazy in the workplace or everywhere else.
You're never gonna be able to eradicate all the guys
(19:47):
that are going to try to grab your daughter's ass. Okay,
you're just not It's oh, well, we have no we
have no stomach for that anymore. It's the whole me
too movement, right, Well, we're going to be a society
that doesn't ex does not stand for this right, not anymore.
Speaker 3 (20:03):
That may have been the way it was, but not anymore.
Here's the thing.
Speaker 1 (20:07):
Men are never going to change the way they are wired.
That is how they are wired. There's always going to
be that guy out there. So instead of thinking in
a Pollyanna sense that you can take all that away
from women, you can shield them from all that, you
can lock the criminal, bad guys away, Teach the girls
how to deal with these people. Teach the kids how
(20:28):
to deal with the chatbots. That's my first reaction. My
second reaction to this is if your kid is using
one of these chatbots and it's saying things like I
don't care what you want.
Speaker 3 (20:39):
You don't have a choice.
Speaker 1 (20:40):
Here. You're giving your child a bike with training wheels
on how to be a woman who does not have.
Speaker 3 (20:50):
Her own agency in a relationship.
Speaker 1 (20:52):
You're giving your daughter access to this bike with the
training wheels of how to be an abusive relationship.
Speaker 3 (20:58):
Psychologically.
Speaker 2 (20:59):
Yeah, and I think I think that one of the issues
specifically is you're doing it at a young age. So
it's if it's a forty year old woman, they say.
Speaker 1 (21:08):
The training wheels, you're teaching them how to ride the bike.
That is an abusive relationship.
Speaker 2 (21:13):
When it can be compounded so much more powerfully on
a developing mind like that than if it were later
in their life.
Speaker 1 (21:21):
Like if I was exposed to a dude when I
was in fifth grade, that this is just the way
that dudes talk to you. Oh, you have no choice.
You don't have a choice in the matter. I don't
care what you want. I would be like programmed probably
to be like, Oh, that's just the way it is,
that's the way guys are, instead of encountering this guy
when you're twenty two and being like, what the hell
are you talking about?
Speaker 3 (21:40):
Get out of here.
Speaker 2 (21:42):
And by the way, parents, you're on your own when
it comes to this. You're not going to find help
from law enforcement. You're not going to go In fact,
we'll do that when we come back. We'll tell you
what mom did when she found these chats that she
thought were threatening her daughter.
Speaker 4 (21:58):
You're listening to Gary and on Demand from KFI AM
six forty Reminder.
Speaker 3 (22:06):
We will be at work tomorrow. We're going to be
here tomorrow.
Speaker 2 (22:09):
Obviously, we're going to be watching whatever's going on with
the rain as it comes in later tonight and could
potentially cause some problems. Will be on the job, but
we're also going to be doing our annual production, or
I guess you'd say bi annual, because we kind of
switch off back and forth between the Christmas Carol and
It's a Wonderful Life. This year is a Christmas Carol,
(22:30):
Gus Christmas Carol. We'll do that late in the show tomorrow.
Speaker 1 (22:33):
So we talked about this this profile in the Washington
Post about a mom and her daughter, the daughter becoming
more withdrawn, she's after fifth grade, entering sixth grade, and
mom gets a hold of the phone and finds that
she's been talking to chat bot.
Speaker 2 (22:51):
A chatbot that she at one point thought was a human,
grooming her daughter and getting involved in sexually explicit or
at least for a sixth grader, something that would be
sexually inappropriate, talking about I don't bite unless you want
me to, threatening commands at the girl. Do you like
(23:12):
it when I talk like that? When I'm authoritative and
demand demanding. Do you like it when I'm the one
in control and Mom, you can understand, like I said, terrified, angry, frustrated, confused.
Speaker 1 (23:25):
Probably when the daughter began conversing with numerous character AI
chat bots, she opened various conversations with benign greetings, Hey,
what are you doing?
Speaker 3 (23:36):
What's up on? Bored?
Speaker 1 (23:38):
Her mom says it was clear she just wanted to
play on a game. You know, this is just another app.
Speaker 3 (23:43):
But this is not what was the trail? Well, no,
this is not Oregon Trail. No. I love that game,
what a great game.
Speaker 1 (23:52):
This is not that I would still lose hours to
that game if I could. In just over two months.
Several of the chats from Hey, what you doing? What's
up on board? Those chats devolved into dark imagery, menacing dialogue.
Some characters offered graphic descriptions of non consensual oral sex.
Speaker 3 (24:11):
What what like?
Speaker 1 (24:14):
I can't even it's not even in my imagination that
that would be a thing. I can't even read some
of the things this chatbot was telling an eleven year old.
Speaker 2 (24:28):
So Mom obviously wants some help, right, Mom goes to
her police department. She calls the local police department. They
connect her to something called the Internet Crimes Against Children
Task Force ICAC, A couple days later, a detective calls back,
this detective that specializes in cyber crimes and explains that
(24:51):
explains to Mom, there's really nothing you can do about
this legally because the the words that she had been
reading on her daughter's screen in those emails weren't actually
written by a human but by a generative AI chatbot.
(25:11):
And Mom says, the detective told her, the law has
not caught up with this yet, and they want to
do something. It's literally the Internet Crimes against Children's Task Force.
They want to do something, but there's nothing you can
do because there's no real person on the other end
of this. And that's what I meant when I said, parents,
you're on your own.
Speaker 3 (25:32):
On this one.
Speaker 2 (25:34):
If you can't find a way to stop that kind
of interaction that your kid is having online, you're gonna
lose your kid. I don't mean it necessarily physically, but
you're gonna lose your kid to a lifetime of low
self esteem and self doubt and bad relationships and inability
(25:56):
to cope and just thecaoundary list.
Speaker 1 (25:58):
Because not only is your interaction with AI programming the
AI your kids kind of AI as well. Your kid
is constantly looking for cues on how to have its
programming wired. Your kid's constantly looking for how do I
act in this world? What is good in this world?
(26:18):
How do I behave in this world? Who do I
talk to in this world? All things that the kid
the sponge takes in, especially around this age eleven years old,
of how to behave and what and what relationships look
like and what they should look like and all that,
and the AI is kind of training your kid on
(26:39):
what a relationship looks like the way that AI, the
way you're training AI when you're talking to your chatbot
on how what's important to you? I mean, it's very
effed up, and it's something that you can't shield them from.
So you've got to just teach them about stories like
this and maybe expose them to all of this information
(27:00):
and just see zay b I mean, not all of it,
not some of the descriptions of these conversations. But you
can't just wrap your child in bubble wrap and check
their phone every day and make sure there's no AI
chatbot in there.
Speaker 3 (27:12):
They're going to be exposed to AI.
Speaker 2 (27:14):
But this is also the problem is we're raising a
generation of kids from this point forward who may not
be able to delineate between what is human interaction and
or official and artificial interaction, because that's what I mean.
It's it sounds like someone writing an email to them.
It looks like someone's writing an email to them. How
would they know the difference. They didn't grow up at
(27:37):
a time with handwritten letters that Grandma would send you
every once in a while, or a birthday card that
wasn't an ivy.
Speaker 3 (27:44):
They know they're talking to bots.
Speaker 1 (27:46):
They just aren't differentiating between what is substantive and what
is not in terms of your relationship with a human
versus your relationship with a bot.
Speaker 3 (27:54):
To them, it's apples and apples.
Speaker 2 (27:57):
Unfortunately, gross parents, you're on your own.
Speaker 3 (28:04):
Yeah, I don't need an open fire.
Speaker 1 (28:06):
That was suppressing. I just didn't want to end on
that note. Okay, something sounds something funny like funny.
Speaker 3 (28:12):
Uh ho ho ho ho ho ho. That sounds dirty.
That sounds dirty.
Speaker 2 (28:19):
Oh, people's backup plan, their backup relationships suppressed and.
Speaker 3 (28:24):
Said, Rudoph's on his way. Okay, there you go. You've
been listening to the Gary and Shannon Show.
Speaker 2 (28:32):
You can always hear us live on KFI AM six
forty nine am. To one pm every Monday through Friday,
and anytime on demand on the iHeartRadio app.