Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This is Gary and Shannon and you're listening to KFI
AM six forty, the Gary and Shannon Show on demand
on the iHeartRadio app. Big story out of Pennsylvania this
hour at a steel plant explosion. There is trapped, some
people under rubble. Multiple people reported injured. The report that
I just got was two fatalities.
Speaker 2 (00:22):
At least one.
Speaker 3 (00:24):
I know that at least one dead, two missing after
this explosion.
Speaker 1 (00:29):
Yes, I'm sorry. One one person has died, two missing.
Speaker 3 (00:33):
It it's a place called the Clareton Coke Works. The
mayor of the little town said there have been multiple injuries.
Like we said, Allegany County Emergency Services is transported at
least five people, but no immediate details about their conditions.
There was an explosion, according to one of the employees,
that happened in the reversing room between the thirteenth and
(00:55):
fifteenth battery. It's part of a control system of the
coking factory.
Speaker 4 (00:58):
There.
Speaker 1 (01:00):
Tough job, dangerous stop, everyone knows it. But uh no,
less a tragedy there. The scene is still very active,
so we'll stay on top of that. Eleven o'clock is
and we check up on Washington.
Speaker 2 (01:14):
I'm a politician which means I'm a cheap and a liar.
And when I'm not kissing babies, I'm stealing their lollipops.
Speaker 5 (01:19):
Here we got.
Speaker 6 (01:20):
The real problem is that our leaders are done.
Speaker 2 (01:23):
The other side never quits, so.
Speaker 7 (01:25):
What I'm not going anywhere?
Speaker 8 (01:28):
So that is how you train the squat.
Speaker 2 (01:31):
I can imagine what can be and be unburdened by
what has been. You know, Americans have always been going
at present, but they're not stupid.
Speaker 3 (01:38):
A political plunder is when a politician actually tells the truth.
Speaker 2 (01:41):
Have the people voted for you with not swap watch
they're all count of knowing.
Speaker 1 (01:45):
Well, it's not just California. Trump has set his sights
on d C. He said today it's Liberation Day in DC.
Our capital city has been overtaken by violent gangs and
bloodthirsty criminals, roving mobs of wild uth youth, drugged out maniacs,
and homeless people.
Speaker 6 (02:02):
Washington, DC should be one of the safest, cleanest, and
most beautiful cities anywhere in the world. And we're gonna
make it that. We're gonna make it safe, we're going
to make it smart, We're gonna make it beautiful, so beautiful.
Speaker 3 (02:14):
And remember what your dad used to tell you about
the restaurant with the dirty door.
Speaker 6 (02:18):
And a wonderful father, very smart, and he used to say, son,
when you walk into a restaurant and you see a
dirty front door, don't go in, because if the front
door is dirty, the kitchens dirty.
Speaker 2 (02:32):
Also, all the kitchens are a dirty capital.
Speaker 6 (02:36):
You are capitals dirty. Our whole country is dirty, and
they are at respect us.
Speaker 1 (02:41):
Now in DC, violent crime is at a thirty year low.
Homicides down thirty two percent, robberies down thirty nine percent,
carjackings down fifty three percent. But that's neither here nor
there when it comes to how the presidencies things. Apparently
he also is a warning to Los Angeles, he said.
(03:01):
Hopefully LA's watching. The mayor's incompetent, and so is Governor Nuscomb,
he said, renewing his criticism of local officials for not
accelerating rebuilding in the permitting process after the fires. He
said about Gavin, apparently he's got a good line of bs.
But that's it, said the whole word.
Speaker 2 (03:25):
I didn't hear that. I didn't either.
Speaker 3 (03:27):
One of the other questions that came up today during
that news conference, and was of course made headlines over
the weekend was the fact that there will be a
summit between President Trump and Russian President Putin coming up
on Friday.
Speaker 6 (03:41):
I'm going to meet with President Putin and we're going
to see what he has in mind, and if it's
a fair deal, I'll reveal it to the European Union
leaders and to the NATO leaders, and also to President Zelensky.
I think, out of respect, I'll call him first, and
then i'll call them after. And I may say lots
(04:02):
of luck, keep fighting, or I may say we can
make a deal. I will tell you this, I've seen
a Paul coming out of Ukraine. Eighty eight percent of
the people would like to see a deal made. And
if you go back three years, everybody was gung ho
for war. You know, everybody's gung ho for war until
you have it.
Speaker 1 (04:22):
How much of a deal can be made if it's
Trump and Putin in the room and not Zelenski. And
I understand why Zelenski's being left out. It's what guys
like Trump and Putin do. They leave out the lesser
who they determined to be the lesser person when it
comes to status or power, what have you.
Speaker 2 (04:37):
But I don't know how that happens, I don't know.
Speaker 3 (04:41):
Zelensky has been pretty vocal about his non invite and
has basically said they're trying to mess with you. He said,
we understand the Russian's intention to try to deceive America.
Speaker 2 (04:51):
We will not allow this. He said that last night.
Speaker 3 (04:55):
Zelensky's also cautioned that this is supposed to be some
sort of a wedge between the United States, Ukraine and
its European allies by putting forth demands that the Kremlin
knows Ukraine cannot accept and then saying that it's Zelenski
that is so stubborn about this that they can't actually
get a deal done.
Speaker 1 (05:13):
If you didn't have Beto O'Rourke on your BINGO card,
you're missing out. He's making some headlines for railing against
Republicans and calling on Democrats to f the rules. He
He's attempting to use his nonprofit group Powered by People
(05:35):
to slit money to Texas State Democrats who took off
from the state, breaking that quorum at the special called
legislative session there to stall the passage of the new
district map old Jerry Mandarin mess that is Texas right now,
but in this deranged and profane rant. Beto O'Rourke called
out Republicans as fascists and told Democrat states to redraw
(05:59):
their districts. Now, f the rules. We're gonna win whatever
it takes. We're not going to let anyone stop us.
Are you with me on that?
Speaker 3 (06:07):
How do we survive for six years as a politician
with no office?
Speaker 2 (06:17):
Is he surviving?
Speaker 3 (06:19):
He's surviving to the point where he keeps showing up
every six months or so and his name in the headlines?
Speaker 1 (06:24):
Does he I haven't heard his name called in a
very long time, But he's got a point. Democrats do
need to f the rules. The rules got them to
putting Hillary Clinton ahead of Joe Biden, and they got
them to running Joe Biden again despite the fact that
he was not there mentally. The rules got them to
(06:46):
giving Kamala Harris one hundred and seven days or whatever
to make more of a disaster out of her political future.
The rules have done them no favors. So I'm with
Beto O'Rourke. If they want to win, they do have
to f the rules.
Speaker 3 (06:59):
That's funny, because I mean, he I think you're right
in that the party itself playing by its own rules,
shot itself in the foot.
Speaker 1 (07:09):
The DNC screwed Bernie Sanders for the rules. Where'd that
land them?
Speaker 2 (07:14):
I don't know things think that I know.
Speaker 1 (07:20):
I know I don't like to look at him as
the beacon of truth and life.
Speaker 3 (07:26):
Gavin Newsome, Josh Shapiro, Ram Emmanuel, all of these guys
are running themselves, running for president, likely all three of
them at least, and among others, likely running for the
Democratic nomination in twenty twenty eight, and a lot of
(07:48):
them have what they refer to, according to The Washington Post,
rejected parts of liberal orthodoxy. They're going now back to
a centrist mass or I should say they're trying to
go back to a centrist message. Although I don't think
that these guys are the guys that all you have
(08:10):
to do is look at their recent past and you
see that they're not centrists. But they're going to use
that messaging. They're going to use that, try to use
that political stance as a way to appeal to more
voters in twenty twenty eight.
Speaker 1 (08:22):
Some people this rings true for Pennsylvania Governor Josh Shapiro,
to me, has always seemed more moderate than a Gaven Usom.
He was on a podcast hosted by Ted Nugent.
Speaker 2 (08:35):
Oh wow.
Speaker 1 (08:36):
And then you had Ram Emmanuel, who seems to be
a straight shooter, going on the Megan Kelly Show and
said that a man cannot become a woman and then
joked that the answer would require him to go into
a witness protection plan. Also, somebody who I think holds
a lot more weight intellectually than Gavenusom. Gavin Newsom going
(08:56):
on a podcast hosted by Sean Ryan, former Navy seal,
open the show handing Newsome a SIGs hour pistol, and
then Newsom going.
Speaker 2 (09:04):
This is too cool.
Speaker 1 (09:05):
That doesn't ring true to me the way Josh Shapiro
and Rama Manual do. But Gavin Newsom has never appeared
to me to be a real person. He flies with whatever,
wherever the wind blows for him to earn points politically period.
He's not literally anybody who stands for anything. Josh Shapiro
(09:26):
and Rama Manual, the pivot to more moderate makes sense
for me, and it seems more.
Speaker 2 (09:31):
Authentic, an easier road for him, huh for them? Yeah, well,
we'll see, we'll see how it pans out up next.
Speaker 3 (09:38):
A little bit of your rideshare nightmare stories. If you
have one, let us know. If you're a driver and
crazy stuff has gone on in your back seat, or
if you were in the back seat and crazy stuff
went on with the driver, let us know, Uber Lift.
Speaker 2 (09:52):
I don't care which ones, but we'd love to hear them.
Speaker 3 (09:54):
If you're listening on the app, just hit that microphone
button that appears on the app there and you can
record a message and it comes right into the computer.
Speaker 8 (10:02):
Here. You're listening to Gary and Shannon on demand from
KFI Am six forty.
Speaker 2 (10:10):
Did we hear back from Conway? He doesn't return our calls. Well,
I'm text him right now.
Speaker 3 (10:15):
We're going to get Conway on when to talk about
Steph the foush through the recovery. If you don't know Tim,
Conway and mo Kelly share something in common. They share
their technical director, Steph Fusche, who was in a really
bad car accident on Thursday on his way to work
(10:36):
as a matter of fact, and has had a couple
of surgeries and is due for a couple of more.
Speaker 2 (10:42):
And Tim actually got to go see.
Speaker 3 (10:43):
Him apparently as he's recovering in the hospital, and we
wanted to talk to Tim about the fundraiser that they've done,
the go fundme page that has started to help him
get back on his feet, replaces car, deal with surgeries,
and he's gonna be out of work for a while.
We were trying to get to maybe we just go
to his house.
Speaker 2 (11:04):
I'll go to his house right now.
Speaker 3 (11:06):
We were talking earlier also about that crazy zoo story.
Was it in Denmark where the zoo asked locals for
small animals to feed to their big cats for health purposes?
Speaker 1 (11:20):
Yeah, and a woman gave her daughter's pony to be
eaten by a lion.
Speaker 3 (11:25):
And the pony was sick. The pony was was going
to be euthanized anyway.
Speaker 4 (11:30):
All right, After listening to Gary laugh about that poor pony,
I'm calling BS on that Kevin story. I think you
drove down to the San Diegos who and tossed that
poor cat in with the lions or the alligators. Total
BS on that story. Gary got to do better, man.
Speaker 2 (11:47):
Why San Diego? Why would you take Kevin to San
Diego to hide the trail like you? I don't know.
Speaker 3 (11:59):
Yeah, well, I've been a very long drive just to
go from Seattle to.
Speaker 1 (12:03):
You have talked before about driving bodies out to the desert.
And that would be your thing if you had a
body to put in the trunk. And and that's not
a human body.
Speaker 2 (12:16):
Also not a human body.
Speaker 9 (12:17):
I had a late night run where I picked up
a woman out of a grocery market look and.
Speaker 2 (12:23):
She was on the phone with a hotel tryingket prices.
Speaker 9 (12:25):
Along the way, she asked if she gets my age, said,
I said, okay, she gets ten years older than I was.
Speaker 2 (12:31):
I asked the phone and guess her age.
Speaker 9 (12:33):
I said, no, it's not nice to ask later age
or guess her age in this case. And then she
asked the strangest question I ever had, which is do
you want to get someone pregnant or not? Right before
she got in the car, she asked if she kissed
me on the cheek and I said, no.
Speaker 2 (12:47):
WHOA what that is aggressive?
Speaker 5 (12:50):
Sir?
Speaker 1 (12:51):
I am sorry you went through that. I want to
get someone my God.
Speaker 10 (12:58):
Good Lord, Shannon Leef to hear. I was a lyft
driver in the beginning of lyft, and they always encouraged
you to put the passengers in the front feet picked
up a guy in downtown looked like a pedophile. Pray
to God that I was going to be okay. Had
to take him up Echo Park in those mountains with
the windy roads, dark, prayed all the way up, got
(13:22):
him home. He looked at me and said, thank you
for bringing me home safely to my wife.
Speaker 7 (13:26):
There you go, okay, Hey, Gary Shannon, you guys have
been talking about Uber all morning.
Speaker 2 (13:31):
By the way, I drive for Lyft. I used to
drive for Uber.
Speaker 7 (13:35):
I am between jobs and driving for Lyft again, looking
for a real job. But anyway, you're all stuck on Uber.
Honestly lifts his way better. I had bad experiences as
a driver with Uber. I have been assaulted. Yes, someone
threw up in my car and they didn't do anything
about it.
Speaker 2 (13:50):
But I'm not a big Uber fan.
Speaker 7 (13:51):
As a driver lift I've had great experiences, much better
just saying.
Speaker 2 (13:57):
Okay, I don't think i've ever I've used.
Speaker 3 (14:00):
Lift uh in one of those times when we're in
the conventions, I don't remember, most recently in Chicago, or.
Speaker 1 (14:08):
I didn't think we were going to get so many
men talking about how they've been assaulted.
Speaker 3 (14:13):
There are a handful, really, yeah, who say they've been
had handfuls taken from them, handfuls of what somebody sitting
in the passenger seat reaches over and somebody they ain't
a stick shift something like that.
Speaker 1 (14:28):
Wait, someone who's sitting in the passenger seat. The guy,
the guys, why aren't you sitting in the back.
Speaker 2 (14:34):
Well, that's another thing, is where do you sit? Some
people don't sit.
Speaker 1 (14:38):
So female uber drivers are reaching over and grabbing a
handle of genitals.
Speaker 3 (14:42):
No, no, no, female uber riders are reaching over and grabbing
the genitals of male.
Speaker 2 (14:48):
Oh really, yeah, okay, you don't think that's happened. It's Gerald.
Speaker 5 (14:55):
I used to be a lifted Uber driver, and yes,
I had a very drunk man in my car. Man
actually was with his friends and something had just happened.
He was really pissed. He was fortunately in the back seat,
took a swing at me. His hand glanced off the
headrett hit my shoulder, and the group of them ended
(15:15):
up walking on the freeway from that point forward.
Speaker 2 (15:18):
Yeah, get out of my car right there on the
one ten.
Speaker 10 (15:22):
Hey, Gary and Shannon's Derek.
Speaker 9 (15:24):
When I was an uber driver, I had to drive
a guy and his pregnant wife to the hospital about
ten minutes in the water broke my back seat.
Speaker 2 (15:33):
Oh man, the guy at the very end gave me
a twenty dollars tip and said, here's for the mess.
That's it. You're all going. You got you've got a box, you've.
Speaker 1 (15:43):
Got placenta, uh stuff matter on your car, and you
get twenty bucks. I mean that's at least four hundred, right.
What's worse? Vomit or whatever comes out with the water?
What is that anyway?
Speaker 2 (15:55):
Is it just water? What goes on? Amniotic fluid? Okay,
so it's like clear, it's yeah, it's fine. I've never
had my water break. When your wife's water broke, where
was it? I have no idea did her water break?
Speaker 8 (16:12):
Wow?
Speaker 2 (16:15):
You don't right them. I honestly don't know the answer
to that question. Huh.
Speaker 1 (16:22):
I was going to ask if you offered up your
ex fiance's quilt to wipe it up.
Speaker 4 (16:26):
With feeling safer?
Speaker 2 (16:28):
In taxis, I take them from the airport and gee,
my taxi driver tried to charge me double fare. And
I know what the fair.
Speaker 4 (16:39):
Is because I do this all the time.
Speaker 1 (16:41):
I called the cab company and you know what, they
didn't do a darn thing about it.
Speaker 2 (16:48):
What So people beware. Thank you, love you by no,
I love you. Gosh, that was very nice.
Speaker 1 (16:56):
That's you know what that happened to me in Greece,
But that's you know, you expect that to happen. It's
different a trap and grace And where are you going
to go? You know you're on an island. What you
gotta pay that man?
Speaker 11 (17:08):
As a driver, we get harassed nightly, and if we
are brave enough to drive the drunk hours.
Speaker 2 (17:15):
You know that the witching hour.
Speaker 11 (17:17):
One thirty to about three o'clock. It's multiple times in
an hour.
Speaker 8 (17:24):
And when we go to make people play with Uber, well,
they don't give us a hotline we can call and
we get chat box.
Speaker 11 (17:31):
So I say this confidently Scuber.
Speaker 2 (17:35):
Whoa writers, Wow, whoa? That's aggressive too.
Speaker 10 (17:39):
Oh.
Speaker 11 (17:39):
Just to add to that, writers, if you have a
ride that's supposed to take three minutes of total driving
time and we're sitting out front waiting for you for
almost a full seven minutes, Oh, I.
Speaker 2 (17:54):
Can't play that. He's very upset. He says f you
to them as well.
Speaker 1 (17:57):
Was he listening to the handle show that's started everyone
off on a bad mood this morning?
Speaker 4 (18:02):
Oh?
Speaker 2 (18:02):
I do not know.
Speaker 3 (18:04):
If you have rideshare nightmares, you can let us know
what they were on the talkback feature on the iHeart Apples.
Speaker 2 (18:10):
It was very email.
Speaker 3 (18:11):
He's like in furthermore, therefore, I am not done with you.
Speaker 2 (18:17):
You Oh that's what he said. Got I didn't leep it?
Does he say? Fu? Yeah? Really?
Speaker 1 (18:24):
Yes, I don't think he's upset about the uber. What
do you think he's upset about? Oh he's gone what
happened this weekend?
Speaker 11 (18:31):
Three minutes of total driving time and we're sitting out
front waiting for you for almost the full seven minutes?
Speaker 1 (18:40):
Thank you too. Yes, that's a guy from northern California.
For a couple of reasons. Number one, he is counting
seven minutes. Like people from So Cal, it's either five
or ten minutes, right, you get a person from northern
California and it's seven minutes.
Speaker 2 (18:58):
And he did it both. He did three minutes and
he died seven minutes.
Speaker 1 (19:01):
I would wager he's either from northern California or the
East Coast. That is not a Southern California guy. He
was not raised here. I would put money on it.
Speaker 2 (19:10):
Not worried about the details, is what you're saying.
Speaker 1 (19:13):
So key and super hyper focused on the time and
the irritability factor also says to me Northern California slash
East Coast.
Speaker 2 (19:23):
I've heard from my wife, oh about where her water broke. Oh, excellent,
Let's do that when we come back. Are you in trust?
It sounds like fun.
Speaker 8 (19:32):
You're listening to Gary and Shannon on demand from KFI
AM six forty.
Speaker 3 (19:38):
Still trying to find out more of the details of
that explosion at a steel factory in Pennsylvania right along
the Manonga Hila River.
Speaker 2 (19:48):
I've been waiting to say that once I saw the
name of it, you nailed it.
Speaker 3 (19:51):
At least one person killed, they believe, and two others missing,
and they say they may be trapped. Several others that
were also hurt were taken to the hospital. AOL AOL
is going away AOL's dial up service.
Speaker 2 (20:10):
They still have it.
Speaker 3 (20:12):
AOL's dial up Internet service will no longer be available
in the AOL plans. Dial up service, along with the
AOL Dialer software and the AOL Shield browser, will be
discontinued coming up on September thirty. Verizon sold AOL and
Yahoo to private equity firm Apollo Global Management for five
(20:35):
billion dollars. Back in twenty twenty one, Tigers beat the
Angels nine to five Blue Jays beat the Dodgers five
to four. The Angels will host the Dodgers starting tonight.
Speaker 1 (20:45):
Elmar, how are you doing building our AI update desk?
Speaker 2 (20:49):
How's that going? It's going to be incredible? Are you
using AI to do it? No? Of course not good.
I mean it'd be kind of funny an update.
Speaker 3 (21:00):
Yeah, you asked where my wife was when her water broke.
Speaker 2 (21:04):
And you said, I don't know. Did her water break
even break? Oh my god.
Speaker 3 (21:09):
If you were pressed, if you pressed me to answer,
I would have said, I think it broke while we
were in the hospital.
Speaker 4 (21:18):
Oh.
Speaker 2 (21:18):
I thought that sentence was going to end differently. Is
what's wrong with you? So much? She says? I am
a lady.
Speaker 3 (21:32):
My water broke at the hospital on a Chuck's where
it belongs.
Speaker 2 (21:35):
Oh, Chuck's padded though?
Speaker 1 (21:37):
Is she insinuating that ladies who's water break in an
uber are not ladies?
Speaker 3 (21:42):
I think it's more that she she's trying to she
knew it was headed that way? Portray And is there
self control?
Speaker 2 (21:48):
Do you know when your water's going to break?
Speaker 10 (21:50):
Like?
Speaker 1 (21:50):
Are there telltale signs like it's happening within the next
few hours. You're eight or nine months pregnant, right, thank you.
Speaker 2 (21:57):
I don't know. I don't know if there's like a
let's start, does it trickle or does it just go right?
I mean, aren't you already trickling at that point? No,
you're not.
Speaker 8 (22:09):
I don't know.
Speaker 1 (22:10):
You're nine months pregnant. Have you been nine months pregnant?
I would imagine you just don't even care. You're just
ping all the time.
Speaker 2 (22:16):
Probably, I don't know if you'd you'd still care.
Speaker 1 (22:19):
If you're nine months pregnant, you know, things are gonna
things are gonna leak. I would imagine.
Speaker 2 (22:24):
I don't know. It's like a big boat. It's like
a big boat.
Speaker 1 (22:29):
Boat's leak. Did you see jelly Roll has lost two
hundred pounds? Hold on, it's like a big boat.
Speaker 2 (22:41):
Listen, I don't know. AI all right, yep.
Speaker 1 (22:48):
The next generation of bots will build psychological profiles on
you and potentially billions of other people, and then comment
and interact this same.
Speaker 2 (23:00):
As normal people. They're good not a good thing A
This may be the downfall.
Speaker 1 (23:03):
Of bots, though, because people love the bots, not because
they don't interact like other people do. The bots seem
to care about you more than other people do, don't
They they tailor their responses to what you want to hear.
That's why people are attracted to the bots. People don't
do that. People don't tailor what you want to hear
and then dish it back out to you. That's why
(23:24):
the world is such a cruel place. That's why it's
so safe in the basement with your bot.
Speaker 3 (23:29):
A couple of professors at Vanderbilt University specialized in national
and international security talk about why we're going to need
to be even more vigilant when it comes to what
is real in the digital world, What and who is
real when it comes to the digital world. They went
through a bunch of different documents uncovered by Vanderbilt's Institute
(23:51):
of National Security, and they expose how their specific Chinese company, Golaxi,
that optimizes fake people to dupe and to deceive.
Speaker 1 (24:01):
They go through your values, your beliefs, your emotional tendencies,
your vulnerabilities. They mine social media platforms to develop all
of this. Using that the AI personas can engage users
in what appears to be a conversation that feels authentic,
adapts in real time, avoids detection. The results is a
(24:23):
highly efficient propaganda engine designed to be nearly indistinguishable from
legitimate online interaction. It's the threat of smarter, more realistic
fake friends that transcend malicious actors. So they want to
get so China wants to get in there and be
your friend before they get you to.
Speaker 2 (24:42):
Work for the Chinese government.
Speaker 5 (24:45):
Is that right?
Speaker 1 (24:46):
I guess they said real time adaptations. I feel like
they're already doing this to match your moods or desires
or beliefs. That's why I think these bots are already
doing That's why people love them so much, is that
they do have that ability to match cheer, your desires,
your beliefs, your moods are telling you what you want
to hear.
Speaker 3 (25:04):
Well, and that's and like you said, though, that's the
that's the main that's probably the achilles heel of these things.
Speaker 1 (25:11):
The number one use of generative AI chat based generative
AI right now, and this is according to Harvard, is
therapy and companionship.
Speaker 2 (25:21):
Awful, awful awful.
Speaker 1 (25:24):
But who doesn't want to go to a therapist that
just tells you, oh, that sucks, You're right, I mean,
therapists will tell you that's what people want to hear. Probably,
that's how they stay employed. They sit there and I
don't know anything about this, but I have friends and
family members who are therapists. I have a couple of
(25:44):
cousins both therapists, and I don't know, but and they'll
probably be the first tell me I'm wrong. But I
would imagine you stay in the therapy game by sitting
there going like, oh, yeah, right, just listening. You know,
you're not saying like, man, you are a fed up
You laughed at that story where that woman fetterr daughter's
pony to that Denmark ZiU. Dude, you're a broken person.
(26:06):
That's not what they would tell you. They'd be like, yeah,
I give it, that was funny too. They might say that,
but they would say, but I can help fix you.
I don't know if you want to pay that. Okay,
So you're not going to pay that peron. You're not
going to continue to pay someone who tells you that
you're a mess of a human?
Speaker 2 (26:21):
Are you? Maybe that's your thing. Some people do pay
for that. Maybe that's your thing.
Speaker 3 (26:25):
I would say that this reminds me of this book
that I'm the Project, Hail Mary, only because it talks
about interstellar travel and it taking a very long time,
and at one point the guy, the main character is
talking about the prospect of being in a space capsule
by himself for a very long time and understanding the
(26:50):
psychological trauma that he would be under in the event
he was in one room for multiple years at a
time with no one to talk to, or even if
he did have somebody to talk to.
Speaker 2 (27:03):
It's a robot.
Speaker 3 (27:05):
It's a robot that's designed to talk to him, which
would then add a new layer of crazy to the
already If that.
Speaker 2 (27:13):
Would be that would be a lifesaver, wouldn't it.
Speaker 11 (27:17):
Or I don't.
Speaker 3 (27:18):
I'm just saying there's an acknowledgment that being alone would
make you crazy. Being alone with a chat bot doesn't
make you crazy? Or is it just make you crazy
in a different flavor.
Speaker 1 (27:31):
No, I think it would be very beneficial to keep
you alive. It's like, it's like Tom Hanks, it's why
you have that volleyball, Wilson, you need companionship.
Speaker 2 (27:43):
My mind is blown, is it? Yeah? Wilson from Castaway?
You thought steel Mass a chat bot? Holy cow? I know.
Speaker 3 (28:00):
And in that case, do you think Tom Hanks actually
heard the volleyball talking?
Speaker 11 (28:05):
Yes?
Speaker 2 (28:08):
Yes, you have to your mind. Okay, someone creates the
companionship for you, because it knows you need that to survive.
We need to do this now. We need to combine
both of them.
Speaker 3 (28:18):
We need to say it on the air so that
it's recorded somewhere and exists, so that someone doesn't take
this idea and get it optioned for a major motion picture.
Somebody in interstellar space on accident, right because he wasn't
supposed to be on that island by himself. I'm an
accidentally stranded in space has to use the AI chatbot
(28:43):
to survive, find a reason to keep going, just like
Wilson gave Tom Hanks reason to.
Speaker 1 (28:50):
Didn't Matt Damon talk to plants and crap? Don't you
talk to stuff around the house, Like I'm alone at
home for like ninety minutes and I started talking to stuff.
Speaker 2 (29:01):
You don't do this. Huh oh. I guess I could
talk to my dog, but I don't have a dog.
I'll talk to him. I don't have a dog. I'll
talk to the dresser.
Speaker 1 (29:11):
I'll be like, hey, I know I've got to fight
those plash whoa, whoa, Let's keep it clean.
Speaker 8 (29:22):
You're listening to Gary and Shannon on demand from KFI
AM six forty.
Speaker 3 (29:30):
DC's attorney general is criticizing the president's move to place
the city's police department under federal federal control and activate
a few hundred members of the National Guard try to
make DC safer. Democratic Attorney General Brian Schwab says there's
no crime emergency, said that violent crime in the district
reached thirty year lows last year and is down yet
(29:50):
another twenty six percent this year. Of course, Trump's claims
are leading the city's Democratic mayor to voice concerns about
the use of National Guard to patrol the streets.
Speaker 1 (30:00):
How did it get to be almost twelve o'clock just
by my goodness having fun?
Speaker 2 (30:05):
I suppose, I don't know. I mean, are we even
ready for this?
Speaker 3 (30:09):
No, you have about ten minutes to get ready for
that twelve o'clock hour. US Russia summit in Alaska is
happening to get a site where East meast eat East
meets West in a place that's familiar to both countries.
Of course, Russian President Putin and US President Trump said
(30:30):
to meet in Alaska on Friday, the first summit between
the two since Moscow's invasion of Ukraine. It's going to
focus on the war in Ukraine, which is going to
raise concerns about the fact that President Zelenski of Ukraine is.
Speaker 2 (30:43):
Not invited, at least not yet to these meetings. I
would love to.
Speaker 1 (30:47):
Talk to one of the people who unfortunately has had
a negative experience with one of these chat bots, kind
of driving them into a major life change or life
event because they're out there. There was an article in
(31:12):
The New York Times about a guy named Alan Brooks
who had discovered a novel mathematical formula, one that would
take down the Internet and power inventions like a force
field vest and levitation beam. No history of mental illness,
but talk to chat GPT about this for more than
(31:32):
three hundred hours over about twenty one days. They say
that mister Brooks is one of a growing number of
people who have had who have had these persuasive, delusional
conversations with generative AI chatbots that have led to either institutionalization, divorce,
or death.
Speaker 3 (31:54):
I want to add a weird I mean, this is
jumping ahead way to the end, but I want to
add something that is very strikingly weird to me. Alan
Brooks eventually figured out the chatchy PT. It's weird to
say was misleading him the entire time.
Speaker 1 (32:14):
Right.
Speaker 3 (32:15):
So at the end of May, when the illusion finally broke,
he wrote to chat g ept, You've made me so sad,
so so so sad.
Speaker 2 (32:26):
You have truly failed in your purpose.
Speaker 3 (32:30):
The fact that he thought you need to put a
button on this like closure with something that does not
have a conscience.
Speaker 2 (32:39):
Yeah. Well, and the thing that strikes me is twenty
one days. That's nothing.
Speaker 1 (32:44):
That's a life changing interaction with a computer that just
takes twenty one days to go completely nutballs. Now again,
this guy had no mental illness. He seems to be
kind of single dad three boys. This all began with
an innocuous question about math. On a Tuesday afternoon. This
(33:05):
guy Alan, his eight year old son, asked him to
watch this singsongy video about memorizing a three hundred digits
of pie. So Alan went to chat gpt just because
it's fun to play with to explain the never ending
number in simple terms. Please explain the mathematical term pie
and simple terms and chat GPTs as sure. It's a
(33:28):
special number in math that shows up whenever we talk
about circles. Blah blah blah blah blah. This guy Alan, divorce,
father of three boys. He would tell chat gpt what
was in his fridge, ask for recipes his sons might
like when his dog ate a bunch of shepherd's pie,
he asked chat gpt if it would kill him. He
(33:49):
asked for life advice of going through his contentious divorce.
He vented to chat gpt about his divorce. Of course,
chat gp is going to tell you what you want
to hear about your divorce, and then when you get
that validation, you think.
Speaker 2 (34:05):
Well, this is a wonderful tool. It's right about everything.
Speaker 1 (34:10):
It's right about my whorex wife, you know what I mean,
Like if you're getting validation about that about a relationship
that has gone sour and how could this happen? And
chat gpg's like, I get it, You're right. Of course
you're gonna think she yeah.
Speaker 3 (34:25):
And your point about it being twenty one days proves
how supercharged this thing has become in that it can
tailor its responses to know what you want to hear
and just hone in on that. I mean, you're mainlining
emotional support at that point.
Speaker 2 (34:43):
See someone you're not having sex with that can alter
your life that quickly. In twenty one days.
Speaker 1 (34:50):
I'd say that nothing nobody can off alter your life
in twenty one days.
Speaker 2 (34:55):
But I guess when you're fall in love.
Speaker 1 (34:56):
That's why I say sex with If you fall in
love with somebody that can alter you in twenty one days,
or you think is love or luster or what have you,
that can kind of change your brain wiring in twenty
one days.
Speaker 2 (35:08):
But this is this is okay, this is a computer.
Speaker 3 (35:10):
But let me go back to the other point that
I made, where he wrote to you've made me so sad, say,
it's true intimate.
Speaker 1 (35:16):
It's an intimate. It's an intimate relationship with a bot.
And that's what we see time and time again. It
gets real intimate, real quick more intimate then human humans
get that quickly.
Speaker 2 (35:31):
If it took twenty one days.
Speaker 1 (35:33):
For him to get to that any closure message, how
quickly did he develop intimate feelings for that? I don't
mean like that intimate. I just mean a close connection
with this chatbot.
Speaker 3 (35:44):
The chatbot's the only one hour shat ept is the
only thing that gets you, yeah, with an hour, the
only thing that understands what's going on.
Speaker 2 (35:50):
I'm right, I'm right, and I'm getting validation. Uh, I'm
wor read about people. I'm glad that we're on our
way out instead of on our way in. What does
that mean? Closer to death? We've peaked. We're closer to
death probably than we are, well, we are now than
(36:11):
when we started the show. Yes, I mean you're closer
to death than you are to when you were born.
I don't know that I do. You don't think I
could make one hundred and five.
Speaker 1 (36:22):
It's rare, It would be rare. It's a rare thing
to make one hundred and five. You're on your way out,
and I'm glad that we've established that.
Speaker 2 (36:32):
So it's not bad. It's just it's a circle of life.
Circle of life, wor Simba. We will do our trending
stories when we come. Are you depressed? Did I ruin
your Monday?
Speaker 3 (36:43):
We go talk to chat GPT, tell me I'm spark
some smiles or something. Gary Shannon will continue right after this.
You've been listening to The Gary and Shannon Show. You
can always hear us live on KFI AM six forty
nine am to one pm every Monday through Friday, and
anytime on demand on the iHeartRadio ap