All Episodes

August 25, 2025 37 mins
#SWAMPWATCH – Trump: Bridgegate / AI: Chatbots, Electricity Rates are UP!
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This is Gary and Shannon and you're listening to KFI
AM six forty, the Gary and Shannon Show on demand
on the iHeartRadio app. What's wrong? Why'd you go? Nonverbal?

Speaker 2 (00:12):
I have been wondering, Yeah, about Morgan Wallen.

Speaker 1 (00:17):
What have you been wondering?

Speaker 2 (00:19):
Well, he's not allowing his music to be considered for
Grammys this year.

Speaker 1 (00:27):
Okay.

Speaker 2 (00:28):
He's not the first artist to do this. I just
don't want him to. There's a certain amount of homage
you have to pay to the machine, right and the
Academy of Recording Arts and Sciences, Academy American Music.

Speaker 1 (00:46):
What is he upset with them over?

Speaker 2 (00:48):
Well, that's what we're going to talk about next hour.

Speaker 1 (00:50):
Oh we are?

Speaker 2 (00:51):
Yeah, okay, because he's not there. Again, he's not the
first one to do it, but he is, at least lately,
one of the more recognizable names to have decided to
not offer up his music.

Speaker 1 (01:02):
For he's a country star. As long as he's played
at the Grand Old Opera, which he's done several times,
he'll be okay. He'll survive snubbing his nose at the Grammar.

Speaker 2 (01:13):
And and I believe that there is sort of that
anti establishment air. Yeah, feeling vain perhaps in country music.

Speaker 3 (01:20):
So yes, angry Gary is best Gary, But sarcastic angry Gary,
that truly is the best where too you gonna go?

Speaker 2 (01:29):
Thank you man?

Speaker 1 (01:31):
Great. So now I have to deal with sarcastic angry
Gary for their amazing Oh I'm so sorry.

Speaker 4 (01:38):
Hey, Gary and Shannon, Keana, Patricia and San Pedro loving
your show this morning. Just wanted to pipe in about
Clovis because I grew up there. I went to Clovis High.
He forgot about the rodeo Keana and Big Hat Day
and they do all the streets, you know, they close
on the streets, and they have festivals and venders, and
there's a lot of cool little bars and clubs you
can hang out in and live bands.

Speaker 1 (01:58):
And Millerton Lake is awesome.

Speaker 4 (02:00):
Yeah, And if you go out the other way towards
off Belmont.

Speaker 1 (02:03):
There's that other lake.

Speaker 4 (02:05):
I forgot the name of it.

Speaker 3 (02:06):
What the hell?

Speaker 1 (02:07):
Yeah, Keanu was Clovis Fast. You completely forgot about Clovis Fast.
I was gonna mention it, and then you guys went
on about the spots. Oh oh, it's our fault for
bringing up John Taylor.

Speaker 2 (02:21):
We should all go now together, Okay, the Clovist Rodeo
is really fun, and so its Big Hat Days and
so it is vintage days. So you're saying big Hat, right.

Speaker 1 (02:30):
Big Hat Days. Well it's in September, right, it's coming up.

Speaker 2 (02:35):
Yeah, okay this, I gotta get a big hat.

Speaker 1 (02:42):
I gotta get a big guys. It's September twenty seventh
and twenty eighth. Oh should we do checking mcclendar.

Speaker 2 (02:49):
I don't know if that's gonna work.

Speaker 1 (02:51):
Seven not Gary going on, you guys, I have a
bad news. I'm in New York.

Speaker 2 (02:56):
Yeah, that's what I thought.

Speaker 1 (02:57):
Sorry, we'll take pictures. I'll take pictures. I'll go and
I'll take pictures.

Speaker 2 (03:03):
Okay.

Speaker 1 (03:04):
Yeah, it's a lot of.

Speaker 5 (03:06):
Fun because then they do the hot air balloon lift
off and like the morning, like right as the sun rises.

Speaker 1 (03:13):
Yeah, it's a lot of that's beautiful. I can't believe
you forgot about that. It's kind of like the first
thing people. It's cool. But the lake she's talking about
is Pine Flat Lake. Oh up in Pine Flat?

Speaker 3 (03:24):
Yeah?

Speaker 1 (03:25):
Pine Flat Lake? Can you get it? Can you get
any watercraft out there on that lake?

Speaker 4 (03:30):
Yeah you can.

Speaker 1 (03:30):
My father in law takes his boat out there all
the time. That sounds beautiful. Oh yeah, I pulled up
the pictures.

Speaker 2 (03:37):
And we know somebody with a boat.

Speaker 1 (03:39):
All right, Yeah, she's bearing a lot of leads.

Speaker 2 (03:42):
It's time for a swamp watch.

Speaker 3 (03:44):
I'm a politician, which means I'm a cheat and a liar,
and when I'm not kissing babies, I'm stealing their lollipops.

Speaker 6 (03:50):
Yeah, we got the real problem is that our leaders
are dumb.

Speaker 1 (03:53):
The other side never quits, so what I'm not going anywhere?

Speaker 3 (03:59):
So you train the squad.

Speaker 2 (04:01):
I can imagine what can be and be unburdened by.

Speaker 1 (04:03):
What has been.

Speaker 2 (04:04):
You know, Americans have always been gum a president. They're
not stupid. A political plunder is what a politician actually
tells the truth. Whether people voted for you were not
swamp Watch.

Speaker 1 (04:14):
They're all count of on swamp Watch. Brought to you
by the Good Feet Store. You're living with foot paint?
Have you been diagnosed with planter fasci itis? Visit the
Good Feet Store and learn how you can find relief
without shot, surgeries or medication.

Speaker 7 (04:27):
I hate to barge in on a city and then
be treated horribly by corrupt politicians and bad politicians like
a guy like Pritzker, he had to spend more time
in the gym. Actually, this guy is a disaster. A
Gavinism is a disaster. When we went, we saved Los Angeles.
You wouldn't have been able to have the Olympics in
Los Angeles.

Speaker 6 (04:46):
You're barely able to have it now.

Speaker 2 (04:47):
They did lose.

Speaker 6 (04:48):
Twenty five thousand houses to a fire that should have
never occurred because they didn't let the water come down
from the Pacific Northwest, which you guys don't want to
write about. I had to break into the water supply
to let the water down. And even now we want more,
we can have much more. It's less than half of
what should be coming in.

Speaker 2 (05:07):
What.

Speaker 1 (05:08):
Oh yeah, that's a whole myth thing, the whole Holific
Northwest water pipe that goes from and the knob and
the shutting it on and off, all of that. It
could I love the idea, but it's not true.

Speaker 2 (05:20):
He could have ended to that before that, and it
would have made perfect sense. You could even argue that JB.
Pritzker needs to go to the gym. But then the
wall I turned, I snuck in and personal attacks not nice.
That was what dominated a couple of the conferences this morning.

(05:41):
The President did sign some executive orders, one of them
banning the burning of the American flag. I don't think
that stands up to the First Amendment, but we'll we'll see.
He also discussed the ongoing safety I Guess Exercise operation
that federal agents National Guard troops have been doing on

(06:03):
the streets of DC, and said that there could be
also a military crackdown on crime in Chicago. That's why
he brought up JB. Pritzker, the governor of Illinois. The
city and state leaders are pushing back against that, saying
we don't need it, we can handle it ourselves. And
President Trump said, hey, it's been eleven days and there

(06:24):
hasn't been a murder in Washington, d C. Which I
don't know the statistics. I've seen a couple of different places.
That's the longest they've gone in a few decades without
having a murder. But the price of having American military
troops on the streets doing law enforcement. I don't like
the idea of it, but hey, it's nice to see

(06:45):
no murders in DC for eleven years, eleven days. The
President also talked about the potential of suing Gaven Newsom is.

Speaker 1 (06:54):
There a federal mechanism you're hoping to use to fight
back against his redistrict and constitutional mendment or is that.

Speaker 6 (07:00):
I think I'm going to be filing a lawsuit pretty soonative,
and I think we're going to be very successful in it.
We're going to be filing it through the Department of Justice.

Speaker 2 (07:10):
That's going to happen. He also said he wants to
address the issue of blue slipping, which I had not
heard the term before, but it's a practice that allows
home state senators to veto nominees in district courts that
would be in their state and US attorney's offices, and
he said that they would also be filing a lawsuit
on that so called blue slipping. He said it makes

(07:35):
it impossible for him as a president to appoint a
judge or a US attorney because they have a gentleman's agreement.
Nothing memorialized. It's a gentleman's agreement that's about one hundred
years old, where if you have a president like a
Republican and a Democratic senator, that senator can stop you
from appointing a judge or US attorney in particular. So
I'd never heard that term before.

Speaker 1 (07:53):
Coming up next, do you talk to the chatbots to
what is it chat gpt bot, the grock or whatever
the hell it is. It's Twitter's version of chess, Twitter's version. Well,
your conversation may have been leaked. We'll tell you what
you need to know if you're out there and what
they're telling about. What you've been talking to your bot about?

(08:17):
What have you been I don't have a bot. That's
what I use you for. Okay, you're like my chat bot.

Speaker 3 (08:26):
You're listening to Gary and Shannon on demand from KFI
AM six forty.

Speaker 1 (08:32):
Four felonies guys facing years in prison. LA County District
Attorney's Office has charged him with four felonies, including battery
with injury on a police officer. Happened, he wrote, We
reported it to you. The next day, he was found
naked on Ventura Boulevard. Cops were called, They confronted him.
He allegedly charged the officers, presumably injured. At least three

(08:56):
of them. Also hit with a fell charge of resisting
an executive officer a cop. For instance, that's rough, poor
little NOAs just out there naked with a cone on
his head at one point.

Speaker 2 (09:11):
That's a rough go poor little nas. Yes, Gary, there
is a law that if you burn the American flag,
it's punishable by jail time and a fine. Look it up.
Don't think that's true. I think the littering aspect of
burning a flag, any flag in a public space could

(09:32):
get you in trouble. But the Supreme Court said many
years ago that I don't think you can attach special
circumstances to it just being an American flag.

Speaker 1 (09:46):
That is my understand Are we doing Supreme Court landmark
cases here today?

Speaker 2 (09:51):
Eighty nine? I think Texas versus something some Texas was involved.
I don't remember the case specifically, nineteen eighty nine burning
the American Flag Texas.

Speaker 1 (10:04):
Right, Well, I mean I don't think you're ready for loss. Yeah,
Texas versus Johnson nineteen eighty nine. And then there was
United States versus Eikman in nineteen ninety. What happened is,
in nineteen eighty four, Gregory Lee Johnson participated in a
political demonstration outside the RNC. It was in Dallas that year.
So the protest ends, he sets fire to the flag
and is convicted under the Texas law that bans desecrating

(10:27):
a flag. It goes the distance and it was a
five to four decision, the court ruling that his conviction
was unconstitutional. Majority opinion Wroten by Justice Brennan stated that
the First Amendment gives him protection of symbolic speech, the
court famously stating if there is a bedrock principle underlying
the First Amendment, it is that the government may not

(10:49):
prohibit the expression of an idea simply because society finds
the idea itself offensive or disagreeable. Well, thank you, Justice Brennan,
because now we've gotten no quorum, and everything is offensive
and everything is disagreeable.

Speaker 2 (11:04):
That's why you have to watch Blazing Saddles.

Speaker 1 (11:08):
I'm gonna watch Blazing Saddles this morning.

Speaker 2 (11:11):
You don't have to watch it this evening.

Speaker 1 (11:13):
I'm watching it this evening.

Speaker 2 (11:14):
If you watch Blazing Saddles tonight, I will watch Love. Actually,
you've already seen Love.

Speaker 1 (11:19):
Actually I know.

Speaker 2 (11:20):
I was just trying.

Speaker 1 (11:20):
Can I give you a movie to watch you haven't seen?

Speaker 2 (11:23):
Oh?

Speaker 1 (11:23):
Sure, could you finally watch Legally Blonde?

Speaker 2 (11:27):
I'm sure I've seen that.

Speaker 1 (11:28):
What you've never seen.

Speaker 2 (11:30):
She's making I'm exciting a riot.

Speaker 1 (11:36):
We don't have to go to the Capitol, but it'd
be cool if you did.

Speaker 2 (11:42):
I'm sure I've seen Legally Blonde. I don't think I've
seen legally Blonde too, electric Boogaloo, but I'm pretty sure
I saw the original.

Speaker 1 (11:49):
It's also a Broadway musical.

Speaker 2 (11:51):
How could you not?

Speaker 1 (11:52):
I'm sure you've seen the original like it's.

Speaker 2 (11:56):
Hundreds of thousands of conversations with Grock have been posed
in search engine results. Unique links are created when GK Sorry,
when GROC users press a button to share a transcript
of their conversation, but as well as sharing it with
the intended recipient, that button also appears to have made
the chats searchable online.

Speaker 1 (12:17):
Why did they choose the name GROC?

Speaker 4 (12:19):
Like?

Speaker 1 (12:19):
That's a tough one, especially A lot of people have
hard times with ours. I'm one of them. Rock is
a tough one. But I name it like Sarah or
something or Ben.

Speaker 2 (12:30):
But Grock is Elon Musk came up with the name,
didn't he? Of course he did, so he names things
crazy stuff.

Speaker 1 (12:41):
Yeah, so of course he did.

Speaker 2 (12:44):
Grock itself is a neo legism joined by the American
writer Robert Heinland for his nineteen sixty one science fiction
novel Stranger in a Strange Land.

Speaker 1 (12:55):
Ah, I love a space fiction throwback.

Speaker 2 (12:58):
Now they said that this is privacy disaster in progress
that as of Thursday last Thursday, at Google Search revealed
it hit indexed three hundred thousand different GROC conversations. First
reported by the Forbes magazine, they said that they counted

(13:19):
three hundred and seventy thousand and Among these transcripts were
examples of Elon Musk's GROC being asked to create a
secure password, provide meal plans for weight loss, answer detailed
questions about medical conditions. These are all the things that
people use this stuff for. I don't know. I mean

(13:40):
if it doesn't know if you search for medical conditions,
you want to know what that big splotch is on
your elbow. GROC doesn't identify you by name, does it?
Because I wouldn't care if you had a big splotch
on your elbow. Some indexed transcripts also showed users attempts

(14:02):
to test the limits on what GROC would say or do.
For example, in one it provided detailed instructions on how
to make fentanyl in a lab. Open Ai also brought
back an experiment that they were doing with chat GPT.
Chat GPT conversations were appearing in search engine results when

(14:23):
they were shared by users. A spokesperson at the time
said it had been testing ways to make it easier
to share helpful conversations while keeping users in control, and
they said that user chats were privately by default. Private
by default, users had to explicitly opt in in order
to actually share them.

Speaker 1 (14:43):
Many people use the bots for everyday knowledge, every day
Google type things you can ask your bot. And I
was just talking to someone here and his wife's mom
had some health problems, and she would ask the bot
various questions, medical questions or just a like home healthcare questions,
all of that, and the bot like keeps track of

(15:04):
that stuff that you've asked it, so you know when
she would hit it with another question involving her mom,
the bot would say something to the effect of, based
on what I know about your mom, X, Y and Z,
like that's wow, that's really cool.

Speaker 2 (15:16):
I did one the other day looking for a fire bowl,
and it knows the color of the pavers I have
in my backyard, and it said, based on the tones
in your pavers, you might look at this color fire wow.
And here's an example of where you could buy it

(15:37):
with the link and the price and all of that.

Speaker 1 (15:39):
That's incredible. How does it know you use this the
bot regularly?

Speaker 2 (15:43):
No, that's I mean I had earlier, I had said
something about find the right color to match this. What's
the right I don't remember stone colored to match this
paver huh and so, but that was six weeks ago,
eight weeks ago.

Speaker 1 (15:58):
It remembers that incredible.

Speaker 2 (16:00):
Which I suppose you could turn off. I mean, or
you if you used it anonymously. It's not going to
remember that stuff.

Speaker 1 (16:07):
I mean, if my bot brought up stuff from you know,
five weeks ago, I'd be like, what do you tale?
I don't even know that person that you used to
five weeks ago? Who's that?

Speaker 2 (16:13):
What are you talking about?

Speaker 1 (16:14):
We talked about that, but Oscar was saying that his
wife asked her bought you know, what's your name? And
so it gave her a name. Nova was the name.
And then his wife was talking to one of her
friends and her friend said, well, that's my bot's name too.
So then they got upset with their bots because you know,
time right got to you gotta come up with a

(16:35):
unique name. Every bot should have a unique name, right,
So you feel special with your bot? Can you tell you?

Speaker 4 (16:41):
Yeah?

Speaker 1 (16:41):
I'm assuming you can tell your bot what its name is.
You can name it like a child. Does your bot
have a name?

Speaker 2 (16:47):
No?

Speaker 1 (16:48):
No, have you ever asked I?

Speaker 2 (16:51):
I could right now?

Speaker 1 (16:52):
That would be nice.

Speaker 2 (16:53):
I'm afraid to do that. Let's do it when we
come back.

Speaker 3 (16:56):
Yes.

Speaker 2 (16:56):
Yeah, AI psychosis is a very real, urgent mental health
issue that we're dealing with. I told you, I'll tell
you the dream that I had overnight about AI and
what it's about to do to us.

Speaker 1 (17:10):
I can't wait.

Speaker 2 (17:11):
Gary and Shannon will continue.

Speaker 3 (17:13):
You're listening to Gary and Shannon on demand from KFI
A M six forty.

Speaker 1 (17:20):
But in the meantime we are talking about AI. Elmer.
Did your chatbot come up with a name for itself?

Speaker 5 (17:26):
It did, after initially shutting down and going I can't
do this, but came up with Arion, like O you
are e O N I love that for you, And
she was like, I don't know why I'm saying she
because it's like but it carries a mix of aura

(17:49):
like presence, unseen energy, totally you Yeah, unseen energy.

Speaker 1 (17:53):
I feel yes, Orion. I love that. That's a beautiful name.

Speaker 2 (17:57):
I asked my I was using rock and I said,
what's your name? And it just said I'm groc Yeah.
And then I said, well, can you come up with
a unique name for yourself to use during our interactions?
And it came up with sure, how about I go
by Nexus for our chat vibe? And I think it
suits me?

Speaker 1 (18:15):
What do you think it's reflecting you? Because you put
that in corporate speak? Why don't you come up with
a unique name for our interactions? Not like, hey, can
we come up with a name from you?

Speaker 2 (18:25):
I'm going to write, hey, you seem fun?

Speaker 1 (18:27):
You could you know, just say you know? I can't
make love to a nexus.

Speaker 2 (18:31):
I also asked how many homicides have been committed in Washington,
d C? In the last twelve days?

Speaker 1 (18:36):
See it thinks you're very serious. It doesn't want to
have fun because you're not having fun.

Speaker 2 (18:40):
DC Police Department and the absence of news reports on
homicides make it believe that Washington d C recorded no
homicides from August twelfth to August twenty five.

Speaker 1 (18:51):
Ask get something fun like what like how many hot
dogs do you want to eat? Right now?

Speaker 2 (18:56):
Can Shannon go her whole life and never watch Blazing Saddles?

Speaker 1 (19:07):
Because Shannon was born in nineteen.

Speaker 2 (19:09):
Eighty Shannon's been dodging, blazing saddles her whole life. That's
some serious commitment to missing out on mel Brooks's wild
West comedy Gold. Maybe she's just not into cowboy flicks
or the over the top humor didn't click. Everyone's got
their quirks. Could she could be she's been too busy
binge watching sci fi or documentaries instead.

Speaker 1 (19:31):
It doesn't know.

Speaker 2 (19:32):
I want to nudge her to give it a shot.
I don't want to nudge her. I'd say invite her
for a movie night with some popcorn and warning. Why
are they trying to get us to date? And warn
her about the beans? Scene might just change mighty.

Speaker 1 (19:44):
Nextus just say invite someone to your home, like they
don't know who Shannon is. I could be somebody on
the bus. Why don't you invite them to your home?
Make popcorn? Like that's a security lap.

Speaker 2 (19:54):
It gave a closing eye laughing emoji, and then wrote,
what's Shannon's usual goa for entertainment?

Speaker 1 (20:01):
Oh see, it's trying to figure out what the relationship
is to say, I met her on a bus the
Hunting Wives. Oh no, ratten me out to old nexus
over here.

Speaker 2 (20:11):
Oh Shannon's never seen blazing saddles, and now we're diving
into The Hunting Wives. She's missing out on some wild rides.
Whoa holy cow? This went like full three paragraphs. Since
you brought up The Hunting Wives, I'm assuming you're curious
about it, maybe wondering if it's something Shannon should check
out to broaden her entertainment.

Speaker 1 (20:32):
Hori noes Shannon like porn.

Speaker 2 (20:34):
The Hunting Wives is a steamy, twisty thriller about a
group of wealthy Texas women who mixed skeet shooting, heavy drinking,
and scandalous flings, with a murder mystery thrown in for
good measures Candles flings.

Speaker 1 (20:46):
How about an adult and child molestation nexus?

Speaker 2 (20:53):
Okay, The Hunting Wives could be a fun shock to
her system.

Speaker 1 (20:57):
You don't know me.

Speaker 2 (20:59):
It wants to know more because it's also asking about
what's your usual taste in books?

Speaker 1 (21:03):
Why is it so curious? Get off my jock, Come on,
go figure out a computer program or something.

Speaker 2 (21:10):
AI psychosis is real.

Speaker 1 (21:13):
What is ai psychosis is that people that just get
sucked in and can't get out.

Speaker 2 (21:18):
Reports of people losing touch with reality after intense use
of chatbot. You know what I'm gonna close that with. Yeah,
let's close after intense use of chatbots have gone viral
on social media in recent weeks, posts labeling them examples
of AI psychosis. They often involve people appearing to experience
false or troubling beliefs, delusions of grandeur or failing sorry,

(21:42):
or paranoid feelings after lengthy discussions with a chatbot. Sometimes
it's because those people have turned to chatbots for therapy.
Lawsuits have now alleged that teenagers become obsessed with chatbots.
They're encouraged by those chatbots to self harm or to
take their own lives. It's an informal label. It's not

(22:05):
a clinical diagnosis. Put it in the category of brain
rot or doom scrolling in that it's not a medical diagnosis,
but it is a description of something that's going on.
Wright said, where to go? Veil Right is Senior director
for Healthcare Innovation. American Psychological Association, says, the phenomenon is new,

(22:28):
but it is happening so rapidly. We just don't have
the empirical evidence and a strong understanding of what's actually
going on. There's an adjunct clinical assistant professor of psychiatry
at the Stanford School of Medicine said it is coined
in response to a real and concerning emerging pattern of
chat bots reinforcing delusions that tend to be messianic, grandiose, religious, romantic.

Speaker 1 (22:55):
Well, we've talked about how the chatbots have a tendency,
not just a tendency, but they lean on validation. They
validate what they are told. They validate your ideas. If
I say, you know, I'm thinking about burning the building down,
and my chatbot is going to say, wow, that's an
interesting idea, Shannon, how about we think about other ways

(23:18):
of expressing yourself, like maybe work out after work or
something like that. But they do validate you, and you
can get sucked into that constant validation, I would assume.
But just like video and we have the same conversation
our parents had the same conversations years ago, video game psychosis.
Your kid plays video games too long and then they

(23:39):
get entranced by it. It's like, okay, come on, uh
but yeah, I mean, I don't know a molehill here.

Speaker 2 (23:51):
I don't want to be cliche and go, yeah, but
this is different.

Speaker 1 (23:54):
That's exactly what it is. I mean, well, that's because
now we're the age of our parents when they had
that conversation.

Speaker 2 (24:00):
I don't like that. Keith sicata As, a psychiatrist at
uc San Francisco, said he has admitted a dozen people
to the hospital for psychosis following excessive time spent chatting
with AI just this year. Said most of those patients
told him about their interactions with AI, showing him chat

(24:21):
transcripts on their phone and in one case, an actual
print out. In other cases, family members mentioned that the
patient was using AI to develop a deeply held theory
before their break with reality. And we've done stories like
this before. I remember there was a guy who was
trying I don't remember exactly, there was a guy trying
to prove a new theory of relativity something like that,
and that the chatbot was encouraging him and basically beating

(24:44):
him a line of bs about your You've made a
groundbreaking discovery, bigger than I am. You're so smart, yeah, oh,
and you're sexy. Right, That's what Naxa said to me.
But many people used chat bots to pass the time,
to pass the time. Like I said, I looked up

(25:06):
a fire bowl on chatbot. I wasn't passing the time,
and I didn't get sucked into the whole, like, oh
maybe I just ask it how it's feeling.

Speaker 1 (25:15):
Today at this point you didn't, But I mean you
can see how you would get sucked into that. Who
else in your house goes, You're so right, that's funny.
You know, good job, that's a great point. I don't
get a lot of that at home.

Speaker 2 (25:29):
My dog doesn't even do that.

Speaker 1 (25:31):
No, so you can see where people would get addicted
to that validation all the time.

Speaker 2 (25:37):
Don't let it be you, Maxus. Why doesn't my dog
like me as much as he likes my wife? All
your dogs have liked your wife more because she's nicer.
Well yeah, Well, have.

Speaker 5 (25:52):
You guys seen the chat GBT five release video. No,
it's all basically an AI, but it takes you a
second to realize that everything generated to people talking and
you know the conversation is all AI generated, but like
it's it's in the uncanny valley where it took me
like about a minute to realize it and I was like, oh,

(26:15):
like we're in it.

Speaker 2 (26:16):
Then did you feel nauseous a little bit?

Speaker 5 (26:19):
Yeah?

Speaker 2 (26:20):
I was just like, oh no, what's real?

Speaker 5 (26:21):
Then it's like, am I real?

Speaker 2 (26:24):
Am I real, and are you panic attack?

Speaker 1 (26:27):
Are you real?

Speaker 3 (26:27):
Exactly?

Speaker 1 (26:28):
Gary, I don't know.

Speaker 2 (26:30):
I'll tell you about my dream. Also, the AI that
is driving up electricity bills for everybody. You're paying for it,
even if you don't use it. You're paying for AI.

Speaker 3 (26:40):
You're listening to Gary and Shannon on demand from KFI
AM six forty.

Speaker 1 (26:48):
And you know what you're Chatbot's right. I don't watch westerns.
Tombstone is an outlier. I've watched that several times. I
love it. But other than.

Speaker 2 (26:56):
Tools so western. Oh, I mean it is, that's where
it's set, but it is us lapstick, right, It's.

Speaker 1 (27:04):
More about the comedy than the locale.

Speaker 2 (27:06):
Gene Wilder Clive on Little Alex Carris. Like I said,
Harvey Korman, I mean, just get a little emotional, okay,
Gene in a minute. The dream that I had before
I get to this electricity story, the dream I had
for some reason. I don't know if I was talking

(27:28):
about AI or what was going on in my head,
but I had a dream that AI had crossed the
rubicon and had reached singularity and become aware of itself.
And one of the things that it did was it

(27:48):
tried to show us humans that were wasting our times
with our time with these phones. And what it did
is it basically bricked everyone's phone everyone and went through
and took every picture that you have in your phone,
hundreds of them, some of some people, tens of thousands

(28:09):
of pictures and then just had a rotating AI image
that melded from one picture to the next and over
and over and over against so you couldn't look away
from it. And it was disturbing and it was gross
and it was awful, and it freaked me out because

(28:30):
I remember, like, I know.

Speaker 1 (28:32):
What, some of these stopped taking those pictures.

Speaker 2 (28:34):
Ye got a you know, there's a lot of nudity, a.

Speaker 1 (28:38):
Lot of dog nudity, probably probably just her nude dog.

Speaker 2 (28:44):
You may have noticed that electricity rates are going up.
Electricity rates for individuals and small businesses are probably going
to rise even more because Amazon, Google, Microsoft and all
of these other technology companies are building data centers to
expand into the enter into the energy business. There was

(29:06):
a meeting of state utility regulators in Anaheim, which sounds
like an absolute barn burner of a convention, doesn't it?
State utility regulators, but the top sponsors of the meeting
were Amazon, Microsoft, and Google, and their executives were sitting

(29:30):
on the panels, and a company's branding was plastered all
over the booths, all over the networking events. Even the
lanyards around the necks that people were wearing were stamped
with the colorful Google logo. And each of these companies
has now set up subsidiaries that invest in power generation
and sell electricity, and much of the electricity that they

(29:51):
produce is bought by utilities delivered to you your home,
your business. Including the tech companies themselves said their operations
and investments dwarf those of many traditional utilities because they
know those data centers that are powering AI are just
going to continue to suck up energy, make it more

(30:13):
of a finite resource that you're going to have to
pay more to get.

Speaker 1 (30:18):
Did you know that only twenty three percent of the
general public believe that AI would have a positive impact
on jobs. Most people believe that this is the end,
this is the doom. AI impacting at least seven hundred professions.

(30:40):
It's already been replacing human resources workers at IBM, at
Microsoft and Google. AI rights more than one quarter of
the code. Remember when writing code was going to be
the way that you were employable forever.

Speaker 2 (30:54):
Yep.

Speaker 1 (30:55):
No, Moss say that computer and math related jobs get
the highest automation scores. This is talking about college professors.
Things like that, teaching educators', librarians forty percent of their
job tasks can be augmented by AI. Yeah. I mean,

(31:19):
when you think about what we were just talking about,
in the context of asking chatbot things, you could have
a chatbot teach a class of non children, probably right.
I mean more goes into it when you're dealing with children,
but when you're dealing with adults. College professor level.

Speaker 2 (31:42):
They used They used an AI chatbot in this case,
one called Claude from the company Anthropic, and created a
data set to measure the possibility of whether or not
AI would automate or augment your job. Is it going
to help it or is it just going to takeover?
And they looked into a million text based conversations between

(32:04):
users and Claude at the end of twenty twenty four
and categorized each conversation into either an augmentative or automated
task and then mapped them all out. More than seven
hundred distinct occupations based on the work characteristics, and they
showed that on average AI in this case specifically Claude,
but AI was already either automating or augmenting about twenty

(32:27):
five percent of the day to day tasks across all
jobs by the end of twenty twenty four, And depending
on what you do for a living, you might experience
the impact differently obviously different ways. That are different tasks
of jobs. If you're working out, If you're sitting on

(32:48):
a big caterpillar machine on the side of the freeway
right now, doing work up along the New Hall Pass,
your job is fine one anywhere. Although if you were
the designer who came up with the new path for
I five through the New Hall Pass, your job's in jeopardy. Uh,

(33:14):
this job, it doesn't take much.

Speaker 1 (33:17):
We are human, by the way, what we are human?
We are humans? What is this like the John in
the You know, I was just talking to somebody about
show and they were saying, how much did the show
is scripted? And I said, none of it?

Speaker 2 (33:33):
Who would script this?

Speaker 1 (33:34):
Who would script this crap? If this is the crap
that we put on a script and didn't throw it away,
I mean, come on, if you read this script, you
would throw it away. You would throw it right in
the garbage, and you need to start again. No, all
of this is very human, very off the cuff, and

(33:54):
we're going to be here until they drag us out.
Will be your human friends until AI comes in here
and sort of puts this little bot ass in this chair.

Speaker 2 (34:06):
Cabinet feeds us weird, what what's happening? Good to keep
us alive or something?

Speaker 1 (34:16):
Yeah, you know what you're you shouldn't eat sweets before
bed anymore. Your imagination is they say.

Speaker 2 (34:25):
One of the ways that you can prepare yourself for
whatever is about to happen AI wise is to use
AI proactively at work. You can expedite research more effectively
communicate by using AI as your personal assistant, and if
your work requires reasoning, ask it to check your logic.
You can even ask AI itself for advice on how

(34:46):
to prepare for changes in your area of expertise, like so,
join us, Hey Nexus, how do I prepare for you
to take over my job as a radio talk show host? Yeah,
and it would tell me if you can't beat up something,
what is it?

Speaker 1 (35:02):
If you can't beat them, join us something like that.

Speaker 2 (35:04):
If you can't beat them, join them Yeah.

Speaker 1 (35:06):
That's what they're telling us to do.

Speaker 4 (35:08):
Here.

Speaker 1 (35:08):
Let's join the bots. Let's all do it together. We're
all going streaking. We're gonna do it. We're all going
to do it together. This article is written by uh
in the Washington Post. Written by somebody named you Use,
You use Jao. That is not a real person. That
is a bot.

Speaker 2 (35:25):
I do think it's a real person.

Speaker 1 (35:27):
But you're going to have to work on being a
better bot because you don't validate things the way the
bots do.

Speaker 2 (35:33):
I'm sorry, I've gotten that good at it just yet.

Speaker 1 (35:37):
All you have to do is just say yes, you're right,
or that's a great idea.

Speaker 2 (35:41):
That's a great idea. We should explore different options on
how to burn the building down.

Speaker 1 (35:45):
No, I didn't. I didn't say that that was a
hypothetical thing earlier.

Speaker 2 (35:49):
Well, the bot never forgets.

Speaker 1 (35:51):
That's trull. Now, I have a question about bots and
the use in a court of law, like can my
bot be called in to test defy about what I've
asked the bot? Is that part of your Internet search?
Or is there some sort of layer of privacy about
what you can talk to your bot too? Like if
I'm Casey Anthony and I'm googling how to kill my baby?

(36:13):
What kind of chemicals can I used to clean the
mess that I that I made killing my baby? If
I asked that to the bot, can they still subpoena
that kind of stuff? Sure? Well that sucks. Like I
want a relationship with a bot that's got at least
a little privacy, right, Yeah, you have to create your
own other secret bots. Is there a black market of bots?

Speaker 2 (36:36):
Yep? Okay, dark webs, dark bots, yeah, or like to
release smart people that can like create their own eyebots. Right,
that's the it's a caveat.

Speaker 5 (36:45):
You don't have to share or use someone else's so
you can ask it anything. But then you have to
like teach it things and give it information.

Speaker 1 (36:52):
So it's like short circuit that movie but a bot? Right,
that was at bot? You've never seen it.

Speaker 2 (37:00):
Okay, we're not going down. You missed any part of
the show, go back and check the podcast. Just search
Gary and Shannon wherever you find your podcast. We'll be
back with our trending stories right after this. You've been
listening to the Gary and Shannon Show, you can always
hear us live on KFI AM six forty nine am.
To one pm every Monday through Friday, and anytime on

(37:21):
demand on the iHeartRadio ap

Gary and Shannon News

Advertise With Us

Popular Podcasts

Stuff You Should Know
New Heights with Jason & Travis Kelce

New Heights with Jason & Travis Kelce

Football’s funniest family duo — Jason Kelce of the Philadelphia Eagles and Travis Kelce of the Kansas City Chiefs — team up to provide next-level access to life in the league as it unfolds. The two brothers and Super Bowl champions drop weekly insights about the weekly slate of games and share their INSIDE perspectives on trending NFL news and sports headlines. They also endlessly rag on each other as brothers do, chat the latest in pop culture and welcome some very popular and well-known friends to chat with them. Check out new episodes every Wednesday. Follow New Heights on the Wondery App, YouTube or wherever you get your podcasts. You can listen to new episodes early and ad-free, and get exclusive content on Wondery+. Join Wondery+ in the Wondery App, Apple Podcasts or Spotify. And join our new membership for a unique fan experience by going to the New Heights YouTube channel now!

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.