All Episodes

December 4, 2025 • 31 mins

Chris Merrill kicks off the show with KFI’s Mark Rahner, joking about AI coming for everyone’s job—while warning that the younger generation may be the ones who feel it most. Chris also celebrates the massive success of KFI’s 15th Annual PastaThon and revisits the emotional, long-awaited return of KFI legend The Foosh at last night’s broadcast.
Then the conversation turns to the now-viral Waymo clip: an autonomous car casually turning left through an active police standoff. Chris questions what “accountability” even means when there’s no human behind the wheel, and breaks down the difference between programming, thinking, logic… and the absence of common sense. Plus: a quick roast of AI’s obsession with em dashes.
Chris then shifts to a disturbing local story—an LA family suing the school district after an alleged “kissing club” incident involving their 8-year-old daughter—and dives into the broader culture of silence that enables abuse. The hour closes with a tribute to K-9 officer Spike, who was tragically killed in the line of duty.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to KFI AM six forty on demand.

Speaker 2 (00:07):
More stimulating talk. We'll talk about AI and how it's
going to replace this all. Yeah, we're all doomed. It's
not so bad for those of us that are a
little bit older. But a boy, if you are young,
you are hosed right now. You are just absolutely you
are just in there, you are in the meet. I
don't know what that means. I was I couldn't come
up with it. It was I just I bet an

(00:28):
AI could have come up with me, would have done something.
I was reading a great article tonight too, I think
it was The New York Times was talking about AI writing,
and you know, I like to use AI to cheat
and sort of summarize my notes and things like that. Eh,
and boy, they were so right. AI loves the m dash,
it loves the it's not this, it's that it's just
so lazy.

Speaker 3 (00:48):
And also I was turning your brain to much. Let
me just say this. I think by now you know
me well enough to know that I'm not a big
like company, but smoocher type of kiss up kind of guy.
I absolutely love the game guaranteed human thing that iHeart
is doing. I think it's smart and I think it's
good and nothing has made me prouder than that. Wow.

Speaker 2 (01:08):
Yeah, okay, right, we just raise a million dollars for
a bunch of kids so they don't go hungry this year.

Speaker 3 (01:13):
But no, that's cool, Mark, we're talking about the end
of civilization.

Speaker 2 (01:16):
I'm glad this is the only thing that makes you
happen that that's wonderful.

Speaker 3 (01:19):
I'm trying to save humanity.

Speaker 2 (01:21):
Mark, glad to have a job. Sorry, your kid is
starving to death in the street. It's not an either
or A I would tell you that. A I would
say it's not just starving kids, it's having a job.

Speaker 3 (01:32):
That's terrible.

Speaker 2 (01:33):
By the way, postathon, the official totalize of last night
was nine hundred fifty five thousand. We got off the
air and somebody had sent a talk back and I
didn't see until after we got done.

Speaker 3 (01:41):
They said, can you give us a total?

Speaker 2 (01:42):
The total that we had from last night didn't change
since the end of the broadcast at the White House
and Anaheim it was nine hundred and fifty five thousand dollars.
So what we're still waiting on and there's still a
chance for you to donate. There's a like a million dollars.
That excuse me, it's not a million dollars. We'll go
over a million dollars when we get the totals from

(02:03):
our partners. So we've got like Wendy's and Smart and
Final in those places where you can do your roundup
and you can do.

Speaker 3 (02:09):
Your your coupon book and things like that.

Speaker 2 (02:10):
So you know, we're gathering those contributions as well, and
you can still donate CAFI AM six forty dot com
slash pastathon. There we go, Am six forty KFI AM
six forty dot com slash pastathon.

Speaker 3 (02:26):
There you all right.

Speaker 2 (02:28):
One of the things we talked about it last night
and I was so glad to see it. It was
a quick blurb, but KTLA was talking about the foushe
showing up there last night and then meeting the guys
that pulled him out of that horrendous wreck. Of course,
the Fush has been with KFI for I think Fush
is one of the first guys I ever worked with.
He's the tech director, so I mean that's been at

(02:48):
least ten years, so I think he's been there. Alex
Fush's been around Mark do you know, twelve fifteen years
something like that.

Speaker 3 (02:53):
I don't know longer than me. I'm coming up on
five years, and he predates me.

Speaker 2 (02:57):
I think, yeah, yeah, I know he does. Yeah, And
he's he's just always been such a solid dude. So
Fush if you missed the story, of course, Fush was
in this horrible car accident where the car was just
there's nothing left. I mean, it caught fire while he
was inside. His arm was crushed. They thought his arm
was going to be amputated, his id burned up in

(03:19):
the car. He was a John Doe when he went
to the hospital. He just didn't show up at work.
And because he's been here so long, that's not the
kind of thing that he does. He doesn't just not
show up, you know. And then we couldn't get a
hold of him. His cell phone it you know, burned
up in the rack and that kind of stuff. So
it just horrible. But he was there last night at
the at the Pastathon broadcast and he was showing how

(03:40):
he's getting the dexterity and rebuilding the muscle in his arm,
and he was telling the story about how they thought
it was going to be amputated, but it was it
was not, which is good because he's left handed and
it was his left arm.

Speaker 3 (03:52):
Yeah. I had been texting him sometime earlier and asking
him if he was getting some cool bionic stuff. That
would have been awesome, but apparently they managed just not
was And it was kind of an emotional thing last night.
It was really hard work.

Speaker 2 (04:03):
It was so katy La caught up on that they
had this this little blurb on Channel five.

Speaker 4 (04:08):
It's been four months.

Speaker 5 (04:09):
Sivil Love, member of the AFI radio family, was severely
injured in a horrific car accident, and tonight he's making
a comeback. Thirty seven year old stephan Ca Bezos, affectionately
nicknamed the Foush, got a chance.

Speaker 3 (04:23):
Oh that was it?

Speaker 2 (04:25):
Okay, that's all that was posted. Okay, all right, well
that's all I got. I don't think we used the article.
We just call him fush. Yeah, we don't use uh
steffush or push. Yeah, that's it exactly. Did you see
his haircut?

Speaker 3 (04:41):
I didn't. I was in the news booth here at
k SO.

Speaker 2 (04:44):
I was watching the live broadcast because you know, I
was in my studio as well. And Fouish has always
had long hair and it is short, and he looks
really good.

Speaker 3 (04:54):
He's a handsome fella. I like fush looking like the
dude from The Big Lebowski. That's he changed that.

Speaker 2 (05:00):
Yeah that Yeah, he changed it. But that's exactly what
it is. Boy, you got it right too, That's exactly
what it is. He just looks like the dude abides
man predisposes you to be friendly with him kind of.

Speaker 3 (05:13):
Yeah, but the new look is the new look is good. Okay,
I like it a lot. So he's coming back next week.
We'll have a chance to get used to it.

Speaker 2 (05:20):
Yeah, I'm excited he'll be back next week. We'll make
a big deal of it. Probably cry men showing emotion.
That's what you come to the show for. You come
to the show because you go. You know what I
want dudes with big radio voices to cry? Oh yeah,
and I ugly cry like Claire Danes.

Speaker 3 (05:41):
Oh that is the ugliest. I screw up my whole face. Yeah.

Speaker 2 (05:46):
God, she just pulls that out all the time too,
doesn't she have every show?

Speaker 3 (05:50):
Yeah, it's like, yeah, that's her money maker. What do
you want?

Speaker 2 (05:55):
No, I get it, dude. What you're good with? Good
at Yeah. Oh I'm frantic melodry. I'm crying, but I'm overcoming. Look,
I'm strong foosh, which I wish we could get her
here for that, wouldn't they be?

Speaker 3 (06:11):
God? I have no idea where she lives, but that
would be the best cameo of all time. I think
if she's listening, she's definitely not going to give us
the time of day.

Speaker 2 (06:18):
Oh well, well, what are you gonna do?

Speaker 3 (06:22):
You know? Well, what the hell with her? All right?

Speaker 2 (06:29):
Uh? You think you're sitting in the car of the
future because you know from Mark he does not like change.
He doesn't like the future. He says, Nope, I don't
like any of it. I don't want I don't want anyone.
No humans replaced none, That's it.

Speaker 3 (06:43):
Yeah.

Speaker 2 (06:45):
The problem is and I hate to do this because
it's going to give Mark fuel for his own little fire.

Speaker 3 (06:51):
There.

Speaker 2 (06:52):
Your future could be drastically shortened because your Cabby just
pulled into an active crime zone. The future is actually
trying to kill you. How to avoid getting shot by
the future. Sounds like a bad promo for Looper is Next.
Chris Merril KFI AM six forty live everywhere in the
iHeartRadio app.

Speaker 1 (07:08):
You're listening to KFI AM six forty on demand.

Speaker 2 (07:13):
It's I'm Chris Merril kfi am six forty more stimulating talk.
I love it when we have a video go viral,
local video of things behaving badly.

Speaker 3 (07:26):
In this case, it's the Weaimo. So Waimo is running.

Speaker 2 (07:31):
You know, the driverless taxi service right on. I don't
want what do you call it? You call it a taxi,
whatever it is. It's the Waymo just taking awaymo. So
they're doing the Weymo. So we've got the driverless service
happening in LA and it's not a question of if,
it's a question of when it's gonna end up doing

(07:53):
something dumb. You've seen videos of Waimo doing dumb things before,
turning the wrong direction, cutting people off, whatever it is.
You've seen the videos, so you know it does these
things a bit unfairly because people cut each other off
every day, people go down the wrong way and a
one way street. People do all of these things as well.

(08:15):
But we hold the machines to a higher standard because.

Speaker 3 (08:19):
There's a scapegoat.

Speaker 2 (08:21):
Say Waimeo does something dumb and smashes into you. There's
no person on the other side of that that holds responsibility. Instead,
it's some corporate leviathan that will buy their way out
of whatever it is. Oh goodness, okay, you assume them.
You get a few million dollars. It's a drop of
the bucket for Google, for alphabet right, it's nothing. Whereas

(08:44):
if it's a person on the other end, they got
they got skin in the game. You know that if
they turn the wrong way in a one way and
they end up hitting you like that has serious consequences.
So it's frustrating when we see the machines are doing
something wrong, even if the machines are better than humans
at it.

Speaker 3 (09:00):
Frustrating when we see it because they got no skin
in the game.

Speaker 2 (09:04):
We can't relate to it, but we're expected to put
our trust in that being on the roads. Well, now,
the video has gone viral for this reason.

Speaker 6 (09:12):
In this video, taken in downtown LA, but the Wamo
doing you can see a line of police cars blocking
the road and a man lying on the ground. Entered
this Waimo driverless taxi, which, while servicing riders, proceeds to
take a left turn, driving right past the active police
stop and officers, two moments later, are seen walking towards

(09:33):
the subject with weapons drawn.

Speaker 2 (09:35):
Yet not only did it drive through the standoff or
whatever you want to call it.

Speaker 3 (09:42):
Uh, it just about drove over the suspect.

Speaker 2 (09:46):
So the driver had gotten out of his out of
his I think it was a truck, gets out of
the truck, lies down of the ground like the police telling.

Speaker 3 (09:52):
Me, good all of the vehicle, lay it out with
your hands.

Speaker 2 (09:55):
Behind your head. That's what he's doing. All of a sudden,
the Weymo drives by and just about drives over him.
Good job, Waimo.

Speaker 6 (10:03):
Waimo telling NBC News that when it's Robotaxi came across
the scene, it turned into an unblocked area where other
cars were also driving, and that it quickly left. In
a statement, the company saying safety is our highest priority
and that when we encounter unusual events like this one,
we learn from them. Will it ever be possible to
train a driverless car on every single potential traffic scenario?

Speaker 2 (10:27):
Come on, No, it's impossible to train a humans on
every single possible traffic scenario.

Speaker 3 (10:34):
Nope.

Speaker 2 (10:35):
Although if we see flashing lights and somebody on the
ground and guns drawn, we know maybe don't drive through
that way.

Speaker 3 (10:41):
And that's the catch with this technology.

Speaker 6 (10:43):
The question is can you train it on enough that
it makes few enough mistakes that you're willing to tolerate
the mistakes.

Speaker 2 (10:49):
Of course, I mean that's a good question. Here's here's
what I think is really interesting about the machine machine learning,
the training and all this stuff we're doing. Whether it's
a whether it's the driver's vehicles, whatever it is, it's
that we train it on logic. But logic is not
a substitute for sense. So we have common sense. Don't

(11:11):
stand in front of the cop that's pointing a gun
on a subject, right, common sense would tell you self
preservation kicks in here, and I'm not going to stand
in between the cop and the guy who the cop
is pointing the gun at.

Speaker 3 (11:25):
That just makes sense.

Speaker 2 (11:27):
And I don't think this is something that we have
to train people on. This is not This is not Oh,
what are the are the what's the training data for this?

Speaker 3 (11:35):
No?

Speaker 2 (11:36):
If you tell a five year old a do you
want to stand between a cop who's pointing a gun
at a person who's on the ground or not, and
the five year old's gonna go No. The cars don't
have common sense, they only have logic. Logic says I
need to turn left. Going straight is blocked off, flashing lights,
avoid flashing lights, move on to the next. Whatever it
is doesn't have common sense the.

Speaker 6 (11:59):
Way most data their self driving cars have ninety percent
fewer serious crashes compared to a human driver that I
believe that their vehicle.

Speaker 2 (12:07):
Yeah, we suck at driving too older, I get the
less I want to be on the road. And I
used to love driving I did, I think probably because I.

Speaker 3 (12:16):
Was the hazard. Yeah, now I drive safer. Driving safe
is boring and scary.

Speaker 6 (12:23):
That their vehicles are still facing challenges. Earlier this year,
a passenger got stuck in a Weimo after the car
repeatedly circled around a parking lot at the Phoenix Airport.

Speaker 3 (12:33):
This car is just going in circles, and.

Speaker 6 (12:35):
The federal investigation is now underway into Weimo's repeatedly passing
stop school buses with lights flashing.

Speaker 2 (12:43):
Yeah, but what's the stake in the game? If I
do that, if you do that, they're gonna track us down.
We're gonna take away our license. There's gonna be consequences.
If Google does that, they go Google, you shouldn't do that.
You should train your your your vehicles not to do that.
And if you keep doing it, we're going to find you,

(13:04):
where's the consequence? No consequence.

Speaker 6 (13:07):
And as for this latest incident here in La Lap
details NBC News that the Waimo car did not impact
or impede any of the police officers' operations during that
traffic stop. But they did say that that turn that
the Robotaxi made did prompt the officers to go and
then shut down and block off that intersection.

Speaker 2 (13:27):
Michael, okay, thank you very much. It was NBCLA that
was reporting on that. So where I get frustrated with
the machine learning and whatnot, Mark loves it.

Speaker 3 (13:38):
When when I talk about AI, he loves it. I
can't get enough of it. Eyes light up. It pleases me. Now,
I had a nice conversation with my chat GPT tonight.
It wasn't a conversation, mil was No, it was can
we say the M word on the air?

Speaker 2 (13:51):
I don't know what the M word is? Never mind
machine masochism? These oh wankin.

Speaker 3 (14:01):
There you go. We can say the W word, but
not the M word. These things aren't thinking. They're programmed.
Programming isn't thinking. Okay.

Speaker 2 (14:09):
So I was diving into the programming tonight. Yes, there
was an article in the New York Times and I
made mention of this in the first segment that was
talking about how the writing is predictable and wrote and
it leans on tropes that sure, we use these things
in real life, but we don't use them in every sentence.

(14:29):
It likes to say, uh, it's not X, it's y,
and it loves to use the M dash.

Speaker 3 (14:35):
I can't stand it. So that's where you drop the line,
Oh my gosh, dude, I can't mark. I can't tell
you it's not wrong. Right. The M dash is fine.

Speaker 2 (14:46):
It's a great substitute for somebody like I tend to
overuse ellipses, right, and M dash is a great substitute
for that, which is just a dash basically for those
that don't know what it is. But it's kind of
like talking to that person that puts exclamation points at
the end of every sense with their fingers or the sorry,
the fake quotation marks. Well, that's the person you're talking
to in real life. I'm just saying, imagine I send

(15:08):
you a message, and every message I send you just
as exclamation points. Are you working tonight? Exclamation boyd exclamation point?
Exclamation point? You go, well, it seems excessive.

Speaker 3 (15:18):
I would think you had some developmental issues, and I
would I would speak very carefully to you and slowly. Yeah,
that's what AI is.

Speaker 2 (15:25):
So anyway, I asked AI would have thought about the article,
and it basically trashed the article.

Speaker 3 (15:30):
Yeah, and I said, your response is interesting.

Speaker 2 (15:33):
Your response uses a lot of the same tropes the
article says are telltale signs. And it's like, oh, good job,
you caught me. You've become self aware about my lack
of self awareness.

Speaker 3 (15:44):
Yeah. And the thing is you can't even if they
if these things screw up, you can't have the satisfaction
of turning them off. Like hal nine thousand, No, that
in two thousand and one, Like you're not going to
sing Daisy slower and slower until you go to sleep forever. Hey,
there's Yeah.

Speaker 2 (15:56):
The thing is like if I yell at you, I
get an emotional response one way and other.

Speaker 3 (16:00):
Right, you're like, oh, wow, he's really upset. No, big
does not care. I wonder if they had focus groups
for WEIMO before they settled on WEIMO, because I would
like to call them dystopian robot cars. That was second
place okay on the on the focus group. Yeah. Yeah,
and it's more of a mouthful when you're using the app,
but it's a little more accurate. Dystopian Robot car yeah

(16:24):
or DRCS for short. Right.

Speaker 2 (16:26):
Anyway, My point is is, I was, you know, wanking
with edgy PC. I'm gonna call a conversation you say
it was not uh huh uh. Basically I said, can
you learn from these criticisms?

Speaker 3 (16:41):
And it basically said no, I can't. It has to
be programmed in.

Speaker 2 (16:44):
So what concerns me is that when you've got weaymo
that's going through this intersection.

Speaker 3 (16:48):
You can't go.

Speaker 2 (16:49):
Don't do that because in all of our dystopian films,
the there's a neural network where if one one remote
server learns something, it passes it on right to the
centralized brain, and then the centralized brain learns from that.
Usually it's learning tactics of whoever's trying to destroy it

(17:10):
in our dystopian robot films.

Speaker 3 (17:14):
But the Waimo doesn't do that.

Speaker 2 (17:16):
It has to so it screws this stuff up at
the police traffic stop, or it screws this thing up,
it starts driving in circles, and it has to have
Somebody has to come to work the next day at
Google write code to tell it don't do that again.
So when they talk about machine learning, not really, it
just it doesn't really learn. It remembers instructions and it

(17:39):
employs new instructions on a one on one basis, but
it's not like there's a neural network where it is
actually learning things.

Speaker 3 (17:47):
So many people are gonna lose their shirts when the
AI bubble bursts and it's going to the numbers bear
this out. I've never more wanted Captain Kirk to come
in and talk a computer into self destructing. Oh, yes,
because he did that. He would do that. He did
that like half a dozen different times.

Speaker 2 (18:05):
Yeah, he would out logic the computer until it realized
that the only reasonable solution to whatever.

Speaker 3 (18:11):
The issue is is self destruction.

Speaker 2 (18:12):
Yeah at Rodenberry, he knew how to write, didn't he.

Speaker 3 (18:17):
We need him. Yeah, well we've still got Shatner. We
do still have Shatner. That's a good point too.

Speaker 2 (18:23):
Hey, you'd be forgiven for thinking that gen Z is broken,
that they're renting and waiting, But maybe, just maybe gen
Z has figured things out in a way that the
rest of us never could find out. What the Zoomers
might be onto next the Chris it's a Chris Merril
KF I am six forty.

Speaker 3 (18:43):
I think, is that the one we're doing?

Speaker 7 (18:44):
Next?

Speaker 2 (18:45):
Oh no, that's the wrong teas. I read the wrong teas.
Uh oh yeah, no, we're not doing that one.

Speaker 3 (18:49):
Next? What am I doing that one? Oh? Next hour?

Speaker 2 (18:54):
All right?

Speaker 3 (18:54):
Ignore that? Then? How about this?

Speaker 2 (18:56):
The kissing club is turning into something very dirty and
it could cost one local school boku bucks? How kissing
turns into sexual assault in elementary school?

Speaker 3 (19:06):
Gross?

Speaker 2 (19:06):
Next, Chris merril k I am six forty live everywhere
in the iHeartRadio.

Speaker 1 (19:10):
You're listening to KFI AM six forty on demand, more.

Speaker 3 (19:18):
Stimulating talk. What have we here? Oh? Look, first blush?
I thought, what's the big deal? A kissing club? All right?
Kids are kids? Now?

Speaker 2 (19:29):
Kissing club assaults an eight year old hidden by LA's
elite Sierra Canyon School, According to a lawsuit, Fox eleven
had the story and I was like, Oh, how bad
could it be?

Speaker 3 (19:40):
Oh? God, it's bad.

Speaker 8 (19:42):
Tierra Canyon is a pre k through twelfth grade school
located in Chatsworth. Tuition can be as high as thirty
nine thousand dollars per year for the lower grades. The
parents of a young girl enrolled her there with the
school's promise of a seller education and a secure environment. However,
when details a bullet and more serious issues emerged, the
parents claimed that the school did nothing to protect their daughter.

(20:05):
So far, there has been no response from Sierra Canyon
regarding what is reported to be the second lawsuit alleging
that older girls bullied younger girls and engaged in even
more serious misconduct.

Speaker 2 (20:17):
Yeah, the serious misconduct gets a little so earmuffs kids.

Speaker 8 (20:21):
This lawsuit was filed on behalf of a now nine
year old girl. According to the documents, the victim, who
was then seven, was allegedly bullied into joining what was
referred to as the Kissing Club. Over time, the activities
in the unsupervised restroom reportedly escalated. E K was pressured
into the group to kiss the others in the bathroom.

(20:42):
Soon thereafter, she was forced by the older girls to
kiss and touch their genital area.

Speaker 7 (20:46):
Okay, they involved the actual kissing of young girls amongst
each other, but it led to sexual touching of sexual.

Speaker 3 (20:56):
Body part healthy.

Speaker 7 (20:57):
And that's the most disturbing thing is it took some
time for it to develop, and there was this kissing
club just grew and got more bolder because there was
complete lack of supervision.

Speaker 3 (21:09):
It.

Speaker 2 (21:10):
Yeah, that's terrifying, and it continued to get worse and worse.
So I mean, were they were grounding the bases? Is
what was happening with these kids in elementary school? And
it's look, I think everyone at a young age is
discovering themselves and probably has stories of kissing the neighborhood
boy or girl or whatever it was. I think, I

(21:30):
think that's something that many, many, many people shared and experienced,
many of us shared in our formative years. So when
I heard okay there was a kissing club, I thought,
all right, inappropriate. You know, adults, you step in, you
break it up. But then when I find out that
it turns into basically sexual assault, it's horrifying. And here
you are at the private school, so no pun intended,

(21:53):
high profile private institution. So where is the supervision? And
I don't know if they're talking about older girls, what
do they mean nine?

Speaker 3 (22:02):
Do I mean?

Speaker 2 (22:02):
This girl was what seven at the time, I guess,
and now she's older. But so when you had ten
year olds eleven year olds, at what point do you
not have an aid or a teacher going into the
bathroom with the with the kids.

Speaker 3 (22:14):
I really don't know the answer.

Speaker 2 (22:16):
To that, and I think we're very attuned to these
sorts of things happening today. That's not to say they
didn't happen in the past, but in the past, I
think it was more likely to be swept under the rug. Sadly,
I think there's probably a lot of people who were
victims of sexual assault in the past where it was thought, oh,
you know, kids will be kids. No, no, no, again,

(22:38):
if this were like a kissing club, you break it up,
you go with the kids will be kids. But this
turned into something far worse, and that is inexcusable. And then, sadly,
we've got this this culture of silence that goes on.
As they said, it's not the first time that there's
been people and you know, complaints about this sort of stuff,

(22:58):
and we would be the last or more kids going
to come forward.

Speaker 3 (23:04):
There's the That's the other question.

Speaker 2 (23:07):
Is this a widespread issue where you've got multiple victims
and we're only at the beginning. That's terrifying. And I
also don't think this is the only place that this
sort of thing is happening. So again, hopefully we can
break this culture of silence. You know, we went through
the Me Too era, and I thought we shined a
light on a lot of poor behavior. But the stuff's
happening in our schools, and I don't know that everybody

(23:30):
learned the lesson when we went through that, So maybe
we need to make sure we're protecting the kids.

Speaker 3 (23:34):
Good lord, you think we'd be protecting the kids of all.

Speaker 2 (23:36):
The talk about billionaire pedophiles and all that other garbage
is going on. Things to Epstein and the other questions.
But oh no, it wouldn't happen to my neighborhood.

Speaker 3 (23:45):
No, it is. It is. It's horrifying.

Speaker 2 (23:48):
I can't even imagine if you're a parent you find
out that this is happening to your kid, I cannot imagine.
I'm sorry, parents, I'm so sorry you're dealing with that.
I don't even know where you start, all right, Him
is the most lovable, cuddly and vicious officer being laid
to rest the community morning. Rightfully so, because this officer's
whole life was meant to do one thing, protect other officers.

(24:11):
You're about to hear the incredible story of Officer Spike.

Speaker 3 (24:15):
Next.

Speaker 2 (24:15):
I'm Chris Merrill. I am six forty Live everywhere in
the iHeartRadio.

Speaker 1 (24:19):
You're listening to KFI AM six forty on demand.

Speaker 2 (24:27):
If you're looking for the show, will you old podcast
is going to be available on demand k if I
AM six forty dot com. Slash featured podcasts or don't
slash it.

Speaker 3 (24:36):
Just look for it. Just featured. Just go just go
to the website, look for featured podcast. Just do that.
Just do that. Okay. What's happening in the job market?
Things to ai?

Speaker 2 (24:48):
Well, if you are one person on the program, you
believe that there's a massive bubble that's going to burst
and the whole world is going to come crashing down,
there still won't be any jobs. You're just a bundle
of joy.

Speaker 3 (25:01):
Ronner. Oh you were talking about me, Okay, all right, yeah,
I actually don't disagree with you.

Speaker 2 (25:07):
We'll dive a little bit deeper into that coming up
here after marks eight o'clock news. I didn't want to
make mention of this today. They were they did a
service for the canine, the dog that got killed in
the line of duty.

Speaker 3 (25:20):
ABC seven had the story he is.

Speaker 9 (25:21):
Being called a hero because he gave the ultimate sacrifice
to protect his partner and other officers that night. But
the apartment says behind Spikes canine badge was just a
good boy.

Speaker 4 (25:31):
Oh who wanted to make his people, especially his partner, proud.

Speaker 2 (25:45):
Those last call those you know the radio, the last
calls on the radio. I cry every time anyway, and
now it's for a dog.

Speaker 3 (25:57):
What are they saying that?

Speaker 5 (26:01):
Boy?

Speaker 3 (26:06):
Brave boy. I told you to have.

Speaker 2 (26:11):
Big guys with big, deep radio voices crying tonight. I
told you that was going to happen. It makes me sad.

Speaker 9 (26:18):
An emotional tribute to a courageous canine cop. You gave
this community absolutely everything you had, and your legacy will
remain with us always. The City of Burbank honoring Spike,
known for his bravery, loyalty, and love. Sky five overhead
the procession from the Burbank Animal Shelter, the dogs flag

(26:40):
drap body brought into the police department for the last time.
A crowd including officers and their dogs from all over
southern California stood watch, remembering Spike's actions and impact. The
department grateful for the support from those grieving this loss.

Speaker 3 (26:55):
And it just shows how tightened it this community in
the City of Burbank is.

Speaker 2 (27:00):
Those are tough guys that are crying too. I'm not alone,
so sad. I cry more for the animals than I
do for people. The animals all they want to do
is protect their protect their person. So they had one mission.

Speaker 3 (27:19):
That was it.

Speaker 9 (27:19):
On November twenty second, Spike was shot and killed by
a suspect running from a traffic stop.

Speaker 3 (27:24):
The four year old.

Speaker 9 (27:25):
Belgian Malinwa spent just under two years with the department.
The canine was a dependable partner, catching bad guys and
going into dangerous situations ahead of others. It's good a
bond built on countless hours of training. Only other handlers
can understand.

Speaker 7 (27:41):
All of our partners are partners with us more than
with our own families.

Speaker 2 (27:45):
Yeah, believe it so, And honestly, they're probably easier to tolerate.
That's not an insult. I just know what it's like
being around the same person all day long. We all
went through COVID. If you had families when you were
going through Covido, it was tough.

Speaker 3 (28:00):
The only saw us you had was the dog.

Speaker 7 (28:02):
Oh, it's very hard when one of us loses one
of our partners.

Speaker 3 (28:05):
It's hard.

Speaker 7 (28:07):
We send these dogs into places that we wouldn't want
to go as as a human. Yeah, and it's just
what we live with every day, knowing we may not
bring them home.

Speaker 1 (28:14):
It wasn't just a police canine.

Speaker 9 (28:16):
He was a beloved member of the family for a
Banks Police chiefs spoke on behalf of Spike's partner, Officer
Corey Sallas.

Speaker 10 (28:24):
His passing has left an unimaginable void in our hearts
that can never truly be filled.

Speaker 2 (28:32):
Oh, you've ever had if you've ever lost an animal,
I used to have an argue. I had a program
director that was like, oh, dogs have no souls. You're
you're a curmudgeon. Love this PD But he didn't like dogs,
and that's a red flag. So we used to argue
about it all the time. But if you ever lost

(28:53):
an animal, and you know that, the heartbreak I mean
it is it's like losing a child. It is absolutely devastating,
devastating the thought of losing the animal because it was
protecting you from a danger has to compound that exponentially.

Speaker 10 (29:14):
When you think of Spike, please remember him as a
hero that he is rust easy, our dear Spikey, Oh
you will always be loved and forever missed.

Speaker 9 (29:24):
And the department says that Spike service serves as a
testament to the dedication, discipline, and heart that define a
true police canine.

Speaker 3 (29:33):
Ah paupers.

Speaker 2 (29:36):
Anyway, I wanted to make sure that we paid a
little bit of homage to the pup tonight makes me
so sad. It's all the way around, makes me so sad.
I really have nothing else to say, guys, I'm just
are we good? Well?

Speaker 3 (29:50):
No, you went on a thing last night about how
broken hard you were when your dog died, and when
my cats died, I cried more than when my mother died. Yeah,
so I think that's a fairly common thing.

Speaker 2 (30:02):
And imagine if the cat died in this case of
the dog while it was protecting you, like, oh, took
a bullet so that the assailant wasn't shooting at you.

Speaker 3 (30:11):
Oh yeah, we don't deserve dogs. They're too No, we don't.
We don't.

Speaker 2 (30:15):
They're gonna, is this right, They're gonna do something for
him in the Rose Parade. Yeah, that's what I'm saying too.
Larger makeshift memorial formed outside of police headquarters, Burbank trees,
handwritten notes, spikes, end of watch on the Facebook page.
And then I'm gonna be memorialized in Burbank's twenty twenty
six Rose Parade float themed All Pause on Deck, so

(30:39):
that will highlight animal rescue work and feature tributes to
Spike's bravery. I know those of you who are not
animal lovers, you're like, why is this SAP crying on
the radio over a dog he's never met? Well, what
can I say? I'm a big softy. You probably have

(30:59):
heard the the liners that we have running. They say,
you know, one hundred percent human, guaranteed human. Mark went
off on this earlier in the show too, where he
was saying he was very proud of the company because
the company is committed to having one hundred percent human,
and I think that's outstanding. I wish more companies were
doing that because they're not. And when you find out

(31:21):
how many could replace us, it's going to blow your mind.
That is next. Chris Merrill KFI AM six forty. We're
live everywhere in the iHeartRadio app

Speaker 1 (31:31):
KFI AM six forty on demand
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Are You A Charlotte?

Are You A Charlotte?

In 1997, actress Kristin Davis’ life was forever changed when she took on the role of Charlotte York in Sex and the City. As we watched Carrie, Samantha, Miranda and Charlotte navigate relationships in NYC, the show helped push once unacceptable conversation topics out of the shadows and altered the narrative around women and sex. We all saw ourselves in them as they searched for fulfillment in life, sex and friendships. Now, Kristin Davis wants to connect with you, the fans, and share untold stories and all the behind the scenes. Together, with Kristin and special guests, what will begin with Sex and the City will evolve into talks about themes that are still so relevant today. "Are you a Charlotte?" is much more than just rewatching this beloved show, it brings the past and the present together as we talk with heart, humor and of course some optimism.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.