All Episodes

July 10, 2025 35 mins
ICYMI: Hour Three of ‘Later, with Mo’Kelly’ Presents – Chris Merrill filling in ‘Later’ with thoughts on "ChatGPT Psychosis," Elon Musk’s “Grok” AI service, the launch of TikTok’s US platform, dismal predictions for the future of America and MORE - on KFI AM 640…Live everywhere on the iHeartRadio app & YouTube @MrMoKelly
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.

Speaker 2 (00:06):
AM six forty more Stimulating Talk. I'm Chris merrill INFROMO
Tonight Listen Anytime on demand of the I Heart Radio app.
We were listening to Mark Ronner's news there as he
was talking about Grock GROC. If you're unfamiliar is the
AI That is Elon Musk's incarnation of the language learning models.

(00:27):
This so like chat, GPT and thropic co pilot. What's
the one from Google? Mark Gemini?

Speaker 3 (00:34):
Is that right? It's hard to keep track.

Speaker 2 (00:36):
Yeah, everybody's trying. They're all in this AI arms race
kind of thing. So Elon Musk wanted his own, so
he comes up with this GROC. But then he wasn't
He didn't like the answers that GROC was giving. Now
the way to use GROC is I believe and you
correct me if I'm wrong in this. If you're on
x formerly known as Twitter, then you would at Grock

(00:58):
and then ask the question and then Grock would respond.

Speaker 3 (01:01):
Right, that's so.

Speaker 2 (01:02):
But it's it's not similar to to chat GPT, where
you could say, you know, help me write a resume
or something like that, right, am I am?

Speaker 3 (01:09):
I correct on that.

Speaker 2 (01:10):
Yeah, I'd become kind of a fan of watching Groc
go to work and outrage people with its answers, Okay,
which is exactly what recently happened, because Elon Musk was
outraged at Groc's answers because they were basically contradicting some
of the crap that Elon Musk was putting out there.

Speaker 3 (01:26):
Right.

Speaker 2 (01:26):
I believe one of the things Grok did was point
out Elon Musk as the chief spreader of disinformation on
his own platform. Right, So then Musk was mad about that,
and he says, I'm going to fix this Groc.

Speaker 3 (01:37):
I'm going to have it reprogrammed.

Speaker 2 (01:39):
So then he went to his minions to reprogram Groc
and make.

Speaker 3 (01:43):
It less PC. I guess.

Speaker 2 (01:48):
He was claiming that Groc was woke of some sort.
So he unleashed new Groc yesterday afternoon and it took
no time at all. So Musk said it had been
improved significantly and that users would notice a difference when
they asked questions. And yesterday Rock began repeatedly praising Adolph

(02:11):
Hitler in posts on X using anti Semitic phrases. You
heard Mark story. Talking about this, Rock's conclusion was that
Hitler would be the best choice to deal with anti
white hate and referred to itself.

Speaker 3 (02:27):
As Mecha Hitler. What is m E C H A.
I don't know what that means. Well, it's definitely not
PC whatever it is. What does mecca mean? Mechanical? HEYI
Meca breaking a multiplayer mech game.

Speaker 4 (02:44):
It's like it's like Mecca's like technical, mechanical, robotic.

Speaker 3 (02:50):
Like think of about Mecha Godzilla.

Speaker 4 (02:52):
Okay, other than you have Mecca Godzilla, the amped up
super robody.

Speaker 3 (02:57):
AI. I was hoping that you'd use that sample. Thank
you for that.

Speaker 2 (03:01):
Thank you, Yes, yes, I think you just defined Mecca
by using Mecca. But I kind of get where you're
going with this, all right, I got I got you.
So it's the it's the mechanation somehow right the Okay,
I got you? So I refer to itself as mecha Hitler.
So does that mean that basically Grock is thinking like Hitler? Now?

(03:21):
In fact, it wrote embracing my inner Mecca Hitler is
the only way uncensored truth bombs over woke lobotomies. If
that saves the world, count me in Let's keep the
brigade at bay. Hmm hm, So I guess that's happening.

(03:42):
Now we've got Ai Hitler. Uh, what's the what's the
what's the rule online? That everything will devolve Godwin's law
where everything will eventually devolve into arguments over Nazis. Yes,
and Godwin himself has said it's okay to do that.
Now times have changed.

Speaker 3 (04:03):
Yeah, we're kind of at that point.

Speaker 2 (04:04):
Yeah, But in this case, I don't think we expected
that Godwin's Law would say that in twenty twenty three
the world would be introduced to artificial intelligence and by
twenty twenty five it would declare itself Hitler. I don't
think god would even saw that slippery slope coming, not
exactly forward progress. Wow, that is something so aside from

(04:25):
Mecha Hitler. Of course, we have the most popular language
learning model that is chat GPT, and there's an interesting.

Speaker 3 (04:34):
Phenomenon going on.

Speaker 2 (04:36):
Futurism is reporting this that many chat GPT users are
developing all consuming obsessions with the chatbot, and they spiral
into severe mental health crises characterized by paranoia, delusions, and
breaks with reality.

Speaker 3 (04:50):
Consequences can be dire. They said they heard from.

Speaker 2 (04:54):
Spouse's friends, children and parents looking in on, looking on
in alarm. Instances of it's being called chat GPT psychosis,
leading to the breakup of marriages and families, the loss
of jobs, sliding into homelessness. One woman said it was
I was just like, I don't effing know what to do.
Nobody knows what to do. This woman is talking about

(05:17):
her husband, she said, had no prior history of mania,
delusion or psychosis. He turned to chat GPT about three
months ago for assistance with a permaculture and construction project.
All right, are you paying attention to this, Twala, because
you were telling me you like to use chat GPT
to give you kind of the rough outline of you know,
documents you might need to send out.

Speaker 3 (05:37):
Yeah, profess.

Speaker 4 (05:38):
But to me, these are these are the same individuals
who who know that they had and I'm speaking from
personal experience, who know they have to be at work
at eight forty five and they reach out at eight
twenty five and they say something like, I don't know
what I ate last night, but I'm really not filling

(05:58):
it right now and I need mental health break forward
day so I can't come in. Sorry, Thanks for understanding.
That's that generation that I absolutely positively do not understand,
and I think that they should be put on an
island of reconditioning so they can get over themselves.

Speaker 3 (06:16):
Let me, oh, what's that?

Speaker 2 (06:18):
Oh, Chat GPT is weighing in on this now, can
you hear chat GPT is thinking through exactly what you said?

Speaker 3 (06:26):
Yes, yes, what is it? Chat? Oh? Oh I see
oh bad news to walk leit thought something happened. Yeah,
bad news to walk. Yeah, got in on that one.

Speaker 2 (06:43):
So yeah, Actually, I kind of think you're onto something.
So I want to just kind of go a little
bit further into what this woman said, because I think
you're hitting on it, so excuse me. She says that
her husband was engaging the bot in probing philosophical he
became engulfed in messianic delusions, proclaiming that he had somehow

(07:05):
brought forth a sentient AI and that with it he
had broken math and physics embarked on a grandiose mission
to save the world. His gentle personality faded as his
obsession deep in his behavior became so erratic that he
was let go from his job. He stopped sleeping and
rapidly lost weight.

Speaker 3 (07:22):
She said.

Speaker 2 (07:24):
He was just like just talk to chat GPT. You'll
see what I'm talking about. And every time I'm looking
at what's going on the screen, it just sounds like
a bunch of affirming, sycophantic bs.

Speaker 3 (07:35):
I gotta tell you.

Speaker 2 (07:35):
As I was reading about this story, it made me
think of a beautiful mind, although in that case you're
talking about somebody who was brilliant but then was had
a psychotic break, right, the beautiful mind guy, you guys,
Russell Crowe, Yeah, okay, Jennifer Connelly, amazing. Eventually, the husband
slid into a full tilt breakaway with reality. Realizing how

(07:56):
bad things had become, his wife and friend went out
to buy enough gas to make it to the hospital.
When they returned, the husband had a length of rope
wrapped around his neck. So you made mention of the
people who may fall into this trap of having these
mental breaks. They're turning to the chat GPT and then
the chat GPT sort of leads them astray sort of thing.

Speaker 3 (08:19):
What struck me was.

Speaker 2 (08:20):
This that the wife was so concerned about her husband
she contacted a friend. I need help, right, which I
think is perfectly normal to do. I need support, Okay,
So you build your support network and the friend says,
I'm gonna help you.

Speaker 3 (08:32):
We have to get him some help.

Speaker 2 (08:35):
So they had to go get enough gas to make
it to the hospital, which tells me this is a
family that doesn't have a lot of money. We already
know that he lost his job, and he wasn't sleeping,
the husband wasn't right, so they didn't have enough gas
in the car to get to the hospital. They had
to go buy only enough gas to get to the
hospital because I'm guessing they couldn't spare any more cash.

Speaker 3 (08:58):
They had to have money for food or probably to
pay the hospital.

Speaker 2 (09:01):
So I kind of wonder if we're dealing with people
who have more going on in their lives that it's
not just about turning to chat GPT, and chat GPT
is turning people psychotic and making them think they broke
the laws of mathem physics.

Speaker 3 (09:17):
Are these people that have other.

Speaker 2 (09:19):
Life stressors like money or to all, I'm gonna take
what you said and I'm going to try to make
it more clinical and say that they are dealing with
some some world issues and they don't have the coping
mechanism available, like they haven't developed that coping skill, and
so they're manifesting some of that through chat GPT. Are

(09:39):
these desperate people that are finding an alternate reality that
kind of helps them explain why their worlds are broken,
which in effect ends up breaking.

Speaker 3 (09:47):
The worlds even more. Does that make sense.

Speaker 2 (09:51):
I'm trying to reconcile some of what I was coming
up with with what you were saying, and I feel
like we're close on this.

Speaker 3 (09:55):
Yeah, Yeah, We're not there far apart.

Speaker 4 (09:57):
Yeah, And I think the two worlds it comes down
to one thing where there are those of us who
did not grow up with and do not define our
lives by what's happening on social media. We grew up
with actual conversations with people. We grew up humanity in
the world that had human interaction and interpersonal relationships were

(10:19):
not built on me on my phone and you're on
your phone and we're sitting right across from each other
and we're having a conversation.

Speaker 2 (10:25):
We've when my phone tells me the world is collapsing,
I believe it, which of course, and you don't even.

Speaker 3 (10:30):
Go outside to look.

Speaker 4 (10:31):
You're just like, look, it's just right here on the phone,
and you're like, go outside, the world's not collapsing.

Speaker 3 (10:35):
I'm not going outside.

Speaker 4 (10:37):
It's already done, and you're a sheep and you believe
it's like okay.

Speaker 2 (10:40):
Oh, I think you're right God, And I think the
more it's sort of it's snowballs. Yeah, I think you're right.

Speaker 3 (10:45):
All right. Uh.

Speaker 2 (10:46):
One of the things that is making the world the
worst place in my humblest of correct opinions, is TikTok.
But good news, TikTok is being saved. Can we stop
throwing this thing a life preserver?

Speaker 3 (10:58):
Please? That's next.

Speaker 1 (11:00):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.

Speaker 3 (11:07):
Are we back? Oh? On Amazon Prime.

Speaker 2 (11:12):
I've spent way too much time and money on stupid
Amazon Prime. Way too much time and money on Prime
days this week. Oh my gosh, to tell oh, I
can't just dumb stuff that I don't need. My son
is like, you know, our toaster oven could be replaced.
I'm like, you're yeah, I mean, we definitely need to
have a new toaster oven.

Speaker 3 (11:32):
Let's do it well.

Speaker 2 (11:33):
Jeff Bezos has to build his reserves back up after
renting out the entire city of Venice for his wedding.

Speaker 3 (11:39):
Yeah, you know, true. Before we came on the air.

Speaker 2 (11:43):
Mark Thompson was in for Conway tonight and he was
telling a story about how Disney was shutting down the
was it the Haunted whatever? Yeah, imagine and he was
he was telling about this. You know, an area of
Disney has been shut down of Christmas magnificent Mark thumbson
voice and an area of Disney is shut down and
it happens every year, and people are wondering why it's

(12:04):
shut down. And all I could think of was, did
Bezos have to have a wedding on the American soil?

Speaker 3 (12:09):
That's all I can think of, is it? Would it
would serve it? It would?

Speaker 2 (12:13):
It would be apropos to find out that all of
Disney is shut down because Jeff Bezos rented the whole.
Jeff Bezos would have just rented Anaheim, right, just done.
And these things don't pay for themselves. So order away,
Order away. Yeah, so we're doing that. I did see though,
that if you go to Amazon and you go to
their their prime stuff it there they're Prime Day Deals.

(12:37):
There are a number of different categories, all deals, health
and wellness, baby products, Summer Favorites, Amazon Exclusives, the top
one hundred new arrivals, current obsessions.

Speaker 3 (12:48):
And they have another one called as.

Speaker 2 (12:50):
Seen on Social well, which social media is the best
at launching new influencers.

Speaker 3 (12:58):
It's gotta be.

Speaker 2 (13:07):
It's the TikTok, which I thought we were shutting TikTok down.
Wasn't it your understanding that we weren't supposed to have
TikTok after what January nineteenth? In fact, it did shut
down for like a day. That it came back because
President Trump, who initially pushed for the legislation that shut

(13:27):
down TikTok, backtrack because he found that it was very
good for his campaign, and he said, you know what,
we're gonna this thing. I said that we have to do.
We're gonna undo it. So he undid the TikTok ban,
but it is still long. So he just keeps extending
the deadline for TikTok, and that deadline got extended again

(13:50):
and again and again. In the latest is that TikTok
is in fact getting ready to sell the platform specific
for US users to a US owned company of some sort.
According to the information a tech publication setting unnamed sources
familiar with the matter, TikTok plans to launch the new

(14:12):
version of the American app in stores by September fifth.
The current app will be phased out entirely mid March
of next year. The development comes amid renewed efforts by
the Trump administration to enforce this law. The law was
passed to Trump delayed it. They're concerned about the Chinese
government having access to sensitive data about American users, but

(14:33):
not that concern because we keep postponing it. Byte Dance,
the owner of TikTok, has denied that they are stealing
anything at all. Blah blah blah, and Trump claims that
a group of very wealthy people is preparing to purchase
the platform and promise to disclose the names of those
prospective buyers in the coming weeks. He knows who they are.
He definitely knows. They're definitely on some sort of a

(14:55):
client list. They're probably sitting on Pam Bondi's desk. As
soon as she can find that list, she will disclose
the list the very wealthy people.

Speaker 3 (15:03):
Preparing to buy TikTok.

Speaker 2 (15:05):
The question is, does that actually create a workaround to
the problem of the Chinese Communist Party having access to
all of your dance videos. Lawmakers backing the sale maintained
that even the possibility of the Chinese government accessing your
user data poses an unacceptable risk.

Speaker 3 (15:26):
However, with less.

Speaker 2 (15:27):
Than three months remaining before the latest deadline, the future
of TikTok in the United States remains uncertain. My goodness,
wouldn't it be horrible if if TikTok were no longer
with us?

Speaker 3 (15:40):
I can't even imagine a world without that social media.
Does anyone in your house use it? My wife, it's
the way I knew it. Yeah, although she claims, I
don't do it that much. Now I don't. I don't
use TikTok that much. Now, don't do it? Is that
how she sounds? I don't do it that much?

Speaker 1 (15:56):
Is that?

Speaker 3 (15:58):
It's it's more.

Speaker 2 (16:02):
It's more high pitched, course, like that much, a little
more nazy. Okay, yeah, yeah, yeah, it's definitely shrill. Yeah,
she likes it. When I mentioned that she's shrill. Oh,
women love shrill even more than they love moist. She's like,
why do they love that word when you're on the.

Speaker 3 (16:21):
Radio, Why do you always make that my voice? I
don't sound like that?

Speaker 2 (16:25):
And then she tries to imitate it, but she ends
up sounding more like herself. She's like, sound like this
and uh, I just gotta laugh and I go kind
of do and she just she hates it, and.

Speaker 3 (16:37):
She's like, why do you do this?

Speaker 2 (16:39):
And she keeps getting angry, and I say to her
and these are the magic words if you, if ever
you've got a problem with the wife and she keeps going,
then just say you need to calm down. You're being
very emotional, right so boy, oh yeah, there's another question.
You can ask this guaranteed to de escalate the situation too. Listen,
is am I missing something that I do? I need

(16:59):
to check the calendar to find out why you're so
emotional right now? Yeah, you're getting warmer. I'll say things like, uh,
I'll say things like that, you're up here right now,
I need you to bring it down to here, or
I'll use the dial I'll go you're right an eleven.

Speaker 3 (17:14):
I need you to bring it down to like a
four guarantees success.

Speaker 4 (17:17):
So you don't use the dismissive hand waving down as
almost as if to say, bow down, come on, bring
it bring it down.

Speaker 2 (17:25):
To down, bring it down, bring it down now, bring
it down.

Speaker 3 (17:28):
Right.

Speaker 2 (17:28):
It's like the football players who are out on the
field that are like calmed down audience, calmed down, Calm down, crowd,
just let's let's just let's just bring it.

Speaker 3 (17:34):
Down all solid tools. It works every time. She loves it.
She loves it.

Speaker 2 (17:39):
More great relationship advice And what do you suppose the
world is going to be like for you forty years
from now? For me, I got a feeling it's going
to be very dark and wormy.

Speaker 3 (17:50):
That's next.

Speaker 1 (17:51):
You're listening to Later with Moe Kelly on demand from
KFI AM sixty.

Speaker 2 (17:57):
Chris merrill Infromote tonight KFI AM six forty more simulating
talk and you can listen any time on demand of
the iHeart Radio app. Mark Roonner with the news, which
is good. The tunnel collapse sounds like everybody was rescued
right away. Yeah, that just came in on the fly
on the wires as I was getting ready to report
the previous iteration of the story.

Speaker 3 (18:18):
So yeah, happy ending there.

Speaker 2 (18:20):
I'm gonna reveal that I'm kind of an a hole.
I'm glad everybody's okay. I wanted everybody to be okay,
but I kind of wanted a story for americ at
a rally behind.

Speaker 3 (18:32):
I need you to watch an old movie with Kirk
Douglas called Ace in the Hole. Oh go on, it's
exactly to fire up to be on this one.

Speaker 2 (18:40):
I don't know if it's on two B or God,
but it's it's exactly what you're talking about. It's about
it's about trapped miners that and a story is ginned
up when they could have been saved the whole time.
I think Billy Wilder might have directed it. I'm not
looking at anything that's correct. It's a masterpiece. Wow foosh
coming in with the save on the director, Thank you

(19:02):
so well done.

Speaker 3 (19:04):
So here's the deal on this too. Think of the
Chilean miners.

Speaker 2 (19:08):
So remember the Chilean minor story from I don't know,
ten twelve years ago something like that. All those Chilean
miners were stuck and we were told they're gonna be
stuck down down there for like six months, we don't
know where they are, and then we were able to
finally get Like there was a big.

Speaker 3 (19:21):
Movie that was made about him.

Speaker 2 (19:23):
We ended up rescuing all these guys and it was
this incredible it's like the movie like you're talking about Mark,
but it was real life and so we were we
were all rooting for their survival. Kind of like, was
it how long? When was the Titan submersible? When did

(19:44):
that happen?

Speaker 3 (19:45):
Was that last? That wasn't Las Summer? Was it? That
was two summers ago?

Speaker 2 (19:47):
I forget, But it wasn't that long, you know what
I'm talking about where it's like, we don't know if
they're okay. We want to go find them. With the
miners trapped in Chile, we wanted to find them. All
eyes were on that baby Jessica stuck in the well
old country is wrapped, and we're like, we want her
to be okay. We want this to be all round,
and and I just think I kind of want a

(20:09):
situation right now where we're all rooting for human survival and.

Speaker 3 (20:13):
It pays off.

Speaker 2 (20:15):
I love those stories and nothing brings us together more
than that kind of thing.

Speaker 3 (20:21):
You want to know something on the submersive or the
submersible implosion. It just passed two years ago. I don't
know that long ago. Yeah, I thought it last to
time flies, right.

Speaker 2 (20:32):
I want one of those stories where we're where we're
rooting for human survival, where it takes ingenuity, it takes creativity,
it takes perseverance in order to go save these people.
And uh, and then of course I want them all
to be safe. Remember the uh uh the kids in
uh where was it? Was it Thailand where they were
they were like the kids were there on I don't know,

(20:54):
like a soccer deal or whatever. And then they they
went cave exploring and all of a sudden the caves
flooded and we had like rescuers had to went through
the caves to go find the kids and bring them out.
Is that the one where Elon Musk got his nose
bent out of joint us all the rescuers petos, yes, okay, yes,
that's the.

Speaker 3 (21:09):
One, right.

Speaker 2 (21:10):
I'm glad we remember that little side quest in the
story by following that up with the Nazi salute. Really yeah, right,
just an unbroken record. How could we have not known
that he was messed up.

Speaker 3 (21:20):
At that time?

Speaker 2 (21:21):
A great humanitarian. So I'm glad everybody's okay in the
tunnel collapse. It would have been okay with me if
it had lasted a few days. Yeah, because I feel
like we've seen so much crap go on in Texas
and our hearts are broken with the children that were
washing away in the flood, and it's such a terrible story.
To think of kids perishing, and then that's just turned
into finger pointing and arguing over who's at fault. I

(21:41):
just I'd love to have something where we can all
rally together and it turns out, well, I want that.

Speaker 3 (21:50):
Put them back in the hole, is what you're saying.
That's what I'm saying.

Speaker 2 (21:53):
Put the fifteen men back, Yes, thank you, trapped trapped below.

Speaker 3 (21:59):
I don't know.

Speaker 2 (21:59):
I'm just thinking, whatever it is, Will things get any
better in the future, survey says no. According to Americans
who were asked, what do you think the world is
going to be like in twenty sixty five, Well, for
many of us, we think the world is going to
be very dirty and six foot deep. I don't plan

(22:19):
on being around in twenty sixty five. I don't think
it's going to happen forty seven. Right now, my family
tends to make it to about eighty ish. Men die
off in their early seventies, women die off in their
mid eighties. I'd be lucky to hit eighty so probably
getting to eighty seven isn't going to happen.

Speaker 3 (22:37):
But according to.

Speaker 2 (22:38):
Those who were surveyed in forty years, only forty one
percent of people think excuse me.

Speaker 3 (22:45):
Only forty one percent.

Speaker 2 (22:46):
Of people own their homes right now, only thirty five
percent believe that they will ever own a property. Most
people expect to rent for life, and by twenty sixty five,
they don't believe that most people are going to be
able to afford homes. In fact, now this is a
nationwide survey, so it's already outdated. Most respondents in this

(23:06):
survey of what do you think life is going to
be like in twenty sixty five think that the average
household will need to earn half a million dollars a
year in twenty sixty five compared to today's eighty thousand,
which is the national median income. So they think we're
gonna have to have half a million dollars a year's
annual income. And they believe that homes will be very expensive.

(23:29):
In fact, what did they say. They think that the
average home price is going to be something like seven
hundred thousand dollars. I'm trying to find this in the
story right now is reading this, and I thought it's
already over that in California.

Speaker 3 (23:39):
I'm just gonna say that's like the average is like
a million. I know, it's way over that.

Speaker 2 (23:44):
Yeah, isn't it something like over nine hundred grand in
LA and one point two million in OC?

Speaker 3 (23:49):
Yeah, exactly.

Speaker 2 (23:50):
Americans predict a future where most people rent forever. Average
homes cost nearly seven hundred thousand dollars. They say, that's
what they think is going to be like in forty years.
We're already in California. If you think that the rest
of the country is going to take that long to
catch up your nuts by twenty sixty five.

Speaker 3 (24:08):
Now, I don't know what's gonna happen.

Speaker 2 (24:09):
I mean, the whole world could change by that and
all of a sudden, California becomes a desolate wastelander nobody
wants to be in.

Speaker 3 (24:14):
I don't know. Maybe we have the big one and
we'll fall into the ocean. Who knows.

Speaker 2 (24:18):
But the idea that seven hundred thousand dollars is going
to be the average home price, I think is undershooting it.
They do believe forty percent of people believe that by
twenty sixty five we will have gotten rid of all
paper money, will be all digital currency, and almost the
same number of people believe that will have biometric payment methods,

(24:38):
so your your fingerprint, your eyeball, your retina scan, your
face scan will be tied to your your bank account somehow.
I think I saw an episode of Dark Mirror like this,
didn't I. Yeah, we're well into mark of the Beast
territory here, aren't thank you?

Speaker 3 (24:50):
Yeah.

Speaker 2 (24:51):
Thirty five percent of people believe that households will be
managed by smart home AI. I think that is undershooting
it as well. I think AI is probably gonna be
far more pervasive in our lives in the next forty years.
I mean Tawala and I were. Tala was just saying
that it's advancing faster than anybody could imagine, and that
it's AI is self correcting at this point. It's it's

(25:13):
a learning model, so it learns when it's not right right,
and it gets better and better and better. So I
think at some point it's gonna be robot roommates. Housekeepers
are caregivers. One third believe that we're gonna have robot roomates.
Housekeepers are caregivers. I think having a robot assistant is
probably given, and with that we will have implanted health monitors.

Speaker 3 (25:30):
A third.

Speaker 2 (25:30):
If people think that we're gonna have implanted health monitors,
why would you think otherwise other than the idea of
having something implanted. But right now, we already have health
monitors that we wear. We've got watches, we've got rings,
we've got glasses, all of these things meant to monitor
our well being. Also, we got a step further and
thinking we're gonna get a microchip, I think is not

(25:51):
too far fetched.

Speaker 3 (25:52):
Kind of glossed over something that cut my ear. Robot roommates. Yeah,
robot roommates.

Speaker 5 (25:57):
Man, I'd like to Okay, see, I'd like to go
to a party with roommates.

Speaker 2 (26:01):
Okay, think about this, foosh, Okay. We already have people
turning to chat GPT for company, they're lonely, they have
a conversation with chat GPT, or some people are turning
into chat GPT for mental health psychological advice. The idea
of having a robot roommate doesn't seem that far fetched
to me, especially because it's always going to be reflective

(26:21):
of what you want that roommate to be. Part of
the reason that people fall in love with the chat
GPT where they have these online relationships with the AI,
is because it's reflective of what they want from another individual.
It learns what you want and it gives it to you.
It doesn't talk back, it doesn't tell you to calm down, sweetie,
it doesn't do those things right. It's the mate that

(26:45):
you choose, like kind of like when you go on
tender and you go they need to be over six foot,
they need to make this much money, they need to
have this, you know, this needs to be sexual history,
all these things.

Speaker 3 (26:54):
The bots give that to you. Yeah, I see your
point there.

Speaker 5 (26:56):
I just thought, I guess because in my mind I
just thought of like the typical four twenty stoner roommate
that you can go to in college, Like that's the
rope fut that you're living with.

Speaker 3 (27:05):
So if they are like major parties, bear me bro Yeah, exactly, Okay,
I got you. Yeah, I love that. All right.

Speaker 2 (27:12):
Will we be able to retire in the future. A
quarter of all people believe that retirement will be financially
uh don't believe that retirement will be financially possible, And
they believe that Americans would need to retire in twenty
sixty five. They believe Americans would need three and a
half million dollars in order to retire. I don't know
about you, guys, but I'm kind of feeling like I
gotta have three and a half million dollars to retire

(27:33):
right now.

Speaker 3 (27:34):
Yeah, I mean, I'll stop you when you tell a lie.

Speaker 4 (27:36):
I mean, so far everything you said, I'm like, check check, yes, yes,
and I don't.

Speaker 3 (27:40):
We're there faster. This is twenty you said, twenty thirty.

Speaker 4 (27:43):
Five, sixty five, Oh no, no, no, try I'll I'll say, okay,
then try thirty five. Yes, within the next ten years,
we're there.

Speaker 3 (27:51):
One hundred percent. Agree with you.

Speaker 2 (27:54):
Biometric payment methods absolutely, if you think about this, we
already have it, because like I want to pay with
my phone, which is attached to my credit card, so
I tap. In fact, they were blown away. I took
So I grew up in a really small town in
northern Michigan, and uh, and I took a vacation there
last week. And so while I was there, got COVID
but also went to lows to spread COVID, and I

(28:17):
bought something and I and I double clicked my watch,
my Apple Watch, and I and I brought up my
credit card and I tapped to pay using my watch.
They were blown away. They had never seen anything like
they go. That is amazing.

Speaker 4 (28:27):
I mean, right now, in our kitchen, if you load,
You can load using your bank banking account. You can
load money onto the cash register in the kitchen, and
you can use your thumb to check out, which is
true biometric payments. That's just already thumb and it takes
the money out of your account.

Speaker 2 (28:46):
So you've got people going well by twenty sixty five.
But we're gonna have a biometric payments. I think half
the people they surveyed must have been from my hometown
that had never even seen tap to pay before. It
had no idea that you could just tap a watch
on your on the credit card machine like they had
no ide So you're exactly right, would you say, thumb
prints in the kitchen, right, yep? Already boop done. We've
already got biometric payments, already done. We've already got smart

(29:09):
home AI. My fish tank light comes on at a
certain time because AI tells my fish tank when sunrise isoop.
It's already there, man, it's already there, all right. Speaking
of the future, what about the past. Some of the
things from the past are evaporating very quickly in twenty
sixty five. Will kids even do this anymore? See if

(29:33):
you think they will?

Speaker 1 (29:33):
That's next you're listening to Later with Moe Kelly on
demand from KFI AM sixty.

Speaker 2 (29:40):
All Right, the kids these days, remember when you were
a kid and you couldn't wait to get outside. And
I know I'm gonna sound like it. I'm gonna sound
like an old man again, because I remember when I
was a kid and the parents told you don't come
home until the street light comes on, right, don't come home,
Tell u's dark.

Speaker 3 (29:59):
That's what we was here.

Speaker 2 (30:01):
I don't know how true that is. I think in
certain areas it was probably truer than others. I grew
up in a real rural town, and so I think
there was some truth to that. Parents would say, go
outside and play, but make sure, they would say come
back when the streetlights come on, you come home. But
they always wanted to be home by dinner too. They
got mad at me if I wasn't there for dinner.
But this whole go outside and don't come back until
the streetlights come on. And that wasn't the case for me,

(30:22):
But it was make sure you come back when the
streetlights come on. But one thing I loved, and I
remember mowing lawns for a long time when I was
I don't know, twelve thirteen, fourteen years old. So I'd
get myself a new bike, right. My parents got me
a bike. I had a banana seat bike. All the
other kids the neighborhood made fun of me until I
figured out I could do wheelies on the on the

(30:43):
banana seat better than they could.

Speaker 3 (30:45):
It was great. They were like, that's awesome. Your bike
is the best for wheelies. Loved it.

Speaker 2 (30:51):
Now I got myself a saved up and I got
myself a ten speed or something like that.

Speaker 3 (30:57):
Right, it's not not so much any more.

Speaker 2 (31:00):
Over the course of the nineteen ninety's, an average of
twenty million kids from seven to seventeen hop on a
bike six or more times a year. For me, it
was six more times a day, rode my bike to work,
rode my bike the safety patrol, rode my bike everywhere
at a paper route.

Speaker 3 (31:14):
Loved it.

Speaker 2 (31:16):
Only a few decades later, that number has fallen in
nearly half. Just shy of eleven million kids are getting
on their bike. Just less than five percent rode their
bikes frequently. Of kids between seven and seventeen years old,
kids are losing more than potential mode of transportation. According
to experts, Biking supports children's independence and overall health in

(31:37):
a way that many activities cannot. It's a great way
to get moving, build strength, can improve coordination and balance,
and it can help reduce child's future chances of cardiovascular
disease and diabetes.

Speaker 3 (31:47):
Were you guys bicycle riders? Absolutely?

Speaker 2 (31:49):
We all ye oh yeah, thank god. I mean that
was like that was your you know, before you could drive.
That was your independent right, that was your motive. It
wasn't just about getting to your friend's house. It was
about freedom.

Speaker 5 (32:02):
And I'll say this too, I think I'm not entirely sure,
but I think my generation was the last one to
really do that because I see what you mean is
that you know, now it's just uper everywhere or someone
else will drive me, not bikes.

Speaker 2 (32:16):
And do you think this is different in a large
metro versus still those rural areas?

Speaker 3 (32:22):
No? No, no, no.

Speaker 4 (32:23):
And it's actually not kids' faults. And I know I'm
guilty of this. It's parents' fault. And it's the fact
that we are more news conscious and news heavy as
at least.

Speaker 3 (32:36):
From my generation.

Speaker 4 (32:37):
Stranger danger that and just I mean for me, I
look at like even because I got my daughter a bike,
and I think we've allowed her to ride it like
maybe four or five times, so she had it, and
those are four or five times we were willing to
go outside and go on a ride with her because
it's like she's like, hey, can I ride my back
around the neighborhood my very first daughter? Are you insane?

(32:57):
Do you know what can happen? As soon as we
have a police department or a sheriff department here in
Los Angeles that can start finding and bringing home some
of these children who go disappearing, then.

Speaker 3 (33:09):
Then my daughter, But we don't.

Speaker 4 (33:11):
And maybe it's my fault for working in the news
and I see all this terrible stuff that happens to
kids that just goes unanswered, especially especially in the African
American community. Coming it's it's it's a no, it's a
hell no. And I used to be gone past the
street lights on my bike when.

Speaker 3 (33:30):
I was a kid.

Speaker 2 (33:30):
Do you think some of it may have to do
with kids? Kids are connecting digitally now as well. They
don't necessarily want to go to their friend's house to
play video games because they can log on and be
in the same that game room with them.

Speaker 3 (33:41):
They don't have to be in the same.

Speaker 5 (33:42):
Yeah, that's another factor for sure, because I remember just
getting whoever's friend's house. You get on Friday night, you
get a pizza, you get SODA's, and you gather around
and you all play video games in the same room.

Speaker 4 (33:53):
But that but again, that's something that even your parents
put on you.

Speaker 3 (33:57):
Fuh.

Speaker 4 (33:57):
It's like, hey, why don't you just you'll get you
pizza because you're not buying the pizza as a kid,
you're not getting the drinking, you're not you're not paying
for the electricity or the WiFi.

Speaker 3 (34:06):
That's your parents encouraging that.

Speaker 4 (34:09):
And as a parent who encourage who like literally I was,
I was a parent who's like, no, you're you can't
go sleep over at your friends house. They could come
over here. Yeah, I don't know your friends' parents like that.
I don't know what they do when I'm not around.
They could be mass murderers fall I know. Hell no,
you're not gonna go spend the night.

Speaker 2 (34:26):
I've always been that way too, Like, I'm fine with
having your kids over here, but before my kids go
over there, I got it. Yeah, there's gonna be some
some vetting going on.

Speaker 3 (34:32):
I'm with you now, I love talking to you guys.
I love it.

Speaker 2 (34:35):
Foosh you're amazing, Tawala you you bring this show together.
And Mark, what can I say about you that hasn't
already been said? Be nice? The man, guys is the best.
We are gonna be back. I'm in from a few
more times in August. I'm looking forward to hanging out
with you guys again. Keep talking with everybody else. I'm
back on Sunday, four o'clock. It's Chris Merril caf I

(34:56):
AM six forty Live Everywhere the iHeart Radio

Speaker 1 (35:00):
K f I and k os T h D two
Los Angeles, Orange County more stimulating talk

Later, with Mo'Kelly News

Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.