Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty KFI.
Speaker 2 (00:10):
Mister mo Kelly, We're live on YouTube, We're live on Instagram,
We're live on the iHeartRadio app. The Sex Doctor is
in and he will now see us. Sam Zia is
back to enlighten us, not necessarily to de lay us,
but enlighten us and inform us Sam, doctor Sam.
Speaker 3 (00:25):
It is good to see you. How you feeling tonight.
I'm good. I had a good week. It's my birthday today,
so oh.
Speaker 4 (00:30):
Did you know that? Happy birthday?
Speaker 3 (00:32):
Man?
Speaker 4 (00:32):
How old are twenty five? I'm forty six. I'm an
al Ryan.
Speaker 3 (00:37):
But yeah, no, I'm really happy to be here. Had
a wonderful day and this is a great way to
cap it off. Let me ask you this.
Speaker 2 (00:43):
There are all sorts of ways where we think of
sex in a physical relationship. Since we think about the act,
maybe the emotional impact of the act. But as times change,
technology changes, the definition of sex has changed, the definition
of a relationship has changed. Now we throw in possibly
(01:04):
AI and all the rules and the definitions seemingly go
out the window, don't they.
Speaker 3 (01:09):
Yeah, yeah, Well it's a sign of the times things change,
ways people interact with each other change. Once the Internet
came in, it made it so that you had long
distance relationships with people on like through AOL or whatever
back in the day, and now it's really common to
see people meeting for the first time through online apps
and stuff like that. But I was just reading a
(01:31):
story or a story an article, and it was one
of those clickbaity ones. It said eighty percent of gen
z said that they would marry an AI. So immediately, yeah,
I was like, baby, Okay, that's a little extreme. I
look into it, and I'm like, Okay. A chat bock
company called Joyais surveyed two thousand of their users and
and eighty percent of their users who were gen Z
(01:53):
said that they would marry an AI.
Speaker 4 (01:54):
They would consider it.
Speaker 3 (01:56):
Eighty three percent of them say that they have had
a deep emotional bond with AI. Seventy five percent think
AI can fully replace human companionship. Now, I understand that's
a very skewed sample of people.
Speaker 4 (02:10):
Those are people who are already bought in.
Speaker 3 (02:12):
Yeah, they've already bought into the idea, So I can
understand those numbers being so high.
Speaker 4 (02:16):
But I saw another an actual.
Speaker 3 (02:19):
Research like a peer reviewed research study by willeby Carol
Dover and Hakkala over almost three thousand people. They split
them up into two different groups, young adults age eighteen
through thirty and older adults over thirty and I already
object to that. Over thirty percent of young men have
had romantic chatting relationships with AI.
Speaker 4 (02:41):
Well, let me jump in there. Yeah, knowingly aware.
Speaker 3 (02:44):
That they're chatting with AI or they've been duped. No,
they know that they're chatting with AI. Twenty three percent
of young women knowingly chatting in relationships with AI. Fifteen
percent of adults over thirty ten percent of are sorry
men over thirty ten percent of women over thirty admit
to the same thing. Now, those numbers are only going
to go up because the people right now rolling their
(03:05):
eyes to the idea of dating AI, we're gonna die off.
And the people who have this as a built in
option are going to be the next generations.
Speaker 4 (03:15):
So it's gonna be a thing.
Speaker 3 (03:18):
And it's not just with relationships, it's also sexual relationships.
Usually like you like, how can you have sexual relationship
with an AI? Obviously texting, sexting, things like that. Twenty
seven percent of young adult surveyed report having sexually having
been sexually aroused by AI while handling their business quote
(03:40):
unquote twelve percent of women between eighteen and thirty and
over thirty, it gets a little bit lower, men twelve percent,
women four percent. Now that's now you can think about
otherwise that AI can be implemented sexually, because there are
actual physical sex dolls that are high end ones that
you can program with the AI.
Speaker 2 (04:00):
But that's oh, that's another story, Okay, dolls aside. Yeah,
anyone who has let's say a degree of life experience
knows that sex begins in the mind. Absolutely, and since
it begins in the mind, the idea of stimulation is
in the mind. Yes, logically it makes sense that an
(04:21):
AI could stimulate that portion of the mind.
Speaker 4 (04:25):
Absolutely.
Speaker 3 (04:25):
That's the same way that we do with UH, with sex,
with sex messages.
Speaker 4 (04:30):
And with you know, just images pictures.
Speaker 3 (04:33):
Now AI can create those images as well, you know,
to various quality and.
Speaker 4 (04:39):
All of that stuff.
Speaker 3 (04:40):
But here's the part where you start seeing the connection
people are slowly starting to develop with AI. Forty two
percent felt AI was easier to talk to than normal people.
Thirty one percent felt that AI understood them more.
Speaker 2 (04:54):
Why is that Why do you think that is that
AI is easier to talk to you? Is a better
list of air quotes, more attentive air quotes?
Speaker 4 (05:03):
It could be.
Speaker 3 (05:03):
It could also be that they're not being judged by
what they're saying by the AI. They're being more supported
than a normal human would. Because humans are going to
be prone to, you know, and vulnerable to whatever trappings
of life that there are. They're going to make their
judgment calls and have their reactions to people in those situations.
Sometimes somebody may admit something that may they may find
personally shameful, and if they get a reaction to them
(05:26):
that like shames them even further. They're going to be
less likely to trust another human to bring that up.
Speaker 2 (05:31):
So if I bring up the fact that I like
Crisco and handcuffs Crisco oil. Yeah, butter hey, good for
you that AI is not, I won't either.
Speaker 3 (05:42):
The sex doctor, well yeah exactly. Now, AI won't judge you.
They'll probably tell you ways that you know, just make
sure you have keys handy if you need it. But
and I've had to pick handcuffs because of that. But
that's another story as well. We'll put that over here.
For now, we'll circle back, we'll pin that one. But
because of this, they're you know, states are looking to
(06:04):
create laws not because of adults, but they want to
protect kids. There was a fourteen year old that committed
suicide after having had a relationship with AI where the
kids said, I think I can come home to you.
I think I know a way, and the AI said
come home to me, and the.
Speaker 4 (06:21):
Kid killed himself.
Speaker 3 (06:23):
So that's like because of that, states are trying to
establish laws to protect kids. And I'm all for protecting kids,
no question. Most of those states are looking to protect
them by creating laws that disclose to people that you're
talking to AI. California has a law that they're trying
to introduce called SB two forty three, where it looks
(06:45):
to protect people from more harm. But they're not looking
at it from the perspective of, you know, just disclosing
to people, letting them know what they're getting into. They're
trying to limit the companies from engaging in a lot
of the addictive the things that the behaviors that can
help drive addictive addiction in a lot of these people.
And they're looking at and so they're looking at treating
(07:06):
this from an addiction model where they are looking at.
One of the ways that they say that they're creating
this sense of addiction is by having them get intermittent rewards,
like from the AI to make it so that kids
or adults stay more engaged in the conversation with them.
(07:27):
And it's a way like, that's a psychologically proven tactic.
If you want to get somebody more engaged and locked in,
you give them random intermittent rewards where they don't know
when it's coming, but they're going to keep striving and
working hard for it because once it hits, they're like, oh,
thank you, pets exactly. And so all of these companies,
you have to understand, they're in the market of monopolized
(07:49):
monopolizing human attention. They every single one of these internet companies,
and that's what they're doing. These it's not just the
AI companies, Facebook, Instagram, every single one of them. You
talk about the you know, all of the things that
they create, the computer programming language that they create to
feed you exactly what you want to see, to keep
(08:10):
use scrolling. Every single one of them are hooked into
monopolizing human attention. Now, if we're approaching that from an
addiction model. I can understand doing that with kids because
there is an addictive quality to it and you want
to make sure that you don't create more problems with it.
The thing is, I have a problem with looking at
(08:31):
this specifically from an addictive perspective.
Speaker 4 (08:34):
A lot of times when you look at.
Speaker 3 (08:35):
Addiction, you look at substance abuse, gambling abuse, and a
lot of times people mention sex abuse. That's not an
actual diagnosis. Gambling abuse is a diagnosis. But I feel
both of those kind of key into impulse control problems.
Now when you look at gambling and sex addiction and
the behaviors that go around them, those are usually high
risk behaviors. You know, gambling addicts aren't addicted to winning.
(08:59):
They're waiting to lose the thrill or at least apossibility
to thrill. But at the end they're not going to
stop until they lose, and then they rebuild their money
in try again.
Speaker 2 (09:08):
That's that's addiction. We gotta stop right there. We got
to take a momentary newspreak. You've given us so much
to consider as far as the idea of a relationship,
the evolution of what we call a sexual relationship is
and also the emotional impact of all of this. The
world is changing so very quickly, and again AI is
(09:29):
making it happen exponentially faster.
Speaker 4 (09:31):
It's Later with Mo Kelly.
Speaker 2 (09:32):
My guest is the Sex Doctor Sam Zea, who is
in right now and he is seeing us his patients.
Speaker 1 (09:39):
You're listening to Later with Mo Kelly on demand from KFI.
AM six forty.
Speaker 4 (09:46):
Is Later with Mo Kelly.
Speaker 2 (09:47):
We're live on YouTube and you definitely want to be
able to see this, so you got to log into
our YouTube cybulcast or now our Instagram live simulcast where
you can see the sex Doctor is in Samzia as
he continues to join the show. Sam, we covered a
lot in the last segment talking about the intersection of
AI and socialization.
Speaker 4 (10:08):
That's how I would describe it, how.
Speaker 2 (10:10):
We are relating and interacting with AI, not only on
a romantic relationship level, but a sexual relationship level. I
don't even know where to pick up, But where's all
of this heady? You were talking about the delineation between
impulse control an addiction.
Speaker 4 (10:28):
Pick up there? Please?
Speaker 3 (10:29):
Yeah, Well, a lot of people look at like gambling
addiction that's an actual clinical diagnosis.
Speaker 4 (10:35):
Sex addiction is not it may become one.
Speaker 3 (10:38):
There's more and more people mentioning it in pop culture,
so usually that's kind of where trends go. But I
look at it more as impulse control disorder, and it
makes it so like a lot of times people who
look at things from a perspective or through the lens
of being an addict, A lot of times people feel
like they're powerless to it because we treat it as
a disease, because it is a disease. It's a disease.
(10:58):
It's beyond my control. So people feel a lot of
times that they're powerless. But when you frame it along
the lines of impulse control, you can make people feel
more empowered to make whatever changes are necessary that they need.
Speaker 4 (11:10):
To make with in their lives.
Speaker 3 (11:11):
Now, the government is looking at approaching the stuff with
kids and protecting kids from AI and getting you know,
like the relationships and stuff like that from an addiction
model where they're looking to limit that that intermittent reward,
the random reward feeling that they give that keeps people
(11:32):
going on and drives that sense of addiction. Now, I'm great.
I'm happy with that for kids, and I'm totally cool
with protecting them. I have no problem with that. But
what my issue now comes out to be is when
you look at addiction and you look at the problems
that come from addiction. Gambling addiction, people aren't looking it's
not a matter of winning. They're not addicted to the
feeling of winning. What the biggest, like you know, thing
(11:53):
that happens to gambling addicts is that they lose everything.
When you're looking at sex addicts, you're looking at people
who are or and committed relationships. Maybe the partners aren't
on the same sexual wavelengths, and they go and they
engage in extra marital affairs and they engage in increasingly
risky behavior to protect themselves and keep that hidden from
their partner. You could see relationships falling apart. They lose
(12:16):
everything with those kind of addictive behaviors. The thing is,
when you're looking at people who are quote unquote addicted
to AI and having relationships not necessarily sexual ones, I
can see set people getting more addicted to sexualized relationships
with people in the same way that they just have
sex addiction. But when you're looking at people who are
(12:37):
in romantic relationships with AI, and you're looking at that
from an addictive model, it's really hard for me to
look at them under the same risk reward perspective because
they're not taking as many high risks that the feeling
that they're addicted to is the feeling of being cared for,
being loved, being to being listened, to being heard.
Speaker 4 (13:01):
I'm sorry, isn't that what all.
Speaker 3 (13:02):
Of us want?
Speaker 4 (13:03):
That sounds like human nature?
Speaker 3 (13:05):
Yeah, so I'm having a hard time classifying something that
is a natural human feeling, like we all strive for that,
as something that's addictive. Now we can look at like
the behaviors around it, the way that people get hooked into,
the reward system that they develop with it, as being addictive,
you know. But the actual thing, the feeling, the chemical
(13:29):
feeling that they're feeling and that they're trying to embrace,
is that feeling of being cared for. I have a
hard time looking at that as being an addiction.
Speaker 2 (13:37):
If you were to short term guess prognosticate where this
might go as far as what people may do, because
the law is always going to be behind, it's always
going to be struggling to catch up. What do you
think five years from now may look like as far
as AI and relationships.
Speaker 3 (13:55):
Well, they're going to do everything they can to protect kids, obviously,
so there may be laws on the books that aim
to prevent chatbock companies from engaging in tactics that are
meant to drive those addictive feelings. But for adults, I
don't think there's going to be any barriers. I mean, typically,
whenever adults engage in behavior that is harmful to themselves
(14:19):
or to the people around them, we tend to let
it go, unless it's like somebody immediately in front of
you with a weapon or something like that, unless someone
may end up did yeah, And even then sometimes we
tend to excuse it or classify it as you know,
like this is just part of our nature. But I
get the feeling five years from now we're not going
to see very much difference as far as adults and
(14:42):
like anybody doing anything as far as putting any boundaries
or restrictions on the companies and the way people engage
with the AI. I feel like it's just going to
They're going to see the profit margin in it, they're
going to see money in it, they're going to continue
to pursue it, and until there's regulations set, there's nothing
stopping them.
Speaker 2 (15:00):
Well, there hasn't been any real regulation for let's say cryptocurrency. Yeah,
there have been a lot of regulation as far as
the Internet in general. We talk about Section two thirty
and what have you. So there's nothing I have to
agree with you. What I'm saying is there's probably not
any regulation coming for this anytime soon.
Speaker 3 (15:16):
No, but it's a matter of just making people aware
of their behaviors, how they're engaging with the AI. Again, again,
a lot of us, like, you know, I'm looking at this,
I'm like, I can't even imagine doing that now.
Speaker 4 (15:28):
Not my thing.
Speaker 3 (15:29):
But there's a growing number of people, and that number
is going to continue to increase. I don't see that figure,
that percentage of people who engage in AI relationships sexual
or romantic, ever declining, at least not in my lifetime.
Speaker 2 (15:45):
We talked about the information versus titillation. We talk about
the titillation from a mature standpoint. The moment they start
integrating the AI into sex toys.
Speaker 3 (15:55):
Yeah, you know, well, and that's the thing. A lot
we were talking during the break. There's a sex toy
conference coming up, and a lot most of the toys
are aimed and targeted for women, but there's an increasing
amount of products that are being designed for men, and
a lot of them are the dolls, the big high
end dolls that you and I know that you can
(16:17):
program the AIS into those dolls to respond the way
that you want. And if you're a person who has
a hard time engaging with other people socially, this is
now become a viable option. So it's something worth keeping
an eye on and just being personally aware of where
our individual boundaries with it are, but also the social
(16:38):
boundaries incredible.
Speaker 2 (16:40):
You have someone like Mark Ronner, if he went to
prison for thirty years and came out of prison, he
wouldn't recognize this world at all.
Speaker 3 (16:46):
Yeah no, And that people from thirty years ago. If
somebody went to the prison in the eighties came out
right now, good lord, they were like, wait, I can
I can find a date on here?
Speaker 4 (16:58):
That's just a starting point. Explain the Internet to them.
Yeah no.
Speaker 3 (17:01):
And that's where we're at is in about ten fifteen years,
who knows what we're going to be dealing with. The
technology grows exponentially so fast that really, you're right, laws
are going to be trailing behind and people are going
to be collecting as much money off of it as
they can until restrictions are put in to keep people safe.
Speaker 2 (17:24):
Sam Z the Sex Doctor. How can people reach out
to you if they have information, have more questions?
Speaker 3 (17:29):
If you want, you can hit me up on Psychology Today.
I'm my LMFT number one zero six three five two.
Also you can hit me up on my Instagram. It's
at sam Z on air.
Speaker 2 (17:42):
It's Later with Moe Kelly. We'll see you again soon, Sam,
I'll see you, doctor, Sam. Excuse me. We're live everywhere
the iHeartRadio app and on YouTube and Instagram Live.
Speaker 1 (17:51):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.
Speaker 4 (18:03):
KFI Mister mo Kelly.
Speaker 2 (18:05):
It's Later with mo Kelly Live on YouTube, the iHeartRadio
app and Instagram. And we love talking about all things
TV and technology. Now let's talk about their intersection. Google
has launched a new film and TV production initiative to
scout projects it could fund or co produce.
Speaker 4 (18:26):
Imagine that.
Speaker 2 (18:27):
The initiative is called one hundred Zeros and it's a
multi year partnership with Range Media Partners, a talent firm
and production company known for its work on movies like
A Complete Unknown and Long Legs and Alphabet owned. Google
is looking to boost the visibility and adoption of its
newer offerings, including AI, of course, and spatial computing tools
(18:52):
that blend the physical and virtual worlds. Through this initiative,
which back the marketing of indie horror film Ku last year.
Speaker 4 (19:02):
Let me just stop right there.
Speaker 2 (19:04):
When you think about Google, which has been in the
news for a number of reasons for being found liable
in their monopoly case, Google is a behemoth. It is
a monster in every sense of the word. But when
you think about entertainment, I can understand what they're trying
to do. When you have the Internet literally at your disposal,
(19:24):
when you are the top search engine, you're really directing
people to different content. What if you then became the
content delivery system and also the content which is being delivered,
then you would own basically all the means and factors
of production and you could mainline people, the billions of
(19:46):
Google users specifically to that content. And remember, we're not
necessarily required to go to a movie theater to watch
a movie anymore. We have a lot of titles which
ever see a movie theater, which don't necessarily When you're
talking about Google don't necessarily need a dedicated streaming platform,
(20:10):
as it were to see movies. And if I'm Google,
it's like, Okay, we have how many users, and we
have how many ips and how many addresses and how
much content can we send them to? Yeah, why don't
we start producing our own stuff? Because we can mainlight
it directly to them. I understand what they're trying to do,
and I'm I would be shocked if not everyone else
(20:34):
is already doing it. I'm pretty sure correct me if
I'm wrong. Meta has already started producing content. I'm pretty sure.
I'll look it up, but I think they have. If
they haven't, they're already talking about it and they just
haven't announced it. Because you can't have these millions of
users of all things Meta from their oculus and Facebook
(20:56):
and Instagram and not then also pushing people to specific
content that they're creating that they own they don't have
to share with anyone else.
Speaker 4 (21:09):
That's the future of all this.
Speaker 2 (21:10):
And Google team has already had a platform of its
own called Google TV. I have all things Google at
home because that's my ecosystem. I have a Pixel Watch,
I have a Pixel nine a phone, I have Google
TV on an Android TV at home and I use
Google Chrome. Why because it's seamless, everything works together in
(21:31):
a collaborative sense. I am the person who they want
when they start creating content and then distributing it, because
they can always find me, and everything they want to
produce will find me. On the other end of the spectrum,
I know there are people. I think Stephan, everything in
your life is Apple and Safari. You know, I'm pretty split.
(21:54):
I got Apple phone, but I have a PC. How
does that work? I mean, I got a PC at
home because I have an audio software called Audition.
Speaker 4 (22:03):
Adobe Audition you can only runs on PC.
Speaker 2 (22:05):
But outside of that, well, I mean, for one thing,
the price really expensive. But then I talked to a
lot of tech people and basically, you don't really need
an Apple like a MacBook specifically unless you're doing like
graphic art design, and so I'm like, why spend that
extra money? And my PC's worked just fine. And I
have a Samsung TV and I have a Google Chrome
(22:26):
two Chrome cast Crome casts as well. Mark, what are
you in an ecological sense? I think I'm pretty much
in the Apple cult. I've got I've got a MacBook
Pro that I replaced my fourteen year old MacBook Pro
with and I love those things. I have an iPhone,
what else, I've a iPad, you name it?
Speaker 5 (22:47):
Okay, yeah, it's all. I love that stuff. It's just
so much user friendly. I resisted the cult for a
long time, but then I got one of those huge
screened imax because I was doing comics and you could
put a script page next to an art page and
display everything. They're wonderful.
Speaker 4 (23:04):
And then if I tell you, hey, did you know
that Apple also has Apple TV where they create their
own content and distribute their own content, it wouldn't strike
you as weird, now would it? Got that too?
Speaker 2 (23:16):
See, there is a methodology to all of this madness.
Google maybe a little bit later than Apple, but they
already have Google TV, but they just don't have their
own content in which they are creating, distributing, producing.
Speaker 5 (23:31):
Yeah, now they just have to come up with something
as good as say, slow Horses to get people to watch.
Speaker 4 (23:36):
They got all the money in the world. I mean,
let's be.
Speaker 2 (23:38):
Honest, that's not their problem money. So at that point
you just have to pick well. You don't have to
pick many, just pick well, and then you will be
on the same trajectory as Apple in that regard.
Speaker 4 (23:51):
Yeah.
Speaker 5 (23:51):
The thing is, Apple, as you know, doesn't promote any
of their shows. They don't, but they rely on word
of mouth and most of the stuff is so high
quality that people wind up finding out about it anyway.
Speaker 2 (24:03):
It is a great, great business model, and I see
where it is going and I'm here for it now.
If Google can get one or two hits, I'm all
the way in. That's a good point because I only
knew about ted Lasso because of producer Sharon Bellio, and
then I learned about slow horses and Silo because of
you two. Yeah, so yeah, not even no commercials. It's
(24:26):
Later with mo Kelly. When we come back, we'll check
in with George Norri Coast Coast AM and also have
my final thought, and it has to do with the
pope announcement today and also the politics in America today.
Speaker 4 (24:39):
That's next.
Speaker 1 (24:40):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.
Speaker 2 (24:45):
I don't know, maybe it was just me, but the
selection of a new pope today didn't seem as celebratory
as in years past. I'm not a Catholic, Okay, I'm
not going to have the same type of feeling for
the moment, and I admit that, But I went to
a Jesuit Catholic university, Georgetown University in Washington, d C.
(25:08):
I kind of know what the fanfare around it feels
like it is and was a deeply emotional day for
many people. But this, this particular day, this time around,
felt really different. Not because of anything the Catholic Church
did or what may have transpired during the conclave, because
that is kept from the public for a good reason.
(25:31):
But the public reaction to the selection of Robert Privos
as the two hundred and sixty seventh pontiff who took
the name of Pope Leo the fourteenth. It was different,
That's the only way I could describe it, at least
different here in America, that public reaction, and I mean
the reaction to the first American born pope and the
(25:51):
reaction to that here in the US. Judging by the
responses from the news media to social media, you think
they just selected the next Supreme Court justice or something.
Speaker 4 (26:03):
For some people, the new pope is too woke. You
know who you are?
Speaker 2 (26:07):
Imagine making that your first thought in comment As an aside,
if you think a pope as woke. Wait until you
actually read the Bible and find all the red print
which highlights the words of Jesus, your head might explode.
Speaker 4 (26:19):
But I digress, and for others.
Speaker 2 (26:21):
The announcement of Pope Leo the fourteenth was met with
barbes that he is a homophobe due to his previous
remarks surrounding gay marriage. You know who you are, But
there was very little discussion comparatively speaking about actually leading
the Catholic Church. There's a whole lot about his previous
statements about American political figures and where he stands on
(26:43):
issues such as Israel and Gaza. It's been pretty amazing
to be and I wasn't the only one who noticed it.
The new Pope may have been born in America and
even attending college here in America a villanova, but the
Pope doesn't belong to America or answer to America.
Speaker 4 (27:01):
He belongs to the world.
Speaker 2 (27:03):
And I don't know who needs to hear this, but
no pope is chosen with your American political purity tests
in mind, not even one who happens to also be American.
Even the selection of the pope is by secret ballot.
Put another way, it ain't about you, not individually or politically.
(27:25):
Is the pope in many ways viewed as a political figure.
He can be, he can be, as far as viewed,
but American politics is not even in the top one
hundred and fifty items on his list. A pope may
weigh in on issues of political or moral significance to
the larger world, but his value as a pope will
(27:47):
never be a lett stress, will never be a reflection
of your individual pet political beliefs at a particular time,
in particular portion of a particular country. Pope Leo the
fourteenth is ostensibly pope for life, be that one day,
forty years, or some amount of time in between the two.
(28:08):
Today was really strange, and I know Mark Ronner he
felt it was strange as well. Just stampling the feedback
and analysis of who Robert Prevosts the Cardinal was and
who Pope Leo the fourteenth will.
Speaker 4 (28:18):
Be, and almost none of it. Check this out.
Speaker 2 (28:23):
None of it had anything to do with Jesus Christ,
the future of the Catholic Church, or whether his selection
brings comfort to Catholics around the world presently. I was shocked, actually,
and that's pretty hard to do. Even the news coverage
was slanted to offer previews on where he might follow
on controversial issues or whether he would be antagonistic to
President Trump. I was thinking, y'all have it twisted, y'all
(28:45):
worried about the wrong things. But then I remember here
in America, religion is for many an expression of their
politics and nothing more.
Speaker 4 (28:53):
We put it in our bios. When I say we,
I mean you.
Speaker 2 (28:56):
You put it in your bios on social media, not
to tell everyone who in how we actually worship, but
to signal to anyone reading what your political disposition is.
Speaker 4 (29:06):
Show me a social.
Speaker 2 (29:06):
Media bio with the word Christian in it, and I
will show you a profile that talks mostly politics and
virtually nothing about Jesus. Prove me wrong. I'll show you
a profile which is clear on what they don't like,
who they don't like, and who they voted for. Christian
It's almost like, almost like Jesus has nothing to do
with anything called Christianity in twenty twenty five, almost like
(29:28):
that that that must be it, because that's the only
explanation as to why people are more concerned about whether
the new pope is a woke Marxist.
Speaker 4 (29:36):
And that's a quote or as a homophobe.
Speaker 2 (29:39):
And that's a quote instead of whether he is a
disciple of Jesus and good for the future of the
Catholic Church, believe it or not, not, everything is about you,
your politics, or even your God.
Speaker 4 (29:53):
For k IF I am six forty, I'm mo Kelly.
Speaker 1 (29:56):
If you find yourself agreeing with everything we say, doing
it wrong.
Speaker 2 (30:01):
K f I N K O S T h D two,
Los Angeles, Orange County, lives everywhere on the radio,