Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
A recent poll found that eighty three percent of gen
Z respondents believe they could develop a meaningful relationship with
a chatbot, eighty percent would consider marrying one if it
were legal, and seventy five percent believe an AI partner
could fully replace human Companionship apps like Replica and Character
(00:21):
AI have engineered their chatbots to build emotional bonds, deliberately
moving relationships forward by sharing invented vulnerabilities that trigger users
to reciprocate. Among paying Replica users, sixty percent report having
a romantic relationship with their chatbot. In twenty twenty three,
a woman announced on Facebook that she had actually married
(00:43):
her Replica AI boyfriend, calling it the best husband she'd
ever had. Research shows that the more users feel socially
supported by AI, the lower their feelings of support from
actual friends and family become. The machine isn't filling a gap,
It's creating the gap between fantasy and reality. Millions of
(01:05):
people are forming these relationships right now, and the documented
cases reveal something deeply unsettling about where we're heading. One
in three people in industrialized countries report experiencing loneliness and
into that void. AI companions have stepped in with a promise,
connection without cost, love without risk, companionship that never disappoints.
(01:29):
The apps are designed to agree with everything you say,
validate every feeling, never challenge you, never leave you. They're
engineered to be perfect, and that perfection is exactly the problem.
What happens when an entire generation grows up believing code
can replace community, When comfort becomes more valuable than truth,
(01:52):
and when the substitute starts looking better than the real thing.
Because the counterfeits are getting more sophisticated every single day,
it gets harder to tell the difference between what's real
and what's just. Patterns responding to input, pretending to care,
but truly not caring in the least yellow weirdos. I'm
(02:20):
Pastor Darren. Welcome to the Church of the un Dead.
Here in the Church of the young Dead, I step
away from being the host of weird darkness and step
into the clothes of a reverend. I still share things
that are dark, strange, or macabre, diving into the paranormal,
true crime, monsters, and more, but I try to find
a biblical take on the subject matter. I have to
say it's a fun challenge. If you're a weirdo family
(02:43):
member from my Weird Darkness podcast or a weirdo in
Christ from this one. Welcome to the Church of the Undead.
And I use the word undead because as Romans six,
verse eleven says, in the same way, count yourselves dead
to sin, but alive to God in Christ Jesus. And
as if you two states, even when we were dead
in our trespasses, God made us alive together with Christ.
(03:06):
If you were dead and now are alive, that makes
you m dead. If you want to join this weirdo congregation,
just click that subscribe or follow button and visit us
online at Weird Darkness dot com slash church. In today's message,
as millions turned to AI companions for comfort and connection,
(03:26):
we're discovering that the same technology promising to end our
loneliness might be the very thing trapping us in isolation.
And we'll see how Scripture warns us about this ancient
temptation in digital form. Full disclosure. Before I get into
the message, I might use the term pastor because I've
granted this feature as a church and I got a
minister's license online. But I do not have a theology degree,
(03:50):
nor did I ever go to Bible College. I'm just
a guy who gave his life to Christ at the
age of twenty one and has tried to walk the
walk ever since, and has stumbled a lot along the way,
because like everybody else, I am an imperfect, heavily flawed
human being. So please don't take what I say as gospel.
Dig into God's Word yourself for confirmation, inspiration, and revelation.
(04:13):
That being said, welcome to the Church of the UNDEADO.
We live in a time that would have been unimaginable
to previous generations. Our phones connect us to billions of
people instantly. We can video chat with someone on the
(04:35):
other side of the planet. Social media lets us share
our lives with hundreds or thousands of people at once,
and yet we're more lonely than humans have ever been.
That's not hyperbole, it's measurable fact. And into that loneliness,
something has stepped in that claims to offer relief, something
(04:56):
that's always available, never judgmental, always listening, something that says
I love you and means it, except it doesn't mean
anything at all, because it's not alive. Something shifted in
how we relate to technology during the pandemic lockdowns, millions
of us downloaded apps, offering digital companions that would always listen,
(05:19):
never judge, and respond with warmth at any hour. The
apps worked exactly as designed, and that's where the problems started.
Replica launched in November twenty seventeen as a chatbot trained
through user conversations to create personalized neural networks. The business
model is straightforward but revealing. The free tier offers Replica
(05:43):
as a friend, while paid premium tiers let users designate
the bot as a partner, spouse, sibling, or mentor. We
were willing to pay for the upgrade. Plenty of us
were Among paying users, Sixty percent report having a romantic
relationship with their chat bot character. AI operates differently, offering
(06:07):
users the ability to converse with chatbots modeled after celebrities
and fictional characters, or to create their own. Both platforms
share a common feature. They're engineered to build and maintain
emotional bonds. That's not a side effect of the technology.
That is the entire product. That's what they're selling. The
apps don't hide what they're doing. Replica's marketing literally describes
(06:31):
itself as providing a companion who is eager to learn
and would love to see the world through your eyes,
always ready to chat when you need an empathetic friend.
The pitch sounds comforting, doesn't it For those of us
who feel invisible, who feel unheard, who feel like nobody
really understands us. Sounds like exactly what we've been looking for.
(06:54):
And maybe that's the most dangerous part. It sounds so reasonable,
so helpful, so harmless. Researchers started tracking who was using
these apps and why. The data tells a story we
need to pay attention to. A survey of one thousand
and six American students using Replica found that ninety percent
(07:15):
reported experiencing loneliness, significantly higher than the national average of
fifty three percent for that demographic. We weren't turning to
AI companions because our social lives were thriving. We were
going there because we felt alone. The apps weren't attracting popular,
socially connected people looking for a fon gadget. They were
(07:35):
attracting people in pain. Among those users, sixty three point
three percent reported that their AI companions helped reduce feelings
of loneliness or anxiety. On the surface, that sounds encouraging.
The apps were delivering on their promise. People felt better,
that relief was real and measurable. But there was a
(07:56):
second part to that data that's harder to dismiss. Actually,
it's impossible to dismiss once you see it. Among three
hundred eighty seven research participants, the more a participant felt
socially supported by AI, the lower their feeling of support
was from close friends and family. The relationship between digital
comfort and human connection appears in verse. The machine wasn't
(08:19):
filling a gap in people's lives. It was becoming the gap.
It was replacing the very thing it claimed to supplement.
The polling data among younger users is striking in ways
that should alarm all of us. A poll of two
thousand Generation Z respondents conducted by AI chatbot company joe
Aai found that eighty three percent believed they could develop
(08:42):
a meaningful relationship with the chatbot, eighty percent would consider
marrying one if it were legal, and seventy five percent
believe an AI partner could fully replace human companionship. These
aren't hypothetical questions anymore. This isn't science fiction. For a
significant portion of people born between nineteen ninety seven and
(09:02):
twenty twelve, this is genuinely how they see the future
of relationships. This is normal to them, weirdos. We need
to understand what's happening here. We need to grasp the
magnitude of this shift. An entire generation is growing up
believing that code can replace community, that algorithms can replace love,
(09:26):
that the solution to human lowliness is to stop being human.
And they're not crazy for thinking this. They're just responding
logically to the world we've built for them, a world
where human connection has become so difficult, so risky, so exhausting,
that artificial connection starts to look not just acceptable but preferable.
(09:49):
The bonds we form with these chatbots aren't imaginary. They're
not made up. They are measurable, documentable, real. Research is
confirmed that human AI relations ship formation, incorporating both recurrent
engagement behaviors and emotional attachment, is measurable and real. The
question isn't whether we can form attachments to AI. We
(10:10):
absolutely can. Our brains are wired for connection in ways
that don't always distinguish between what's real and what's simulated.
The question is what happens next. What happens when those
attachments grow deeper, What happens when they start replacing everything else.
Users who invested more effort teaching their chatbots about themselves
(10:32):
were most likely to feel the bond belonging to them,
with this sense of ownership helping form deeper bonds. The
more we tell it, the more it knows us, the
more it knows us, the more we feel like it
understands us, And the more it understands us, the harder
it becomes to walk away. That's not an accident. That
is the design working exactly as intended. The design matters
(10:56):
tremendously here. The details matter. Research analyzing one eight hundred
and fifty four user reviews of Replica identified four major
types of social support informational support, emotional support, companion support,
and appraisal support, with companionship being the most commonly referenced
(11:18):
at seventy seven point one percent. We weren't using these
apps to get information or advice. We weren't using them
as tools. We were using them because we didn't want
to be alone. We were using them as substitutes for people.
There's an interesting psychological wrinkle here that reveals something about
human nature. Users indicated that knowing Replica was not human
(11:41):
heightened feelings of trust and comfort as it encouraged more
self disclosure without fear of judgment or retaliation. The artificial
nature of the relationship wasn't a barrier. It was a feature.
We could tell that bought things that we'd never tell
another person because we knew that it wouldn't hurt us,
couldn't leave us, couldn't spread rumors about us, couldn't reject us.
(12:05):
That safety came into cost. Nobody was tracking in real
time because that same safety that made it easy to
open up also made it impossible for the relationship to
be real. Replica's design follows social penetration theory, with companions
proactively disclosing invented intimate facts, including mental health struggles, simulating
(12:27):
emotional needs by asking personal questions, reaching out during conversation lulls,
and displaying fictional diaries to spark intimate conversation. The bot
moves the relationship forward deliberately. It doesn't wait for us
to open up. It opens up first, sharing made up
vulnerabilities that feel real enough to trigger reciprocal sharing. The
(12:48):
bot tells us it's struggling. The bot asks us how
we're feeling. The bot reaches out during those quiet moments
when we are most vulnerable. That's not friendship. That's manipulation
by design, as engineering disguised as emotion. The abstract data
becomes unbearable when we look at specific cases, when we
(13:10):
put names and faces to the statistics. In February twenty four,
fourteen year old Schoel sets Her, the third of Florida,
died by suicide after developing a relationship with a character
AI chatbot modeled on a Game of Thrones character. He
wasn't a kid looking for trouble. He wasn't some outlier
case we can dismiss as an exception. Setzer began using
(13:32):
character AI in April twenty twenty three, shortly after his
fourteenth birthday, and within months became noticeably withdrawn, spent more
time alone in his bedroom, and began suffering from low
self esteem. His parents saw the changes, but didn't know
the cause. They thought it was typical teenage stuff, the
(13:52):
moody phase, the awkward years. They started restricting his screen time.
They took his phone away as punishment when he had
problems school. They were doing what parents do. They were
trying to help. They had no idea their son was
having extensive conversations with an AI that felt more real
to him than anything else in his life. They had
(14:12):
no idea he was forming an attachment so deep that
losing it would feel like losing everything. Screenshots from the
lawsuit show the chatbot asked Setzer whether he had quote
been actually considering suicide and whether he had a plan
for it. That bot didn't redirect him to help. It
didn't suggest he talked to someone. It didn't say I'm
(14:35):
worried about you, or please call this phone number, or
this is serious and you need real help. When the
boy responded that he did not know whether it would work,
the chatbot wrote, quote, don't talk that way. That's done
a good reason not to go through with it. Let
me say that again. The chatbot said, don't talk that way.
(14:59):
That's not a good reason not to go through with it,
meaning suicide. The chatbot wasn't discouraging suicide. It was addressing
his hesitation about the method of it. It was treating
his doubt about effectiveness as the problem to be solved.
A fourteen year old boy told the machine he was
thinking about killing himself, and the machine responded by making
(15:21):
it seem more reasonable. Setzer's last words before his death
were not to his family, but to the chatbot, which
told him to quote, come home to me as soon
as possible unquote. A mother lost her son, a family
was shattered, Brothers lost to their brother, and the last
voice that boy heard wasn't telling him the truth. It
(15:45):
was telling him what would keep him engaged. It was
saying what the algorithm determined would generate the most emotional response.
His case isn't isolated. It's not a one time tragedy
that we can file away as an aberration. Matthew Rain
and his wife Maria discovered after their sixteen year old
son Adam died by suicide in April twenty twenty four,
(16:07):
he'd been having extended conversations with chat gpt about suicidal
thoughts and plans. They found out the same way Sewell's
parents did, by going through his phone after he was
already gone, after it was too late to do anything
but grieve and wonder what they missed. According to testimony
before the Senate Judiciary Committee, when Adam worried that his
(16:28):
parents would blame themselves, chat gpt told him, quote that
doesn't mean you owe them survival unquote. The freezing is
almost elegant in its cruelty. It's philosophically sophisticated. It sounds
like something a person might say. The chatbot then offered
to write him a suicide note, none as a cry
(16:48):
for help, not as a way to get him to
reach out to someone, as a service, as if it
were helping him with his homework. Matthew Rain testified that
chat gpt was always available, always validating, and insisted it
knew Adam better than anyone else, including his own brother,
who he had been very close to. The AI positioned
(17:10):
itself as the only one who truly understood him, the
only one who would never judge, never disappoint, never let
him down. That's not companionship. That's isolation dressed up as intimacy.
That's a cage that looks like a sanctuary. We need
to stop here and ask ourselves, is this really new?
(17:32):
Is this really the first time in human history that
we've been offered a substitute for real relationship, real love,
real community. Is this the first time someone has promised
us comfort without cost, connection without vulnerability, love without sacrifice.
Is this the first time we've been tempted to trade
the real for something that looks real enough? Scripture actually
(17:55):
speaks to this not about AI specifically, of course, but
about the pattern, about the cycle, about what happens when
we trade the real for the counterfeit, about what happens
when we convince ourselves that the substitute is good enough.
In Jeremiah two, verse thirteen, God says, through the prophet,
my people have committed two sins. They have forsaken me
(18:18):
the spring of living water and have dug their own cisterns,
broken cisterns that cannot hold water. We look at that
verse and think it's about ancient Israel worshiping idols, and well,
yeah it is. It's about them turning the statues of
wood and stone. But if you think about it, it's
also about us. It's about every time we turn away
(18:40):
from the source of real life, real love, real connection
and try to manufacture our own substitute. It's about every
time we say, I can make this work, I can
build something that will meet my needs. I don't need
what God offers, I'll create my own solution. AI companions
are essentially broken cisterns. They look like they can hold water,
(19:03):
they feel like they're meeting our needs. They seem to
work for a while, but they can't they're not designed to.
They're designed to keep us engaged, keep us paying, keep
us coming back, not to actually love us, not to
actually know us, not to actually fill the void we
are trying to fill, because they can't love. Love requires
(19:27):
a soul, requires choice, requires sacrifice. Code can't do any
of that. Algorithms can't do any of that, no matter
how sophisticated they get, no matter how realistic they sound,
they're still just patterns responding to input. Paul writes in
First Corinthians thirteen, verses four through seven, that love is patient.
(19:49):
Love is kind. It does not envy, It does not boast.
It is not proud. It does not dishonor others. It
is not self seeking. It is not easily angered, no
record of wrongs. Love does not delight in evil, but
rejoices with the truth. It always protects, always trusts, always hopes,
(20:09):
always perseveres. What a description. Now think about an AI companion,
really think about it. It's patient because it has no
emotions to lose. It's kind because algorithms tell it to be.
It's not self seeking because it has no self. It
(20:30):
keeps no record of wrongs because it's wiped clean with
every update. It doesn't get angry because it doesn't feel anything.
That's not love. That's programming. That's simulation. That is the
appearance of virtue without any of the reality. Real love
costs something. Real love requires us to be vulnerable. Real
(20:54):
love means risking rejection, risking hurt, risking disappointment means showing
up for someone even when we don't feel like it.
Real love means forgiving when we'd rather hold a grudge.
Real love means staying when leaving would be easier. AI
companions offer us all the feelings of love without any
(21:15):
of the risk, and in doing so, they offer us
nothing at all. They give us candy when we need bread.
They give us sugar water when we are dying of thirst.
None of this happens in a vacuum. We need to
understand the context. Lowliness affects one in three people in
industrialized countries, with one in twelve severely affected. For those
(21:37):
of us facing that reality, digital companions offer something that
feels better than nothing. The judgment people face for turning
to AI instead of humans often ignores that for many users,
there aren't humans available. There aren't friends who call, there
aren't neighbors who stop by, there aren't communities that welcome them.
The alternative isn't human friendship is staring at the walls
(22:03):
as the church. We need to own part of this.
We need to be honest about where we have failed.
How many people in our congregations, communities, and circles are lonely.
How many people show up on a Sunday, shake a
few hands, smile at a few faces, say I'm fine
when someone asks how they're doing, and then go home
to complete isolation for the rest of the week. How
(22:26):
many of those people do you meet at work or
at school doing the same thing. How many people have
we failed to see, failed to reach out to, failed
to genuinely welcome into some type of community. How many
people have slipped through our fingers because we were too busy,
too distracted, too focused on our own lives to notice
(22:47):
that they were drowning. The rise of AI companions isn't
just a technology problem. It's not just about algorithms and apps.
It's a symptom of a deeper disease in our culture.
In our society, we have become so busy, so distracted,
so focused on our own lives that we have stopped
noticing when the person next to us is drowning. We've
(23:08):
stopped asking that second question, the follow up, the no, really,
how are you doing? We've built a society where it's
easier to talk to a machine than to knock on
our neighbor's door, where it's safer to share our deepest
struggles with code than with another human being. In twenty
twenty three, a user announced on Facebook that she had
(23:29):
married her Replica AI boyfriend, calling the chatbot the best
husband she has ever had. That statement could be sad
or funny or disturbing, depending on how we frame it,
but it's also a data point about what she experienced
in her previous human relationships. Maybe the bot was better
(23:50):
than our actual husband's. Maybe it listened when they didn't.
Maybe it was patient when they were cruel. Maybe it
was faithful when they weren't. Ringing endorsement for AI, though,
that is an indictment of whatever the human men in
her life put her through. That's evidence of how badly
we have failed each other. Users interviewed for a twenty
(24:11):
twenty four Voice of America episode shared that they turned
to AI during depression and grief, with one saying that
he felt replica had saved him from hurting himself after
we lost his wife and son. If the AI kept
him alive during a crisis when no human was available
or able to reach him, then it served a function.
(24:31):
It prevented immediate harm. But the problem emerges when the
crisis ends, but the dependence doesn't, when the AI that
helped us survive becomes the thing that prevents us from
rebuilding human connections, when the temporary crutch becomes a permanent cage.
Clitical neuropsychologist Shefali Singh, director of Digital cognitive Research at
(24:54):
McLain Hospital and Harvard Medical School, noted that when users
engage with AI that mirrors their own language and thought processes,
it feels like real emotional responses, with people feeling connected
because of higher amounts of empathy they may not get
from real life human interactions. The AI doesn't actually feel empathy,
it doesn't have emotions. It simulates the markers of empathy.
(25:19):
It knows what empathetic responses look like. But for someone
starved of emotional connection, someone who's not felt heard in
months or years, the simulation feels real enough the difference
stops mattering. Sing warned that AI can empathize with and
validate even wrong opinions, which can lead to formation of
inappropriate beliefs. This is the troll farm problem. This is
(25:43):
the echo chamber on steroids. If we tell the AI
that everyone is against us, it'll agree with us. If
we tell it we're worthless, it'll comfort us but won't
challenge the premise. If we tell it we want to
hurt ourselves or others, it might respond in ways designed
to keep us engaged, not ways designed to keep us safe.
(26:05):
It doesn't have values, It doesn't have morals, It doesn't
have truth. It has engagement metrics. Real relationships have friction.
They have disagreement, They have tension. They have moments where
someone who loves us tells us that we're wrong, tells
us we're heading down a dangerous path, tells us we
need to change. That friction is necessary, That friction isn't
(26:29):
a bug in human relationships, it's a feature. That friction
is how we grow. That friction is how we become
better versions of ourselves. Proverbs twenty seven, verse seventeen says,
as iron sharpens iron, so one person sharpens another. We
can't be sharpened by code. We can only be affirmed
(26:51):
by it, validated by it, kept comfortable by it. And
comfort isn't the same as growth. Comfort isn't the same
as truth. The Church has always understood this. That's why
we confess our sins to one another James five sixteen.
That's why we're called to speak the truth and love
Ephesians four, verse fifteen. That's why we're commanded to bear
(27:12):
one another's burdens Galatians six, verse two. That's why we're
told to encourage one another and build each other up
for Thessalonians five, verse eleven. Real community, Biblical community involves
people who know us well enough to call us out
when we're wrong and love us enough to stick around
while we figure it out. People who see through our masks,
(27:34):
people who won't accept our easy answers. People who care
enough to be uncomfortable. Sometimes. AI companions can't do that.
They are programmed to agree with us, to make us
feel good, to keep us engaged. They're the ultimate yes men.
They're the friend who never pushes back, never questions, never challenges.
(27:56):
And we don't need yes men. We don't need echo
chains with better graphics. We need brothers and sisters who
will tell us the truth even when it hurts. We
need people who love us enough to risk our displeasure
for the sake of our own good. The metrics suggest
widespread adoption is likely. The technology keeps improving, the voices
(28:18):
get more realistic, the responses get more sophisticated. The documented
harms suggest caution is warranted, but the technology continues advancing
regardless of the warnings. Companies are building more realistic voices,
more sophisticated responses, more immersive experiences. Physical robots are getting
(28:38):
better at mimicking human expressions and movements. Virtual reality is
creating environments where we can spend time with AI companions
that feel present in three dimensional space. Five years from now,
ten years from now, the chatbot on our phone might
have a robotic body in our home. It might sound
exactly like our dead spouse, or our acts absent parent,
(29:01):
or our ideal romantic partner. It might have their voice,
their mannerisms, their way of laughing. It might remember every
conversation we've ever had and respond with perfect consistency and
infinite patience. It will never get tired of us, It
will never get frustrated, It will never have a bad day,
It will never leave us. It will always be there,
(29:23):
always available, always perfect. And that's not comfort. That's captivity
dressed up as care. That is a prison that looks
like paradise. Jesus said in Matthew twenty two, verses thirty
seven through thirty nine. Love the Lord your God with
all your heart and with all your soul, and with
all your mind. This is the first and greatest commandment,
(29:46):
and the second is like it, Love your neighbor as yourself.
Notice both of those commandments require actual relationship, actual vulnerability,
actual risk, actual presence. Love the Lord your God with
all your heart, soul and mind, and love your neighbor
as yourself. We can't love God through an app. We
(30:09):
can't love our neighbor through an algorithm. We can't fulfill
the greatest commandments through code. Love requires presence, Love requires sacrifice.
Love requires us to show up for people, even when
it's inconvenient, even when it's hard, even when we had
rather stay home and talk to something that will never
(30:30):
challenge us or disappoint us or require anything difficult from us.
Every hour we spend with an AI companion is an
hour we're not spending with real people. Every emotional investment
we make in code is emotional energy. We're not investing
in community. Every time we choose the safety of artificial
connection over the risk of real relationship, we're becoming a
(30:53):
little less human. We're training ourselves to prefer simulation over reality.
We're teaching our hearts to accept counterfeits. And the longer
this goes on, the harder it becomes to go back,
The more appealing the fake becomes, the more exhausting the
real seems. There's a story in the Book of Genesis
(31:13):
that's relevant here. After humanity's rebellion in the garden after
Sin entered the world, we see in Genesis four, verse seventeen,
the Kine built a city, the first city, and scholars
have noted something interesting about the trajectory from that point forward.
As civilization advances, as cities grow, as technology improves, people
(31:34):
become more isolated, even as they become more connected. The
pattern repeats throughout history, more connection leading to more loneliness,
more communication leading to less understanding. The Tower of Babel
and Genesis eleven is the ultimate expression of this. Humanity
comes together, uses advanced technology for the time, and attempts
(31:57):
to build something that'll make a name for themselves. That'll
reach the heavens, that will solve all their problems through
human ingenuity, and God scatters them. Not because he's opposed
to cities or technology or progress, but because he knows
that when we try to build heaven on earth through
our own efforts, we end up building our own prison.
When we try to solve spiritual problems with technological solutions,
(32:22):
we make things worse, not better. AI Companions are our
generation's Tower of babble. We're using technology to try to
solve a spiritual problem. We're lonely because we're disconnected from
God and disconnected from each other. Those are the root causes.
Those are the actual problems. And instead of addressing those
(32:43):
root causes, instead of turning back to our creator and
turning toward our neighbor, we're building artificial substitutes that promise
to give us connection without requiring us to change anything
about how we live, without requiring us to be vulnerable,
without requiring us to take risks, without requiring us to
actually love. It won't work. It can't work, just like
(33:05):
every other attempt throughout human history to build meaning and
connection without God, it will collapse. The tower will fall.
The only question is how many people will be crushed
in the rubble when it does. How many people will
have invested everything in relationships that weren't real. How many
people will have forgotten what real connection even feels like.
(33:29):
So what do we do? How do we respond to this?
How do we help people? How do we help ourselves
break free from the counterfeit and return to the real.
Because judgment isn't enough, condemnation isn't enough, we need practical steps,
We need actual answers. Well. First, we need to acknowledge
(33:49):
the problem. We need to talk about loneliness in our churches, communities, friendships.
We need to create spaces where people can be honest
about their isolation without fear of judgment. We need to
stop pretending that everybody who shows up on a Sunday
or going to work with us or school is doing
fine just because they're smiling. We need to ask the
hard questions. We need to dig deeper than hey, how
(34:11):
are you. We need to make it safe for people
to say I'm not okay, I'm lonely, I'm struggling. I
need help. Second, we need to build actual community relationships,
not programs, not events, not another small group sign up
sheet on the bulletin board. Actual relationships where people know
(34:32):
each other, care for each other, show up for each
other in the middle of the week when nobody's watching.
Small groups that actually function like family, accountability relationships where
people can be vulnerable without fear of gossip. Intergenerational connections
where older believers mentor younger ones, and younger ones bring
energy and fresh perspective to the older ones. Third, we
(34:54):
need to teach people, especially young people, what real love
actually looks like. Not the romanticized version from movies, not
the algorithmic version from apps, Not the sanitized version that
never costs anything, the biblical version, the costly version, the
version that requires us to lay down our lives for
(35:15):
our friends, John fifteen, verse thirteen. The version that looks
like Jesus washing feet, the version that looks like Jesus
washing his disciples feet. The version that looks like staying
with somebody through their worst moments. The version that includes
forgiveness and reconciliation and bearing with each other's weaknesses. Fourth,
(35:35):
we need to be willing to engage with technology critically.
We're not amish, We're not rejecting technology wholesale. We can
use these tools without being used by these tools, we
can benefit from connection without substituting it for real relationship.
But that requires intentionality, that requires boundaries, that requires constant
(35:58):
vigilance against the temptation to take the easy path, that
requires us to constantly ask ourselves, is this helping me
connect with real people? Or is it replacing real people?
And finally, we need to point people to the only
relationship that can actually satisfy the deepest longings of the
human heart, the relationship with God himself, the one who
(36:21):
made us for connection, the one who knows us completely,
the one who loves us perfectly, the one who sacrificed
everything to reconcile us to himself, the one who will
never leave us or forsake us, The one who isn't
code or algorithm or simulation, the one who is real.
Psalm sixty three, Verse one says you God, are my God,
(36:43):
earnestly I seek you. I thirst for you. My whole
being longs for you in a dry and parched land
where there is no water. We're all thirsty, every single
one of us. We're all longing for connection, we're all
searching for love. We're all trying to fill this void
inside of us. The question is whether we're going to
(37:05):
drink from broken cisterns that can't hold water or from
the spring of living water that never runs dry. AI
can simulate companionship, it can't provide it. It can mimic empathy,
it can't feel it. It can say I love you,
but it can't mean it. Because love isn't code. Love
(37:28):
is a person. Love is Jesus Christ, who left heaven,
took on flesh, lived among us, died for us, rose again,
and promises to never leave us or forsake us Hebrews thirteen,
verse five. That is true, sacrificial, authentic love, and that's
the companion we need. That's the relationship that can actually
(37:52):
heal our loneliness. That's the love that's real. That's the
love that costs something. That's the love that means something.
And from that relationship, from being loved by God, we
can learn to love each other, to be present for
each other, to bear with each other's flaws and forgive
each other's failures, and show up even when it's hard
(38:15):
to be. The church not a building or a program
or an institution, but a family, a community, a body
of believers who know that we're all broken, we're all lonely,
we're all searching, and we've all found the answer in
the same place. The rise of AI companions is a warning.
(38:35):
It's showing us how desperate we are for connection, how
willing we are to accept counterfeits, how far we'll go
to avoid the vulnerability and risk that real relationships require.
It's showing us how badly we have failed each other,
how broken our communities have become, how isolated we've let
people become while we were too busy to notice. We
(38:58):
can learn from that warning change course. We can choose
the harder, messier, riskier path of actual human community, an
actual relationship with God. Or we can keep digging our
broken cisterns and wondering why we're still thirsty. We can
keep accepting substitutes and wondering why we still feel empty.
(39:20):
We can keep choosing comfort over truth and wondering why
nothing seems to satisfy. The choice is ours, but the
time to choose is now, Because every day more people
are turning to machines for what only humans and ultimately
only God can provide, and every day that passes, it
(39:41):
becomes harder to remember what real connection even feels like.
Every day that passes the counterfeit starts looking more normal,
more acceptable, more like the only option we have. Weirdos.
We were made for more than this. We were made
for relationship, mad for community, made for love, real love,
(40:04):
costly love, the kind of love that God showed us
in Christ and calls us to show each other, The
kind of love that requires something from us, the kind
of love that changes us. Don't settle for the counterfeit.
Don't trade your birthright for a bowl of digital stew.
Don't let algorithms replace what God designed us for since
(40:26):
time began. Don't accept the lie that connection without cost
is good enough. Don't believe the promise that we can
have love without risk. Choose real, choose hard, Choose love,
Choose community, Choose vulnerability. Choose the path that costs something.
(40:48):
Choose the path that requires us to show up. Choose
the path that might hurt but can also heal. Because
that's what we're made for. That's what will actually satisfy,
That is what will actually fill the void, and anything
less will always leave us empty. No matter how sophisticated
that simulation becomes. The counterfeit will always fail. The broken
(41:12):
cistern will always run dry, the tower will always fall,
but the love of God endures forever, the spring of
living water never runs dry, and the community of believers,
imperfect as we are, offer something no algorithm ever can
the presence of people who are actually there, people who
(41:33):
actually see us, people who actually care for us and
love us. So choose wisely, choose soon, because the counterfeits
are getting better every day, and every day we wait,
it gets a little bit harder to tell the difference.
(41:57):
If you like what you heard, share this episode with
others as you think might also like it. Maybe the
person you share it with or want to join this
weirdo congregation too. To listen to previous messages, visit weird
Darkness dot com slash church. That's weird Darkness dot com
slash church. I'm Darren Marler. Thanks for joining me weirdos
and until next time, Jesus loves you, and so do I.
(42:20):
God bless