All Episodes

May 22, 2024 38 mins

Send us a text with feedback about the show or questions you'd like to see addressed.

Imagine a future where artificial intelligence is aimed at our deepest connections and desires. This is the potentially unsettling reality that Dustin Freeman and I wrestle with in this episode centered around AI and its impact on intimacy and sexuality. As technology seeps further into our lives, we confront the jarring reality that AI's role in intimacy could lead to a seismic shift in how we perceive human interaction—possibly sidelining the enriching complexities of real relationships for the convenience of simulated companionship.

As I sit down with Dustin, the pastor of a growing church and a father of five boys, we peel back the layers of how artificial relationships actually pale in comparison to the messy but beautiful tapestry of genuine human bonds. The fear that we're inching towards a world where emotional growth with real people is just an option looms over us. It's a poignant reminder that while AI can attempt to mirror the nuances of human intimacy, it lacks the transformative power of shared experiences that shape our character and resilience. 

Last, our discussion takes us to the evolution of pornography aided by AI—an area of concern where synthetic experiences could eclipse the need for authentic intimacy and sexual interactions with a real person. This scenario isn't just speculation; it's a genuine possibility that demands a response. We advocate for a life where technology should be approached very cautiously when it comes to aiding in our relationships. While we believe AI could serve some useful purposes in the relationship realm, such as an occasional tool for receiving feedback about ourselves, we also believe it is dangerous to see AI as a source of intimacy in and of itself, let alone a tool that can be leveraged to help improve our relationships. 

Our heartfelt message is one of empowerment, encouraging men to cultivate relational fortitude in the face of seductive digital advances, championing a future where meaningful human touch remains irreplaceable.

Music credit: Music from #Uppbeat (free for Creators!):
https://uppbeat.io/t/ben-johnson/some-kind-of-feelin
License code: QVDADXXPNNVQMPKV

Support the show

If you're enjoying this podcast, Reclaimed and Unashamed, please consider leaving a review and making a donation to help us deliver more life-changing content!

Donate here with Paypal - help me release consistent, well-researched, and zero-cost-to-consumer content by making a monthly recurring donation.

Subscribe on Youtube
Follow us on Instagram
Join Our Private Facebook Group for Men

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Welcome to the Reclaimed and Unashamed podcast.
We are helping men to rewiretheir brains and overcome the
shame that often surroundsunwanted pornography use.
I'm your host, colton Thomas,and we are back with episode
number 21 with guest DustinFreeman.
We've had him on our podcastbefore.
Dustin is a personal friend anda mentor, and I think this

(00:22):
conversation is really important.
It's about AI, how it relatesto relationships and intimacy,
as well as pornography.
Before we get started, I'd justlike to say it's good to be
back, guys.
For those of you who have beentuning into the podcast since
the beginning, you know thatwe've taken a few months off and
it's been a little while sinceI've released an episode, so I
guess we can call this seasontwo maybe, but anyways, I've got

(00:45):
several episodes recorded andready to air, and so for the
next several weeks, you canexpect consistent episodes
released, probably bi-weekly,maybe sometimes weekly, but it
is good to be back with you,guys.
I'm really excited to sharewith you these conversations
I've been having.
I think they're so important.
I think they're going to helpyou move forward if you're

(01:05):
looking to quit your habit withpornography and that's really
our goal here right, I want tohelp you to get results with
powerful knowledge andinformation and also actionable
takeaways from the podcast.
So it's my prayer that thispodcast does just that for you,
and I know that Dustin has somereally insightful things to say
on the subject of AI, so now I'mgoing to turn it over to him.

(01:27):
Enjoy it, guys.
Dustin's really excited to haveyou back on the show.

Speaker 2 (01:34):
I'm glad you're back, Colton.
Thanks for asking.

Speaker 1 (01:36):
I really enjoyed our last episode.
We talked about the epidemic ofloneliness in men.
I feel like that really strucka nerve with a lot of guys that
listened.
I heard really great feedbackabout that episode, and so today
we are doing something thatnaturally, I think is related to
that conversation and it's adanger, I think, for increasing
loneliness to a much higherdegree, and that is AI.

(02:00):
You and I have had someconversations about AI.
I've heard it come up at thischurch.
I know you guys are thinkingabout it and talking about it
here at St Andrews, and so it'ssuch a vast subject.
It's such a hot topic.
You can go on and look up tonsof interesting articles on it.
Today, we were hoping to focuson AI and relationships and AI

(02:22):
and pornography and theimplications there, so we're
thinking about the future andthe dangers in the future, but
also right here in the presentand the now.
There's a lot already happening, and so I'm sure we could
easily talk about this forever,but I'd love to hear you
elaborate more on your thoughtsregarding AI.

Speaker 2 (02:42):
Well, it's funny.
I'm really bad at knowing howlong ago things happened, but
when ChatGPT really dropped,like most people, I was
immediately caught up at thesame time with Wondermint.
It was like a kid in a candystore, like, wow, this is so
amazing.
It's like the future's almosthere.
That's really cool, um, and andalso like horrified by all of

(03:07):
the possible ways uh, that mybrain ran off thinking about you
know what kinds of uh you knownew problems, um, something like
this uh can can create.
Um, you know I don't know aboutyou and other uh those viewing,
but you know I've watched uhdifferent documentaries and
things.
What is the one on Netflixabout the rise of the internet

(03:28):
and phone addiction?

Speaker 1 (03:29):
Yeah, the Social Dilemma.

Speaker 2 (03:31):
Social Dilemma.
Great little documentary andone of the things that the
people in there talk about ishow, starting out, they really
felt like in building theinternet, they were going to
save the world.
It was this powerfuldemocratizing force that was
going to give everyone a voiceand bring us all together.
Like the amount of pureidealism that was in that.
And I, you know, I'm I'm 43.

(03:52):
So I was kind of coming of agewith the internet and I felt
that you know and then you know,but you see all these consumer
forces that ultimately end updirecting how it's used and
where it advances.
And you know, a lot of thepeople in that documentary find
themselves in a place where theywouldn't let their own kids use

(04:13):
the services they, you know,built because of the kind of
addictive and fundamentallydehumanizing tendencies that no,
like one great big evil personlike was like ah, in the
background creating, but justmarket forces kind of created as
an almost inevitability.
And I feel like all those samekinds of things are just in the

(04:36):
air as AI is on the rise.
I'm not someone who'snecessarily for or against
technology.
I just recognize that, as longas human beings have brokenness
inside of us, any tool thatextends our capabilities is
going to mean that there isgreater good and greater evil
done, you know, greater harm.
That's just sort of the natureof things.

(04:58):
And so the pornographyconnection is an important one.
Pornography connection is a isan important one, um, and it
made me think differently about,uh, the possibilities of AI.
Not just that there would be AIporn, um, but you know, when
you think about pornography andhow it functions in people's
lives, I think in many ways it'sso ubiquitous because porn

(05:19):
provides a really cheap way andby cheap I don't just mean
inexpensive in the financially,I mean like it just doesn't cost
you anything through porn tohave immediate gratification
with some of your most basicintimacy needs.
It's just really easy and that'shard to say no to, because
those needs are real and then tojust be able to push a button

(05:41):
and have some sense of thembeing addressed is really
attractive and that's a be ableto push a button and have some
sense of them being addressed isreally attractive and that's a
reward on multiple levels.

Speaker 1 (05:47):
There's an intimacy and relationship reward.
It's artificial but you kind offeel that reward.
But there's also a dopaminereward all kinds of chemicals
released when we know from otherscientists podcasts I listen to
Andrew Huberman.
He's constantly talking abouthow dopamine is something that's
meant to offer us rewardstowards long-term goals that

(06:07):
we're gradually working towards.
So the feeling we get ofsatisfaction and some of those
feel-good chemicals are supposedto be in conjunction with
building something long-termthat's tangible, of value, right
, but pornography doesn't offerthat, so it's yeah, it's
instantly disposable, but it'sgiving you that feeling, the
feeling without all of therealities that accompany you,
right, right, but pornographydoesn't offer that.

Speaker 2 (06:26):
So it's instantly disposable, but it's giving you
that feeling, the feelingwithout all of the realities
that accompany Right?
Well, it just strikes me, andit struck me immediately, that
AI would create the possibilityof having, in the same way that
pornography meets, a lot of ourmost basic intimacy needs
cheaply that.
AI would create the possibilityof having much more complex,
advanced, profound intimacyneeds met in very cheap ways.

(06:47):
And so I mean again, likealready, there are a number of
services out there inviting youto have a relationship with an
AI partner in some form.

Speaker 1 (06:58):
Have you heard of Love Plus?

Speaker 2 (07:00):
No, I really haven't looked at a lot of the different
services, I just know theyexist.

Speaker 1 (07:03):
Yeah, so I was just researching this to prepare for
our podcast and so that it's injapan.
It's very popular in japan and Ithink it's actually out for the
nintendo switch, I believe.
Uh, so mainstream console andmen in japan it's so popular.
It's not uncommon to see a mangoing on a date with his virtual

(07:24):
girlfriend at like a dinner.
Or you go out, you go out to arestaurant and there's a man
sitting there with his switchand like having this date, and I
mean, that might sound bizarreto us in our culture right now,
but I think we're gonna see thatmore and more commonplace here
and that's something that wasdeveloped actually before.

(07:45):
Now AI is getting so much morepowerful, so this Love Plus has
actually been around for a whileand it's been popular in Japan
for a while, but it's going toget increasingly more
sophisticated and to the pointwhere I think it's going to have
a market here in the UnitedStates and we're probably going
to see that kind of thing moreoften, especially as the dating
landscape is already janked up.

Speaker 2 (08:07):
That's right, and it's been shaped by technology
in ways that I think most peoplewho are participating in it at
least what I hear from folkswould say has not been whole or
healing, or constructive orlife-giving.

Speaker 1 (08:19):
The stories where people meet and get married are
kind of the exceptions, not thenorms.

Speaker 2 (08:24):
Yeah, and there's a lot to say about that.
Um, but to your point, likeright now, when you hear about,
or maybe if you saw, someonegoing on a date, uh, with their
lab, you know, with a, an AI insome form, uh, that would strike
us as weird, it would seemdeviant.
We wouldn't feel good about it.
Um, think the trajectory we'reon is that it wouldn't be a

(08:48):
mainstream thing.
For now, people would feel likeit's weird, but eventually
there'll be people who areagainst it and other people who
are saying look, this isn'thurting anyone, it's making

(09:08):
people happy.
In fact, you'll and I've evenseen this that those who are
creating some of these thingswould say look, we're offering a
cure to loneliness.
There's loneliness epidemic,here's medicine for it.

Speaker 1 (09:20):
Like it'll be a positive for mental health.

Speaker 2 (09:22):
Right.
But so the critique that Iwould want to bring to that is
really what I'm thinking about.
It's not just hey, this isweird deviant behavior, so stop
it, just for its own sake.
It's more.
What I see is and I got to stepback a little bit here to get
into this I don't know if you'refamiliar with the idea of the
male gaze.
It's an idea that has floatedaround in gender studies and

(09:45):
stuff, but basically it's thenotion that, like for a long
time, women and men have learnedhow to think about women by
watching TV and movies andreading magazines, but that the
people who are making thoseshows and movies and whatever
were looking from a maleperspective.
And so I mean the most you know,maybe obvious way of explaining

(10:07):
something like this would belike a women in a beer
commercial or something whereyou want for a man or a woman
watching something like that.
You're basically being trainedin a very subtle way to see this
is not as not as a person, butas an object.
Right, it's objectifying, butpart of the craziness of it is
is, again, that it's not justtraining men to see women as
objects, it's also trainingwomen to see themselves that way

(10:29):
.
Um, but I would say that thereare other gazes, like we're
trained by media and differentkinds of things to see ourselves
in the world in particular ways, and one of the most powerful
gazes that I think is out thereis the consumer gaze.
So like if you just exist in theworld, I think, is out there is
the consumer gaze.
So, like if you just exist inthe world, you're constantly
being asked, uh, what yourpreferences are.
You know, there are these big,powerful entities that are

(10:56):
trying to understand, um, whatyou want so they can sell it to
you or they're trying to shapewhat you want so they can sell
it to you.
It's a little bit of both.

Speaker 1 (11:00):
Right.

Speaker 2 (11:00):
Um, and you learn to relate to the world as a person
who the truest thing about youis.
You have preferences that needto be met, you know so that your
whole way of being in the worldis consumer shaped, and we
bring that to our relationshipswith people so that we expect a
person across from us to relateto us the way a business does as

(11:21):
a customer.
You know what do you want, howdo I meet your needs, and so
it's very self, it's extremelyself-oriented, um, and I think
dating apps on a lot of thingsum, leverage that and are and
are taking us further in thatdirection.
But, um, the possibility ofhaving, um, an AI intelligence

(11:41):
that I can make it look the wayI want it to, I can make it
sound the way I want it to inits whole reason for existing
and its whole purpose is torelate to me in the way I want
it to relate to me, right, like,think about, like, how that
shapes me as a human being.
And and what's nuts is like.
Again, when people talk about AIand its dangers, you know like

(12:04):
a lot of people run to theselike Terminator scenarios, you
know, and it's like, well,that's pretty sci-fi, who knows,
maybe we end up there, but AIdoesn't even have to, it doesn't
have to be self-aware for thereto be a lot of complicated
challenges that we're going toface, and one of them I mean
human.
You alluded to this.
Like we already are competingwith technology for FaceTime,

(12:24):
for real interaction with otherpeople, the better technology
gets at being the place I wantto be and being a conduit for my
communication with other people, the better I get at this, but
I'm going to tend to just, bynumber of hours spent, get worse
at this.
And it's not just that I'mbecoming a little bit worse at

(12:46):
social engagement, it's thateverybody else is too, and so
there's this negative feedbackloop where we're all getting a
little bit worse at relating toeach other, while the machines
are getting better and better atrelating to us.
And when that's going on, whilethe machines are getting better
and better at relating to us,and when that's going on, it
just makes sense that we wouldbecome more and more invested in

(13:07):
relationships with technologyand virtual people, uh, rather
than real people, because thetruth is, real relationships are
always hard, they're alwayscomplicated.
I mean marriage, but also meanfriendship, I mean family, like
any kind of human relationship.
Uh is hard work all the time,but it's also like the place of

(13:27):
greatest value in our lives.
You know, being able to workthrough problems with someone
else and, uh, learning toforgive and to, to to earn the
trust and respect of anotherhuman being.
You know, like that's the mostmeaningful, beautiful stuff in
life and I'm.
If I don't, if I don't have todo that work with someone else,
I'm never going to grow Like.

(13:47):
That's the kind of thing that'sgoing to make me grow as a
human being um, showing up forthe real relationships with the
people who I'm growing in lovefor and with you know.

Speaker 1 (13:57):
Right.

Speaker 2 (13:58):
But if I have, uh, an entity whose whole purpose is
to know what I want and to giveit to me, I'll never have to do
any of that.
And as that gets, and as and asAI gets better and better at
providing that, it knows when Igo, where I go, it's always
available to me.
You know, it knows all mypreferences, it can do all kinds

(14:19):
of things for me and, of course, as we're having more and more
of our romantic lives happeningon our phones anyway, the
difference between real and arelationship with an artificial
intelligence would be smallerand smaller.
So again, I'm just saying youknow, maybe we get to a future
where there are love bots andthat's a whole thing, and who
knows how near or far awaysomething like that is.

(14:42):
But I just think, with thetechnology that's currently
existing, that this is already achallenge and it's just going
to get bigger and bigger, suchthat, again, you get to a place
where, if we're buying into thiswholesale, if more and more
people are having relationshipswith even friends that are AI
instead of real people, right,with even friends that are AI

(15:03):
instead of real people, right, Imean, the sounds really over
the top, but it just starts tosort of unwind the social fabric
of our world.

Speaker 1 (15:11):
Yeah.

Speaker 2 (15:11):
Um, so that's not even an explicitly Christian
argument.
Obviously I'm a pastor andobviously there are significant
spiritual realities of thethings I'm talking about, but
just in terms of like, um,social like, like if I was a
politician or like some kind ofpublic philosopher, like I would
be really worried about thisanyway, not because I think, not

(15:33):
just because I would say, likeGod doesn't think you should
love a machine, but because Ithink it is.
I think it will befundamentally destructive to
individual lives, giving themthe appearance of something
without the reality of it, butthen also to our civilization as
a whole.
Yeah, it'll create people whoare stunted and unable to
negotiate and work with eachother.

Speaker 1 (15:55):
Yeah, and we're already feeling the effects of
that.
You're right.
And the social dilemma, thatdocumentary about how social
media like we're all we'realready used to the idea of how
algorithms are training usbecause we see it through social
media.
And social media has beendriving more and more
sophisticated algorithms.
But everyone's familiar withyou know an exponential power

(16:15):
multiplier.
When's the last time you lookedat like two to the power of 10?
Like it gets crazy really fast.
I really see AI as a multiplierfor the kinds of issues we're
already facing in our societyfrom the algorithms that social
media have created.
So you can imagine if socialmedia is the two and then AI

(16:37):
comes along very sophisticatedand it will make it to the power
of 10 and it will reinforce allthis.
I mean it'll make it like yousaid.
It'll blur the line so muchbetween what's lifelike and what
we need and what's real man.
It's scary to think about.
And the Social Dilemma wasn't aChristian film either, you know

(16:57):
.
So I think people realize likeit's manifestly obvious these
things are kind kind of tearingus apart.

Speaker 2 (17:04):
Yeah, we're all addicted, we don't know.
There's no, there doesn'tappear to be a place to stop the
train right um, and it's it'shard to conceive of honestly
there being any way to get offthe train.
The way our civilization worksin a consumer culture like stuff
that sells is going to keepselling, and that's going to
drive so yeah innovation, and soI, honestly, I mean to bring it

(17:28):
back around to a Christianthing.
um, I will say I think one ofthe main places of the church is
, um, an opportunity for thechurch, uh, to be the church as
it's meant to be in the days andyears ahead will be um, be the
church as it's meant to be.
In the days and years ahead willbe um as a place where people
are willing to, I hope, to dothe hard work of having

(17:48):
relationships with other people,to get into the messiness of
that Um in in spite of the factthat it's hard, but to believe
intrinsically that it's worth it.
Um, because the value in otherhumans.
I like what you're saying aboutthe multiplier thing.
Um, I think a thing that likehelps give some flesh to that or

(18:08):
to make it feel, is if youthink backwards, like from where
we already are.
Yeah, like, okay, you know,it's not that long ago that the
power of pornography was.
You had to, you know, go to astore and face a person and buy
something.
You know, buy a magazine ordrive, however, no, who knows
how long to like the specialty,uh, video shop.
I got to think of rural areas.
Yeah, um, you know where I grewup, where there's that's not

(18:30):
just something you see, butthere's like 50 miles over.
There's just one place with abig triple X, sign up Right.
Um, like it was.
It was only a very motivatedperson who wasn't embarrassed.
It a very motivated person whowasn't embarrassed.
It's a very specific person.
To source those things RightNow.
It's ubiquitously with us, yes,but that's only a drop compared

(18:51):
to the reality of the powerthese things will have.

Speaker 1 (18:54):
Yeah, so you know.
Another example is Snapchat.
It's kind of an early adopterof a customizable AI.
I was on Snapchat the other dayand for a feed you can get
Snapchat Advanced or SnapchatPlus and it allows you to type
in some sentences to customizean AI friend in your app.
So I think that's going tobecome something we see

(19:16):
everywhere right More and more.
And, like you said, I mean I'vebeen seeing all kinds of
research and statistics latelyshowing how, around 2007, all
kinds of negative markers interms of mental health and
social skills have declined, andthat's when the iPhone came out
right.
I think we're going to lookback and see the same thing
around AI releasing and becomingmore and more common.

(19:37):
Again, it's so eerie becausewe're already seeing the
damaging effects of technology,social media, iphone in every
pocket, and I just can't imagineit getting a whole lot worse.
But it could, oh, absolutely,and I think that's why this
conversation is so important.
I mean it's so important tohave these conversations and for

(19:58):
people to be thinking ahead.
And for people to be thinkingahead because if you're already
allowing algorithms to dictate alot of your time from day to
day and take away from yourrelationships.
If you're not guarded againstthat, it's going to get worse.
It's going to happen even moreso.
So I think this is an importantconversation.

Speaker 2 (20:16):
Yeah, I think that's great.
So the implicit question, allthe stuff we're saying,
hopefully to your're, to yourpoint, like it's not just hey,
everything's going to beterrible, be afraid.
You know, like that wouldn't bevery helpful to say Um, but but
it's more to say like part ofthe power of technology is to
create new defaults.
You know so, and I think abouteven small changes that have

(20:39):
been made to different apps, youknow from like when you have to
like, choose to go to the nextpage, to infinite scroll, like
that's a huge deal in keeping meon my phone, you know, and many
of them are changing thedefault.
So there's a sort of passivemovement towards so to recognize
, as you say, that these thingsare happening.
It creates the possibility ofstepping back from them and

(21:02):
being intentional.
Um, I'm not a you know someonewho's thinking of going full
Amish.
You know where I'm going tocompletely um step away from all
uh, modern technology.
Of course the Amish have thethat um reputation.
The truth is they do adopttechnology.
They just do it very slowly Um,very, very slowly, um, but I

(21:36):
think they there's some wisdomin that.
To just accept the delivery ofnew technology to us isn't
something that we should justaccept in a neutral way, but
something that we should be morecautious about, more careful
about and more thoughtful about.

Speaker 1 (21:56):
And we can't trust these big tech companies to do
that for us.
We've got to learn how to.
We can't trust them to protectus or our minds, because economy
is what's going to drive theirdecisions at the end of the day,
Even if they mean well, even ifthe whatever, whoever the CEO
is, even if he makes a speechand he means well.

Speaker 2 (22:13):
we've got to learn how to guard ourselves against
this stuff and we can trustcorporations, we can a hundred
percent trust them to always dowhatever is necessary to make a
profit, and that is to, frankly,mine you know Our attention,
our attention and our interests.
Anything that's going to be anaddictive and often it's lowest
common denominator kind of stuff.

(22:34):
It's going to be fear or lust,or shame, or one of those kind
of buttons that's the easiestway to move a human being.
So those are going to keepgetting pushed harder and more
effectively and more often.

Speaker 1 (22:48):
So yeah, so I'm already using chat GPT on a near
daily basis to give me bulletpoints of organized thoughts
that I'll take and write out forjust different kinds of things.
You know whether it's my workwith Reclaimed preparing for
this podcast episode, it's kindof ironic like using AI.
Hey, give me some points thatwe could talk about in a podcast
episode warning about thedangers of pornography and AI.

(23:10):
So I do want to slant this hereat the end towards pornography
specifically.
I want to talk about thepotential for AI to just like
how we're talking about how AIis going to make certain things
seem almost irresistible.
We're going to have to guardourselves, train ourselves to
guard against them.
We think about pornography andhow irresistible that already is

(23:31):
to so many men and we thinkabout adding an AI component to
where you can almost in realtime.
So just to expand guys that arelistening like, expand your
imagination.
Once AI and video capabilitiesget so advanced and we're
already seeing a lot of likeclips on the Internet showing
how much you can do and it'sscary, like I've seen some

(23:51):
videos where, like hey, turnthis barn into a castle and show
me walking up to it in a video,and it's just like perfect,
right.
So pornography is, it's not aquestion of if it's when it's
going to get to that point.
And so again, we're talkingabout if pornography is
capitalizing on your sexualfantasies and relationship

(24:13):
fantasies.
Its ability to lure you in iskind of dependent on how well it
can do that right.
And so you add AI on as thatmultiplier, that right, and so
you add AI on as that multiplierand it's suddenly going to give
you the ability to customizeyour fantasies in real time.
And I think here's one of thescariest things to me, even

(24:33):
putting in asking AI to depictreal people, right, celebrities,
actresses.
So Taylor Swift, you can go andfind like a news article about
this.
She bought up a ton of domainnames around her name because
she knew that eventually AI deepfake pornography would get so
sophisticated that she at leastwanted to make it a little

(24:55):
harder for people to find thatkind of thing.
Wow.
So I mean that just kind ofgives you an idea of how serious
that that's going to be.
So the fact that you could askAI to against people's will
portray them in a video, apornographic video, is so
frightening.
Yeah, it is Like what do weeven do about that?

(25:17):
What is there to say to that?

Speaker 2 (25:19):
I mean Like what do we even, what do we even do
about that?
What is there to say to that?
I mean, well, I think, just onthe on the personal moral level,
right, like so many of thesethings are slippery slopes and,
and I think in general, wheresexual morality is concerned,
the best defense is is to, is tobuild a fence.
You know it's like like you're,you're much less likely to get.
I mean, I think most like, theuse of pornography is already

(25:43):
shame inducing for most people,for many people, especially
Christians who who do.
But if you found yourself, youknow, like deep faking your own
porn of somebody you have acrush on, like a person you know
, like.
I mean, the moral implicationsof it are really terrifying
because, I mean again, we takeseriously that viewing something

(26:05):
like that shapes the way yousee a real person, um, it
doesn't just objectify them, butI think it breaks down a
certain amount of um, I have tothink that it would increase the
possibility of of of realbehaviors being acted upon that
are dangerous, violent.

Speaker 1 (26:21):
Because it's no longer like.
You're no longer watchingpornography filmed by someone
who lives in Hollywood.
Right, you're watchingpornography that AI has allowed
you to use someone in your realcircle of like relationships or
something you know and you thinkabout the penalties for, for
example, like putting a cameraup and watching someone you know
.
I mean there's severe penaltiesfor that, but like, is the law

(26:45):
going to be quick enough tocatch up to protect people?
Right?

Speaker 2 (26:49):
And so I mean so I guess I'm just saying, like, on
multiple levels, like there's alot, there's the potential for
harm to the other person.

Speaker 1 (26:57):
Right.

Speaker 2 (26:57):
For the person being portrayed or for the person who
acting out, who's been usingthat kind of pornography, seems
much greater.
But even then, like if what Iwas originally going to say is,
if a person gets to the placewhere they've done that, like
the amount of shame andself-loathing seems like it
would be so much greater becauseit's no longer an impersonal
kind of wrong that's been donein some sense against yourself

(27:19):
and some hypothetical strangerit's.

Speaker 1 (27:21):
It's, it's much deeper than that, Um, but the
way you avoid it is to stay awayin the first place, you know,
like not to uh start to go downthat rabbit hole and so, um, as
these new tools come about, justkeeping a wide berth seems to
me like that yeah, you knowwe've had podcast episodes where
we talked about there's a lotof healthy and positive

(27:45):
motivations and reasons to likeremove pornography from your
life right, or even social mediaand algorithms, for the most
part to a certain degree, notbeyond like what's practical or
what you can handle, but likeyou got to start guarding
against this stuff.
Yeah, this episode I think ishelpful because it's more about
the dangers.
We're trying to help men find amotivation by kind of striking

(28:07):
some fear of the realities ofwhat's to come.
I think that fear of what couldhappen to you if you don't
start with spiritual disciplinesor other disciplines in your
life to protect againstpornography.
Then you could wind up in aplace like that.

Speaker 2 (28:23):
Well, so yeah, to connect the dots as clearly as
possible.
It's sort of like these thingsare nascent, they're just
beginning to take form today.
They'll be significantly morepowerful tomorrow.
But, like, whatever you've gotgoing, if you're looking for
motivation to step back frompornography or an addiction to
pornography today, it's sort oflike well, like, that's where

(28:43):
all this is going.
And if you think it's hardtoday, it's only going to get
harder.
So, there's no time to press it,to see healing and and to begin
to build habits and practices.
They'll protect you from thosethings, because I mean as
addictive as it is, as we'vebeen saying again and again it's
hard to imagine that it's notgoing to get exponentially more

(29:05):
powerful in getting and holdingon to your life.

Speaker 1 (29:09):
Yeah, well, guys, if that doesn't, you know, strike
enough fear in you to startdoing something about this.
And I think one more last pointI want to make is like I think
one thing that enables men tosometimes spend years looking at
porn but remain in this placewhere it's kind of secretive and
they're still living their life, like you know, their lives are
somewhat normal from theoutside and they're watching

(29:30):
porn all this time, sort of theporn game.
I think that the consequencesof you watching it will come on
more quickly and more apparentif AI has more powerful dopamine
surging capabilities.
So I think that maybe that'llbe a good.

(29:51):
Maybe for some guys they'll see, hey, this is too much, and
that line will be clear.
Like I cannot cross this line,I need to stop, and I hope that
that by listening to thispodcast, I hope that's what
happens for more men.

Speaker 2 (30:03):
My, my fear is that, um as pornography becomes more
able to provide the appearanceof a complete relationship
experience right.
That it will take us from aplace where it's sort of like
well, this is aberrant behaviorand in the sense that it makes
it harder for you to have ahealthy relationship, Like if

(30:24):
you're looking at porn all thetime, you're not going to be
able to look at your partnerwithout you know, comparing, and
it's just kind of to destroylike intimacy there, those
things.
But it's sort of like, if porncan take you all the way to a
partner that talks to you andshows up and meets your higher
intimacy needs, um, maybe it'slike, well, there's no concern
about this, this is where I'mgoing to stay, this is where I'm

(30:45):
going to live, this is, this ismy sexuality.
Um, I don't know what that willbe called, but it will have a
name.
It'll be some kind of coinagefor that where, um, I'm a person
who chooses to have that kindof relationship and again, um,
whether you know, apart from aconversation about gender and
sexual expression preference, um, those kinds of things, um,

(31:08):
just simply understanding thatthe consumer power of a tool
like that will be such that, um,it'd just be easier for anybody
to go that way and when it'sgetting harder and harder to
make real relationships withpeople.
when we're lonelier and lonelier, that's going to seem like a
more and more appealingalternative, and it just, it

(31:28):
cannot give you the same thingsthat a real relationship can.

Speaker 1 (31:31):
Yeah, ai intimacy that's one of the terms I hear
thrown around a lot which iskind of a term that's getting
coined, and I mean AI a lotwhich is kind of a term that's
getting coined, and I mean AIintimacy.
Guys, like, if you hear that,that should scare you.
Like the fact that that's goingto become more and more common
phrase is going to become moreacceptable.

Speaker 2 (31:49):
Something you said really kind of like shook up
another thought loose for me,and it is that you're talking
about.
Maybe the advancement of theadvancement of these things will
, by increasing the, you know,the, the potency of it.
Maybe that'll shake some peopleawake or something.
But maybe one of the ways thatcould be true, so sort of a
different angle, is that we'vealready been shaped for a long

(32:11):
time in a lot of ways, like Iwas saying, to objectify each
other.
You know, it's like as if, as ifI was looking for a mate simply
to find a collection ofbeautiful body parts or
something you know, just to haveaccess to those things, like
that was the point ofrelationship Other than a real
human partner, you know, anothereternal soul to, to learn how
to love and trust and forgiveand do all those kinds of things

(32:33):
with that you can only do withanother human, only do with
another human.
Maybe the emergence ofsomething like intimacy with AI
helps underscore the ways thatwe've already often been
treating people we have realrelationships with like we wish
they were just just AI.
You know, it's like I'm in arelationship with a woman but my
fantasy is basically that shewould treat me the way that the

(32:56):
AI treats me already, that Ijust want her to do whatever I
want her to do, to have mypreference to.
You know what I mean To simplycater to my needs, like how many
people think that's what theywant.
And of course, we all want apartner who's kind and who
listens and those kinds ofthings, but the deeper truth is
that we all need a real holdhuman being to relate to, who

(33:20):
can and will challenge us whenwe need to be challenged, who
will ask us hard questions, whowon't buy our crap when we you
know what I mean.
Like that's the kind of personthat will help us actually grow
and mature and whose respect isworth winning and having.
You know, and and so maybe,maybe, in some ways, an upside

(33:41):
in the brokenness of AI intimacyis that we can learn to see how
we've already um sought thatfrom other people in a in a
broken way.

Speaker 1 (33:51):
Yeah, yeah, I think that's a great way to end this
episode is I would challengeguys to find ways to use AI to
improve your relationships.
There could be some not goinginto this deep end rabbit hole
of like actually getting yourintimacy from it, but if there
is ever an opportunity to getlike feedback, then maybe see
what you can do there.
But I think you don'tnecessarily have to go Amish on

(34:13):
it Although maybe one point ofthis podcast is that the Amish
had it right all along but youknow, um, there are some ways
that could help, you know,improve your life, I think, and
so seek out those ways.
Guard yourself against a lot ofthe things that we talked about
today, and you know that's whatReclaimed is here for.
That's what, uh, the mission ofReclaimed is for, and we're not

(34:35):
going anywhere.
We're going to have a lot morecontent, more conversations,
episodes, looking into thefuture, looking at AI, and our
mission is just to see as manymen as possible learn how to be
self-aware, learn how to findand connect with that purpose in
their lives, so that theirlives would be better than porn
better than porn lives,pornography free and also not

(34:57):
needing AI intimacy when itcomes, so it's not as tempting,
but like having the relationalskills developed to be able to
withstand those kinds oftemptations.
And so, anyway, everybody,thanks so much for listening.
Dustin, thanks again for havinganother conversation.
Yeah, really enjoyed it man.
Next time.
Yes, sir, all right, reallyenjoyed it, man next time.

(35:19):
Yes, sir, all right, man.
This episode was a little bit ofa wake-up call for me.
I don't know about you guys,but just to guard my heart and
guard my eyes against things now, because what's to come, I
think, will be even moretempting and powerful and
alluring, and so I'd like tostart talking about action steps

(35:41):
at the end of episodes, becauseguys that know me and have done
my 10-week program and haveworked with me know that I'm
really serious about takingaction and thinking back on this
conversation, I think oneactionable step that really
makes sense is to lean intorelationships in your life, even

(36:02):
the difficult ones, especiallyif that's your marriage, but
really just any relationships,like if it's in your family,
whoever it might be.
Lean into those and learn torepair those, because when AI
comes along providing thisartificial companionship, it's
going to be really tempting tochoose that over relationships

(36:23):
that aren't serving us well butare still important.
You know people that we want tobe close to, but it might take
a little work, and with AIrelationships, we're not going
to have to work at all.
Instead, ai is going to beworking to try to please us, and
so I think we really need tolearn the skill of pursuing and
repairing relationships.
I think that's the actionablestep for today, because if you

(36:46):
wait to do it, it's just goingto get harder, and we already
know that if you're hurting inthe relationship area and you're
feeling lonely, thatpornography is also going to
prey on that as well.
But now we have AI andpornography, and sometimes those
things are going to cometogether as a package and
they're going to make aformidable predator on our

(37:07):
emotions and on our loneliness,and they're going to try to
leverage our selfishness againstus and against other people if
we allow them to do so.
But we don't have to allow themto do so.
We have agency and we havefreedom in Christ, and Sigmund
Freud believed that we wereslaves to our instinctive sexual
desires and impulses, and hereon this podcast, we're calling

(37:30):
that bogus.
We are grown independent menwith a choice, and my prayer is
that you believe that and youstrive for the right choice and
you wouldn't be here if youweren't.
And you can do it.
You can do it, guys, so I hopethat's helpful.
Now that we're back and we'rebreathing some new life into the
podcast, I'd really appreciateyou guys.
If you haven't left a reviewfor the podcast.

(37:52):
Now's the time.
Please go and do that, butthat's all I have for now.
Be looking for another episodeto drop in a couple weeks and
you guys take care.
And please, if you want to stepup and get more involved with
Reclaimed and with what I'mdoing and our community is doing
, I invite you to check out ourcommunity and our app at

(38:12):
communityreclaimedrecoverycomand there you can join.
You'll have to answer somequestions because I make sure
the community is safe and sothose questions are to safeguard
and make sure everybody's goingthrough an approval process and
that everybody's there is therebecause they legitimately want
to grow together with other menwho are overcoming pornography.

(38:35):
All right, guys, till next time, take care, god bless.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Ridiculous History

Ridiculous History

History is beautiful, brutal and, often, ridiculous. Join Ben Bowlin and Noel Brown as they dive into some of the weirdest stories from across the span of human civilization in Ridiculous History, a podcast by iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.