All Episodes

September 8, 2025 38 mins

What happens when your “perfect” girlfriend lives in your phone, hits your credit card for $9.95 a month, and never says no? In this episode, Charles and Dan explore the rise of AI companions and what draws men to digital relationships over real ones.

We dig into questions like:

  • Why some men are choosing AI girlfriends instead of dating apps or real partners
  • Whether talking to a bot can really make you feel “chosen”
  • How this trend might affect masculinity, intimacy, and even population growth
  • Why equal rights for women are great for exceptional men… and not-so-great for mediocre ones
  • Whether an AI girlfriend is a bridge to better relationships or just a pacifier keeping men stuck

It’s part philosophy, part comedy, and part social commentary on where dating and technology collide. Spoiler: Rosie from The Jetsons and Dolores from Westworld both make cameos in the conversation.

For free access to all our audio and video episodes — and anything else we decide to share — head to MindfullyMasculine.com
.

Support the show

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Charles (00:00):
we have changed the way that we meet and date and mate
with people, because of socialmedia, because of dating apps,
because of the birth controlpill, because of, you know,
women's liberation, and here's,here's the sort of sad reality
of it.
Um, equal rights for women arebad for mediocre men.

Dan (00:28):
Interesting.

Charles (00:29):
Equal rights for women are great for good men, great
men, exceptional men, men whoare flexible, men who are
willing to, you know, realize.
Oh, the world around me haschanged.
I have to change with it.
But for guys that are kind ofjust, you know, skating by with
the bare minimum, having a bunchof women in college, having a

(00:49):
bunch of women buy houses,having a bunch of women in
professional settings, having abunch of women with access to
all kinds of different guys thatthey might be interested in
dating, that's kind of bad forthe bottom of the barrel guys
out there yeah so you know what.
What do we do about that?
Like I say, we do nothing about.
Welcome to the mindfullymasculine podcast.

(01:11):
This is charles.
In this episode, dan and I do aone-off where we dive into the
topic of ai girlfriends and howthey compare to real
relationships.
We explore what draws men todigital companions, what's
gained, what's lost and how thistechnology might shape dating,
intimacy and even society itself.
Join us as we look at thepsychological, cultural and very

(01:33):
human questions behind thistrend.
Check out mindfullymasculinecomfor access to all of our audio
and video episodes and anythingelse we decide to share.
Thanks and enjoy the show.

Dan (01:44):
Good morning Charles, how you doing.

Charles (01:45):
I'm well, dan.
Thank you, how are you?

Dan (01:47):
I am well, thanks.

Charles (01:48):
I am dressed a bit more festively than I usually am for
the yeah, what is the specialoccasion?
I am going to the zoo rightafter we get done recording and
so, as a sign of respect to theanimals, I'm wearing a shirt
with animals on them.
I hope animals I'm.
I'm wearing a shirt withanimals on them.
I hope they appreciate thatpretty soon.
Yeah, especially if there's anaquarium full of fish we'll see.
Going to the brevard zoo downin melbourne I've never been

(02:09):
there before okay, I've neverbeen either you know I'm a big
fan of going to new cities andseeing new zoos.
It occurred to me oh, there's,there's a new zoo right in the
greater central florida areathat I've never been to.
Excellent.
So me and friend are gonna gocheck out the brevard zoo today
and see what kind of uh animalsthey got.
I hope it's uh, I don't thinkit's as big as the one in

(02:31):
sanford, I think it's a littlesmaller.
Okay, the one in sanforddoesn't really blow me away
either.
Now we'll see how it is.
Maybe there'll be somethingcool, but uh, anyway.
So uh, we so we have noteffectively decided on what our
next long form series is goingto be, so decided to.
We didn't have an episode inthe banks.

(02:52):
We got to put something out onMonday, so we came up with a,
did some searching through somedifferent topics and landed on
one that we both foundinteresting.
So we're going to talk aboutthis, which is, um, artificial
intelligence, girlfriends versusreal relationships with real
girlfriends.

(03:12):
Um, though, I suppose you know,this doesn't have to be tied to
girlfriends or boyfriends.
I believe that one of the videosthat we watched to prepare this
said you could, like, choosemale, female, non-binary, like,
whatever companion you wantedprepare.
This said you could like choosemale, female, non-binary, like
whatever companion you wantedyou could.
You could pick from.
And it's funny, the, the onevideo that we'll talk about.
Um, the guy in the video askedthe ai companion if it wanted to

(03:39):
be his girlfriend and it's likenope, that's beyond.
That's on the other side of thepaywall.
They thought was funny.
You getting rejected by the aigirlfriend.
God, take out your wallet ifyou want to.
If what?
We'd say yes, but uh, so, yeah,we're.
It is like reality.
So the the first video that wewatched was a news report on uh,

(04:02):
uh, I think I think it was ajournalist on CNN interviewing
an expert, uh, professor, about,um, the technology that's out
there and the kinds of peoplethat it appeals to.
And, uh, I mean we'll.
We'll disclose our own personalexperiences first.
I I have never it's neveroccurred to me Like I've I saw

(04:26):
the movie her with joaquinphoenix.

Dan (04:27):
Yeah, I didn't even realize that movie existed.
A lot.

Charles (04:29):
It was a while ago too.
I'm not a movie buff, but Idon't know when that.

Dan (04:31):
Let's see when that came out yeah, they said it was about
10 years old, right yeah, it's.
It's been a while and uh yeah,joaquin phoenix definitely plays
some interest 113.

Charles (04:43):
Yeah, so that is 12 years ago.
Oh my gosh, it's crazy, yeah.
But uh, yeah, a lot of the, alot of the functionality, um,
that was shown in that movie isis pretty close to being
available now and you know it'snot going to be that much longer
.
Before you know, you can havesome sort of a physical android

(05:08):
in your house.
I'm having an apple, I'm anapple guy.
Come on, uh, you can have aphysical apple in your house
right now.
Um, yeah, so that I mean there,because the the point of it
from the that first video thatwe watched was there are some
needs that an ai girlfriendcannot meet for you at this

(05:29):
point, because they don't arebehind a paywall they don't
physically.
Oh oh okay, got it, got it um,but you know that's not going to
be the case forever yeah youknow, at some point there will
be some sort of a west worldsituation where you can have
yeah I I think your house itlooks like a real person and
acts like a real person.

Dan (05:48):
I mean I just got a Roomba.
So you know, I mean, there yougo.
Yeah, a little, a little robotbuddy to the vacuum for me.

Charles (05:56):
Yeah, the next step will probably be Rosie from the
Jetsons, and then the step afterthat will be uh.
The step after that will bewhat's her name from Westworld?
What was the girl's name?
I don't remember the main host,but yeah, and so I guess the
question is you know, from amacro and a micro level, what is
this doing?
What is this technology doingfor individuals?

(06:19):
What's it doing to individualsand what's it doing to our
society?
And for me personally, I don'tsee the appeal.
I I don't really relate to whatit would deliver me to have a
make-believe girlfriend that Italked to.

Dan (06:40):
Now is that because you know what having a real
girlfriend and a realrelationship and a real wife
actually feels like, Whereas alot of these younger kids don't
have that experience just yet.
So they've got really.
They don't have that standardto compare them, to compare this
to Possibly.

Charles (06:59):
However, I don't think that it is just that.
I don't think that it is justthat I mean I had a girlfriend
in high school and I had agirlfriend in college.
I'm not bragging, I'm justsaying from a young age, as a

(07:22):
young man, I understood thedifference between, I mean,
because there were, there weresome guys that I knew fairly
young, that you know cause there, I think, there are some strip
clubs you could go to whenyou're 18 in Florida, and there
are others that you have to be21 for.
So, um, I understood thedifference between paying

(07:42):
someone to act like they likeyou and actually getting a girl
to like you.
Yeah, and for that reason, Imean, I've I've never really
seen or experienced the appealof a strip club, of a prostitute
, because the my brain cannotdisconnect the part where this
isn't real.
This is make-believe.

(08:03):
Yeah, um, I've never had theproblem of I mean, you and I
have gone out a lot for meals,gone out for drinks, and you've
never once heard me saysomething to the effect of I
think that bartender reallylikes me, dan.
Yeah, because I I understand,you know what the the
interpersonal dynamics of now,this is that person's job.

(08:23):
They're supposed to be nice toyou, because being nice to you
makes them good at their job andmakes them excel at their job,
so that's why they're being niceto you.

Dan (08:30):
So I'm wondering if this kind of falls into what Mel
Robbins talked about, how wealways think we're the exception
to the rule, and I wonder ifthis plays a little bit into
that too.
It's just like, yeah, yeah, Iknow for everybody else, you
know she's a bartender or astripper or whatever you know
and has to play that role, butshe really likes me like I'm the

(08:51):
exception here, right, and Ithink I think that happens a lot
, and especially you.
You add in in those placespeople are drinking, they're
high on something else sometimes, so all of those things can
absolutely start to twist, twistaround our logical and our, our
, our brain as it works normally.
Add in we think we're theexception to the rule, we are

(09:13):
special because maybe we'regetting a little bit of extra
attention from that person whowe don't normally.
We're not used to getting thatkind of attention from anybody,
and I think I think that's howit happens.
I think it's very easy forsomebody to slip into that mode
where they're fast.
I think this is, this isspecial it's different.

Charles (09:29):
Yeah, right, so I mean, does that magical thinking
follow on to you?
Know an app on your phone whereyou think that you know, oh,
this, this ai companion.
I think she might be sentient,I think she becomes self-aware,
she might be real, real Well.

Dan (09:44):
I don't, I don't.
I mean that's a great question.
I don't know, I don't think.
I mean, if I had to guess is Idon't think that people are even
asking those questions like thegetting things that they need
out of it in terms of being ableto completely be vulnerable
with them and express exactlywhat their needs and desires are

(10:12):
, with no or very little fear ofany type of consequences.

Charles (10:16):
I wonder what it is about the language part of it
specifically.
Because I mean, you know, ahundred years ago you could tell
your dog all of your problems.
Because I mean, you know ahundred years ago you could tell
your dog all of your problemsand it would, you know, not
judge you and not react harshly.
But so is there something aboutbeing able to verbalize it and
then hear, you know, have it goback into your ear.

Dan (10:36):
Oh yeah.

Charles (10:37):
Oh, absolutely.
Yeah, you know cause?
I mean, you can talk to yourpet rock, you can talk to your
dog, you can talk to the wall,and it's essentially
accomplishing the same thing.
I just different.

Dan (10:47):
I just no, no, no, it's completely different because
you're getting the feedback fromthem.
I mean it's, it'd be differentif it was, if there was no input
coming back.
But I mean, just like with chatgpt, when we're asking questions
to it or advice, I mean I use,I use chat gpt for advice, um,
whether it's business or orsometimes relationships, um,
just basic ideas, and the waythe, the way it communicates

(11:12):
back to me, um has really isit's helped me with getting back
on track on personal goals andthings.
And I've, I've I've uploadedthings like oh, here's, like,
here's a personality profiletest of me.
This is this will help you getto know me and where I have
strengths and also weaknesses,and I say call me out on on

(11:32):
things like if I'm makingexcuses for stuff that don't
need me excuses for, and I think, and it does a very good job of
that and I, I think, as a, ifyou were trying to get that out
of a, some sort of romanticrelationship, and you could take
that, you could take that tothe next level and and yeah,
there's definitely some sort ofgood feeling that comes out of

(11:53):
you know, out of reading thosewords or hearing those words now
because you know, with the way,the way they speak now and it's
, it sounds like a real personfor sure it's getting closer.

Charles (12:03):
Yeah, it's still.
I mean I, I use the, the voicemode on chat, gpt sometimes, and
it still feels a little.
Yeah, it's not.
It's not like you're having aphone call with a real person,
correct?
So it's definitely yeah, a bit,but yeah, I don't know.
I think I mean, okay, here's, Iguess you know let's, let's get
to the meat of it is is thereanything wrong with it?

(12:23):
I would say no, there's nothingwrong with having an AI
girlfriend.
It's just there's nothingmorally wrong with it or
ethically wrong with it.
The question is is it leadingto happier, healthier lives for
the people that have it?
Maybe in some cases it is.
I would say in most casesprobably not.
But I mean, I don't thinkthat's any reason to say we need

(12:46):
to stop this.
We need to put an end to this.
I mean, if it's working forpeople, or even if it's not
working for people, but theythink it's working for people,
it's not somebody else's job toyou know, jump in the middle and
say, no, you can't do this,it's not good for you.

Dan (12:59):
Yeah, my, my question is is when we go into a relationship
thinking we want a girlfriend,what do we want out of that?
What is our end goal?
Is it to eventually meetsomeone to marry and have kids
with or start a family with, oris it just to have some sort of

(13:20):
emotional support where you canbe completely, completely
vulnerable with them?
That would be my question, andthen that would roll up into are
we making it too easy, though,to depopulate the human, the
human species?
Because if people are getting alot of the benefits from this

(13:48):
ai relationship that theynormally would have to get from
another human being, who theneventually, are more likely and
at least have the thepossibility of reproducing with
at that point, are we thenreally making it much more
difficult for reproduction tohappen, because now we have this

(14:11):
temptation to get those needsmet somewhere else, and so
there's a lot less bang for thebuck for having a real
relationship because you can getthose emotional needs met
somewhere else?

Charles (14:21):
Yeah, I mean, I would say yeah, if, if, if AI
companions lead to a populationcollapse, then so be it, fine, I
don't care.
I mean because the idea that,uh, I mean one of the rationales
that has historically been usedto to oppress gay and lesbian
people was well, the reason it'swrong is because if we all did

(14:43):
it, we that we wouldn't have anyhumans anymore.
So, therefore, it's wrong to begay or it's wrong to be a
lesbian.

Dan (14:49):
piss off yeah, that's a weak argument to me yeah, yeah.

Charles (14:53):
And so, even if, if, if humanity evolves to the point
where we seek out relationshipsthat can't lead to human
reproduction, then, okay, we ranour course.
Yeah, I don't care.
I mean, I, I haven't reproducedand I sleep like a baby.

Dan (15:07):
Don't get me wrong, I'm not saying we shouldn't do it
because of that, but it's justsomething to think about.
Is, yeah, really just.
I think it really then fallsback down on the individual to
really go in with intentionshere.
What are you looking to get outof this?
Because it could be and one ofthe videos they were talking
about how people who are reallychallenged in terms of getting

(15:30):
relation, having relationshipslike if you've got a lot of
disabilities or or, um,crippling social anxiety, that
this could be a a bit of abridge or, you know, even if
it's not an end, you know an endfulfilling need.
It definitely can help youbecome maybe more social, as a

(15:50):
training ground before youactually go out there and
interact with real humans, right, so there's, I think there's
definitely useful.
There's a.
It can be a useful tool as well.

Charles (15:59):
Yeah, I think in my own case, I would say and tell me
if you agree with this or that.
I mean, this could just be afunction of my mind and my
background.
One of the appealing things tome about a real relationship
with another person is thefeeling that you get when you've
been chosen by somebody elsethat's interesting yeah're,

(16:24):
you're not getting that from anAI companion that's hitting your
card for nine 95 a month.

Dan (16:32):
But let me ask you what are the things that that person is
doing when you know to feel likeyou've been chosen to me?
I guess we could look at it oneof two ways the the moment they
agree to be your, yourgirlfriend, or are there things

(16:52):
throughout the relationship thatyou're looking for to fulfill
that need that that to remindyou that you've been chosen?

Charles (17:00):
I think it happens from from saying yes to the very
first date, yeah, or all the way, okay, you know, agreeing to,
you know move to a new placetogether or start a business or
have kids, or I mean yeah it'ssomebody you know when, when
you're in a long-term, healthyrelationship, you're, you're
saying yes to each otherconstantly.
Yeah, and you'll never get thatwith.

(17:22):
You know, either the currentversion of ai companions or,
once they, you know, becomewalking automatons that you have
in your house, yeah, that willcook you dinner and then have
sex with you.
Yeah, like they're, they'restill.
They're not choosing to say yes, they're, they're fulfilling
their program.

Dan (17:39):
so you're right.
Here's a question, though howmuch of that are you going to
remember if you're in it likethat They've been programmed,
versus just the very fact thatthey are still there engaging
with you?
I would ain't remember Is is,uh, I think the same reason I
don't get a strip club.

Charles (17:53):
Yeah.

Dan (17:54):
Yeah, I was like cause I could see that very getting
blurred really easily Like, ohwell, they're still here.
So, yeah, I've been chosenRight, even though that's not
really what's happening underthe surface.
Um, yeah, yeah, yeah, I, thereis no real being chosen.
I, I agree with you and I guessit depends on the way you're

(18:15):
looking at I think.

Charles (18:16):
Yeah, I mean, look, people can delude themselves
into a lot of different types ofthings for a lot of different
types of reasons, and oh yeahand, yeah, it's possible that
you could trick your brain intoreleasing the chemicals as if
you were in a real relationshipwith a real person.
Um and I.
I don't know what the long-termeffects of that are.

(18:39):
I, I would think that they'reprobably not good, because you
know there is value in trying togo out there and, in front of
the world, strive to get thething you want and not get it,
and then refine your technique,refine your abilities and then
try again.

(18:59):
You experience a little bitmore success and then try again,
and you know when.
When stuff is handed to you,you know it doesn't make for um,
it doesn't make for resilientpeople it doesn't make for
growth.
Either that, exactly, orprogress.
Right, and one of the thoughtsI had while I was watching,
listening to these, uh, thesethree videos, which we'll put

(19:21):
the links to the videos in theshow notes so that you guys can
follow along with what wewatched I wonder if or how the
women of the world are aware ofthis and concerned with it.
My first thought was okay,ladies, there's not much to
worry about here, because theguys that you're losing out to

(19:46):
AI girlfriends these are notboyfriend or husband material.

Dan (19:53):
That's an interesting point .

Charles (19:55):
But the worry I would have is that they could be
someday, but because they'regetting this right, this binky,
this pacifier, yep, they have noreason to develop the skills to
become a good boyfriend or agood husband.
So at the moment that the aigirlfriend snatches them out of
the dating pool, you're notmissing out on much right.

(20:15):
It's the future version of themthat they could have become by
having some bad dates andhearing some no's.
That could have turned theminto a good partner, and that
just may not happen now becausethey're taking themselves off
the table.

Dan (20:30):
Yeah, that's a legitimate.
It's a legitimate concern.

Charles (20:33):
But you know, like so many things, I mean the the only
reason that this is happeningis because of, or one of the
contributing factors, I wouldsay, is we have changed the way
that we meet and date and matewith people.
Because of social media,because of dating apps, because
of the birth control pill,because of, you know, women's

(20:57):
liberation, and here's, here's,the sort of sad reality of it.
Equal rights for women are badfor mediocre men.

Dan (21:11):
Hmm Interesting.

Charles (21:14):
Equal rights for women are great for good men, great
men, exceptional men, men whoare flexible, men who are
willing to, you know, realize.
Oh, the world around me haschanged.
I have to change with it.
But for guys that are kind ofjust, you know, skating by with
the bare minimum, having a bunchof women in college, having a

(21:34):
bunch of women buy houses,having a bunch of women in
professional settings, having abunch of women with access to
all kinds of different guys thatthey might be interested in
dating, that's kind of bad forthe bottom of the barrel guys
out there yeah so you know what?
what do we do about that?
Like I say, we do nothing abouta top 10 guy.

Dan (21:55):
Yeah, it's interesting, it's, I feel, like it's
evolution, on on, fast forward,like we are.
We're witnessing evolution andor or a natural selection, I
should say.
And evolution, I guess, butnatural selection real time, if
we're looking at the dating poolat that point.

Charles (22:15):
And again there could be some consequences like
population collapse, where if we, you know, the top 10% men
can't produce enough babies tosupport all the old people that
we already have, and so ifthere's a population collapse
because of that, then again myattitude is okay.
So be it.

Dan (22:34):
you know, we, we chose this course and yeah, now we have to
live with the consequences.

Charles (22:41):
We get what we sow right, yeah, exactly, and so
what we sow?
So yeah, I, I don't have a.
I don't have the attitude thatsome people have, where I mean,
like you know, Scott Gallowaytalks about this a lot, where
he's like the the crisis amongyoung men is an existential
threat to our society and I'mlike, okay, yeah, maybe it is.

Dan (23:05):
What I'd like to get your take, on a kind of a side note,
what I heard him say, which wasthat, because of this and this,
I guess, isolation of of youngmen, um that, uh, it actually is
causing them to be moreconspiratorial and just you know

(23:27):
questioning things and and Ijust I didn't see the link there
how, how one can help,reinforce another.

Charles (23:37):
I mean you're not, you're not going down YouTube or
Reddit rabbit holes when you'regoing on dates with your
girlfriend.
I mean, I think that's okay.
Okay, all right.
The other thing is, you know,if you don't have a lot of these
guys that don't havegirlfriends, they also don't
have friends, and so if it'sjust you and your phone or you
and your computer, then you'rekind of going to end up a slave

(24:00):
to whatever the algorithm tellsyou you should be interested in,
which we know is going to bestuff that prompts reactions
from you, including ragefrustration.
Sense prompts reactions fromyou, including rage frustration,
since you know sense of oh,I've been unjustly treated, blah
, blah, blah.
so, um yeah, lonely guys aremore likely, okay, to I call
down those rabbit holes and getinfluenced by you know somebody

(24:24):
who's willing to say all yourproblems are not your fault,
they're at.
This group is what's causingyou all the problems you have.
Yeah, okay, I can see that it'seasy to be politically
motivated or motivated againstAll your problems are not your
fault.

Dan (24:30):
They're at this group is what's causing you all the
problems you have.

Charles (24:31):
Yeah, Okay, I can see that it's easy to be politically
motivated or motivated againstyou know, certain groups of
people and, like you know,you're doing fine, everything
you're doing is great.
It's just this, this group ofpeople, who's not exactly like
you.
They're the reason that youknow you're having problems,
whether that's a racial orethnic group, or a different
religion or women or whateverit's like.
Oh, that's such a relief.

(24:52):
I thought it was my fault.
Now this, this guy with abajillion followers, is saying
it's not my fault.

Dan (24:56):
I'm doing everything right, but these other people are
keeping me down yeah, so wouldyou think that would actually
happen for people who aregetting ai girlfriends?
Because I feel like, based onwhat you just said is, yes,
you're not gonna be going out onreal dates with them.
Um, I mean, other than that onevideo where the guy brought
brought chat gpt to the bar andhe's like it was hysterical, but

(25:20):
but you are spending time andyou I feel like, from the videos
we watched, a lot of the my ownexperience with ai is that it
actually can challenge yourbeliefs and at times I don't
know what all the AI models doif they do or they don't but
they can actually challenge yourbeliefs and they seem to have a

(25:40):
positive spin on a lot of theinteractions.
I think in one of them they weresaying that he couldn't find it
, but they were saying that someof these apps would actually
let you go down negative rabbitholes which I don't know if you
had any experience where thatactually is the case, because

(26:02):
every time I've used any AI it'salways been a little bit more
of a positive spin and trying tohelp you solve problems and not
let you stew in, I mean I don't, I don't reveal anything really
negative to chat gpt, so Idon't.

Charles (26:20):
I don't see the utility in that, so I I don't think
I've given it a chance.
I don't know how it would reactif somebody was, you know,
depressed or suicidal orsomething I don't know.
I would like to think that theyhave controls in place to
prevent AI from making thingslike that worse, but ultimately,
they're going to do what theyneed to do to deliver

(26:42):
shareholder value.
So if we lose a couple of kidsalong the way, who cares as long
as the stock price is going up?
That's the attitude, gordon.
Going up, um, that's theattitude.
Well, I mean Zuckerbergcertainly seems that's his, uh,
his guiding principle with theway he's doing Facebook.
So I don't know, um, yeah, Iwould say the uh, the time that

(27:10):
you're spending on your phonewith your AI companion is time
that you're not spending out inthe world learning how to be a
person.
So, you know, I would say thateven even if you're spending
time talking to your AIgirlfriend instead of in some
really, you know, crazy 4chan,8chan, reddit groups or whatever
, it's still not the best use ofyour time as far as what's

(27:35):
going to turn you into the kindof person that makes the world a
better place and gets the bestoutcomes in your interpersonal
relationship.
I find it unlikely that I wouldbe able to form deep personal
friendships with somebody whoseprimary romantic partner was, oh

(27:56):
, on the phone oh yeah right,like I, nothing happens in a
vacuum and you know nothingabout us as people is really
completely compartmentalized,like if if there's one area in
your life where you're behavingoutside of the social mainstream
, that usually will manifestitself in other areas as well.

Dan (28:17):
Yeah, but what's interesting is that becomes
acceptable these days because,as one of the guys in the last
video talked about is, with thecurrent level of technology we
have it's where you know, ashuman beings, when we're in a
dating pool, it is not, we'renot competing as a, you know, a
big fish in a small pond, it isa big fit.

(28:37):
You know, you are a fish in theentire world, on the right, the
ocean at this point, becauseyou've got men from all over the
world that you're, you'rebasically competing with it, you
know, from the dating side, andwomen too, you know, um.
But that being said, I think ifpeople are having these ai
girlfriends, they will then findother people who have ai

(28:59):
girlfriends and then have theirown little pool of of people who
have ai girlfriends and thatbecomes their, their little
world, their little socialcircle.
Right, and I mean I think thatyou know.
I mean I think it happens withall the other strange fetishes
and everything else that peoplehave these days is, you know, as

(29:21):
weird as it is.
There's probably somebody elseon the planet that also has that
same thing, and somehow theyfind each other right, maybe
they do, but where do they findeach other?

Charles (29:29):
That's the question I mean.
Are these guys, this group ofguys with ai girlfriends in each
and you?

Dan (29:34):
know are they are.

Charles (29:36):
They going out and getting a nice haircut, nice
shape, put on a shirt with acollar and go into mathers and
hanging out with each other andtalking about how great the
relationship with their ai.

Dan (29:44):
No, no, no, no, no, right, it's no they're you know,
they're, they're in, they're inchat rooms or whatever, you know
.

Charles (29:50):
I mean, I just yeah so I would say that, yeah, it's.
It's still the the initialproblem that I mentioned of of
it being socially isolating.
It's.
You know, if, if all you'redoing is talking to people in
chat rooms, then maybe there isno difference between an ai
girlfriend and correct right?
Yeah, the the real life friendsthat you have that you only

(30:11):
exchange text with in a chatroom.
I don't know, but yeah, I don'tsee it making you.
If your life goes down thatpath, I think it unlikely that
you're going to continually growand get better at interpersonal
relationships, correct, and thepeople who are good at

(30:32):
interpersonal relationships willcontrol the world and make a
lot of decisions that affect youyeah because that's who runs
for office, that's who getspromoted at work, that's who.
And so if you just say I'm gonnacheck out of that, uh,
interpersonal relationshipskills, I don't need them
anymore because I can interactwith anybody I want to interact
with on my phone, whetherthey're real or they're virtual,

(30:53):
that's all I need.
It's like, okay, but you'regiving up a lot of the control
you could have over your ownlife by going down that path and
maybe that works for people.
You're okay with that.
I would not be right.

Dan (31:22):
I need to continually improve my interpersonal skills
in order to have a increasinglybetter personal and professional
life.
But my you know what I'mthinking here is the person who
you know, young adult, whohasn't had much life experience
out in the world yet they don'tknow or haven't experienced that
benefit of having that activesocial lifestyle or having a lot

(31:43):
of interaction with otherhumans face to face and their,
their whole um level of offulfillment and enjoyment comes
from virtual interactions, whicha lot of times can be, you know
, similar to like ai, where it'slike this perfect relationship
and so you don't have to dealwith any of the negative

(32:06):
consequences or any of thechallenges that you do with a
real human being at that pointright.
So it's kind of like all upsideand and and as far as they're
concerned and no downside yeah,it's like oh, I'm living my life
, I'm not having anyrelationship failures.

Charles (32:19):
Yeah, relationships right right so you're.
you're not going to then at somepoint say, all right, I've had
enough of my ai girlfriend, I'mgoing to just jump into the
dating pool and get a realgirlfriend and expect it to go
well.
Had no training, you've had nofailures, you've had no learning
, and I think a lot of thiscomes down.
A lot of this comes down to the, the guys that are finding this

(32:43):
attractive and finding this asa good option.
Their parents have failed them.
I mean essentially their.
Their mom and dad have havefailed them as parents.

Dan (32:54):
Yeah.

Charles (32:54):
This is you don't see that.
You don't find yourself in thispredicament while having good
parents who understand people.
Understand the world andunderstand that it's their job
to share that insight with youas their child.
Understand the world andunderstand that it's their job
to share that insight with youas their child.

Dan (33:10):
So do you think that this is going to naturally keep the
population of people who arehaving these AI girlfriends
relatively small, becausethey're not going to be parents,
most likely, and if they are,it's not going to be a large

(33:31):
percentage of them, because mostof their relationships are
virtual and therefore those, thepeople who are into that sort
of thing, will, I guess,eventually either stay stagnant
or not not necessarily take overthe entire human population and

(33:51):
and run out of, and we're gonnabe at a point where we're not
creating humans anymore becauseI, I, yeah, I, I think, I think
it's going to be kind ofself-limiting I think it could
be.

Charles (34:04):
I'm right, yeah, I, I don't know.
You know, unless, unless thetechnology for you know, unless,
unless, as ai gets better andbetter, you know, we also start
birthing matrixes and birthingchambers where we just start,
you know, people instead ofhaving them the old-fashioned
way.
Yeah, I think that's whatkrypton in in at least one of

(34:24):
the uh, one of the, okay, one ofthe origins of uh superman's
story.
You know they, they switchedfrom natural birth to basically
just making people in pods and,uh, that led to their societal
collapse eventually.
But, yeah, I, I don't, I don'tknow what it's going to lead to
and what it's going to look like.

(34:47):
Um, I, I do.
You know, at least, as westernsociety goes, we are pretty bad
at figuring out the root causeof complex problems and then
taking steps to solve it.
If ai girlfriends, you know, ifthat's the first domino that
leads to the collapse of modernsociety, um, we're just going to

(35:09):
be along for the ride watchingit happen, because I, I don't
think anybody's going to jump inand recognize it and stop it
enough oh, no, no, no, no, yeah,it's, it's cats out of the bag
but you know, the good news isthat it'll probably be after I
die, so it won't really be.
It won't be my problem.
But um again, I I do think thatas long as there is a society

(35:32):
to be influenced and controlled,the people who understand
interpersonal dynamics are goingto be the ones that control it.
So if you decide talking topeople's too hard, I'm just not
going to do it anymore.
It was like, okay, well, thatis a a life of not determining
your own course anymore.
You're just going to be alongfor the ride and whatever

(35:53):
happens happens and other peoplewill be making those decisions
for you.

Dan (35:56):
Yeah, yeah.

Charles (35:58):
I don't know.
I did really appreciate thelast video we watched, which was
an excerpt from Diary of a CEO,steven.
What's his last name?
It's Smith, I don't know.
Um, he hosts that.
Um, he was actually at uh oneof the podcasting conventions I
went to last year.

(36:19):
Oh, yeah, podcast, I think, upin in uh DC.
I think he was the guest thereand, uh, he's an interesting guy
and he has a lot to say aboutthe way that they put that show
together.
But he interviewed a therapistslash dating expert, I forget
what his name?
was?
It was Terriban.
I didn't even know it was aninteresting name, but the guy

(36:43):
had some interesting things tosay.
So I think we might do a littlebit more of a deep dive into
his work and see if we thinkthere's good stuff in there, in
there, because from the littlepart that you and I watched, he,
he had some interestinginsights.
But you know, with with so manyof the guys that build a career
talking to guys about love andrelationships and psychology, I

(37:06):
I sign off on those guys very,very slowly, because even if
what they're saying right now isgood, you don't know where
they're going to be five yearsfrom now.
Right, right, and that's youknow.
We we experienced that withJordan Peterson, I mean when,
when his first book was written.
It's like man, this guy reallyknows some some good stuff, he
has some good things to say.
Yeah, and then, five yearslater, you look at his Twitter

(37:27):
feed and it's like, oh my gosh,what happened to this guy?
So I hesitate to endorseanybody but Dr Julie and Dr
Gottman.

Dan (37:34):
Okay, yeah, fair.

Charles (37:36):
All right, dan, we'll, uh we'll, get back to work on
finding our next uh book orsubject or topic and, uh, maybe
by the next episode we'll we'llhave something to say.

Dan (37:46):
Sounds good.

Charles (37:46):
Talk to you later.
Bye-bye.
Thanks for listening to themindfully masculine podcast and
for sticking with us all the wayto the very end of the episode.
We really appreciate your timeand attention.
Don't forget to visitmindfullymasculinecom for access
to all of our audio and videoepisodes, plus anything else we
decide to share.
We'll be back soon with moreconversations on masculinity,
relationships and personalgrowth.

(38:06):
Thanks and take care.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.