All Episodes

June 16, 2025 41 mins

Send us a text

The digital landscape has created an entirely new frontier for sexual violence, one that leaves lasting trauma without physical contact. In this eye-opening conversation with Dr. Kristen Zaleski, Chief Clinical Officer for the Mental Health Collective and recognized expert on sexual violence, we explore the disturbing world of technology-facilitated sexual violence and AI-generated pornography.

What began as Polaroid pictures passed around high schools has evolved into sophisticated algorithms capable of creating fake but convincing sexual imagery of anyone—celebrities, teachers, ex-partners—with devastating consequences. Dr. Kristen Zaleski, Co-Founder of the Neurodivergent Collective, shares shocking insights from her research: 90% of tech-facilitated sexual violence targets women, making it uniquely gendered compared to other forms of sexual assault. Perhaps most surprising, adults over 55 represent the second most victimized demographic after teenagers.

Our conversation extends beyond the problems to explore solutions—from Meta's facial recognition technology that helps remove reported content to the critical importance of education. Dr. Zaleski emphasizes that parent-child conversations about consent and healthy relationships don't require lengthy, uncomfortable "big talks"—just consistent, straightforward information.

Listen to understand how this growing epidemic affects everyone from teens to seniors, and what we can all do to protect ourselves and our loved ones in an increasingly digital world.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hello folks, thanks for joining us on Head Inside
Mental Health, featuringconversations about mental
health and substance usetreatment, with experts from
across the country sharing theirthoughts and insights on the
world of behavioral health care.
Broadcasting on WPVM 1037, thevoice of Asheville independent
commercial free radio, I'm ToddWeatherly, your host,
therapeutic consultant,behavioral health expert, my

(00:21):
distinguished guest and friendand co-conspirator in the
endeavor to make the world abetter place, dr Kristen Zaleski
joins me on the show today.
Dr Zaleski is the chief clinicalofficer for the Mental Health
Collective a residentialtreatment program in Southern
California, providingattachment-focused,
evidence-based treatment foradults living with mental

(00:42):
illness and trauma disorders, aswell as the chief clinical
officer and co-founder of theNeuroDivergent Collective,
providing evidence basedtherapeutic care to diagnose,
support and assistneurodivergent adults and maybe
we'll talk about that a littlebit too.
A psychotherapist andresearcher by trade, dr Zaleski
is a recognized expert on sexualviolence in American society.

(01:03):
Her current research focuses ontechnology facilitated sexual
violence in American society.
Her current research focuses ontechnology-facilitated sexual
violence, with a particularemphasis on AI-generated
non-consensual pornography.
That's interesting stuff.
What a world we live in.
She is the author ofUnderstanding and Treating
Military Sexual Trauma, now inits second edition, the first
social work text dedicated tothe subject.

(01:23):
Her second book, women'sjourney to empowerment in the
21st century, published byOxford University Press in 2019,
examines global human rightsabuses through transnational
feminine, through atransnational feminist lens.
Her peer-reviewed researchexplores sexual violence
treatment frameworks andqualitative analysis of unique
trauma experiences, includingchild marriage and online sexual

(01:50):
violence.
An active figure in the LosAngeles clinical community as a
consultant and trainer, she'snationally recognized for her
contributions to military andonline sexual violence research.
She serves as the chair of theMilitary Sexual Violence
Division at the Center for Lawand Military Policy, is an
active board member of theMETA's Global Safety Data
Interest Group on online sexualviolence and sits on the

(02:10):
advisory board of ASHAO, aglobal advocacy organization
combating female genitalmutilation and cutting.
Additionally, she is thefounding director of Forensic
Mental Health and the USC KeckHuman Rights Clinic, where she
conducts psychologicalevaluations for asylum seekers
who've experienced human rightsabuses.
Dr Z welcome.

Speaker 2 (02:30):
Wow, you did the whole thing.
That's a mouthful.
Thanks for the warmintroduction.

Speaker 1 (02:34):
No, well, I mean, I know that you're like this cool
lady who's fun at parties.

Speaker 2 (02:43):
Or the boring one at parties, or the boring one at
parties.

Speaker 1 (02:45):
You know, you know I like nerds unite and we just
find ourselves kind of like overin a circle somewhere talking
about whatever interests us themost.
But you know I, we have gotthere's so much to unpack, and
not only what you research andwhat your specialty is in, but
also the work you do with MentalHealth Collective, a program we

(03:05):
collaborate with on a regularbasis.
You and I have worked together,which I completely enjoy.
I don't think I actually knowwhat is AI porn.
Tell me what that is.
First of all, I have no clueand I'm grateful not to have a

(03:26):
clue, but you're going to giveit to me.

Speaker 2 (03:28):
Yeah.
So to introduce that I need tointroduce sort of the bigger
picture which is non-consensualintimate images, ncii Another
term in the literature istechnology-f technology
facilitated sexual violence.
So you know it's a play on whatmaybe you and I knew in our

(03:53):
high school days of a Polaroidpicture that got passed around
of you know some naked imagery,non-consensually right.
Or those released videos ofPamela Anderson and Tommy Lee,

(04:15):
for example, right.
These were all non-consensualexamples of sexual and intimate
imagery, but with technology itrapid fires across the world,
right?

Speaker 1 (04:31):
So I've got kids in high school, so you know
there've been a few expulsionsas a result of somebody sharing
something that was inappropriateto share.
It was not, and not only was itinappropriate, but it was also
shared without the person'sconsent.

Speaker 2 (04:47):
That's right.
So with high school, middleschool and high school kids.
That's where a lot of theadvocacy focus is right now on
this topic generally, becauseyou know part of exploring your
sexuality as the Polaroidpicture example exemplifies is,

(05:08):
you know some kids choose toshare a sexy photo with someone
they're dating to sort of tiptoeinto.
You know how to negotiatesexual boundaries and when that
gets taken advantage of.
We then consider that childpornography and schools are

(05:28):
taken by surprise when they getthis report that this is
happening, because it's a wholenew universe of how do we stop
this.
You know, do we go into everystudent's phone and tell them to
erase this text message?
Is it in a group chat?
Do we go into every student'sphone and tell them to erase
this text message?
Is it in a group chat?

Speaker 1 (05:47):
You know it's a terrible epidemic and parents
and law enforcement are havingto play catch up really quickly
to get on top of this.
I guess it gets complicatedbecause that's technically child
porn right, yeah.
Some two 16-year olds are doingwhatever they do.
They take a picture of eitherthemselves or one another and

(06:10):
then now that's an image that'srecorded.
That's technically child porn?
If it's.
If it's nudity, yeah, then theyshare it.
So now they're sharing childporn.
Those are pretty seriousoffenses that are hard to get
out from underneath If you getthem as an adult.
I wow.
That's just that opens up awhole world of like legal

(06:31):
problems and manage and likediscipline problems and
everything else for schools Likeso what's, what are we doing
about it?

Speaker 2 (06:40):
Like what's the what's the way forward?
Well, the laws are trying toget a hold of it and you know
I'm not an expert on sort of all50 states laws, but I can tell
you that it's hard in moststates to convict somebody who
is purposefully distributing andcreating, you know this, this

(07:00):
type of abuse.
This type of abuse, if theperson let's say, you know a
partner took the intimate imageby some state laws, they own
that image and they can share itwith whoever they want,
including the internet, right.
So there's ownership pieces tothis.
And then we have had, asadvocates I think we're doing a

(07:21):
better job in 2025.
But in 2017, it was a brand newtopic and people weren't
understanding how victims ofthis crime could be traumatized
without ever being touched.
You know, people who have hadthis happen to them are
absolutely showing symptoms thatyou would expect a sexual

(07:44):
assault, face-to-face sexualassault victim to show.
And it took a lot of convincingand it still does in some cases
for people to understand howhaving a naked image of yourself
, you know, in your whole highschool, would negatively impact
you and give you stress-related,trauma-related symptoms.

(08:04):
So you know there's differentsort of dives here and different
police departments.
You know I live in Los Angeles.
I still will make a report witha client and I'll come across a
responding officer who hasnever taken this case and
doesn't know the law and doesn'tunderstand how to do it.
And that's in a pretty big citywhere you know this happens

(08:25):
pretty frequently.

Speaker 1 (08:27):
I can't imagine.
You know a small town.
It doesn't do anything, right?
Is that like it's like the rapekit cases?
You know they just sat thereand never anything was done.

Speaker 2 (08:38):
Yeah, and it's rape kit is a good example.
Right, and for the listenerswho don't know what a rape kit
is, when someone's sexuallyassaulted, there's sort of a
standard of care that you can gointo a hospital and every
hospital across the nation doescertain things.
You know prophylacticmedication to prevent pregnancy
or sexually transmitted diseases, dna collection through the use

(08:59):
of swabs, and you know thingslike that medical care generally
for any injuries.
We are developing a rape kitfor technology, facilitated
sexual violence as well.
Those do exist, but a lot oflaw enforcement agencies don't
know about them, so it has to bean advocate that usually tells

(09:19):
the victim, hey, you can dosteps.
You know ABC to help sort ofcreate evidence and maintain it.
But it's hard.

Speaker 1 (09:31):
Well, I imagine it's not just hard to kind of like
navigate the law, it's also, Ibet, it's hard.
Correct me if I'm wrong, but itsounds like it's probably hard
for for law enforcement officersofficers to wrap their head
around it.

Speaker 2 (09:47):
Yeah, that's right Because it's.
You know, the issue is consent,always when we're talking about
sexual violence, and so in mostof these cases, a consensual
image was created right, but thenon-consensual part is the wide
distribution to the high schoolgroup text, or Pornhub, you

(10:11):
know.
So that becomes thenon-consensual piece, and
officers have a hard timeunderstanding that.
Also, when you get to theinternet, we're no longer
talking about local policedepartments, we're talking about
the FBI, you know, and thatsystem's pretty overwhelmed and
overloaded.
So, unless it's a prettypopular case, not much is going

(10:32):
to really happen.
So you know, again gettingbetter.
And if you watch policydiscussions, melania Trump, in
February, I believe, came outagainst revenge porn, which is
what this is called in AI,generated images and is asking

(10:55):
our national leaders to createlaws to protect victims from
this crime.

Speaker 1 (11:02):
Well, victims are DEI , so we got to get rid of
victims now too, isn't thatright?

Speaker 2 (11:07):
So AI is an interesting discussion.
So artificial intelligence cancreate images of kids, you know,
engaging in sexual acts.
The question is is that harmful?
The answer from advocates is,if you have somebody who is
searching for and creating anddistributing sexual images of

(11:33):
children, that that is harmfulas a whole to children.
Oftentimes AI-generatedpornography of children is based
on some real photos, so thefaces might actually be a child
that exists in the world andthen all of a sudden they're

(11:54):
engaging in this disgusting act.
So states like the state I'm inin California it's sort of a
gray area we have to report theconsumption of child pornography
.
The question is do we have toreport the consumption of
AI-generated child pornography?
As the state law exists inCalifornia right now, we have to

(12:16):
report it if someone's creatingit and distributing,
distributing, disseminating it,but the consumption of it so far
does not appear to be in thelaw.
But it's complicated and so allof us get stuck when we meet a
person usually a man who isconsuming AI generated child

(12:40):
pornography, and how we begin tohelp them, and when we have to
do that mandatory report asclinicians.
Adult AI-generated porn usuallyis done with a sextortion or
revenge porn angle.
It's rare that it's a randomgenerated image.
For the most part, it's usuallypartners or predators.

Speaker 1 (13:03):
Taking someone crafting an image.

Speaker 2 (13:07):
Exactly.
They'll take a femalecelebrity's face, put it on a
pornographic movie and then itlooks like someone we know and
see in the movies is havingactual sex, right, and that's
obviously harmful to the personwhose image it is.
And in the cases that I cantell you with clients that I've
worked with, I always talk abouta school teacher that I worked

(13:30):
with where she lost her job.
It was a small town.
She was a well-known schoolteacher.
Her ex-boyfriend put her imageon these pornographic videos but
because the town isn't educatedon AI-generated porn, they
assumed she had created apornographic movie and so she
immediately got reported.
Because she works with kids,she lost her job.

(13:53):
She had to leave her hometown.
Her whole life was destroyedbecause of this revenge porn, ai
generated video Wow yeah.

Speaker 1 (14:05):
What's wrong with people.
You know what I mean.

Speaker 2 (14:08):
I mean, I think that's an interesting piece in
the literature so far too.
You know, in sexual violence inthe military we see men and
women equally being victims,right?
If you look at the statistics,it's a bit misleading because
women have higher statistics ofbeing sexually assaulted in the
military than men, and that justhas to do with the number of

(14:33):
the end.
But if you look, at like the rawnumbers.
There's equal there withchildren who are sexually
assaulted, with women or girlsit's one in four.
With males it's one in eight.
So still pretty comparable.
But when we're looking attechnology and sexual violence,

(14:55):
we're looking at nine out of 10cases women are the victims.
Sexual violence we're lookingat nine out of 10 cases women
are the victims and I find thatvery fascinating that it is if
we talk about gender-basedviolence.
Technology and sexual violenceare very much gender-based, at
women specifically.

Speaker 1 (15:10):
What fascinates you about that?
What is that?

Speaker 2 (15:13):
That.
I mean, this might sound silly,but in two decades of doing
sexual assault research, I'vealways been dispelling the
stereotypes that it's.
You know, it's just women.
You know I've been trying toshow that this is an epidemic
that is non-discriminatory ofgender and in this case it's not
that way.

(15:33):
It is absolutely a form ofgender-based violence.
So I just I guess my 20 yearsof trying to dispel that myth I
can't right now and it feelsharmful and hurtful.
Right, it's like.
This is, this is.
I mean, all forms of rape, ofcourse are violent and personal,
but you know this is sopremeditated.

(15:56):
Right To create a video, toharness an image.
You know there's so muchemotion, hurt, length of time,
you know, to do these things andto distribute this material and
you know a lot of the survivorsI work with.
they will work with likeinternet companies like Reddit
or Pornhub, and try to get theimages down and the perpetrators

(16:19):
are watching and they'll repostit.
So it's not just the singleincident trauma, it is ongoing
and as a treating therapist I'mreally stuck because, you know,
with sexual assault there's astart and end point, right.
Domestic violence it's a bitlonger, you know.
They might be in thatrelationship for years, but

(16:41):
usually by the time they get tome there's been an end point.
With technology and sexualviolence there's rarely an end
point.
You know I am now working withmothers who are having to tell
their teenage sons hey, if youever consume porn online, I just
want to warn you.
I'm one of the most watchedvideos on Pornhub and it was not

(17:01):
consensual.
Those conversations are crazy.

Speaker 1 (17:06):
Well, yeah, let's say , actors don't Google mommy,
right?
Okay, I think what you'retalking about to a certain
extent, in trying to dispel themyth is about how we want to
have an answer and categorizesomething.

(17:27):
Oh, this is something that mendo, something that women suffer
from, this is something thatwomen do and men suffer from,
and never the twins shall meet.
We want these simple answers,and none of them are simple.
They all exist on a spectrum.
It's not.
Men suffer from sexual violence.
They absolutely do.
And for children, depending onthe environment, it's what's

(17:51):
available.
Boys in the Catholic Church,they're more over than females.
And so you can start pickingapart the numbers and nuance it
by category, nuance it by, bywhat is happening for a person
in a school system or in achurch system or in a or
something else, and you can findthe numbers.

(18:12):
And then you look at this kindof you know this large scale and
even transnational if you will,and you start to get these kind
of piece out.
Do you think that the AIgenerated porn piece?
Being more male being?
You know, being the co-founderof the Neurodiverse Collective,
you know that by and large, froma brain scan standpoint, men

(18:36):
have a tendency to be a littlemore visual by nature.
We suffer from tech addiction alittle more than females,
numbers wise, and so there'salso the disconnect piece.
I can sit here and perseverateon something that's in front of
me.
It's not a real human being,it's a screen and I can do.

(18:57):
You think that there arelinkages to this kind of being
on the being on a spectrum andhaving this stuff kind of show
up and be OCD, like it'sobsessive.
It keeps showing up, the personkeeps reposting it and creating
more and everything else.
Do you see linkages in onediagnosis to this trauma area?

Speaker 2 (19:17):
You know I think that's a better question for
someone who studies perpetrationmore than I do, but I can tell
you that in the autism communityI'm dealing with way more
victims than I am perpetrators.
It's you know we have foundthat sexual violence is much

(19:39):
more common than we anticipatedand seen more readily than you
know.
We treat trauma at the mentalhealth collective.
We're known for that.
So we have a lot of survivorsat the mental health collective,
but the neurodivergentcollective, it's you know, just
as much or more.
I do understand where you'regoing, though I think that I do

(20:24):
understand where you're going,though I think that you know
that could maybe be the case 15years ago, they know how to grab
an app.

Speaker 1 (20:27):
Follow the directions , can do it.
Pick some pics and say hey,create this and they'll get
they'll get a product they'llget a product.

Speaker 2 (20:34):
That's right.
So you know that and I thinkthere's a responsibility on us,
as are really on tech creators,on how they can monitor that.
You know, with my consultingwith Meta, one of our first
meetings it was pre pandemic,which feels like 100 years ago,

(20:54):
I don't remember who's 17, or 18, or 19, but one of the they
were asking, you know, whenthese things get posted on their
platforms, what are theadvocates ideas of taking them
off?
And because Meta has that umbase recognition software right
when you can update, you know, Idon't, I don't know if you're

(21:15):
on Facebook, but you know youcan upload an image of yourself
and it'll identify everybody inthe picture right Based on who's
been tagged before, and so they.
They can now use that technologywhen someone reports a sexual
image and try to find it andtake it off their platform that
way, but a lot of otherplatforms won't take

(21:35):
responsibility for it, and soyou know.
So that, I think, is somethingwe need to have a bigger
conversation about.
It should not be this easy tocreate harmful pornographic
material, and with AI, it caneasily detect if it's porn.
Right, we can put those filterson these platforms.

(21:56):
So let's do that to create a, asense of you know, not a porn
platform.

Speaker 1 (22:02):
You can't post it here, um, and then probably on
platforms that are pornplatforms like porn hub, some
measure, some additional measureof validating source.
You know, I I really don't knowanything about that end of the
world, honestly, but you knowit's harmful and it gets out

(22:23):
there.
I mean, you've got majorindustries advertising on
Pornhub.
Now you had some guy run foroffice and he put some of his
advertisement on Pornhub.
That's how prominent, it is.

Speaker 2 (22:35):
It's prominent and Pornhub has been in a lot of
lawsuits.
Uh, because they have, you know, in some.
I don't want to entangle inthese lawsuits, but you know, it
is believed that Pornhub isaware that some of the videos
that they put for free, you know, to get people to pay for their
platform, those videos havebeen reported as sexual assaults

(22:58):
and non-consensual.
But Pornhub, you know, with myunderstanding, has not been
motivated to take it down.

Speaker 1 (23:05):
Hands off.
It generates revenue, right,right, right.
So what you're telling me is,the problem is revenue
generation.

Speaker 2 (23:13):
Capitalism at the end of the day.
Capitalism at the end of theday.

Speaker 1 (23:16):
Capitalism is the problem.
I knew it.

Speaker 2 (23:18):
Capitalism is the problem, yeah, yeah.
And we were talking earlierabout men and women and you know
how we sort of separate this ina sexual violence lens and I
think you know there's a lot oftech companies are owned by men,
right?
A lot of people who code andcreate the algorithms are male.

(23:40):
Tech is still very much amale-dominated industry and I'm
not sure that the awareness andthe empathy of the effects of
this are there, like afemale-dominated industry might
be more conducive to lean into.
Industry might be moreconducive to lean into, you know
.

Speaker 1 (23:58):
Yeah, so how do we titrate that in what's your?
What's your thoughts about thesolution for something like that
?

Speaker 2 (24:10):
Well, we haven't talked about senior citizens,
and we don't have to, but I dothink it's important to note
older adults above the age of Ithink it's 55 are the second
most common victims of revengeporn.
Really yes, no kidding.
And so, you know, I think wetalk about it from a child
perspective and we should,because they're developing minds

(24:31):
and their safety in the worldreally matters.
But I think as a society, weneed to be talking about this
generally.
You know, in the same way thatyou hear about, you know, STD
epidemics and nursing homes,Right, and that's doing
education about sort of how tohow to have safer intimacy.
You know, I think we need tohave that discussion too.

(24:54):
I'm a big proponent of techeducation at a school level, you
know, as part of sex ed, forexample.
But I also, you know you're notrequired to take a class as a
parent, but I do think we needto push more parent education
and vis-a-vis, hopefully,grandparent education and what

(25:15):
this is and how to keep yourselfsafe and what this is and how
to keep yourself safe.

Speaker 1 (25:21):
Well, pushing sex ed is always.
It's this big topic, I mean, Iknow why it is.
It seems silly that it getsrestricted in the way that it
does, because it's reallyimportant that kids know.
You know, one of the things andI think this is, if we're
talking about porn addiction andwe're talking about these
images, right, something I toldmy, my kids, my two, my two boys

(25:41):
, is, it said look, you're gonnarun across it, it's gonna
happen, it's out there, it's allover the place, it's free, it's
accessible, there's no stoppers, not really.
You press a button, says I'm 18, doesn't check.
The thing that you need torealize is that the more you

(26:01):
train your brain and your eye ifyou will quote unquote to
utilize those images forstimulation, for where you get
your arousal, the more thatimage becomes the thing that
arouses you and the less thereal the real thing one day

(26:23):
doesn't stimulate you and youcan create a problem for
yourself.
You can create a cognitiveissue for yourself.
If you spend too much time withthis material.
My advice to you is stay awayfrom it and you know, and even
if you do run into it, imposelimits for yourself, because one

(26:47):
day you'll have a partner andthat partner will look nothing
like anything that's online.

Speaker 2 (26:52):
That's right, and won't want to have sex in those
ways that you're seeing.

Speaker 1 (26:55):
In those ways either.
You know all these are.
These positions are designedfor camera.
That's right For the littleamount of porn that I've seen in
my life.
I can promise you that's notwhat you're going to probably do
in the pet trip.

Speaker 2 (27:10):
That's right.

Speaker 1 (27:12):
I'm just guessing, I don't want to know, but let's
separate these two things.
Yeah, yeah and in order to have, you know, healthy intimacy and
a healthy sex life.
And you know, intimacy is notjust about sex, it's about touch
and it's about togetherness andit's about all these other
features, none of which exist inVideos online right.

(27:33):
Almost almost none.
There's very little out therethat would.
That shows things that includeintimacy, include relationship
and all these other things it'slike.
So please be careful and don'tcause a problem for yourself.

Speaker 2 (27:47):
That will cause you also, and has caused for many
performance issues in thebedroom I was going to say
erectile dysfunction is causallycausally linked to the amount
of pornography that you consume.
That's right.

Speaker 1 (28:02):
That's absolutely right.
So when they heard that, theywere like whoa, wait, you know,
and it, like that's just that,didn't take me a minute, like
that's just a dose of education.

Speaker 2 (28:14):
Well, todd, I want to compliment you because I think
having sons is such aresponsibility I mean, being a
parent is a responsibilitygenerally but you know, I I
bring it reminds me of my goodfriend's interaction with her
son, where he was a teenager andin his first relationship, and

(28:34):
she checks his text messagesfrom time to time and she came
across this conversation of hey,I want to send you a sexy photo
.
And he was like are you sure Idon't want to pressure you to do
that?
And she had, you know, calledme and was like I'm so proud of
my son because of all the waysthat I've been trying to tell

(28:57):
him about this.
He liked it was like the thegirlfriend was pushing for let
me send this to you already, youknow, and him trying to talk
her out of it.
And you know, having thoseconversations with boys is so
important.
You know and we know from, like,a sexual violence standpoint,
that perpetrator intervention isso key, and especially college

(29:20):
age sexual assault at bars andfraternity parties and things.
But you need to know about it,right, you need to have
conversations about it.
And when it comes to your ownpersonal sexual health, like how
amazing, todd, that you wereable to have conversations with
your sons about being intimateand sharing intimacy.
I mean God, how many parentsactually sit down with their

(29:43):
kids and separate what is sexand what is intimacy?
Those are two important topicsthat I'm not sure get talked
about the dinner table and wheredo they?
get talked about At parties, youknow, with other people who are
just learning too, and I betintimacy and connection and love
is not part of those youngerperson conversations, you know.

(30:06):
So anyway, compliments to youfor doing that.

Speaker 1 (30:09):
Well, I, you know I say this it's like I appreciate
being complimented.
Everybody likes compliments,right?
Um, but to me it's an awful lowbar.
Yeah, to me like's an awful lowbar.
Yeah, to me like having goodconversations so that your kids
are educated about things andthey're prepared to be in the
world in a healthy andsustainable and, you know,

(30:31):
adaptive kind of way.
Right, just makes sense to me.
Yeah, now, I'm trained, I'veworked with adolescents, I've
worked with large populations ofpeople with various levels of
mental health and and lack ofhealth conditions.
Um, and I'm a, I'm welleducated and I've got all the
benefits that that that mightentail.
Uh, and, but I also a lot ofwhat I know is not what I got

(30:56):
from school, what I got fromfrom, you know it's it's.
It's what I got from beingdrawn towards things that caused
me to address issues of my ownor feel healthy, or have better
relationships, or do explorebeing a human being on this
planet, being a responsible male, which is, which is just, you

(31:17):
know, for all the males outthere, it's just as much about
being vulnerable as it is aboutbeing tough.
It's like this mythology that'sout there for males, it it?
I know why it's there.
I wish I knew how to solve itfor guys, because I think what
they're?
They're, they're desperatelyafraid and they don't know how
to behave.
And they've got these, thisdichotomy of of conflicting kind

(31:41):
of roles they're supposed toembrace Tough and tough but not
sensitive and you can't cry, andyou know you've got all these
messages and none of that'ssustainable and none of it's
real Right and vulnerability isjust as important to a healthy
human being as anything else.
And so you know where's theplace for all that stuff that
can live?

(32:01):
You know, I got a group offriends that we talk about how
to help males, but there stillseem to be a large crowd of guys
out there that are justfollowing this party line, the
red pill societies and those youknow, and even people who are
educated are.
You know, I give JordanPeterson a hard time because I'm
like you know, not all hispoints are not necessarily bad.

(32:24):
He makes some good points, buthe's just a jerk about it and
he's aggro about it and he'she's all he's.
He's got this very male centricapproach and I think he just
doesn't do people any services.
He's not, he's not reallymaking the world a better place.

Speaker 2 (32:41):
So having those role models I think is so important
and I don't know that parents Imean, I'm curious your thoughts
here as a parent, coach andeducator.
But you know, I'm not sureparents recognize how much of a
role model they are and how muchyou know their own
conversations with their kidsabout their marriage or their

(33:01):
divorce or their dating life hasan impact on how kids create
their own internal world ofintimacy and relationships.
You know, and it's, it's, yeah,it's hard to help parents
recognize because, as a, youknow, I'm raising a teen now and
I know she loves me but, man,she does her best to make me

(33:24):
feel insignificant sometimes,you know, and and only because
I'm in the field that I'm in,it's like, I know I'm not.
You know, I know all of theseinteractions matter, but if I
didn't have the education Iwould throw my hands up too, you
know and so I'm curious how doyou work with parents in that
way and tell them that what,what are the obstacles of them
recognizing their own?

Speaker 1 (33:45):
influence.
A lot of times, parents are, ofcourse, dealing with crisis, as
you know, when we, when they'reworking with us, but, you know,
a lot of times it's all aboutthe, the fear of making your
child uncomfortable or engagingin conversations that are sticky
or setting limits or any ofthese other things.
And if you're, if you're afraidto set healthy limits and

(34:07):
boundaries with your child, thechances are good you're also
afraid to have those kind ofclose, touchy conversations with
them.
Um, and I, one of the solutionsI think is actually important
to name uh, in my experience atleast, is that, um, there's
always the big talk, right?
You know you gotta have the bigsex talk, you've got to have

(34:27):
the big.
You know, whatever talk it isthat you have to have with your
child, responsibility, talk,don't drive your car too fast.
Talk whatever it is.
And you know I, I'm a big fan ofinformation Like, look, you're
going to be in plenty of placeswhere you're going to decide
what you do.
I won't be there to tell you orto manage it or to control it.

(34:50):
Yeah, whether it's drugs orit's sex or it's how fast you
drive your car and all theseother things, let me just give
you some information.
Yeah, if you do this, the morechance you do this, the higher
risk you are of getting a caraccident.
How fast do you drive when youhang out with friends?
If somebody hands you drugs?

(35:11):
You don't know what's in it,you don't know where it came
from and there are all kinds ofsubstances out there.
They can literally kill you.
The buyer beware goes foreveryone.
If you're out there doing thesethings, if you're out there
engaging in intimaterelationship, you need to.
It needs to be consensual.
People don't think you need totell your children that that sex
needs to be consensual.

(35:31):
I'm like but you do, but you donot because it's a foreign
concept.
So they don't, they don't knowanything.
And and giving them anythingand I they, they don't have to
be big conversations Like, letme give you a chunk.
I want to give you a chunk ofinformation right now.
You do with it what you will,but this is what you need to

(35:53):
know about what's going tohappen and what happens to
people and what are theconsequences of the these
behaviors.
And that takes me.
It takes me two minutes.

Speaker 2 (36:01):
Yeah.

Speaker 1 (36:02):
Good luck, son.
Enjoy the prom.
You know what I mean.

Speaker 2 (36:05):
Yeah, that's politically relevant right now
too.
You know there's conversationsin school boards and, um, you
know, school districts generally, around taking consent out of
sex education, you know, andthat is a scary proposition idea
.
Yeah, you know it's um workingwith sexual assault victims as

(36:29):
long as I have.
I can tell you that um therehave been many instances when I
hear um the experience from thevictim where in no, I'm not
questioning her experience thatit was a non-consensual act, but
when I hear sort of the eventson the other side of the man's

(36:52):
side I'm not sure he even knewto ask for consent.
Look for consent, recognizethat.
You know when you're thisintoxicated.
You know when someone freezes.
They need to be responsive toyou.
You know things like that, that.

Speaker 1 (37:08):
Clueless to any of that.

Speaker 2 (37:09):
Just clue, yeah, just overwhelmed by their own sense
of sensory experience and maybewhat they've seen on Pornhub,
and not paying at all attentionto the person that they're with.

Speaker 1 (37:20):
Which baffles me.

Speaker 2 (37:21):
but Right, right, but it's like that consent
conversation of like, are youokay with this?
It's a simple question, youknow, and the fact that we would
remove that from education andschools is beyond me.

Speaker 1 (37:35):
Yeah, well, there's a lot of things that are beyond
me these days, but, yeah, myfavorite video about consent is
the british one about tea yesit's like if you ask the person
if they wanted tea and they saidno and still give them tea if
they were asleep, you wouldn'teven ask them if they wanted tea

(37:56):
.
You know it's like it's perfectit's absolutely perfect, oh lar
.

Speaker 2 (38:01):
Larry David does a great on Curb your Enthusiasm
His character.
Did you see that?
Is it okay if I unbutton yourblouse?
Is it okay if I put my righthand on your left breast?

Speaker 1 (38:14):
It's a very exaggerated consent conversation
but a great parody on you knowthis conversation generally well
, I think that we opened up aseries of remaining rabbit holes
that you and I could go startrunning out for a little while,
but, um, I have, like you,opened a world up to me that, uh

(38:36):
, I think I need to be moreaware of, just so I can educate
people that are dealing with itand everything else.
But I certainly look forsolutions and sound like you
know, I'm grateful to be withyou as one of those people
that's out to make a world abetter, co-conspirator.
Making the world a better place, um, good to be your
co-conspirator.

Speaker 2 (38:53):
Thank you for nerding out with me on this very
important conversation in mymind and um very important
conversation in my mind andwe're not having it enough, so
thank you for bringing it tothis platform and talking about
it.

Speaker 1 (39:05):
Well, Dr K, we call you Dr K because you're Dr
Kristen.

Speaker 2 (39:11):
Dunn Master Weatherly .

Speaker 1 (39:13):
That's right.
That's right.
It's been great to have you onthe show.
I really appreciate you beinghere and I'm sure we're going to
do it again.

Speaker 2 (39:21):
Thanks again, it's been fun.

Speaker 1 (39:23):
It's been Headed Side Mental Health with Todd
Weatherly.
Hope to see you next time,thank you.

(40:03):
I'm used to the legal arts inhere in here, I'm a little power
.
Oh, oh, oh, I'm used to thelegal arts in here in here, I'm
a little power.
Oh, oh, oh, I'm used to thelegal arts in here in here, I'm
a little power.
Oh, oh, oh, I'm used to thelegal arts in here.

Speaker 2 (40:19):
In here I'm a little power.
Oh oh oh, power, oh oh oh.
Thank you.
I feel so lonely and lost inhere.
I need to find my way home.

(40:58):
I feel so lonely and lost inhere.
I need to find my way home.
I feel so lonely and lost inhere.
I need to find my way home.
I feel so lonely and lost inhere.
I need to find my way home.
I feel so lonely and lost inhere.

Speaker 1 (41:18):
I need to find my way home.
Find my way home.
Advertise With Us

Popular Podcasts

Stuff You Should Know
The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.