All Episodes

July 14, 2025 27 mins

The digital world has become a hunting ground where algorithms serve as silent recruiters, pulling vulnerable young men toward misogynistic ideologies at alarming speeds. This eye-opening conversation with experts Laura Frombach and Joy Farrow reveals the disturbing reality of how technology amplifies hatred against women and girls.

When researchers created a fake social media account for a 16-year-old boy, the algorithm began serving misogynistic content within just 23 minutes. This isn't coincidence – it's systematic grooming at an algorithmic level. While human predators might take months to isolate and indoctrinate victims, today's AI-powered platforms accomplish the same goal with frightening efficiency through data-backed feedback loops that constantly refine their effectiveness.

The most troubling aspect is how this online radicalization translates directly into real-world violence. We examine three chilling case studies where digital hate found deadly physical expression, including the notorious Isla Vista killings by Elliot Roger and the Parkland High School shooting. These weren't isolated incidents caused by individual pathology alone – they represent the culmination of algorithmic radicalization pathways that validate and amplify harmful ideologies.

For parents, educators and concerned citizens, addressing this crisis requires immediate action. Our experts provide practical strategies for engaging with youth about their online activities, teaching comprehensive media literacy, and effectively intervening when someone shows signs of radicalization. Most importantly, they emphasize the need to "call out the content but call in the viewer" – recognizing that many drawn to toxic content are primarily seeking connection rather than hatred itself.


Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
In this second part of the three-part conversation
with Laura Frombach and JoyFarrow, we examine how misogyny
has weaponized technology toillustrate how online platforms
have become training grounds fordigital predators, transforming
what used to be fringe misogynyinto mainstream male supremacy.
Through algorithmicamplification, monetized content
and influencer ecosystems.

(00:21):
This virtual content results inreal world offline violence
against women and girls.
I'm Maria McMullin and this isGenesis, the podcast Laura and
Joy welcome back to the show.

Speaker 2 (00:37):
Thank you for having us.

Speaker 1 (00:38):
Thank you so much.
We're glad to be here.
In our last conversation wediscussed the weaponization of
kindness, using metaphorsspecific to technology that
translate how we, as women, aresocially conditioned to ignore
our own intuition and submit tothe demands and even the abuse
of men.
Let's recap that concept as wemove further into its
consequences in talking aboutthe weaponization of technology

(01:01):
through misogyny.
So Laura and Joy give us aquick recap of the concept of
the weaponization of kindness.

Speaker 3 (01:07):
So the last time we talked about kindness, but not
the kindness that gets nurtured,it's the kindness that gets
exploited.
Here's what we know.
Our instincts are hardwired,just like any other mammal.
We are built to sense danger,built into our bodies by nature
Somewhere along the way.
For women especially, societyinstalls a faulty app and it's

(01:30):
called Be Nice, no Matter what,and it teaches us to override
our natural instincts so that westay quiet, stay small and stay
agreeable, and that's whatpredators look for.

Speaker 2 (01:43):
Got it.
They don't look for weakness,they look for kindness, the kind
that's been programmed toignore red flags.
But the good news, the badsoftware can be overwritten and
your instincts were never broken, they were just muted.
So today we're going to talkabout how technology, like

(02:04):
predators, has learned toweaponize that same social
programming.
It's time to take back thatcontrol.

Speaker 1 (02:12):
Yeah, and in a lot of ways it is possible to take
some of that control back.
Now help us understand wherethe idea of how misogyny has
weaponized technology has comefrom.

Speaker 3 (02:21):
So the concept that misogyny has weaponized
technology came from we'll usethe Dublin study as an example
found that when the researcherscreated a fake account for
16-year-old boys, that within 23minutes that account was fed

(02:43):
misogynistic and anti-LGBTQcontent.
And so it's talking about.
This isn't just what to watch.
Next it's grooming by algorithm, and here's why so kind of
think of it like this.
If a human predator slowlygains teens' trust and isolates

(03:04):
them and feeds them toxiccontent over time, we'd call it
what it is right Grooming.
Now swap that human predatorfor an algorithm Same strategy,
same harm.
It's just faster, slicker andwith a data-backed feedback loop
.
And so the Dublin study showedthat those fake accounts didn't

(03:25):
just stumble into content.
The account was fed thatinformation by the AI algorithm.
The systems are designed thatway.
They're designed to keep peoplewatching.
Controversial, extremistcontent does exactly that.
And now AI makes the algorithmworse because it's self-tuning.

(03:46):
So the algorithm watches you asyou watch the videos.
So if you linger too long on avideo that's pushing toxic
content, the algorithm says, ohhere, you like that, here's 10
more.
And unlike a human you know ofcourse it doesn't sleep, it
doesn't stop and it certainlydoesn't question the ethics of

(04:08):
what it's feeding you.
So we say what we're seeing isa high speed conveyor belt
pulling boys especially thosepoor guys who are lonely angry
or they just want to datetowards content that dehumanizes
women, mocks LGBTQ people andnormalizes violence.
So we say this is actuallyradicalization on an industrial

(04:32):
scale, dressed up asentertainment.

Speaker 1 (04:33):
Yeah, I want to back up a minute from that part of it
, because someone designed anddecided that this algorithm
needed to be in place to entrapyoung boys and young men to be
socially conditioned towardmisogynistic thoughts.
So this is kind of thearchitecture that's designed by

(04:54):
patriarchal society.
That is correct.

Speaker 3 (04:56):
That said, I will say the algorithm does not care
what content it serves up.
So, whether you're cooking,whether you're sewing, whether
it's misogynistic, whetheryou're into fashion or swimwear
or whatever, the algorithm doesnot care what content it serves
up, and so that's.

(05:17):
Its sole goal in life is tofeed you more of what you've
already watched.
The idea, I think, that someonehas designed this it's actually
just not personal.
The algorithm designed by thecorporations just wants you to
watch more content because,quite actually and we'll talk
about this further it's whatfeeds them ads, and, of course,

(05:41):
the ads are monetized and thecreator and the platform gets
more money from it.

Speaker 1 (05:46):
Well, who's designing the algorithm that selects
young boys?

Speaker 3 (05:50):
should get misogynistic content, then, it
depends on what they're watching.
So let's say they start offwith a question on YouTube that
says how can I get a date, howcan I meet women?
And what happens in the waythat creators really game the
algorithm is they will say thisis how you get a date, and then,

(06:11):
within that content, they startweaving that misogynistic
content.
So the algorithm itself is veryneutral.
It's only got one goal, and onegoal only to feed you content.
But it's the creators who gainthat algorithm and say oh, you
want to date, you want to learnhow to meet women?

(06:32):
Okay, I'll tell you this.
You want to self-improve?
I'll tell you how to makeyourself better, I'll tell you
how to work out and what to eat.
But in the meanwhile they areweaving in the content that says
oh, a woman's trying to set upboundaries.
Well, a guy like you canoverwrite those boundaries,
because you know what?
If a woman is really intosomebody, she's not going to

(06:54):
have any boundaries.
So I believe, we believe thatthe content creators are just as
much to blame because they'vegamed the algorithm.

Speaker 1 (07:03):
Yeah, I've no doubt that that's true as well.
It's a fascinating topic and,as simple as it sounds, I think
there are a lot of complexitiesinto how these algorithms are
designed and what all thecontent is that can be pushed.
And I know at times on socialmedia, I feel like I'm being
tested by different businessesor tech companies or algorithms

(07:23):
to say well, what do you thinkabout this?
I know you look at all of thisand I have certain things that
I'm always looking at, but whatdo you think about this?
And you can ignore it or justkeep scrolling Joy, what do you
have to say?

Speaker 2 (07:34):
I definitely agree with that, and I find that
myself when I'm online whoa, whyam I being pitched all this?
And I just think over the lastfive years at least, it's just
become bombarding you with allthese horrible toxic content.

(07:54):
You almost can't get away fromit wherever you are.
No matter what you do, what yousay, it still tends to pop up
on your feed.

Speaker 1 (08:02):
Yeah, it is very hard to get away from it and there's
not a lot of legislation aroundit.
I mean, they only just passedthat recent legislation, the
Take it Down Act.
So the Take it Down Act wasinspired by deep fakes being
created, especially of women, ofnude images, and this has
happened to lots of celebritiesand it happens to lots of women,

(08:22):
you know, and young girls, andI'm going to look it up so I can
read it exactly what it saysthe Take it Down Act is a
federal law that addresses theissue of non-consensual intimate
images, including deep fakes,that are shared online, and it
actually criminalizes thepublication of these images and
requires platforms to removethem upon the request of the
victim.
The act also includesprovisions for law enforcement

(08:45):
to access this information forinvestigative purposes, and
there's a lot more to it.
Right?
But it passed the House and itwas sent to the president, and
while you're looking at that.

Speaker 3 (08:56):
I would love to see Congress do anything because,
quite truthfully, here we are in2025 and nothing has been done
in Congress about making onlinespaces safer even since the turn
of the century, so 25 years.
Ever since they legislated Ithink in 1998, because of

(09:18):
lobbyists, because of theinfluence of the corporations,
social media corporations andwhatnot, all of the legislation
has been overcome time and timeand time again.
No matter who testifies infront of Congress, no matter how
many whistleblowers have comeforward, they have yet to act.

Speaker 2 (09:37):
And how many people have sued and have not gotten
their content taken down.
And the longer something likethat stays up there ruining
their reputation, and they'respending money for lawyers, and
it just keeps going around andaround.

Speaker 1 (09:56):
Yeah, it's a very murky place right On the
internet and posting stuff onsocial media, and there is not a
lot of safety for any of us onsocial media or the internet,
but there certainly is even lessfor children, and so we wanted
to talk about an example fromthe Netflix series Adolescence.

Speaker 2 (10:16):
Right.
I'm going to say that from mytime in law enforcement, I've
learned that being home doesn'talways mean being safe.
So a closed door and Wi-Ficreates a dangerous illusion
where parents think their kidsare safe because they're not out
.
But the threat, hate, doesn'tneed a front door, so it finds

(10:39):
them online.
And that Netflix series.
It's real A boy alone in hisroom, radicalized by hate, like
Laura mentioned, fueled bymessages of power and control,
he killed a classmate.
It made him kill his classmate.
This isn't rare.
We need to disrupt the silencebehind these screens.

(11:03):
The screens hide warning signs.
Digital safety isn't passive,it's active.
So we need to check in, staycurious, ask them questions,
because we need realconversations, we need real
awareness, not because you don'ttrust your kid, but because the
internet is full of voicestrying to own them and nobody

(11:29):
owns your children.

Speaker 1 (11:31):
I want to add to that a couple of thoughts as well,
because having conversations,especially with teenagers, can
be challenging.
To ensure that you're gettingreally honest answers and I'm
not trying to imply thatteenagers are intentionally
dishonest what I'm trying to sayis that you might ask them a
question specifically about areyou on this platform, do you

(11:51):
play Roblox or you know one ofthese other types of games,
which are equally as scary associal media platforms, and they
may.
They may tell you yeah, I playthat, but I only play with my
friends, or no, I don't use thatservice, but sometimes I get
Snapchats, you know, sometimespeople send them to me, and so
there are vulnerabilities intheir responses, and parents are

(12:13):
not always educated enough toknow what these responses mean
and that's again not a criticismof a parent so much as we don't
have the information we needand we don't have legislation
that could even protect us orour children, should we find out
that something is goingterribly wrong.
Now, lately, however, certainvideo games and other platforms

(12:36):
have been exposed as having someserious vulnerabilities and
very dangerous environments forchildren, and Instagram did make
a move to require age limitsfor accounts.
Now I think kids can't have anInstagram account, what I recall
hearing last.
But those are baby steps.
They're not like reallystrengthening the safety net

(12:58):
around technology in any way.

Speaker 2 (13:00):
It's still the conversation that sometimes I
think parents just don't want tohave.
They may feel embarrassed toeven talk about it because they
think they're going to hear oh,I know about it, they talk about
that at school and they feelthat if they went on that
website and you caught them, youknow they don't know what to do

(13:21):
and they're going to get angry.
So, you know, these areprobably just ways to talk to
them so they're not embarrassedand they can say yep, I was on
that website and I saw it.
You know they talked about itat school.
I just wanted to go on it.
But let them know what ishappening out there and how you
can be lured into peoplecatfishing you on there.

Speaker 1 (13:42):
Yeah, and I think this ties right back to the
episode that we that was thefirst part of this series, when
we're talking about kindness,and I think here's a really
dangerous scenario.
Right, you have a young girlwho's on social media, she plays
video games with her friends,she has her own account, her own
phone, and she's been taught tobe polite at all costs, and so

(14:03):
she's not listening to herinstincts.
And it's funny when someonereaches out who's a stranger and
starts texting you and they sawyour profile picture and you're
really cute, why don't you sendme another picture?
And she starts to be drawn intothis, and then she doesn't want
to say no, because now there'sa bond, right, so they've gotten
you to feel safe andcomfortable with them, and now,

(14:25):
if you say no, you know what'sgoing to happen they're going to
be mad, they're going to avoidyou, and then you don't have
this digital relationship anylonger, and this is one of the
traps.

Speaker 2 (14:43):
Right, and I think sometimes that if parents are
cornered, when they're gettingpushed back like that, if they
think maybe a relationship isstarting, you know, it might
even be good to look up a caserecently and there are so many
of them that have happened andsay, look, this girl was the
same age as you.
All she did was get online toplay this game.
And here's this person thatpretended to be 14 years old, 15

(15:05):
years old, and they're really30 and 40 years old or older.
So maybe if they see that,sometimes kind of like a movie
until they believe it, they needto see that these cases are
real and they do happen andchildren do get kidnapped.

Speaker 1 (15:22):
Yeah, I mean, that's one of the worst outcomes of
those types of cases.
Laura, what were you going toadd?

Speaker 3 (15:28):
to that.
So often we think about youngwomen, but I think it's
important to think about youngboys as well and how naive they
are online because they fallinto the same trap, and what
we're starting to hear moreabout now is the suicides of
young men who are caught insexting traps where somebody

(15:49):
pretends to be a young woman,pretends to send them you know a
picture, and then tells the boyto take a picture of himself
and everything that's involvedwith that and send it back, and
then, once they get going inthis loop that then they start
hitting them up for money and weknow people whose young men,
whose young boys, have beencaught in this trap and they say

(16:10):
if you don't send us the money,then we're going to send these
pictures to everybody on yoursocial media.
They just had to take the hitbecause what's more important?
Your child's life, or somepictures that people are going
to forget about in the next newscycle or the next 24 hours?
But you know, because so muchshame is involved, that so many

(16:33):
young people are being taken inby this and you know, really it
breaks our hearts.

Speaker 1 (16:37):
For sure, and that has been a popular news story
lately it's been in the mediaabout those sexting cases and
some of the really terribleconsequences and outcomes from
those.
I suggest people look into someof these topics themselves,
especially if you have childrenof your own, to understand kind
of the landscape of what's goingon with our young people and

(16:58):
technology.
Let's turn our attention to alittle bit of a different
concept.
Joy, can you give us someinsight into how tech-driven
misogyny is showing up in actualcases, whether it's domestic
violence, school threats orother gender-based crimes.
In other words, we experiencethis harassment or misogynistic
content in technology and whenwe're, you know, kind of in that

(17:21):
secluded space alone with ourWi-Fi, with the bedroom door
closed, what happens in reallife then, when we are out with
people?
What are some examples of whatcan happen to change our
attitudes towards other people,especially men, towards women?

Speaker 2 (17:36):
There have been plenty of cases that I have seen
over the years in my career inlaw enforcement, a lot of the
cases I studied.
They are rooted in control andentitlement and these people
have a deep resentment towardswomen.
So that's misogyny in actionand you know it's not about

(17:59):
hateful words, but it's abouttheir behavior.
So, like the mass shooters anddomestic abusers, they have
clear histories of women hatingand they justify this abuse by
blaming the victim.
And it shows up in schools andhomes, especially online, and
these extremist groups activelyrecruit young men into

(18:23):
misogynistic thinking.
Now I have three case studiesthat illustrate this behavior.
Three case studies thatillustrate this behavior and
it's really mind blowing that atsuch a young age, this has
happened and they've got suckedinto this.
The first one was a notoriousone and he was 22 years old and

(18:44):
he inspired others Elliot Roger,isla Vista, california, in
2014,.
He killed six and he injured 14in his misogyny-fueled rampage
near UC Santa Barbara.
He called it his day ofretribution against women who
rejected him and the men hehated, and his YouTube videos

(19:08):
and manifesto showed years ofgrowing resentment, violent
fantasies, starting at age 17.
The second one, nicholas Cruz,from Parkland, florida, the
county I worked in and, as amatter of fact, it was just
months after I retired and mysheriff's office responded to

(19:28):
this at the high school inParkland.
At the high school in Parkland,Cruz was 19 years old.
He opened fire at the highschool with an AR-15, killing 17
and injuring another 17.
And before the attack he postedviolent messages, was active in
the online extremistcommunities.

(19:49):
He was another isolated, angryguy.
And the third guy is veryrecent, from FSU in Florida,
phoenix Eichner.
20 years old.
Eichner targeted fellow collegestudents, hate online and on
the campus In his posts, alsodepostility towards women and

(20:20):
minorities.
These aren't isolated incidents.
They're part of a digital echochamber.
This is tech-driven misogyny inaction.

Speaker 1 (20:30):
Those are very interesting examples, horr
horrifying, of course, but theyare exactly what you just said.
This is how tech-drivenmisogyny plays out in our
day-to-day lives, and we see itall the time on media.
Let's talk about what can bedone.
There are four bills beforeCongress to regulate algorithmic

(20:52):
exposure and content targeting.
Talk about what those are.

Speaker 3 (20:55):
First of all, the Kids Online Safety Act.
Now that was passed last year,I believe, by the Senate, it was
held up by the House and nowit's back again with some
revisions.
So if this does pass, this willbe as I mentioned before, this
will be the first bill thatpasses our federal government

(21:16):
regulating online content forchildren.
The second one, called theChildren and Teens Online
Privacy Protection Act, andthat's COPPA 2.
And it expands protections forusers ages 13 to 16, andans
targeted advertising andmandates verified parental

(21:36):
consent, and it also introducesan eraser button for teens and
parents to delete personal data.
The third one aims to tightenobligations for platforms around
child sexual abuse material,and it's CSAM, and so it
requires public reporting,mandatory reporting of suspected

(21:59):
exploitation, and it introducespenalties plus a new exception
to Section 230.
That's the bill that was passedin 1998, the last time by
Congress, I might add, to allowsuits when providers facilitate
child sexual abuse material.
You would think that this wouldbe a no-brainer Stop the

(22:19):
exploitation, especially theexploitation of children, but it
seems, as we talked aboutbefore, it seems to go round and
round.
And then the fourth one is theEARN IT Act, and that proposes
stripping Section 230 preventionprotections when platforms fail
to prevent childhood sexualabuse material.

(22:41):
So I'm going to summarizeSection 230 real quick.
It basically says that theplatforms are not responsible
for the content that usersprovide.
So if Time Magazine or the NewYork Times publishes content,
they are held liable forpublishing factual content.
Section 230 of the federalgovernment says social media

(23:02):
platforms do not have thatresponsibility.
In fact, anyone can postanything they want and the only
responsibility of the platformis to make the platform
available and, as we justpreviously talked about, and to
push content that users want tosee more of.
So we're hopeful that theseregulations pass in Congress.

(23:25):
But we also encourage yourlisteners to not just rely on
legislators, because the newscycle, the cycle of predation on
the internet, is way too fast.
It takes legislators years andmany of them have no idea about
technology.
By the way, it takes years forthem to even consider something

(23:47):
and in the meanwhile, things cango viral just in a few minutes,
as we know.

Speaker 1 (23:51):
Yeah, and technology has changed so rapidly and I
mean, I think even if there waslegislation a couple of years
ago, it would have to be revisedat this point because there's
just so much happening.
I appreciate you bringing allof that to our attention.
What else can parents, schoolsor even people listening to this
podcast do to help stop thespread of this online
radicalization?

Speaker 2 (24:10):
Well, I'm going to say for parents, it's not just
about the screen time, it'sabout the screen content.
Like we said, you can ask yourkids what are they seeing, but,
you know, not in an aggressive,gotcha tone, but with some
curiosity.
You know, who do they follow,what makes them feel seen or
powerful online?

(24:31):
You know that opens the door totalk about manipulation without
triggering shame.
And for schools, media literacydefinitely needs to be taught
like a life skill, because it is.
Schools should learn howalgorithms work and how online

(24:51):
content can be engineered tomanipulate them.
And for the rest of us friends,neighbors, podcast listeners we
need to call out the contentbut call in the viewers.
So when we see someone gettingpulled into that toxic content,
don't write them off.
You can try, you know,validating their feelings or

(25:12):
insecurity while drawing a hardline against hate.

Speaker 1 (25:16):
Yeah, those are good points.
Thanks for sharing that.

Speaker 3 (25:19):
The only thing I can add is that we say call out the
content but call in the person.
One of the things we talk aboutso often is involuntary
celibates the acronym is incelsand incels, like we talked about
before, they're often young menwho are lonely, who are angry

(25:39):
and who are pulled into thiscontent because they are angry
and lonely and they don't reallyhave anyone else to turn to, so
they think that theseinfluencers are their friends.
So when we become aware of that, if we can be empathetic and
really call those people in andlet them know that they aren't
alone and that that content isharmful but that there are

(26:00):
better ways, I think that we canall benefit.

Speaker 1 (26:02):
Good advice.
So thank you both for talkingwith me again and I will see you
next time.
Thank you.

Speaker 3 (26:08):
It's been wonderful.

Speaker 1 (26:08):
Thanks so much, maria .
Genesis Women's Shelter andSupport exists to give women in
abusive situations a way out.
We are committed to our missionof providing safety, shelter
and support for women andchildren who have experienced
domestic violence, and to raiseawareness regarding its cause,
prevalence and impact.
Join us in creating a societalshift on how people think about

(26:29):
domestic violence.
You can learn more atGenesisShelterorg and when you
follow us on social media onFacebook and Instagram at
Genesis Women's Shelter, and onX at Genesis Shelter.
The Genesis Helpline isavailable 24 hours a day, seven
days a week, by call or text at214-946-HELP 214-946-4357.
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.