All Episodes

October 1, 2025 27 mins

Examining how social media algorithms, online communities, and anonymity can influence vulnerable young people, sometimes with deadly results. Through expert insights and the tragic story of Bianca Devins, we uncover how loneliness and online echo chambers can fuel real-world violence. Even an innocuous search online can spiral into extremist spaces, with Incel ideology just a few clicks away—a stark look at the dark side of digital life.

Check us out online:

www.instagram.com/kt_studios

www.tiktok.com/@officialktstudios

www.kt-studios.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From the dark corners of the web, an emerging mindset.

Speaker 2 (00:03):
I am a loser if also we know wouldn't pay
me either.

Speaker 1 (00:06):
A hidden world of resentment, cynicism, anger against women at
a deadly tipping point.

Speaker 3 (00:13):
In Cells will be added to the Terrorism Guide Pelissea.
A driver intentionally drove into a crowd, killing ten people.

Speaker 1 (00:21):
Tomorrow is the day of retribution, the day in which
I will have my revenge.

Speaker 4 (00:27):
Is very angry, expressing a lot of hatred towards women
and towards men who get all the women.

Speaker 3 (00:34):
I just told my husband I know she's dead.

Speaker 1 (00:39):
This is in Cells, a production of Kat's Studios and iHeartRadio,
Season one, Episode three, Just ten Steps. I'm Courtney Armstrong,
a producer at KATI Studios, with Stephanie Leidecker, Gabriel Castillo,
Connor Powell, and Carolyn Miller. In the first two episodes,

(01:01):
we identified what in cels are, spoke with experts about ideologies, language,
and where they commonly congregate online. We also heard from
two remarkable mothers who shared in the aftermath of losing
their daughters, nineteen year old Veronica Weiss and seventeen year
old Bianca Devans, both girls died senselessly at the hands

(01:22):
of inseell ideology taken to its most violent extreme. In
the case of Bianca. What her killer did post mortem
continues online to this day.

Speaker 5 (01:33):
Seventeen year old Bianca Devns of Utica was found dead
in a wooded area at the end of the street.

Speaker 6 (01:39):
Her killer posted the images of her body online and
they soon went viral.

Speaker 2 (01:43):
On discord, it's fairly common for people to post gore
and disturbing images just to sort of get a rise
out of people.

Speaker 1 (01:52):
Kim Devns, Bianca's mother, spoke about how toxic and dangerous
the conversations are in insuell forums. That's where we spend
a lot of time throughout our producing both to get
a lay of the land and to gather firsthand experiences.
Here's investigative journalist and producer Connor Powell talking about his
experiences on insull forums.

Speaker 4 (02:15):
There's sort of two different groups. There are those that
are very public about their struggles and self describing themselves
as in cells and on places like Facebook and Reddit.
They tend to be a little bit more sort of calm,
a little less angry, and that's part according to talking
to them, is that both Facebook and Reddit have some rules,
have some moderation restrictions, and they do police the language

(02:41):
they are using, often their own names, so they're a
little bit more easy to talk to. But on in
cell discussion board, insel dot co or insel dot is,
those names are all anonymous. They don't use their real names.
In fact, there was a whole discussion thread about how
you can best protect yourself, about protecting your identity so
that you can essentially post as angrily as obscene as

(03:04):
you want without having to have any repercussions. So you know,
they are very aware that what they're posting could get
them into trouble. And I think you see a lot
more abusive language, a lot more sort of obscene language
on this in cell website than you would say on
like an in cell chat group on Facebook or even Reddit.

Speaker 7 (03:25):
Online anonymity means no repercussions for being nasty.

Speaker 2 (03:29):
Add all this together and you get what psychologists call.

Speaker 7 (03:32):
The online disinhibition effect, and effectively it dictates that people
will do things that they wouldn't do in the real world.

Speaker 1 (03:42):
I asked why members of inseell forums would have such
a strong focus on security and anonymity.

Speaker 4 (03:49):
They don't want to lose their jobs. They don't want
to have other groups that are combating in cell behavior
or in cell speech, you know, find out their names
and put their information out of on social media, to
have the police called on them, to have other groups
attack them online. So you know, they really do make
a real effort to protect their identity because they just

(04:10):
don't want to be called out. They know generally what
they're saying is inappropriate. Generally what they're saying is not
acceptable in sort of the real world. So one person
talk about how for them, the real world is online
and the real world for them, in terms of interacting
with other people is sort of the fake world. They
have to be somebody they are not in the real world,

(04:32):
and that's because they feel most comfortable when they can
use this abuse of language and say things that most
people would find pretty obscene.

Speaker 1 (04:41):
Spending time on whatever you're interested in is not a
new phenomenon. People have things in common or generally agree
with one another. Forming groups is not a new phenomenon. However,
the ways people are fed information and are drawn together online,
specifically in social media, has not just been mega amplified,
it's literally gotten out of human control.

Speaker 8 (05:07):
Americans average more than two hours a day scrolling content
fed to us through algorithms based on our specific interests
and past online activity. But a growing chorus of critics
warns that algorithms can lead us to disturbing places, into
making bad choices.

Speaker 7 (05:22):
These algorithms, they are the digital equivalent of ar fifteens.

Speaker 1 (05:29):
Social media algorithms seem to hand deliver whatever you didn't
know you needed, write to your phone or computer and
instantly how to craft the perfect cocktail out of paar juices,
master cartwheel, or spruce up your backyard. These are innocuous examples.
Here's Stephanie and Connor Powell discussing the potentially nefarious side
of algorithms.

Speaker 4 (05:51):
At its most basic level, the algorithm is a computer
or social media site deciding what is put in front
of you in your feed. It's a computational set of
rules that whether it's one of the large companies like
Facebook or Google, they've decided that they are going to
push content to you. It's the computer saying you're going

(06:14):
to watch this, and the more you scroll, the more
you look at things, the more it learns about your
behavior and your attitudes and your interests. Now they're looking
like are you a young man? The algorithms are getting
so sophisticated, they're starting to do predictive interests and predictive likes,
and for young men and for young girls, that can
get really dangerous because the algorithms start of making general,

(06:39):
overarching decisions about what you should be interested in, and
it can push you in those directions because again, what
are they trying to do. They're trying to engage. What's
the easiest way to engage to inflame if you're fired up,
if you're angry, if you are riddled with anxiety. You
keep scrolling, you keep looking through, creating a feedback.

Speaker 2 (06:58):
And what we're seeing real time with Insight, which I
thought was really staggering. There is no question now that
there are certain groups and people who are specifically targeting
young demographics who have a very limited social network, who
maybe feel isolated in the world, who maybe feel like
the loner, who may have suicide ideation or a tilt

(07:21):
toward violence because they're feeling completely disenfranchised. That is the target.
So it's not just this happy accident where two things
are meeting in this dark place.

Speaker 1 (07:34):
We spoke with Boyce and Hodson, the communications and marketing
director for The Mankind Project USA. He shared some disturbing
but very important information with us.

Speaker 6 (07:46):
It is very easy to see and folks, my age,
folks who are older parents, we have young adult children.
A lot of us, we don't even know what our
children are seeing. And when the researcher go and look
at content and just kind of let the algorithms take
over for them, what is inevitably going to happen with

(08:08):
in Sell ideology, with pornography, with any kind of violent
or antisocial messaging, is that the content will get increasingly
violent and increasingly more negative if you just let the
algorithm control what you're seeing. So a nine year old
boy can very easily go from a very innocuous YouTube

(08:31):
video to hardcore in Sell ideology in ten steps. You know,
that's a made up number, but it's very easy to
go from very simple stuff into very harmful stuff if
you just follow the algorithm.

Speaker 1 (08:49):
We asked how this continuous, violent and antisocial content affects
young developing minds.

Speaker 6 (08:57):
There is an incredibly dangerous developmental experiment that we are
playing with young people in our culture. Jonathan Hate talks
about this in his writing quite a lot. There's a
book called Dopamine Nation that really highlights this idea that
we are rewiring our children's brains and ways that we

(09:18):
don't even understand, and we won't actually know what the
outcome is for another ten years or so. The addictive
process is an incredibly powerful process that we are kind
of just letting run our society.

Speaker 1 (09:38):
Let's stop here for a break. We'll be back in
a moment here again. Connor Powell and Stephanie Leidecker. They're

(10:05):
talking about how social media is specifically designed to keep
us all on edge or enraged. It's a way of
locking us into the doom scrolling loops.

Speaker 4 (10:16):
So the single most important metric that social media companies
use for their sites, and Mark Zuckerberg has talked about
this is engagement. What is engagement? Engagement is how long
are you on their website? And how do you keep
people engaged? You build a sense of anxiety, you fire

(10:37):
them up. They are designed to take you to bad places.

Speaker 3 (10:41):
Have you ever felt like the ads that you see
on your social media feed are specifically targeted to you.

Speaker 1 (10:46):
It's almost like the app is listening to your conversation.

Speaker 4 (10:49):
What if your phone already knows what you'll buy?

Speaker 5 (10:52):
Who you'll date, even what you'll fear before you do.

Speaker 4 (10:56):
The single largest entertainment company in the world is not Comcasts, Paramount, Disney,
the BBC, or any of these other companies. It is YouTube.
There is more content, It creates more revenue and more
profit than any other and YouTube has something that no
other company does because YouTube is owned by Alphabet, the
Google company. Right, if you send an email that gets

(11:19):
pumped into how you will look and what is fed
to you in YouTube, because it will read what you
are doing an email or what you are searching for.
I want baseball cleats for my son. Guess what's going
to pop up on YouTube? Advertisements or people talking about
baseball equipment, or people talking about sports, or maybe they
broaden it out if you put in something that's maybe darker, oh,

(11:41):
gasoline caniser. Who knows what's going to pop up on
YouTube when you go to YouTube? Right, Yeah, maybe it's
for your vehicle, or maybe it's for something else. We
just don't know because we don't know what's in the algorithm.
It can take you to some very dark places.

Speaker 2 (11:56):
What you just described is exactly the point and the
scariest thing I've ever heard.

Speaker 4 (12:02):
Do you remember when we first got on social media,
and it was just twenty years ago, right, it went
from being a connection to your family and friends to
essentially a NonStop advertisement of other people's products.

Speaker 2 (12:16):
That you bring up a good point about advertising, and
I guess that's the key question that we want to unpack.
Where is the money? Who is paying for what? The
oldest adage in the land follow the money.

Speaker 4 (12:28):
I mean, the largest companies in the world right now
are all tech companies that are dealing with AI or
providing algorithms for content. When Google purchased YouTube back in
two thousand and six, twenty years ago about now, they
purchased for one point six billion and is now worth
roughly five hundred billion dollars. That's an incredible return for

(12:50):
the Google Alphabet company. And it's the largest media company
in the world right now. Their reach is global and
they don't pay for any content. All they do is
put ads on user generated content. Just to put that
in perspective, there are first world countries like Austria, Singapore,
Norway which have GDPs the gross domesster product which is

(13:12):
sort of the how big is the economy that are
roughly five hundred billion dollars so YouTube as a standalone
company as roughly the same value as some of these
other first world countries. That's how big and powerful YouTube
as a standalone company is.

Speaker 1 (13:32):
Stephanie and Connor turn their attention from the nuts and
bolts of algorithms, technology so powerful it's collectively worth trillions
of dollars worldwide to how their impacts can lead to
real world violence.

Speaker 2 (13:47):
If we're watching things on our social media that are
intentionally being targeted to our small perspective, that's also pretty dangerous.
So if you're somebody who's unemployed, tough on their luck,
living in the base men of mom and dad's house,
kind of feeling a little bit lonesome and depressed, not
really assimilating into the real world. Here you are, You're

(14:07):
on your computer, you're lacking community, and now suddenly community
is finding you. Now suddenly said community is dark and
potentially dangerous, and they're filling your brain with more dark,
scary information. So while you may feel like you are
a part of a community, the community is full on
targeting you because they know that you might be vulnerable

(14:29):
and lonely and susceptible to them. And at the end
of the day, it could lead to violence, as we're
seeing real time.

Speaker 4 (14:37):
In the hour before the attack, a four page manifesto
allegedly written by Crusius appeared online.

Speaker 7 (14:43):
Social media posts attributed to him contain other hate speech,
and a final entry just before the shooting, perhaps was
an indicator of the deadly violence about to unfold. The
suspect repeatedly posted photos of guns and Bower's profile page
read quote, screw your optics, I'm going in.

Speaker 2 (15:01):
Even when we started this podcast, this was not that
prevalent in terms of language in cells, things like that.
Now it's on the ticker of every show we are
seeing because the radicalization in these dark corners of the
web that are not regulated, we're all subjected to it
and don't even realize it. So and we're adults. What

(15:23):
about those who were kind of finding their way on
their own.

Speaker 1 (15:28):
We had an in depth interview with the young man
who self identifies as an insul. He goes by mister East.
We'll hear more of his personal story in a later episode.
For now, in an effort to hear all perspectives, we
ask mister East how he feels algorithms are impacting his world.
For me, it was eye opening.

Speaker 5 (15:49):
I mean algorithms. I don't think they're doing it intentionally.
I think they're more like they're doing it because they see,
like people view a lot of these contents, and they
push the content because you know, they want to make money,
right and more. The more people view them, the more
a revenue they get, the more money they make. And
what happens is that a lot of these people, basically

(16:12):
it's a spiral. People might start viewing like why don't
girls like me? Or what am I doing wrong? On
Tinder at first, and maybe after words they start following
into the red pill sphere, maybe not like the black
pill right away. It all depends on how like the
algorithm plays out by chance. I don't believe the Internet
makes people into in cells. I think like life experiences does,

(16:36):
but like the Internet certainly allows more, like more of
these black pill types of people to like get together
and discuss their thoughts and ideas.

Speaker 1 (16:45):
We asked mister East how he personally got involved in
online in cell groups, as well as how and why
he thinks others found these communities.

Speaker 5 (16:57):
Yeah, I do think that algorithm plays large role into
forming these groups. I first joined the community because I
wanted to basically talk to people. I was never too
good at forming relationships, you know, making friends. So when
I got invited within the community and started like talking

(17:17):
and getting along with a lot of the people, I
do believe that a lot of these people they aren't
really like bad people. They are just lonely, maybe socially
awkward as well, which is why the Internet, I guess,
in a way, is so easy for them to you know,
get together and form these communities, because it is much
easier over the Internet to communicate and form groups than

(17:39):
it is in real life. A lot of them, including me,
we do bond over our shared sense of failure in
a way. You know, it's sometimes good to know, hey,
you're not the only failure within this world. You know
you're not alone in this. And I feel like that
really resonates with a lot of people because one of

(18:00):
the things I do believe that occurs when people are failing,
depending on like how many people around them share the
same problems, they might feel more alone because they might think,
why am I the only one who's struggling with this?
Why is nobody else? Is there just something intrinsically wrong
with me? I think having a community that can tell

(18:21):
you, you know, we're just like you, we shared the same problem.
Then we can we can have a laugh about it.
We can say, hey, it's over. You know it never began.

Speaker 1 (18:30):
We asked mister East about how online groups compare to
in person physical connections.

Speaker 5 (18:37):
I would say that no online group in general can
really substitute for in their personal relationships in real life.
And I think that despite having like a community that
can relate to you, I do think that it doesn't
really solve your loneliness. And this is where like a
lot of people get sucked into like the part of

(18:58):
the community that is like more misogynist, more hateful, and
more like angry. Because when you don't have any in
real life relationships and you instead spend all your times
within these echo chambers that like you know, are just
born from people's pain and anger and frustration, it does
cause many people to become more and more radicalized. And

(19:22):
basically online isn't like a replacement for real life loneliness.
You will still feel lonely even if you are a
part of these communities, despite what society tells you. You know,
the nice guy doesn't win. The nice guy never wins in.

Speaker 3 (19:35):
This case, algorithms feed us what they think we want
to see. It gets to the point where people often
don't see opposing ideas that may challenge their own views
unless it's through the lens of attacking or mocking the
other side. And the cycle keeps going.

Speaker 1 (19:57):
Let's stop here for another break. We'll be back in
a moment. Stephanie and Connor continue their conversation, focusing on

(20:24):
how isolation can lead to radicalization and violent tendencies.

Speaker 2 (20:30):
The dark place is literally seeking out young men hoping
to radicalize. And by the way, what we're seeing real time,
it's the lone shooter. It's the lone wolf who is
being radicalized by themselves in the basement. You can see
how that can turn darkly quick.

Speaker 4 (20:47):
One of the things I find so interesting about some
of the lone wolf shooters that we've had in the
last couple of years is one of the first things
that we always hear about them from authorities is they
went on discord to post that they shot somebody, or
they went on social media to explain their manifesto before
they shoot somebody. Social media is a part of the

(21:08):
thought process for violent acts in this country. It's not
just having access to the weapon. It's also the ability
to tell the world why I'm doing this on Facebook. Remember,
they've we've had shooters who have live streamed these things.
We've had shooters who have recorded pre shooting why they're
doing this. In the case of Tyler Robinson with Charlie Kirk,

(21:29):
he went on Discord, apparently to brag about doing it,
to tell his friends that he did this shooting. We
didn't have that in these violent acts twenty years ago.

Speaker 8 (21:38):
They terrorist invited people to view the Discord server that
housed his manifesto in other posts in the hour before
the shooting, so other members the public had an idea
of what he was up to. Ezekiel Kelly pleaded guilty
to a deadly citywide shooting spree.

Speaker 7 (21:52):
He used Facebook to live stream the shooting. Armed with
a rifle, shot and killed four colleagues and wounded nine
other people, according to police, while live streaming the attack.

Speaker 2 (22:05):
We look at Elliott Roger and his terrible, terrible, terrible
crime that he committed that was unusual, right and Bianca Devin,
for example, the most horrific example. This poor girl was
murdered and her murderer posted about it and posted after
it and put it on social media for all to see. Remember,

(22:27):
we are seeing things that we should not. Nobody should
witness a man being murdered, as we all just recently
did in the Charlie Kirk assassination. But now we're seeing
it again and again and again. We're seeing that again
with Tyler Robinson, Luigi Mangione. These are single person with
a gun who are now going to act on behalf
of many, whether they've been instructed to or not. That

(22:50):
is so scary because again we're just scratching the surface.
Now we're seeing repeat after repeat after repeat. I mean,
they're not even trying to not get caught. It's almost
like there is something popular about being a part of
this act of violence, and somewhere that's being celebrated.

Speaker 1 (23:09):
While outward violence, these murders and assassinations, which either allegedly
or in some cases assuredly stem from online radicalization, are
on the uptick. Thankfully, statistically, they are rare. Self harm, however,
is becoming more and more prevalent as a worldwide epidemic
spreading because of social media. Here again, investigative journalists Connor

(23:34):
Powell and Stephanie Ladaker.

Speaker 4 (23:38):
The amount of self harm and there are lots of
scientific studies that I've been done at this point about
the self harm social media sites due to young adults.
This is known, and the reality is we've known this
for a long time anecdotally, but it's scientific sort of proven.
Now there is no rules about what can be pushed

(23:59):
in front of kids.

Speaker 5 (23:59):
Now.

Speaker 4 (24:00):
Now there's some discussions about whether these AI models should
be encouraging people to commit suicide or whether or not
they should be offering help when it comes to suicide.
But there are multiple lawsuits right now where family members
whose children turn to AI models and ask them about
suicide and the AI model gave them a blueprint for
how to commit suicide.

Speaker 2 (24:20):
You bring up such a big thing because post COVID,
this is really a conversation about loneliness at its core.
In cells, this conversation, even at its core, we really
are talking about a lonely person who is struggling, who
is now being targeted from outside sources, not for help,
but being targeted because they have a propensity for violence

(24:42):
if given the right ingredients.

Speaker 4 (24:46):
Algorithms determine the articles in your Facebook feed or your
Google results.

Speaker 5 (24:50):
The exact inner workings of these computer programs are trade secrets.

Speaker 4 (24:55):
There really is no oversight of these complex algorithms because
it's all intellectual property for these companies, and you're relying
on companies who have a profit motive to make a
decision about what is best for young people, for all
people of all ages. They don't have any real responsibility
or reason to take responsibility because their only goal is

(25:17):
to make money. Even if their logo is don't be evil,
their job, their fiduciary responsibility to their shareholders is to
make money, and the easiest way to make money is
to pump that algorithm full of anger and extremism. But
we aren't as a country, legislatively, regulation wise, we're not changing.
This is not going away. This is the problem that

(25:41):
confronts us at this moment, and it's going to be
here for a while because we're not changing any of
the policies around protecting young kids from these sites.

Speaker 2 (25:49):
By the way, what you just described is so incredibly
scary because if you're watching television, if you're listening to
the radio, there are regulations to that. So there are
certain obvious things seeing pornography, we're not using certain words
or swear words, and there are guardrails. However, it appears
that now suddenly we're all on the web doom scrolling,

(26:10):
and that doom scrolling is somewhere a serotonin hit and
makes us feel something for a minute, and little do
we know, just a few clicks later, we're down the
darkest rabbit hole that is only reinforcing the darkest hole.
Now I can't get out of it, and I think
that's sort of the scary part is the lack of regulation.

Speaker 4 (26:31):
There's none of that on social media. You don't have
any sense other than the content that's in front of
you about the background of what the content is being
pushed to you, who's pushing it, who's paying for, why
it's being delivered to you. It just is and you're
just expected to consume it, and that can take you
to some really dark places. The problem is is that's
also coupled with this engagement desire by these companies to

(26:55):
push you content that will inflame you, that will fire
you up, that ultimately creates anxiety, and it's very possible
that that's the future, is that you're going to see
a more extreme version of the content that's been pushed
to you and a more politically motivated version of the
content that's been pushed you.

Speaker 1 (27:15):
For more information on the case and relevant photos, follow
us on Instagram at kt Underscore Studios. In Cells is
produced by Stephanie Leideger, Gabriel Castillo and me Courtney Armstrong.
Additional producing by Connor Powell and Caroline Miller, editing by
Jeff Tooi music by Banicourse Studios. In Cells is a

(27:39):
production of KAT Studios and iHeart Radio. For more podcasts
like this, visit the iHeartRadio app, Apple Podcasts, or wherever
you listen to your favorite shows.
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.