All Episodes

December 5, 2019 58 mins

NOTE: This episode contains discussions of racist ideologies and hate-based violence. Listener discretion is advised.

 

Sam (not his real name) was just 13 when he discovered a meme-focused online forum during a lonely time at school. His new friends there were more than happy to share their disturbing views about race and women with him, and Sam, looking for acceptance, was more than happy to listen. Except the more he listened, the more he started to believe what they had to say. 

 

On this episode of Next Question, Katie sits down with Sam and his mom to talk about what it was like to get sucked into the far right online, then gets an expert’s take on how the internet is contributing to the rise of white nationalism—and the white nationalist violence we saw play out in places like Pittsburgh and El Paso. She also speaks to Angela King, a former neo-Nazi who works with extremists hoping to leave the far right behind, about how to help vulnerable young people navigate a world in which hate is always just a few clicks away.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Next Question with Katie Curic is a production of I
Heart Radio and Katie Couric Media. Hi, everyone, welcome to
Next Question. I'm Katie Couric. About two and a half
years ago, a white nationalist by the name of James
Fields Jr. Plowed his car into a crowd protesting the
Unite the Right rally in Charlottesville, Virginia. This rally, close

(00:26):
to thirty people were injured. Thirty two year old Heather
Higher was killed. What did what did you think of
what you saw today here in Charlotte's Ville. I've never
seen so much hatred in the eyes of my fellow
human beings in my life. We are in very deep trouble.
This country is in deep trill. I was there that

(00:48):
day and since then, at least seventy three murders can
be tied to the radical right wing extremist movement James
Fields Jr. Embraced That includes, of course, they hate fuel
shootings in Pittsburgh, Poway, California, and El Paso, Texas. In
many of these cases, the perpetrators adopted the disturbing views

(01:10):
of a broad Internet savvy movement made up of different
misogynistic and white nationalist groups. In online forums and on
social media. The suspect in today's mass shooting at the
Tree of Life Synagogue in Pittsburgh had an extensive anti Semitic,
anti Jewish digital footprint. Law enforcement officials in Texas say
they believe he wrote a roughly six hundred word essay

(01:32):
on the Internet and posted it an hour or so
before the shooting that says the attack was motivated by
anti immigrant hatred. Today, that's where hate spreads, not through
top secret word of mouth meetings, but out in the open,
on places like four chant and Reddit, and even YouTube
and Facebook. People who feel lost or left behind, or

(01:55):
maybe just looking to rebel are radicalized almost exclusively online,
often without ever meeting anyone face to face. The technology
itself is a big part of the problem. The algorithms
on sites like YouTube encourage users to stick around by
serving up ever more extreme examples of whatever topic they're

(02:16):
engaged with. To be clear, not everyone who spends time
in these virtual spaces will go on to commit horrific acts,
but the marching orders for many of these groups are
usually pretty explicit. True believers are encouraged to bring people
with mainstream views so called norm e s, into the fold.
And to ultimately take real world action against the groups

(02:38):
they believe are threatening their way of life. That means
people of color, women, lgbt Q individuals, and members of
other marginalized communities, which brings me to my next question,
how can we stop the spread of radical extremism online
and keep people from becoming radicalized in the first place.

(03:00):
My first guest knows from experience just how easy it
can be to get sucked into this world. Three years ago,
when he was just thirteen years old, he turned to
the Internet as a way to escape a lonely stressful
time at school. The community he found there gave him
some dangerous ideas about who was actually to blame for
his problems. In a jaw dropping anonymous account for Washingtonian Magazine,

(03:24):
his mom opened up about forums where her son, who
she calls Sam, spent countless hours and the toll his
newfound beliefs took on the entire family. Sam, I knew
you were having a pretty tough time emotionally when you
were thirteen years old. What was going on back then?

(03:45):
I had a kind of traumatic experience in eighth grade,
and uh, my parents didn't really like how the school
handled it, so they ended up pulling me out of
the school, um, which just led to a lot of
like confusion on my part, and it just made me upset.
You know, I had to find new friends. I had
to you know, kind of remake my mark on my
new school. What happened exactly that made you feel this way?

(04:08):
My friend and I were joking around and some people
took it out of context and went to the school,
and the school handled it really poorly, said you were
sexually harassing. Yeah. They said I was sexually harassing these
two girls, which wasn't the case, and the girls didn't
even think it was the case. Um, and the school
kind of they just had their sexual harassment training, so
they were really on edge about that kind of stuff,

(04:29):
and they handled it pretty poorly. They said that I
was bullying the girls. And then eventually it just got
to a point where I said, okay, like it's time
to go. And so you changed schools and you were
feeling pretty isolated, lonely, I guess a little angry. Yeah,
I was upset with the school. I was upset that

(04:50):
I had to leave. I was, you know, angry that
you know, my old friends weren't you know, in I
wasn't in as much contact with my old friends as
I like as I'd like to be. So you turn
to the internet. What was the process that opened this
door to the people you got involved with at least online.

(05:10):
One of my friends at my at my new school
UM decided to show me Reddit, which I had never known,
so he was showing me some subredits, like there's one
called r slash dank Memes that he was showing me,
which was just a subredit for like internet jokes and memes.
Um harmless enough, yeah, harmless enough. Seemingly the memes on there,
I didn't really understand a lot of them. A lot

(05:31):
of them were talking about right wing politics or um,
you know, race issues or gender issues, you know, about
women lying about rape, or about just in general, like
like gender politics, about gay people, and you know, at first,
I had come from a friend group that was pretty liberal,
so I didn't really understand the memes or get them.
But I started to look them up. And the more

(05:54):
and more that I started to see the memes and
I started to relate to them, the more I wanted
to be part of that community since I didn't have
friends to fall back on. So did you get involved
in these chat rooms. I guess that's a very unequated
way of describing it. Well, um, I decided to apply
for a moderator position on our slashdanc memes, and a
moderator is just someone who controls content that goes on

(06:16):
the subredituh, deletes posts, you know, flags them. So I
started getting in with the moderator community and a lot
of the moderators that were there, there's like a group
chat basically something called discord where you can talk to
them about everything, talk to them about posts, and a
lot of them were, you know, members of the alt
right or at least believed in the philosophy. So you know,

(06:37):
these people were basically my friends for a little bit,
or at least they convinced me that friends. So what
kinds of things would they say and what kinds of
things would you say? They would say like on our
group chat they would say like these liberal snowflakes can't
handle the fact that science says they're only two genders
and just stuff like that. Or they'd say like, you know,
gay people, you know, aren't they aren't like genetically okay,

(07:01):
Like it's not okay to be gay, It's not right
with the Bible. They'd say, you know, immigrants are fighting
or fighting to get Donald Trump out of office, and
that you know, immigrants are killing people and immigrants are
raping people. They would say that black people commit crimes
at a higher rate than white people. Do. They bring
up black crime statistics, which are obviously skewed and you know, biased,

(07:22):
but uh, you know, you want to agree with them,
because if someone's your friend, or at least if someone
says that they're your friend, you want to be like
them and you want to have that friend. You want
to keep that friend. You don't want to be isolated.
What about women? How did misogyny play into their overall conversation.

(07:43):
A lot of the stuff they would say about women
was about how women would lie about rape and about
how women would lie about sexual harassment, which spoke to
me on a personal point because of the thing that
happened at my old school in eighth grade. So that
was kind of your I don't want to say sweet spot,
but their way into you. Yeah, definitely, it was the
topic that I had, you know, had to change my

(08:06):
whole life because of and because of that, I was
kind of like a claim mold. I could easily be manipulated.
And did you ever challenge initially some of the things
that were being set there, And did you ever say,
wait a second, that's not true, or that's racist or
anything like that. I would never audibly fight back because

(08:28):
I didn't want to lose my position as a moderator
and I didn't want to alienate myself from the rest
of them. So if they said something, I disagreed, and
I just wouldn't say anything back. You have been raised
in a progressive Jewish Yeah, so you must have thought,
g these views do not swear with the values that

(08:48):
have been taught by my parents. I mean, how did
you kind of wrap your head around that? For me,
I think a lot of it was just resentment from
my parents, and resentment that made me leave my old
school and they made me leave my friends, and you know,
I couldn't see that they were, you know, right in
the long run. You know, it was just I'm gonna
be different than you guys. I'm gonna be the black

(09:11):
sheep of the family, you know that kind of the thing.
So you're sort of proud that you were being a
nonconformist in a way. Yeah, it became like a part
of my identity, that I was different than the rest
of my family. Did your parents challenge you when you
started sharing these newfound beliefs with them? Definitely, Yeah, there
was a lot of um. I'd say it became pretty

(09:31):
tense in our household because I would keep learning new
stuff and then something on the news would come on
that would say it would be Trump saying they were
fine people on both sides of the charts to about
and he was said, I would say, you know, he's right,
you know, not everyone was bad, not everyone was violent,
and you know, it was basically defending neo Nazis, or
I would say Jews basically run the world, which my

(09:52):
parents were horrified by because I was Jewish and I
was propagating anti Semitism as a teenager and as a Jew.
But to hear this blatant anti semitism as someone who's
Jewish himself, was there a disconnect there? I mean at
first there was, but then you bought into it, bought

(10:12):
into it somehow. Yeah. I became so receptive that kind
of talk that I would basically believe anything that they
told me. And that was the case with the anti Semitism.
Who were these people? Did they reveal, say what they
did for a living. A lot of the people were
minimum wage workers, people who worked in restaurants, people who
worked in you know, hotels, and some people made money
off of Reddit. The only time people would talk about

(10:35):
their actual jobs is saying like, oh, work, it was
so hard today. You know. I think they're upset because
they don't have a great job, they don't have a
great life in general, and they need to blame it
on other people for their downfalls. So these were a
lot of aggrieved people. Yeah, definitely. I'd say no one
was truly happy within the alright community. I think these

(10:58):
people are disgruntled, upset, you know, men who really feel
like they have the worst. You know, they feel like
they have it the worst. They got the raw end
of the stay they got, the yeah they got, the
short they got, they pulled a short straw. And then,
on top of their preexisting anger, all the new things
that they're reading on the internet about immigration, about ethnic people.

(11:21):
You know, all of that translates into real anger and
they internalize it and you know, their wires get crossed
and they're kind of like a powder keg. Yeah, yeah,
they're already primed and the alright, hatred is what light suffuse.
That's scary, It's terrifying. How much time would you spend online?
It depended on the day. I mean I would normally

(11:41):
get home from school and then spend about three or
four hours online. So you really did get sort of
sucked into a vortex. That's a lot of time. Yeah, yeah,
I mean it adds up to you know, after a
couple of weeks, you've spent days online on those subredit's already,
and that amount of time is already detrimental. How did

(12:02):
what you heard sam impact your your relationships or how
you acted with friends or at school? It alienated me
amongst friend groups because again, you know, my school was
fairly liberal, you know, my new school at least, and
then all my friends shared those viewpoints. So me saying,
you know, like immigration is out of control, that would

(12:25):
alienate me amongst my friends, amongst my teachers. Even so,
it would just drive me deeper and deeper and deeper
into isolation, which in turn led to more Internet time
because I wanted to talk to people because you you
were lonely, yeah, I mean, above everything else, lonely. Did
you feel you were targeted in some ways that once

(12:45):
you showed up in this place that people were enticing
you to kind of get deeper and deeper into it.
Do you feel like your groom. Yeah, a little bit, honestly.
I mean the people there were really, really, really insist
on their strict set of beliefs, which is immigration is bad,
women lie about rape. Uh, you know, black people commit

(13:09):
more crimes, black people are violent, that kind of stuff. There.
They were so insistent on those beliefs that it felt
like I was being brainwashed. And I liked it because
it meant that people, you know, I was able to talk.
I was able to tell them my beliefs and tell
them my take on these issues. I would say like, yeah,
you know, women do lie about rape because this happened
to me, and I'm proof that this is the truth.

(13:31):
And then they would, you know, they would eat that
shipped up completely. They would love it, you know, just
to hear that. And so they felt like they had
a receptive audience. And you definitely they thought that I
was an adult for a while, which I loved. I
loved feeling like an adult and feeling like my voice mattered.
Was there a turning point did you say these people

(13:52):
are assholes. Yeah, I mean it was gradual French Sam, Oh,
don't worry about it. You've heard it before, I'm sure,
especially on the internet. It was gradual, like I got
in gradually and kind of got out gradually as well.
In September two thousand seventeen, a little over a month
after Charlottesville, Sam convinced his mom to take him to

(14:15):
the so called Mother of All rallies in d C,
a gathering of right wing groups ranging from arden Trump
supporters to far right fascists. You'd think that finally meeting
his heroes offline would only reinforce Sam's views, but ironically
it was those face to face interactions that made him

(14:36):
have second thoughts. So he went and I was like
talking to some of the all right people, and a
lot of them were insane, like they were crazy people,
Like one guy had an iron cross on his lapel
and then had a like an anarchist cookbook and was
talking about how he was an anarchist but he supported Trump,

(14:57):
which it doesn't make sense. It's ridiculous, But he kept
trying to justify it with his own you know, with
his own reasoning that he made up. And it made
no sense. That was the first time I finally saw
that the people of the alt right had no idea
what the hell they were talking about. And then from
there on that kind of planted the seed in my
brain that grew, you know, to think like, hey, maybe

(15:18):
I should, you know, stop doing this. Maybe I should
focus more on school. Maybe I shouldn't try to isolate
myself among my friends, you know, maybe I should try
to make a name for myself. So I eventually got
out just through time. You just stopped going there. I
just gradually stopped going on the internet, you know. I
went on less and less and less every day until

(15:38):
it I was not on the hateful sub credits at
all anymore. I wanted to get Sam's mom's take on
what it was like watching her son go from a
carefree kid to an angering member of the growing far right,
so I asked her to join us for the second
part of our conversation. When you first realized that saying

(16:00):
Am was doing this, what was your reaction, I mean,
how did you process it? It was a really scary time.
Sam is someone who we had raised from day one
to be a very empathetic kid, and everyone had always
commented on how he seemed like an old soul. Hey,
even his kindergarten teacher couldn't believe how well he related

(16:21):
to other people and saw other perspectives. He was the
kid that every other kid went to in class when
they were hurt or when they needed a friend. He
was that person. And so to watch this transformation, I mean,
it was as if he had a total personality transplant.
How did you realize that this was going on. We
have dinner together every night as a family, and do

(16:43):
you just have one child. We have a daughter as well,
who's younger than Sam, And and that actually is part
of the story, because we we had dinner every night
as a family, and we always talked through the day
and you know, little little things and big things. And
I noticed that Sam started to talk about really odd issues,
such as there is no wage gap between men and women.

(17:07):
So he was thirteen years old. We're at the dinner
table talking about our day, and he's talking about how
the wage gap is a fallacy his words. And our daughter,
who's younger but it is already a very ardent feminist,
would argue with him about that. And we'd had such
a happy, peaceful family and the two of them are
were in our best friends, and so it would they

(17:29):
would fight about it, and we were just left scratching
our heads thinking where did this issue of the wage
gap fallacy even come from? It was never anything we
had talked about. So did you ask him where are
you getting these ideas? Well? Yeah, and he said, well
I read about it online and he wanted to talk
about it, and he felt like he was educating us.

(17:50):
He was really excited to be discovering all this new information,
and I think because we had always been the people
to tell him things, he was thrilled now to have
new in for nation that we didn't have and to
be able to tell it to us. So, I mean,
it was the wage gap, it was stuff about the
Second Amendment. It was a ton of stuff about you know,
women lying about rape, and it was very jarring because

(18:15):
it was nothing we had ever been led to expect
that our kid would um parent. Do you think that
the people online saw an opening in because of Sam's
personal experience at his other school. Well, what what I
read about when I started finally looking into it and

(18:35):
seeing what a problem we really had is that, you know,
the people in these white supremacist communities prey on depressed, vulnerable,
isolated kids, and they knew what to look for um,
and they saw that in him. And we saw he
was lonely. He didn't say he was lonely, but we
saw he had no friends. We just figured it's going

(18:57):
to take time, you know. And when he complained about
the kids at his new school, we said, it's going
to take time, like over and over, because that's what
you say as a parent. It's going to be okay.
But that's not what the people online were telling him.
What were they say They were reacting in a much
more immediate way, saying, the people in your new school
don't have your back for whatever reason. That was what

(19:17):
they were implying. And they were also implying, we are
the ones who have your back, and we're your real friends.
And and that resonated with him. And they knew that
three or four hours a day he was online. It
was actually I heard him say that too. It was
more than that. It was more than that because he
came home. You were online when you came home, and

(19:38):
then you went back online after dinner anything. That's what
I remember, because it was do you have an exact
time frame? I don't have an exact time frame, but
I do remember that when you became a moderator, you
had a quota that you had to fill, and that
was one of the things that actually added to your
stress at that time. Becau as as a moderator, you

(20:01):
have to spend a certain amount of time online. It's
not about time. It's more like you have a certain
amount of posts you have to approve a day. Um
so it would be I would have to approve like
a hundred posts a day or remove. So there's moderator actions,
and an action is removing or approving a post. You
sound like you're an amazing mom. So could you not

(20:22):
have said, I'm sorry, you can't be online like this,
you can't be talking to these people. I mean, I
didn't understand at the time what was going on. I
mean when when he told me he was a moderator,
he was so excited and proud and proud, and things
had been so rocky for such a long time that

(20:45):
seeing him happy again made me happy. And I didn't
realize what this sub bread it was about. I hadn't
gone on it myself. Um, I thought it was just
about funny memes. In retrospect, there was so much that
I did that was wrong. When did you discover the
kind of hateful, horrible things that were being discussed. So

(21:09):
it started with the whole wage gap as a fallacy business,
and then it moved on to the Second Amendment. And
then when he started talking about women lying about rape,
obviously that was incredibly alarming. When he started parenting Jews
run all the financial networks in the world, that really
helped us see how far gone he was. I mean,

(21:31):
it's it's one thing to set out on his own
like political journey. And we did for a while think, well,
maybe he's just maybe he just has a conservative mindset,
like that's not a crime. He's not like us. But
that's okay. You know, there's merit to both sides. Maybe
he's sort of like an Alex P. Keaton conservative. That

(21:53):
was like my frame of reference, the Alex P. Keaton
of our family. Um. And but when he started, you know,
parroting anti Semitic statements, I realized it was it was
way more than that. And so that was when we
started taking things a lot more seriously and engaging on
a more serious level. I can't believe you went to

(22:14):
this mother of all rallies. It took a lot of convincing.
I mean, what did you think when you I got
there and you saw these people, these very fine people.
I was there for one reason and one reason alone,
and that was to try to show Sam, by actually

(22:36):
being there, that I I would be his partner in
this um not in in believing these beliefs, but in
having these discussions. And he saw for himself. I mean,
that was what was so amazing about this whole experience
is that he did see, by being there himself, that
that these people don't have a leg to stand on.

(22:58):
I always tried to get through to him that if
you're going to have an opinion, you need to meet
the people, You need to talk to the people, you
need to see things with your own eyes. And that's
what this was. He saw it with his own eyes,
and his mind was changed. You're lucky because I imagine
that there are a lot of young people who do
not have the wherewithal to reject this philosophy and to

(23:26):
basically see that it's wrong. I mean, do you ever
think about what would have happened if if Sam hadn't
had this epiphany. A lot of people wrote in both
to the WASHINGTONY and also on Twitter, and a lot
of the conversations were from parents whose kids had gone
all the way in terms of the white supremacy movement

(23:49):
and didn't come back and they were estranged and and
I saw that over and over again after my article
was published, and I just felt again how lucky we
were that it worked out the way it did. And
I think some of that has to do with the
fact that um Sam was so young when it all happened.
He didn't, for example, he couldn't like go out and

(24:10):
get his own apartment, you know, he was under our roof,
and I think that helped bring him back into the fold.
I think that I couldn't go to events myself also right,
And also you had a mom who was learning but
also presenting a counterpoint. Definitely. I think a lot of
the parents that have kids like that, you know, immediately

(24:31):
go the your insane route like you did at first,
and never change that route. But they don't realize that
if you tell a kid no, they're just gonna want
something more, I mean, more than anything else. I think
that's the case with a lot of things. What do
you think should be done about this? If I already
given advice to a kid that was going through something
like this, I would say that no one ever tells
you something to be your friend, and that they only

(24:54):
or at least a lot of the times it's not
the case, and that when you read something in the news,
there's a definitive yes, and you need to source your information.
You need to find out where it's coming from. The
thing that I do now is I'll read something from
a conservative news site, and I'll read something from a
very liberal news site, and I'll compare facts and I'll
compile the information that has shared in both of those

(25:15):
and to me, that is the full story. That's a
lot of work, isn't it. I feel like it's worth
it in the long run. I mean, even if it
takes time, it's a good thing to find out the
entire story. How scary is it to think that people
are being radicalized online and then they are going out
and in some cases, as we've seen, killing scores of

(25:39):
people as a result of this indoctrination. It's really scary
for me because, you know, over everything else, I could
have been one of those people if I had gone
down the path, if I had gone down further, you know,
anyone has the capability to be radicalized fully, and I'm
no exception of that. No one's an exception of that.

(26:02):
Sam's story was shocking. After all, If a kid from
a stable home and a progressive Jewish family at that
could find himself identifying with white nationalists, what exactly does
that say about the power of this movement to draw
people in. Up next, an expert explains how widespread far

(26:23):
right online radicalization has become. I'm here with Keegan Hankus
of the Southern Poverty Law Center's Intelligence Project. Keegan and

(26:45):
his team closely monitor right wing extremists online. Again, I
know that you keep a close eye on these communities
and what's going on in these circles. So how and
where exactly are these people congregating online and what is
it about these spaces that makes them so appealing to them? Well,
the modern hate movement is really marked by a lot

(27:08):
of internet and online activity at this point. So we
see a lot of these communities cropping up across all
of the same major social media platforms that all of
those us in our everyday lives, but we also see
them trailing off into more disparate and more underground online
spaces as well, especially as more pressure has been put
on them in recent years. I know a huge part

(27:30):
of the appeal I understand is the use of humor
and irony and memes. How does that play into this?
That's exactly right. Humor, irony, memes, other forms of visual propaganda,
video and audio propaganda tremendously important. There's a whole cottage
industry in the racist movement around this. I think a
lot of this has to do with this culture that

(27:52):
they're building for themselves. There's a very intentional project going
on across these movements to build content that's going to
keep people engaged, is going to keep pushing people towards
more extreme beliefs. And they also then claim, you know,
plausible deniability and say, oh, we were just kidding around,
we thought this was funny. If they get busted, yeah,

(28:13):
you frequently see people say, oh, I was just being
quote edgy, or this was just humor that was taken
out of context. I think what's notable is that a
lot of these communities are built around these uh this activity.
So if you think about a platform like h Chan,
which we've seen a number of manifesto show up on
in the last year or so. Um these communities, it's

(28:34):
more common to see death threats and in a joking
and almost presented as an ironic manner, than it is
to not see them. I spoke with Keegan in September,
about six weeks after the El Paso shooting. Following El
Paso and christ Church and pow Way, there was talk
of shutting down eight chan, the site where all three

(28:55):
shooters posted violently racist manifestos. Ah Chan went dark for
several months as it struggled to find a company willing
to host it, but re emerged on the so called
clearnet in November as the rebranded eight coon. Keegan told
me more about how the site works h chan. I

(29:15):
would first describe it as probably one of the darkest
and grimmest places on the Internet, particularly some of the
boards that are dedicated to politics and more specifically to extremism.
It's an image board that has some text in it,
and this is content that's basically organized in threads and
is almost always dedicated to some form of extremism. There

(29:35):
are participants from all across the world, especially in the
last year, so what we've seen is basically cheerleading on
this site around massive violence. What's the difference between eight
chan and four chan? They're very very similar. H chan
it was designed to be an even more extreme version
of four chan. It's four Chan with even fewer rules
around moderation. Almost no content at its inception was taken

(29:59):
off of h N. We've seen that change a little bit,
especially as there's been a lot of attention put on
the site for the number of manifestoes that have been
posted there. But it is very uncommon for content to
be taken down no matter how extreme. Do you worry
about shutting down these sites that they'll go further underground
and be harder to monitor by law enforcement and other authorities. Well,

(30:23):
the first thing I would point out is that sites
like h N are already incredibly difficult for law enforcement
to monitor. The posters are almost always anonymous. It requires
a heavy investigative lift to go and unravel some of
these threads as they're happening. Um, so it's already a
really big problem. The bigger issue with the site like
HN being online is that it's well known and it's

(30:46):
very notorious, and that people know to go there to
look for this type of content. It has a wide audience,
so whenever I get asked about, you know, the danger
of driving this content further underground, I actually propose an
idea more akin to containment. Right, we should be striving
to make this content as hard to find as possible,
because you are then limiting the potential pool of people

(31:07):
who may be susceptible to its messaging. So you'd be
okay to have it go into sort of the darkest
corner of the dark web. Absolutely. It also makes it
easier for researchers like myself and my colleagues of the
SPLC to track these communities. We follow these extremes wherever
they go, and there's this interesting tension in the hate
movement where they can never go entirely underground. So if

(31:29):
your entire project is based around the idea that doing
nothing or not bringing more followers into your movement means
extinction for white people, which is what many of these
individuals believe, then you necessarily have to have some entry
points into it. Well, that brings me to the next question,
how do they convert people and how easy is it
for them to have access to vulnerable targets. Well, it

(31:52):
has become disturbingly more easy to find vulnerable targets. With
the rise of social media, in particular in the last
decade or so, as we've seen a lot of these
tech companies really drag their feet, uh to enact meaningful
policies and to enforce them against hate and extremism. One
of the things that they commonly do is go to
online spaces, whether it be on social media or certain

(32:15):
you know, right leaning news sites, where they know that
there are people who are near their ideological beliefs but
still part of the mainstream, and they will start using
very targeted propaganda to try to push them further and
further to the right. They call this red Heilene. Is
that right? Yeah, So that's common vernacular on a lot
of these sites and in a lot of these communities
for uh, you know, bringing someone into this set of

(32:38):
extremist beliefs. You know, it's it's reminiscent of the matrix
of course, right where you take this red pill and
suddenly you see reality for what it is. Do potential
recruits share anything in terms of demographics, geographics, socio economic backgrounds.
It's a really difficult question, and the reason for that
is it's really difficult to know just how many people

(33:00):
have been exposed to these beliefs. What we see, at
least from monitoring the extremist groups themselves and their leaders,
is a really concerted effort to recruit young white men
into these movements. Uh. And there's a couple of mechanisms
that they used to do that. So one of those
is that they play up this narrative of basically declining
prospects for young white men. You know, they make a

(33:21):
narrative about you know, what's being taken from them, are
deprived for them, and they ascribe that to you know,
a target community, say it's this group's fault that you
did not get what you were expected, or this is
the group that took what you thought you were owned,
and sort of grievance politics. Right, that's absolutely right. Do
you think the number of people who are gravitating towards

(33:43):
these groups has exploded? What role do you think external
factors are playing into this? I e. Political rhetoric that
we hear, the immigration debate, the whole focus on diversity
and inclusion in kind of all aspects of society, and
by the way, changing demographics. So it's interesting you actually

(34:04):
just named the three largest drivers that we point to
when we explain that we listed the largest number of
hate groups that we've ever counted last year. You know,
there were a thousand and twenty of them in the country.
And in that number, we saw fifty rise in the
number of white nationalist groups. And when you're explaining or
when we are explaining how that happened, those are a
lot of the factors we point to. Demographic change is

(34:25):
absolutely the biggest driver of these groups and what is
giving them so much energy this tremendous anxiety about the
idea that you know, whites will no longer be an
absolute majority in this country. This rhetoric has been very
effective in driving people these movements, and it has coupled
really dangerously with a lot of the toxic political rhetoric
that we've seen from the halls of power, particularly the

(34:47):
White House in recent years. A lot of this anxiety
about immigration, for instances showing up in the manifestos of
mass shooters. Uh. And it is almost indistinguishable from some
of the statements we see from politicians when it comes
to the pres it in do you think that he
has made or sent implicit or explicit messages to these
very groups by saying things like they're very fine people

(35:11):
on both sides, or just failing to criticize and come
out and say this is wrong. Well, he's done both.
I mean it's been explicit and implicit, and this is
obviously not lost on these communities. I mean they see
the president's language, whether they agree with him on every
issue or not, as the ultimate validation of their worldview.
What can be done about this, Well, there's a number

(35:33):
of things. I mean the first, at a high level,
especially when we're talking about this issue of radicalization and
online spaces, you know, we need to demand more of
the technology companies that play such a large role in
our lives. For instance, we should have meaningful policies banning
all of this content on every major social media platform,
and not only that, we expect them to enforce this stuff.

(35:56):
That is an easy first step, you know. One of
the other things that speaks to the drivers that you
brought up earlier, because you know, we as citizens and
voters need to be making forceful demands that politicians speak
out against hate and extremism, and also candidates should be
including these planks in their platforms. And the other piece
of this is, you know, we need to find ways
of a civil society level to break down these feelings

(36:18):
of social isolation and really do more to counter what
the racist hate groups have been so good at perpetuating,
which is this narrative that you know, demographic change and
diversity in this country and multiculturalism is somehow a threat
to white people. What about the free speech argument? I
know you mentioned social media and basically stopping this on

(36:41):
all platforms. You often hear the free speech argument and
this being a slippery slope, and I'm curious what your
reaction would be to that. Well, whenever I have the
free speech argument presented to me, I bring up a
couple of different points. You know, there's no one stopping
these individuals from going out on a street corner and
yelling racist slogans or trying to convince random passer bys

(37:03):
of their racism and you know, the truth in their
worldview as they would see it. However, the difference when
it comes to the social media companies, as these are
megaphones that are directed in every direction. Right, this is
just tremendous amplification for these ideas, and members of hate
groups and their leaders have proven that they're very good
at manipulating them. These companies have every right as private

(37:25):
institutions to set codes of conduct and acceptable use policies
against these ideas, and I would argue that many of them,
if not all of them, have a moral responsibility to
do so, given the consequences that we've watched play out
so many times. I would remind all of your listeners
that you know, what happened in christ Church, New Zealand
earlier this year is an absolute atrocity, and it's incredibly

(37:48):
disheartening and was powered by Facebook's live broadcasting mechanisms. The
fact that that was even allowed to happen and the
devastating impact that we watched it have across the world,
UH is a problem in and of itself. So there's
a lot of work to be done. You know. One
of the things that we would expect from these companies,
and these are two concrete suggestions, is you know, we

(38:08):
want them to be much more transparent about the activity
of extremists on their platforms, and especially making that data
available to outside academics and researchers who study this so
that they can try to find new ways to combat it.
The other is, and this especially speaks to for instance,
Facebook Live and the role it played in New Zealand. Um,
whenever they're rolling out products, we should be testing them

(38:30):
against civil rights concerns as opposed to just solely looking
at how much money we can make from them in
the consequences of not rolling them out quickly. You are
offering some great suggestions, But when you really think about it,
do you think, Hgan, that this can be controlled or
is it just too hard to get a handle on this.

(38:53):
I think it's too important and the consequences are too
devastating not to try. And I think there's some obvious
in easy steps for us to begin taking to go
down this path. Um. Whether we ultimately succeed, we'll have
to see. Um. But I just can't accept that the
normal and regular pace of life is now going to
be that we're going to see these violent actions taking

(39:15):
place over and over again in our society. It's I mean,
it's just devastating to watch and for you, just unacceptable,
absolutely unacceptable, well should be for everyone. Up next, we'll
hear from someone who understands exactly what it takes to
leave extremism behind, an ex neo Nazi who was able

(39:37):
to turn her life around Angela Hi. It's Katie Curic.
How are you, I'm well, thank you? How are you today?
I'm good. Thing. We've talked a lot about how online
radicalization works, but experts in this field also talk a
lot about what it takes to do you radicalize someone.

(40:00):
Angela King, a former neo Nazi, has devoted her life
to doing just that. Her organization Life After Hate, helps
people disavow far right extremism. Do you believe the radicalization
process today is fundamentally different than it was when you
were younger? Fundamentally no, just more streamlined, and individuals have

(40:26):
the ability to now become radicalized completely online rather than
in face to face meetings. How did your radicalization happen? Well?
I was raised in South Florida in a household that
there's no easy way to say it. I was taught
racism and homophobia from as early as I can remember,

(40:49):
from my parents and my family, and by the time
I was an adolescent, I was angry. I had been bullied.
I was very vulnerable, and when I ran into other
UH kids my age who were already involved in the
violence far right, it wasn't a great weak for me

(41:11):
to become involved. I happen to have ingredients in my
life and experiences that led to a very negative recipe.
Let's call it so. You know, I had familial racism
and homophobia that certainly socialized me in a way to

(41:33):
think that I was better than others and that that
kind of behavior and thinking was acceptable. But I was
also vulnerable. I went through um in acute identity crisis
at a very young age because of what I was
exposed to at home and some experiences. I was bullied

(41:53):
both verbally and physically, and I buckled under that pressure.
You know, I I honestly felt that my pain emotionally
and mentally was so intense I could not hold it
on my own. And it's only in retrospect that I'm
able to see that as this vulnerable young person that

(42:18):
was struggling and in so much pain, I tried to
literally beat my pain into other people. That's really difficult
to come to terms with. But it's even more difficult
to see the very same things happening to an even
larger scale to our young people today. There's a lot

(42:39):
of fear, and I feel like anger has its roots
and fear one of the big factors that I think
is contributing to an increase in and extremism and far
right extremism in particular, is changing demographics. It will be
a majority minority population. Absolutely. Communities or groups who are

(43:05):
used to being in positions of power and affluence are
losing those positions or feeling like they're about to lose them.
What is terrible is that we are a society that
is so quick to say this is mine, I'm not sharing,
you're not taking this from me, instead of sitting down

(43:26):
and allowing ourselves to be vulnerable and allowing ourselves to say, hey,
I'm worried about this. There's nothing wrong with being worried
about the future, but when we allow that to translate
into violence or into dehumanization. We always talk about learning
from the past and learning from history. It is becoming

(43:49):
clear on a daily basis to me that those lessons
are almost blocked. You know, I don't know if you
have seen or heard about any of the polls asking
something as simple as or measuring, you know, people's knowledge
about something like the Holocaust. That history is slipping away.

(44:13):
And that's terrifying. When you look back at your own journey,
was there a turning point when did you say this
is wrong, I can't do this anymore. This isn't me.
It took me almost a decade before I had that epiphany.
I was nineteen. Oh gosh, I just gave my age away.

(44:37):
I was nineteen when the Oklahoma City bombing happened. I
had already been involved in the violent far right for years,
and that was the first thing that I really remember
kind of shaking me a little bit and even um
inspiring me to consider my beliefs or my actions. It

(45:00):
that wasn't even enough. What I was involved with isn't
something that you know, you just wake up from one
day and say, hey, you know, nice knowing you have
a nice life. I'm growing up now or moving on.
So it was another few years when I was twenty
three and I found myself sitting in a federal detention
center because I took part in a hate crime. You

(45:22):
were arrested for taking part in an armed robbery of
the Jewish owce store, and you were sent to prison
for that. Yes, I was. And it was sitting in
this federal detention center as a result of that that
I really started to reconsider my life. There was a woman, um,

(45:44):
a woman of color, who kept looking at me this
one day and in my mind. I thought, she's going
to do something to me, she's planning something, she's going
to try to beat me up, or And she asked
me if I knew how to play a game, and
I didn't, So she picked this game up and came
over and sat down next to me and taught me

(46:05):
how to play. And she did that knowing why I
was sitting in that detention center. She did that, knowing
that I was there for a hate crime. She did
that having to look at the racist tattoos I had
on my body because I couldn't cover them all. And

(46:25):
over the course of the time I spent in prison,
I was treated with kindness and compassion by women who
I never would have had the same respect for had
I met them prior to entering prison. And it changed me.
I mean I was changed at the core of who

(46:47):
I was as a human being. Because I was expecting anger, aggression, hatred,
potentially violence, I wasn't expecting kindness. That just absolutely disarmed me.
And it inspired me not only to change who I
was and to take responsibility, but to come out and

(47:07):
do something meaningful with my life and to make a
difference with the unique knowledge and experiences that I had.
How long did you spend in prison? Almost three years?
When you got out, I know you started an organization,
Life After Hate. What inspired you to do that? Well,
I did not start it by myself. I am one

(47:29):
of six co founders of Life After Hate UM, and
essentially I did outreach and some consulting for about a
decade before I met my fellow co founders. In two
thousand eleven, I was invited to attend a summit in Dublin.

(47:50):
It was at this conference that I first UM met
face to face with others who were similar to me,
and by that I mean other former violent far right extremists.
On the last day, we talked about how inspired we
were by you know, the individuals that we met and

(48:11):
the things that we learned, the feelings, you know, the
emotions that we had, and we all remembered what it
was like to go through this disengagement from the violent
far right on our own. And we also talked about
UM coming back to the US and moving forward together
as a group. And one of the men had an

(48:34):
online journal called Life After Hate, and he said, I
have the perfect name. Let's take the name of my
journal and we'll go home and turn it into a
nonprofit organization, and that's exactly what we did. What is
the group's primary focus. Our primary program is called Exit USA,
and Exit USA is a program that helps individuals to

(48:59):
disengage from the violent far right. So there is UM
an element of intervention that is involved. There is after
care and support networks in place. UM. This is not
a linear process. It is different for every individual and
it can depend on so many different things. You know,

(49:21):
why someone got involved, how long they were involved, what
kind of violent far right group they were involved with them.
We do not go out seeking UM. Individuals come to
us concerned family loved ones. We get UM referrals, so

(49:42):
it can be anyone from you know, a teacher, to
a parent, to a sibling, to a spouse, to UM
law enforcement to somebody from you know, a human rights group.
We also get individuals who contact us directly and say, hey,
I have been involved in this for this long, you know,

(50:03):
and for whatever reason, I don't want to be involved,
but I don't know exactly how to get out. What
is the success rate? I mean, are you more often
than not able to extricate people from this world? You know?
It all depends, UM. There are so many factors involved.
Part of it depends on the individual. It is not

(50:25):
at all uncommon to see a certain level of recidivism
with this when individuals get involved in these types of groups.
It's not necessarily because someone wakes up one day and says,
I decided I hate everyone who doesn't look like me today.
A lot of times there are so many other mitigating factors.

(50:50):
So maybe somebody has grievances, maybe somebody's looking for protection,
maybe somebody's looking for acceptance, are belonging somewhere, so when
they find these things, they gravitate to the group, and
they're so grateful for having found or filled whatever was

(51:10):
missing for them that they take on the entire identity
of the group. This affects every aspect of life, the
kind of clothes they wear, the kind of music they
listen to, the kind of food they eat, the people
that they associate with. So imagine having all of that

(51:30):
and then disengaging. You lose all of that, So you
lose your entire identity and need to reshape a completely
new one. So a lot of times we see individuals
who we call it, you know the difference between disengagement
and de radicalization. So for someone to disengage. They can

(51:53):
disengage from you know, physically being in proximity to the individuals.
They're not going to events any longer. They're not you know,
going to meetings or out you know, physically recruiting people.
But that doesn't necessarily mean that they've left behind the ideology.
De radicalization, on the other hand, that's at the cognitive level.

(52:17):
So that's deconstructing the beliefs and the ideology and leaving
that behind. Um So it's a very very complicated process.
Like I said, it's not linear, and it's really hard
to go into success and failure because then we have
to stop and ask, how are we measuring success? How

(52:38):
do we measure what was prevented? So, for instance, look
at you know, we can look at things like the
cost to society. What if we prevent people from committing
mass violence, you know, like a mass shooting. Is it
successful if we prevent someone from picking up a gun

(52:59):
and going out and doing something like that, but they
maybe still have the belief. It's very complicated and it's
something that I think is going to remain complicated probably forever.
Is there anything you would say to someone listening today
who might be worried about a loved one who might

(53:21):
be attracted. I can't imagine people attracted to these beliefs
necessarily be listening to my podcast, honestly, but you never know.
What would you say to those people if they're in
fact out there listening right now. I would tell them
that their novelone that you know, things don't have to

(53:42):
be that difficult. We have a community of individuals who
understand from having the experience there, and you know, we
are individuals who have gone on and worked hard to
be so much more than just former extreme you know,
we're educated, we've worked on, you know, making amends, and

(54:04):
you know, really going for personal and professional growth. So
I would say, you know, to everyone, whether it's an
individual who wants to come out or a concerned loved
one or family member, change is possible. Transformation is possible.
One of the hardest things that we have to do

(54:25):
is to listen to each other and to try to
understand one another's grievances, not to concede our own position
that these beliefs are heinous and they're wrong, But we
are not going to just right off the individual human
being because their beliefs are ugly. People can come back

(54:46):
from this. I know this is true. Um, probably you
know more than I know anything else in this life,
because I've I've gone through it myself. So I would
just say, you know, there is help out there. We're
here to listen. We're not, you know, going to judge,
and it's possible we we can find a way to

(55:07):
get through it. We've talked about earlier, you know, the
Holocaust and the lessons of the past. I have in
the past two years had experiences with survivors who have
said things like, it was never supposed to happen again,
and it's happening again. Why Why are we letting this happen.
Why isn't anyone doing anything? Why don't people care? It's

(55:32):
horrifying to me that we have individuals, an entire generation
that they're almost gone, their voices are almost gone from
this world. What they survived and what they went through
no human beings should have to go through. And we
are on the same path again. It sure is enough

(55:53):
to motivate someone to keep going. Well, thank you for
the work that you're doing, Angela, and thank you for
having such a profoundly open and honest conversation with us
about your own story and about the larger story that
unfortunately is unfolding in this country. Thank you. I am

(56:16):
one of the lucky ones. Angela is in fact one
of the lucky ones. Clearly she had the strength, support
and newfound empathy that opened her heart to understand others
and ultimately herself. It seems to me people are attracted
to these groups because they're lonely, isolated and want a

(56:36):
sense of belonging. If you need help or know someone
who does, you can go to Life After Hate dot org,
or you can text or call six one two eight exit.
Thank you so much for listening everyone. I hope you
learned something from this episode. I certainly did. By the way,

(56:57):
if you're feeling overwhelmed by news and ormation these days,
and let's face it, who isn't, you can sign up
for my daily newsletter, wake Up Call by going to
Katie correct dot com. And of course I'm a social animal,
so you can follow me on Instagram, Facebook, and Twitter.
Until next time and my next question, Thanks again for listening.

(57:21):
Next Question with Katie Curic is a production of I
Heart Radio and Katie Curic Media. The executive producers are
Katie Curic Lauren Bright Pacheco, Julie Douglas, and Tyler Klang.
Our show producers are Bethan Macaluso and Courtney Litz. The
supervising producer is Dylan Fagan. Associate producers are Emily Pinto
and Derek Clemens. Editing is by Dylan Fagan, Derek Clements,

(57:42):
and Lowell Berlante. Our researcher is Barbara Keene. For more
information on today's episode, go to Katie Currek dot com
and follow us on Twitter and Instagram at Katie currec
For more podcasts for my heart Radio, visit the i
heart Radio app, Apple Podcasts, or wherever you listen to

(58:03):
your favorite shows.
Advertise With Us

Host

Katie Couric

Katie Couric

Popular Podcasts

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.