All Episodes

October 12, 2021 29 mins

Robert talks about deradicalizing far right extremists, a much more difficult problem than many self-appointed experts want to admit, with actual expert Alex Newhouse.


Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
What's terrible my me? This is it could happen here
a podcast about collapse, and that's appropriate because everyone's faith
in me as a colleague has collapsed today as the
result of a series of horrific cluster fox On my part,
I'm late to the meeting. I accidentally left the meeting
when they started recording. Just a just a complete fucking

(00:26):
ship show. Speaking of ship shows, my co host Garrison Davis,
how are you, Garrison. I'm the one that saved this.
I had to send the guest to the zoom call.
I know. I'm not even supposed to be on this call.
No you're not. You're not even supposed to be working today.
That's not true. Well but you're not on this call,
not on this call. But here I am saving. This

(00:47):
is enough. This is enough, Woody Banter. This is a
daily podcast, and now let's bring on our guest for today,
monsignor Alex new House. Alex, how are you doing. I'm
doing well. Thanks for having me. I feel like I
was pulled in off the street, just like bundled into
a van and then yeah, yeah, we uh. You know

(01:10):
how people used to get like shanghaied, like like captured
by allegedly allegedly and forced to work on on on
boats in like San Francisco and whatnot. We do that
with podcasts. I mean that is actually most of what
I've done to the people who work on your podcast.
I think I think I've had everyone from your show
on our show now, and it has been very much

(01:31):
like I'm just pulling them on a string. Speaking of which, Alex,
you are one of the hosts of the Terrorism Is
Bad podcast, a very uh controversially named podcast. Uh. And
you work at the Middlebury Institute of International Studies at
Monterey Center of Terrorism, Extremism and counter Terrorism Center on
not not of that would be a different center, very important,

(01:53):
very important. Yeah, and we're not we're not bringing you
on to talk about how to make explosively formed penetrators.
Not that not this time. That is someone else. Yeah,
but you are also you were also a actual games journalist. Yes, yeah,
I got my start in this weird space. How do
you gamer gate? How do you feel about ethics in

(02:14):
the game journalism industry? Alex uh, always been fine, Like yeah, alright, anyway,
that's the end of that. Yeah, I do want to
actually start there, Alex, because you and I both have
something in common, which is that we we've got our
start writing in a field that's wildly different from consulting

(02:36):
with like governments on terrorism. Like for me, it was
I wanted to write like Dick jokes on the Internet
and I just like stumbled into a bunch of ices
propaganda that most people weren't aware of and and that
started me like lecturing at universities and ship And for
you it was gamer Gates. I'm interested in kind of
you telling your story a little bit to start us off. Yeah,
so I was. I was during undergrad I entered every

(02:59):
summer at game Spot video game website you may have
heard of. It's one of the two big ones along
with I g n UM. And when I was doing that,
I was so this was like right in the at
the beginning stages of of gamer Gate really popping off.
And what ended up happening is a lot of the
people I worked with, a lot of my colleagues and

(03:19):
friends were just in the blast zone. They were just
targeted by the absolute onslought of of harassment UM and
I just had a curiosity started looking into some of
those people who were who are targeting my friends and colleagues,
and it ended up being a lot of the people
that were still talking about today. Uh you know, it
all all rolls back up to the bright bart metropolitan area,

(03:40):
if you will. And um, I don't know what a
uh the thing that made me want to I mean,
obviously I've been aware of you work for well, the
thing that maybe want to specifically bring you on as
you started on a new project to create like a
video game that that will hopefully have an ability to
help like de radicalize people. And I'm I'm not entirely
certain like of the details of the project, but I

(04:01):
think it's a fascinating project because as as you know
all too well, a lot of this stuff started in gaming,
not as a result of anything specifically about gaming, but
the kind of like socialization that occurs in those spaces
and the kind of like different communities. And it's been
like we have going back to the nineties evidence of
like different Nazi groups on the early Internet, like talking

(04:22):
about like these are specific specific groups and subcultures that
you know will have an easier time radicalizing and whatnot.
But yeah, I'm interested in kind of what actually is
going on with this project, um, And and how you
think it's going to look at this stage. I understand
it's pretty early in development right now, so I'm not
expecting like, you know, an E three walkthrough. Yeah, our
E three size life and I wish we had that. Um. Yeah.

(04:45):
We won a grant from DHS and FEMA their their
Tears and Prevention Grant program this year. We just got
awarded it like literally two weeks ago, so I have
not even started work on it at all. But the
project will be a laboration between my center and a
nonprofit games development company called the Eye Thrive Foundation. And

(05:06):
basically what we are going to do is like build
digital scenarios, digital narratives that can be engaged with UH
within classroom settings. So we're targeting high schools for rolling
this out UH, and the idea is that we're going
to give students the ability to take on roles that
empower them to better understand how extremism and radicalization work
as mechanisms, which will hopefully the idea is that it

(05:29):
will it will improve resilience and you know, civil integrity
and all those fun buzzwords within within high school communities.
So we're not necessarily trying to de radicalize already radicalize people.
But we're really trying to build community awareness, community resilience
to to radicalization pathways. I mean, this is something I
think about constantly because I get asked this a lot.

(05:51):
You know, I'll get I'll get emailed questions from people,
sometimes as much detail as like, hey, I'm like a
teacher and here's some things this could in my classes said,
or something he put in an essay, and like, I'm
growing really concerned earned about him, and like, I what
do I do? And my usual answer is, you know,
there's a couple of people who I respect that I'll
try to direct them to, but I I don't. I'm
pretty good at how people get radicalized. It's something I

(06:13):
spent a lot of time studying. I don't know how
how you I have trouble figuring out how to break
down these pathways because like, right, the the default for
a lot of people and for a lot of time
has been will you d platform? Right? You um, you
get them off of whatever. And there's there's I do
certainly think there's there's utility in that, but there's also
you know, the toothpaste tube effect, the fact that when
you you squash these popular areas where they're able to

(06:36):
spread them, they filter off into increasingly isolated communities that
develop new terms, they find out ways to hide it,
and that actually increases you know, it may it may
reduce the number of people who get radicalized, but the
people who remain just get more and more extreme because
they're even more isolated from you know, everyone else. And
I don't know, how do you how do you how

(06:57):
do you break that that radicalization and cycle? Like, how
do you how do you stop that ship before it gets,
you know, to a tipping point? Yeah, I mean, in general,
I'm with you, I'm pretty skeptical of a lot of
de radicalization strategies. Uh. And it's it's like an incredibly
difficult task to to pull someone out who's already going
down these pathways. And then, like you said, it's also

(07:19):
an incredibly difficult task to make sure that when you
are disrupting the radicalization networks that they aren't just disappearing
off to some other corner of the Internet, which we
know they're doing. Like one of the reasons why we're
we're working with a video game video game company is
over the last few years, we've noticed a big migration
into video game platforms especially big social based video game

(07:40):
platforms like Roadblocks and Minecraft, which are like not even
remotely prepared to deal with you know, very well developed
sophisticated radicalization networks. They have moved over there both for
organization and radicalization reasons, um, since mainstream companies have started
taking more of an interest in de platforming them. Uh

(08:00):
And so we're ending up like pretty wildly unprepared for
this sudden onslought of extremists being right in front of
kids as they're playing games or you know, teenagers or
even young adults. So our idea essentially is to use
that language, the same language that extremists are trying to
adopt the structures of video games three via the sort

(08:21):
of interactivity there, to better communicate, uh, the the impacts
of extremism, what it looks like, how to identify it,
and hopefully how to avoid getting you know, falling into
the traps that are laid uh for for unsuspecting people.
One of the issues and I'm curiously thoughts on this
because we we we talk a lot about, Like I

(08:42):
think people have become increasingly aware of how bad Facebook
in particulars is a problem with this. It's it's really
well real a lot of the boogaloo movement to and
now this stuff is coming out about like the data
Facebook has had on just and this isn't this isn't
this is adjacent to radicalization. Um, the mental impact that
it's been having on teenagers right like the just how
bad it is for people, And UM, I'm wondering, like

(09:07):
how do you scale this stuff? I guess is the question,
like how do you actually how do you make the
social internet less dangerous? Yeah, I mean that's that's going
to be extremely tough. And we are even starting very
very small, like we're building we're building on a narrative
platform to target three high schools right now. Um. But
the hope is that ultimately what we can do is

(09:28):
build a tool set and and a platform, like literally
a game platform that can be used by high school
teachers in high school classes throughout the country or throughout
the world. Um. The idea will be to hopefully make
a new sort of package of different methods and interactive
experiences that can be reused into the future. But it

(09:50):
is one of the big open questions that we will
hopefully come to some sort of answer for throughout the
project about how do we actually scale us up UM.
But you know, in general, it is again like one
of the biggest open questions right now. One of the
reasons why I's so skeptical of a lot of d
RAD and CBE techniques is they try to go for
scale about effectiveness UM, when in reality, one of the

(10:14):
best and only de radicalization pathways that we know of
involves people that you know and I know going out
and meeting with these people one on one and having intensive,
frequent communications with them. So UM, there's as far as
we know, there's not a good answer right now. This
is a huge place of research right now because we
should just straight up do not understand how to scale

(10:35):
up UM radicalization, prevention and de radicalization. I mean, and
you know what you're trying to do, And like, reaching
kids in high school in something that's meant they're meant
to be consuming while they're in school is even such
an additional challenge because I think you and I are
both young enough to at least remember that like almost
nothing that you put before kids in that context in

(10:57):
a school gets through. I can I I can think
about like anti drug programs and stuff when I was
a kid, and how ineffective they were. There was I
had one one effective anti drug like speech by a
teacher and it was just a teacher who whose son
was part of this this there was this one night
and plane where like six kids indeed on heroin. It
was there was big Rolling Stone article about it was
a very famous moment, and her son was one of

(11:19):
the kids who nearly died and she was and she
like just explained like physically what happened to him and
begged us not to do heroin. And that actually did
stick with me. I've never never shot up anything, um,
but you know, like the a lot of it doesn't work.
And I think part of why it's this thing I
talked about when I tried to explain like why isis
propaganda was so effective, it's the it feels more authentic

(11:42):
than the than the counter narrative, right, the counter narrative
because it's it's usually focus grouped. It's coming as the
result of like some sort of government initiative, a bunch
of people worked intogether. It feels focus grouped as opposed
to there's something inherently more compelling about something that just
like feels like somebody who really gave a ship cares
a lot put this thing together, even if it's terrible.

(12:03):
And I that strikes me as a really because if
you're going to be scaling something and trying to reach
a lot of people, it's going to have to be
something that is put together at scale by an organization.
And how do you I mean, I know this must
be on your mind as you're trying to figure out
how to craft this thing. I'm just interested in your
thoughts on that. Really, Yeah, I mean that exact challenge
challenges what led us to proposing the project project that

(12:24):
we are so the idea behind it or the the
impetus behind what we did what we proposed is, um,
the exact problem of students just don't listen to people
in whether that's anti drug programs or anything like that. Often,
my uh my uh feeling about it is they are
often resistant to it because it's very negative. It's very

(12:46):
don't do this, don't do this. I'm setting up boundaries
for for kids and ano listens to act within. It's
all very declaratory, very you know, commanding. Um, there's no
there's no sense of treating kids like people who have control,
who have interests, who have motivations. It's all attempting to

(13:06):
restrict them. And so the idea is that we're going
to attempt to build a game platform that actually empower
students to operate within roles that have control, that that
have something to say, to give them voices, to give
them UM and that sort of feeling of being an
established UM person within a within a certain scenario. UM.
The way that I've been thinking about it is that

(13:28):
we're basically merging video games with like the structure of
a model you in conference or something like that hopefully
will be a little less nerdy than the model un conferences.
But that's the idea of giving people power to make decisions, uh,
and and treat them like actual, you know, operating humans. Yeah,
I uh. I'm wondering do you have any kind of

(13:49):
models that you're looking at when you think of like
something that you see is is kind of worth I
don't emulating maybe the wrong word, but like, oh, these
people I think got it right and and this was effective,
Like or is this really a situation where you feel
like we're kind of in the fucking wilderness here. There's
not a lot of great models for what's effective. We
are very much in the wilderness. I was expecting you

(14:12):
to say, like, so much of c V and d
RAD work of the last ten years has been directly
towards trying to essentially recreate the like the DARE model
or the anti drug model, just in a different field. Um.
And so we're going to be pulling from scenario builders
and like mad un and debate and like all of

(14:33):
these different models that seem to at least work to
get kids engage with like operating that sort of situation.
But it is going to be pretty I mean, at
least from what I understand, is gonna be pretty new.
We're going to be out there really flying blind for
a lot of it, um, But we will you know,
we have a pilot phase built in to try to
bay attest this with with um some of the students

(14:56):
were incorporating students and instructors and the actual creation devolopment stage.
So that'll be another hopefully good part of this. We'll
we'll give some students experience with the game development process,
which I think will health engage them as well. That
strikes me as a particularly good idea of like giving
and also just giving them some agency. So it's not

(15:18):
like this is a thing that you are forced to consume,
like this is the thing that you can like learn
something from. I think that's that's very important. I'm interested
in how you see how you see this because like

(15:41):
again we kind of both got in around the same time.
Gamer Gate is when I started paying attention to radicalization
to how do you think it's changed since then? How
do you think like the nature of of how, particularly
like younger people are being radicalized has changed. And I
guess I'm also interested because I get the feeling that
back then it was mostly younger people getting radicalized and
that's no longer the case. Just as we're talking, I

(16:02):
just came across the video on Twitter of a group
of anti vax protesters chasing parents and children away from
an elementary school and screaming at them that they're raping
their kids with a vaccine. So clearly the problem is expanded.
But yeah, yeah, and honestly, one of the things that
keeps me up at night is when we start, if
you know, knocking movies, are able to roll this out
to more schools, we're going to run into some probably

(16:24):
very resistant parents who have radicalized. Yum, yeah, I mean
the big one is, like what you said, like the
radicalization demographics have vastly expanded to incorporate so many more
different types of people, so many more ages and even
ethnicities and genders. Um. But what we do know is
that the hardcore of the of the violent extremists are

(16:46):
still targeting adolescents. UM. We know, accelerationists, for instance, hang
out and try to essentially black pill a bunch of teens,
especially autistic teens, especially teens with mental health issues, UH,
and bring them into a more violent, more accelerationist posture. UM.
So I mean, I think that has sort of stayed

(17:08):
constant throughout all of this. One of the big uh
changes has been platforms. You know, ten years ago, it
was much easier for a neo Nazi to operate openly
on YouTube or Facebook, but that has thankfully changed. Um.
But they have spread out into like I mentioned earlier,
they've spread out into video games. They spread out into

(17:29):
other sorts of platforms where the social aspect isn't necessarily
the first part of the platform, but rather a secondary
aspect to it, And they try to engage UM adolescence
on their own turf on you know, in a roadblocks
game or in a in a video game forum. Out there,
it's not even enough to say. It feels like the

(17:49):
task of reducing radicalization or or not not even mentioned,
pulling it back, just stopping the process. It feels not
just like whack a mole, but like whack a mole
when you're surrounded by moles. Um. And I guess that
is the thing that keeps me up at night the most, too,
is that like the problem has gotten because of how
social media scales, I think, in large part, has gotten

(18:10):
so much worse than it ever was. And the I
see these crowds of adults, you know, assembling in you know,
places like Los Angeles and showing up outside of schools
to her ask people, and like, I don't know what.
I don't know what to do about that. Like part

(18:30):
of me thinks, um, part of me thinks that the
only effective long term answer is to mobilize a larger
number of people two show up to you know, not
necessarily confront those people, but make them make them feel outnumbered,
and maybe they'll stop, and that will start a process

(18:51):
where they they alter their thinking. Like I'm thinking kind
of back to some aspects of the civil rights movement
here right where you would have these people show up
at schools just try to stop integration and whatnot, and
they would be opposed often by by larger groups that
they would see the size of the marches in the street.
And like, I don't know, I don't even know if
it works that way anymore. Like knowing that you know,
tend to one people think your stance on vaccines is

(19:14):
stupid and they're willing to show up to like yell
at you if that would do anything. But I don't
know what. I don't know what's going to do. Like
I guess I'm asking you, like, can you have you
figured this out? Because I don't know what the funk
to do? Um, But it's it's it's not you can't
we can't close our obviously you're someone who's trying to
confront it directly, but we certainly can't keep ourselves like

(19:36):
just pretend it's not going to get worse, right, No, totally,
And um, you know, I often feel like it's almost
too far gone. And you know, frequently I worry that
we've already passed some sort of you know, part of
the return on rodo causation exploitation of social media. But
one of the other things I've also recognized is that

(19:57):
when you're in a space that is dedicated to one
type of confronting one one method of confronting extremism, very
often they will forget about, or d prioritize, or or
even ignore the other types, the other methods. And one
of the tasks before us, I think, before we throw

(20:18):
up our hands and give up, is trying to tie
together all the different facets of resisting extremism, from the
hardcore confrontational doxing and showing up in the streets counterprotesting,
which I think is an essential part of it, to
UM working as hard as we can to try to
get tech companies to to realize what's going on UH,
and then also on the educational side, like what we're

(20:40):
doing with this with this project UM. Some of the
things that make me at least a little bit optimistic
is that there is obviously inertia, both intentional and unintentional
at tech companies, but frankly, they are still extremely far
behind in understanding how to even do D platforming on
their platforms, how to even identify who t D platform like,

(21:01):
the majority of tech companies are still making conto moderation
decisions on a piece by piece basis specifically looking at content.
Very few of them are doing actor analysis, very few
of them are doing secial network analysis. Very few of
them are looking at even the links between like off
platform violence and on platform content, Like it's the They

(21:23):
are still very much in the stone ages when it
comes to contom moderation. And that's so so key when
I think about like what actually would reduce the harm
that these platforms are doing at scale. It's focusing on
the actors, um and and not just like the individual actors,
which is part of the patterns that let you tell
whether or not someone is like that same actor who's

(21:43):
kind of like putting on a different hat, so to speak. Um,
are you aware of like, is there any I because
I have not seen that happen yet. I haven't seen
Facebook take that seriously, um, and I have I have
spent some time there. I haven't seen certainly haven't seen
Twitter take that seriously. UM, I haven't really seen. I

(22:05):
don't believe TikTok is like they're they're they're they're just
um like you said, they're going after they're taking it
on a piece bay piece basis, which is never there's
too many pieces. That's never going to handle the problem. Yeah,
I mean taketok is crawling right now. They are in
their infancy. Um, they don't. They don't have a data
sharing uh, any sort of data sharing systems set up

(22:27):
for researchers or anything like that. Yet I've seen optimistic signals.
So I think Facebook's approach to q and on in
boogleoo movement over the past year has been probably the best,
the most positive development we've seen on the content moderation front,
because they took an actual network based approach to it.
It was hands strung by a variety of different policy decisions,

(22:48):
but it was still from like a from like a
mechanics standpoint, the most sophisticated one any of the comedies
has actually talked about openly. Uh. And YouTube has followed
in their path. They've started taking more network approaches. Um.
They they've taken moderation action against q and On on
a similar basis. But the thing that I want tech

(23:10):
companies to start looking at is applying a lot of
the techniques they're using for disinformation and info ops work
to extremism and radicalization. It's very similar, but right now
it seems to be just easier politically or just there further,
along with doing the large scale network analysis approaches on
this info UM, like Twitter is doing a lot of that,

(23:32):
but it's all on information operations and take info yeah,
as opposed to yeah people. Yeah. And I I worry
too because I'm paying attention to kind of you know,
you have this whistle blower from Facebook, and how that's
being politicized right, how the right is kind of coming
at this from a they're trying to say, like as
Ben Shapira said, they're trying to to UM to censor

(23:56):
alternative media voices and the like. And I I worry
each tremendously about the politicization because number one, it means
that at best we've got like three years to get
something together before you know, who knows whose wides up
in the White House next. But also if it's just
this thing of like veering between who gets who gets

(24:17):
paid attention to UM based on like what is politically
viable for Facebook, we're never going to solve the problem.
And I I think I agree with you for the
most part on the Facebook's response to the Google Boo movement.
I mean, I guess I think the problem was that
by the time they developed a functional set of responses
to it. Um, it had metastasized, it had grown, it

(24:40):
had grown strong enough to exist on its own, and
a lot of people have gotten exposed. What do you
think is the actual is reasonable to expect in terms
of response time from these people? Because with boogle boo stuff,

(25:01):
it was about I want to say, about three months maybe, well, no,
it was more like five. It was about five months
that it had from like December of twenty nineteen was
when I started really noticing it, and then like you know,
May at the when when stuff really kicked off with
the George Floyd protests, when you started to see action
taking the tail in the May. Yeah, So I guess

(25:21):
that I'm wondering, like what is the half life of
this ship? Like how quickly do you need to crack
down on this stuff before it gets to be impossible
to contain? Uh? Yeah, I mean that's the biggest limiting
factor on that effectiveness of UH contemnation in general, but
also in particular these new approaches that the tech companies

(25:43):
seem to be experimenting with. UM. My understanding is that
part of the So I'm not I'm not defending Facebook
by any stretch I'm not here to be the Facebook
rallying croup, but my understanding is that they literally did
develop an entirely separate approach to taking down the Biblie movement,
So that explains at least a little bit of the delay.

(26:03):
But hopefully, you know, my optimistic side hopes that they
will be able to apply it more quickly in the future. Um.
The problem is a lot of the network approaches that
have been developed are have like these very high thresholds
for attribution. So it has to be like a dedicated
network that has crossed the line into criminal activity and

(26:23):
is actively calling for you know, political violence on like
a network level, and that like we all know that
that is that is like the end goal or the
end point in exactly right, Like that is the terminal
point of the development of these extremist networks. So you know,
we're one of the one of the things that we're

(26:46):
working on is trying to figure out a way to
convince tech companies that you can and should take action
earlier before it reaches that point. And it's going to
be a mosaic of things. It's going to be combining
violent extremism with hate speech, with even like c sam
child exploitation stuff with um, all of you know, criminal
criminal conspiracy, network policies, all of those things need to
be sort of thought of as pieces in a single, big,

(27:09):
overarching umbrella that we can use to take down networks
earlier on. But you know, it's a it's a that's
one of the biggest tasks is just convincing them to
think about it much much earlier. Yeah. Um, all right, well,
let's I think most of what I wanted to get
into today. Is there anything else you really wanted to
like kind of talk about while you're here? Um, those

(27:31):
are the those are the big ones for sure. We
will hopefully have more to talk about very soon. And
how we're approaching this project. Um, it's going to be
a pretty big project. It will take two years to implement,
but um, we're pretty excited to see what comes out
of it. Yeah. Um, well, people can find you on

(27:51):
Twitter at it's just at alex new house, right, alex
B new house, alex B new house. Yeah, at alex
B new House. Um, they can check out where you
work at at C T E, C M, I I
s UM And yeah, I'm I'm excited to see. Well,
maybe we'll have you back on when you UMU, when

(28:13):
you you actually put out the game. But I'm really
interested in looking at that. Oh yeah, it was the
last thing you brewed. Oh I brewed a red I
p A and I'm currently brewing three gallons of apple cider.
Oh nice. We just um, we juiced ten gallons of
apples and pears that I just kegged after almost four
weeks of fermentation. That I know. I've been I've been
looking at I've been looking at apple mills, like apple presses,

(28:35):
and yeah, I should I should just buy one. We
found one to rent. Um So it's just like, I
don't know, thirty bucks for the day. Uh, and we
just gathered up all the apples on property. But it's
it was rab definitely very solely. Yeah, we were juicing
all of the apples the day that um tiny got
shot at that protest in Olympia. So it's just like
looking at the Twitter saying there's been a shooting into

(28:58):
protests and be like, yeah, I'm glad I'm not working today. Yeah,
I'm glad I'm not working today. Idellic afternoon pressing apples.
This is this is a more enjoyable use of my
time right now. All right, well, Alex, thank you so
much for being on, Thank you for what you're doing,
and thank you all for listening. Go with you know, whoever,

(29:19):
whatever deity up to you. It could happen here as
a production of cool Zone Media. For more podcasts from
cool Zone Media, visit our website cool zone media dot com,
or check us out on the I Heart Radio app,
Apple Podcasts, or wherever you listen to podcasts. You can
find sources for It could Happen Here, updated monthly at

(29:40):
cool zone media dot com slash sources. Thanks for listening.

It Could Happen Here News

Advertise With Us

Follow Us On

Hosts And Creators

Robert Evans

Robert Evans

Garrison Davis

Garrison Davis

James Stout

James Stout

Show Links

About

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.