Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
I'm Anny, and this is no such thing the show
where we answer our dumb questions and yours by actually
doing the research. On today's episode, we're trying to figure
out what exactly it would take to change the mind
of a conspiracy theorist. There's no no such thing, such
(00:23):
thing touch thank Okay, So I'll start this episode off
with a story that happened to me recently. I was
home in the Great State of Ohio a few months ago.
(00:48):
And you know, when you're when you go back home,
you're meeting up with some people that you haven't hung
out with for a long time, oftentimes people that you
grew up with.
Speaker 2 (00:56):
Uh.
Speaker 1 (00:57):
And so we you know, group of friends, and I
went to a bar, a couple of drinks in we're
talking about politics, sure.
Speaker 3 (01:05):
And lovely conversation to have you have a couple of drinks.
Speaker 1 (01:09):
Especially in Ohio. Like the people I'm talking about our liberals,
but like they definitely think about politics differently than we do.
Speaker 4 (01:17):
Yes, you know, living in our coastal elite environment.
Speaker 1 (01:21):
Anyway, I can't remember how the conversation got here, but
I made a joke about Pizzagate. Do you guys remember
what Pizzagate of course, basically, you guys were on round zero.
Speaker 2 (01:33):
Yeah, Me and Noah went to college in DC, and.
Speaker 5 (01:39):
I used to work right next door to Commet Ping Pong,
so I knew a lot of people who worked there
at the time.
Speaker 4 (01:45):
Yeah, and then but.
Speaker 5 (01:46):
Yeah, it was crazy because then we were working at
BI when Pizza Gate happened, right, and then here I
am editing a video about it, and there's people I
knew and used to work with and see every day
talking about this horrible thing that happened.
Speaker 1 (02:01):
Yeah. Quick summary. During the twenty sixteen election, WikiLeaks published
all these emails from the Hillary Clinton campaign. Conspiracy theorists
went through the emails and claimed that they deciphered some
kind of coded references to child abuse. They decided that
one of the locations of this abuse was in the
(02:23):
basement of Comet Ping Pong, which was a pizza spot
in Washington, DC. And it got to a point where
someone actually went to Comet Ping Pong with a gun
trying to uncover this alleged layer where these activities were happening.
And I think he actually shot the gun at like
a lock that was on and locked door. Fast forward
(02:47):
many years, I'm in Ohio at this bar with some
friends and I make a joke about like barging into
an establishment quote like the pizzagate guy, and one of
the people and the group said to me, you know,
you joke about this, but there's actually a decent amount
of evidence out there that the stories around Pizzagate have
(03:11):
some credibility. Now, obviously I'm a few drinks in and
I decide to press him on this. I say, what
are you talking about? He says that there's a video
in the dark web or something that shows world leaders
participating in child abuse in one of these kind of
basement dungeon areas, like the one they thought was under
(03:35):
commet Ping Pong Pizza. So I press them a little
further and I ask, you know, do you think that
world leaders who wanted to engage in this activity would
record it on video? And that's kind of when the
conversation went off the rails. He was viewing all these
ideas and if I pushed back even a little bit,
(03:57):
he would just kind of spiral into these different team engines,
and I just I was blown away by how adamant
he was about all of this, considering how little he
was actually sick.
Speaker 3 (04:15):
Is this so grow up, you grew up with this
guy kind of.
Speaker 1 (04:19):
I mean it's one of those like in the friend
group situations around, Like you know, back then in high
school you're hanging out with people because you're in the
same homeroom.
Speaker 3 (04:26):
Yeah, because you don't have a deep connection.
Speaker 2 (04:28):
But did you get the sense from him that this
was his vibe in like high school?
Speaker 1 (04:34):
We certainly in high school would talk about like.
Speaker 3 (04:36):
Loose Change or yeah sure, yeah, yeah.
Speaker 1 (04:39):
So if you don't know listeners, Loose Change was like
a documentary that came out back then about how and
why and the evidence that nine to eleven was an
inside job. And I think like believing that nine to
eleven was an inside job was like a like edgy
thing for me, as like a fourteen year old to think. Obviously,
once I grew a few more brain cells, I realized,
(04:59):
like this is probably not accurate. But yeah, I remember
actually now that you asked me, that we did talk
about that documentary a lot back then.
Speaker 5 (05:08):
Oh yeah, I remember watching it on Google Video. Yes,
it was for YouTube, so is I just remember the interface.
But that's how LEAs changed, I think was maybe predominantly shared. Yeah,
I came out, which I think was like two thousand
and five six. But yeah, I remember watching him just
being like, wow, this is it's blowing my mind right now.
Speaker 3 (05:28):
Yeah, no one doing anything about that.
Speaker 1 (05:30):
Yeah, but it made me think about how easy it
is to believe this stuff because they'll they'll say something
that's like, well, you know, the mathematical burning point of
a blah blah blah, and then you see they put
a week screen we were fourteen. Yeah, yeah, I wonder
if that was just like kind of the starting point.
Speaker 3 (05:49):
Yeah.
Speaker 1 (05:50):
But outside of like that interaction, No, I never really
took him to me. He's like a like sports fan.
Speaker 6 (05:56):
Yeah.
Speaker 3 (05:56):
Yeah, he's not saying things on a daily basis.
Speaker 1 (05:58):
It's not what his social media. Yeah, and I and
I think there's there's maybe something to the fact that,
like we were a few drinks in, Like I don't
know that you would just be bringing that.
Speaker 7 (06:11):
Up at work or whatever.
Speaker 3 (06:13):
Yeah.
Speaker 1 (06:14):
A few sentences into this interaction, I was just like, WHOA,
how do you get to this point where you believe
something so wholeheartedly but you fold at even the smallest
amount of pushback. Yeah, because obviously the evidence is not
an existent, but conspiracy theories like Pizzagate and nine to
eleven was an inside job. Like, those are very serious
(06:37):
topics and theories that have probably led to real life
harm for people. But if you're someone who believes that,
like the government is hiding aliens or whatever, I'm actually
so okay with that. I would say most conspiracy theories
are kind of dumb and silly and just fun to
talk about. I was actually gonna ask you guys, like,
what are some conspiracy theories that you either used to
(06:59):
believe or you think are more feasible than other ones.
Speaker 5 (07:03):
Have you guys heard about you know, Jim Morrison from
The Doors, the rock band.
Speaker 3 (07:08):
No, I'm sorry, there's some white ship.
Speaker 1 (07:11):
Kim. We're talking to two black guys.
Speaker 4 (07:16):
From The Doors.
Speaker 5 (07:17):
Resident White of the podcast. Falcimore played him in a
biopic in the nineties.
Speaker 3 (07:24):
But oh okay, I do actually notice they think he's
still alive.
Speaker 5 (07:28):
Yeah, so he he was like part of the twenty
seventh Club, which are like a bunch of rock stars
who passed away at twenty seven. Anyway, there's a theory
that he's still alive and is just living a quiet life.
Is like some guy named Frank in like Syracuse, New York.
Speaker 3 (07:41):
Wow, Wow, did you watch the documentary.
Speaker 4 (07:43):
I think there's a I haven't watched it.
Speaker 5 (07:44):
There's a documentary, and then there's a new story of conspiracy.
Speaker 4 (07:47):
Yeah, that team I think earlier this year.
Speaker 1 (07:49):
I would love to watch them, because one thing I
will do is fucking watch a lot of Like I
don't ever believe in any of them. But there's a
YouTube channel called the I think it's called the Why Files,
and it's just this guy who goes really deep into
random theories and then at the end he tells you
why they're bullshit.
Speaker 3 (08:07):
Interesting?
Speaker 1 (08:08):
What is another one? When I was in high school,
actually I was in the International Baccalaureate program, so I
kind of like AP but a little bit more vibes
based versus versus objective kind of fact knowing. But one
of my final papers actually was about the conspiracy that
(08:29):
there were two shooters during the JFK assassination. And they
talk about this in the movie too, that like if
there were only one shooter, the trajectory of the bullet
that hit him would have had to curve or something. Yeah,
that's to say nothing about like who killed him and why,
but like there's like decent evidence that there had to
(08:52):
have been two people that's a fun one.
Speaker 3 (08:54):
I'm pretty anti conspiracy theory as a person. I watched
this change.
Speaker 2 (08:59):
And for you know, forty eight hours, thought that George
Bush did nine to eleven and then, like you said,
you like think about it for five more seconds and
you're like, actually, no, that's not what happened.
Speaker 1 (09:10):
Yeah, And like some of it, some of the deep,
like the deeper conspiracy theories have like the nugget of.
Speaker 4 (09:17):
Yeah, well yeah, yeah, that's that's the thing.
Speaker 5 (09:19):
Because even to go back to the nine to eleven stuff, Yeah, okay,
was George Bush in their planning nine to eleven?
Speaker 8 (09:25):
No?
Speaker 4 (09:26):
Is there something too?
Speaker 6 (09:27):
You know?
Speaker 5 (09:28):
Okay, they knew something was gonna happen and did less
than they maybe would have to prevent it.
Speaker 4 (09:32):
Yeah, because of other factors.
Speaker 5 (09:34):
That's more of the believable side of the conspiracy theory,
Like some of the some of these theories are more
fun than the other ones were. Like I think just
the political environments changed so much, where this stuff is
so much more prevalent at high levels, where like I
used to think Alex Jones is really funny. Yeah, and
like you'd see clips of info Wars and he's talking
about the frogs being gay because the floor eyde or
(09:55):
this or that, and then it's like, well, it's way
less funny when he's talking about Sandy Hook and these
kids not ever existing or whatever.
Speaker 4 (10:03):
It's like, it's just not funny anymore.
Speaker 1 (10:05):
It's not funny because you have to think, You think about,
like what would it take for someone to believe something
like that.
Speaker 5 (10:11):
And then there's actual harm done, you know, versus like, yeah,
if you think there is flat, yeah, no skin off
my back and.
Speaker 1 (10:19):
You're an idiot. But it doesn't affect my life.
Speaker 4 (10:20):
That's crazy. But it's like, but truly doesn't matter.
Speaker 2 (10:24):
But to me now it's like it used to be
funny when Bob was the one die exactly believe that,
But now the issue is I feel like it's representative
of so much other stuff that's happening where it's like, oh,
it's not funny anymore, as so many people think the
world is flat, because it's like, yeah, it's representative of
likely it don't matter.
Speaker 3 (10:44):
Everyone thinks they're the smartest person in a room.
Speaker 1 (10:47):
I read a good book about the I think they
just did.
Speaker 5 (10:50):
They did a Netflix documentary of it about the Manson murders,
and it the theory of this book is that Manton
was like a informant for the government, and then also
he was basically let loose, like they knew he was
doing all these crimes and stuff, but they say they
let him do it to kind of poison the hippie movement.
I think a lot of times these things aren't as
(11:10):
complicated as we want them to be. Yeah, and there's
it's more like a negligence than an active conspiracy.
Speaker 9 (11:19):
Yeah.
Speaker 1 (11:19):
I think you hit the nail on the heather because
it's like, there are such things as conspiracies. The things
that are conspiracy theories are like the more dumb things.
But obviously governments and organizations have conspired to do nefarious
things in secret, and they fuel conspiracy theories. It's just
(11:41):
that very few of these theories ever turn out to
be true.
Speaker 3 (11:46):
Do you give us an example.
Speaker 1 (11:50):
Yeah, So the Tuskegee Experiments, for example, was of like
forty plus year long study with the goal of tracking
the full progression of in a black community. They recruited
six hundred people into this study in exchange for free healthcare. Today,
the experiments are viewed as at best unethical, but at
(12:12):
worst straight up evil because these medical professionals were essentially
allowing people to die of syphilis, allowing the syphilis to
spread throughout the community, simply so that they can monitor
and document what exactly was going on. The study was
over forty years long, but fifteen years into the study,
(12:33):
penicillin became known as the main treatment for syphilis. And
so these medical professionals were letting people suffer despite their
being treatment for syphilis. So if you're in that community
and you're looking around and everyone's getting syphilis, and you
know that some studies being done, you might be like,
what the hell is going on? And it wasn't until
(12:55):
nineteen seventy two when a medical professional found out about
this study and leaked it to the Associated Press and
then boom, it's not a conspiracy theory anymore. And finally,
some twenty years after that, President Bill Clinton even apologized
for the Tuskegee experiments because they were conducted by federal
(13:15):
medical professionals. All Right, we did a call out online
to see if we could get some people to send
some conspiracy theories that they believe in and why, and
(13:36):
so before we continue our conversation, I just want to
play some of those for the listeners.
Speaker 4 (13:48):
I guys, this is rich.
Speaker 6 (13:50):
Nst heads know me as rich from episode one. I
think that the US military or US intelligence, so maybe
either like the Air Force or the CIA, is actually
behind or at least fueling the whole idea that Area
fifty one was where aliens were like dissected and UFOs
tested and all of that, because we do know that
(14:12):
that's where the Utwo spyplane was tested. And so I
think that by getting this idea of aliens out there
during the Cold War, it could have saved maybe some
difficulties and tensions that were happening with the Soviet Union
when we were just building stuff.
Speaker 4 (14:30):
To spy on them.
Speaker 10 (14:33):
I think that Benjamin Franklin was a grifter with a
great pr team. I live in Philadelphia, so his image
is everywhere. To a degree I find suspicious. He's credited
with inventing basically everything from electricity to bifocals to modern
day fire insurance. And I don't believe that one little
guy achieved all of this alone. He was never actually president,
but was somehow always there as an advisor and Elon
(14:54):
Musk kind of fraudster, the ideas guy with a big checkbook.
I just feel like it's extremely possible. All that everyone
hated him and he was a bad hang, but his
reputation has been scrubbed clean.
Speaker 11 (15:07):
My theory is that Michael Jackson was a costrato. I
think there's a lot of reasons for that. Obviously, you know,
his voice never changed. He was making a lot of
money as a kid with a voice that hadn't changed yet.
I feel like I'm going crazy because it seems very obvious.
Speaker 12 (15:31):
Hey, guys, be fine of your show here. I have
a conspiracy theory. The cosmetics industry is totally evil. That
stuff is essentially junk, synthetic junk. They selled you that,
and then they selled you the remedy to fix that
damage they have caused. The cosmetics industry is selling you
(15:52):
at the same time the poison and the cure.
Speaker 13 (15:59):
I think that Tesla cyber trucks are a government contracting play.
So Tesla does not care about these cars meeting some
big consumer market. They don't need to sell well, they
don't care about bringing down the price. Doesn't matter to that.
Tesla is happy to have these kind of high rollers
kind of test out their weird car, and then in
a few years their plan is going to be to
secure some government contract and make the cyber truck the
(16:22):
official state car of the US.
Speaker 8 (16:24):
That's my conspiracy theory.
Speaker 13 (16:25):
Like I said, maybe not that spicy, but I'm very
attached to it.
Speaker 1 (16:48):
So there's actually a pretty funny study that came out
of the University of Oxford. It took a look at
the viability of some of the more popular conspiracy theories,
like the moon landing hoax for example. Basically, they wanted
to see how many people would have to be involved
in such a conspiracy and also how long it would
(17:11):
take for that conspiracy theory to fall apart, because someone
inevitably would have to let it slip. Essentially, in the study,
this is called the failure time, the amount of time
it would take for the conspiracy theory to kind of
fall apart. The study calculated the failure time by essentially
studying other known famous scandals and seeing how long it
(17:36):
took those to kind of fall apart. All right, So
for the moon landing hoax, all right, we're talking about
peak NASA employment. How many people were working at NASA
at the time, four hundred and eleven thousand employees. According
to the study, it would take approximately three and a
half years before the employees like, you know, let it slip.
Speaker 4 (18:02):
That's pretty impressive. Eleven and three. I would think not
all being paid.
Speaker 2 (18:07):
Well yeah, I mean I think about a couple of
months if that oh yeah, that being said not to
be that guy, but not everyone at NASA needed to
know that it was fake.
Speaker 4 (18:17):
Yeah.
Speaker 1 (18:17):
Yeah, that's a good point. But I would say that
maybe if someone at NASA who wasn't involved with the
moon landing, they're probably still smart enough to know how
many people it would take to put someone on the moon. Yes,
and so if they knew the team size or whatever, yeah,
you might be able to be like, well that's fishy
if it's only taking five people.
Speaker 3 (18:36):
Yeah, yeah, only on this project.
Speaker 4 (18:39):
Yeah.
Speaker 1 (18:40):
Another conspiracy theory that this study looked at was the
suppression of the cancer cure. So this idea that these
institutions could find a cure for cancer or already have
a cure for cancer, but won't release it because they
can make more money by treating cancer, that would take
three point one seven years to fall apart, according to
(19:00):
the study.
Speaker 2 (19:01):
Well that one, it's like, no, you just say we
have the cares, it's gonna cost you a lot of
money to get it.
Speaker 1 (19:07):
Well, have you guys seen Common Side Effects on HBO?
Speaker 8 (19:11):
No, I need to.
Speaker 1 (19:13):
Yeah, I'll recommend this to our listeners. Basically, it's this
animated show. It's by some of the people who did
King of the Hill, and it's about big pharma suppressing
a magic mushroom that's like a cure all for all
kinds of diseases. It's really funny and it's also really smart.
It makes you think a lot about how the healthcare
system is set up in this country. And honestly, it's
(19:36):
hilarious to hear the voice of Hank Hill being like
a pharma executive.
Speaker 9 (19:41):
They brought me in, a CEO with one job settled
a huge fucking lawsuit that's coming at us because of
the depression and the suicidal thoughts that were a side
effect allegedly of spumaiva, which you know, no one talks
about how much arthritis we alleviated.
Speaker 1 (19:57):
But fine, So the he's in I wanted to discuss
this today is because I recently listened to another podcast
called Alternate Realities. It's by a friend of mine named
Zach Mack. He's a reporter, a journalist and a podcaster.
And the whole show is about trying to convince his
(20:19):
father to stop believing in kind of QAnon conspiracy theories.
Since he has done a three episode piece on this,
I felt like he'd be a good person to get
in here and talk about you know what exactly would
it take to convince someone that their crazy conspiracy theory
(20:40):
is not true? So that's after the break.
Speaker 14 (20:46):
It's called de Niana's freedom of speech. No, it's called
num of speech. It's misinformation. Now, who gets the right
to label it misinformation? When all these things happen, then
you will realize that I'm not as big a crackpot
as you think I am, and that these are not
conspiracy theories. These are reality. You're going to go, wow,
(21:10):
that's amazing. How did he know that?
Speaker 1 (21:14):
That was a clip from Alternate Realities, a three episode
story that you can find on NPR's embedded series. We're
here with the creator of that show, Zach Mack. He's
a producer, he's a journalist, and he's a homie. He's
a friend of the pod. Thanks for being.
Speaker 8 (21:30):
Here, Yeah, yeah, thanks for having me.
Speaker 1 (21:32):
I listened to the show. It's incredible. It made me
tear up. It made me, you know, roller coaster of emotions.
But for people who haven't listened to it, could you
give us like a little premise, you know, walk us
through what happens.
Speaker 15 (21:48):
Yeah, my father has been falling down the conspiracy rabbit
hole over the couple of years, last few years, and
a little over a year ago I had confronted him
about it, and of course he didn't want to hear me.
We had just been having these same like circular arguments.
And what he did next surprise me, which is he
challenged me to a ten thousand dollars bet that ten
(22:10):
of his predictions, they're very like politically apocalyptic predictions that
he all believed was going to come true by the
end of the year. So we made this bet early
twenty twenty four, and he said, by the end of
the year, these ten things will happen. I bet you
ten thousand dollars that they will. And when they all happen,
you'll see that I know what I'm talking about, and
(22:31):
you have no idea what you're talking about. And I
took a look at the bet, like his proposal, and
I was like, oh, you're on none of these things.
I didn't believe any of these things were going to happen.
The moment he proposed the bet, I sort of immediately
knew that this would be a good podcast story because
I'm a podcast producer and I'm sort of trained to
look for like the elements that would make a good series,
(22:53):
And so I asked if I could interview him throughout
the year and we could check in, and he agreed
to it, and so that's what the show is, is like,
you're hearing us make this bet, but you're also hearing
how his beliefs are impacting our family, which is has
been very difficult, you know, for my mother, for my sister,
for myself, And that's why the series is a little
bit heavy. But he was such a willing participant, and
(23:15):
we joke around a lot, So the series also feels
kind of light because we're joking around, we're betting, like guys,
do you know I feel like whenever guys can't settle argument,
it either comes down to betting or boxing.
Speaker 8 (23:28):
And you know, so it was, yeah, could you.
Speaker 4 (23:41):
Just lost like a couple of the his what you
were betting on it? Just like maybe three or four.
Speaker 15 (23:46):
Yeah, So some of his predictions were like the US
would come under martial law, that Donald Trump would be
reinstated without an election, that all of these top Democrats
like Barack Obama, Nancy Pelosi, and Joe Biden would be
rounded up and convicted of treason. It was all very
like politically apocalyptic, you know. He thought that a electromagnetic
(24:07):
pulse device would like wipe out all digital communication across
the US.
Speaker 1 (24:11):
And is this this is officially like q and on
stuff or is a lot of it outside of Okay.
Speaker 15 (24:16):
Yeah, a lot of it is qan on stuff. A
lot of it is just like the right wing greatest
hits fantasy. I should say. My father is like a
Christian conservative. Yeah, so it's it's like kind of playing
in that sphere.
Speaker 7 (24:28):
That sphere.
Speaker 15 (24:29):
But some of it is just like mainstream now, you know,
it's all very mainstream at this point, but yeah, it's
a lot of the greatest hits. None of the things
is like are like, we're like things that you hadn't
heard before.
Speaker 1 (24:40):
Yeah, Well, I was just telling them a story about
like an old high school friend of mine who has
all these beliefs too, about like the Pizzagate and stuff.
But my kind of interactions with him are very brief.
It's like, all right, yeah, you you're kind of crazy,
but uh, I'm going home now. But I can't imagine
what it would feel like for someone to have these
(25:03):
beliefs and it's someone you like grew up with in
your immediate family, like someone you know and love. I'm
kind of curious how challenging actually was it to produce this?
You already knew what you wanted to do from a
blueprint standpoint, but like, how hard was it to execute?
Speaker 6 (25:18):
It was?
Speaker 7 (25:19):
Really?
Speaker 4 (25:20):
It was.
Speaker 15 (25:20):
It was like the hardest thing I've ever had to
do emotionally, Like it was. It was just very challenging
to like write and piece together this series and even
just having to like cut tape of hearing my sister
crying over and over and figuring out how to position
that into an episode.
Speaker 13 (25:36):
It was.
Speaker 15 (25:36):
It was very strange for me as a producer. Normally
I'm working on things that have to do with like
pop culture or tech. I'm never reporting on my own
family and my own emotional state. So yeah, it was
it was quite challenging to make. But it was also
really beautiful because before we started the series, my father
and I were just having circular arguments.
Speaker 4 (25:55):
That were going nowhere.
Speaker 15 (25:56):
Once we agreed to make a show and I had
to interview him, the dynamic of our conversations changed completely
and we were able to have like really deep and
meaningful conversations and just talk without arguing, and I could
just listen to him and try to understand him a
little bit better and how he got here and we
got closer.
Speaker 7 (26:14):
We were not close.
Speaker 15 (26:16):
Before the series, and we got closer than we ever have,
and we've sort of been able to talk about things
that we never could before.
Speaker 14 (26:23):
This is incredibly refreshing to be able to have these
kind of conversations with you.
Speaker 7 (26:29):
Do you feel like I know you now better?
Speaker 11 (26:32):
Oh?
Speaker 1 (26:32):
Absolutely? Was?
Speaker 15 (26:34):
That was like the real benefit, like the positive that
came out of it, but it was super challenging to
make Like emotionally.
Speaker 2 (26:41):
Yeah, did you feel like the show gave you like
a framework to talk to your dad? Like what was
the difference in the conversations, Like why did you feel
like before you were talking in circles and then during
the show you felt like the conversations were more productive?
Speaker 3 (26:54):
What was that difference?
Speaker 9 (26:55):
Yeah?
Speaker 15 (26:55):
I think because when we were talking in circles, I
was just being like in emotional son, I was just
being like in an upset family member, Yeah, and trying
to win an argument, totally totally trying to win the
argument and prove how stupid he is.
Speaker 7 (27:12):
You know, it's just getting We were just getting like
really routed up.
Speaker 15 (27:14):
Yeah, And then once we started the show, I had
to approach that differently. Right, you can't just like scream
at somebody on the mic. I guess some people can't,
but like, that's just not how I conduct interviews. I
wanted to really understand like how he got here, and
I just tried to be as patient and as empathetic
as I could and really approach you with a lot
more curiosity and just asking questions and trying to listen more,
(27:38):
and talking to experts and doing research and finding out
more and more about this world and really just yeah,
taking a genuine interest. I think that that was like
the real difference.
Speaker 1 (27:48):
I want to talk about some of the research and
experts that you interacted with YEA to produce the show,
But first we're doing this episode because we want to
find out what exactly it would take to change someone's mind.
You felt like the perfect person to talk about this
because you set out to do that in your show
and I'm not gonna spoil everything. I think everyone should
go listen to it. But it's probably no surprise that
(28:10):
like the ten things that your father laid out didn't tappen.
Speaker 7 (28:14):
Oh yeah, he was zero for ten.
Speaker 1 (28:16):
Yeah yeah, far far far yeah. And so I'm curious, like,
after this experience, what do you personally think it would take.
I mean, is the answer simply just like a more
of an open mind from the conspiracy theorist.
Speaker 7 (28:33):
I think it depends on who you're talking to and
what they believe the more.
Speaker 15 (28:38):
When I started this whole process, I sort of just thought, oh,
my dad just has the wrong facts. Let me just
go get them the right facts and replace them, and
he'll be on his way and we'll be set. And
that's just not how it works. That doesn't work at all.
You really have to be almost like investigative and understand
what are these beliefs doing for them? You know, what
(29:01):
are they doing for them? What are the things in
their background maybe that like transpired that kind of got
them to this place. So much of it comes from
a distrust of institutions. My grandfather was a chiropractor, right,
and very embittered towards the medical society and was sort
of like run out of business. So my father grew
(29:21):
up in an anti VAXX household with a dad who
you know, hated the medical institution and hated big government.
So it's no surprise that he's also distrusting of these
large institutions and things like that. So I think a
lot of people will have things like that in their background,
and you can follow them down that pathway and start
(29:42):
to understand how they got here, But yeah, you really
have to get under the hood and understand like what
is it, like, what are these beliefs doing for them?
Speaker 1 (29:51):
And yeah, is the idea that like, for example, we
not your dad, but like a conspiracy theorists, if they
are getting maybe a sense of community from being on
like all the forums or whatever, is the idea that
maybe if they found a sense of community in you know,
some in some other facet of their lives, that they
(30:12):
might not be so inclined to believe the theory the
conspiracy theories totally.
Speaker 15 (30:16):
One of the experts I was talking to who's Who's
a former evangelical pastor, and he was talking about how
his parents were into a lot of this stuff. A
lot of the QAnon stuff, and they were talking about
it all the time and how Obama was the Antichrist
and all this stuff.
Speaker 7 (30:29):
And then he said they moved to.
Speaker 15 (30:31):
A retirement home in Florida and like had this really
great community and they're golfing and they're drinking, they're hanging
out with friends, and he just said, they.
Speaker 7 (30:39):
Just never talk about that stuff anymore.
Speaker 15 (30:41):
Doesn't mean they don't believe it, yeah, but they're just
they don't need it. It's it's not of interest. And
so I think a lot of times when you see
like conspiracies, they are doing something like some of the
things that they do for people, Like the main reasons
are access to a community or it's it's.
Speaker 7 (31:00):
Also like access to esoteric knowledge.
Speaker 15 (31:03):
It's like I know something you don't know, so it
kind of elevates you in status, like I'm up on this.
And you see that behavior all the time, Like it's
not just conspiracy theories like you see it with hype beasts. Right,
I got these special sneakers, you can't get these rights.
It's fulfilling the same need, just in a different different space.
So there's all these different ways. I think a lot
(31:23):
of times people are just afraid and want to be
told that everything's going to be okay, or they want
to understand the way the world works, and sometimes even
telling them a really scary vision, but like this is
what's going on. It feels definitive. And when things are definitive,
they're a little more calming because you know, you know
the answer instead of just I don't know how this happened.
Speaker 7 (31:47):
I don't know how COVID got out.
Speaker 1 (31:49):
You know that is really interesting because in the interactions
I've had with my kind of like the friend that
I grew up with, there always does seem to be
like an air of like condescension, a little bit of like, oh,
actually you would feel like I felt if you did.
Speaker 4 (32:05):
More reading, Yeah, right, if you were on my level.
Speaker 6 (32:08):
Yeah.
Speaker 7 (32:08):
Yeah, it's like the J cole fan Yeah, yeah, exactly,
But you.
Speaker 1 (32:12):
Have to be on a specific level to enjoy this music.
Speaker 7 (32:15):
This music is good if you're really smart.
Speaker 15 (32:17):
Yeah, it's exactly like the J cole fans, Like you
just see it in all these different spaces, and some
people choose to get there through conspiracies, but other times
it's sneakerheads or ja.
Speaker 9 (32:29):
Yeah.
Speaker 5 (32:30):
When you were saying that, I was thinking of, like
I feel like when you read interviews, it's like someone
who used to be in like the KKK or something,
and it's like a lot of times it's like it
doesn't seem like they actually even care about race. It's
like they had some upbringing or whatever. And then it's
like they're lonely. And then there's some older guy who's
involved in this stuff and they take them under their
wing and then it's just like boom, seems like a
(32:52):
fun thing to do.
Speaker 4 (32:53):
I guess yeah, it's like it's like more like you're
just bored and like need friends.
Speaker 15 (32:58):
One of the things I've seen is that with with QAnon,
that and a lot of these conspiracies, they cast you
as an active participant in the conspiracies. It's like, hey,
you're not just some dude sitting on the couch. You
are a soldier in a secret war that's being fought
behind the scenes.
Speaker 7 (33:15):
Like you are important.
Speaker 15 (33:17):
So it gives you a sense of importance or it
casts like a lot of times it's like these people
are lonely or you know, maybe they don't feel of use,
and suddenly they're cast as like a protector for their
own family.
Speaker 7 (33:30):
Right, It's like, I know this secret knowledge.
Speaker 15 (33:32):
I have access to this secret knowledge, and now I
can inform you and thus be of use again. And
that's like that, that's a really compelling.
Speaker 1 (33:39):
Yeah, just hearing about it.
Speaker 4 (33:42):
Yes, where do we sign up?
Speaker 1 (33:53):
One question I have is like, there's there are probably
a lot of people in your position, like people who's
paying parents are kind of deep into this, or maybe
they have a sibling. You know, what would be your
kind of advice for someone who's in this scenario?
Speaker 15 (34:09):
Yeah, I mean, since I've put the project out, I've
heard from literal hundreds of strangers who are like, I'm
going through this exact same thing, and I realized how
ununique my story was, that this is just millions of
households across across the country. My advice would just be
try to be curious about what's going on. You know,
(34:30):
it's to the extent that you can be empathetic, be curious,
be patient, do your research. And also, I think it
just has to come with a level of acceptance that
you will most likely not be able to change this
person's mind.
Speaker 7 (34:45):
And that's the really.
Speaker 15 (34:46):
Tough pill to swallow, because it's really hard to change
someone's mind, especially most deep beliefs don't feel like choices.
You know, you think about the things that you believe deeply,
you would probably feel like you don't have a choice
in that matter. I couldn't just tell you, like, Manny,
believe this differently, Yeah right now, you it's it takes
(35:07):
a lot to kind of unwind that. So you sort
of just have to understand, like what is your threshold
of tolerance for this person being in your life, and
you know, being as patient and empathetic as you can
asking questions.
Speaker 1 (35:22):
That's a really good point. I mean, I'm trying to
think of my deeply held beliefs. Like, for example, if
if one of you guys are like, well, here's a
couple of reasons why universal health care actually might not work,
I was I be like, actually, I hear you, but
I still think we should do it. Yeah, they're like
foundational belief and we've.
Speaker 4 (35:43):
Been trying to get him to back off this.
Speaker 7 (35:46):
Well, think you and I have a similar deep belief.
Speaker 15 (35:49):
You couldn't get us off that we believe like Lebron
James is the greatest basketball right, Like what would it take?
Speaker 1 (35:55):
It would have to be like one hundred or two
hundred years from now, or someone has built like someone
has built like a system where we can actually test
it out where you go back in time put Lebron James. Yeah,
that's what that's what it would take for I would
have to watch it. I would have to watch peak
Michael Jordan beat the peak Lebron James.
Speaker 15 (36:17):
But then you're talking about one on one, which is
a whole different game. And does Michael Jordan have access
to the diet plan that Lebron is on and all,
like you know nutrition.
Speaker 1 (36:27):
Yea one hundred Michael Jordan's us.
Speaker 15 (36:32):
Things you believe deeply don't feel like choices, and so
it's it's really hard to change someone's mind.
Speaker 2 (36:38):
So it's the goal then to just like make it manageable.
Like you're talking about those people who move to Florida, right,
It's like they may still believe it, Like the parents
maybe still believe it, but let's not talk about it
at dinner, you know, like you guys, that's your thing.
Speaker 5 (36:55):
Well, yeah, and even like in your family, obviously, the
dynamics between you and your dad is different than your
sister and your dad, and your mother and your dad,
and you know, it seems like you and your dad
are closer than you were before. Your relationship seems in
some ways maybe better but different clearly than his relationship
with the other people in your family.
Speaker 4 (37:15):
I mean, how does that feel?
Speaker 15 (37:18):
So, I mean, just to be clear, when I got
into this, it was to change my father's mind. Like,
I didn't get into this to be more patient with him.
I got into it to win a bet and to
try to change his mind, to pull him out of
this rabbit hole. I think as the year went on
there there was a new level of understanding of how
(37:39):
he got here and why he's here, and also a
deeper understanding of how difficult it is to pull someone out,
and a little bit of a just an acceptance of
what is. But the reality is, like millions of people
are really falling down these rabbit holes throughout the country,
and I don't think we should just all be patient
(38:04):
about it.
Speaker 7 (38:04):
Like, I think this is.
Speaker 15 (38:05):
A real problem that we have to work as hard
as we can to fix. And it's not just hey,
be nicer and more empathetic at the dinner table, right
like that that helps me with my father in our
one on one relationship. That does not help the country
and the state that we're in.
Speaker 1 (38:22):
So some of these people with these beliefs are starting
to become elected, so.
Speaker 4 (38:31):
A lot more urgent. Yeah, I think, well.
Speaker 1 (38:34):
Great man, thanks for thanks for coming to hang out
with us. Be sure to go listen to Alternate Realities.
Where can people find that?
Speaker 15 (38:42):
Yeah, Alternate Reality is on NPR embedded. Uh, that's that's where.
That's where to hear it.
Speaker 1 (38:46):
Great, all right, we're back. What did you guys think, Well,
you know, obviously I listen to the series.
Speaker 2 (39:00):
Yeah, but it is still a little depressing to hear
that you can do all this research, all this work
and it's still basically impossible.
Speaker 4 (39:11):
Yeah.
Speaker 1 (39:11):
I think I was really I was most moved and
convinced by when he was talking about, you know, replacing
someone's need for community. For example, if someone is in
these like online communities and they basically are there because
they need a sense of community, you might be able
to replace that community with something a little more healthy,
(39:32):
and then they might not be as susceptible to the
conspiracy theories. Now, now, the people he was talking about
that move to Florida don't necessarily talk about conspiracy theories anymore.
That doesn't mean they don't believe him, but it's not
kind of like ruling their lives in the way that
it was. I mean, I just think about it's like,
forget conspiracy theories. It's like, you know, people double bling
(39:54):
down on things.
Speaker 5 (39:54):
It's like, well, even if you're wrong about something small,
like how many times have you been in a dumb
argument and you just don't want to say you're wrong
because you're embarrassed, you know, and it's.
Speaker 4 (40:03):
Like show it be some minor things.
Speaker 5 (40:05):
Yeah, It's like it could be some minor thing like oh, yeah,
you forgot to take the chicken out of the fridge
or you know what I mean, it's like something stupid
like that, and just like, oh.
Speaker 4 (40:12):
No I didn't and it's like.
Speaker 5 (40:15):
Yeah, it's like just a dumb thing like that, and
like okay, so this is much more extreme, but like
that makes it even harder to then back down from
because it's like, well, now I've been talking about this
for years, yeah, you know, alienating my family or whatever.
It's like, yeah, I'm not going to just be like
you know what, I read this article.
Speaker 4 (40:33):
Mm hmm, you're right.
Speaker 1 (40:34):
It is so hard. I think it's like something about
it must be just human nature, like something that's so
hard about being like actually I was wrong about this thing.
But I remember some of those early arguments we used
to have at BI when we talk about them today,
I don't even remember what my stance was. That's exactly
and that and that goes to show you how much
I didn't care about the actual argument, but I just
(40:55):
wanted to argue, and like wanted not to say that
I was wrong. Thanks for listening to No Such Thing
produced by Manny, Noah and Devin. Theme song by me Manny.
Shout out to Zach for joining us, and thank you
to everyone who's sent in voice notes. You all concerned me,
but it means the world. Please rate this five stars
(41:17):
on wherever you're listening to this, and be sure to
follow us on our substack at No Such Thing dot show.
Talk to you guys soon.