Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_02 (00:31):
I don't know why I
end up on threads because it's
not really You love a thread.
It shows up on my Instagramalgorithm and then I get sucked
into reading it.
Yeah.
This one in particular, thismorning that I saved.
She said, I am this close tounfriending my older sister.
She was quiet about herpolitical stance until Charlie
(00:52):
Kirk died.
And then she showed her truecolors and they clash with mine.
Then she and my other sisterlaughed about the No Kings
rally, which I attended.
They think that I'm a freak.
Trump destroys more thanbuildings.
He wrecks families.
Fuck him.
And the comment section is like,uh, you're wrecking your family
because you can't just agree todisagree.
(01:13):
Imagine turning against your ownflesh and blood in order to feel
somehow justified in yourpolitical stance.
Um, blah, blah, blah, blah.
This says, my sister's opinionsclash with mine.
I'm this close to unfriendingher.
Trump wrecks families, but lookat that.
Like, why are you blaming thaton Trump?
SPEAKER_00 (01:33):
Like, clearly,
you're the one with the issue.
Well, and again, I I think thatthere is like something to be
said about maybe a level ofnarcissism that has come with
this political polarization andand divide because it's like,
I'm right, you're wrong, I'm notgonna talk to you, I'm gonna
(01:54):
exile you.
You're a bad person now.
I've known you for 20 years,you've been my sister.
Right.
Like we've loved each other, youknow me like to my core, but now
I can't talk to you.
And it's again being so your frayour ego being so fragile that
you're adamant that you're soright.
And maybe if more people on allsides could be like, you know
(02:19):
what?
I might not be right oneverything, and my side might
not be right on everything.
And um, maybe I shouldn't wrapmy identity up with this
political affiliation.
Let's think about the fact thatuh maybe both sides are super
corrupt and evil.
And um yeah, I shouldn't be sotied with that.
SPEAKER_02 (02:42):
I think it's
refreshing to walk away and say
I affiliate with no one.
And I think it's an assumptionthat when we talked about are we
starting?
SPEAKER_00 (02:53):
Yes, like start.
SPEAKER_02 (02:54):
I think it's an
assumption like an assumption
that people may maybe they did,maybe they didn't when our last
few episodes came out.
That like now that we havespoken out about leaving the
Democratic Party, they assume weare now the other the other.
Yes, and that's kind of where Iwant to lead this conversation
(03:18):
today, because there aren't justtwo options.
This is not a black and whitething, and it is like incredibly
refreshing to step out from bothsides and to see the hypocrisy
on both sides, to see the lunacyon both sides.
(03:39):
But and I say this with toughlove, but from where I'm
standing, it seems like it'smore one side than the other.
Is that bad to say?
SPEAKER_00 (03:56):
No, I don't think
it's bad to say at all.
And I think it's on both.
I think I think that where we'recoming from is we were that
side, and I think that's whywe're talking about it because
we experienced these things.
Is I used to be you.
I used to be you.
I used to think that I was rightand I was a good person, and if
(04:19):
you voted differently, not onlywere you wrong, you were now a
bad person, and I judged you andI unfriended you and I did all
these things.
And again, I said it in aprevious episode.
It took a mushroom journey to belike, oh my God, I'm the
problem.
And oh my God, I don't know asmuch as I thought I did.
And oh my gosh, my black andwhite thinking, yeah, that was
(04:39):
like a trauma response that ummaybe helped protect me at a
time, but it wasn't, it wasdoing me a disservice in present
day and to unlearn these things.
And so it's like, even when Ispeak out about the Democratic
Party and people are like, oh mygosh, well, now you're the other
and now you're a Trumper.
(05:00):
And I'm like, yeah, you can tryto use that, but my identity is
not is not affiliated witheither side.
I'm able to like be able topoint out and see what I don't
like and the good things aboutboth.
So like you're trying to insultme, but it's not an insult.
Right.
I'm in the middle and I think weneed we need more of that.
(05:21):
We need more of the bridgebuilders.
SPEAKER_02 (05:23):
Well, I also think
that you and I are very sure of
who we are.
That too.
So what you think about where Istand doesn't bother me.
SPEAKER_00 (05:32):
Yeah, and you don't
know me because we are not
willing to have a conversationabout it.
Now I now if you're wanting tohave a conversation about it and
have some like thoughtfuldialogue and you are able to not
just like I don't like talkingto people where no matter what I
say, no matter what is presentedto them, they're not changing
their mind.
(05:53):
They're not being able, they'renot able to hear like different
points or differentperspectives.
Like they're not like nuance.
They can't hold nuance.
It's this is what I believe.
They're very disregulated, tiedto.
This is bad.
I'm good, you know.
It's impossible.
It's impossible to have aconversation.
I've like kind of tried.
(06:14):
I do try to like I do talk topeople who are very on the right
and very on the left, and I tryto be like the devil's advocate.
So if I'm talking to somebody onthe left, they automatically
assume that I'm a Republicanbecause I'm trying to like get
them to maybe get out of thatecho chamber or their binary
thinking.
But I also do the same thing ifI'm talking to somebody on the
(06:36):
right too.
Like, and there are things thatI don't agree with on that,
because it's like there'sthere's gotta be the people in
the middle who are holding thisshit up because it's going so
far the other way.
And I've I I do think thatpeople who are in the middle get
kind of hated on the mostbecause the people who are very
(06:56):
on the right and very on theleft, they just don't talk to
each other, they just avoid eachother.
And so it's like to me, I feellike it's like the people in the
middle who are like, well, youknow, have you got a little bit
of a little bit of a little bitshot from both sides?
SPEAKER_02 (07:08):
Right.
SPEAKER_00 (07:08):
Have you thought
about this on your side?
Like you guys are getting alittle, well, it's going a
little too far.
The marker keeps moving, andsome people aren't okay with it.
And I think that's okay to havea valid discussion about that
and be able to hear what thatperson has to say, and then you
know, you're on this side andyou're, you know, hooting and
hollering a little bit too loud.
And have you ever thought aboutthis perspective?
SPEAKER_02 (07:29):
Have you noticed
since we have kind of stepped
away from identity politics?
Have you noticed that sometimesyou can feel who is willing and
capable of having a conversationversus someone who isn't based
(07:51):
on your other conversations?
SPEAKER_00 (07:53):
Oh my God,
absolutely.
SPEAKER_02 (07:54):
So I think first I
want to talk about somebody who
reached out to us um about aprevious episode and something
that we said.
And I don't want to go intodetails, but she's been a
listener for a long time and andand we've had multiple
conversations with her.
Like, we love this person.
(08:16):
I don't want to throw her nameout there because this is a
personal story.
She wrote us and she said, Ilistened to what you said, and I
kind of have a differentperspective.
Are you open to hearing it?
We're like, Yeah.
SPEAKER_00 (08:29):
And I love when a
conversation starts out.
Absolutely.
Because I'm like, what are yougonna drop?
And I I I want to hear whatyou're you have to say.
SPEAKER_02 (08:37):
Yeah.
SPEAKER_00 (08:38):
Because I'm I'm open
to changing my mind.
SPEAKER_02 (08:40):
I'm open to it, and
again, maybe I'll get her
permission one day to like maybetalk about her perspective and
what it was on.
But I think we had a very goodback and forth dialogue because
she gave this differentperspective.
And I was like, you know what?
Actually, I didn't think aboutthat.
(09:01):
And I think that that should bea conversation that is had when
that specific thing is broughtup because there is nuance
there.
This isn't a black and whitesituation.
There are situations where thecontext matters, where it's not
as easy as the answer isn't asobvious to some people.
But what I'm saying is she waslike scared to say that to us.
(09:28):
Yeah.
And she said that.
And she was like, thank you forbeing a safe place for me to
open up.
You guys have helped me be ableto talk about this in a
different way.
She bought the third perspectivebook so she could learn how to
speak up more.
I love that.
Um, and I was like, this wassuch a productive conversation.
(09:52):
I learned something in itbecause there was like a back
and forth.
But that's what I mean.
Like putting feelers out andknowing who is safe to talk to
and who isn't.
And I hope, I really hope thatour listeners know that we are a
safe place and we are open tohearing other opinions and other
(10:15):
sides of things.
Like, I feel like if anything,we've proven that like
throughout the years that we'rekind of open to anything.
SPEAKER_00 (10:23):
And also, like if
there is a listener who is
listening to us and you saysomething or we say something
and it's it's you don't agreewith it, you think it's wrong,
and you have like you're willingto have a valid discussion of of
why, um, and pr presentinformation.
We do not claim to be experts.
(10:43):
We'll hear it.
Like we'll hear it.
SPEAKER_02 (10:45):
Sem tents will
change our minds.
Yeah.
We've done that a lot.
Yeah.
Yeah.
So we're we're willing to haveconversations and and we're
willing to change our mindsbecause isn't that kind of what
life is about is being presentedwith new information and maybe
under having a different walkingaway with a different
understanding and knowing thatit's not always it's not always
(11:10):
wrong or right or good or evilor black or white.
It there's so much more.
SPEAKER_00 (11:14):
Well, and I think
we're in this time where people
are really scared to talk, and Ithink that's kind of why we're
doing and doing all of theseepisodes that we are doing,
because if we feel like it'sjust kind of a necessary
conversation right now.
Um not slowing down.
Yeah, and hopefully, hopefullymore people by doing this, more
(11:35):
people can be more open to beingable to have discussion with
people that they think are wrongor people that they highly
disagree with.
And maybe, you know, out of thatdiscussion, there can be some
common ground.
Doesn't necessarily mean youhave to end and agree on
everything, but maybe there can,again, there can be things where
(11:55):
you can find some commonality init, or um walk away from the
conversation.
You're like, oh, I learnedsomething, or maybe that changed
my perspective a little bit, orsomething.
SPEAKER_02 (12:06):
Or maybe you can
walk away knowing that like
agreeing to disagree, but stillhaving respect for the other
person.
I think that this would be likereally good for people who are
struggling with theirfriendships or family
relationships right now.
SPEAKER_00 (12:17):
Yeah.
SPEAKER_02 (12:18):
I and and I really
mean what I say when I say like
we're only how many months intothis presidency?
10 months, almost 11, 11 months,and we have three more years of
this shit, and it has not sloweddown, and it has only seemed to
(13:37):
have gotten worse politically.
So I think that now is a greattime for everybody to learn how
to regulate, go touch somegrass, but also how to have
these conversations.
I think that that's kind of whatwe want to do is like teach
people how to be open to havingthese conversations.
(14:00):
With that being said, I havenotes.
SPEAKER_00 (14:06):
Y'all, Leah's gone
down a rabbit hole today.
And you want to know what I likeabout this doing just an episode
with you and I on Riverside.
I feel like we're doing kind ofa Marco Polo, but it's like
together.
We're like on FaceTime, Bestie.
That is what this feels like.
So I'm excited for this episode.
SPEAKER_02 (14:22):
I am so excited
about this now.
Yeah, we're trying a differentway to make editing a little bit
easier on our end.
And I think this is it.
This is gonna work really well.
I think so.
Okay, so I have two rabbitholes.
One is more about like, I thinkone is good for everybody.
It's about critical thinking,what it is, how to strengthen
(14:46):
it.
And then the other one, I wantto put it in a series called
rabbit holes.
Because as I was taking notesand studying critical thinking,
and I've never done this for anepisode before, by the way.
Like, I don't think either oneof us have ever put this much
effort into studying for anepisode.
SPEAKER_00 (15:08):
Yeah, we are both
going down some rabbit holes.
SPEAKER_02 (15:12):
You picked one, I
picked one, this is mine.
Welcome.
SPEAKER_00 (15:15):
Also, if if while we
do this series, if there are
rabbit holes that you want us togo down that you you're
fascinated about, uh message usand let us know because we have
been fascinated by these rabbitholes we were coming down.
It it is like this is where I'mlike excited.
I was never a great studentbecause I just didn't give a
(15:38):
fuck about like I don't want towrite a paper on that shit.
Yeah.
Where now I'm like, holy shit,this is crazy.
SPEAKER_02 (15:45):
Yeah.
So we want to know what rabbitholes you guys think we should
go down.
We don't want to do this oftenbecause it's, I'm not kidding
you.
I've like spent weeks takingnotes and preparing for this.
But as I was going down thiscritical thinking rabbit hole, I
came across some stuff aboutpropaganda and the history of
it.
And I'm going to put that in aseparate episode, and that's
(16:07):
where we're going to put itunder rabbit holes.
Because this, I don'tnecessarily consider the
critical thinking stuff a rabbithole.
It just led me into a hole.
SPEAKER_00 (16:16):
I love a hole.
SPEAKER_02 (16:17):
I love absolutely
fun.
But I also want to do aconspiracy theory section.
And I think that's differentthan rabbit holes because these
are things that like not youcan't necessarily prove them
yet.
But I love a good conspiracytheory.
SPEAKER_00 (16:37):
Yeah, usually
conspiracy theories, they're
right.
They're just too early.
SPEAKER_02 (16:41):
Just too early.
Like, what's the have you heardthat?
What's the difference betweenthe conspiracy theory and the
truth?
What?
About six months to a year.
SPEAKER_00 (16:50):
See, exactly.
SPEAKER_02 (16:52):
Oh, God.
Okay.
Okay.
So critical thinking.
Let's talk about it because Ithink we're struggling as a
society in this area.
Critical thinking, and I eventhough I have notes, Christine,
like I want you to stop me andsay whatever you think and
(17:14):
anything that's like valid.
All right.
So like don't let me just likeread off my notes.
Well, I got you.
Okay.
Critical thinking is the abilityto question, analyze, interpret,
and evaluate information to makereasoned judgments.
It involves breaking downcomplex information, identifying
assumptions, and consideringdifferent perspectives to arrive
(17:36):
at a well-informed conclusion.
That is like the definition ofcritical thinking.
It is using your brain onpurpose, not just believing
something because someone saidit, but actually questioning and
analyzing it and looking forevidence before deciding what
you believe to be true.
So if you are like a criticalthinker, you are the person who
(17:58):
looks at the facts.
You are aware of your ownbiases.
Um, you ask, how do I know thisinformation?
And you're open to changing yourmind if new information comes
along.
SPEAKER_00 (18:11):
I think that's a big
one.
SPEAKER_02 (18:12):
That is a big one.
And I am guilty of the oppositeof critical thinking.
I think we all are and havebeen, and I still catch myself
sometimes.
But the opposite of it, and thisis like a nice way of saying it,
is uncritical thinking or blindacceptance.
(18:36):
Now, there are some other wordsthat could go under this
category, but I feel likethey're kind of mean.
But like naive and um gullibleand ignorant?
Ignorant is one.
And I don't like those words.
And I think that as soon as youattach negative connotation to
something, you get defensive.
(18:59):
So I don't want to call peoplenaive.
I don't want to call themignorant.
I think, like, hey, maybe youshould um work on your critical
thinking.
SPEAKER_00 (19:10):
Oh, you you think
that would hit better if you
said that to somebody?
So much nicer.
SPEAKER_02 (19:15):
Yeah, I'm the golden
retriever here.
So the opposite of that, whichis uh, well, let's just say
blind acceptance, is believingthings because everyone says so,
letting emotions or authorityfig figures decide what's true
for you, avoiding hard questionsbecause they make you
uncomfortable, or cherry-pickingfacts that only support what you
already believe.
(19:36):
So critical thinking iscuriosity with discipline.
It's conscious, it's deliberate,it's intentional.
And how often do we speak onthat?
Intention is everything.
This is still something we'vebeen talking about from day one.
Yeah.
Blind acceptance is choosingcomfort over truth.
(19:58):
And it's usually unconsciousbehavior.
And I really truly believe we'reall, not all, a lot of people
are walking around living intheir unconscious world and not
knowing how to be intentionalwith anything, not knowing how
to be mindful with anything.
So I feel like this kind of allgoes with where we stand with
(20:18):
psychedelics and like going intoyour subconscious and learning
how to think consciously andrespond before you react, you
know?
SPEAKER_00 (20:26):
It's it's kind of
like what Joe Dispensa talks a
lot about.
I don't know if you guys knowwho Joe Dispensa is.
He's a big meditation guy, buthe talks a lot about how you're
right.
Like we are living veryunconsciously.
We're just kind of going throughthe motions and and to work on
making the unconscious thingsconscious and then reframing
(20:48):
that thought.
SPEAKER_02 (20:49):
Ooh, I really like
that.
Didn't you read one of hisbooks, or you just kind of
follow his work?
SPEAKER_00 (20:54):
Um, I I've just
follow his work and do his
meditations.
So I I would do I would love togo to a retreat or something
like that eventually one day andand you know find people who you
know practice what he does.
Yeah, they're just kind of intothis shit and into this, into
this woo-woo.
I love this shit.
It's not woo-woo, which to me Ithink it's intentionally living,
(21:15):
but you know, yeah, consciouslyliving, living in our
consciousness tomato tomato.
SPEAKER_02 (21:19):
Yeah, same thing.
So this is something that I Iremember texting you about this
because I was like, oh my God.
Like, here's a fun fact.
Because I've heard you say thisso many times.
Like, use your discernment, useyour discernment.
You how many times have you saidthat?
I do say that a lot of times tomy kids constantly.
Right.
Well, discernment is the abilityto perceive, distinguish, and
(21:44):
judge between different thingsto determine what's true.
It is the act of applyingcritical thinking.
It is not the same as criticalthinking.
You cannot have discernmentwithout practicing critical
thought.
SPEAKER_01 (21:58):
Ooh.
SPEAKER_02 (22:00):
So it is like the
symptom of critical thinking is
discernment.
Or maybe that's not really, Iwouldn't say it's a symptom, but
maybe like the the reward.
SPEAKER_00 (22:12):
Yeah.
SPEAKER_02 (22:13):
If you lack
discernment, this is where
people see things in black andwhite.
They have a hard time holdingtwo truths at once, um, which
we've talked about before, iscalled splitting, where it's
either you're either good oryou're bad.
You're there's no in between,which I think we can all agree
that not everyone is all goodand all bad.
(22:36):
And we all have good and badqualities.
SPEAKER_00 (22:39):
Well, and I think
too, when you like there's a lot
of I'm on this side, and so ifsomebody disagrees with me, then
I'm gonna other ize them.
When you maybe have not had aconversation, you're making just
assumptions of where they're ator who they are, or you know, it
was somebody that you really,really loved, and now you just
(23:02):
it's like, you know, I'veexperienced it a decade, and uh
it's now I'm bad.
I loved you, but now you're bad,and now I don't like you, and
now all of these things we'vedone that before.
Yes, I have been that personbefore.
I guess where I'm going with isit is it took some
self-accountability to to belike, wow, I'm I'm I'm putting
(23:26):
these people in boxes.
I put myself in a box.
Um, and most people when theymake choices, they're not doing
it with the intention of beingevil.
They're doing it with theintention of trying to do the
right thing, trying to take careof themselves or their families
on all sides.
And we're sitting here and we'reagain otherizing each other when
(23:50):
I think no matter what waypeople vote, most people think
that they're trying to do good.
Right.
So it's like, who's to say thatyou are good and they are bad
and vice versa?
SPEAKER_02 (24:03):
I love that you keep
saying otherized.
I have never heard that before.
I don't even know if that's areal word, but I love that.
Because that's like like nowyou're one of them.
SPEAKER_00 (24:12):
Yes.
SPEAKER_02 (24:12):
You're othered.
Yes.
SPEAKER_00 (24:14):
Yes.
And to me, otherized.
My yeah, I don't even know ifthat's working.
I'm using, we're using it.
My argument with with that is ifthat's what you're doing, maybe
take a look at that where youthink that everyone who votes
different or thinks differentlythan you is bad.
(24:34):
Um, because you're kind of doingwhat the government exactly what
the government wants you to do.
And you are being hatefultowards your neighbor or your
friend or your family member.
Um, and again, I'm I'm I'mguilty of this.
I'm I'm I was a big, bigsplitter.
Uh, and that's this is a wholenother thing.
(24:56):
But you're to me, I feel likeyou're feeding right into the
propaganda.
And it's what a what aconvenience when we all hate
each other and we are comebecoming more polarized and more
divided instead of what goodcould we do when even besides
our differences, we could beunited and be able to talk to
(25:18):
each other and listen to eachother and then maybe focus our
attention on um the things thatwe need to focus our attention
on, which is not each other, andit's maybe the powers that be
just saying.
Just saying.
I don't know.
SPEAKER_02 (25:31):
Well, it's it's
funny you say that because I
think that like when we when wego down into my propaganda
rabbit hole, it's it's bydesign, it's for a reason.
And if you want to like attachthe spiritual woo-woo side to
this, um love is the highestvibration when you're operating
out of love and when you areoperating out of hate and fear,
(25:54):
it is such a low vibration, itis impossible to resonate with
your higher self.
So are you happening that so Ijust think we all need to take a
step back and say, is this formy highest good?
(26:15):
And I think I think it'simportant to be selfish in these
situations because I think thatwe have said this before, even
before we like started talkingabout politics.
Like, it is not selfish to say,I am going to do what's best for
me and my family and make surethat we're good.
And then when we are good, thatcreates a ripple effect and
(26:37):
creates good energy for everyoneelse.
SPEAKER_00 (26:40):
Yeah.
SPEAKER_02 (26:41):
You know, so it's
it's like be mindful of the
energy that you're putting outinto the world because that is
what's going to help raise ourvibration or lower it.
SPEAKER_00 (26:53):
Yeah.
I I'm gonna be a little crass,but there are people that I know
who are so angry about the stateof the world and who's in power,
who's not in power.
And um, again, I'm speaking fromexperience, but there is you see
so much dysregulation andthinking and acting on fear.
(27:16):
Yeah.
And I am thinking in my head, ohman, if you took that energy and
you put it into yourself, oh mygod, like then you could really
do some damage.
Good damage, like good damage,yeah.
Like, then you could really likemove some shit around in your
life.
100%.
(27:36):
Then you spread that.
But anywho.
SPEAKER_02 (27:39):
And and to speak on
that, I think that we have both
lived through that where we tookour focus and our energy inward.
We worked on ourselves.
By doing that, it encouraged ourhusbands to do the work.
By doing that, we have createdfamilies that are healing.
(28:00):
We have broken generationalcurses and traumas.
Like we are continuing to putgood out into the world by
creating other good humans.
Yes.
Because we focused on ourselves.
SPEAKER_00 (28:13):
Yes.
And what better way to put goodout in the world by being it and
by modeling it and not by sayingit.
Yeah.
Like, you know, it's it'ssomething that I've like have
had a talk with like a lot ofconservative Christians about
where I'm like, okay, but here'sthe problem.
(28:37):
We we both have a faith insomething.
We have we just have differentinterpretations of it, right?
I like one, I think God is awoman.
Um, I think God is in all of us.
I think we kind of create ourown heaven or hell.
That's just my personal belief.
I I it's it's up for myinterpretation because at the
end of the day, who fuckingknows?
(28:57):
And we we don't know.
SPEAKER_02 (28:59):
Well, there's a lot
of ancient wisdom, like Toltic
wisdom speaks on that.
There is no heaven or hell thatyou go to when you die.
It's like what you create inyour mind.
Yeah, yeah.
No, it's right here.
SPEAKER_00 (29:10):
Yeah, yeah.
SPEAKER_02 (29:11):
We're already it,
we're already in it.
SPEAKER_00 (29:12):
Right.
I don't know where I was goingwith that.
I'm so sorry.
No, that's okay.
No, no, no, that's okay.
SPEAKER_02 (29:17):
Umservative
Christians.
SPEAKER_00 (29:19):
Oh, but you know,
it's I just try to offer like a
different perspective.
I'm I will say this.
This isn't where I was going,but I do want to say this.
This is just from my personalexperience.
I have had an easier timetalking to conservative
(29:40):
Christians and having aconversation where we don't
agree on much.
I've had a much harder timetalking to liberal Democrats on
where we don't agree.
Yeah, there is no conversationat all.
And so my hope is that there aresome people who are left-leaning
(30:02):
who are more willing to havesome civil discourse without um
exiling me, or like a a bigthing I notice is if I say
something where I have acritique about the left, their
automatic rebuttal is to belike, well, Trump.
(30:23):
And I'm like, listen, I don'tgive a fuck about Trump.
Okay.
Like you're you're like, you'retalking to me like I'm a Trumper
and I'm not.
And I'm talking about the leftand why left and went to the
middle, and you're bringing toTrump, and that's right now it's
irrelevant to the conversation.
And oftentimes I hear, well, heis so triggering for me because
(30:44):
he reminds me of so and so.
I've heard that so many times.
And to me, I'm like, well, youare having a conversation and
making choices from a veryemotional place.
Not saying that you still don'tyou can still dislike him and
you can still not agree withwhat he says, how he is as a
(31:05):
president, all of this stuff.
That is that is those are validarguments that I would not
disagree with you on, but you'revery emotional.
It's coming from an emotionaland triggered, wounded place.
And so it's hard to have aconversation with you because I
am unable to like focus on thethings at hand, and you're
(31:27):
otherizing me, and I'm not.
Like, I don't fall into that.
So, like, you can't do that.
SPEAKER_02 (31:31):
Well, we'll get into
this in a second.
Second, but emotions overridecritical thinking.
Ooh.
They do.
So it's something that we haveto be very aware of because if
your heightened emotions arecausing you to be triggered,
you're not going to be thinkingcritically in that moment.
(31:51):
Yeah.
So it's really important that welike notice where we're feeling
emotionally triggered and we sitwith it for a minute before we
react.
Um, I wanted to give someexamples of like some things
just to like show the differencebetween critical thinking and
blind acceptance.
I'm gonna have a hard timesaying that because I wanna it's
(32:14):
it's um all right.
So let's say uh there's a postabout coffee causing cancer.
Uncritical thinking would belike, oh my God, I just saw
something saying coffee causescancer.
I guess I'm gonna stop drinkingthat now.
Like, unless it let's just saylike you just saw an article,
you didn't even open the link,you didn't even read the
article, you just saw a headlineand immediately were like, fuck,
(32:36):
now coffee causes cancer.
Guess I'm done with that.
A critical thinking responsewould be, well, who published
it?
Was it an actual study?
Was it a clickbait article?
Because a lot of times they letme be very clear.
In public relations andmarketing, they know what
they're doing with headlines.
(32:57):
It is by design that it grabsyour attention and it invokes an
emotional reaction.
This is something that even wehave learned through our podcast
with content.
You have to say something so offthe wall with your caption, or
maybe create a clip that is likeout of context to like grab
(33:22):
attention so people watch it tohear what you're trying to say.
That is like marketing 101.
Yeah, and the media does it.
Everybody does like StephenBartlett of The Diary of a CEO
will do it.
Yes, like his podcast titles aremeant to evoke an emotional
response.
(33:42):
So you're like, what the?
I want to listen.
What are you talking about?
SPEAKER_00 (33:45):
Yes, like even the
trailers.
I'm like, oh shit, this trailerfor the podcast episode, damn.
SPEAKER_02 (33:50):
Like, this is like
watching reality TV.
This is gonna get crazy.
And then you listen and you'relike, oh, they're just having a
conversation.
Like there was no debate, therewas no heat in that at all.
But I want to give an example ofa time that I did this because I
feel so stupid looking back atthis situation.
Um, my husband came home and hehad just like left his friend
(34:11):
group, and he's just like, Didyou know that there are some
schools that are putting litterboxes in high schools for kids
who identify as furry?
And I was like, What the fuck?
And I immediately get on Marcoand I'm telling a friend, I'm
like, oh my God, how crazy isthis?
As soon as I left her thatmessage, I was like, hold on.
(34:32):
Let me Google that real quick.
And then I Google it, and it'slike, actually, that's been
debunked.
That was a social experimentdone by a high school class
years ago, and it was to see howfast misinformation can spread
and how fast rumors can spread.
Oh shit.
(34:52):
And I immediately, before I evenlike, I read that and I was
like, fuck, I feel like afucking idiot because I just got
on a Marco and sent that tosomeone so fucking annoyed, and
I hope they don't believe it.
And I got on there and I waslike, by the way, I just Googled
that and that's not true.
Don't spread that shit.
(35:13):
And she was like, Yeah, I knewit wasn't true.
Like, and I was like, I feltlike an idiot.
SPEAKER_00 (35:17):
But I think it's,
you know, I'm glad you caught
it, and I'm glad you did yourresearch and then retracted
that.
It's with, you know, mystepdaughters, they have a lot
of fear instilled in them.
And, you know, one day one ofthem sent a message in the group
chat and she was like, Oh mygod, like I'm worried about like
I'm gonna get drafted for WorldWar III.
(35:38):
And it's just like, okay, yougotta let's get off the TikToks.
And nothing, there's there'snothing happening, but it's she
just saw a video and it wassalacious and again very
fear-based in the way that itwas presented.
And then she just took it astruth and internalized it and
(36:00):
was like, oh my god, this iswhat's happening.
And I think it's very hard inthis day and age because we are
so um flooded with information.
And it's like you watch FoxNews, you watch MSNBC, and
they're telling the same,they're talking about the same
event, but the way that they'retalking about it are two polar
opposites.
(36:21):
And it's kind of like what yousaid in a previous episode where
it's like you'll listen to oneside and listen to the other,
and then maybe do your ownresearch and find some truth in
the middle.
Unfortunately, I feel like we'reso polarized though, that if
it's on our side, then weautomatically believe it.
And if it's the other, weautomatically dismiss it as like
(36:43):
fake news and it's whatever.
SPEAKER_02 (36:45):
Yeah, I found a
study on that actually.
SPEAKER_00 (36:48):
Do tell.
SPEAKER_02 (36:49):
Oh, I'll tell you
later.
SPEAKER_00 (36:50):
Okay.
SPEAKER_02 (36:51):
This is again,
there's gonna be a lot of
moments like this where it's aninvitation for you guys to go
back and listen to my rabbithole episode.
All right, we don't have to domore examples, but I wrote some.
Critical thinking is let me seeif this is true.
And uncritical thinking isbasically just like I'll take
your word for it.
Page one, done.
Oof.
All right.
(37:11):
What would affect someone'scritical thinking skills?
SPEAKER_00 (37:15):
Um, I think fear.
SPEAKER_02 (37:17):
Oh, so much.
No, no, no, no, no, no, no.
Raise your hand, class.
Um, so here's what'sinteresting.
Critical thinking isn't a skillthat someone has or doesn't
have, uh, doesn't have.
Um, it is something that canshut down in the wrong
conditions.
(37:37):
Everybody has it, and it's likea muscle that you have to work
out.
And we'll get to that towardsthe end, like how to practice
your critical thinking skills.
Seems like common sense, butit's actually doing this and
writing all this stuff down.
I'm like, I don't think a lot ofpeople do this, and I am guilty
of not doing this sometimes.
SPEAKER_00 (37:58):
Oh, I bet it has to
be so conscious and intentional.
You have to be critical,especially in the world we live
in now.
SPEAKER_02 (38:04):
Uh especially in the
world we live in now.
You know if they used to teachlogic before the Rockefellers
like fucked up the educationalsystem.
SPEAKER_00 (38:12):
You guys, I'm doing
a deep dive on John D.
SPEAKER_02 (38:16):
Rockefeller right
now, and it is why I did I did
not try to, I am like trying tostay out of that rabbit hole.
Please do.
So, but Jason, my husbandbrought that up to me because
he's reading a book right nowabout Latin.
He's in law school and he waslike, they used to teach Latin.
They used to teach logic.
School was very different.
SPEAKER_00 (38:35):
Well, and also
weren't most people like
farmers, they had land.
If they went to school, it wasto be to learn how to read and
write.
But the the like the wonderfulthing about that is like you
learned from your like family oryour community.
And so people were criticalthinkers and thought for
themselves and worked forthemselves, and they taught had
(38:58):
a lot of skills, like handyskills.
Even um, Candace Owen has talkedabout it about how we've gotten
dumber, where if you like listento or read the letters that um
like 18-year-old boys wrotetheir moms from World War One,
yeah, they were not educated andthey were the most eloquent and
(39:22):
articulate.
Articulate letters.
Like, we don't talk like thatanymore.
SPEAKER_02 (39:28):
Oh my god.
Jason read a letter to me fromthis famous, um, he was like a
Supreme Court judge back in theearly 1900s, and he used to
write letters to people, andthere are books that are
literally just filled withletters that he wrote.
And I'm like, I'm going to needyou to translate every word he
said because I don't know whatthe fuck he was talking about.
(39:51):
And these weren't like lettersto other judges, these were
letters to friends where he waslike, My dearest Edward, it is
with great remorse that youknow, like the way that they
talked.
We are so dumb right now.
Like the way we're like six,seven, bruh, bruh.
And what the sigma?
(40:12):
Oh my God.
Living with teenagers right nowis like hilarious.
SPEAKER_00 (40:16):
Dude, I I have a
six-year-old that says what the
sigma.
And I'm like, you need to stopit right now.
Jesus.
And skibbity toilets.
Yeah.
What the fuck does that mean?
Let's teach Latin again.
Okay, but I have something.
Can I share it?
It is, um, hang on a second.
I gotta find it real quick.
So here it is.
Okay.
(40:36):
Instead of lol, say thatstatement has rendered me most
amused.
That's how they talked! I know.
That is legit.
And I want to learn how to talklike this.
Instead of rah, say, my goodfellow, I am utterly astounded
by your actions.
Instead of wow, say such marvelmarvels leave me thoroughly
(41:02):
astonished.
Instead of OMG, say graciousheavens above, I can scarcely
comprehend such happenings.
The fuck?
Instead of nah.
Say, regretfully, I must refuseyour proposition.
Instead of huh?
Say, pray, could you elucidatethe matter once more?
(41:26):
Instead of yikes, say, I amthoroughly disquieted by the gr
grievous spectacle before me.
SPEAKER_02 (41:32):
Where the fuck did
you get that?
SPEAKER_00 (41:35):
TikTok, baby.
Oh my god.
And you best believe I savedthat.
SPEAKER_02 (41:40):
Oh my god.
Jason would love this because hewas like literally talking to me
about how dumb we are.
SPEAKER_00 (41:46):
Like how our con how
I want to learn how to talk like
that, and I'm being dead ass.
SPEAKER_02 (41:51):
Well, you have to
learn how to speak or at least
read like that when you do law,because that's how stuff is old
school.
Yeah, it's that's how laws arewritten.
SPEAKER_00 (42:01):
Fucking love it.
SPEAKER_02 (42:02):
I try to read some
of this, uh, some of the stuff
that he writes, or and I'm like,I need you to translate this
into crayon for me.
SPEAKER_00 (42:09):
I'm gonna say
something so out of pocket right
now.
If Tony learned how to talk tome like that and write me
letters like that, my pantieswould be dropped.
SPEAKER_02 (42:21):
So wet.
Consider them on the floor.
Yeah.
And it, I mean, it's smart assexy.
Smart is sexy.
SPEAKER_00 (42:30):
Oh my god.
Yeah.
No, we need to figure this out.
Anyways.
SPEAKER_02 (42:33):
Okay.
All right.
So here are some of the thingsthat could affect someone's
critical thinking skills.
Um, emotions, strong emotions.
When we're angry or scared ordeeply attached to a belief, our
rational brain shuts down.
It is the amygdala.
That is where our brain'semotional alarm system is.
(42:54):
The amygdala hijacks logic andit floods us with flight or
fight chemistry.
Fight or flight.
Flight, I said it backwards, butfight or flight chemistry.
SPEAKER_00 (43:04):
Yeah, I think you
see that a lot, just even when I
spoke out against the DNC, thejust sheer anger from people.
And, you know, I think it'sanother thing too.
So much of that stuff you wouldnever say to somebody's face.
And if you did say that that tosomebody's face, you'd probably
(43:25):
get socked.
Yeah.
SPEAKER_02 (43:27):
Well, and it's just
so reactive and and I think
about the episode that we didjust last week with Amanda um
Newton and how she was talkingabout like the way that our body
responds when we're in fight orflight.
And for me, my neck and chestget so red, I get so hot.
I can feel the temperaturerising in my face and in my
(43:51):
neck.
And it looks like I'm likebreaking out into hives.
And so that's my body liketelling me, hey, your emotions
are kind of taking over rightnow.
And emotions also overridecuriosity.
So we really have to find a wayto like not ignore those
emotions, but maybe take a beat.
SPEAKER_00 (44:13):
Yeah, and like learn
to sit with it, sit with it a
second.
SPEAKER_02 (44:16):
It's gonna be
uncomfortable.
SPEAKER_00 (44:18):
Yeah.
And and why are you getting soum triggered by what somebody
says?
We're never gonna live in aworld where everyone agrees with
us and everyone has ourperspective and everyone has our
belief system.
That's it's impossible.
That would be a dictatorship.
SPEAKER_02 (44:32):
Oh my God, you're
right.
But I was gonna say, I thinkthat there is this idea that
everyone has that we're allgonna agree one day.
And that is such a fucking lie.
I don't know why anybody onearth would think that.
That is not possible.
Well, and also there is alwaysgoing to be good and evil in the
world.
There is always, there is nevergonna be anybody who agrees with
(44:56):
you 100% ever.
Well, and what a way to set uppeople you love for failure.
Right.
Right.
So, okay, another thing thataffects someone's critical
thinking, um, informationoverload.
Right now, we're like drowningin content and social media
algorithms.
And another thing about thealgorithm rhythms right now,
(45:17):
they are by design created toevoke a reaction out of you.
But not only that, they aredifferent based on what you look
at, what you spend time on.
So my algorithm is verydifferent than my husband's.
Yours is probably, well, oursare probably the same because we
send a lot of back shit back andforth.
(45:39):
Yeah.
But like, you know, if you'reliking content that is
politically biased, it is goingto continue to show you that
exact same stuff that is goingto evoke the same emotional
reaction from you.
And this is not conspiracy, thisis fact, this is literally how
(46:00):
an algorithm is built to work.
SPEAKER_00 (46:03):
I don't know about
you, but I've kind of gotten to
the point where I, if anythingpolitical shows up on my
algorithm, it is calling outboth sides.
Oh, and the disappointment inboth sides.
SPEAKER_02 (46:18):
Well, I think
because our we have created an
algorithm that sits in themiddle.
SPEAKER_00 (46:23):
Yeah.
Yes, yes.
I'm loving it, honestly.
It's like I've got, you know,Coleman Hughes, who is not a
Trump or has always votedDemocrat, but has some opinions
about the left.
Or, you know, there areRepublicans who still vote
(46:43):
Republican, but they haveopinions about the party or
Trump, or, you know, people whoare in the middle.
Um not all Republicans are MAGA,by the way.
Yeah.
That's a very good point.
Yeah.
And also I I There's a spectrum.
But I would also argue thatthere are a lot of Democrats who
(47:04):
left the left as well because itgot a little too too extreme.
Yeah.
And they still considerthemselves a Democrat, but
they're struggling with the umtrajectory of where this party
is going.
I've gotten a lot of that.
I've and I've gotten a lot ofpeople who are scared to share
their opinion because uh I thinkleft is, I think they're very
(47:28):
big on cancel culture, and Ihighly disagree with it.
Yeah.
SPEAKER_02 (47:32):
Sounds like a lot of
silencing.
SPEAKER_00 (47:34):
Yes, a lot of
silencing and a lot of exiling
on that side, I feel like.
SPEAKER_02 (47:39):
So the information
overload is when your brain is
overwhelmed, it defaults toshortcuts.
So it learns to trust what'sfamiliar or emotionally
satisfying.
Um, so fatigue killsdiscernment.
There was an old bachelorette, Ithink her name is Becca.
(48:00):
I follow her now because she'svery, she's like a homesteader
now.
SPEAKER_00 (48:04):
I know exactly who
you're talking about.
Dark short hair.
Yes.
SPEAKER_02 (48:07):
Yes.
I kind of love her.
SPEAKER_00 (48:09):
Yeah, I do too.
SPEAKER_02 (48:11):
And she doesn't she
doesn't shave her pits.
She doesn't shave anything.
Okay, like her legs, her pits,nothing.
And nothing wrong with that.
Yeah.
SPEAKER_00 (48:17):
Go on, go on, go
with the phone.
SPEAKER_02 (48:19):
Listen, I was an
esthetician for a very long
time.
So I am I don't like hair.
SPEAKER_00 (48:25):
Can I bring
something up?
I was I was paid to rip hair offyour body.
Can have you seen the um skimsuh the panty bush?
SPEAKER_02 (48:36):
Yes.
So I used to, there used to bethere's a thing called a Merkin.
Do you know what a Merkin is?
No.
It's like when like it's it's afake bush.
For if there's an actress whodoesn't have hair down there and
they need to look like they havehair down there, it's literally
a fake bush.
And that's what those pantiesremind me of.
SPEAKER_00 (48:56):
You should get one
since you've had laser.
Don't say that.
Listen, I don't have to worryabout it.
I don't I don't need bushunderwear.
SPEAKER_02 (49:08):
Do they make do they
come in pink?
What about rainbow?
Oh shit.
Okay.
Um TMI, sorry.
Yeah, yeah, yeah.
Okay, okay.
So, oh shit.
So she did uh, I guess she has apodcast, and I saw a clip of her
saying to her husband in thispodcast, I think it's okay to
(49:29):
ignore political content andsay, I don't want to see this,
change your algorithm so you'renot being emotionally charged
because it is daily, all day,25-8.
Got that from you.
Um and so if you know that thisis like changing the way you are
(49:50):
throughout the day, making younot as present with your
children, if it is likeemotionally activating you
throughout the day, take abreak.
Change your algorithm, stopwatching it.
And one of the comments waslike, um, that's a very
privileged way to be.
It's not a privilege to chooseto consume different content.
(50:16):
You're choose what you're notchanging, you're choosing.
And so, like, you're choosing tocontinue to activate yourself.
SPEAKER_00 (50:24):
Since you said that,
yeah, can I also add um in 2012,
the word racism, whiteprivilege, and transgenderism
skyrocketed on our algorithmsbecause that's when algorithms
were created.
And so again, I think it's okayto ask yourself like, why does
(50:44):
this keep showing up to me?
Are that many people thatracist?
I've experienced racism, but asa whole, I I think most people
are, you know, good people andthey are not like that.
Are there?
Yes.
I think it's I think for for me,I think it's been blown up so
much because of our algorithmsintentionally trying to create
(51:08):
divide and polarization andemotion.
And again, for you to other eyessomeone else.
SPEAKER_02 (51:16):
I would love to do a
rabbit hole in 2012.
Another time.
SPEAKER_00 (51:22):
Yes.
SPEAKER_02 (51:23):
We've texted each
other before and been like, holy
shit, another connection.
Holy shit, another connection.
SPEAKER_00 (51:28):
Mental health
declined.
A lot of these kind of triggeremotional wars.
Polarization starts, like it'sIt was like the rise of social
media.
Yeah.
SPEAKER_02 (51:40):
But there's so much
that happened during that year
that I'm like, And there's somany dots to take.
SPEAKER_00 (51:45):
What did you say?
The mind calendar ended.
SPEAKER_02 (51:47):
Yeah, but we'll get
into that.
Okay.
It's I feel like there's like wewe could keep going.
We could keep going.
SPEAKER_00 (51:52):
Two ADHD groups into
a podcast room.
What are we gonna talk about?
I don't fucking know.
We're in 15 different directionsin about the span of seven
minutes.
SPEAKER_02 (52:01):
This is why I
thought, like, I'm gonna have to
write notes.
And I don't want you guys toknow, I'm not like directly
reading them.
I am trying really hard to juststay focused and stay on track.
Okay, so strong emotions,information overload, group
think, and social pressureaffect critical thinking skills.
Africa Brook has talked on this,like how very humans are very
(52:23):
tribalistic.
We crave belonging to a groupmore than we crave truth.
And that is, it is always goingto be a part of our biology.
We crave connection.
And hundreds and thousands andthousands of years ago, we all
thrived in tribes andcommunities.
(52:45):
And if you thought somethingdifferent than your tribe, it
often meant you were exiled,excommunicated, sometimes
killed.
That happened because you wentoutside of what the group
thought.
So it is a survival instinct forus.
We crave belonging, even at theexpense of being wrong.
(53:09):
So, group think and socialpressure.
We're naturally a tribalspecies.
If your tribe believessomething, it's really hard to
question it without feelingdisloyal or alienated.
It's really hard.
SPEAKER_00 (53:22):
Oh, yeah.
We we're seeing it in real timewhere people are scared to talk
to us.
People, things we've spokenabout, people have been like, oh
my God, thank you.
I'm like so scared to sayanything to anybody about this.
And um, yeah, so people, it'sthere's a lot of with this uh
polarization, there's a lot ofself-censorship because people
(53:43):
are scared of the backlash ofgroupthink.
So it's almost like somethingwhere it's like, again, you kind
of have to make thatunconscious, because we're kind
of designed to be that way, kindof have to make that unconscious
thinking conscious and thenreframe it.
SPEAKER_02 (54:01):
And you kind of have
to say, Am I okay if this tribe
exiles me?
It hurts when that happens.
Yeah.
No matter how big or small thattribe is, like it is something
that you have to be okay with.
And that is like something Ithink a lot of people, the
reason they don't speak out isbecause they're afraid of that.
SPEAKER_00 (54:20):
But I will say, and
I do think that there it there
has to, you have to get to apoint where you are okay with
yourself and you know who youare, and you like like no matter
what somebody tells you who youare, you know that what it what
it you know your truth.
Right.
There is something liberatingabout being canceled in this
(54:42):
cancel culture because it's it'sit's there is like you can
throw, oh you're you're not theother, and it's like, but I'm
not though.
You can throw, you know, oh, youknow, you you're a brown person
and and now you're you're notthis, and so like you you don't
get to be brown anymore.
It's like, yeah, I do.
Still am, still am, says nomatter what you throw at me, I
(55:08):
know who I am and I know what Iwhat I stand for.
And also I stand on my mistakes,I stand on changing my mind, I
stand on that I'm not alwaysright, I stand on that I could
change my mind the next episode,I stand on that I'm not an
expert.
So like okay.
SPEAKER_02 (55:28):
There's nothing you
could say that like is going to
make me feel less than orshamed, or you can't shame me
into anything, like or guilt meinto anything.
SPEAKER_00 (55:40):
Well, and also
there's kind of like why are you
so worked up right now?
Like what what like it's it'snice to not like subscribe to
being in a box or being a labelbecause then it's like I kind of
encompass a lot of differentthings and a lot of different
viewpoints.
SPEAKER_02 (55:56):
That feels like to
me, freedom.
Yeah.
To me, that's where this idea offreedom really, really resides
is within yourself.
SPEAKER_00 (56:08):
Yeah.
SPEAKER_02 (56:09):
I feel the most free
I've ever felt because I'm I'm
unapologetically who I am.
Right.
And it doesn't fit in a box.
SPEAKER_00 (56:19):
And I don't know you
can't put me in a box.
SPEAKER_02 (56:21):
I want to say there
is like a good side to this
because we have both experiencedthis exile.
I don't think we're big enoughto be canceled.
But you know what I mean?
Like we've been unfriended.
We have family members who arelike upset with us.
That is a price you pay, but thereward and the other side of
(56:43):
that is you find other peoplewho kind of feel the same and
they felt lonely.
We have very few people that wehave these conversations with,
but it feels really good.
Well, and also to know that wecan.
SPEAKER_00 (56:58):
When you're showing
up authentically with your
opinion or your self-expression,oftentimes I think a lot of the
people that are closest to you,it doesn't resonate with.
But then the wonderful thingabout social media is then you
do find the people who doresonate with you.
And they're like, oh my God, Ifeel politically homeless too.
I'm in the middle too.
(57:19):
You're attracting a moreauthentic audience.
It might be a smaller audienceor a smaller group or a you
know, a smaller circle, but atleast it's real where you feel
like you can self-express andand other people can do the same
too.
And I feel like it's morefulfilling because it's more
authentic.
(57:40):
Right.
Like who wants in whether you'retalking about politics or
anything, who wants to feel likethey have to self-censor
everything, every opinion, everybelief that sucks because
walking on eggshells withsomebody is not fun.
Feeling like you can't show upas your full self, that is not
(58:01):
fun.
And unfortunately, that is whatpolitics has become.
I saw this chick on uh a tick ora reel and she wasn't from the
US, and she got asked bysomebody like, what since moving
here, what have you noticedabout our political system here?
And she's like, Well, here inAmerica, people don't just have
(58:26):
opinions, their beliefs are sotied with their identity.
So if somebody has a differentpolitical opinion than you, you
take it as they're attackingyour identity and you're and and
they're not, where in othercountries they're just uh
opinions.
Where here it's it's it's youknow, if I if I have an opinion
(58:47):
about something, let's say kidsbeing on puberty blockers or
something, again, a response Igot well, well, well, my someone
I know did this.
And I'm like, I'm not attackingthat person that you know.
I'm just saying that this is anopinion for me, that I I just
don't agree with it.
(59:08):
I think there's a lot of pushfor that uh in our shows and in
social media.
And I I don't know why.
If anybody's gonna be havingthese conversations with my
children, it's gonna be me.
Right.
Not a show.
But again, she took that as apersonal attack, and it's not
it's not meant to be at all, andit's not meant to be hateful,
(59:29):
it's not meant to bediscriminatory at all.
It's just my opinion.
SPEAKER_02 (59:34):
It's an opinion.
So no, you're right.
And I also uh to add to whatevershe said, I feel like it is so
impossible to not politicizeevery fucking thing today.
Why is healthy food politicized?
unknown (59:53):
Yeah.
SPEAKER_02 (59:54):
Why is our our
medicine political?
Why is our healthcare political?
Why is this shit political?
Why is it if you think this, youthink you're on this side, and
if you think this, you're onthat side?
Like, that is such a low IQ wayof thinking about anything.
(01:00:16):
I it it it's just frustrating.
I get really frustrated becauseit shouldn't be political, and
people are like, but it is.
Okay, well, I'm saying it's not.
SPEAKER_00 (01:00:25):
So like I can have a
different belief.
And even if you do think it'spolitical, again, you should
maybe learn to have aconversation with people that
you don't agree with so youdon't stay stuck in an echo
chamber and it go deeper anddeeper, because unfortunately,
that is what is happening.
There are people who are tryingto still have those
(01:00:46):
conversations and they're tryingto like work with people on both
sides to try to find some likeunity and common ground.
And, you know, again, like therewas a time where people
disagreed, people got marriedand they didn't have the same,
you know, political affiliation,and it was like not that big of
(01:01:08):
a deal.
And yeah, it's just I don'tknow.
SPEAKER_02 (01:01:12):
Oh, Jason also told
me they used to teach debate.
Oh my God, I would love thatclass.
You I would thrive.
I would not.
Okay, so let's keep going.
Comfort and certainty.
And we kind of talked aboutthis, but like if you are a
critical thinker, it is commonto Be okay with not knowing.
(01:01:36):
Uncertainty and criticalthinking kind of go together.
A lot of times people will tradesimplicity for the truth or
truth for simplicity because itfeels more comfortable to them.
Like having an answer feelsgood, even if it's wrong.
They would rather know than say,I don't have an opinion on that.
SPEAKER_00 (01:01:58):
Right.
SPEAKER_02 (01:01:59):
I have a different
life experience.
So my opinion is going to bedifferent than yours.
So yeah, I just think that whatdid I say?
Comfort and certainty.
That's one thing.
Cognitive, cognitive, why am Ihat not saying this?
Cognitive biases.
Our brains are full of invisiblefilters.
(01:02:19):
So there's different types ofbias.
There's confirmation bias, whichis where we notice what supports
our views and ignore whatdoesn't.
Um, authority bias, we believeexperts, or if someone sounds
confident, we believe them, evenif they're wrong.
And then there's anchoring bias,which is where you cling to the
(01:02:40):
first thing you heard as thebaseline truth.
So bias sabotages criticalthinking.
And I will go over this in thenext episode, but there is a
study where they look atcognitive biases and how it
affects your critical thinking.
And it is like prettyfascinating when you see the
results.
SPEAKER_00 (01:03:00):
Well, and we all
have those people on our social
media where everything that theypost about the other side is
bad, and everything that theypost about their side is like
good.
And it's like it's kind of likethis mentality of like my side
is going to save the world andyour side is going to ruin the
(01:03:22):
world.
And that's all that they post.
So when anything happens in theworld, it is an opportunity to
talk shit about the other side,no matter what.
And like for me, I want to sayto those people, okay, instead
of like continuing to go thisdirection, because it's really
(01:03:43):
not like doing anything forprobably many of your listeners
unless they already agree withyou.
Maybe you should start sayingthree things that you don't like
about your party or yourpolitician you're in favor, and
three things that you wish youwould could change about your
party.
That's that like I would like Iwant to see that shit.
SPEAKER_02 (01:04:00):
But Cassine that
might get them canceled.
No, I think you're right.
I'm just I'm shitting.
I'm like giving you a hard time.
But like that would be such adifficult thing for someone to
do because then people wouldassume what they assume about
others.
unknown (01:04:20):
Yeah.
SPEAKER_02 (01:04:20):
When they go against
the ground.
You know what I mean?
Like they're so afraid ofjudgment because if they saw
someone post that, they would belike, up.
SPEAKER_00 (01:04:28):
Fuck her.
SPEAKER_02 (01:04:30):
I this what is the
word you've been using?
Otherized them.
SPEAKER_00 (01:04:33):
Yeah.
Yeah.
SPEAKER_02 (01:04:34):
Like they don't want
to be otherized.
SPEAKER_00 (01:04:35):
Yeah.
SPEAKER_02 (01:04:36):
So it's I agree with
you.
I think we should make I thinkthat that would take a very
brave person to do that.
SPEAKER_00 (01:04:42):
I think we need more
bravery in that way.
I would like I would love to seethat.
Because it's like, you know,with especially with social
media and with politics andsocial media, I'm like, you
know, you've been posting thesame shit for 12 years, bestie.
Like, it's the same thing.
No matter what, you're anti thatperson or you know, that side,
(01:05:05):
and you're for this side, andthat's all you post about.
Yeah.
And it just doesn't reallychange anybody's mind because we
know you're just stuck there.
SPEAKER_02 (01:05:13):
Well, and by that
time, and I want to get into
this later, at that point,people already know your
beliefs.
I know.
You don't have to announce them.
Oh.
All your 200 followers knowexactly where you stand.
You're not changing anybody'smind because if they disagreed
with you, they probably alreadyunfollowed you or muted you.
Right.
So they're not seeing it anyway.
(01:05:34):
Right.
So you're only sharing thisstuff in your little echo
chamber who already believes it.
Yeah.
You're not changing anybody'sminds.
SPEAKER_00 (01:05:44):
I love how like the
bravery is standing in the um,
I'm in the middle and I'm notsure about everything.
SPEAKER_02 (01:05:49):
Yeah.
Right.
Like, what the fuck?
Crazy.
What?
That's crazy.
Okay, we're gonna start gettingjuicy here.
Exhaustion, stress, and poorhealth.
If your nervous system is friedor your brain is underfueled,
your higher reasoning doesn'tfire properly.
Critical thinking requiresenergy, um, which literally
(01:06:13):
provides glucose and oxygen toyour prefrontal cortex.
So without energy, you are notgiving your brain what it needs
to think critically.
It's like offline.
It's like, we're tired.
I don't want to do this rightnow.
Let's just do what you want.
Do what you want.
Like, that's me being a brain.
(01:06:34):
I think in today's world, it isimpossible to not be exhausted,
stressed, or tired.
SPEAKER_00 (01:06:39):
Yeah.
Yeah.
SPEAKER_02 (01:06:41):
You have to work
really hard at not being those
things.
And it seems like you'refighting a never-ending battle.
SPEAKER_00 (01:06:50):
Oh, you have to be
so conscious about it.
And I think there are people whodo this rhetoric like, well, why
didn't you post about this?
And why didn't you care aboutthis?
And why didn't you?
And it's like, we're tired.
I'm fucking tired.
We're tired, Susan.
Times are tough.
We're tired.
There's something to be upsetand sad.
There's a tragedy all over theworld.
We're not meant to see tragediesall over the world.
(01:07:13):
And we do like it's normal andit's not.
It's not normal.
And, you know, sometimes peopledo care about things and they
don't post about it.
I don't know.
There's that too.
SPEAKER_02 (01:07:23):
Yeah.
Well, and if you want to bringhuman design into this, there
are people with like a definedthroat center and people with an
open throat center.
You and I both have definedthroats.
We are meant to speak our truth.
We are not meant to stand up forthe masses.
SPEAKER_00 (01:07:39):
Ooh, okay.
SPEAKER_02 (01:07:40):
People with an open
throat are more likely the
people who are speaking onbehalf of others.
I am not meant to do that.
And I there are people out therewho are, but it's not me.
And it's not, it might not beyou.
And so it is okay to not postsomething on someone else's
(01:08:03):
behalf because you feel likeyou're being pressured to do so.
SPEAKER_00 (01:08:07):
It's okay.
And I've done that, and it umcaused a lot of mental health
issues because I did do thosethings and I don't think I was
meant to do it.
And I think it's okay to notpost about everything, not post
about the things you care aboutat all.
And I don't know if youremember.
SPEAKER_02 (01:08:27):
It doesn't mean you
don't care.
SPEAKER_00 (01:08:29):
Right.
Yeah.
I I think about the um blacksquare during BLM and how if
somebody didn't post that blackscare square, and I was somebody
who was guilty of judgingpeople.
So like I'm whole I'm callingmyself out in this, judging
people who didn't post thatblack square.
And it's so performative, and alot of people did it because
(01:08:53):
they felt pressure and they feltlike they had to do it.
And and I think about that timeabout there was so much pressure
to do things a certain way andpost a certain way.
And if you didn't, oh, and youhad any type of a platform, you
were canceled.
And I think about if we were tohave, if we would have had that
podcast at that time, ourpodcast at that time, like we
(01:09:17):
would have been pressured to doit.
And we probably would have doneit.
SPEAKER_02 (01:09:20):
Well, we would have
done it.
Yeah.
Uh you just said like it's neverperfect.
Do you also remember during thattime, like if you posted it and
you did hashtag BLM, they werelike, you can't do that because
it's it if you or you wereeither supposed to hashtag BLM
or ha or not hashtag it becauseit was messing with the
(01:09:41):
algorithm?
SPEAKER_00 (01:09:42):
Yeah, I I like that.
So there were like rules aroundit.
SPEAKER_02 (01:09:45):
So they were like,
if you're posting it, you have
to do it this way.
SPEAKER_00 (01:09:49):
And now we know that
BLM took a lot of the money and
didn't give it to the blackcommunities.
Yeah.
And so people got bullied forthat.
Yeah.
Um, I think also people getbullied to give money to
different organizations ornonprofits or fundraisers.
And sometimes I get why people,I'm more hesitant about it now
because I don't know where thatmoney is going.
SPEAKER_02 (01:10:09):
I want to see where
it's going.
SPEAKER_00 (01:10:11):
Yes.
SPEAKER_02 (01:10:11):
And you know, you
know what that is though?
That's using your criticalthinking and not just believing
it because they say it.
Like I want to know for a factthat my money is going to go do
some good before I just give itto you.
SPEAKER_00 (01:10:25):
Yeah.
The founders went and boughtsome mansions and shit.
Yeah.
SPEAKER_02 (01:10:28):
Yeah.
That's been proven.
Yeah.
SPEAKER_00 (01:10:30):
Not speculation.
SPEAKER_02 (01:10:31):
That is not a
conspiracy.
SPEAKER_00 (01:10:32):
Yeah, that's a fact.
SPEAKER_02 (01:10:34):
Um, okay, so lack of
practice or education.
So this is just like lack ofpractice of critical thinking.
If someone was never taught howto question information or
analyze evidence, they justdefault to emotional reasoning,
reasoning or authority becausethat's what they were taught.
They don't know any differentand they don't teach critical
(01:10:54):
thinking in schools.
They really they don't do thisstuff.
I wonder why.
It's an mmm, but wonder why theydon't teach critical thinking in
schools.
But like to me, like that's not,it's not because you're stupid.
It's not stupidity, it's notignorance.
It really is just like youhaven't been taught to do this
and you have to train yourselfto do this.
Um, this one's where Oh, you'regonna trigger.
(01:11:18):
You see where I'm going?
Yeah.
Um trigger.
Okay, let's go back to like thisis what would affect someone's
critical thinking skills:
psychotropic medication. (01:11:24):
undefined
Okay, why's that, Leah?
If you don't know whatpsychotropic meds are, they are
psychoactive drugs that affectthe brain to treat mental
illnesses or emotionaldisorders.
These medications blunt youremotional range, which
(01:11:46):
indirectly affects yourthinking.
If your emotions are dulled,your drive to question can drop.
So your curiosity goes out thewindow.
Critical thinking thrives oncuriosity.
So without that, you can't be acritical thinker.
Antidepressants, lower dopamineactivity.
(01:12:08):
This is one of those things.
When I read this, I was like,holy shit.
Think about all the people rightnow who are being diagnosed with
ADHD in their 30s and 40s.
And ADHD is low dopamine andnorepinephrine.
And antidepressants lower yourdopamine.
(01:12:31):
I did not know that.
Did you know that?
SPEAKER_00 (01:12:34):
Well, I know it from
you.
SPEAKER_02 (01:12:36):
Yeah, because I
probably I think I texted you as
soon as I like read this and Iwas like, holy shit.
Well, didn't know that.
I know they affect your guthealth, which is crazy because
that's like your second brain.
95% of your serotonin isproduced in your gut.
So if you are destroying yourgut health, of course your body
(01:12:56):
is going to stop producingserotonin naturally.
Yeah.
That's crazy.
SPEAKER_00 (01:13:00):
I just, I I
constantly go back.
Leah and I are a part of thisFacebook group.
Um, and it's SSRIs destroyed mymarriage.
And it's just thousands andthousands of people who are
sharing stories about how umtheir partner took
antidepressants and it numbedthem.
(01:13:20):
Um, it made them numb to kind ofeverything, and that caused a
lot of trouble in theirrelationship.
And there is just a wholesupport group of people who have
had a um partner who they feellike they lost because of SSRIs.
(01:13:41):
And it's it's really wild.
Yeah, it's it's hard to readsome of those stories.
Oh my god, it it it'sheartbreaking.
SPEAKER_02 (01:13:48):
And dopamine is what
fuels motivation and your reward
chemical.
So those are like where thecuriosity comes in.
Do you know how you know howmany dopamine hits I got?
Like doing all this research?
It's like one after the other.
SPEAKER_00 (01:14:04):
But can you talk
about when you were um had post
postpartum depression just alittle bit?
Obviously, we're not gettinginto it.
We talked about it before, buthow you know you were in a bad
place in your marriage, and likeJason would do things and stuff
that would have made you mad andit should have made you upset.
SPEAKER_02 (01:14:23):
Yeah, I know you
felt nothing.
Completely numbed me.
Like he was, this is during hislike addiction phase where he
would go out and not come homeor not answer the phone or not
let me know where he was.
And before I would like lose myfucking shit when that would
happen.
(01:14:45):
Oh my God.
Like I would be so emotionallydregulated for days after that.
And when I was medicated, Ididn't care.
And at one point he said heliked me better when I was on
medication because it was easierfor him.
(01:15:08):
Because I wasn't getting upsetwhen he did things that should
very much upset me.
SPEAKER_00 (01:15:16):
Even like a show or
movie that would typically make
you cry.
SPEAKER_02 (01:15:19):
Yeah, that's how I
knew I needed to come off is
when I watched a movie anddidn't cry.
I cry over everything.
I'm the biggest fucking crybabyin the world.
So I was like, mm, I think I'm alittle too numb right now.
And I was on what they wouldcall the baby dose.
The smallest dose of Zoloft thatis safe.
(01:15:40):
Quotation mark, safe duringpregnancy or breastfeeding,
because I was so afraid oftaking medication and my doctor
knew that.
And so she was like, why don'twe put you on the lowest dose?
And I'm like, okay, kind.
I can't even imagine how muchmore numb I would have become if
I continued to use it or if Iupped my doses.
(01:16:01):
Because I was already so numbedout.
And here's the thing, it kind ofworked.
It it did what it was supposedto do, kinda.
I no longer felt depressed, butI also felt nothing.
SPEAKER_00 (01:16:15):
Yeah, no, I want to
feel joy.
SPEAKER_02 (01:16:17):
Yeah.
I felt nothing.
So yeah, a side effect of a lotof these medications is brain
fog, which that is like commoninformation, and slow
processing.
All right.
So another thing, depression andanxiety can affect it.
Uh, depression and anxietyimpairs your critical thinking
by affecting your range ofcognitive function, um, your
(01:16:41):
memory, your decision making,promoting negative thought
patterns.
Anxiety can lead to a biastowards threat, intrusive
thoughts, or rumination.
And depression can slowinformation processing.
So it encourages hopelessnessand pessimism and an
all-or-nothing thinking.
And we are one of the worstcountries with mental health,
(01:17:05):
not just mental health, but alsowe're very highly medicated.
The most medicated and the worstmental health.
Make it make sense.
SPEAKER_00 (01:17:12):
Well, and also like
we're so medicated.
How's that working out for us?
Apparently, not well.
SPEAKER_02 (01:17:17):
It's not.
It's not.
Um, and then the last thing,manipulation and propaganda,
which I will go over this rabbithole later.
Media, political groups,marketing, they all use
emotional storytelling andsocial proof to bypass your
critical thinking entirely.
And this is, again, I will say,this is by design.
(01:17:39):
It is meant to do that.
If you feel fear, outrage, orurgency, that is often a red
flag, that something is tryingto shortcut your rational mind.
People don't lose their criticalthinking because they are stupid
or ignorant or naive.
They lose it because they'retired, scared, overloaded, and
(01:18:02):
you're probably beingmanipulated.
Okay.
Love that.
Chapter one done.
Actually, I think that waschapter two.
How to strengthen your criticalthinking.
All right, this is important.
And I think that this is like apractice that everybody could
(01:18:22):
use.
Okay.
Regulate your nervous system.
So when we say go touch grass,that is literally go out in
nature.
If you are stressed or angry andyou can't think critically, go
breathe, walk, stretch, groundyourself, delay your reactions.
(01:18:43):
And you don't always have torepost or even respond to
someone right away.
This is like a really big thingfor me.
And I think that like you'vepointed this out.
And sometimes I swing too far inthe other direction of not
responding.
But like I am very much of thebelief that I owe no one a
(01:19:04):
response right away, ever.
SPEAKER_00 (01:19:06):
And I love that
because I'm I've I've I've been
very, very guilty of being sucha quick responder and a quick
reactor when I should have saton it for a moment.
SPEAKER_02 (01:19:16):
Well, and I also
think that there is a part of me
that wishes I used, I used towish that I could do what you
do.
But even through human design, Ihave learned that I am not meant
to respond when I am in anemotional wave.
So it is absolutely okay for youto take time to regulate
(01:19:37):
yourself before you give anysort of answer or response.
So yeah, go outside and touchsome grass or get in some water.
A calm body and a clear mind isgoing to help with your critical
thinking.
So get out of that emotionalstate.
A question how you knowsomething.
Practice asking yourself thesequestions.
(01:19:59):
How do I know this?
Where did I learn it?
Is it a fact?
How do I know if it's true?
What would change my mind?
That's a big one.
That is a very big one.
Because if nothing can changeyour mind, that is a belief, not
a conclusion.
(01:20:21):
If you have an opinion onsomething and there are facts
that disprove that, but they arenot changing, like why would
that not change your mind?
Right.
Like it's like I don't know.
I don't know.
SPEAKER_00 (01:20:37):
But there are people
who I've had discussions with
where I'm like, no matter what Ipresented to them, nothing it
doesn't matter.
And I think about like amarriage.
Like I think about if you'restruggling with your spouse,
you're in a disagreement ofsomething, and if both people,
(01:20:58):
or one person even is like,nope, not changing my mind.
No matter what you present tome, no matter how like you tell
me you feel, this is it, this iswhat I think.
That marriage would neverfucking work.
SPEAKER_02 (01:21:13):
Or you'd be
miserable the less let the rest
of your life staying in that.
SPEAKER_00 (01:21:17):
Or whoever the other
person is, is they're like,
okay, well, I can't, I gottawalk on eggshells.
I can't, there's there's nothingI can say to change their mind.
There's nothing I can do.
I'm I'm I'm gonna, you know,self-censor or mute, or I'm
gonna be miserable, or I'm gonnawhatever.
And it's like that's that's anawful, such an unhealthy
relationship.
Yet we have normalized that inpolitics.
SPEAKER_02 (01:21:40):
I think that this
would be a good time for you to
say what you wrote down earlierthat Oscar Wilde said.
SPEAKER_00 (01:21:45):
Oh, yeah.
Okay.
SPEAKER_02 (01:21:46):
So I feel like this
fits right now.
SPEAKER_00 (01:21:48):
It does.
So Oscar Wilde said thatconsistency is the last refuge
of the unimaginative.
And, you know, a lot of highlyintelligent people can change
their mind when presented withdifferent information.
I think about like if like thatis a flex.
SPEAKER_02 (01:22:06):
It is a to change
your mind.
SPEAKER_00 (01:22:09):
Absolutely.
It's like I think it's a flexright now to be in the middle.
And I think about if, you know,we would have met up for lunch
and you told me about mushrooms,and I would have just been so
closed off because I didn'tagree with it.
I didn't know it, whatever itwas.
I was not open to hearing it.
Where I was it like my life, Ithink about my life and where I
(01:22:33):
would be and how much thatchanged my life because of that
conversation where I was curiousto what you had to say and was
willing to like listen and tohear you.
And I think, you know, we needto make it more common to say
things like, oh, I used tothink, or, oh, well, that's a
(01:22:53):
good point, or didn't thinkabout it that way.
Yeah, or oh, let me reconsider,or, you know, I've changed my
mind given differentinformation.
Or I used to agree withsomething and now I don't, or
vice versa.
I I I wish that we could get toa place, and I hope we do, where
that is more um normalized.
And I think a lot of people,they they double down on these
(01:23:15):
points to protect their ego.
But to me, that is um a measureof unintelligence because you're
protecting your ego and you'retying your identity to being
right instead of just beingcurious and being willing to
learn or being willing to bewrong.
And Albert Eidenstein also saidthe measure of intelligence is
(01:23:37):
the ability to change.
SPEAKER_02 (01:23:40):
So this one I think
is really hard for people
because I don't think peopleunderstand how much they operate
out of ego unconsciously.
I think that's something thatyou could ask yourself is like,
why do I want this to be true sobad?
And if it turns out to be true,how would I feel?
And this isn't to judge yourselfor to criticize yourself or to
(01:24:05):
beat yourself up, but to beaware of something that you
might be attached to that thatgoes much deeper than the
surface.
So, like, I'm just trying tothink of situations where we
have confronted ourselves withtruth and been like, oh, that's
really uncomfortable to sitwith.
(01:24:25):
I had to sit with the fact thatI was a villain in my own story,
and that was an extremelyuncomfortable truth.
But I didn't want to see thatbecause I knew once I saw that
there was gonna be somejudgment, there was gonna be
some shame, there was gonna besome guilt.
And that happens a lot insobriety when you are faced with
this truth that you have anissue.
(01:24:45):
Nobody wants to see that.
An addict doesn't want to seethat because if they sit with
that, they have to face this uhtruth that you weren't always
the victim.
And that that sucks because itmeans you may have been the
villain.
And you might feel shame, youmight feel guilt, and those are
(01:25:07):
very uncomfortable things, andyou you have to it's just a hard
thing to sit with.
And I think it's important thatyou ask yourself, like, why do I
want this to be true so bad?
SPEAKER_00 (01:25:17):
Well, and I think it
is a lot easier to project than
to self-reflect.
Yeah.
I mean, think about that.
It is a much harder job.
I would say it's more worth itto look at yourself, but it's a
(01:25:38):
much harder job to look atyourself and take some
self-responsibility for your ownsuffering, or you know, realize
that maybe some of the thingsthat you thought you knew you
didn't, or what you thought madeyou wrong.
Yeah.
Or admitting you're wrong.
Yeah.
SPEAKER_02 (01:25:56):
And I think that
that's what I was trying to tie
back to the ego.
Like our ego is so, so strong.
It does not want to admit faultand it does not want to admit
that it was ever wrong aboutanything.
So I just think sometimes it'shard to like put our ego to the
side and be like, hey, hey, hey,it's okay.
(01:26:16):
You're allowed to change yourmind.
That's actually more admirable.
SPEAKER_00 (01:26:21):
Yeah.
SPEAKER_02 (01:26:21):
You're allowed to
say you were wrong.
That's more admirable.
Yeah.
SPEAKER_00 (01:26:25):
I've changed my
mind.
SPEAKER_02 (01:26:26):
Yeah.
Okay.
So this one is a really, really,really, really tough one, but
diversify your sources.
Ooh, yeah.
Be careful of the I Did MyResearch illusion.
Let's say you go down a rabbithole, you read all the articles,
you watch all the videos, andyou feel like you've thought
deeply about it, but all yourresearch came from people who
(01:26:49):
already agree with you.
It feels like critical thinking,but it's actually confirmation
bias because you're unknowinglysometimes gathering information
that supports what you alreadybelieve.
If you want to do that morethoughtfully, more
intentionally, you're going tohave to deliberately look for
(01:27:10):
disconfirming evidence, not justcollecting evidence for your
side.
I'm going to give an examplebecause earlier when I was doing
some of my rabbit holes, one ofthe questions that I asked was,
is fluoride good for your teeth?
If you Google that, it's goingto be all the research that's
saying that it's good for you.
If you change your wording andsay, Is too much fluoride bad
(01:27:33):
for your teeth, you're going tofind scientific research and
evidence supporting that.
I've been guilty of this, whereyou Google something and there's
like, see, see, look, it says itright here.
But then you Google the otherside of it, and then it's like,
oh, but then it also says this.
So you almost have to exposeyourself to perspectives and
(01:27:56):
research that you don't agreewith.
And this is kind of where Iwanted to touch on like the
political algorithm that you wewere talking about earlier, like
where you are creating this echochamber of people who believe
you or who believe the samethings that you believe, and you
only follow people that believelike that.
I have had to stop myself fromdeleting people before because
(01:28:20):
they disagreed with something Isaid or because I disagreed with
something they said.
Think of all the people rightnow who are like, if you believe
this, go ahead and delete me.
Oh my gosh, yes.
Go ahead and delete me.
You're doing me a favor.
This happened recently where Ilike wanted to delete somebody
because they said something soout of pocket that I'm like, I
can't support that.
(01:28:40):
But then I was like, wait aminute, I'm doing it.
I'm deleting someone because Idon't like what they had to say.
And I think that that's animportant thing for us to
remember.
You're only creating even moreof an echo chamber if you're
muting and deleting anybody whodoesn't agree with you.
(01:29:01):
That's crazy.
SPEAKER_00 (01:29:03):
And I've been so
guilty of that.
Yeah.
But again, I think about thattime when I acted out in that
way and where I was mentally andemotionally and even physically.
And it was not well.
And well, and I'm go ahead.
(01:29:25):
No, go ahead.
No, you go ahead.
SPEAKER_02 (01:29:29):
I just I wanted to
add to that that in the last
couple of years, you and I havelistened to people that we
probably never would have giventhe time of day before.
And I'm talking like podcastswhere we're like, we were, I
thought we were supposed to hatethis person and I actually
(01:29:50):
really like what they have tosay.
And I don't agree witheverything that they have to
say, but they make some reallygood points.
And I kind of agree with thispart, and they're not as bad as
I thought they were, you know.
If you follow people across thepolitical spectrum, not to
argue, but to understand howthey think, it keeps your mind
(01:30:13):
agile, it prevents echo chamberbrain rot, and it prevents
confirmation bias.
And it might be uncomfortable,but that's the point.
A comfort zone is a beautifulplace, but nothing grows there.
So I think it is okay to followpeople you disagree with.
(01:30:34):
All right, evaluate your sourcesand the motives.
Ask who's saying this and why.
What do they gain if I believeit?
And are their primary sources oris this just opinions being
repeated?
So what I always say is like,follow the money.
Like if they're quoting studies,like who do you really think
that a pharmaceutical company isgoing to put a study out that
(01:30:55):
says their drug is not safe?
SPEAKER_00 (01:30:58):
Also, do you think a
pharmaceutical company is going
to put out studies where thereare, let's say, okay, holistic
practices such as plant medicinethat has lifelong benefits?
Are they going to put studiesinto that?
unknown (01:31:17):
No.
SPEAKER_00 (01:31:17):
No, because it
doesn't make them money.
SPEAKER_02 (01:31:20):
Do you know how much
money it costs to do research
and studies?
How much and to have thempublished millions and millions
and millions of dollars.
So a lot of times, these otherPractices that are not owned by
billion-dollar corporationsdon't have the money or the
resources to conduct the studiesthat they would like to conduct.
(01:31:44):
There have been people who havetried to make studies happen and
they've run out of money andthey don't get funding and they
don't get grants.
And it's like not a simpleone-year process.
Sometimes, I mean, they've beentrying to get MDMA off the
scheduled drug list for how manyyears?
Like this has been going onsince the 90s.
SPEAKER_00 (01:32:04):
Yeah.
SPEAKER_02 (01:32:05):
And it has taken
decades of research and funding
and people backing them up andfundraisers and grants.
Like that is not an easy thingto do.
But if you are a billion-dollarcompany, you can pay for the
fucking science.
SPEAKER_00 (01:32:21):
And it be in your
favor.
SPEAKER_02 (01:32:22):
And it 100%.
They wouldn't publish it if itwasn't.
SPEAKER_00 (01:32:27):
I think about um, I
don't remember what season it
was, but when you talked aboutthe MDMA study on Oprah.
I feel like that was season one.
That was probably season one.
Yeah.
And can you kind of just likewiki notes it?
SPEAKER_02 (01:32:44):
Oh God.
Um, so Oprah did an episodewhere she talked about how MDMA
put holes in your brain.
They did the study on monkeys.
Um, these monkeys ended updying.
Was it monkeys or rats?
It was monkeys.
Okay.
We'd have to go back and findthat episode.
That was a rabbit hole I wentdown a long time ago.
(01:33:04):
Um, and they showed these brainscans of these holes in the
brain, which all of this wasdebunked, but she never came out
and said that the informationfrom that episode was not
factual.
SPEAKER_00 (01:33:18):
And what was wrong?
SPEAKER_02 (01:33:19):
Well, so much was
wrong.
Number one, so MDMA, the longversion of it is like methyl
dias, something, something,something.
Yeah.
But it is very similar to meth.
Not actively.
The word.
They were giving them meth.
It wasn't MDMA.
It wasn't MDMA.
(01:33:39):
They were giving these monkeysmeth.
Not only that, but a typicalsafe dose of MDMA is between 80
and 120 milligrams.
Okay.
Okay.
That's how much meth they weregiving these monkeys.
And a typical dose of meth islike five milligrams.
(01:34:04):
Oh.
So they were fucking killingthese monkeys with meth.
It was not MDMA.
And the other thing that wasdebunked is this brain scan
showing holes.
This person wouldn't be alive ifthose holes were actually holes.
So all of that was debunked, butnever corrected.
(01:34:25):
Corrected.
But that is the narrative thateverybody ran with.
SPEAKER_00 (01:34:28):
Well, it's
interesting that you bring it up
because I know somebody who is anurse and she literally brought
up MDMA and she's like, well,they put holes in your brain.
And I wanted to say something sobadly, but again, it's hard to
(01:34:49):
have a conversation withsomebody who is dead set on
being one way.
So no matter what I say, I it'sit's pointless.
And I and then it's like youfeel like you have to put a
muzzle on yourself becausethey're openly talking about
something.
And that happens a lot too,where I feel like people are so
(01:35:11):
open to talk about where they'reat, but they're unwilling to
talk and listen about where theother person's at.
They just assume there's a lotof assumptions going on that
they know where you're at andthey don't.
And again, it's like to grouppeople in as they're this or
that.
Oh my gosh, you're doing peoplesuch a disservice.
SPEAKER_02 (01:35:33):
If I'm quiet in the
conversation, you can probably
go ahead and assume that I don'tagree with what you're saying.
I just don't know how to say it.
SPEAKER_00 (01:35:41):
Ooh, yeah.
SPEAKER_02 (01:35:43):
Because if we're on
the same page, I can have some
really good, not even the samepage.
It doesn't even have to be thesame page.
Because I have had conversationswith people that I don't
necessarily agree with, and theyhave been very good
conversations.
Yes.
But yeah, no, a lot of peoplewill just assume and talk out
loud in front of me.
And I'm like, that is crazy thatyou would say that right in
(01:36:06):
front of me, and you have noidea where I stand on that
issue.
Yeah.
SPEAKER_00 (01:36:09):
Okay.
My personal favorite is peopleassume because of the color of
my skin and my ethnicity, theyassume everywhere I lie.
And again, you might not be asum for diversity as what you
think, if that's what you thinkabout people of color.
SPEAKER_02 (01:36:31):
Can I say a
stereotype that I feel like I
fit into for a long time?
Yeah, sure.
So I have been doing pink hairliterally since I met my husband
20 years ago, on and off.
It has been, I have like alwaysdreamed of having pink hair.
So I will go on and off.
(01:36:52):
Pink hair, rainbow hair, purplehair, blue hair.
And he finally said something tome.
He was like, it just it's givingwhite angry liberal.
And I was like, all right, Ihear you and I see it now.
And so I think that just madepeople assume.
(01:37:15):
Yeah.
And I hate that because it'slike such a fucking stereotype
because I just like the colorpink.
SPEAKER_00 (01:37:21):
Well, and I and I
also love when I see somebody
who's like really tatted out andthey have like a septum ring and
they're a Christian or they'regay and they're a Christian or
breaking those stereotypes.
They break these stereotypes.
And I'm like, I love that.
So I think you should stand init.
SPEAKER_02 (01:37:41):
I was gonna say, in
a way, it kind of makes me want
to do it anyway.
Because I'm like, don't you putme in the box?
I didn't, I didn't put myself inthat box.
Right.
Like pink's been my favoritecolor my whole life.
I have dreamed of having grayhair one day so I can just dye
it pink all over.
My God, you would eat that up.
Oh my God.
Okay, sorry, sorry.
So back to like the follow themoney, follow the influence.
(01:38:04):
Like, I think that that's reallyimportant to understand that
like not all the science isgoing to be as accurate as you
think because it's biased.
It's being paid for for bypeople who want you to believe a
certain way.
And there are a lot of thingsthat they can do to sway the
results, you know?
unknown (01:38:24):
Yep.
SPEAKER_00 (01:38:24):
Just saying.
SPEAKER_02 (01:38:26):
Just saying.
What was also interesting aboutthat, that is not a conspiracy,
by the way, that was debunkedand these people lost their jobs
because they fucked up so bad.
Oh shit.
It was like a big thing.
So that's not speculation.
That's fact.
Fact.
Okay, learn basic logic andfallacies.
(01:38:47):
This one, it's hard.
You don't need to be aphilosopher, but there are
logical fallacies.
I had to Google this, and I'mgonna invite other people to do
this on their own.
Look up the 10 logicalfallacies, what they are and
what they mean.
There are examples of this, andthey're called like the straw
man, ad hominem, slipperyslopes.
(01:39:07):
There are examples of these ifyou Google this information.
It's like learning the languageof a narcissist.
Oh shit.
And once you know the languageand the way that they word
things, you can pick it up.
Like you're you're you've likealready worked on your pattern
recognition.
So then you can be like, hold ona minute.
(01:39:29):
They're using the straw manstrategy here.
I can see it.
And so this doesn't feel rightto me.
So it's it literally is likelearning what gaslighting is,
learning what triangulation is.
Equip yourself with thisinformation so you can see it
when it's happening in realtime.
I don't know enough about thatto go on a tangent, but it makes
(01:39:52):
me want to learn.
SPEAKER_00 (01:39:53):
Okay.
Another rabbit hole.
SPEAKER_02 (01:39:54):
Stay curious, not
cynical.
Critical thinking isn't aboutbeing a skeptic of everything,
it's about withholding certaintyuntil there's a reason for it.
Healthy skepticism asks, couldthis be true?
It's not, this is definitelyfalse.
Curiosity keeps the door open.
So curiosity is not the same asskepticism.
Practice in real life.
(01:40:15):
Try critical thinkingrepetition.
When someone tells you ashocking story, fact check it.
Like I did with the litter boxin the high schools.
The shock value in that.
Like if something soundsunbelievable, it might be.
So fact check it.
(01:40:37):
When you feel a strong emotion,whether it's online or in real
life, pause.
Take a breath.
Walk away if you need to.
It is even okay to say, Let melet me get back to you on that.
I don't know where I stand onthat.
You're presenting a differentpiece of the puzzle that I
didn't know was there, and Ineed to process this
(01:40:58):
information.
Let me get back to you.
All right.
Get comfortable withuncertainty.
I think we already said this.
You can say I don't know withoutshame.
Like saying I don't know isactually it it's kind of a flex.
Like I get onto my 13-year-oldabout this all the time because
he is his confidence is throughthe roof, even when he is wrong.
(01:41:22):
And it is so infuriating theamount of arguments I get in
with a 13-year-old.
Okay, well, I am getting Becausehe's like, Yeah, it is true.
I read it.
And I'm like, did you researchit?
Did how do you know that's true?
Because it's not, Austin.
It's not.
SPEAKER_00 (01:41:41):
I feel this with a
six-year-old, and he'll be like,
This is what this says.
And I'm like, bro, you don'teven know how to read.
Fuck, what the fuck?
SPEAKER_02 (01:41:54):
I love his
confidence.
I really, really do.
But I need him to add a littlediscernment to it.
SPEAKER_00 (01:42:01):
Yeah.
And I need my boy to learn howto read.
SPEAKER_02 (01:42:06):
Okay.
So, like, love the confidence,step it back a notch.
Make sure you're right beforeyou die on the hill that you are
right.
It is okay to say you don't knowfor sure.
Yeah.
That is what I've had beenhaving to teach him lately.
Um, and then the last one,reflect and recalibrate.
(01:42:26):
Every once in a while, audityour own beliefs.
Oh, this is a really goodpractice.
And we did this in our Patreongroup chat.
What is something you believedfive years ago that you no
longer believe?
And I think that's a really goodway to track your growth and to
keep your ego from fossilizing.
(01:42:47):
So, with that, I think we canprobably end this, but I want to
ask that question to both of us.
I don't want to say end thisepisode.
I want to, there is going to bea part two, and this is going to
be the rabbit hole.
That wasn't even a rabbit hole.
The part two is the part thatyou have no idea about and the
(01:43:09):
history of propaganda, thefather of propaganda, why it
works so well, and then theseclinical and psychological
studies that have been done toprove how effective propaganda
is.
And this all goes in line withlike critical thinking.
I think if we can work on ourcritical thinking skills and
(01:43:30):
remember what overrides them, wecan become immune to propaganda.
We can recognize it when we seeit, and we can realize that this
is not information that I needto share because I am using my
own discernment to say, I don'tknow if that's true or not.
(01:43:52):
And it's not information that Iwant to be spreading.
Because at this point,propaganda is not just something
done by the government or by adagencies or marketing agencies.
We're all doing it because we'reall spreading the misinformation
and spreading the propaganda.
Whether it's misinformation ornot, you're kind of playing a
(01:44:16):
role in it without even knowingit.
So I think that that will be animportant episode.
And it'll kind of fuck with yourheads a little bit.
Can't we?
In a good way.
So tell me something, Christine,or a couple of things that five
years ago you believed that nowyou don't.
SPEAKER_00 (01:44:35):
Um, okay.
So a big one is um, I think thebiggest one that stands out is,
you know, in 2020 was when Ithat was a like late election,
and in the morning I woke up, Ithink, and I I I cried because I
was like, I don't want Trump towin.
(01:44:57):
And if Trump is gonna win, we'regonna have to move away.
And I was very extreme in mythinking, and it was very um, it
came from a very emotionalplace.
And I think the change has beenbeing able to find um emotional,
more emotional regulation when Iam making decisions and forming
(01:45:23):
my beliefs and opinions aroundthings.
But the other thing is um that Iwas so black and white that the
things that I was doing weregood and the things that
somebody else was doing wasevil.
And so I think I've gotten somuch better at um being able to
find gray and hold nuance inthings.
(01:45:45):
And it's it helped me in thiselection because I wasn't coming
from such an emotional place andI was able to listen to people
that I thought that I hated,even if I still didn't agree
with th certain things that theysaid or where they were at.
And I felt like it a lot of ummy opinions now come from just a
(01:46:08):
more just grounded place, not soemotionally reactive and
thinking that like I'm right,this is wrong, being like, I
don't know.
I'm not sure.
I'm gonna find out, I'm gonnalisten to something that makes
me uncomfortable, I'm gonna I'mgonna listen to somebody I don't
agree with and still be able tolike not lose my shit.
(01:46:30):
And before I could not have saidthat.
SPEAKER_02 (01:46:33):
I've learned a lot
of shit from people that I used
to refuse to listen to.
unknown (01:46:38):
Yeah.
SPEAKER_00 (01:46:38):
So what would you
say your answer is?
SPEAKER_02 (01:46:40):
Well, I have the
first one is an obvious one.
I think before I ever did myfirst mushroom journey, I was
very much like drugs are bad andKay.
Uh not necessarily because Ithink I started doing my own
like holistic journey beforethat.
(01:47:00):
Like after my daughter was born,I kind of got into cannabis a
little bit for my anxiety.
And then it that was like thegateway to open up.
And then I opened up about that.
Let me tell you something thatlike has kind of it triggered me
before.
And now I'm like, yeah, you knowwhat?
Yeah, yeah.
(01:47:21):
What else am I into?
I when I posted that we weredoing this podcast, one of my
old friends was like, What newthing are you doing now?
Because I was constantly like, Iopened up about my cannabis use
and we had a podcast aboutcannabis, not you and I, but I
had a caught podcast aboutcannabis.
Right.
(01:47:41):
And then a couple years later,I'm talking about psychedelics
and opened up about that on myInstagram.
And so when she said that, atfirst I was like, what the fuck?
That makes me feel reallyshitty.
You know, like, what new thingare you on to now?
What are you doing now?
Now I'm like, yeah, because I amconstantly changing my mind and
(01:48:05):
open to new information.
And I can't imagine going backto 10 years ago when I was so
closed off and rigid in mythinking that I would have shut
the door to the possibility ofwhere I am now in life.
And I can't imagine that.
I love where I am in life rightnow.
SPEAKER_00 (01:48:27):
Yeah.
SPEAKER_02 (01:48:28):
And to think that I
might have been stuck as that
very depressed wife with threekids with an alcoholic husband,
like that makes me sick to mystomach to think that like there
is a change, there's the thingthat could have held me back
would have been to not have anopen mind.
Obviously, my thoughts onpsychedelics have changed.
(01:48:51):
And then another thing, and I'mgonna say this because I do
think that this could be anepisode that we talk about
later.
I used to consider myself afeminist.
And then I listened to thispodcast about maternal feminism
and how very different it isthan the feminism that I
(01:49:12):
considered right, and it kind ofaligns with where I am now more,
and it's not a side of feminismthat is talked about because
it's almost otherized in thefeminist community.
SPEAKER_00 (01:49:29):
I think that modern
feminists have left out maternal
feminists 1000% in thismovement.
SPEAKER_02 (01:49:38):
And I think we
should talk about it.
I would love to because that issomething that I have completely
shifted and changed my mind on.
And I also want to say I am opento changing it again, because
isn't that kind of the point?
SPEAKER_00 (01:49:54):
Yeah.
SPEAKER_02 (01:49:54):
Of life is
constantly growing and evolving
and changing.
And I would think that if I was70 years old and and my thought
process was like, I haven'tchanged a bit.
Since I was 20 years, I've beenthe same person my whole life.
I don't know if I want that tobe my life.
(01:50:19):
I I mean I I know I don't wantthat to be my life.
SPEAKER_00 (01:50:21):
It wouldn't be your
life.
unknown (01:50:22):
Because look.
SPEAKER_02 (01:50:23):
Once you start doing
all this shit, you can't go
back.
Like Pandora's box is open.
There's no going back in.
SPEAKER_00 (01:50:30):
I like to say that
uh um I'm awake, not woke.
SPEAKER_02 (01:50:33):
I like that.
I like that.
So I again And I would considerus the awakeners, not the
awokeners.
SPEAKER_01 (01:50:42):
Yes, I love that.
SPEAKER_02 (01:50:44):
I love that.
Okay, so if you made it thisfar, congratulations.
I say you put this stuff topractice, put it to use.
Um, and please listen to thenext episode because it is going
to blow your fucking minds.
I can't wait to blow yours.
I know.
And I love this ribside becausethe whole time I'm like, I know.
(01:51:06):
All right.
Stay curious, be open.
We'll see you guys on the otherside.
SPEAKER_00 (01:51:12):
Bye.