Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:08):
Welcome Hello, Big Talker, episode five five for it.
Speaker 2 (00:16):
Yeah.
Speaker 1 (00:17):
Uh, Danny Red is unavailable today. Nathaniel is back in health.
Not perfect, not perfect. Here, we're here, We're here, We're
both fucked. Uh And actually I mentioned this on my
other pod yesterday, but I bit my lip a week
(00:38):
ago and something about maybe the illness in my immune system,
like is is not functioning well, but like and the
placement of it. Every time I talk, it's rubbing against
my canine and every time I eat, I'm like jabbing
it and getting salt on it, and it's just like
horribly inflamed and like damn every time I'm speaking.
Speaker 3 (01:02):
And if you I wouldn't have noticed from an exterior perspective,
but now that you have mentioned it, I'm thinking to myself.
Speaker 1 (01:10):
Goddamn, this is a big I've ever seen. I should
sit further away. It's it's not it's not the contagious
kind unless I head but it's just I feel like
I'm I noticed myself talking weird because I'm holding it
trying to know you. But I'm going to push through again.
So you're You're brave, brave man.
Speaker 2 (01:29):
You know what we all have pross there?
Speaker 1 (01:32):
Uh, man, a beautiful day in Friday, Harbor Waterworks Gallery,
Unversus Studios.
Speaker 2 (01:38):
Good views up here. It's pretty incredible.
Speaker 1 (01:42):
Yeah, it's Valentine's Day, Valentine's Valentine's You are you doing anything? No? Not,
we don't. I don't Quinn and I don't care at
all about my dad's birthdays today, So I got to
call him. And Quinn's birthday is tomorrow. Yeah what a
just now she will be thirty eight. Thirty eight, Yes,
(02:03):
that's my age too. Yeah, yeah, we got a I
got her a nice fifteen pound gold label Woggu brisket
from Snake River.
Speaker 2 (02:12):
Farms And okay that's a yeah, that's a lot of meat.
Speaker 1 (02:15):
Yeah. Our buddy arn is cooking it up. You know,
she's not you know her, she's an introvert. So I'm like,
what do you want to do, like grow a big
bash and invite everyone? And she's like no, Like, yes,
that sounds great, We've more of that. Yeah, So we're
just gonna have a little a little dinner with the
homeschool crew and eat some brisket. Nice yeah, but nothing. Yeah,
(02:38):
Valentine's Day is a non entity.
Speaker 2 (02:41):
Yeah, it's just such a hallmark holiday.
Speaker 3 (02:43):
That even the thought of celebrating it makes me think about, like,
like the people that are out there that have done
like absolutely nothing to show affection or love for one
another throughout the year. Yeah, and then they must be
just like freaking out, be like fuck, but thank god,
(03:03):
we've got this one.
Speaker 1 (03:04):
Day and we're gonna make up for it here. That's
a cynical way of looking at it. But I think
you're right though. I think people use it as an
excuse to, like, well, go to flowers on Valentine's Day, like.
Speaker 2 (03:13):
I went to write it and got you some chocolates.
Speaker 1 (03:16):
Yeah you like the shitty Yeah. No, it's a it's
like anything else in our society. It's a fucking honey pop.
You know. It's just made to force you to feel
things that you don't want to feel and buy things
you don't want to buy.
Speaker 2 (03:32):
Okay, love doesn't exist.
Speaker 3 (03:34):
We're not convinced that friendship does either, So.
Speaker 1 (03:39):
No feelings in the like like the feel manufactured guilt
about not buying yeah fake chocolate, you know what I mean. Yeah,
it's like I and like you said, I try to
show my appreciation and love for my partner as often
as humanly possible, and yeah, to have one day about
it is.
Speaker 3 (03:59):
It's also we just that like the a formula of
Valentine's Day is I mean it's very rigid, right, I mean, like,
I guess what is presented as this thing that you're
supposed to do is like flowers, chocolates, dinner.
Speaker 1 (04:14):
Jewelry like bathbed yea is very strange.
Speaker 3 (04:19):
I don't like doing things in like as a collective
like that that are all the same.
Speaker 1 (04:25):
Well, it feels like mind control, you know. It feels
like take your prescribe romance pill. Yeah, eat your prescribe
romance goo. Yeah.
Speaker 3 (04:36):
Christmas kind of feels the same though they all yeah, yeah,
just like the pressure to like get everyone these fucking gifts,
especially at a time when like the country's melting itself
and like things are not in I feel like the
greatest place just globally in general. So yeah, it feels
super surreal to be like, oh but you know, here's
(04:59):
this free gig.
Speaker 1 (05:01):
Let me spend a bunch of money on trinkets objects.
Speaker 3 (05:05):
Even though I do like doing that. One of my
favorite things to do is very Michael Scott ask. It's
like when people are like, Okay, we're gonna get people
gifts this year. We're gonna try to kind of like
not spend any more than this, And then I'll just
sit there and I'll be like, m yes, you're nodding,
and in my head, I'm like, I'm going to spend
way more than that on this person. I'm a better
(05:26):
and I'm going to be the better gift giver despite
what we've agreed upon right here, I will break the game.
Honestly for me, Yeah, Christmas is tough because I'm like,
I appreciate a good gift giver, you know, I appreciate
when people Maybe it's because I feel like nobody takes
(05:48):
the time to like.
Speaker 1 (05:49):
Really know you or like think about what you might like.
Like if someone gave me a knife every year for Christmas,
I'd be stoked, you know. Like socks are cool. Yeah,
but like my sister's a really good gift giver. She's
very thoughtful and like knows what, yeah, what people like.
But yeah, just the pressure to like make sure we
(06:10):
get everybody. I've started doing.
Speaker 3 (06:13):
I did this last this past year actually, where I
kind of realized that the pressure of that holiday in
particular was so much that to even have like the
mental bandwidth to try to be thinking about like, this
is this person, these are the things they like. If
I wanted to get them a gift, this is what
I would do. It's just like you, you don't have
anything to like left over in the tank in order
(06:35):
to kind of figure out what that's going to be.
Speaker 2 (06:37):
So instead, throughout the year, well.
Speaker 3 (06:40):
Just when I when I saw something where I thought
something came to mind, either put it on a list
so I kept in track of it, or I just
I just got it and like I had essentially just
built up, like I was good to go by the
time Christmas rolled around basically for like most people, which
felt really really good. That's why it was a you knows,
(07:00):
you more time to like if you wanted to really
like hone in on something for like that individual person,
something that you think that they need or wants but
they don't even know it. Like that's the best you
know you can you can come up with that?
Speaker 1 (07:13):
Yeah, Yeah, anticipate Anticipating needs is love language.
Speaker 2 (07:18):
I used to like to.
Speaker 1 (07:19):
Make things for people, and I haven't had the band
with their time to make anything for many years. But yeah,
like crafting things, Yeah I did that a few few Christmases.
Speaker 3 (07:31):
I think I made people some some cutting boards. I
made a friend like a bed frame. But yeah, like
having the capacity to do that obviously.
Speaker 1 (07:39):
Is hold on, you made your friend a bed frame.
It was weird.
Speaker 3 (07:43):
Yeah, it's such as a big I realized that the
seism of it actually was maybe also like not crossing
a boundary per se, but like it was a little
bit of a stranger.
Speaker 2 (07:56):
I just wanted to make something cool for them.
Speaker 3 (07:58):
And I was like, yeah, they're sleeping on a mattress
on their floor, all right, let me get them.
Speaker 2 (08:03):
Like, oh, that's totally different.
Speaker 1 (08:06):
That's like I went, I went by your house, and
I noticed, looks a little weak for the pounding you
guys might be doing.
Speaker 2 (08:13):
This one's really sturdy oak.
Speaker 1 (08:16):
Actually, I mean I didn't walk by like per se,
but I did. I was already in their house. I was.
I was sorry, I was. I was there.
Speaker 2 (08:26):
I was over visiting them.
Speaker 1 (08:29):
In the house.
Speaker 3 (08:30):
I was already inside, and I was just like walking,
you know, down the hallway, and like I poked my
head into the I.
Speaker 2 (08:39):
Can't this always. This sounds really bad.
Speaker 1 (08:41):
I I mean I was their door was open all
my way. There a little bit of Jimmy and picking locks.
Speaker 3 (08:48):
It was open, and yeah, I could clearly see like
their their sleeping situation, which looked almost akin to like
if I were to walk into any like back alley
behind a restaurant in Seattle, and you would see like
a gathering of like maybe newspapers and some other items
that could warm you potentially just kind of scrunched together
(09:10):
in order to kind of stave off the night.
Speaker 2 (09:13):
That's what the.
Speaker 3 (09:15):
Bed situation looked like. So I was like, I'm gonna
make that person bed frame. Well, and I did, and
as far as I know, that bed frame is still
in use, so perfect.
Speaker 1 (09:25):
It was a good gift. Yeah, I mean, but it
was a weird gift. I'd be stoked to receive a
custom bed frame. Yeah, yeah, yeah, Well you also have
the gift of fine craftsman shit available to you. So
I was talking about more like chopping up tree rounds
and coating them as coasters. You know that's nice too, though. Yeah, No,
(09:46):
I think I would burn like some ferns and shit
on them.
Speaker 3 (09:49):
They were, yes, ERCANDISM would burning occasionally. Yeah, that seems
like cool. I don't know how you're supposed.
Speaker 1 (09:55):
To get it seems like it's really hard to get
clean lines, and you always get like the burning like
outside of the thing that you're actually trying to trace
out though. Yeah, yeah, it's there's I mean, there's different
levels of machines and tips obviously, and the different wood
reacts differently. I've noticed, and I've thought about like doing
(10:17):
a slower, deeper burn and then like a little sand
to take off the Yeah, you know, that makes sense. Actually,
I have a wood burning project that I'm in the
middle of and have been for about eight years. So
no kidding, we gotta pick it up. Yeah, what is it?
It's just a round and I'm burning. We have a
(10:39):
we have a meal prayer, like a food prayer that
we say and I'm just I'm just burning that into
the round.
Speaker 3 (10:46):
Oh, actually, I maybe you may have mentioned this a
couple a few years ago. I think, is it sort
of like a it's not like a lazy season that
like spins?
Speaker 1 (10:55):
Is it no that that I was talking something about?
More like a sea ant calendar, right, I wanted to
make for Quinn. This is just like a wall hanging.
That's our food prayer is Oh god, what is it?
I'm on the spot now, I'm not eating that must
be it. I'm not I'm not prime to think about it,
but yeah, it's like I tried to do some like
(11:17):
easy calligraphy and then just woodburn over it. That's pretty cool,
and then you just get bored because wood burning is slow. Yeah,
slow process.
Speaker 3 (11:26):
Sometimes I like subscribe to a lot of YouTube would
workers and some of these people have the craziest like
like the laser burners, oh yeah, and three printers and
C and c's and just all the things, all the
gadgets and those, but especially like the laser where I
(11:47):
think it's a laser cutter, I guess, but it can
do wood burning and stuff, and man, they get super.
Speaker 1 (11:52):
Crisp designs on those things. They're pretty cool.
Speaker 2 (11:56):
Yeah, well that's kind of.
Speaker 1 (11:57):
Like the the AI handcrafting, right, Yeah, it's too clean
I think so.
Speaker 3 (12:05):
But then you know, I've talked to some people about
it and because I'm just like, you know, is that
taking it a step too far? Like are you really
doing like the work at that point? And they do
make a good point.
Speaker 1 (12:17):
They've said a lot of the times that like.
Speaker 3 (12:21):
Sketch up or cat or three D modeling or in
order to create like to put that path to a
CNC machine to have it cut out. That kind of
stuff is like, that's that's a skill for sure, like
of its own I have. I've never dabbled beyond like
doing some SketchUp. And I think after the first like
forty five minutes, I was like, no, too hard now,
(12:43):
like threw my keyboard down and I was like, Okay,
I guess I'm just gonna walk around on my knuckles
and draw things on pen and paper and then be
like this is good.
Speaker 1 (12:52):
This is what you might want this. I'll try to
do something close to that. Well, that's that's funny because
I was having this conversation yesterday about the use of
AI as a tool in your arsenal versus you know,
whether or not it's going to take over and change
the value of the artists in general, you know, And
like I was saying, I've used chat GPT a bunch recently.
(13:14):
I used to be a purist, like I don't trust them,
fuck it, but I've been using it a lot for
like the tedious like grant application or like draft me
up by laws for a nonprofit, or like make me
some like contracts for space rental, like just stuff that
requires specific verbiage legally and like a specific professional looking format. Yeah, total,
(13:37):
and it's it whips it out and then you just
put in, you know, whatever you need. So I think
of it that way as like it's just a tool
that saves me time. Yes, And I've only delved a
little into the I guess the creative aspects like and
not creating with it, but I loaded some some my
lyrics into it, and I was like, act like a
(13:58):
creative writing teacher and like analyze these bars. Yeah, that
was fun and I felt like it got what I
was going for. And then we were talking about you know,
it's different with images, and the question was is And
I was thinking about you when we were having this
conversation because I remember you telling me that when you
(14:19):
were doing your poetry, like you have to craft the
words in such a way to elicit the images that
you want and like to get the vibe that you want,
so that you know, that's takes some skill in art
to produce.
Speaker 3 (14:35):
You know, it was still it still kind of sucked, like, yeah,
it was not very good at it. And this is
like a year ago now, so I assume it's probably
gotten better and the tools for actually kind of like
input versus output have gotten closer together. But man, it
(14:56):
was it was kind of just a frustrating process because like,
not a surprise. I don't think she'd be surprised to
you or anyone else that AI is just not good
at like what's the word that I'm looking for here?
Oh my god, Well, creative things in general, but.
Speaker 2 (15:17):
Esoteric. It's not good at the esoteric.
Speaker 3 (15:20):
And so yeah, when I would write poetry and then
I would kind of if I just did the poetry
like write into it, like create an image based upon
this thing, it would just be kind of a composite
average I think of what it contained from the words
(15:42):
that were used, but also from I think like what
other people on the servers were kind of like doing
image based for their prompts, because like a popularity kind
of thing, Like people like this sort of style, They like.
Speaker 1 (15:56):
This kind of thing.
Speaker 3 (15:57):
So unless you were like laser specific and in addition
to here's my palm, please create an image based upon it,
but also telling it very specifically a type.
Speaker 1 (16:09):
Of image that you have in mind, and like.
Speaker 3 (16:14):
Grammatically how you're stacking your sentence structure as well.
Speaker 1 (16:18):
It was just kind of like.
Speaker 3 (16:21):
Babysitting a machine that is not built to do that
kind of thing.
Speaker 1 (16:29):
And I stopped.
Speaker 3 (16:30):
I stopped using it one because like each time I
had to do that was like just a really long process.
But also, yeah, a lot of the ar art is
so shitty, Like I hate ar art so much. It
looks so terrible, and it's like I like using it for.
I've used it for like privacy policies and like you said,
(16:56):
like legal these kind of stuff. Especially Erica uses it
for some like therapy, paperwork and stuff like that is
great at that, but Jesus, yeah, the art side of things,
it feels wasteful because I supposedly every time you Q
a prompt from chat GPT, it uses like an insane
(17:18):
amount of water or you know, electricity, and so like
it's not environmentally very feasible for like.
Speaker 1 (17:26):
Our our grid, and I've heard that and I don't
understand it.
Speaker 2 (17:33):
And it's like the equivalent energy that it uses.
Speaker 3 (17:36):
And because you know, you've got like all these servers, right,
and they need cooling, which takes like electricity, it takes energy.
A lot of these are also water cooled, right, so
they have to keep them at a specific temperature. So
it just takes a lot of energy in order to
kind of to get this thing. Well, yeah, I don't
know how it's different than like the Internet though, because
(17:57):
like tons of people use the Internet of the exact
same time, and you would think that that would also
maybe maybe it does. Maybe that's terrifying, and that's kind
of like the whole time it's just been using like
a ton of but supposedly, yeah, with like chat GBT,
every time you do a prompt, it uses a crazy
amount of resources.
Speaker 1 (18:19):
And so I don't know how I feel who is
saying this though, Uh.
Speaker 2 (18:25):
I mean I think I've heard it in a few places. Anything.
Speaker 3 (18:27):
I can't like bring to mind the exact source that
I'm thinking of, but I think I specifically remember reading
about it on.
Speaker 1 (18:36):
It was like an ecological blog basically.
Speaker 3 (18:38):
They were they were talking about interviewing Sam Altman, of
the CEO of chat GPT, and he was saying that
in order to scale chat GPT like they would need
to build like nuclear power plants basically because there's just
(18:59):
they don't have enough. They don't have enough energy to
bring to bear if it continues to grow and you know,
need more information in order to get better and better,
and with the increased volume of people that are going
to be not only using it but like relying on it,
And so there was like they were saying that it
(19:21):
was just like an energy choke point that was going
to be coming up. But then like deep Seek uses
like insanely like a very small percentage of energy as
compared to Chat GBT, So it seems like it's possible.
Speaker 1 (19:40):
It's so hard to say, and all this stuff is
so like there's so many giant agendas and giant bottom
lines involved in this stuff that like I get so
cynical about any anyone who's too and one was too
(20:01):
like passionate about their selling points or like yeah you know,
I'm like I don't know, dude, Like but I'm also
an idiot.
Speaker 3 (20:09):
So now I feel that. And I kind of started
out very experimentally with chat GPT and AI for like
art purposes, but it just definitely, I think, given the
prevalence of like what it can do and how many
people are using it specifically.
Speaker 2 (20:30):
And this is subjective, but Jesus, like.
Speaker 3 (20:33):
There's so much terrible AI R like out there that's
being produced.
Speaker 1 (20:37):
Well, let's game this out real quick. Like let's say
that we could agree maybe that, or I will lay
out the premise that any technology we have the military
industrial complex has ten and twenty years advanced, right, like
they they trickle out tech to us, usually for consumer
(20:58):
and like commerce reasons, right. And that makes me feel
like if what we're seeing with this AI push and
like all this hype about like is AI gonna become
sky in it and take over the world, or like
is it dangerous and all this stuff, and like what
we're seeing is the most kind of goofy, rudimentary versions
(21:20):
of it, and like it's gotten better. Like I've seen
some you know, I've seen some images and videos that
I'm like, oh, that's like that's close. There is still
something triggers that. Maybe it's like the Uncanny Valley type thing.
It's like that's not like I know instantly when I
scroll to an AI video, like I don't know if
your algorithm does this, but it'll be like a normal
(21:41):
scene and then like morphin weird disgusting.
Speaker 3 (21:44):
Shit Gordon like over a stove without a shirt and
black lipstick and I got flower going everywhere and I
don't think I remember seeing that.
Speaker 1 (21:55):
But so then I think, like, okay, if if military
Industrial complex has this tech ten years in the future,
like then I can't trust any video I see on
the Internet at all, you know, like all or the
news or the TV like anything. I don't even know
(22:16):
if people are really making TV shows anymore, you know.
Speaker 2 (22:19):
Yeah, it's hard to I think.
Speaker 3 (22:20):
It could definitely get to that point. It's weird right
now because I think like a lot of the videos
that I see that are AI, the ones that are
really good, you know, well, I guess people used to say,
like when it was first coming on the scene is
like it looks very realistic, but look at the hands,
look at the face, and like if you pay attention
(22:41):
to details within like the whole image or something, there's
usually things that are like out of place that maybe
on first glance you wouldn't notice. But if like you
play it a second time and you kind of like
pause it, you can see that, Okay, it was clearly fake.
But now I think it's gotten good enough where that
isn't so much an issue for the AI and it's
figured out kind of how to work through some of
(23:01):
that stuff, and a lot of people that are just
have seemingly high success rates of spotting an AI video
just like really quickly.
Speaker 2 (23:10):
It's not even looking.
Speaker 3 (23:11):
It's just like a vibe, just like something internal that
they are processing about that video.
Speaker 1 (23:18):
That.
Speaker 2 (23:18):
Yeah, Like it's like they just get it like naturally.
Speaker 3 (23:22):
And maybe that's like also kind of a generational thing,
you know, like my parents' generation was not super computer literate. Yeah,
so they could they could see these AI videos and
they have Remember the dad watching the fake UFC fights
on YouTube, which ended up being a person playing a
video game.
Speaker 2 (23:41):
No story, No, I didn't what No.
Speaker 1 (23:45):
I didn't tell you that.
Speaker 3 (23:46):
Oh my god, Well this is a perfect example. Yeah,
so sorry, dad, he's not listening. It doesn't matter.
Speaker 1 (23:54):
I'll tag it.
Speaker 3 (23:55):
Like when I was probably like twenty four or twenty five,
I came back from the city. I would come back
in the summers and I would work, and you know,
went over to my parents' house for dinner right when
I first got back, and I was trying to get
like a rundown, Oh what have you guys been up to?
Speaker 1 (24:10):
What's new in your lives?
Speaker 3 (24:11):
And my dad had recently been getting really into UFC
and MMA fights. But I was like really curious how
he was doing that because they don't have like satellite
TV or subscription service of any kind to do that. Yeah,
he said, well, you know, I've found some forums online
that you can go to, and they actually a couple
guys on there told me that I could just go
(24:34):
to YouTube and watch these fights. And I was like, okay,
I maybe if they're like old enough or something, then
they would release some of these fights.
Speaker 1 (24:42):
Yeah, And he was like.
Speaker 3 (24:43):
No, no, no, no, I mean I can watch like
I can watch fights that are going on like basically
live on TV at the same time.
Speaker 1 (24:50):
So I was a little suspicious, but I was like,
all right, Dad, Yeah, that's that's good. I'm glad you're
enjoying that. He's like, oh, I'll prove it to you.
Speaker 3 (24:57):
And so he went into the living room and like
I'm in the kitchen and the parent my parents' living
room is like you can kind of like see through
the kitchen into the room. The TV's like right there
on the screen, and I see my dad like clicking
around and like searching I don't know what he searched
box and he pulled up a video and it's like
(25:21):
I was a distance away, probably like twenty feet away,
and so I couldn't really see exactly close up like
the details of the image, but it seemed like the
fight that was happening was like pretty repetitive and that
like just it was at the same like camera like
one camera angle, and.
Speaker 1 (25:39):
I was like, uh, it's a little weird.
Speaker 3 (25:42):
So like I put down my knife, stopped preparing food,
and I got closer and I noticed, like in the
corner of the screen there was an EA Sports logo
that was just like revolving. I was like, Dad, this
is a you're watching a video game right now. You're
you're literally watching somebody who decided to record them playing
this video game. And he could not you could not
(26:04):
even tell you've been watching it for six months. And
I told this to him. Of course he was like, no, no, no,
that's this is this is real. And I was like that, no,
it's not. I'm pretty sure that like Joe Rogan and
Harrison Ford have never had a fight.
Speaker 2 (26:19):
From them like the moat.
Speaker 3 (26:20):
Yeah, that's the kind of thing that like bother good lord,
like what yeah, so that was interesting.
Speaker 1 (26:29):
Interesting one well see and like now yeah, I don't
know again, it's like and you've heard you know, it's
like that that theory of like.
Speaker 2 (26:41):
We could have say we.
Speaker 1 (26:45):
I don't know, bombed a small country out of existence,
you know, and like just keep running social media from
their feed through AI or from their location through AI,
and like how long would it take anyone to notice that,
like this place was obliterated if you know, like here's
(27:05):
the you know, the latest feed from totally you know
it's this island. I mean obviously if people try to
travel there, it would.
Speaker 3 (27:11):
Be one thing, but yeah, but otherwise how would you know,
especially maybe like if you reach your points where like
the government is actually controlling the media enough to like
they don't even let certain images out.
Speaker 2 (27:24):
I've seen this stuff about.
Speaker 3 (27:25):
Like North Korea, where like they let a select amount
of interviewers or news people like in a year, and
they bring them to like like their Google office and
people are like at these computers, but he like notices
that nobody is yeah, or like.
Speaker 1 (27:41):
Giant empty restaurants and yeah, for sure, Yeah, very controlled.
Speaker 3 (27:44):
Definitely, you can definitely fake it, so it's doable.
Speaker 1 (27:48):
But it's scary. Yeah, Well, there's a lot of scary
things right now, and a lot of it you know,
has to do unfortunately with those you know, if you
if you tie in what the climate change push is
(28:08):
being used for you know, and this is a touchy
subject around here because everyone's very ecologically minded and climate
aware and all this stuff, and like, obviously you want
to be obviously you want to prioritize the health of
the planet and this and that. But like again, for me,
(28:30):
it always comes down to follow the money. Like who's
who's making billions and trillions of dollars off these green
regulations or these like certifications for companies that get tariffs
or like tax breaks, and like the company who gives
them the green label gets you know, it's like there's
there's so vast amount of money being moved around in
(28:53):
the name of like sustainability and climate change that like
I can't help but think that eventually, if not already,
the the mission that spawned these things is gonna morph
into the mission of getting a lot of money.
Speaker 3 (29:11):
Yeah, you know, I man, I don't even know how
long ago it was, but it just seems like there
was a time where.
Speaker 2 (29:19):
Like the term conspiracy theory to me was, yeah, it was.
Speaker 3 (29:26):
Kind of decoupled a little bit, and I could say, Alice,
that's an interesting theory. But like, you know, like validity,
like I you know, I don't know about that.
Speaker 1 (29:35):
I don't.
Speaker 3 (29:36):
I don't need to, like look at it's not useful
for me to think about that. But like more and more,
the theory part of conspiracy theory, I think has been
like shorn from that title, and I think it's just
conspiracies which are just very much more real and like
at this point, I don't, I don't even know nothing.
Nothing does sound all that crazy. We got freaking plasmoid
(29:59):
worbs guy, we got drones flying around the guy that
controls the most of the water in California. After these
freaking fire like there's all sorts of like crazy synchronicities
almost that seem to be happening. And so it's just
like I don't know, man, Like these don't seem that
far fetched any more.
Speaker 1 (30:17):
And it shouldn't be, because I mean, we all know
that the CIA, the CIA was responsible for undermining alternative
research with the term conspiracy theory, like they launched that
as like yeah, ay of like denigrating questioning things and like, oh,
(30:39):
these people are kookie and wacky conspiracy theorists. But like
you know, conspiracy is a very simple definition. You know
that's found all over Like you have to conspire with
people to get anything done, so like everything is a conspiracy,
you know, like we are conspiring to make this podcast.
Speaker 2 (30:58):
You know, it's not like.
Speaker 1 (30:59):
We're having to make this podcast.
Speaker 3 (31:02):
And for anyone that didn't realize we're going to be
talking about conspiracies.
Speaker 2 (31:08):
Kids, Oh yeah.
Speaker 1 (31:09):
But like you know, the mal intent and the the baseless,
like the crazy aspect is manufactured in there, and like
you know that I'm sure that the flat earth and
all the other stuff doesn't help, you know, but it's
smoke screens to distract from the actual conspiracies of the
(31:33):
systems that we're forced to live in, you know, and
the conspiracies that allow like even just like Nancy Pelosi
is a billionaire from stock investments that she had inside
information on. Like that's not a theory, that's a fact.
Speaker 3 (31:51):
There's a company that exists now which follows trades by
US senators and congress people, and it allows you, for
like a subscription amount of money per year to basically
automatically like it will trade stocks the same ones that
they're trading.
Speaker 2 (32:08):
Oh shit, So that's cool, yeah, but interesting.
Speaker 3 (32:11):
It's interesting because like you know, they have to declare
all of their trades and the amounts that they were
for and yeah, like they could be saying something like,
you know, one thing, but then you see how they're
trading and it's like you's the exact opposite thing of
what you just said, because you just heavily invested in
palanteer technology, you know, or like military military and you're
(32:34):
the person that's up here talking about like wanting to
like not provide as much money to the military industrial complex,
and yet like you're spending your money here and you're
making a lot money doing it.
Speaker 2 (32:43):
So it's yeah, it's.
Speaker 1 (32:44):
Always that fall of the money thing. And like one
of the good things about the Internet and AI and
technology is like this stuff is becoming more transparent in
a way. You know, Like I'm sure there's elements of
like obfuscation. Two. Yeah, since like the amount of information
we have is such so much higher that like you
(33:07):
there's more to sift through. But like it's harder to
hide who you are. It's harder to hide what you
do if you're a public figure.
Speaker 2 (33:15):
Like there's going to be there's going to be proof.
Speaker 1 (33:20):
And that's that's why I enjoy you know, whether or
not Joe Rogan is like a deep state operative or
like a just a you know, no matter what the
opinions on him is. I find it helpful to be
able to listen to long form interviews with people who
are in charge of shit, you know, like because if
someone talks for two hours, it's really hard to keep
(33:43):
up pretense about certain things. You can get a sense
of like who they are and what they believe, and
like maybe people can yeah, yeah, because there are a
ton of very hard to tell when they're lying. Yeah,
and most of the people in those positions probably are
sociopath Maybe I miss I miss it when he could.
Speaker 3 (34:05):
Actually, it doesn't seem like he the people he brings on,
he ends up talking with him about the thing that
they're known for much of the time. Now he gets
like weirdly stuck on, you know, talking about in every episode,
like COVID or the vaccines or it's just weird because
like so.
Speaker 1 (34:22):
Much time wasted.
Speaker 3 (34:24):
This guy's like a like a Nobel Prize winning like
astrophysicist or something. It's just like, why don't we ask
him about planets, Joe? Yeah, anything we don't need to
be talking about, like things that they're not part of.
Speaker 1 (34:36):
So that's very frustrating. Me. Yeah, well, you know, he's
he's also just a person, you know, and like he
was pretty he got pretty uh crucified on the vaccine
issues and like having having people on that weren't against
you know, that went against the the story, you know,
So like he probably has some personal amount of feelings
(35:00):
like vindicate me, you know, And I think it's I
mean a lot of it's born out to be vindicating
to the people who are like, hey, maybe this is rushed,
or maybe this is not safe and effective, or maybe
this is like a money grab or you know, any
of that stuff. You know.
Speaker 2 (35:18):
Yeah, I don't know.
Speaker 1 (35:19):
But then again that's like what sources, you know, what
are you given up on? Like again, I can't, I
can't feel very strongly about many things anymore, like burnt
out in the COVID and the BLM era of like
I got heavily invested in in my opinion being heard
and like riding for the side that I thought was accurate,
(35:42):
and like, yeah, it doesn't get you anywhere, you know.
Speaker 3 (35:46):
Yeah, I think that's in a way, maybe that's by design.
Like you've heard have you heard the term like hyper normalization?
Speaker 1 (35:53):
It's like I think it was.
Speaker 3 (35:58):
It was a term that came about during like Soviet
era Russia, where like the deep state was essentially just
like completely taking over it.
Speaker 1 (36:09):
Was it pretty Berlin Wall coming down.
Speaker 3 (36:12):
I don't remember, but it's a term that they basically
describe as like loading you with so much information, whether
it's true or false. I think that it just completely
like overwhelms the people, and eventually they get to a
(36:34):
point where things are so bad and they're getting so
much worse that it.
Speaker 1 (36:39):
Just becomes the normal.
Speaker 3 (36:41):
And instead of people eventually just trying to like rationalize
that and sort of like fight against it, the entire
country or nation can just be like crumbling and eroding
beneath their feet. But the only thing that people are
experiencing at that point is just apathy. Yeah, just just
sheer apathy. They don't they feel completely powerless and useless,
(37:04):
and no information is useful anymore.
Speaker 1 (37:06):
They don't know what to believe.
Speaker 3 (37:07):
And yeah, yeah, it's probably somewhere close to where we
are now.
Speaker 2 (37:12):
I think a lot of us for sure.
Speaker 1 (37:13):
And I would say definitely by design, you know, like
if if what the shift is going towards this like
one world government, you know, with central bank, digital currencies
and you know, the the governing body of the world.
(37:37):
All these pieces that are happening right now could easily
support that, you know, like bringing on Elon and and
giving them all these powers and all this weird shit
and like and like all this, you know, this slow
drip of revelation of aliens and all this different stuff.
It's like all this leads very easily to and so
(37:59):
we've decided that, like the world is united against the
other planets, and so we need to have a world
governing body. And luckily we have all these experts in
place already in America, and like we're the world leader anyway.
So let's just you know, yeah, and we already know
that the billionaire and trillionaire parasite class, their existence is
(38:25):
global anyway. It's not what they're confined to China or
Africa or you know, like that trillionaires exist in a
global sphere. They can they can be anywhere, go anywhere,
work anywhere, make money anywhere. All of their deals are
tied up with global movements, Like there is no borders
for the billionaires and trillionaires, you know, so like laws
(38:47):
don't exist for them, laws don't exist. They influence the
policies to keep their profit margins where they're you know,
it's like, yeah, these are just facts, you know, and
they are conspiracies. But Scott Galloway, like he's been on
like newscasts and stuff. He's a I think he's an economist,
but I've seen with talk he seems like a really
(39:08):
smart guy.
Speaker 2 (39:09):
Like what he has to say.
Speaker 3 (39:10):
He he references this study based upon like happiness as
relevant to like income levels, and the study was trying to.
Speaker 1 (39:25):
Figure out like how.
Speaker 3 (39:28):
Happiness relates to how much money you make, and like
I forget what the number was, but it was a
surprisingly low number, like past if you make past three
million dollars a year or something like that, diminishing returns
for happiness after that, And like, after hearing that, I
was just like, well, what the hell, Like why why
(39:48):
do you feel why do people feel the need to
just like want to make like so so so so
much money, Like because if it's not actually even getting
you anywhere now, I'm not helping you achieve anything. Additionally,
it's not providing you anymore happy this sense, just like
you are a sick, sick puppy.
Speaker 1 (40:04):
Well, because like what's the sociopathy? Because you have to
be a certain amount of disconnected from empathy to hoard
that much treasure like a fucking dragon, you know, like
if you had and and the problem with most most
of us as far as like understanding conspiracies or or
(40:26):
accepting conspiracies is like we put ourselves in the place
of those people as like I wouldn't do that if
it was me. You know, I'm a decent person, you know,
so like the people who are in those positions of
power wouldn't do that either, yeah, you know, like that's
I feel like that's how my parents are as like, well,
we're good people, you know, so like we would never
(40:48):
do that to someone totally.
Speaker 2 (40:49):
And it's like okay, but.
Speaker 1 (40:51):
At what point do you think that there are or
like what percentage of people do you think around you
have a pretty low number of actual dollars that they
would trade for other people's lives, you know, if you
stand like would you kill a hundred people for a
trolley paradox four million? You know, yeah, I know a
(41:14):
lot of people who would kill people for way less
than four million dollars, you know. Like so if we're
getting up to the billions, you know, and it's like Okay,
we have a chance of harming let's not even say killing,
Like we might harm or mutilate or maybe just adversely
(41:34):
impact five million people, but will make ten trillion dollars.
Speaker 3 (41:41):
Like, Yeah, I mean I tend to think that, Uh,
that's kind of the more of the exception out of
the world, just based upon like what I've seen for
most people, I tend to kind of be I tend
to think on like surface levels, lesser of people just
as like a whole humanity. But I'm usually surprised at
(42:04):
groups of people when like they have to make card
decisions like that, they make what I would think as
like the moralistically right choice more times than I would think,
especially when stakes are high and like yeah, money is
on the table and stuff like that. It seems like
people in what surprisingly have like more of a capacity
(42:24):
to for like the good of like a collective in
what context.
Speaker 1 (42:31):
Uh let's see here.
Speaker 3 (42:36):
Oh, even even something silly like like Beast Games. You
know you've seen that that show with mister Beast. I
haven't seen it, but I've I've heard of it.
Speaker 1 (42:45):
Yeah, there's a surprising amount of times on that show
where like somebody can be offered like an extreme amount
of money to just totally you know.
Speaker 2 (42:59):
Someone else.
Speaker 1 (43:00):
I see where you're going and and yes, I think
you're right, and maybe this is premature, and I'm going
to cut you off because if I don't, I'll lose
the thread. But like, I think that that is a
function of those people being normal people put in those positions.
(43:22):
That's like the predator class has made it to that
point because they clearly have no compunctions being like, like,
what are the traits of a CEO like or like
a billionaire, Like you have to be ruthless in business,
you have to be compassionless and like, you know, like
you have to be a killer, a wolf on you know,
like that's the qualities that saying.
Speaker 2 (43:43):
Yeah, that that's that's kind of like that's the exception.
Speaker 1 (43:46):
But not the rule. I think most more like normal
quote unquote people.
Speaker 3 (43:50):
They do not have the capacity and the lack of
empathty and the lack of conscience to be able to
actually cause that kind of negative effect on one or more.
Speaker 2 (44:00):
Yeah, but there are people out there scary scary.
Speaker 1 (44:03):
Well, it's because we're all living together in the same
plane of existence. Like those people are on a whole
nother plane of existence, Like even even multi millionaires can
just you know, they don't have to. They just don't
think the same way that like things don't have the
same value. Its too easy to us.
Speaker 2 (44:21):
It's it's too easy.
Speaker 1 (44:22):
And like you when you figure out that you can
get almost anything you want experientially or like object wise
with money, like you're threshold for like what gets you excited?
Changes you know, like if you if you could just
if you and I just decide, like hey, let's walk
out the door and walk up to the air airport
(44:45):
and like take a ken More plane down to Boeing
and like take a jet over to Singapore, and like
we don't have to pack anything. We don't have to
plan anything because we don't need to. Like all we
have to do is show up with our bank cards
and like we can get a new wardrobe, we can
go straight to the nicest hotel. Like the thinking is
(45:06):
so different, Like if I were to go to Singapore,
it would be I would have to save for months
and like that tenerary, you know what I mean.
Speaker 2 (45:14):
Like it's just the.
Speaker 1 (45:16):
Actual existence of those people is so different and disconnected
from most of us. And like the circles of people
that they run in who can also do that also
have that perversion of values or perversion of like things
don't mean the same thing to me anymore, because I
can I can buy anything I want, I can do
(45:37):
anything I want. You know, if I want to step
on a child's head while receiving a blowjob from a
ninety year old man, like that's very easily accomplished with
a dollar amount, yea somewhere in the world, you know,
and like yeah, maybe.
Speaker 3 (45:51):
Like removing removing strife, Like if you don't have to
try for anything anymore, then you just get bored. And
when you get bored, you do some pretty awful things.
I don't know, it's a blanket. And then you throw
power in there or like authority. You know, there's those
Stanford studies or whatever about like you give well the
Stanford prison experiment for one, but then like the ones
(46:13):
where you give someone a lab coat and like the button,
you know, and like yeah, and like people's people change
very quickly with authority, with authority.
Speaker 2 (46:26):
And like, but that doesn't explain why people I don't
know why.
Speaker 1 (46:30):
But like I see a lot of people that are
basically just like working class or maybe less, that are
total simps for the billionaire class.
Speaker 2 (46:40):
And it's like what are you Why are you defending them?
Speaker 1 (46:43):
You are crazy? It's because the American dream is we're
all one or two occurrences away from being those people,
Like we could be the billionaires if we blah blah blah.
Speaker 2 (46:55):
You know, yeah, and that's that the.
Speaker 1 (46:58):
Trap of American exception is like we could be millionaires
next week if things go our way, you know, and like,
you know, we want to see them out in the
world and like, yeah, I don't know, it's very odd.
I mean, and there's you know, and then we we.
Speaker 2 (47:16):
Don't have time to get into it too deep.
Speaker 1 (47:18):
But then you pull back from normal human movements and
stuff to that potential of like, well, there seems to
be a lot of independent research done that points to
like a very specific thread of you know, esoteric religion
(47:39):
or like cult beliefs that that is shared in these
upper echelon parasite classes. That like who's to say that
they don't all, you know, whether through bloodline or through
class zeitgeist, Like who's to say that they don't all
(47:59):
subscribe to these weird cult beliefs, And like, you know,
there's that I say this all the time, but like
that the saying I think it was a JP Morgan
or somebody who was like, you know, millionaires don't use astrology,
but billionaires do, you know, like the the the understanding
of the power of things like astrology or energy or
(48:24):
manifestation or like then the darker elements of like blood
sacrifice or any of those things that could potentially harvest
an energy that we're all kind of vaguely aware of. Like, oh,
there is more to us than just like my body
on the couch next to you. You know, like there's
there's these elements that are hidden from us, maybe purposely
(48:45):
maybe you know, maybe not, but like we've all had
the experiences that show us that there's deeper stuff, and
like it would behoove the ruling class to erase that
from our our curriculums and our understandings of ourselves and like,
but if they're engaging in it, you know, if they're
putting together these big spectacles of karmic energy and like
(49:07):
harvesting stuff you know in whatever way, that actually looks
like I don't know.
Speaker 3 (49:11):
Well, yeah, I mean I think the fact that like
a large group of people that just happened to be
extremely extremely powerful and singular throughout the entire world that participated.
Speaker 1 (49:21):
I mean, like people used to kind of.
Speaker 3 (49:24):
Laugh at the idea of like the Bohemian grove, but
there's you know, lots of videos of rituals that are
performed there, and I don't know what's going on, Like
there's not enough context.
Speaker 2 (49:34):
I can't say what, but definitely that's going on that
is at least is very clearly obvious.
Speaker 1 (49:41):
So I've never worn robes with my friends and in
front of a giant that's not a Friday night activity,
could be just anything, you know. H And then then
it gets to like I abhor I abhor a coincidence theorist,
Like it frustrates me to know, and like like, yes,
I I'm a conspiracy theorist, but like is it worse
(50:03):
to be a coincidence theorist? And feel like.
Speaker 2 (50:07):
Oh, it's just like you know, I feel like that's
the easy way out.
Speaker 1 (50:11):
I feel like that just is what do you call that?
Speaker 3 (50:13):
It's a thought ending imperative, Like you are in a
position where you can just say, well, you know, I
don't I don't know that that's that's true. There's just
not enough evidence and maybe maybe it were like maybe both,
Like right, there's a little bit of like both that
are true here in a situation, and it just like
shuts down the conversation.
Speaker 1 (50:30):
You know.
Speaker 3 (50:30):
They don't have to defend any kind of argument. They
just have to say, like that doesn't sound right.
Speaker 1 (50:34):
Thought terminating cliche yeahliche Yeah, and like you know, even
take the mysticism or the occultic stuff out of it,
you know, if it if it feels too woo wu.
But like.
Speaker 2 (50:46):
Just the fact that.
Speaker 1 (50:48):
All of the presidents of the of the United States
and like all of the wealthy people, you know, they're
either freemasons or they you know, skull and bones at
ye ill. You know, like these are all things that
have a barrier of entry, and that barrier of entry
is usually class or bloodline. Yeah, you know, like bloodline
(51:10):
is important. That's why alumni is a thing in those circles.
And like yeah, it's just it's I feel like it's
disingenuous to say that, like's.
Speaker 2 (51:21):
Just too much.
Speaker 1 (51:22):
It's like once or twice maybe a coincidence, but like
almost so many every one of them is a pattern.
Speaker 3 (51:28):
Yeah, like we got you know, like a pattern recognition
person would say, like you, well, this is a too
much to be a coincidence at best a statistical anomaly.
But I guess if it happens over and over again,
it's like no, it's there's something and something.
Speaker 1 (51:45):
And what about the things that these ultra rich people
are doing that are not known to us, Like it's
like say the layouts of like a city like Washington,
d C. The layout of it, you know, and it's like,
what do they call it geomanson geomancy, yeah, yeah, or
things like And I think I probably brought this up
(52:06):
before too, but like Michael Bloomberg putting his office in
the City of London, which is Londonium from Roman times,
and the same area that was Londonium is the City
of London inside of London proper, and it has a
different you know, like the governing body, the Lord Mayor
(52:28):
of the City of London has his own it's like
the Vatican, like it's its own thing inside of London,
and like the Queen couldn't just go in there, like
they're not subject to the laws and the rules of
the UK or any of that. Like there's a lot
of international bank headquarters in the City of London apparently,
and like Michael Bloomberg is putting his new offices, which
(52:51):
will be the biggest construction project, like right on the
site of the Temple of Mithris, you know, and like
you know, like why is he doing that? You know,
is that a coincidence? Is that something that he just
thinks is cool? You know, it could be and it
could not be, And I don't know.
Speaker 2 (53:08):
That's wild? Do you how much how much time do
you have? Actually none more?
Speaker 1 (53:12):
You have none more? I mean we could we could
do we could do five ten more minutes. There's not
a line of people clamoring to come by in the galleries.
Speaker 3 (53:19):
Yeah, okay, I see. Yeah, Well I was going to
ask like, yeah, like what like in your recent life,
like what is like either the kind of quote unquote conspiracy.
I don't even know that their conspiracies anymore, like we've
just talked about, but like, what is the most interesting
(53:40):
or uh compelling kind of conspiracy that you've been thinking
about lately.
Speaker 1 (53:49):
Like as a thought exercise or as something that affects
me daily.
Speaker 3 (53:53):
So just I guess, uh, something that like that resonated
with you something about it was like it made connections
where it just like, yeah, that feels that feels right
like this, I've been thinking about several.
Speaker 1 (54:06):
Of these things, and it just made like a connection,
the one that feels true to me. And granted, like
I am a broad stroke, intuitive feeler, like I'm not
good at citing my sources. I couldn't explain to you
the details of most things. I like. If I feel
it resonating as true with me, that's that's something I feel,
(54:28):
And like that's why that's why I don't ride from
super hard, because I.
Speaker 3 (54:31):
Mean it could also yeah, just be something that completely
just like blew your mind, like, oh my god, what
the hell?
Speaker 1 (54:37):
I really like the the layers of ancient civilizations that
are being peeled back that throw off the normal timeline
of our history of man and like all the academic
understandings of our existence as people, like things like go
Beckley Teppe or or any of the things that point
(54:57):
to like, hey, we don't have the timeline correct officially,
you know, like there's clearly things that point to civilization resets,
like a lot of those mud flood theories of like
you know, maybe the planet gets reset every so often,
(55:17):
and like civilizations have built themselves up to large, grand
levels of technology and existence and then you know reset
and a lot of that, mostly like you know, the
Graham Hancock and Randall Carlson, like younger driest impact theory,
and and just just the understanding that like we don't
(55:42):
know as much as we think we know, and the
stubbornness of people holding on to the official story. Yeah,
that stuff's interesting to me. I really fucking get wigged
out looking at the patterns of clouds in the sky
these days over the last few years, Like it just
looks to me. And I say this all the time.
(56:03):
I'm sure you've heard me say this, but like the
skies just don't look normal to me anymore, Like a
lot of those striations that look like either when they
do it with sand or with other mediums water, even
like when you you know, when you play a certain
frequency or tone, like it's cymatics. Yeah, Like because I
(56:27):
and and it's because I look at the sky all
the time since a kid, like I was a kid
looking for shapes in the clouds. I was daydreaming out
the window.
Speaker 3 (56:36):
Like I feel like I gaslight myself there because like
I yeah, when I was a kid, to me, the
same area that I'm in right now, I seem to
remember the skies being like just a completely deeper shade
of blue than they are. But that's not really science
or anything. That's just and then it's like, yeah, that's
just like maybe that's the cloud the cloud pattern specific.
Speaker 1 (56:59):
You know what I'm talking about, Like those fucking super
common like wavy.
Speaker 3 (57:04):
Thing like when it's a super overcast day and it
like any day threatening to rain but.
Speaker 1 (57:09):
No, no, it any day, Like I mean, I take
pictures of it because I get it bother me, you know,
and like I'm a visual you're literally becoming an old
man shouting at the clos I am the tinfoil hat dude.
Oh I heard an anecdote of where we got tinfoil
hat from that was creepy too. Apparently the CIA like
(57:31):
implanted a chip in some guy's brain and like it
shorted him out, and he like was a very smart individual,
so he was wrapping his head in tinfoil to like
block the receivers for the chip. Could be bullshit, don't know,
that's uh here, but anyway, like and it's you know,
(57:52):
in my in my head, it's linked with the the
spraying of the heavy metals you know in the sky
or weather control stuff and the instant you say anything
about chem trails or spring heavy metals, people are like, well,
you're a nutcase, you know, like that's fucking dumb. And
it happened to me as a kid. I had a
Donald Duck comic about cloud seating.
Speaker 2 (58:12):
I brought it.
Speaker 1 (58:12):
I like brought it up in an argument with my
folks and they were like, cloud seating isn't real, and
I was like, yeah, it is. And I brought down
the Donald Duck comic as proof and they were.
Speaker 2 (58:21):
Like, oh, yeah, retard.
Speaker 1 (58:23):
But like cloud seating has been used since the thirties
and forties, you know, like people put heavy metals in
the sky to produce precipitation. You know, like Dubai spends
billions of dollars a year on it. You know, like
there's there's weather modification programs constantly in effect around us.
You know, you can go to many sites and see
what active programs and like get the sources, get the
(58:46):
actual government websites to these things, and like, yeah, the
harp facilities. You know, like this year all of the
Aurora borealis. I was like, you guys think that this
is normal? Is creeping south? Like we just we just
see it now all the time time it's normal for
us now when in my thirty nine years, like, yah,
it's just it feels like nonsense to me.
Speaker 3 (59:08):
Have you seen the uh kind of in that vein?
I guess I haven't really noticed the clouds all that much.
But the thing that I've noticed in the past two
years is a lot of the places I've been the
snow is really weird.
Speaker 2 (59:23):
The snow is super weird.
Speaker 1 (59:24):
There these little, tiny, little crunchy balls just when it's
snowed here, like two weeks ago or whatever, I was like,
what the hell is this? I wasn't letting my kids
eat it.
Speaker 3 (59:33):
Very oh god, okay, just in general, don't let your
kids eat any snow, whether or not you think that
it looks normal.
Speaker 1 (59:43):
But I don't think it's normal. I don't know, Like, yeah,
why I should have asked.
Speaker 3 (59:49):
Actually, I really want somebody who is like a meteorologist
or or someone to explain something like how is it
that suddenly seemingly because I've seen it several times in
the past two years, and I hadn't I hadn't seen
it anytime prior to that. The snow is just like, yeah,
(01:00:11):
these little pebble sized almost styrofoam.
Speaker 1 (01:00:16):
Kind of consistency. Yeah, bits and pieces, and maybe and
maybe people you know, like maybe there are you know,
very legitimate scientific explanations for stuff I want that that
I don't know, you know, I'm sure that's true, But
like I see patterns like this all the time, and
all that looks like is simatics to me, you know it,
(01:00:37):
And like we don't know what the four G and
five G my towers do to the environment, you know,
like we don't know what the amount this amount of
those frequencies in the air do. Yeah, you know, it's
like just in a constant state of being like, well,
(01:00:59):
we'll give it a shot. We'll see, but we also
won't see because just like any vaccine concerns, like you
won't know nine years down the road, seven years down
the road, six years down the road, like you could
never trace it back to that.
Speaker 2 (01:01:12):
Yeah, luckily for them to.
Speaker 1 (01:01:17):
Yeah, but anyway, those are the things that get me
fired up. But to the point where like you know,
Quinn's like, hey man, and then what are you just
gonna like freak out about it and like ruin your
day by looking at the sky, Like then they've won,
whoever they are they've ruined your time, right, fine, you know, yeah,
(01:01:38):
but like and luckily for us out here, we've got
you know, the beauty of cross currant breezes and ocean
and lots of trees and you know, because then I
tell myself, well, you know, if the frequencies and the
heavy metals were really terrible, like the plants would probably
(01:02:00):
not like.
Speaker 3 (01:02:00):
It, you know, and like indicator species like oh my god,
what is like the lichens and mosses and stuff like that.
But like I just said, like I just took like
a metals test and like I was really really good,
So like I don't have any crazy stuff happening.
Speaker 1 (01:02:16):
Well, nobody would have said you were very metal?
Speaker 3 (01:02:19):
What even even like if I saw, like right now
those guys were super massed up and it's like, oh,
something weird is going on, I'd be like, is there a.
Speaker 2 (01:02:27):
Better place to be that?
Speaker 3 (01:02:29):
No, like if you can't do anything about it, you know,
useful to you than like I mean, like you know,
for're screwed or screwed.
Speaker 1 (01:02:35):
And then you know, and then it's like and the
people who are like they would never do that or
like people would never do people would never spray heavy
metals over populated.
Speaker 3 (01:02:45):
And my mom growing up in New Jersey, they had
like planes that would come in like dust with oh
my god, like the agent orange stuff or whatever, in
order to keep the mosquito population down, just before they
knew that like it was having like they're going at
on people. But regardless, like they would have scheduled evenings
where these crop dusters would fly over the city and
(01:03:07):
just drop like these chemical agents. And it's like, you know,
my mom and her sister and her niece, they all
have the same type of cancer, which.
Speaker 2 (01:03:18):
Yeah, could be genetic.
Speaker 3 (01:03:19):
Pretty despicion, but all at the same time, it's just
like if you're all in the same area, and like
I feel like, yeah, just dropping chemicals on a population,
even if it's for a certain thing control mosquitos create
rain because the steria is too dry, stop wildfire. That
kind of stuff is just just like it's probably got
some some unintended consequences.
Speaker 1 (01:03:39):
Or or Bill Gates dumping all that aluminum and heavy
metals in the air in Arizona to like try to
see if he could dim the sun. Oh my god,
I'm pretty sure that they're suing him for that, like
some city or the state maybe is suing him, like
because he didn't ask them. Oh my god, and like
my dad I talked to, he's you know, my you know,
my pops. He's very much like intellectual, like show me
(01:04:03):
the sources and like if it's not published in the
Scientific American or like you know, like yeah, we were
talking about he's from Saint Louis and he was reading
up about did you know that the CIA like put
blowing machines on the top of project buildings in Saint
Louis and sprayed heavy metals over the poor black areas
to see what it would do.
Speaker 2 (01:04:23):
And I was like, yeah, dude, I did know that.
Speaker 3 (01:04:25):
It's interesting because like now like we actually have things
that are starting to be like declassified, and the things
that are being declassified to me are crazier than any conspiracy.
Speaker 2 (01:04:36):
Theory that I exactly, and it's just exactly. You could
go down the list of of.
Speaker 1 (01:04:42):
Verifiable just the CIA or the or the government or whatever,
like all of these things that are verifiable, the Gulf
of Tonkin, you know, like all of these operations that
sound like the plots of even things like Operation paper Clip,
where we brought in all the Nazi top Nazi scientists
into our government and gave them positions of power, you know,
(01:05:06):
like yeah, like because what the Nuremberg Trials only prosecuted
like eleven Nazis because all the rest of them were
running Disneyland, you know, like Werner von Braun, the head
of NASA. Yeah, you know, like Operation paper Clip is real,
and it sounds like science fiction or like a movie.
Speaker 3 (01:05:24):
It sounds like the most easy thing to discredit as
a conspiracy theory because it's just so much there's no way, there's.
Speaker 1 (01:05:32):
No way we did that or the operation was it Northwoods,
I don't remember where, like we were we had sanctioned
within our military, like a false flag attack on a
plane full of Americans and as an excuse to like
bomb Cuba or some ship. You know, it's like these
(01:05:54):
things are all verifiable now, so like for someone to say,
we probably aren't doing that now like we did that
back then. Yeah, we wouldn't keep doing stuff like that.
We wouldn't keep doing Tuskegee like experiments. We wouldn't keep
doing you know, like why Yeah, because we've evolved as
(01:06:15):
a species and we're smarter and more moralistic, Like that's
clearly not true, you.
Speaker 3 (01:06:20):
Know, I think we just like like colorful red buttons.
Speaker 1 (01:06:27):
We love bread and circus, you know, like there's.
Speaker 2 (01:06:30):
A temptation I understand.
Speaker 3 (01:06:32):
I think people probably aren't really they get to a
point where they're not thinking about the consequences because it's
just like, you know, what happens if I do this thing,
especially like if it has a chance to make money,
to be able to kind of capitalize or commodify in
some kind of way in order to get certain groups
of people ahead.
Speaker 1 (01:06:51):
Yeah, and then and then like just that thread of sociopathy,
like all the experiments done on orphans or Alen syndrome
people or the mentally ill, like the countless declassified experiments
on that that basically result like amount to torture, you know,
like just to see if we can learn something for
(01:07:13):
science and progress, Like and you think that people wouldn't
engage in that now, Like what gives you that confidence?
Speaker 3 (01:07:22):
Maybe that's one area that AI will actually kind of help,
is that, Yes, I'm thinking about Like one of the
areas that I'm excited about with AI would be that
from medical things, like it's able to kind of advance
medical science and able to you know, spot cancers earlier on.
Speaker 1 (01:07:41):
Or well, that's what our boy Mark Anderson has done.
Speaker 2 (01:07:45):
Oh, it's right, Yeah, that was yeah, I guess that was.
Speaker 3 (01:07:50):
It's a long time ago that we talked about that,
not that I'm aware of, I guess, but like even now,
is there is there like a a that a person
can take to actually get that test or I feel
like you're still rolling that.
Speaker 1 (01:08:04):
Out like it's not available or should bring them back
trial or something. Yeah, that'd be cool. Yeah, check in again.
Well that was a good that was a good roundabout
little Yes, Sorry that.
Speaker 3 (01:08:16):
Wasn't more focused. That was like a conspiracy. Hey, we're
shooting the ship viral rabbit hole. If you don't like it,
don't that's true. You gotta before we before we go.
Speaker 1 (01:08:27):
Do you have a conspiracy that's been lighting you up
recently that you well, I.
Speaker 3 (01:08:31):
Do have, like kind of I was talking to you
before I hit the record button. But I some things
that have been going on in my life for years
and years now that at certain points, you know, I'd
almost just forgotten about. And then somebody had asked me
(01:08:52):
a question that made me like remember this thing? And
then I said that and I started thinking about it
like years ago. It just like that was kind of weird,
like what was that? And then it building kind of
up into things happening additionally to you after that, and
I started kind of like looking into some and just
like figuring some stuff out, or if not figuring out,
(01:09:14):
definitely finding sources of information that we're kind of corroborating
that we're just kind of like little like neurons or lights, you.
Speaker 2 (01:09:24):
Know, kind of firing above my head.
Speaker 3 (01:09:26):
And I was just kind of been having some interesting
thoughts and ideas come my way about some stuff that.
Speaker 1 (01:09:34):
But it's not keep going, keep going. This is like
paranormal in nature.
Speaker 2 (01:09:40):
It's definitely paranormal in nature.
Speaker 1 (01:09:45):
And yeah, it's it's basically how many since I was
like fourteen, So it's been to like twenty four years
of stuff off and on again on this island only
or wherever you go, Yeah, other places as well. Give
us a little, give us a give us more.
Speaker 3 (01:10:04):
Uh yeah, I don't I need to connect everything otherwise
I'm just gonna it's gonna be said, but and I
don't have the time to die.
Speaker 1 (01:10:11):
But basically, I think what I could say is that.
Speaker 3 (01:10:15):
When I was younger, I something something happened to me
that was really weird, and at the time that it happened,
even as it was happening to me, like in the moment,
(01:10:35):
it really wasn't until after that thing happens that I
was kind of processing things and connecting the dots of
what it was that happened. And as I was processing things,
I was kind of realizing what was wrong like with
it or what you know, like how abnormal or you know, paranormal,
(01:10:59):
Like what I was witnessing was happened because.
Speaker 1 (01:11:01):
We encounter or abduction or encounter, I would say, probably
it is the closest.
Speaker 2 (01:11:07):
Thing, but then you know, having things happen.
Speaker 3 (01:11:14):
After that, which you know, like I had no reason
to believe I had any connection to it whatsoever, but
in frequency kind of continuously happening over the years, and also.
Speaker 1 (01:11:30):
Not even just to myself but those.
Speaker 3 (01:11:32):
Like around me, and having it just be like this
arrow timeline that all points back to like this one thing,
because stuff like that wasn't happening kind of before this,
and it was a really weird kind of thing. But
I had been listening to a couple of different podcasts
(01:11:54):
where people were talking about some similar things or similar
like encounters, I guess, and what was happening to them afterwards,
or even by not just them, but like two people
around them, or and even weirder, some of these people
(01:12:16):
had like third party investigation, like people that were just
kind of like building like a case file and like
studying and trying to find like patterns and stuff like that.
And once those people started researching or even talking about it,
things would start happening to them. So it's almost like
(01:12:37):
there's some kind of weird transitory property.
Speaker 2 (01:12:41):
Yeah it's really weird, but yes.
Speaker 1 (01:12:42):
It's kind of. Yeah, it's been, it's been.
Speaker 3 (01:12:45):
It's been blowing my mind a little bit lately, and
I'm hoping it's nothing.
Speaker 1 (01:12:49):
Well, let's get let's get Danny Red back and we'll
we'll dig into this. If you're down next time, I'd
love to hear about it.