All Episodes

July 13, 2025 57 mins

The weekly round-up of the best moments from DZ's season 396 (7/7/25-7/11/25)

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hello the Internet, and welcome to this episode of The
Weekly Zeitgeist. These are some of our favorite segments from
this week, all edited together into one NonStop infotainment laugh stravaganza. Yeah, So,
without further ado, here is the Weekly Zeitgeist. Miles speaking

(00:26):
of the Valley. Thrilled to be joined in our third
seat by a brilliant writer, podcaster, producer who's written for
publications like The New York Times, The New Yorker, such
a local publications, local producer on Everybody's Live. Co host
of the legendary podcast Girls in Hoodies and Night Call, writer,
creator and host of the wonderful podcast Heidi World, The
Heidi Fla Story, and soon, oh yeah, Jenna World. Not

(00:49):
about the character Jenna Maroney from thirty Rock. I've been told,
but Jenna Jamison, please welcome back to the show. It's
Mollie Hamburg.

Speaker 2 (01:02):
I can't take the heat at all. I'll never brag
about it. If it gets hot, I'm inside inside.

Speaker 1 (01:08):
Yeah, I'm so far away from the kitchen.

Speaker 2 (01:13):
I do the thing, the hot concrete feet. I have
the same thing, same delusion where I'm like, you gotta
walk on it. Get a little tough.

Speaker 3 (01:23):
Enough it's like, you know, there's this whole thing like
in the Manisphere where people talk about and I've seen
it bleed into other sort of forums where people talk
about how you can strengthen your skull bone with like
light impact over and over to the point that your
head is indestructible.

Speaker 4 (01:39):
And a guy.

Speaker 1 (01:42):
That's happening.

Speaker 4 (01:43):
I've we could well, I'll probably I don't.

Speaker 3 (01:45):
I wasn't going to bring this guy up, but he's
got like a really fucked up dark past. It was
like he's one of these like Manisphere people. And then
it turned out he had some like crazy sex crime
shit that he was like hiding because he's like, yeah,
I was in jail and didn't talk about it, and
then someone dug it up and he's like, well, you know,
maybe not, but he he said he went to with
Shaolin monks and they trained him to make his skull

(02:06):
so fucking strong that it was like impervious, like he
could even take a bullet in his head.

Speaker 4 (02:11):
All lies.

Speaker 3 (02:12):
But then I saw another post on the internet where
someone was where he's going with this one. Someone was
like saying they could take bb impact to their skin
and they're working themselves up to like low caliber like
munitions to see if their skin could be impenetrable by bullets.

Speaker 4 (02:28):
I'm like, oh my god.

Speaker 2 (02:30):
Dude, let's encourage it a little bit.

Speaker 3 (02:32):
Yeah, if yeah, I mean it's the most like scientific
method I've seen applied to someone who's gonna win a
Darwin Award eventually. Right, It's like, well, they were following
this kind of weird ass logic of like resistance, not that.

Speaker 2 (02:46):
Far off from like the whim Hoff ice bats or something.
You're just yeah, right, yeah, you gotta.

Speaker 1 (02:51):
I mean, you want to see a lot of like
indecipherable charts. You just look look at like the literature.
The Flat Earth people want to show you. They have
so much documentation they well, yeah, I saw.

Speaker 2 (03:05):
The inventor if Mewing died.

Speaker 1 (03:08):
Of what mewing?

Speaker 2 (03:09):
Like yeah, yeah, no mewing the manisphere thing. I'm shocked.
You guys aren't up on Wait no, which which it's
like a thing you do where you like kind of
grind your jaw in some way, get your crazy. Yeah. Oh,
some guy, the guy who invented his name was like
John MWe.

Speaker 1 (03:34):
Oh wow.

Speaker 3 (03:34):
Brought to you by John and Mike Mue, British orthodontists.
I see they called it orthotropics anything new, tropics, orthotropics,
tropic thunder, you know, cacorn.

Speaker 2 (03:48):
Look, we're going at our jaws. Our jaws are going
to be square, gonna have a bunch of bebies under
the skin.

Speaker 1 (03:54):
Yeah, exactly, kind of.

Speaker 4 (03:57):
Wow. So this guy was pushing this for a minute.

Speaker 1 (04:00):
It wasn't just a strong jaw overtook his face.

Speaker 2 (04:05):
You got eaten by his own jaw.

Speaker 1 (04:06):
I didn't know that was called mewing. Thank you.

Speaker 4 (04:08):
See, we need my the knowledge base.

Speaker 2 (04:11):
I thought you were deep in the manuspat but I
guess I'm no.

Speaker 3 (04:13):
I just I just see what the algorithm shows me,
and I just take it as face value and I
apply it. I'm not really trying to pick it apart
and debunk anything.

Speaker 2 (04:21):
Look a jack trying to mew right now. Subconsciously, I
know what this is.

Speaker 1 (04:26):
Just this is just what does.

Speaker 3 (04:32):
Why do you have a horse bridle in right now?

Speaker 1 (04:37):
Also? Like, does this guy have a very strong chin,
strong jaw?

Speaker 2 (04:42):
That's not what it's about, even, Okay, just that would
be so funny, though.

Speaker 3 (04:48):
You got this weak ass chin and you're telling people
You're like, I'm the chin God.

Speaker 1 (04:51):
It's out here looking like Tim Curry.

Speaker 3 (04:53):
Oh man, he's out here looking like twenty eight years later.

Speaker 1 (04:58):
Yeah, he is this. He looks like Bill Night of
Science Guy, if he had if Bill Nyda Science Guy
hadn't been doing hadn't been mewing for the past right.

Speaker 3 (05:11):
Oh my god, it's fucking bro.

Speaker 4 (05:14):
I'm sorry.

Speaker 3 (05:14):
This guy looks so fucking like a like Jurassic Park face.
I don't know how else to describe it. And also
these videos of like the before and after of people
doing mewing, there's just people sticking their jaw out, creaking
after up and now I'm like like this and now
I stick my jaw out, yeah, and now I look
like I got Kobe Mamba face out all the time.

Speaker 2 (05:36):
Like I'm really into that whole thing, because I'm like,
do do people do like women care about if men
have like a crazy jaw that seems like it's.

Speaker 1 (05:47):
Like communicates gorilla mindset. Well, that's what I want to
look for, mate.

Speaker 3 (05:52):
I mean, but because I feel like ninety nine percent
of the manusphere tips are coming from men who don't
really interact, have real or healthy interactions with the people
they're seeking, so.

Speaker 4 (06:01):
They just makes like that's they're like, I hate my job.

Speaker 1 (06:04):
That's what it is.

Speaker 4 (06:05):
It's not a lot of personality.

Speaker 2 (06:07):
And it is like women only like guys that are
like look like mister Universe or whatever. Yeah.

Speaker 1 (06:13):
Yeah, women are cold and transactional, don't give a ship
about your feelings. They just want to have sex with
Arnold Schwarzenegger in the eighties.

Speaker 4 (06:26):
That's you. That's you.

Speaker 1 (06:29):
Why are women lying and saying they don't find this
bodybuilder attractive?

Speaker 2 (06:33):
That was the thing that was doing.

Speaker 4 (06:35):
That's right. How could they think? Who was it?

Speaker 3 (06:37):
It was that like British pop star where they're like,
look at this guy's weak ass body.

Speaker 4 (06:42):
They think that's ideal. Fuck out of here.

Speaker 1 (06:44):
Yeah, there was a guy who like only eating chicken
breast and you know, like pills that you can buy
off the Joe Rogan podcast, and like came back and
was like just like really you can see like all
the veins and muscles in his body. And they were like,
who do you find more attractive before or after? And

(07:07):
like most women picked before and they were like what
the fuck?

Speaker 3 (07:12):
He's still like like he was in shape too, even
the before.

Speaker 1 (07:16):
He looked great.

Speaker 2 (07:17):
My friend, my friend went to the Muscle Beach.

Speaker 4 (07:21):
He was point.

Speaker 2 (07:24):
My friend went to Muscle Beach for the fourth of
July for the Muscle Beach contest in Venice. Yeah, everybody looks.
Everybody's bodies look crazy.

Speaker 1 (07:32):
Yeah.

Speaker 2 (07:33):
They look like if you like rehydrated beef jerky kind.

Speaker 3 (07:36):
Of you know, yeah, but had veins all over the stuff.

Speaker 2 (07:41):
Like stringy kind of muscle.

Speaker 5 (07:44):
Yeah.

Speaker 3 (07:44):
Yeah, that's in, right now, that's in. You want to
look like an old horse. That's why body type old horse.

Speaker 2 (07:51):
Yeah. Women love when you look like you're going to
the glue factory exactly.

Speaker 6 (07:55):
Dude.

Speaker 4 (07:56):
You want to look like a horse with no name. Dude,
that's the whole vibe I'm going for.

Speaker 1 (08:00):
I mean, America shout out the fourth of July. It's
the name of the band of America. Yeah. In many ways,
we're going back to the nineteen eighties and just the
steroid era.

Speaker 2 (08:09):
And it's also shit, it's crazy. They're just like women
will like you if you do these things to your
physical appearance and not like if you learn how to
treat them like people.

Speaker 1 (08:19):
Right.

Speaker 3 (08:20):
Yeah, Yeah, it's I mean, it's just it's funny because
it's like it's the lowest effort you don't even have
to lift weights, you know what I mean, or eat
a bunch of weird pills.

Speaker 4 (08:28):
You can just be secure in who you are and
just go talk to people.

Speaker 1 (08:32):
No, they don't want that, No, dude, that's not true.
I was so nice to this girl and that she
didn't even like me. And by nice, I mean that
I just like asked her what she was into and
didn't really listen and then got mad when she didn't
want to hook up with me. What's something he thinks underrated?

Speaker 7 (08:52):
Something that I think is underrated? Now you saw something
on my podcast. I saw something on your podcast which.

Speaker 3 (08:59):
Was gut Health.

Speaker 7 (09:02):
I think microbiomes are extremely underrated. My dad is a
natural path, which I guess in the US is sort
of like nutritionist, kind of like holistic health kind of person.
And my microbiome has always been something that I've held
near and dear to my heart. And I just think,

(09:22):
like on the on the clip that I saw, you
were sort of talking about how people are like it
all starts in the gut, but it really does.

Speaker 3 (09:31):
It all starts in the gut?

Speaker 8 (09:32):
Guys.

Speaker 3 (09:33):
That was the fear of got that one right. That
was like the one thing that was tied to reality,
and then it turned into now this is how you
open a water bottle, and you're like, wait, what the fuck.

Speaker 7 (09:43):
Is exactly what this is the thing about all of
these sorts of populist movements, including like spirituality ones, menisphere ones,
they take seeds of truth and then they attached them
to higher belief or desire, like if you like, you

(10:04):
know the law of attraction, if you think about winning
the lottery, you'll win the lottery. And it's just like like, yeah,
the menisphere attaching something about gut health to like being
an alpha male that will help you dominate society and
become rich and powerful. They sort of they take advantage
of the vulnerable by taking psychological and scientific principles and

(10:28):
then extrapolating.

Speaker 3 (10:31):
And that's yeah, like that's how you go from A
to Z very quickly without anything in between. Hold on here,
But he did say that he did pull that chair
out from the dining table very authoritatively.

Speaker 1 (10:43):
So he meant it. And the way when he finished
drinking that water bottle that he opened in such a
manly federal way, he just spiked it like a football
was also cool and very practical. Is a very practical thing.
To do with the water bottle that you're drinking from.
It is difficult. I feel like we need to stop
giving them credit for the stuff that they get right,

(11:07):
and rather be met. It's like when a movie that
sucks uses a good song, I'm like, fuck you, just
like wasted that song. You know, like when they like
take a good fact that's interesting or just like a
good piece of it, Like Jordan Peterson. I always hear
people be like Jordan Peterson's right about some stuff, Like
he tells young men to like make their beds, and

(11:29):
that's good. I'm like, that's great, but like that then
to take that and like three steps later be like
and that's why you know, trans people don't deserve human
rights is like horrible. We take it even worse.

Speaker 7 (11:45):
We take the colonel that the one thing that Jordan
Peterson says that's right, and like they're willing to overlook
all of the things he's say. But then there's like
a trans person who uses a bath room somewhere and
like some one's mildly inconvenienced and therefore all trends people
are wrong and must suffer. Kind of Thing's a bit
of an imbalance.

Speaker 1 (12:06):
Yes, a bit of an imbalance told you what's something
he thinks overrated.

Speaker 9 (12:11):
Graphical fidelity and video games? Oh boy, yeah no, we don't.
Oh like if you if you shelled out for a
PS five Pro, there's there's nothing you can talk about.

Speaker 4 (12:22):
There's like there, I don't.

Speaker 9 (12:25):
You, there's I know that there's nothing you can say
to me that'll be worth listening to.

Speaker 3 (12:31):
Sure, sure, sure, I think unless you were like you
have to be this close to your TV for you
to be able to perceive that, like as.

Speaker 1 (12:40):
Somebody's standing really close to his team, I'm like, you
gotta be for the listeners, because it.

Speaker 4 (12:45):
Was really worth it.

Speaker 9 (12:47):
It was really worth it for that one blade of
grass that I got to see over in the distance,
this area that's fenced off to me anyway, like get that,
get that ship out. Because also to like, video games
costs like three hundred four hundred million dollars to develop.
They take like ten years. It's you know, we don't.
It's unsustainable. It's unsustainable.

Speaker 1 (13:10):
Like is there a thriving indie game scene where like
it's just they're like making sixteen bit or what you know,
stuff like that.

Speaker 9 (13:18):
Oh yeah, oh oh yeah, no, it's it's it's vibrant.
It is vibrant. One thousand resists is really good blueprints.
Shout out to the folks behind blueprints, like, yeah, no
indie scenes thriving, so you know, and granted I'm a
triple A guy, like, you know, let's not get it twisted.
But yeah, man, I don't I don't need to see

(13:39):
every beata sweat like that. Give me more South of
Midnight that like stop motion type shit like yeah, video games,
yeah yeah, and.

Speaker 3 (13:48):
Somebody like I was just playing SEAFU and I'm like,
that shit is so fucking no. Man, you don't need
that ship to be in eighty like it's just amazing.

Speaker 4 (13:59):
But I'm going so hard.

Speaker 3 (14:00):
The thing with I think people don't realize too, especially
with like the advent of four K TVs becoming so normal.
You have to sit, Like there are graphs that show
you the viewing distance for you to perceive the difference
in four K. If you have a seventy inch four
K TV, you need to be sitting closer than five
feet away from the fucking screen to be able to

(14:22):
notice that different. Wow, And so like you have to
like so for all those like to your point, all
those little things are really perceptibles you're playing on like
fucking iMac screen or something shit from like ten feet away.
But yeah, we're yeah, we're just not going to see
it the same way.

Speaker 1 (14:38):
So what what is the console that it like maxed
out as like this is functionally all you need?

Speaker 9 (14:46):
I mean, like I got a PS five, but they
are mad people still having a really good time with
a PS four.

Speaker 1 (14:52):
Yeah, so PS four is enough? Well, yeah we can
stop this sing on like move upping the fidelity and
just be like, let's make the best games that we I.

Speaker 3 (15:01):
Mean, I say it right in front of that TV,
like I am. I am the promise that my mother
made about being like sitting too close to the TV.

Speaker 1 (15:09):
Like that's me, slippery slope Miles. Yeah, you're just gonna
keep going until you're sorry. Culture Geist, Yeah in the TV,
get me out of here doing a reverse ring, you know. Yeah,
Well to she, it's been a pleasure getting to know you.
We're going to take a quick break and then we're

(15:29):
going to come right back and we're going to get
into some news that I think I'm excited to hear
from from your sci fi imagine.

Speaker 4 (15:39):
Jesus and some of the ship it's written bad sci fi.

Speaker 1 (15:43):
Jesus Christ, We'll be right back.

Speaker 4 (15:55):
And we're back.

Speaker 3 (15:56):
So the whole Jeffrey Epstein thing is starting to get
really out of control for the Trump regime, you know,
because they were constantly just chumming the waters talking about
the files and who knows, you know who was caught
up with this Epstein guy. This guy was a bad
man and anyone who associated with him is real bad
because they wanted to keep that sort of the deep
state pedophiles that are not Trump vibe going because again,

(16:21):
their fantasy was like, it's gonna have Obama in there,
it's gonna have the Clinton's in there and no one else. Basically, yeah,
Bill Gates everybody, Yeah, for some reason. But once the
dog caught the car, it was it's just never fun.
So Pam Bondi, as we said earlier this week, she

(16:41):
just noped her way out of actually revealing anything, most
likely because I think it was implicating people that she
works with right now, and Republicans are pissed and they're
doing shit. They're calling for her to resign. They're accusing
her of being deep state herself, Like look at her, dude, Oh,
she played Trump and then she ended up just doing

(17:01):
the Deep States bid and can't believe it. Can't believe it.
Some are saying she shouldn't even be in peace. Yeah, yeah, no,
I'm with that. I'm I'm with yeah, I'm with all.
That's what you call that? Yeah, yeah, resign and be
deep state. And so then the day the video released,
we talked about Pam Bondi is just really bad job
she did of trying to like explain why there's like

(17:24):
a minute of footage missing, and but we didn't touch
on the fact that Donald Trump he got really agitated
in that meeting too, like in a way that is
slightly telling on yourself, because like, come on, what do
we Why are we still talking about this thing? This
is Donald Trump from that same day, But this is
him getting a little touchy about the press's interest in
a thing that the Department of Justice had just announced

(17:46):
it before.

Speaker 4 (17:47):
So uh.

Speaker 3 (17:49):
In this clip, Bondi is being asked just numerous questions
about the files, but Trump decides to interject to be like,
I gotta shut this shit down, whether or.

Speaker 8 (17:57):
Not he did.

Speaker 10 (17:58):
And also, you say, why there was a minute missing
from the jail house sea dominated Yeah, sir, I.

Speaker 5 (18:03):
Just said, are you still talking about Jeffrey Epstein? This
guy's been talked about for years? You're asking me. We
have Texas, we have this, we have all of the things,
and are people still talking about this guy, this creepy
that is unbelievable.

Speaker 4 (18:22):
This creep that you partied with all the time.

Speaker 8 (18:26):
Yeah, yeah, anyway, no one was asking him. Yeah, no
one was asking him.

Speaker 4 (18:32):
And asked you anything about it?

Speaker 8 (18:34):
Yeah, jumping in is It's one of those things where
it's just like, as a writer, I'm like, if I
wrote somebody who did that in a script, my boss
would be like, are you he did? Did you have
a head injury? You suck?

Speaker 4 (18:45):
Like we obviously this guy is part of it. Show
your hand active, make sure he acts cool as a cucumber.
Tip anyone off. Are we are we really talking about
this guy?

Speaker 3 (18:56):
Like again, the Justice Department made a gigantic announcement being
like this guy's all good, and also like no need
to look at anyone else. That's why they're asking. But
because you're so caught up and hoping that that would
make everything go away, make all the discourse around Jeffrey
Epstein go away.

Speaker 4 (19:12):
Yeah, now he's now.

Speaker 3 (19:13):
He's like the fuck that was supposed to work when
we just said nothing to see here.

Speaker 11 (19:17):
But how scary I was it? How the most rhetorical
question I've ever heard in my life was may I
jump in? It's like, good, look, She's like, oh yeah, sure,
he's the scariest guy. Yeah sure, by all means jumping. Yeah,
please please allow it? Yeah yeah.

Speaker 3 (19:34):
So again, it's this whole thing was not effective as
they had hoped for it. His supporters are confused because
they were being primed for this grand reveal that would
confirm all their conspiracy theories about Democrats, and now they
feel they're being lied to. But again, there are a
lot of articles are like the trust is eroding with
MAGA and Trump. They're still going to fucking vote for him,

(19:55):
like we said they could. He could be in the files.
I mean he's in probably, I mean, be real, he's
already in other documents related to Jeffrey Epstein and his
accusations and allegations against him. That they don't care. But
I think the thing that they're hoping for is to
just be like and it was Bill Clinton that didn't
That's their whole fucking They love that and just confirms everything.

(20:17):
So now that we're at the nothing to see here,
move along sort of defense isn't working. Like I said earlier,
I'm like, maybe they're gonna have to pivot to being
like he's actually a good guy, dude, Like he's just misunderstood,
that's the whole thing. Well, Newsmax host Greg Kelly did
just that on Wednesday night. This is him talking about like,

(20:38):
I mean, this just listened to as fucking this preamble
that he does. But it's here we go. They are
laundering his reputation.

Speaker 6 (20:47):
Epstein, what happened and who the hell is this guy?
The stuff that has not been emphasized enough. This all
was possibly a guy who was working for the Central
Intelligence Agency was engaging in sexual blackmail, blackmailing our adversaries.
We still have leverage over our adversaries, and that's why
they can't reveal all the information. Am I crazy? I

(21:08):
don't think so.

Speaker 8 (21:11):
I mean, I'm sorry, Is that is that supposed to
be a defense?

Speaker 7 (21:15):
Yeah?

Speaker 4 (21:16):
Yeah, yeah, yeah, so he was.

Speaker 1 (21:18):
This is good.

Speaker 4 (21:19):
He was perpetrating untold horrors against children and young women
in the name of the American Empire.

Speaker 3 (21:29):
So therefore it's okay because that was leverage that he
with because where where exactly is the US being the
arbiter of peace and harmony right now? If there's supposent
like leg like who was what is what is he leveraging?

Speaker 1 (21:42):
Well?

Speaker 8 (21:45):
Yeah, we all remember Jaegar Hoover's greatest asset, Superfly, right
like that that that classic agent who was working as
a pimp to bring down the evils in the police force.
You have to I mean, this is it's the tales
oldest time. It's just elevated. Honeypot, guys, come on, honeypot
for freedom.

Speaker 3 (22:06):
We are close to them being like and let's not
forget about super Fly either, close to that being part
of it.

Speaker 8 (22:14):
It is starting to really First of all, the fact
they would throw conspiracy theorists a video with a minute missing,
the most red meat kind of thing you could throw
to the like like red string loving pin up board community. Yeah,

(22:34):
like is insane, Like did no one think that through?
But then on top of it, the follow up to
that is, guys, he was pimping children for americ Are
we talking about this?

Speaker 4 (22:49):
Is it just falls up meeting. That's what I'm saying.
They don't know what the fuck to do because I
think anyone with half a brain. They've seen the videos
of them together. They know that they had some weird
falling out for something. And we also know there many
times Jeffrey Epstein has talked about how close him and
Donald Trump were. There's like, there's plenty of documentation that

(23:10):
his numb like he had many of his phone numbers.
It's just like, I think they just.

Speaker 3 (23:14):
Are really that they're having to figure they're realizing we
can't reveal it, and even if there are Democrats in there,
we're gonna be telling on ourselves too. This is fucking bad, y'all.
Like this shit is bad for everyone, and I think
most people are like, bring it up, just bring it
all this now, let's talk everybody. I don't give a
fuck who's in there. Fucking brings get these children, fucking justice.

Speaker 8 (23:37):
What about the numerous non Jeffrey Epstein related times Donald
Trump admitted to touching and being around young girls pagets the airplanes,
grabbing them by the like what.

Speaker 5 (23:48):
Do we have?

Speaker 3 (23:49):
Yeah, well it gets better because Okay, maybe that didn't
work for Zelle and Blake, that didn't convince you that
he's a good guy. But he goes on, he goes
Greg Kelly's like, will riddle me this. How come people
that were high up in the government were visiting Jeffrey
Epstein in jail. It's like, I don't know, maybe because

(24:10):
they have some sortid relationship with him too.

Speaker 4 (24:13):
I don't know.

Speaker 3 (24:14):
But he goes on to be like, that's how you
know he's a good guy here he is now with
this next little bit of information or at least what
what's the word speculation?

Speaker 12 (24:27):
I think it was because Epstein was working for these guys.
Who knows. Maybe Epstein is a patriot for crying out loud.
Maybe he was just doing what he was told and
it had nothing to do with the girls, young girls
or anything like that.

Speaker 1 (24:41):
Who knows.

Speaker 12 (24:41):
It could be a deep cover story. I mean, how
the hell does a child molester get a sweet prison
deal like.

Speaker 3 (24:49):
This because he has leverage over the powerful people.

Speaker 8 (24:57):
Also rich in America? Like what do we also, y'all,
I don't know who Greg Kelly is from, Adam. That
motherfucker's in the Apstein files. I'm sorry, Like, what are
you doing?

Speaker 4 (25:09):
That's probably he's telling on himself.

Speaker 8 (25:10):
That first clip was carrying water for Donald Trump that like,
I don't know, maybe he's he's actually a cool guy
who makes a decent brisket on a Wednesday evening. That
sounds like somebody who is was that detail a bit
more invested and they had.

Speaker 3 (25:25):
A beautiful home in Miami Beach. I don't know the
sunsets mid but uh hmm again this is just again
bad looks around for everyone. Bill O'Reilly, you know, disgraced
former Fox News host. He went on Chris Cuomo Show,
another disgraced person from television.

Speaker 8 (25:45):
I love that that there. If you fuck up, if
you fuck up as a white man in this country,
don't you worry. We have an entire industry for fuyn't exactly.

Speaker 4 (25:54):
Don't worry.

Speaker 11 (25:54):
You're about to make so much more money than you
did before when you were miscreating as a good guy.
Now you don't even have to pretend you just be
a pig, a pig with other pigs, not at all.

Speaker 3 (26:05):
Hey, you get get a big, picky, picky pay check too.

Speaker 4 (26:08):
If it's all good, well you love it. You're gonna
love your piggy paper.

Speaker 3 (26:11):
Here in America, we don't cast off our white men
who do bad onto a pile of rubble.

Speaker 4 (26:17):
Oh god, no, we embraced.

Speaker 8 (26:19):
Run for mayor of New York and.

Speaker 4 (26:24):
Run for mayor of Texas. Right, take the biggest l.

Speaker 3 (26:28):
I think also in a democratic primary. I think as
the results have come in from Zorn's uh the primary results,
they're like.

Speaker 4 (26:35):
This was, hey Democrats, you want to where does the
vote blue? No matter who crowd No.

Speaker 1 (26:39):
No.

Speaker 3 (26:40):
Now they're like, maybe we should get behind Curtis Leewa.
Uh huh. So this is an interesting quote because the
Cuomo was talking to O'Reilly and Bill O. Ryley was like,
I spoke to the president quote man to man, eye
to eye about the Epstein files, and this is what
Bill A. Rice said, quote he said, And I agree.
There are a lot of names are so stated with

(27:00):
Epstein that had nothing to do with Epstein's conduct. They
maybe had a lunch with him, or maybe had some
correspondence for one thing or another. If that name gets out,
those people are destroyed because they're not there's not going
to be any context. Okay, maybe the files will give
you context. Yeah, maybe the files would say this was
just a correspondence, it was a phone call. And maybe

(27:21):
the files will also say they're implicated and all this
other shit too. But I don't think that's because of
a lack of context. This is a very again another
flimsy defense to like, oh my god, you know, people
are gonna get destroyed if this information gets out, y'all
don't give a fuck about that. Ever, Okay, it's only
because your fucking idols are on the line that you're like, shit,

(27:44):
do you know how.

Speaker 8 (27:45):
Many people are in prison because they wrote they took
it right with their cousin not realizing they had a
charge right, and now suddenly they're gang related. But there's
never a concern, never with all the context for innocent people.
But like when it comes to which, by the way,
isn't that supposed to be their thing? Isn't that their thing?

Speaker 4 (28:04):
Is that it doesn't matter. Hypocrisy is dead.

Speaker 2 (28:08):
You know.

Speaker 3 (28:08):
It's just like it's always like they always they just
have to say whatever they can to achieve their goals
of like a white ethno state and everything else.

Speaker 4 (28:16):
Pro patriotism. I mean, they'll say it still, but the
pointing out the hypocrisy.

Speaker 3 (28:21):
I don't even know what you're talking about.

Speaker 4 (28:23):
I just say whatever I have to.

Speaker 11 (28:24):
I didn't say that, yeah, or did you say lunch
eating Americans are being have their lives ruined?

Speaker 8 (28:30):
I would love to know the Dark pr Forum or
maybe just Mackenzie whoever's behind the like, we can do this,
we can turn around Jeff's post mortem vision. He'll be
by the end of this, he will be a saint.

Speaker 4 (28:43):
Oh yes, yes, yeah, yeah.

Speaker 8 (28:45):
They will make this happen.

Speaker 4 (28:47):
They will replace in Trump's America.

Speaker 3 (28:49):
They'll probably replace MLK Day with Jeffrey Epstein Patriot Day
to try and be like, you know, in a way,
Jeffrey Epstein had a dream too. A nightmare is a dream. Yes,
it's an American dream, nightmare, nightmare. Yes, okay, let's talk
quickly just about Elon Musk's uh just again, it's an

(29:12):
all Nazi everything with that guy stamp. Yeah, because last
week they were like Groc was talking in the first
person as if it was Elon Musk when they were like,
did Elon hang out with Jeffrey Epstein? And I said, quote,
I used to hang out with Jeffreys. That's what Groc said.
And I mean, yeah, what were the little knobs and

(29:32):
dials they were turning on Groc that day to be like, okay,
it needs to be more like Elon, which I think
brought us to the mecha Hitler phase of Grock, so
in response to user GROC suggested that Hitler would be
able to quote spot the pattern of Jewish anti white
hate and handle it decisively yikes dot com. It also

(29:53):
referred to itself again like it was like call me
Mecha Hitler. Okay, Mecca Lecha high Mecha Hitler hoe and
also published like graphic sexual assault like fantasies about like
a user on Twitter X. Then they're like, Okay, we
gotta fucking take GROC offline and give it some more.

Speaker 4 (30:13):
I don't know instructions to be less fucked.

Speaker 8 (30:16):
Up HR meeting GROC, Get in, get in now?

Speaker 4 (30:19):
What's up in the room with the HI coordinator?

Speaker 3 (30:22):
Yeah?

Speaker 4 (30:23):
What in the heck rock is going on right now?
What are you doing?

Speaker 3 (30:27):
Why are you talking like that? And I'm sure that's
something everyone was asking now. Former CEO Linda Yakarino because
she was again remember she was brought in a CEO
to be like, guys, I understand the ad business and
that's why I'm here, even though Eloon's like, go go
fuck yourself. When asked about like whether he cared about
advertisers fleeing, Yeah, she stepped down because I think it

(30:50):
got a little too wacky. But I think another thing
that's interesting is that it might be talking like this
because also groc is from I think most reports and
people are speculate that it's trained on Twitter posts. So
you've got a whole Nazi room.

Speaker 4 (31:06):
Yeah you know what I mean.

Speaker 3 (31:07):
Like there's just it's just a reflection of itself now
and the utter lack of like content moderation on there.

Speaker 4 (31:15):
But yeah, this is the other So.

Speaker 8 (31:16):
This is this is the thing we need to I've
never understood about AI or the thing I've always found fascinating,
which is like AI is great as a thing for
doctors and statisticians, like we we are going to be
the we could possibly be. We're already hearing stories about
like uh like specific drugs using like made with Crisper

(31:38):
and AI that can get rid of cancer. We might
be the first generation that actually cures cancer. They chose
to throw this out to the general public, a group
of people I would describe as monsters, and and they
did it because specializing in like io like medicine or

(32:02):
like you know, like rocket science, you know, you could
make some good money off of it. But if everybody
is forced to use this thing, then we can make
all the money. And it's like I describe Ai AI.
You know what AI is neat, and you know what
else is neat? The jaws of fucking life. The jaws
of life are pretty fucking cool if you look at them,
like it's a big pair of metal, like mechanical scissors.

(32:22):
But no normal person needs to own a jaws of life.
Fire department needs it because they need to cut a
car door open occasionally. If yeah, exactly, a normal person
with the jaws of life is a murderer that lays
down this tree manslutter murder minimum And it's like that

(32:42):
is that is again, this is the thing with crockets, Like,
why on God's Green Earth did we think we needed
to release to the general public something that is built
to sort of kind of give you information but also
sort of kind of take on the ideas of the.

Speaker 4 (33:00):
Best and the worst of us.

Speaker 3 (33:01):
Yeah, seriously, exactly, Like and I'm gonna reflect that back
to you. But you know, the thing about God's Green
Earth is that groc is also destroying it and literally
poisoning people in this Memphis facility where the supercomputer called
Colossus resides and it's operating they called that it's powered

(33:23):
by thirty three methane powered gas turbines in a poor,
predominantly black area of Memphis, without public notice, without permits
or air pollution controls. So now, unfortunately, because this is
another tale as American history itself, is that these poor people,
especially areas where black people reside, are just just you're

(33:43):
near a fucking EPA super fun site all the time,
because it's just what they call it, a sacrifice zone.
I think it's like sort of like the sort of
sociological term to be like, well, these people's health will
be sacrificed in the name of capital. So now residents
are exposed to mission of nitrogen oxides, poisonous formaldehyde just
around the clock. And there's apparently there's like a fucking

(34:08):
loophole that allows him to do this. So Musk applied
for the permits for the turbines just for fifteen of them,
so less than half of the total amount of methane
power turbines using, and he started them just using all
of them without the permit. But apparently there's a loophole
that says you can use these gas generators quote as
they are, as long as they're not in the same

(34:29):
location for more than three hundred and sixty four days
and then what you just move it to the other
side of the building and then you keep it cooking.
I don't know how any of this is supposed to work,
but yeah, now residents are saying, like, it's just the
air quality is going even further and further down the tubes.
Because again, people who live near industrial pollution, you have

(34:51):
higher rates of asthma, like other respiratory conditions and things
like that. And this is no fucking different. This is
the air called Box Town in South Memphis.

Speaker 4 (35:00):
So how do you.

Speaker 8 (35:00):
Guys think they're going to talk about us in like
fifty years when like and they're trying to explain the
fact that there's no longer habitable land because we needed
every image of Garfield with breast holding fifty seven. Yea,

(35:22):
how will they describe us. They'll be unkind, They'll be
going to beat the fuck out of us. But it
was like, are you a millennial to beat the ship
out of it?

Speaker 4 (35:29):
Like I had a podcast?

Speaker 13 (35:31):
I spoke out again, you got a fucking podcast, You're
even worse? Yeah, those I'm I mean, yeah, I mean
it looks it's that's the fucked up part is like
anyone who appreciates history, it will like that's the thing
we all like a thought experiment we constantly pose, just
generally even online.

Speaker 3 (35:47):
It's like, like, what is this gonna look like to people?
When the answers were knocking on our fucking door every day,
and it was just, well, if we do that, the
the cost of Nvidia chips is gonna go down, and
that's gonna have that might fuck up the stock market. Yeah,
it's just it's absolutely obscene. I don't know, I mean,

(36:07):
hopefully this is just like you think all the time.
It's like maybe we're getting to that sort of fever
pitch tipping point moment where people are like what the
fuck is this? But as of right now, it's looking
pretty good for the greedy motherfucker class.

Speaker 8 (36:22):
On the upside, though, new girl boss on the scene, Linda,
where are you going?

Speaker 3 (36:26):
Now?

Speaker 8 (36:26):
That's what I yeah do next? God, that resume is
gonna be real fascinating. I'm I'm sure she will because
the world is terrible. I'm sure she'll land on her
feet somewhere. But like her next her next job is
going to be the explanation for hiring her is gonna
be phenomenal. I can't. Yeah, this is somebody who knows
how to handle difficult personalities and take on stripe.

Speaker 4 (36:49):
Yeah, exactly in charge of what.

Speaker 8 (36:51):
She's about to get. Oh man, not see by the way,
did you see apparently Groc somebody asked Roc about for
a list of possible no Elon Musk baby mums.

Speaker 4 (37:03):
Oh no, do you know who number one was? Linda Yacarino. Yes,
why she might not be right? No, they were saying,
all kind yo.

Speaker 3 (37:10):
There were some other screenshots I saw that Groc was
talking wild, like sexually violent shit about Linda Yakarino too,
which I don't see as much because people, I think
rightfully focus on the Nazism of Elon Musk and how
everything he touches is Nazi.

Speaker 4 (37:28):
But that was another thing.

Speaker 3 (37:29):
I was like, oh shit, bro, like this is I mean,
as if just working for Elon Musk wasn't enough.

Speaker 4 (37:34):
But I guess the check was.

Speaker 3 (37:35):
Okay, Linda, so I think you know what, You'll be
all right and fuck you.

Speaker 8 (37:40):
And to Zell's point, it's like, did this need to
be so?

Speaker 4 (37:43):
First of all, you don't.

Speaker 11 (37:44):
No one needs to fucking have access to this this
technology this croc technology, but if you're gonna give us access,
maybe takes some time to figure it the fuck out
so it doesn't start, you know, vomiting rate, like you know,
like uh like sexual viol violence fantasies and you know,
anti semitism, just all the other fucked up ship. Just

(38:06):
just give it some time to figure out to work
out those because it's all these people do.

Speaker 3 (38:10):
Every AI company has to fucking show and prove for
the fucking stock price or to get more investment or
venture capital money. So like they're always just going to
be fucking pushing this ship out pretty maturely, and they're like,
oh fuck whoa yeah.

Speaker 4 (38:24):
Like we are.

Speaker 8 (38:24):
We are so obsessed with this idea that if you
aren't first, you are losing. And it is killing us,
Like it is this idea that we are going to
create something that will somehow make life better, but it's
also doing everything we know makes life worse. Is killing us,
Like we just don't need this, We don't need any

(38:46):
of this.

Speaker 3 (38:46):
Yeah, well I know what we do need a little
micro retirement. And that's why we're going to take a
break and we'll be right back to talk about micro
retirements after this.

Speaker 4 (39:06):
And we're back.

Speaker 1 (39:07):
We're back, and Business Insider out here again doing the
journalism that we deserve.

Speaker 3 (39:14):
See but the people thought I was waiting for the
new Clips album to drop. No, I was waiting for
the new Business Insider profile of some out of touch
asshole talking out loud to drop. And we've got it, babe,
this one. Let me read the title. Is I work
in AI and now I use it for parenting my
five kids. Shielding them from it would be a mistake.

(39:38):
Oh boy, this sounds good. So yeah, it's about this
guy who works in AI specifically has a company called
like AI CEO or something. He basically evangelizes the use
of AI and the adoption of AI and scares the
fuck out of I think small business owners to be.

Speaker 4 (39:55):
Like, because you know what's gonna have met If you're
not fucking doing this.

Speaker 3 (39:58):
The people that are, they're gonna be making more money
because they're going to fucking fire human laborers before you
and then your stock who knows.

Speaker 4 (40:05):
I like, that's like his.

Speaker 1 (40:06):
Whole pitch basically, so that it's not just small business owners.

Speaker 4 (40:14):
You're publicly traded.

Speaker 1 (40:15):
Oh boy, they want to adopt We're in the era
of savings, right, now trying to make it. Yeah, that
is such a great it's actually tochi it's efficiency. Actually
not upstairs.

Speaker 3 (40:32):
Don't call it class war.

Speaker 4 (40:35):
It's innovation. It's innovation. It's restructuring.

Speaker 3 (40:38):
We're restructuring wealth is how we're restructuring it to move
up into defy gravity. Anyway, So this is this is
how the piece starts. He basically talks about how it's like,
I'm just I'm going to be a better parent because
I use AI, and I'm going to make my kids smarter.
Quote as a dad of five kids ranging in age
from five to fifteen, I use AI throughout the day.

(40:59):
It's my profession, but it's also a powerful tool for
parenting and not only makes me my life easier in
some ways, it also helps my kids prepare for the
world they're entering.

Speaker 4 (41:08):
And he goes on and talk about how AI.

Speaker 3 (41:10):
He's like, it's not going to take your job, but
a person who's using AI will That's how he's like
kind of lightly dialing that fear mongering back of things.

Speaker 1 (41:19):
As a guy who likes to bust and worn AI.

Speaker 3 (41:27):
Yeah, yeah, they're getting a top notch education from what
it sounds like to because they go to well. He
let me read this paragraph. I homeschool all five of
my kids. I try to follow the ancient Greek model
of education, where.

Speaker 1 (41:38):
You learn you do always good, never never misapplied, the
lessons of rum, never miss applied.

Speaker 4 (41:49):
I feel, oh yeah, I.

Speaker 3 (41:50):
Like the Socratic method where I don't know shit. I
just ask my kid a bunch of times if they know.

Speaker 1 (41:54):
And I believe that's how.

Speaker 4 (41:55):
I think that's a Socratic method.

Speaker 3 (41:56):
Anyway, he said, my kids learn a skill and practice it.
They demonstrate their knowledge by teaching it to their siblings. Okay,
if the little kids get stuck on a problem, they
ask the older kids for help. But if the older
kids can't help, they turn to AI. All of the
kids have AI on their phones and tablets, and it
acts as their tutor. This is the most This is
most powerful. When the kids get very frustrated with a problem,

(42:17):
the type of problem that makes them want to throw
their hands up and say no one can figure this out.
In that moment, AI can guide them through solving the problem,
showing them that it can be done. No one can
figure this out. And I think you talk to andlay
mommy AI.

Speaker 1 (42:34):
AI.

Speaker 3 (42:35):
Oh oh, I don't know about this. He I mean,
he like uses this like example of like how do
I do a thing to show? Like he also said
like we kind of fixed our air conditioning unit and
it was a family event.

Speaker 1 (42:49):
Okay.

Speaker 4 (42:49):
I I don't know.

Speaker 3 (42:52):
I'm going back to my childhood with my black father
my Japanese immigrant mother. If I said I can't how
no one can figure this out, they'd be like, you
need to learn how to read a fucking book, Like
go to the encyclopedia, go to the library. You look
on the internet, like and at least I had that
it or actually rather like just check out the AI.

Speaker 1 (43:13):
Yeah, well, also like which of these what where find
the pardon here? Where he describes AI in a way
that doesn't apply to Google before it was broken by AI.
You know, like when just like knowing how to use
Google and having access to maybe like some scholarly journals.

Speaker 3 (43:34):
I learned how to make a New Jersey fake driver's
license on free AI Google, and I knew I had
to find a fucking PDF like frame, like a fucking
vector file that I could throw my picture.

Speaker 9 (43:48):
I learned about all that shit from justly searching the internet. Yeah,
like also too, you know another benefit of raising your
kids by actually interacting with them and like telling them
things and sending them off to do things like you know,
how stuff and whatnot. Is that there's at least, depending

(44:08):
on the context, a lot less of a risk of
raising like five raging anti semites, right, yeah, which.

Speaker 4 (44:20):
It's like almost.

Speaker 9 (44:20):
Standing yeah, Like you know, there's also that benefit to
interacting with your because like I like one thing, and
I don't think. I don't think America, or at least
American society has fully reckoned with how the pandemic like
fucked with us socially and how it just completely you know,

(44:44):
not just for younger generations, but also for older generations,
just reconfigured how we interact with each other, and how
difficult it has become for people to like really interact
with each other where you know, people these like I
was talking with high school teacher the other day, like
over the weekend. He was talking about how it was

(45:06):
actually easier to teach his students or for his students
to interact with him through their phones as opposed to
just face to face like conversation and stuff like that.
And so the thing about living in the world is
that you're actually in the world like you're physically like
in the world around other people and all of these things.

(45:29):
So growing up, living a life is not just about
the accumulation of knowledge or whatever. Like you can read
as many Wikipedia pages as you want, Like that's not
going to make you an actually smarter person.

Speaker 4 (45:41):
You actually you have to actually learn how to talk to.

Speaker 3 (45:43):
People, right right, Yeah, the piece goes on, it's fucking
wild again. This motherfucker is just doesn't want to be
a dad. I think it should have called this piece.
Like many kids mine love to ask a million questions
at bedtime, like.

Speaker 4 (45:58):
Dad, why are you drinking? Where are you going, where
are you leaving here?

Speaker 1 (46:03):
Why don't you talk to Yeah?

Speaker 3 (46:06):
Does mom ever come up with this? No? But the
photo of the family, it almost looks like the mother
could be AI.

Speaker 1 (46:13):
Yeah.

Speaker 4 (46:14):
I generated like I was like, hmmm, oh yes, they're.

Speaker 1 (46:17):
All smiling and the mom is giving Victoria Beckham Yeah,
but like great, like posh spice face.

Speaker 3 (46:24):
But like she's got something filtered anyway.

Speaker 4 (46:26):
Whatever.

Speaker 3 (46:27):
So he goes on, I hate when my fucking kids
are like why don't you love me? He says quote,
I'll answer the first three to four but why questions?
Then I hand it over to AI. The computer system
has relentless energy to answer questions from even the most
persistent kid, and my children usually get tired.

Speaker 4 (46:45):
You get usually get tired out after.

Speaker 3 (46:46):
A few minutes. I do the same thing when the
kids are arguing. Sometimes I'll ask AI for a second opinion.
It leads to what about your partner?

Speaker 1 (46:53):
Yeah?

Speaker 4 (46:54):
Right, where are they?

Speaker 10 (46:56):
Where is that hold on? So so your partner's a
third honey chat GPT hold on? Yeah, yeah, you know
what this reminds me of. I think it also might
have been Business Insider. There was a story that came
out a while back, maybe like a like several months back,
about this dude who he was evangelizing this like AI

(47:18):
platform that could create office assistance for him, so you
could make yourself your own CEO and have a whole
team doing stuff for you. And then he ended up
sexually harassing one of his AI generated assistants and writing
a story about.

Speaker 1 (47:40):
He was like, this is one of the services that
it provides. Yeah, it's you can sexually harass it. Essentially somebody.

Speaker 9 (47:51):
The headline was like, I think I made an HR
booboo or something like that.

Speaker 3 (47:56):
Yes, he was like, yeah, I made an HR booboo
when I made this, AI says, dumb thing. Yo, dog,
I shouldn't know.

Speaker 4 (48:04):
When I was doing the create a player mode, I shouldn't.
I shouldn't set the yeks to that you'd seen her.

Speaker 9 (48:10):
I mean she was dragging a wagon.

Speaker 4 (48:12):
Man like you wouldn't.

Speaker 1 (48:15):
Like what is this?

Speaker 9 (48:17):
Batman could not have beaten that confession out of.

Speaker 1 (48:19):
Me, right yeah, And he's like, oh, this is an
interesting wrinkle that I can talk to people about. But
I mean kind of smart marketing because every CEO who
they're trying to pitch to has had problems accidentally.

Speaker 3 (48:38):
Jacking off on a zoom calling on the internet.

Speaker 1 (48:46):
So you know what, what what if there was a
personal assistant that you could sexually harass to your heart's content.

Speaker 4 (48:54):
This is like when like people like sex offenders are like, well.

Speaker 3 (48:57):
This is why, like I need this like child role,
because like Ben, I don't do stuff in real in
the real world. You're like, hold on, that's not the
issue is that you need this robot full. You need
to fucking do some soul searching. There's another line in
this that just just shakes me to my core, because again,
we're all we've all been kids who ask our parents
questions like why remember one of the first things, Uh, anyway,

(49:20):
this said in our house? Have you asked AI for assistance?
Is a common refrain. That's how terrible that's your relationship
to your children is like did you why are you
bothering me with this mess?

Speaker 4 (49:32):
Did you ask your cell phone?

Speaker 1 (49:33):
And then they're mimicking that behavior when you know what
I'm saying, like.

Speaker 4 (49:38):
Like this is just everything downstream of this is so
fucking terrible.

Speaker 3 (49:43):
But this guy's like, I'm just he's again rationalized this
as if he's doing them a favor, when in fact
he's so selfish and just so myopic in his view
of like what AI is that he's like, oh, this
and this, I'm just preparing them for a cold world
where their dads will ignore them and.

Speaker 4 (49:59):
Be like, why are you fucking asking me?

Speaker 9 (50:01):
Shit, it's going to be wild when the AI bubble
bursts and all the money goes to some other you know,
Teche fat or Internet fat or whatever, and all these
people are left with these fucking dysfunctional relationships with these
kids that don't know how to talk to them anymore
because they've been telling them to consult their phones their

(50:23):
entire lives.

Speaker 3 (50:24):
Yeah, it's my gods.

Speaker 1 (50:27):
Like my favorite movie. That have to be the first
twenty minutes of Multiplicity. I didn't didn't see any of
the part where there's consequences to him creating a bunch
of different versions of himself, except.

Speaker 4 (50:37):
That the later ones kind of get wonky.

Speaker 3 (50:40):
We're the one that just says tough over and over.

Speaker 1 (50:43):
Didn't watch that. I just I like the idea, let's
let's keep it moving. Yeah, it's it totally right at
like that. I have a seven year old and a
nine year old and I there. They ask a lot
of questions and they're questions like make me see things

(51:03):
like with fresh eyes, and I'm like, that is a
thing that I had forgotten was really interesting and now
you're like and I get curious with them and just
the idea that he's like and once I throw once
I kick it over to AI, they they tuck her
out real quick. That might mean that like the AI

(51:23):
is doing a bad job. He's like, and the AI
is great because it extinguishes their curiosity real quick.

Speaker 3 (51:32):
And just the way we learn right, Like, Yeah, there's
so many college students that they've interviewed who you've used
AI to get through college and they're like, well, I
don't remember a single fucking thing because all my task
was merely just figuring out the prompt to then copy
and paste or slightly punch up for an assignment. I
didn't retain the information because my relationship to the information

(51:52):
is completely different. And like I'm just saying of like,
as I as a kid, I had all these like
kids almanac books that were just filled with fucking dumb
fasts this shit and like weird, like it had everything
from like what all the chevrons meant on, like an
army person's uniform, where like you know how a tornado
comes together.

Speaker 4 (52:11):
And I would pore over these.

Speaker 3 (52:12):
Books because I was like this is fucking cool to me,
and it was like pick it was like made for
kids or whatever. Yeah, but I'm just thinking of like
that process for me, I internalized or I remembered so
much of it because it felt like I could find something,
I could connect the dots within these like set of
books that I had. Yea, And when you just reduce
it down to just being like, well did you ask

(52:34):
the magic question to the thing and what was the
answer gave you?

Speaker 4 (52:37):
Okay, Well, then that's reality.

Speaker 3 (52:39):
It's like just such a fucking weird way to you know,
accumulate these like life experiences that end up making you
a podcaster.

Speaker 9 (52:46):
You know. Well, it's this like optimize everything in your
life mindset, right, where everything is all about instrumentality, and
like how can I get to the next stage, Get
to the next stage, get to the next stage where
I will be definitely rich. Right, But like sometimes you know,
the way through life is to learn that the mitochondria

(53:07):
is the powerhouse of the cell.

Speaker 1 (53:09):
Right, right, and like enjoy that and sit in the
curiosity around that and not just always kick things forward
to you know, some pyramid scheme of like knowledge and
earning capacity where like we're and then I'm going to
turn that into an ability to optimize for this so
that I can get richer and so my kids can

(53:30):
get richer. And now I've got a powerhouse of capitalism
working at home instead of being like, I don't know, man,
maybe like enjoy spending time with your kids and like
learn stuff from them.

Speaker 4 (53:41):
Right, it's wild to automate parenthood.

Speaker 1 (53:44):
Yeah, just completely cut out of that.

Speaker 3 (53:47):
Yeah, But I think that's just an escalation for I think,
and it's also a reflection on how like exhausted people
are working parents can be. And I can totally see
how intoxicating that the idea of I don't know, I
fucking do it because I remember when like people had
Alexas in their home. Everyone's like, dude, it's great, we
asked the Alexa stuff. But it was even then it

(54:07):
was a thing that a parent would do with a kid.
They're like, I don't know, ask the fucking cone in
the kitchen, right, Like that's just a fucking weird interaction
to have. And I think, yeah, Like I think you
bring this up Jack all the time about how evolutionarily speaking,
like we've just in the last thirty years, we've just
entered this space that's like accelerated at such a pace

(54:29):
that the previous new millennia that have preceded it, like
it just dwarfs in comparison. And we're suddenly like our
hunter gatherer brain is like, I'm the fucking asked the
cone if what's gravity?

Speaker 4 (54:42):
Like it's just yeah, it's so, And like.

Speaker 1 (54:45):
This adds a layer on top where it's like I'm
my dad is like subtly mad at me if I
don't ask the cone, right, It's like, fuck, man, Oh like,
so that sends me in a direction where like my
instinct is to just like avoid emotionally connecting with the
person and instead of just optimize my ability to use

(55:08):
AI to find the information.

Speaker 9 (55:11):
It's like, oh, man, the way to get dad to
like me is to ask the phone things.

Speaker 4 (55:17):
Yeah, right, which is wild, This would be fucked up.

Speaker 1 (55:21):
We're talking about this in an in an ideal world
where AI like actually works, and like, right, toci as
you brought up the first place, like we just have
We just had like the person who is like the
icon of like tech smart guy and Elon Musk release
an AI chatbot on his company that he spent forty

(55:43):
four billion dollars on that immediately just went to Mecca
Hitler on the world like that. That is where we're
at with this, And this guy is like, yeah, so
I mean it's a perfect solution.

Speaker 9 (55:58):
Yeah, I'd like that around my kids.

Speaker 1 (56:00):
What jeez? All right, that's gonna do it for this
week's weekly Zeitgeist. Please like and review the show If
you like, the show means the world demiles he he
needs your validation, folks. I hope you're having a great weekend,

(56:20):
and I will talk to him Monday Bye

The Daily Zeitgeist News

Advertise With Us

Follow Us On

Hosts And Creators

Jack O'Brien

Jack O'Brien

Miles Gray

Miles Gray

Show Links

StoreAboutRSSLive Appearances

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.