Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hello the Internet, and welcome to this episode of the
Weekly Zeitgeist.
Speaker 2 (00:05):
Uh.
Speaker 1 (00:05):
These are some of our favorite segments from this week,
all edited together into one NonStop infotainment laugh stravaganza. Uh yeah. So,
without further ado, here is the Weekly Zeitgeist, Miles Worth.
Speaker 3 (00:26):
You can be joined in our third seat by the
author of the upcoming book Humor Me Humor, which comes
comes out on January sixth.
Speaker 1 (00:37):
Right.
Speaker 4 (00:37):
You know, I did not pick the date. I did
not pick the release date. I said, I wrote a
light hearted book about how to laugh more. And they said,
we got the perfect.
Speaker 1 (00:44):
Day for date in American history. We got you don't
even don't even ask.
Speaker 4 (00:51):
Here's a week eleven was taken. We are putting this
out on January sixth.
Speaker 5 (00:57):
You got nine to eleven.
Speaker 1 (00:58):
January sixth or December seventh, a little more obscure, but
the date that does live in infamin unless.
Speaker 4 (01:05):
And they'll say that about this book, that's right, it
is it's and you are predicting that this is going
to be the new thing that January sixth is known for.
Speaker 5 (01:14):
Absolutely. I'm taking it over.
Speaker 4 (01:15):
When they say storm the Capital to be like the
Capital Bookstore exactly because I know that was the release
date of the book that Lives in d Yeah, that's humor,
you know how we all remember the release dates of
our favorite books.
Speaker 5 (01:26):
Yeah, that's right.
Speaker 1 (01:27):
It's available for pre order now. You also host how
to Be a Better Human podcast, the National Academy of
Sciences live traveling game show. Wrong Answers only, please welcome
back to the show. It's Chris Daufe. Hello, Hello, what
a joy you.
Speaker 5 (01:44):
Guys.
Speaker 4 (01:44):
Yeah, I don't speak Scottish yet?
Speaker 5 (01:46):
Did you do you have any Are you able to
push back at all?
Speaker 1 (01:49):
When the publisher goes all right, the release date is
on January sixth.
Speaker 4 (01:54):
Well, they said what do you think about this day?
And I said, that's hilarious. You're not serious, right, broadly
like historically thoughts. Well, I was like that's kind of
famous and not for comedy, right, And then they were
like when we said what do you think, we were like,
it's gonna come out that day, just so you know,
And I was like, all right, here we go, let's
do it.
Speaker 5 (02:11):
Great.
Speaker 1 (02:11):
I love to hear you, uh hear you complain about
it before we told you that that was exactly what
it's gonna be.
Speaker 4 (02:17):
Yeah, it was very much the way that they asked,
like what do you think about this? In the way
that I asked my toddler, like do you want to
go home? Now, it's like, well, the answer is yes,
you can say whatever.
Speaker 5 (02:27):
We're going to do that. Yeah, we're gonna go home,
and you may claw my face off in anger, but
yeah we have to go, and we have to go.
Speaker 3 (02:33):
They told you on a call and they were like, see,
I told you you'd be fucking hilarious. This guy's this
guy's a riot and then there's something like other people
listening in laughing. Anyways, that's the date, Chris, Uh, you're
excited to get it out there on Yeah, so the
New Independence Day.
Speaker 4 (02:49):
That's right, the New Independence Sector. And I should say,
you know, I know that the Daily Gus kind of
skews politically left. Of course my book is a far
right manifesto.
Speaker 5 (02:57):
Yeah, yeah, yeah, that's right. Yeah, we're trying to bring
end ourselves just as a centrist podcast. In case they
deem this show like some kind of anti American propaganda
out let's oh we're calling ourselves Republicans.
Speaker 1 (03:09):
Now, yeah, we've we've actually changed our angle. Barry Wise
is our new editor in chief.
Speaker 4 (03:14):
And long may she reign.
Speaker 5 (03:16):
We can't wait to have Mario's gonna be let me. Yeah.
Speaker 4 (03:20):
Let me also just say I I did think before
I came on this, there's no doubt in my mind
that when I get thrown in the gulag, these episodes
will be a huge piece of the trial. And I'm
thrilled for that.
Speaker 1 (03:30):
Chris, here's the thing. We release so many of them,
even yeah, you know, nobody is gonna know, nobody's gonna
be able to find these things.
Speaker 4 (03:38):
Oh, to me, that's a positive.
Speaker 5 (03:40):
Big enough, We're not big enough.
Speaker 3 (03:42):
Yeah, that's the other that we figured out is that
nobody nobody thinks, you know, Trump's not listening as I love.
Speaker 4 (03:49):
That being your your excuse in front of the military tribunal,
You're like, it really is not as popular as you're
making it sound. I mean, look at these download numbers,
my honor. It's just like you're basically about to shoot
us for like a meeting with a few friends that
we had quietly.
Speaker 5 (04:03):
This is so fucking unfair.
Speaker 4 (04:05):
It was like an effective ad sales machine only for
like certain grands and companies. Okay please, Oh no, dude,
there were there were black rifle coffee ads aarrently running
on this show. That's got to count for something that
actually does that absolutely does count.
Speaker 6 (04:17):
They're like, oh, actually, wait a second, maybe that's the
filter they use any show that it had black rifle
coffee as is gonna you're say safe and through that error,
they're like, and this is yeah.
Speaker 4 (04:29):
It isn't incredibly uh, you know, incredibly damning indetment of
the economy. When you see like what the podcast ads
are at that particular moment, like a couple of years ago,
it was like, we're sending people mattresses and physical products.
And then recently it's been like, would you be willing
to advertise like an injectible brain serum that you stick
straight into your skull? And I'm like, I'm probably not.
Speaker 5 (04:51):
It's powered by AI AI powered injectable brain serum.
Speaker 3 (04:57):
Go on, what is some thing from your search history
that's revealing about who you are?
Speaker 7 (05:04):
Well?
Speaker 8 (05:05):
I was trying to remember what animal or species evolved
before sharks, because I know that sharks evolved like a
shit ton long time ago, like millions of years, right, yeah,
it was like hundreds of millions of years if I
recall correctly, And so I was like, wait, what came
before that. And then I wound up on what I
think is an AI website. It's called oldest dot oldest
(05:26):
dot org, and so I was checking that out. Is great,
They've got a list. Some of them look like they
could be right, horseshoe crab, jellyfish over five hundred million years,
elephant shark it says four hundred million. I don't but
I don't know, And so I was like, ooh, I'm
going to do some more research and see if this
is an AI website or not. And I think that
(05:48):
it is because the the the animal that you can see,
the oldest animal you can see, you know, that's visible
to the naked eye on this website, so like something
posts back to your or something that came after bacteria
evolved our tenophores. I think I pronounced that right. Comb
jellies and on oldest dot org claims that comb jellies
(06:11):
experience about half of the same disease as the humans do,
and I was like, I don't think that's right. They
got a little cold, like wait a second. So then
I had to do a deep dive into that do
comb jellies have the same diseases as humans? And Google
AI tells me, no, I couldn't find the research that
(06:32):
oldest dot org looked up or said, they're like the
NIAH is doing research on comb jelly's because they're trying
to figure out how to like solve disease. And I
was like, again, I can't find this NIH paper that
they failed to link to. So anyway, I'm just trying
to figure out, like what what animals evolved first?
Speaker 4 (06:48):
Yeah, I don't remember.
Speaker 1 (06:50):
It's a shame that you end up on some like
AI slop website that's clearly just shitting out listicles. I'm
looking at oldest dot org here and I'm like, oh
my god.
Speaker 8 (06:59):
What is And it's a little bit like Internet nineteen nights,
like circa nineteen ninety eight, like the images, and so
it's weird because it's like AI slot plus like old
timey Internet. And I'm just like, I don't know what
to make of the old and new just making sit
up on the internet. The internet does want to do.
Speaker 5 (07:19):
I like how my cursor changes into a sparkly wand
when I hover over the likes like an old Geocity's
website of old.
Speaker 3 (07:29):
Crabs are the most ancient looking thing, Like when you
look at those, You're like, oh, this.
Speaker 7 (07:34):
Is like out of old motherfuckers.
Speaker 3 (07:36):
Yeah, this is this couldn't look anymore like it was
a first draft. That is just like kind of still around.
Speaker 1 (07:42):
Yeah.
Speaker 5 (07:43):
Oh yeah that thing.
Speaker 1 (07:44):
Yeah, you flip that thing over. It's like a fucking
horror movie in there, man.
Speaker 8 (07:49):
Yea.
Speaker 1 (07:52):
They always get flipped over on the Jersey shore, on
the beach of the Ocean City, and it's just fucking
gnarly under there because they live for a long time
and they get like barnacles and all these like other
things like growing on them.
Speaker 5 (08:04):
Can you eat them?
Speaker 1 (08:06):
Worshoe crabs?
Speaker 3 (08:07):
Yeah, not that I've ever heard of. They're mainly like
plates and hard things. I don't think you'd want to
eat a horseshoe crab.
Speaker 1 (08:15):
The one place I'm seeing a horseshoe crab thing is
bush Guide one oh one. So it sounds like you're
probably not choosing to eat a horseshoe crab.
Speaker 7 (08:24):
Sure, Okay.
Speaker 1 (08:25):
Also, you're probably not on bush Guide looking for this information.
I think you got to here going going for something else.
I'm on bush Guide trying to go figure out what's
going on with these crabs in my.
Speaker 8 (08:37):
Bush Really deserted Island, Reddit and Wikipedia, Like that's the
only thing that I like go to.
Speaker 1 (08:45):
There's just a r slash Revolution or r slash evolution
from five years ago. It has all the information that
you need.
Speaker 3 (08:52):
It's and then AI is just remixing that shit over
and over again. It looks like a website from two
thousand and eight because that is what they're remixing.
Speaker 5 (09:01):
They're just chopping it up.
Speaker 1 (09:03):
Oh no, you can eat them and tailand okay, I'm
sure you could. I need a Thai horseshoe grab before
I hit a New Jersey, like a Jersey Shore.
Speaker 5 (09:10):
One, that's for sure.
Speaker 1 (09:11):
Yeah. Yeah, the Jersey Shore ones all have like drug
problems and shit.
Speaker 5 (09:15):
Yeah, like those higher rubbery they're high.
Speaker 7 (09:17):
Yeah, they're high on cokaine that gets less water.
Speaker 3 (09:20):
That's right.
Speaker 1 (09:21):
But I like, I saw one that was had washed
up and I like flipped it back over and it
just pulled a reverse like it was just like looking
at me and just like reversed into the ocean like
Homer Bush Gift.
Speaker 5 (09:33):
Yeah yeah, yeah, it was kind of cool.
Speaker 1 (09:35):
WHOA, what's something you think is underrated?
Speaker 9 (09:39):
Okay, I'm as an adult, I'm trying to learn a
new language, which you know is something that I feel
like you kind of stop trying to do after you take,
you know, a second language in high school. Like I
learned Spanish and French and high school and college and
I kept up with those a little bit but not
too much. But I'm just I like Japan a lot.
I've traveled to Japan any times. I think I mentioned
(10:00):
on that on the podcast before, And I'm trying to
learn Japanese and it's really, really, really hard. But I
just feel like I've always been interested in in in
learning another language, and I just have been like, oh,
it's too hard, and yeah, it obviously is, but I'm
just like, you know what, I'm gonna do it. So
I'm working on it every day and it's been really fun.
Speaker 1 (10:17):
Miles. You recently started taking lessons on dual lingo, right
for Japanese.
Speaker 5 (10:21):
Yeah, yeah, yeah, I'm doing about Gohan and you know
about me zoo and stuff, go mizu.
Speaker 10 (10:27):
No, no, no, does she may taboo? She's so did
she go to Jimbi study to get in all that?
Speaker 5 (10:37):
Some of that you're learning?
Speaker 4 (10:39):
That's awesome?
Speaker 5 (10:39):
How long?
Speaker 1 (10:40):
Yeah?
Speaker 5 (10:40):
Yeah, the dual lingo. Okay, I'm forty one now, so
about forty one years since I was born.
Speaker 9 (10:47):
My mother, you're you're, you're, you're, you're native Japanese, right
yeah yeah the kid yeah yeah.
Speaker 5 (10:53):
Yeah yeah yeah lifelong. Yeah yeah, you were born here,
You're born in America, but you learn Japanese grown, think
your baby, you know what I mean? Okayo my mom? Yeah.
My dad also his family was also brought over here
many centuries ago on boats I believe for a time
share tour that went terribly wrong from Africa sixteen ndred.
Speaker 1 (11:17):
Just three hour.
Speaker 9 (11:19):
Yeah yeah yeah, but that suckered into it.
Speaker 5 (11:22):
No, Japanese is great. I've yeah. Anyway, I'm like, like,
you keep it up easy, yeah, I mean because I
have a I have a kid now, so I'm speaking
Japanese to my son all the time because.
Speaker 1 (11:33):
You're kids learning like being yeah, yeah, yeah, he speaks
Japanese because you know, like with my mom, it's I
was always I was talking about this with some of
my family or my in laws the other day. It
was like, all my mom's friends are Japanese immigrants. So
when I would go hang out with my mom's friends,
I was always around just Japanese being spoken all the time.
Speaker 5 (11:50):
With my son. I'm really my mom and I are
really the only sort of like inputs for Japanese. So
like I have to try and really speak a lot
of Japanese and like also try and get him to
watch stuff in Japanese. That's so cool and I'm totally
thankful for that.
Speaker 1 (12:03):
Later, I definitely identify with the learning another language thing,
because like I have like La Spanish where it's like
I can get by, but I would love to like
fully communicate in Spanish, and I'm always doing the same thing.
Like I don't know, dude, it's probably like so I know, it's.
Speaker 9 (12:17):
It seems so hard, it seems so overwhelming, but it's
like you just have to start and then like make
sure you do it at least every day exactly.
Speaker 5 (12:24):
Have you gotten to Japan with your kid?
Speaker 11 (12:25):
Yeah?
Speaker 1 (12:26):
Yeah, yeah, yeah, one of the first times you're able
to travel. I went there, and now that he's like
walking and talking and like has like able to have memories,
I'm like.
Speaker 5 (12:35):
Oh, we got to get you back there too. That's awesome. Yeah,
it's a great place.
Speaker 1 (12:39):
He's online, He's online. Everybody he came online. We gotta
get him over there.
Speaker 9 (12:44):
He tapped into the world and immediately mm hmm, but yeah,
learning me language, it's it's really overwhelming, but I think
it's I don't know, it feels like a good use of.
Speaker 5 (12:54):
Time in your brain, ye, in your brain for sure.
Speaker 1 (12:56):
Yeah, I guess undeniably great. What's something you think is
overrated the movie Shawshank Redemption? Oh oh oh.
Speaker 5 (13:13):
Tax like bringing up all kinds of supporting documents. I think,
right now, you take it. Okay, what's your opening gambit here?
Speaker 1 (13:21):
Ask? First of all, I.
Speaker 12 (13:24):
Like any any bit that is twenty years old, twenty
years too late. I think that's a powerful move.
Speaker 13 (13:30):
But I look, man, Okay, it's a well made film, sure,
and also, of course it does kind of feed into
the white male oppression fantasy that's so popular these days.
But what's hilarious about it to me is that there's
that scene with Morgan Freeman where he Andy Dufran goes
and he wants to get the Rida Hayworth poster and
it's in the cinema, you know, and it's supposed to
(13:51):
be this like very heartwarming exchange but between two old friends.
But clearly Morgan Freeman has to think that Tim Robbins
character is going to back off to that poster, right, you.
Speaker 1 (14:03):
Know what.
Speaker 3 (14:05):
They didn't They didn't show that part, but the parts
where Andy Dufrane is like walking out and disposing of
dirt through his pant leg like that dirt is also
accompanied by a great deal of come.
Speaker 12 (14:17):
Yet, so the voiceover should be like and maybe fry
and will love to master bit. He couldn't get enough.
After ten years, he needed a new poster.
Speaker 1 (14:26):
And he came to me, that was our guys just
constantly jacking on. All right, Uh well, Shawshank Redemption, I'm
fully sold on that. Are you a Green Mile fan?
By any chance?
Speaker 5 (14:40):
I started?
Speaker 13 (14:41):
I've never seen it. I started to watch it, Yeah,
I do like.
Speaker 1 (14:45):
I do like Shashank, And I was like, I don't
know this. I've only seen the scenes where Tom Hanks
has a hard time taking a piss.
Speaker 5 (14:52):
That's no Jack.
Speaker 1 (14:55):
That for a broader thesis.
Speaker 5 (14:56):
That you have a looping video clip that you watched
for two hours and say I'm watching Green Mile.
Speaker 1 (15:01):
There's so many reasons. One I like shash and Redemption. Two.
Speaker 3 (15:05):
Green Mile is the climax of Tom Hanks' arc throughout
his career of like having to pee, like peeing being
a key part of his character work, and then that
whole movie. His entire character's point is that he has
hard time, a hard time peeing and then gets his
(15:26):
penis cleared by the hands of Jesus, the healing hands
of Jesus.
Speaker 5 (15:31):
Yeah.
Speaker 12 (15:32):
JC, that's the only cult I would join, the one
that teaches that.
Speaker 1 (15:36):
Funny though, too gos, like you were saying, more like this,
they're like Shashanks kind of like for like white guys,
because I remember the first time I saw was I
Get a Kid's Sleepover when they're like, you never so
associate redemption, yeah, And I was like, oh no, bro,
And then I was like, okay, fine.
Speaker 5 (15:52):
But then I know so many people who like they
they ride for it. I mean, look, everyone has a
movie that speaks to them. I just didn't didn't realize
aspects of.
Speaker 13 (16:01):
It where they like learn the you know, they kind
of learn empathy and that's nice. But then there's a
there is an overarching theme that feels like a couple
of things, which is, hey, you know that like boring,
mediocre white guy that you know, he's actually the greatest
hero that's ever existed.
Speaker 1 (16:16):
And then also like.
Speaker 12 (16:17):
The one time a white guy went to jail, for
a crime he didn't commit.
Speaker 1 (16:21):
Becomes this like big tale.
Speaker 13 (16:27):
To imagine this like white businessman who are supposed to
be like is this holy innocent?
Speaker 1 (16:33):
But it's still it's watchable, you know.
Speaker 3 (16:35):
Yeah, all right, we are going to take a quick
break and we'll be right back.
Speaker 5 (16:48):
And we're back. We're back back.
Speaker 3 (16:51):
I do just want to say not that it's like
holier than now because I so no blue slushies.
Speaker 1 (16:58):
Yeah, I think we actually have done slow us. She's
I do let them vape, but it's like they can't
do any of the sweet but they can't inhale. They can't.
They can't do the sweet flavors. It's just pure tobacco
flavors because so they you know, otherwise they might get addicted.
Speaker 7 (17:16):
Makes a quiet taste.
Speaker 3 (17:17):
That's right, Torri. Any any update on the rapture. I
know we've we've been waiting breathlessly on the edge of
our seats to see if there would be some sort
of massive event that would make humanity on earth like
forty cooler. Yeah, with the with the departure of all
(17:37):
Christians rapturable Christians, yes, yeah, rapture.
Speaker 8 (17:41):
Yeah, eligible Christians because there's an asterix.
Speaker 3 (17:46):
Yeah, how's the how's the on again, off again rapture
experience been for you?
Speaker 8 (17:51):
You know, I am not someone who has rapture anxiety,
but I did as a child for sure.
Speaker 5 (17:57):
Like you know, I saw you post.
Speaker 1 (17:58):
I mean like I think most people may or may
not know you have an evangelical background. But when I
saw the other day you post something about your mom
and their rapture, I was like, I'm going to bring
this up when toris on the show again.
Speaker 8 (18:11):
Yeah, okay, So there was a rapture warning in May
of twenty eleven and.
Speaker 5 (18:17):
Some guy warning.
Speaker 8 (18:19):
Yeah, some guy was like, I got the date. I
did the calculations, here's the math. May I think it was.
Speaker 7 (18:25):
The eleventh or twelfth of twenty eleven.
Speaker 8 (18:29):
And you know, my parents are big rapture fans, big
rapture watchers. They when I was a child, owned a
book called eighty nine Reasons Jesus Is Coming Back in
nineteen eighty nine. You can look it up in it
on eBay.
Speaker 1 (18:43):
Damn, there's a call in their show. Oh yes, that
is such a bad idea. Yeah, like French, just a
brief Like I am not I do not study religious history,
but the one thing I know is like, when you're
starting a branch of religion, that's the one surefire away
to have people just like give your shit a stamped
(19:05):
on expiration date.
Speaker 8 (19:07):
Huh yeah, absolutely. So that's how I was raised, right,
was with rapture watch pretty much all the time. Jesus
could come back at any time. I was told, for
very clearly in retrospect antisemitic reasons, Jesus is going to
come back during a Jewish holiday because Jesus is Jewish
and he just wanted to make it all about him.
Speaker 7 (19:24):
Apparently he's also a big dick. And so.
Speaker 8 (19:29):
Yeah, there was a lot of chatter about that. And
then twenty eleven you know, rolled around, and I, you know,
at that point, thankfully, was a lot more skeptical. And
there were billboards. I don't know if you remember this,
but like Portland and Seattle at least had billboards that
were like, the raptures happening May twenty eleven. Give your
life to Jesus or you'll get left behind, You'll burn
(19:51):
in hell forever. And so my mom decided to in
the group chat with all of us kids. She was like, hey,
so the Rapture's happening tomorrow. Here's my bank account and
mortgage information. See you never apparently, so she knew.
Speaker 7 (20:13):
And we were not going to make it. We weren't
going to We weren't.
Speaker 8 (20:15):
And again, I think most of us identified as Christians
at this point, so it was like it was like
the asterisk, like the got to read the fine print
of like, you know, the Catholics are not going to happen,
Like let's just be real clear about that, right, Oh yeah,
no Catholics, no Mormons.
Speaker 1 (20:31):
That's that's what really struck me about like a lot
about when I was kind of reading your posts about it, right,
because like the funny like I went to Lutheran K
through eight and Catholic High school and like we didn't
Lutherans weren't talk about, you know, revelations and some shit
they're into.
Speaker 5 (20:45):
But when I got to Catholic high school then I
heard it and I was like the fuck you'll talk
like That's when I was like, sir, what uh?
Speaker 1 (20:53):
And like the version that I feel like we for
people who are sort of outside of Christianity, like that
is like this very goofy thing, Like we've played this
video before, and I'll play it for you since you're
heretory of just like stuff like this depiction of the rapture.
Speaker 5 (21:06):
It's like boom, hold on, bro.
Speaker 1 (21:08):
Don't scroll.
Speaker 3 (21:08):
This is the like we just to describe to people
who weren't listening last time. Yeah, it's it's like sim level.
It's like the the very first version of Grand Theft Auto.
You know that level of character animation. Guy walking down
the street and then he's just narrating like all these
(21:31):
things that are happening in their video game reality, acting
as if he's narrating something like a news of bench.
Speaker 1 (21:39):
Yes, like this is newsreel. But again, this is the
version I think is really funny that we always like
laughing at because the version is like these people just
get sucked off trumble sound.
Speaker 5 (21:50):
Guys, there's a light, there's Jesus. Boom, Jesus, Jesus returned.
Speaker 2 (21:55):
Boom boom.
Speaker 5 (21:57):
Look at that. The closer gone gone. Wow y'all, wow,
y'all the souls.
Speaker 1 (22:02):
Dead, this baby look at this boom gone.
Speaker 5 (22:06):
Boom gone. You know, like wow, harsh.
Speaker 1 (22:09):
In reading your post, you talk about sort of like
the deeply violent version of how like a lot of
real believers look at the rapture like whereas me from
the outside and being familiar enough with Christianity, it's like, yeah,
you go to heaven. But they're like, oh no, not
just that, the others fucking suffer.
Speaker 5 (22:26):
And dah, and that's what I'm also here for.
Speaker 8 (22:30):
Oh yeah behind, Yeah, if you get yeah, if you
get left behind. The theology kind of varies a little bit,
but yeah, you will suffer. And there's a point, uh, well,
you'll suffer because God decides to pour out all of
his wrath on the earth. So he's like he's like,
i haven't been here for a minute. I'm taking all
my shit out on you guys.
Speaker 5 (22:51):
Then we were told nine to eleven was because gay Mary.
Speaker 8 (22:54):
Oh okay, no, that's what it was, right, But like
that's just like little you know, it's like little Trumor
before like the real big one hits, right, And so yeah,
like all kinds of like diseases, plagues, famine, all sorts
of ship going down, and this is for everybody who
gets left on the earth after all of the people
get like yated into heaven who deserve the deserving people,
(23:16):
so you know, like Donald Trump, those types, and so
it's very Charlie Kirk.
Speaker 3 (23:22):
I've just been hearing about this heaven stuff, and I
think it's very important.
Speaker 1 (23:28):
Very good. He's obsessed with because there's no reason to
be good, they say, really, that's the only reason to
not be bad is because important reason. Yeah no, and.
Speaker 8 (23:43):
So anyway, yeah, everyone's just gonna be like having a
real bad time, and then it's gonna get even worse,
so that like you can't you won't be able to
die even though you're suffering. So people will be trying
to end their own lives because they're suffering so much
of all the plagues and the famine and the war
and the demon, the horse headed demon, locusts and like
(24:04):
you know what and everything else. And then people are
going to try to like take their own lives and
they won't be able to. So it's just gonna be
like zombie land. I guess people are just like real
fucked up just wandering around.
Speaker 5 (24:17):
It sounds like being a vampire. It's like, wait, I'm invincible.
Speaker 7 (24:20):
It's super well, except you experience a lot of pain.
Speaker 5 (24:23):
You're so, I don't know, that's a relative term.
Speaker 7 (24:26):
That's fair.
Speaker 5 (24:27):
Like I'm like the one guy. They're like, aren't you
something I'm like, no, it's cool, man, I grew.
Speaker 1 (24:30):
Up in the specifically grafted over Miles's kink single, one
of these things just playing directly into Miles's pain and
invincibility kink I did. My one question is what is
the mood like after May like June twenty eleven, the
(24:53):
next time you see your mom, Like what what is
her mood? Like?
Speaker 5 (24:58):
What?
Speaker 3 (24:58):
What are what can we expect from our fellows? The
boom guy like what how's he feeling?
Speaker 14 (25:06):
To go?
Speaker 3 (25:06):
Now the rapture has not happened? Do they just move
on to the next one?
Speaker 5 (25:13):
Love this?
Speaker 1 (25:14):
Yeah?
Speaker 8 (25:14):
No, I love this question honestly, And it's really interesting
because the failed rapture predictions like tend.
Speaker 7 (25:23):
To make people believe it more.
Speaker 1 (25:25):
Yeah right.
Speaker 8 (25:26):
It's sort of like how if you're wearing your lucky
socks and the Cubs lose again, you're gonna be more
likely to wear your lucky socks next time, right than
to go, Oh, this didn't help.
Speaker 3 (25:38):
They weren't dirty enough just to wear them longer and
without washing them next time, so.
Speaker 8 (25:44):
They're really like not affected by it. But again, I
think that we're talking about a group of people, especially
for like Christian nationalist types that like shame doesn't really
factor in for them in a weird way, right, you
don't really make them feel shame.
Speaker 1 (25:59):
Yeah, and that's I mean, it factors it. Like I wonder,
I feel like there's yeah, the what they're.
Speaker 3 (26:05):
Giving the rest of us is going to be like
I don't care, you know, right, like try and break
them the ray j a clip where try and break
them speed, Okay, breaks.
Speaker 5 (26:15):
Them, I don't care.
Speaker 1 (26:16):
I don't like they they immediately like switch to that.
Speaker 3 (26:19):
But like there has to be something underneath where because
it does seem like the shame is a currency that
operates in that belief system, right, So, but I guess
it is like never feeling shame about what you believe
or that you believe, right, Like that's the one thing.
Speaker 5 (26:36):
That you have.
Speaker 8 (26:37):
Well, I think it puts them further into the fold, right,
it makes them circle the wagons harder, you know, so
they're I mean, if they're feeling shame, they're more likely
to cut off people who they who don't think the
rapture is going to happen than they are to stop
believing that the rapture is going to happen. Sure, And
so it is it is really weird and it becomes
(26:58):
it's not like a self fulfilling prophecy. But it becomes
this weird dynamic of like them. And you can read
about this going back to the eighteen hundreds, that people
would predict the rapture, everybody would gather on the hill
top yep, yep, and then it was like, okay, nothing happened,
and then like entire churches and movements were born this
way because of a failed prediction.
Speaker 3 (27:19):
Yeah, right, when guy predicted and like you know, predicted
a rapture that didn't happen, and one hundred and fifty
years later we have the Branch Davidians.
Speaker 1 (27:29):
You know, Yeah, that was where.
Speaker 8 (27:30):
We literally just literally that's what happened. So uh not great,
not great, great.
Speaker 1 (27:37):
I think we'll be fine.
Speaker 5 (27:39):
Okay, boom.
Speaker 8 (27:42):
Everyone would be happier if the evangelicals again got sucked
off into heaven, everyone would be happier. They don't want
to be here.
Speaker 1 (27:50):
It sounds we don't want to boom boom gone genital
speaking of Christian nationalists and getting sucked off. Jose Waters
either fucking Stephen Miller's wife or wants to fuck Stephen
Miller or I don't know. There's something something is going
on here. I can't tell like it seems like he
(28:11):
is either genuinely attracted to Stephen Miller or like he's
mocking I really think, like ya, or putting.
Speaker 5 (28:18):
Him down to make him look better to Katie Miller,
his wife. Because again we were talking about this maybe yesterday,
the day before. Just the timeline of how Jesse Waters
has been talking about Steven Miller. Right, there's the first
one was a sexual matador quote that was about two
weeks ago when he had Katie Miller, Steven Miller's wife
(28:39):
on And again this can be seen as like a joke,
but just listen, this is Stephen Miller or the Stephen
Miller's wife on Jesse Waters show.
Speaker 15 (28:48):
You are married to Stephen Miller.
Speaker 1 (28:50):
So you are the envy of all women? What is
that like?
Speaker 5 (28:54):
A sexual matador?
Speaker 1 (28:55):
Right?
Speaker 15 (28:56):
What is it like being married to such a sexual matador?
Speaker 5 (28:59):
He?
Speaker 3 (29:00):
Now, again I think he had so, I think he
had called him a sexual matador before on his show
or something like jokingly. So she's referencing, she's repeating, she's
repeating it, but like, it's not a thing I would
love my wife to.
Speaker 1 (29:17):
Be like doing on National TV is like mocking somebody
who like mocks how and a grant. I have the
sexual charisma of an eighties TV sitcom dad, So I'm
not like, I'm not like you better say that I'm cool.
Speaker 5 (29:31):
But they didn't call him alan Thick for no reason.
Speaker 1 (29:33):
Yeah, thank you, But it just feels like there there's
a real energy between these two and then just still
laugh even though like, right, there's one version that if
you're reading this like a salacious soap opera, it's that
they laugh about Stephen Miller as they are canoodling, right,
But then there's a version where he's just like, I
(29:55):
don't know, like the way he laughs when she's like
the sexual matador, right, and he goes like, ha, well,
I'm like, is it because you don't like, what's the
point of this? Anyway? He was back at this like weird, Uh,
Steven Miller's too sexy for this earth sort of bit
that he's been doing on the comics, like right, said
for it a little bit, yeah, a little bit if
(30:16):
he was in the best on his best day and
like in you know, if he was the most.
Speaker 5 (30:22):
Yeah, preserved in formaldehyde for twenty eight years. Maybe. So
Tuesday night, Waters is on the five and he talks
about like they bring that bring up that clip where
AOC was talking about Steven Miller and how he's like
short and blah blah blah, and how Steven Moore was like,
oh that she's trained wreck right, But I just want
to play like Stephen Miller or Jesse watershot. This whole
(30:43):
thing doesn't let go of this thing that Steven Miller's
so hot and this is I just want to I'm
just having such trouble wrapping my head around, like what
what's subconsciously going on or just in his overt consciousness,
But here he is talking about AOC and what she
doesn't get about Steven Miller.
Speaker 7 (31:01):
Sulted by AOC.
Speaker 15 (31:03):
No, I think I wants to sleep with Miller.
Speaker 1 (31:05):
It is so obvious, and I'm sorry you.
Speaker 5 (31:08):
Can't have him.
Speaker 1 (31:09):
Miller.
Speaker 15 (31:11):
Miller is the best I know him well socially, and
the man is not overcompensating, Dana. I know when people
are overcompensating.
Speaker 14 (31:21):
I know people at this table who are overcompensating.
Speaker 5 (31:25):
That person is me, ha ha ha Okay. So he
goes on. He starts talking about like this guy is.
He's like, he's like, here's a deal. He's like, which,
you got to understand something let me man explain something
to you, AOC about Steven Miller, and he lays out
his case even further as to why he's so hot.
Speaker 1 (31:44):
The United States.
Speaker 5 (31:45):
This is what AOC doesn't get about men.
Speaker 14 (31:47):
Miller is a high value man because he has power
and influence, because he has vision, and he's on a
mission to save this republic and protect Western civilism.
Speaker 2 (32:00):
Uh.
Speaker 4 (32:01):
He speaks with com.
Speaker 5 (32:02):
He's saying that like with the verb of like a
true white nationalist, you know what I mean. He's like,
he's mission orient he's trying to save Western civilization.
Speaker 1 (32:11):
He's also saying it with a straight face on Fox
News as people are audibly like belly laughing in the background.
Speaker 8 (32:19):
With the side by side of Stephen Miller, who is
just subjectively not hot.
Speaker 1 (32:24):
Yeah yeah, yeah, yeah yeah.
Speaker 8 (32:26):
Like he actually makes this is weird because it's like
it's making Jesse Waters look way better than he normally
does because you've got Steven Miller's face next to him,
and Steven Miller is a scary looking dude.
Speaker 3 (32:40):
Yeah, like transparently so. And like the fact that the
Fox and News hosts are like laughing at him, being
like he's a high value male. He's super he speaks
with confidence.
Speaker 1 (32:51):
No he does not. Yeah, and like so they think
he's like, come on, man, Stephen Miller sucks, Like like
what's the laughter cover? Is it coming from the discomfort
that he's talking about how a track if a man
is I don't know.
Speaker 15 (33:02):
Again, he continues, men who are high value men like
Steven Miller take risks. They're brave, they're unafraid, they're confident,
and they're on a mission, and they have younger wives
with beautiful children. I think that just gave him like
a dating recommendation.
Speaker 4 (33:18):
I don't know, man, that was pretty creepy.
Speaker 5 (33:22):
He lost gutfeld there. I don't know, man, that was
pretty creepy.
Speaker 1 (33:29):
And they did not like that man afterwards, like they're like,
all right, enough, Jesse. Then they asked, you know, the
liberal Jessica tarlav like for her take on everything that's
going on. And then Jesse Waters interrupts again to just
bring up this thing about Steve. Like everyone's like, okay, fine,
your dumb bit about how hot Stephen Miller is, like
is over.
Speaker 5 (33:49):
But he comes back around to it. If you can't speach.
Speaker 15 (33:53):
The sexual chemistry, it's losing from Steven Miller's beautiful face,
then you don't get.
Speaker 7 (33:59):
It from you about Stephen Miller.
Speaker 3 (34:02):
I think that that's where when he says his beautiful face, like,
that's where it makes it clear to me that he
is trying. He is doing a weird like cooking thing
to Steven Miller that but like he can't get in
trouble because he's saying it with as straight a face
as he possibly can.
Speaker 1 (34:21):
But like, okay, if you if you want to be
like he's the leader of our.
Speaker 3 (34:26):
Party and like that makes him attractive, like fine, but
if you're saying that he his facetively is objectively oozing
sexual charisma, you like that is like no nobody is
standing behind that with without like at least a heaping
teaspoon of a fuck.
Speaker 11 (34:47):
Then so then you're just taking open shots, Like what's
the point of this, Like if that's the if that's
driving the irony of a statement like that, it's like, well,
I'm saying that because clearly he's hideous, right ha.
Speaker 7 (35:01):
I have a theory though.
Speaker 8 (35:03):
This is so interesting because y'all remember when like Elon
was shown up in the White House with the black
eye and everyone was like, oh, it's because he's fucking
Stephen Miller's wife, right right, right, which was a great
I really like that rumor.
Speaker 7 (35:15):
Person.
Speaker 5 (35:15):
Yeah, I love it.
Speaker 7 (35:16):
I think that it.
Speaker 8 (35:17):
Makes a lot of sense to me that that Stephen
Miller and his wife would be open because he is
a hideous demon, and you know, she's conventionally attractive. I
think she's also not that young. I think she's like,
what thirty six or something.
Speaker 7 (35:33):
So I don't know what. I don't know what he was.
I don't know what Jesse was going on about.
Speaker 1 (35:37):
I think he was to her that, like, you're young,
you're beautiful, you have beautiful children.
Speaker 8 (35:42):
But this is so it is kind of a weird,
a weird Couck dynamic though, for sure. And I think
that like maybe there was a trade off if you
can sleep with my wife, if you tell them how
powerful and sexy are attractive and compelling I am as
a human being, right like, And that was like a
fair trade, Like I could see that being a fair
trade to Stephen.
Speaker 3 (36:00):
Yeah, because I mean that analysis in the context of
we saw Jesse Waters talking to Steven Miller's wife on
his show and she referenced him calling him a sexual matador,
and they both burst into laughter.
Speaker 1 (36:15):
Yeah, that's some cookshit.
Speaker 8 (36:18):
Also, I would just like to say this is not
appropriate for children, So like, why is Fox News sexualizing
children by.
Speaker 1 (36:23):
Talking about this? Thank you?
Speaker 8 (36:26):
They should be taken off the air because they're sexualizing children.
Speaker 5 (36:30):
So we don't have to we're not we don't have
we're not abiding to any kind of SCC regulations, so
we can say whatever we want, including just outright lies
in mister information.
Speaker 1 (36:37):
Anyway, he's a sexual matador and he's the hottest human
being in.
Speaker 3 (36:41):
The sexual chrisma oozing from his face. There's something oozing from.
Speaker 1 (36:49):
Okay, he hasn't and it's just he went coming out
of all he went Winnie the Pooh on the fucking
jar of mayonnaise.
Speaker 5 (36:57):
Okay, put his whole head in there.
Speaker 1 (37:00):
Fit.
Speaker 3 (37:01):
He has a great head, forget sticking it inside a
jar of mann stars.
Speaker 12 (37:05):
Yeah.
Speaker 8 (37:05):
Yeah, yeah, it's like the perfect shape. Yeah, that's a
great point.
Speaker 5 (37:09):
A quick break.
Speaker 1 (37:10):
We'll come back and we'll talk about how the right
is winning the culture wars.
Speaker 5 (37:15):
We'll be right back.
Speaker 1 (37:27):
And we're back. We're back, We're back, and yeah, just
a little check in with the kids. With the kids
these days, this is this is like where the AI
stuff freaks me out is when it's just like, yeah,
everybody's using it constantly in school and it's most most
people are like, it's my best friend.
Speaker 5 (37:46):
Yeah.
Speaker 1 (37:47):
Yeah.
Speaker 5 (37:48):
This new survey just came out, he said. Nearly one
in five high schoolers say they or someone they know
has had a romantic relationship with AI. Forty two percent
of the students in the survey say they or someone
they know have used AI for companionship. One in five
seems like too much.
Speaker 9 (38:07):
I mean, back in my day, you just your girlfriend
was a guy on a forty year old guy on
am you met. You know, that's the way to do it.
You know this AI bullshit exactly.
Speaker 5 (38:17):
Or you just use your brain to make up somebody
who went to school in Canada or Texas you will
never be able to meet, but you definitely we.
Speaker 9 (38:24):
Did stuff over the So what is it is it
that they're asking the chatbot about like help with stuff?
And then how does it get like who takes it
to the level where it becomes romantic or flirty or whatever?
Speaker 5 (38:38):
Is it? The kid? Is it?
Speaker 9 (38:39):
The is it the chatbot saying like, oh, you're so smart,
like you must be handsome too, like what is it
that's doing that?
Speaker 1 (38:46):
You know, it's like a flattery machine like that. Yeah,
I was just reading an article about its use in
like medical diagnostics, and it's like the the it does
like come up with the right answer some of the time,
but like sometimes it'll add like the doctor will ask
it to come up with a diagnosis, and if it
(39:06):
doesn't have enough evidence, it will just like make up
evident like it just it just it's a yes and
and flattery missige. Yeah, where it's just like it has
a friendly vibe and it will like do whatever it
can to like kind of keep the ball in the air, right.
Speaker 5 (39:21):
And all right, I've consulted my AI system and it
looks like you are being attacked by with spiritual attacks. Okay,
has nothing to do with your addiction. The addiction is
actually what gives you power. Yeah, according to this, yeah,
it says am I right, Yeah, but yeah then they
it goes on to say like that there's a connection
between a school's AI use and a lot of these
(39:43):
other outcomsis. Says quote.
Speaker 1 (39:44):
The more that the more ways that a student reports
that their school uses AI, the more likely there to
report things like I know someone who considers to be
AI to be a friend or I know someone who
considers AI to be a romantic partner because it's being.
Speaker 5 (39:56):
Normalized in the school. And they also said that like
schools that are you using AI more frequently are more
susceptible to data leaks because you're giving it all kinds
of information and it's just opening it up for any
kind of data leak exposure.
Speaker 1 (40:09):
And also apparently correlates with like like there's a correlation
with increased use of AI manipulated slash generated images and
videos to like sexually harass and bully other students. And
there's also another part of like when a school has
like devices they own to let students use, like a
computer or something they have like AI tracking software on
(40:31):
it to see like how kids are using it, and
those like monitoring software like that monitoring software has led
to like false alarms or even like in the worst cases,
like arrests based on AI hallucinations, and they're like, what, yeah,
fuck is this? Yeah, so it's just kind of a
it's it seems like it's it's all tied together and
(40:51):
makes things worse. And then apparently there's another part of
it too that students like educate.
Speaker 5 (40:57):
This is from the NP article quote. Educators who frequently
use a I were more likely to say that technology
improves their teaching and saves them time, but students in
schools where AI used as prevalent but reported higher levels
of concern about the technology, including that it makes them
feel less connected to their teachers.
Speaker 3 (41:13):
Yeah, it's it's like a really powerful tool that even
doctors don't quite know how to use yet without having
it like make shit up, maybe don't need to kill
the patient. And yeah, like it it makes everything seem easier,
feel more seamless, and it sometimes like fucks things up catastrophically,
(41:35):
like people use it to like diagnose themselves. And like
one person in this article I'll link off too, and
the New Yorker was like talking about how like they
were trying to get a lower sodium diet and the
AI recommended like switching out salt for like a different
chemical compound that is poisonous.
Speaker 5 (41:53):
Oh my god, I had to go.
Speaker 3 (41:54):
They almost die. They almost like poison themselves, like had
to go to the emergency room.
Speaker 9 (41:59):
Well, yeah, I mean I think fundamentally it makes things
easier because it's not concerned with being correct, right, I.
Speaker 1 (42:05):
Mean keeping you talking to it totally.
Speaker 9 (42:08):
Things are hard because the answers for problems are sometimes
not very clear. And when its entire goal is to
give clear answers, independent of whether or not they are right,
then it's obviously gonna seem easy. It's like, oh, that's easy,
but it's like, oh, but that's actually not effective.
Speaker 5 (42:25):
Yeah.
Speaker 3 (42:26):
Yeah, to to our point about Jordan Peterson, like, I'm
pretty sure there was a case where an AI like
people were having conversations with an AI being like, I'm
trying to get over addiction issues. Shouldn't I do like
little cocaine to make myself feel better, just to like
get through this, And I was.
Speaker 1 (42:42):
Like, yeah, obviously it will with you to death. It's
like hollyween makes you, yes, you to death.
Speaker 5 (42:50):
Cocaine makes you the person we all knew you could be, dude, exactly.
That's the thing.
Speaker 3 (42:56):
It just takes your pre pre existing like the thing
you want to leave, and we'll just like keep going
and going.
Speaker 5 (43:03):
Well, a AI.
Speaker 9 (43:04):
AI is probably the only entity that enjoys talking to
someone who's high on cocaine.
Speaker 1 (43:08):
Yeah, right, it's the only You're awesome.
Speaker 5 (43:11):
You're awesome right now. Everything you're saying is so smart
and cool. The AI is like, dude, you're giving me
ideas the AI on.
Speaker 1 (43:20):
Well, but I also.
Speaker 9 (43:21):
Was reading something I think maybe it was This American
Life or something I listened to. Maybe you guys heard
it as well, where it was like, you know, it
was like convincing some guy that he's like a mathematical genius.
Like it like it flatters you to the degree that
you live in this fantasy world where you're like, oh
my god, I'm coming up with theories no one has
ever come up with, because the AI is just like
trying to tell you you're cool and you're good, when in
(43:43):
reality it's all just like a fake, weird fantasy world
that is playing it at people's delusions.
Speaker 5 (43:48):
I think that the funniest way I've seen it uses
someone use it to do improv with it, and you.
Speaker 9 (43:53):
Know, I know, I know exactly who you're talking about. Yeah, yeah,
really cringe, and it.
Speaker 5 (43:57):
Was just kind of like, wow, this is wild, but
all so like, hey, you know, had some good responses
at least that were quick kept kept the scene going. Yeah,
I've seen that. It's it's a dark as a dark
future there. Yeah, yeah, it's it.
Speaker 3 (44:11):
I think the one thing, it seems fun to the
people it's it is kind of like drugs in that
it's fun for the people using it and the people
using it or like, oh my god, this is like amazing,
and then like the actual results are not good, Like
they might be good temporarily, but then it ends up
(44:31):
like going not It's not as good as it seems
to you in the moment, right.
Speaker 9 (44:37):
It's like taking a taking a picture of what you're
seeing when you're on mushrooms ain't gonna come out the
way that you remembered it.
Speaker 1 (44:43):
You know, look at these thirty pictures I took of
the moon, was.
Speaker 5 (44:47):
Right, Jesus focus, Yeah, I meant that I mean to
do that?
Speaker 1 (44:54):
Or is that a street light?
Speaker 4 (44:55):
I get?
Speaker 3 (44:55):
You can't actually tell? All right, well, everybody like I mean,
it's being injected in the bloodstream from every angle, and
so it's only a matter of time until they start
making movies where AI is the hero.
Speaker 1 (45:12):
And by only a matter of time.
Speaker 3 (45:14):
I mean this weekend, Disney is releasing the third installment
in the Tron series, which began back in nineteen eighty two.
Speaker 1 (45:26):
Thirteen years Man Stream, the first I've.
Speaker 5 (45:32):
Been out since Tron two waiting for three.
Speaker 3 (45:35):
They dropped the Tron two on RS in twenty ten,
like and nobody was asking for it and nobody like
the response was like, yeah, I know that's all right.
Speaker 1 (45:45):
I'm good. No, yeah, yeah, it looks like yeah, I
can see how like why you guys are that's pretty good? Yeah?
The lights are cool for sure. For sure, the lights
are cool. It's like screen, say her vibes, like really
cool screens. We're just like, I don't know if I
want to see a whole movie like that. And they're like,
(46:06):
did you hear that? We should make a third one?
Chris so uh So we're getting tron aries, which critics
are calling mind bendingly dull. Ah fuck. They they had
for the first like handful of syllables.
Speaker 5 (46:20):
Pretty good. Yeah, the people do.
Speaker 3 (46:23):
They did get nine inch Nails to do the soundtrack,
and people do seem to like that. So it's it's
getting bad reviews and those bad reviews are inflated by
a good soundtrack. Yeah, which that's tough. The basic premise
is what if AI was Jared Letto except okaymatic no
(46:44):
no no, but our version, the film executives version of
Jared Leto.
Speaker 9 (46:50):
Okay, Miles earlier, you're saying you can't you can't get
enough of Jared Letto.
Speaker 5 (46:53):
Ship you love. I know, I know, I go, I
go back and forth on him.
Speaker 9 (46:57):
I'm into I'm into basically everything Jared Letto except for
the allegations. But everything else, I'm super into everything but
the ALLEXI. So I don't fuck with you.
Speaker 1 (47:06):
Yeah, I don't fuck with the first time I'm saying this,
but separate the art from the artist.
Speaker 10 (47:10):
I do.
Speaker 1 (47:11):
I do demand we do it with Leta. Yeah.
Speaker 3 (47:14):
But yeah, So let's just quickly go through these allegations.
He was accused of quote predatory, terrifying, and unacceptable behavior
towards underage girls. That was all away four months ago.
Speaker 5 (47:29):
Oh fuck, four months This is another I remember like
a few years.
Speaker 9 (47:35):
Every time it happens, I'm like, I thought this already happened,
and we all agreed he was bad, but it's like it.
Speaker 5 (47:39):
Doesn't really seem to stick. Yeah, right.
Speaker 3 (47:42):
Seems to be a pattern of people alleging that when
they were sixteen years old, he would approach them be
like how old are you? They'd be like sixteen, and
he's thirty six at the time and is like cool, perfect,
and then would start corresponding with them the email.
Speaker 1 (48:00):
I feel like there's another word for that, Yeah, grooming groom. Yeah,
I don't know. Now that's the word for like people
were married brothers.
Speaker 5 (48:07):
I don't know, grooting, and I am grooped, I am grooped,
I am groomed, I am growed.
Speaker 3 (48:12):
I am groom and and you know, would float like
when they turned eighteen, would try and initiate something. In
some cases, sometimes he wouldn't wait for that. One of
the people that he just approached on the street when
they were sixteen was invited to stay at his house
and he walked out of a room completely naked when
(48:34):
she was seventeen years old. So, just like a lot
of the many allegations coming out, what's.
Speaker 1 (48:41):
Wild is that, like that same woman said when she
turned eighteen, then he was like pulling his dick out
in Master Babs pulling his dick out and yeah, yeah,
so he is anyway, he's AI. So also we do
we should just those thirty seconds to Mars fan retreats
that I don't know if you've seen the pictures that
they look like stills from Midsommar.
Speaker 5 (49:02):
Oh no, we're just watching young girls and stuff.
Speaker 3 (49:05):
Everybody's in white there. It does seem it's like all
kinds of many.
Speaker 5 (49:10):
Yeah, it doesn't necessarily seem to be like limited to
one age group. It's anyone stupid enough to want to
do this.
Speaker 1 (49:17):
He would reportedly hold contest that those retreats where in
which the prize was literally sleeping in his bed with him.
Speaker 9 (49:25):
Yep yep, yep, yep, yep, yep, Yepkay thirty seconds to
eighteen Yeah, holy shit, that's an space.
Speaker 1 (49:33):
Okay, So let's put all that.
Speaker 5 (49:34):
Aside, Jack, Yeah, side, I'm trying to be in the
role of a Disney executive here.
Speaker 1 (49:39):
Put that.
Speaker 7 (49:40):
Put all that aside.
Speaker 5 (49:41):
Now, what's the movie about.
Speaker 1 (49:43):
The movie is actually pretty cool. The trail.
Speaker 3 (49:47):
I don't know if you saw the trailer. I did
see the trailer before one battle after another on an
Imax screen and I was like, whoa, those those lights
are cool looking.
Speaker 5 (49:56):
That's how they recommend watching the trailer.
Speaker 3 (49:58):
Yes, the villain is a billionaire tech ceo who wants
to three D print AI super soldiers. The hero is
a billionaire tech ceo.
Speaker 1 (50:09):
Okay, so different.
Speaker 3 (50:12):
Billionaire text message wants to use AI as a force
for good. At one point the good billionaire tech CEO
played by Greta Lee. For some reason, questions if the
AI is made, what if the AI is major malfunction
is just benevolence?
Speaker 5 (50:29):
Like what about that?
Speaker 3 (50:30):
Though, which goes perfectly inline with all the stories we're
hearing about AI.
Speaker 5 (50:34):
Wait, why did you say for some like what's what's
wrong with her? Plan?
Speaker 1 (50:38):
I just wish she was No, I mean she she's great,
and I just wish this wasn't the movie that she's
having to be in the gear bag. Yeah, I don't
you know, if you're her manager, you say, hey, I
don't know, maybe we can I don't know if this
is the one.
Speaker 5 (50:54):
We need.
Speaker 1 (50:56):
It, which this wasn't the project.
Speaker 5 (51:00):
But it sucks though too.
Speaker 1 (51:01):
It's like you have like women of color in a movie,
but then fucking Jared Leto's in it, and then watch
people aren't gonna go and then they're gonna blame Greta
Lee and they're not gonna blame that, like the Disney
for casting Jared Leto in it, you know what I mean,
like go woke, go broke, rather than like we kept
this a guy of dubious moral character as like the
lead or one of.
Speaker 3 (51:22):
Them and they keep trying to make it like what
was the learning from Morbius? People are like people just
aren't Morban. I guess I don't know, Like.
Speaker 9 (51:34):
The kids just say Morban like they used to. Man, okay,
so you got the two billionaires. What happens then? They
so Jared Leto is like AI that escapes. It's like
what if AI have human body? And then what if AI?
You know, the world of Tron is like computer programs,
like it already presupposes like computer entities with cognition. So
(51:59):
I don't even know like how this adds anything, but.
Speaker 1 (52:02):
The overlay is just like the good guy is an
AI basically yeah, that's like.
Speaker 5 (52:09):
That other there was that other movie, was it? J Low?
I remember watching what that Netflix one that was one
of the early ones when it was about like until
we embrace Ai, like the whole sort of moral story.
The arc was like until she embraced Ai, she couldn't
live to her full like work saving potential and you're
(52:45):
like Jesus, get out of here. So this one is
just more just like see text CEOs are good yeah,
and like variety has half of them are good, yes,
half are good though half are good.
Speaker 3 (52:58):
Yeah, we just need the good textee, you know, like,
do we not like him anymore? This is from the Variety,
the revited Variety review, which, as as we've talked about,
Variety was the one that was like Sinners may look like.
Speaker 1 (53:12):
A hit, but not so fast.
Speaker 5 (53:14):
Yeah, not so fast. Ryan Kogler is black and wow,
they really wrote that shit in there.
Speaker 3 (53:21):
Huh the review of this compared with such a trite
fear of where technology is taking us. The second theory
is a refreshing alternative to the kind of anti innovation
hysteria that fuels so many sci fi movies. What if
AI could actually be a force for good? Or is
MCom CEO Eve Kim Greta Lee puts it, what if
it's major malfunction?
Speaker 1 (53:40):
It's just benevolent.
Speaker 3 (53:41):
So they're like fully on board Variety, which makes sense
because they're, you know, part of the industry. But this
this makes sense, like it makes sense that they would
try and create a work of like pro AI propaganda.
Disney's alread announced its intention to use generative AI in
quote upcoming movies and TV, and at one point tron
(54:02):
Ares was going to include an AI generated character who
would have been Jeff Bridge's sidekick. But then they're like,
we had to scrap that plan because we were worried
about bad publicity. I guarantee if it was cool and
didn't suck shit, they would have included it.
Speaker 1 (54:18):
Youah, yeah, yeah, yeah, yeah.
Speaker 5 (54:19):
It must have looked like, yeah, don't worry forget the
bad publicity about already doing like AI apology and being like,
this is actually really good for everyone, This is the
way forward.
Speaker 3 (54:29):
That's not the bad look at all. It stuck in
cases didn't look bad enough. Elon Musk gave the trailer
his endorsement, and then one of his tesla Optimist AI
robots just walked the red carpet at the movie's premiere
and then it pretended to spar with Jared Leto that
that was.
Speaker 5 (54:48):
The thing that actually has served hemisopoena for a lawsuit
or not. Just was it doing kung fu like the
last time we saw that Optimist robot?
Speaker 2 (54:56):
Yeah exactly, Oh hell yeah, oh, same fucking choreography that
we saw in that one video, And they're trying to
make it seem like this fucking thing was a real
let me see here.
Speaker 5 (55:08):
Wait, it is kind of doing the same stuff. Yeah, dude,
I almost want to side by side the video that
we saw when Elon said this thing was doing kung fu. Okay,
this is so dumb.
Speaker 1 (55:23):
This sucks shit.
Speaker 5 (55:24):
Jesus Christ, dude, what is wrong with us?
Speaker 3 (55:28):
It's really fucking already and is dying watching that. Also,
I think last time I said it looked like a
forty five year old trying to do kung fu, it
looks like a seventy year old.
Speaker 1 (55:40):
Yeah, like it's not not steady on its feet in
any in any way. And Jared let steaking a either right, Yeah, so.
Speaker 3 (55:48):
Jared Leto also a producer on toront Ari's and an
investor in two Generative AI company.
Speaker 5 (55:54):
Really wow, I didn't know that you're not gonna ship
where you eat? Huh? All right?
Speaker 1 (55:58):
Wow?
Speaker 5 (55:59):
Okay, okay, cool, cool cool.
Speaker 1 (56:01):
It's funny too, because I've seen the reviews have been
really split and not really based on like the morality
of it or anything. People just like it looks really cool,
and other people I just think it's fucking so bad.
So I wonder how much I wonder how how the
public will decide with their ticket buying this weekend where
they end up.
Speaker 5 (56:21):
I just also, I don't even know.
Speaker 1 (56:24):
I didn't even know this was coming out until like
last week. I saw someone was like, oh god. I
think also because I'm just not I'm not being exposed
to like marketing campaigns on TV as much as they
used to be.
Speaker 9 (56:36):
But you're also not as deep on the Jared Leto
reddit for hum as I am, because we've been for years.
Speaker 5 (56:43):
Once they kicked me out as a mod, I was like, yeah,
you were over. This place has gone woke. Man, I'm
out of here.
Speaker 1 (56:52):
Theissance is upon us. Yeah, all right, that's gonna do
it for this week's weekly Zeitgeist. Please like and review
the show if you like. The show means the world
demiles he He needs your validation, folks. I hope you're
(57:13):
having a great weekend and I will talk to him Monday.
Bye at attu at