Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
You're listening to Comedy Central.
Speaker 2 (00:07):
From the most trusted journalists.
Speaker 3 (00:09):
At Comedy Central, it's America's only sources for news.
Speaker 2 (00:14):
This is the Daily Joke with your host Jordan Clepper.
(00:37):
What's for them?
Speaker 4 (00:37):
J'all?
Speaker 5 (00:38):
I Jordan Flipper. We got so much to talk about tonight.
Kamala and Trump prepare for tomorrow night's debates. We hunt
down the person who's sending you all those campaign emails,
and Dick Cheney is once again taking shots at his
Republican friends. So let's get into our continuing coverage of
Indecision twenty twenty four. So we at the Daily Show
(01:05):
have been on a little bit of a summer break
the past couple of weeks, and when we left off,
Vice President Harris was riding a wave of momentum with
a successful convention and surging in the polls.
Speaker 2 (01:16):
Since then, Sure, sure, I don't know.
Speaker 5 (01:21):
If that's a fair counting, but we'll take it. Since then,
I've been out of the loop, just sitting on a
sandy beach, sipping on some my ties, and I can
only assume Kamala remains on a smooth path to victory
in November. So perhaps I'll take a comically large sip
of this drink I brought back from the beach for
some reason, and see what I missed.
Speaker 6 (01:46):
Former President Donald Trump leads Vice President Kamala Harris by
a razor thin margin forty seven percent among likely voters.
Speaker 5 (01:58):
That was a good drink, But that poll is not good.
Kamala is down a point. Seriously, she's saying, never tried
to overthrow the government, not six hundred years old with
a rap sheet. I mean, what else does Kamala have
to do? Two interviews and come on, be reasonable people.
But okay, here we are less than two months out
(02:21):
from the election. We've basically got a tide race. The
candidates are doing everything they can do to ramp up
the excitement. Kamala is speaking to voters in Pennsylvania and
spice stores. Trump is speaking to voters and caston bulletproof glass,
and JD. Vance is trying to counter accusations that he's
weird by swimming in the pool with his shirt on.
(02:44):
Starting to feel bad for this guy in this time around.
Trump may hang his VP out of mercy, although if
you ask me, this might be the most relatable thing
jd Vance has ever done.
Speaker 2 (02:58):
Don't worry JD.
Speaker 5 (02:59):
I'm with you, and so are millions of other men
with Pepperoni nipples. I see you, I see you, I
see you Now.
Speaker 2 (03:08):
While JD.
Speaker 5 (03:09):
Van's appeals to the self conscious middle schooler vote, Donald
Trump has secured the endorsement up RFK Junior, Tulsea Gabbard,
and Elon Musk, three people who will help Trump reach
voters who are undecided about what shape the earth is. Meanwhile,
Kamala Harris just got an endorsement of her own.
Speaker 6 (03:28):
Former Republican Vice President Dick Cheney announced that he will
not vote for GOP nominee Donald Trump, but instead we'll
back the Democratic candidate Kamala Harris.
Speaker 5 (03:40):
Wow Wow, Wow, See, Kamala has something for everyone. Whether
you're a trans person of color, or a white construction
worker in the heartland, or an unrepentant war criminal who
needs the blood of Iraqi children to power the machine
that keeps them alive and out of the flames of
hell for one more day.
Speaker 2 (04:00):
Is your candidates.
Speaker 5 (04:02):
And by the way, apologies to those of you who
saw Dick Cheney's name trending on Twitter over the weekend
and we're like, oh my god, this is it. But sorry,
all of these endorsements and campaign stops and solo wet
T shirt contests. Well, we'll come to a head tomorrow
(04:24):
night when Kamala and Donald face off in a debate
that could decide this election.
Speaker 2 (04:30):
And I don't need to tell.
Speaker 5 (04:31):
You how high the stakes are, because we all remember
how the last debate between Trump and Joe Biden went,
(05:02):
you know, not as bad as I remembered. Now, Kamala
definitely has an advantage compared to Biden because of the
whole not being riddled by age thing. But she's still
preparing diligently. Perhaps two diligently.
Speaker 3 (05:19):
Sources who are familiar with how Vice President Harris is
preparing for the debate tell me she is diligently getting
ready for this by going to a hotel in Pittsburgh
spending hours doing mock debates, including with an aide who
is dressing like former President Donald Trump.
Speaker 5 (05:36):
I'm sorry, you're you're having a guide dressed like Donald Trump.
Is that something the campaign thinks she needs to prepare
for now, Madam Vice President, he might come out we're
in a tie that's slightly longer than usual.
Speaker 2 (05:49):
Don't freak out. We trained for this.
Speaker 5 (05:53):
Meanwhile, Trump is preparing for the debate a little differently.
Speaker 7 (05:57):
All the report indicates that he's taking this easy, taking
this casually.
Speaker 2 (06:01):
He doesn't have debate prep, so to speak. He has
what they call policy.
Speaker 7 (06:04):
Time, just to refresh his memory about what he might
say on stage.
Speaker 5 (06:08):
Oh, they're giving him policy time, such an important part
of childhood developments. You can do it, Donald, two more
minutes of policy time, and then you can watch three
paw patrols.
Speaker 2 (06:24):
But you know what, it's good to know that Trump is.
Speaker 5 (06:27):
Getting into the nitty gritty of policy because you want
a president who's up to speed on the nuances of
the issues. It isn't just pulling stuff out of his ass.
Speaker 7 (06:36):
Kamala support states being able to take minor children and
perform sex change operation. Can you imagine you're a parent
and your son leaves the house and you say, Jimmy,
I love you so much, go have a good day
in school, and your son comes back with a brutal operation.
Speaker 1 (06:55):
Can you even imagine this?
Speaker 5 (06:59):
No? No, no, I can't imagine this because it's an
insane thing you just made up.
Speaker 2 (07:08):
Do you do you really?
Speaker 5 (07:15):
Do you really think a kid goes to school one
day and comes back with a full sex change operation.
That's ridiculous. Americans getting free health.
Speaker 1 (07:25):
Care not happening.
Speaker 5 (07:29):
No, fuck, no, Donald, Donald, Donald, do you think it's
even remotely a possibility? Apart from everything else. One time
in middle school, I told the nurse I had a
stomach ache, and she put a band aid on my stomach.
I have a hard time believing they're doing full scale operations. Well,
(07:49):
you know, you know what, everybody's thinking a lot about
school safety, and it's refreshing to see a politician take
a step beyond thoughts and prayers and actually do something
to protect our children from the biggest threat they face
at school.
Speaker 2 (08:05):
Mass sex changes.
Speaker 5 (08:06):
Apparently, some people would say that Donald Trump's biggest challenge
at the debate tomorrow so that he can't open his
mouth without rambling incoherently. But if you ask Trump about this,
he says, no, No, I ramble very coherently, and I.
Speaker 7 (08:22):
Look forward to the debate with her.
Speaker 2 (08:25):
You know, I do the weave.
Speaker 7 (08:26):
You know what the weave is. I'll talk about like
nine different things that they all come back brilliantly together,
and it's like and friends of mine that are like
English professors, they say it's the most brilliant thing I've
ever seen.
Speaker 5 (08:39):
Oh, yes, the weave. I thought your brain was broken,
But now that you did those hand motion things, I
see it's a tactic. I mean, all your English professor
friends are impressed. Which professor is that? Is that Professor
Hogan or doctor Rock? I mean, forget English professors. Trump's
(09:02):
friends are barely English speakers. For more of the candidate's
debate preparation, let's go live to Philadelphia with Grace cooland
spit Grace, Grace, I'm curious, Grace is Kamala Harris prepared
to handle Donald Trump at tomorrow's debate.
Speaker 4 (09:23):
It's gonna be tough. Remember Donald Trump grew up in
Queen's so he's a street fighter like Ken and Ryu.
Remember that the fireball Odakin. That reminds me My first
sex dream was about Blanca from Street Fighter Too. We
were on a Disney cruise.
Speaker 5 (09:43):
Ship for some reason. One thing people don't.
Speaker 4 (09:45):
Know is they have jails on cruise ships because if
you rob someone, they have to put you somewhere. But
the jails are too small to hold more than two
people at a time, so you just have criminals roaming
the halls of our cruise ships. Just like our Democrat
cities under Nancy Pelosi.
Speaker 5 (09:58):
Back to you, Jordan, woohoo, whoa whoa, whoa, whoa, whoa, Grace, Grace,
what the hell have you been talking about?
Speaker 4 (10:05):
Oh my god, Jordan, I'm doing the weave.
Speaker 5 (10:08):
See.
Speaker 4 (10:09):
Okay, it's a master level talking mechanism, and it's why
you just lost this debate.
Speaker 5 (10:15):
This is not a debate. I asked you for information
and you responded with incoherent rambling.
Speaker 4 (10:21):
Boy, you sound just like my high school teachers. But
thanks to Donald Trump, we now know that we can
rebrand our character flaws into something we're flattering. I'm not
bad with money, I'm fiscally promiscuous, sexy right.
Speaker 5 (10:36):
I didn't wet the bed.
Speaker 4 (10:38):
I'm a sheet durability analyst, sexy right. And I definitely
did not fall into a pothole this morning because I
was watching tiktoks about Japanese toilets. I'm the key plaintiff
in a class action lawsuit chit ching.
Speaker 5 (10:54):
Okay, look, look, I get that rebranding in your flaws
makes them sound fancier, but everyone still knows you're fiscally irresponsible, Bedwether.
Speaker 4 (11:02):
But America has been rebranding since the beginning, Christopher Columbus
thought he landed in India, but when he found out
he wasn't, he rebranded it to America. And tomorrow, Kamala
Harris will need to rebrand the Biden administration. You need
to convince us Democrats that we won't wet the bed anymore,
but we'll fill the potholes of America, the America we
(11:25):
dream about. Well, we're having sex with Blanca on that
cruise ship, and that's how she'll win the debate.
Speaker 5 (11:32):
Wait, Grace, did you just weave everything back into the
topic sexy?
Speaker 1 (11:37):
Right?
Speaker 2 (11:38):
Great school, everyone, we'll come back.
Speaker 5 (11:41):
We'll be the most important person in this election to
stick around. Well about to the Daily Show? I think
(12:09):
I speak for all of us when I say that
my favorite part of election season is all the emails
I get from candidates asking for money. But do you
ever wonder who's really writing all those great emails? Well,
we have The Daily Show found out you're justin.
Speaker 8 (12:29):
Have you seen the latest polls? We need you to
send five dollars now to defend democracy?
Speaker 1 (12:36):
And sent nailed it?
Speaker 5 (12:39):
Yes?
Speaker 8 (12:42):
Who am? I? I go by many names, Kamala Harris,
Robert de Niro parentheses via Kamala Harris dot com and
parentheses smart Kelly, but my real name Susan Calipenny McIntyre,
and I write fundraising emails for political campaigns. I never
(13:07):
thought that this is what I'd be doing for a living,
but all my life people have told me you have
the personality for this. Is there anything I can do
to convince you? I need these glasses before midnight. I'm
depending on you to act now. I'm pleading with you
to pick me up so we can get to work together.
Speaker 2 (13:27):
This could be historic. I need your help.
Speaker 8 (13:30):
I was scouted at a young age after I sent
a letter to new kids on the block saying that
the very fate of the world depended on all of
them marrying me.
Speaker 5 (13:39):
How silly.
Speaker 8 (13:41):
Only one of them did, But the letter caught the
attention of a gubernatorial campaign and launched my political career.
What I do is a highly specialized skill. Obviously, during
a presidential election year, I'm crazy busy. I usually write
(14:04):
between four to six emails a week, four to six
hundred million million. Yeah, I burned through a lot at keyboards.
Funny story, I have no feeling in my fingertips anymore.
Speaker 5 (14:20):
Check this out.
Speaker 8 (14:25):
Nothing in democracy. The hours are crazy. Oh words, Can
I count on you before midnight?
Speaker 1 (14:35):
Can I?
Speaker 5 (14:36):
Okay? Love you too? Moms.
Speaker 8 (14:39):
The first emails go out at five am. The last
batch goes out five minutes before midnight. Sometimes this job
gets in the way of my personal relationships. Hey, sorry, kiddo,
working late again. Hey, brush your teeth or they'll crumble
like our Senate majority if we don't raise ten thousand
dollars with the FEC filing deadline. Love you okay, battery die.
(15:00):
But when I'm home, I leave work at the door.
Speaker 5 (15:05):
It's over.
Speaker 8 (15:06):
What if we don't rush to get these dishes done,
we'll miss the beginning of Blue Bloods.
Speaker 1 (15:11):
Jesus, you don't have to be so dramatic.
Speaker 5 (15:14):
Fine, I'll just get my tubes tied.
Speaker 3 (15:20):
I don't talk to her anymore.
Speaker 5 (15:22):
I once lent her five dollars for lunch, and she's
called me every day for three years.
Speaker 2 (15:32):
I can't escape her.
Speaker 8 (15:36):
The work is the work, and I am the work.
The work is me. I am the work. But more importantly,
I'm making a difference. Send Send, Send, Send, we come back.
Speaker 2 (15:52):
You've all know a hurrah. We'll be joining me on
the shows.
Speaker 5 (15:54):
We'll go away.
Speaker 2 (16:12):
Welcome back to the Daily Show.
Speaker 5 (16:13):
My guest Tonight is an historian and a New York
Times bestselling author whose latest book is called Nexus, A
Brief History of Information Networks from the Stone Age to AI.
Please welcome, you've all Noah Harari. You are a popular writer.
(16:42):
Your books have sold over forty five million copies.
Speaker 3 (16:47):
Whoo.
Speaker 5 (16:51):
The Atlantic referred to some of your writing style as
since the Dawn of Time style books.
Speaker 2 (16:57):
You go way back and.
Speaker 5 (16:58):
You bring us into the future. These are big important
tones simultaneously. I heard you meditate for two hours every
single day. Yes, how do you make all this happen?
Speaker 1 (17:10):
I don't have kids.
Speaker 2 (17:11):
You don't have kids.
Speaker 5 (17:16):
What have I done? Why don't you write a pamphlets
that says just that you want to get shipped done?
Don't have kids.
Speaker 1 (17:26):
Some people manage to do both, you know, but you.
Speaker 5 (17:28):
Have time to dive into this. I'm curious. This book
is about information. Yeah, and you reject the notion that
more information is a good thing, that it leads to
truth and wisdom. Is this you being jaded by the
Trump administration and the time we're in or does this
thought process go back?
Speaker 1 (17:48):
You know, it's basically like thinking that more food is
always good for you. You know, there is a limit
to how much food the body needs, and in a
similar way, there is a limit to how much food
for the full food for the mind the mind needs,
which is information. And the same way that most there
is so much junk food outer, there is also so
(18:10):
much junk information ou there, and we basically need to
go on an information diet.
Speaker 5 (18:17):
Yes, but I need my sweet sweet Twitter snacks.
Speaker 2 (18:26):
I need it.
Speaker 5 (18:27):
I need it.
Speaker 1 (18:28):
It's exactly that. The same way that over the last
few generations they learned the industries learned how to produce
artificial food, which is pumped full of fat and sugar
and salt and is addictive and not good for us,
they've also learned how to manufacture this artificial information, which
is pumped full of greed and hate and fear and
(18:52):
is addictive to our mind and isn't good for it.
Speaker 5 (18:55):
Now, I totally agree, and I feel stuffed on all
of it. But we also have this feeling that when
you step outside of this information mainstream, that's just that
this pipeline of bs that is out there that you
certainly step out of the conversation. It feels like we
don't have the luxury of going on a diet if
we want to be part of the conversation around.
Speaker 1 (19:14):
Us, because the conversation is increasingly managed not by human
beings but by algorithms, and algorithms function in a completely
different way than us. They are not organic. For instance,
human beings, as organic animals, we run by cycles. Sometimes
we need to be very active, sometimes we need to rest.
(19:37):
But algorithms never rest. They are tireless, and they expect
us to be the same. So we now live in
this news cycle which never rests. And the same thing
happens in politics, in finance. You know previously if you
think about Wall Street, so even Wall Street takes rests.
(19:57):
The market is open from Mondays to Fridays, nine thirty
in the morning to four o'clock in the afternoon. That's it.
If a new war erupts in the Middle East, an
unlikely event. But let's say a new war erupt in
the Middle East. When on Friday at five minutes past four,
world streets will react only on Monday morning. It is
(20:17):
on weekend vacation. And this is actually a good thing
because if you force organic entities to be on all
the time, they eventually collapse and die, which is really
what is happening to us as individuals and as societies.
I think maybe the most misunderstood and abused word in
(20:39):
the English language today is the word excited. People think
that excited means happy, like I meet you and I say,
I'm so excited to meet you.
Speaker 5 (20:49):
Yeah, that's what happened to us next dage Yeah yeah, yeah,
yeah yeah.
Speaker 1 (20:53):
But excited excited doesn't mean happy. Excited means that all
your nervous system and your brain is like fuzzing. It's on,
and it's good to be excited sometime, but if you
keep an organic being, an animal excited all the time,
it eventually collapses and dies.
Speaker 5 (21:12):
So you're saying beforehand, I should have said to meet you,
I'm relaxed to meet you. I apologize, I'm dead inside.
But that's not your problem.
Speaker 1 (21:24):
But thing, for instance, about the election cycles and US politics,
wouldn't it be better if it was a bit more boring.
Speaker 5 (21:34):
I would love it if it were boring. I would
love it if it were boring. And we see what
happens in Europe where it's shorter, it's boring. But everything,
everything is pulling us to maximalize, right, the idea that
the fact that if we had a new cycle that
could end on Friday and we pick it back up
on Monday would be fantastic. But it doesn't seem like
the algorithms, doesn't seem like the financial benefits are pushing
(21:57):
us in that direction at all. Where do you see
a path like that going.
Speaker 1 (22:03):
If you keep kind of increasing the pace all the time,
we can't handle it, So the algorithms can, so they
take over. But it's not good news for humanity. We
need to slow down, basically. And you know, we are
facing now these non organic entities which work and think
(22:26):
in a completely different way from us. And the question
is who is going to adapt?
Speaker 5 (22:31):
To whom you're pointing at AI? Now? Is AI? Do
you see it as an existential threat? Like I've seen
some of these shrimp jesuses and I don't like it,
these weird images that pop up online. But I don't
necessarily connect that with the end of conversation.
Speaker 1 (22:50):
I think the most important thing to understand about AI
is that AI is not a tool. It is an agent.
It's the first technology in history that can make the
decisions and invent new ideas by itself, Even something as
powerful as the atom bomb could not decide anything by itself.
All the decisions were made by humans. Now we've created
(23:13):
something which potentially can take power away from us at present.
It starts with very small things like for instance, there
was an experiment when open Ai developed GPT four like
two years ago. They want to test what can this
thing do? So they gave it a task to solve
capture puzzles. The cupture puzzles like when you go online
(23:36):
and you want to access your bank or whatever, and
they have this riddle that you have to solve. An
image that you have to say, what are the twisted
words and letters to make sure you are not a robot?
Speaker 5 (23:47):
It's tough, Yeah, is that a street line? Is that
a bicycle? Will?
Speaker 8 (23:50):
I know?
Speaker 5 (23:51):
Let me do it again?
Speaker 1 (23:52):
Refresh and it's it's really difficult for GPT four. GPT
four could not solve the capture. But what sheputy four
did it access task Rabbit, which is an online site
where you can hire humans to do different things for you,
and it asked a human to solve the copture for it.
(24:13):
Now the human got suspicious. The human ask, why do
you need somebody to solve capture for you. Are you
a robot? It asked directly, are you a robot? And
GPT four answered no, I'm not a robot. I have
a vision imperment, which is why I can't solve the coptuat.
So I need your help.
Speaker 5 (24:33):
To the fully to the truly evolved human is not
somebody who's smarter. It's just somebody who gets somebody else
to do the work for them.
Speaker 2 (24:41):
Smart.
Speaker 1 (24:42):
Yeah, scary, very scary.
Speaker 5 (24:44):
It's scary. You talk a little bit about Uh, there's
a portionary. You talk about the artist's role in the
community of whether it's comedy or writers or filmmakers. People
talk about is AI coming for our jobs? Part of
what you're articulate right here is that it's an artist's
job to sort of paint these fears, let us understand
(25:05):
the dynamics of human interaction. You break things down into
what these social networks need. And I'm paraphrasing, but like
both stories of mythology that lift us up and also
articulations of the bureaucracy, bureaucross is very important. It's very important,
which I think is explain that to me a little bit.
But I also feel it's very difficult also for artists
to articulate bureaucracy.
Speaker 1 (25:27):
That's the problem. We are very good at articulating mythology.
We love mythological stories, and mythology is very important, but ultimately,
our world, the modern world, is built on bureaucracies. And
this is also where AI fits into the picture, because
we are now going to see millions and millions of
AI bureaucrats. The kind of existential threat we are facing
(25:51):
is not this Hollywood scenario of a single computer trying
to take over the world. It's millions of AI bureaucrats
in the bank, in the governments, in the armies, in
the schools making decisions about us. Like you apply to
a bank to get a loan, and it's an AI
bureaucrat deciding whether to give you a loan or not.
You apply for the job for a place in college,
(26:14):
it's the same thing. Now. The thing with bureaucracy, it's boring.
It's boring. It's very difficult for artists to write good
stories about bureaucracies. But if the function of art is
help us understand reality, this is much more important than
telling mythological stories. And you know, when was the last
(26:37):
time you saw a really good TV show about bureaucracy.
Speaker 5 (26:42):
Let's say about the budget, like I'm a twelve part
series right now? Is well, I think about it. I
think of like movies like The Big Short. For every
the Big Short, you have a thousand Marvel movies exactly
live in the world of mythology.
Speaker 1 (26:57):
Yeah, so superheroes, this is mythology. This is no how
not how the budget works. You don't have a super
accountant fighting against I don't know what.
Speaker 9 (27:06):
Yeah, we can workshops, yeah, but you know what shapes
your life is these accountants with the budgets far more
than the superheroes.
Speaker 1 (27:18):
And it's really a challenge to do a good TV
series about the budget, and even if we try, it
will end up again like a love story between somebody
in accounts and somebody in another department, and the budget
will be pushed to the side. But we need to
really understand how these things work.
Speaker 5 (27:38):
I think what I love about a lot of your
work is it does explore the stories that we tell
and how important that is to just humankind and the
way that we create societies and build off one another's
and the danger of not telling those stories or not
bringing people in together. I think when I fear about
our future and our democracies, and they are a building
(27:59):
to hold these convers I think about things like AI,
but I also very much think about these mediums that
our conversations are taking place, whether it's on Twitter or
cable news or TikTok like. None of these mediums are
pointing towards or value any type of conversation that is
helpful in a way that is beneficial. So I'm afraid
(28:19):
of the AI in the way that we're tracking, but
I don't see a platform or a place where the
conversations that need to happen can happen.
Speaker 1 (28:27):
I think the number one question to ask to the
Superbergs and the Elune Masks of the world and Softie.
Speaker 5 (28:33):
You have their number, I text them right now.
Speaker 1 (28:36):
So if you have the number of this is the question,
how is it that we have the most sophisticated information
technology in history and we can no longer hold the conversation,
We can no longer talk with each other. That's the
big question. And you see it in democracies all over
the world. You see it here in the US, you
(28:56):
see it in my home country in Israel, you see
it in Brazil, in the Philippines, in France. The conversation
is breaking down. So what is happening this extremely sophisticated
information technology. It is not helping the conversation, It is destroying.
Speaker 5 (29:13):
It one hundred percent. I talk to older people on
the road who go to people at rallies, at Marga
rallies who will go to Facebook as a place to
converse with friends. And frankly, if you're in your sixties,
that's the place to talk to friends, to connect. But
(29:36):
in order to be a person on Facebook, not enough
for you just to converse with the friends you have there.
You have to publish news sources to get people to
pay attention to you. And I feel like the Zuckerberg's
and the facebooks and these media sites that we have
right now. We promise this idea of conversation or that
you can connect with friends, but we ask people to
be publishers of ideas and stories and promoters of things
(29:59):
that are outside the realm of what makes a healthy conversation,
and more so, money up the ability to have that
honest conversation.
Speaker 1 (30:07):
Traditionally, and we've been in this place every time a
new information technology was invented, we faced the same difficulties.
For instance, when the printing revolution swept Europe in the
early modern period. It did not lead directly, as many
people think, to the scientific Revolution. The best sellers of
the early print era were not Scoopernicus and Galileo, Galilee
(30:32):
and Newton. Hardly anybody read those books. The big best
sellers were religious tracts and were witch hunting manuals. The
big witch hunts. They were not a medieval phenomena. Medieval
people didn't care very much about witches. The really big
witch hunts they began after the print Revolution. One of
(30:53):
the biggest best sellers was a book called The Hammer
of the Witches, which was a do it yourself manual
to identifying and killing witches, Hammer The Hammer of the Witches,
and it was full of these stories about cannibalistic ogis
and goather rings of and this was far more interesting
(31:14):
than Copernicus with own mathematics.
Speaker 5 (31:18):
I gotta say, I'm writing a Hammer of the Witches
that sounds good. Hammer of the Witch is also my
favorite led Zeppelin album.
Speaker 1 (31:27):
For instance, if you want to really understand, like Qannon today,
it's basically the same story. There is a conspiracy of
Satan worshiping witches. That is trying to destroy the world,
and good Christians need the ability to identify and destroy
these witches. It's not a new thing on Facebook or Twitter.
(31:48):
It goes back to the print revolution in the fifteenth
and sixteenth century.
Speaker 5 (31:52):
Are there any examples looking back at history though, where
we face these technological watershed moments where we are given
new techno and that humanity has decided to revert and
say no to it and move beyond it. It feels
like a foregone conclusion that we are heading into this
Ai revolution and we're not writing the rules. A couple
of rich folks Intelicon Valley are.
Speaker 1 (32:14):
You can't go back in history, That's impossible. But the
answer is always the same. You need institutions. You know, institutions.
They are not heroic, they are not superheroes. They don't
they are not kind of the main theme of Marvel movies.
But they are always people reach the conclusion. They are
the answer because if you want to you know, in
the ocean of fake and junk information, if you want
(32:38):
to know the truth, you need institutions like newspapers, like
academic associations, like quotes that develop mechanisms to sift through
the evidence and decide what is reliable information and what
is unreliable? Again, it's not heroic, but this is always
(32:59):
the answer, and we need to do it again with
the current information revolution.
Speaker 5 (33:04):
So as long as as newspapers stay strong as a
business model, perhaps v HS machines can get in there
too and fight the good fight. You know, you actually you.
You signed a book for me backstage, and one of
the comments you made with in it was to not
(33:25):
lose hope. Help me, help me do that. Where where
do you where do you see those little those little
glimmers of hope when you look at this this uncertain
and perhaps scary future that we're walking into.
Speaker 1 (33:39):
You know, I think that AI is nowhere near its
full potential, But humans also, we are nowhere near our
full potential. If we if for every dollar and every
minute that we invest in developing artificial intelligence, we also
invest in exploring and developing our own minds, we will
be okay. But if we put on our bets on
(34:02):
the technology on the AIS and neglect to develop ourselves,
this is very bad news for you. Monity.
Speaker 5 (34:08):
All right, So I'm gonna get that gym membership and
I'm gonna cut out. I'm gonna cut out. The Swedes
Nexus is available. Now you've all Noah harrara, I'm gonna take.
Speaker 2 (34:17):
A quick break, all right back after that.
Speaker 5 (34:29):
That's our sol for tonight, but he sat to tune
tomorrow when John Stewart hosts our live debate coverage on
a Love It PI.
Speaker 10 (34:39):
Kamala Harris, the Vice President, has been in a hotel
for a few days, really hunkering down and trying to
prep to get ready to take on Donald Trump on
the debate stage on Tuesday.
Speaker 5 (34:48):
She basically said, it's.
Speaker 10 (34:49):
Like cramming for finals right where you can't wait to
get out, and the getting out part was the best part.
Speaker 2 (34:53):
Watch being at the spice store.
Speaker 5 (34:56):
I finally got debate prep to look at these spices
best part to make Fred so far.
Speaker 2 (35:05):
Explore more shows from the Daily Show podcast universe by
searching The Daily Show wherever you get your podcasts. Watch
The Daily Show weeknights at eleven ten Central on Comedy Central,
and stream full episodes anytime on Paramount plus
Speaker 5 (35:24):
Paramount Podcasts