All Episodes

December 9, 2025 181 mins
00:01:54 — Trump’s “Peace Plan” for Ukraine Mirrors Real Estate Deals Knight ridicules Trump’s 28-point Ukraine “peace plan” as a Kushner-style negotiation scam, arguing it treats war like a property flip.

00:07:07 — Europe’s War Cult and the Rise of Authoritarian Leaders He warns that Macron, Scholz, and Starmer use endless war to justify censorship, digital IDs, and domestic surveillance—governments “at war with their own people.”

00:11:44 — Milo Yiannopoulos Exposes GOP Hypocrisy Knight highlights Milo’s revelations about corruption and moral rot inside conservative circles, arguing controlled-opposition influencers sanitize vice as “freedom.”

00:16:06 — January 6th Was Fueled by Controlled Media Figures Knight names Fuentes, Jones, and others as agitators shielded from scrutiny, saying they exist to steer genuine dissent into chaos.

00:34:00 — The Surveillance Age: When Your Refrigerator Watches You He tells of “smart” appliances spying on owners, comparing the Internet-of-Things to an always-on domestic intelligence network. 00:36:20 — Edmonton’s AI-Equipped Police Cameras Mark New Surveillance Era Knight reports on Axon’s facial-recognition rollout targeting “7,000 high-risk citizens,” warning that predictive policing is replacing constitutional law.

01:09:10 — Google’s AI Deletes a User’s Entire Hard Drive A chilling example of corporate AI failure—Knight uses it to show how automation concentrates unaccountable power over private life.

01:13:05 — Drugs Are Not Violence: Trump’s Duterte Doctrine He exposes Trump’s rhetoric equating drug use with armed combat, calling it moral inversion that paves the way for extrajudicial killings.

01:41:21 — Trump’s Tariffs Increase Trade Deficit by 23 Percent Knight cites official data proving tariffs backfired—raising consumer prices, enriching China, and sinking U.S. manufacturing.

02:03:05 — Neuroscientist Warns of Eight 21st-Century Brain Threats Dr. Richard Restak outlines eight technological and psychological forces—AI, isolation, propaganda—reshaping and damaging the modern mind.

02:15:20 — Memory Editing: From Courtrooms to Soldiers Restak exposes DARPA research on erasing or rewriting memories under the banner of trauma therapy—an Orwellian leap in mind control.

02:49:30 — The Unholy Alliance: Capitalism Meets Totalitarian Power Knight closes by warning that corporate profit motives and government surveillance have fused into a single global technocratic system.





Money should have intrinsic value AND transactional privacy: Go to https://davidknight.gold/ for great deals on physical gold/silver

For 10% off Gerald Celente's prescient Trends Journal, go to https://trendsjournal.com/ and enter the code KNIGHT

Find out more about the show and where you can watch it at TheDavidKnightShow.com

If you would like to support the show and our family please consider 
subscribing monthly here: SubscribeStar https://www.subscribestar.com/the-david-knight-show


Or you can send a donation through
Mail: David Knight POB 994 Kodak, TN 37764
Zelle: @DavidKnightShow@protonmail.com
Cash App at: $davidknightshow
BTC to: bc1qkuec29hkuye4xse9unh7nptvu3y9qmv24vanh7

Become a supporter of this podcast: https://www.spreaker.com/podcast/the-real-david-knight-show--5282736/support.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:29):
You know the world of deceit. Telling the truth is
a revolutionary act. It's the David Knight Show.

Speaker 2 (00:44):
As the clock strikes thirteen, It's Tuesday, the ninth of December,
of our Lord twenty five. Well, today we're going to
take a lot of look at the artificial intelligence. Who
actually have in the third hour a guest, a best
selling author. He's done twenty five book. He's an MD
who works in neuroscience. The book is the twenty first

(01:06):
Century Brain. And he's been a consultant or lecturer at
the CIA, at the NSA, the Pentagon other places, so
he knows something about where this stuff is headed. We're
going to have to see what he has to say
here about this. But we're going to begin with what
is going on in Ukraine? Is this the beginning of

(01:31):
the end? Are these people going to be able to
sustain this? Russia is rapidly advancing, even though Vilensky's not
even taking a look at the plant. And so we're
going to start with news. We're going to also we
have some interesting updates in pharmaceutical areas as well as
an update in terms of Trump's tariffs. Are they working?

(01:54):
And he finally got around to giving some breadcrumbs to
the soybean farmers. But we're going to take a look
at the big picture of the sy stuff. So we're
back already. It ended, that's okay, all right, Well let's
start with the news here, and Trump said out loud,
he said, I'm disappointed that Zlynsky hasn't even read my

(02:15):
peace proposal. And I understand how you feel is I'm
disappointed that Trump hasn't even read the Constitution that he
swored uphold. Maybe he doesn't like it, just like Zelinski
doesn't like peace. His frustration continues to show, especially after
high hopes for his twenty eight point peace plan. You know,
we had this ten point piece plan between us and

(02:36):
the government. It's called the Bill of Rights. Tell you
what this is where we draw the line with what
the government does and with our natural rights, our God
given rights. But they didn't respect that either. So he said,
I'm a little bit disappointed that Zelinsky hasn't even read
the proposal yet. Well, he doesn't want peace, and neither

(02:56):
do the European leaders as well. As he's saying that
you've got the leaders of Britain, France, Germany all meeting
with Zelensky to him, keep fighting, keep fighting, We're going
to win this thing. Except that's not really what's happening,
he said, as people love it, but he hasn't. Russia's
fine with it, he said, and the assessment of what's

(03:20):
going on with Ukraine. Of course, his follows after his
son in law Jared Kushner and his former business partner.
I guess you could say Steve Witkoff, who now his
emissaries for geopolitics. I mean, hey, if you can negotiate
a big real estate deal in New York, that's most
of the stuff is about real estate, right, whether you're
talking about Gaza or you're talking about Ukraine, you're still

(03:42):
talking about people killing each other over land. And so
he said, they were not. They didn't think that Zelenski
was really serious about this. Moscow, as I point out yesterday,
really likes the document that was released, the first NSS,
which is the National Defense Security Agreement that's there basically

(04:07):
laying out the Trump administration's perspective on foreign policy and
national security. And I liked what it had to say.
I just don't believe that Trump is going to stick
to any of it. But Russia reported on it. You
didn't have any reporting really from mainstream media here in America.
So the Russians liked it because, as they pointed out

(04:31):
in the Zero Hedge article, the document characterizes europe as
weak well, warning of an unpredictable, disunified atmosphere on the
European continent, where, in desperation, European leadership could overreact and
escalate a war with Russia. You think, I mean, they've
been doing that upfront in so many different ways. You

(04:51):
got fred Mertz in Germany, and you've got Kere Starmer
in Britain, as well as Macron and France. They're all saying,
you know, get ready for massive casualties, and we've got
to we've got to draft more people in the army.
I mean, they're doing everything is essentially amounts to a
declaration of war already. So Donald Trump's first in assistance,

(05:13):
returning to office, blames European officials for thwarting US efforts
to in the war in Ukraine and accuses governments of
ignoring a large European majority quote unquote who want peace. Well,
I agree with Trump on that. The Trump administration, I
just don't trust him on any of this stuff. Meanwhile,
there might be another way to have peace, and that

(05:35):
is for Russia to when and it looks like that
may be happening. One way to have it lasting piece
is that we're going to have a lasting peace. Well,
you could end the NATO provocation that is called Ukraine
is a geopolitical construct that, as I pointed out before,
the Ukraine was an area of Russia, an area of

(05:56):
Russia for four hundred years, and breaking that off as
a separate entity and then creating a coup to change
the government that then began a civil war that happened
in twenty fourteen, eleven years ago. So that is a
construct of NATO who decided after the Soviet dish Union

(06:18):
that they would eliminate Russia as a power. And so
this has been a gradual policy of encroachment they're pushing
for war. Putin's army seizing land at one of its
fastest rates since the initial invasionment almost four years ago,
says research. The Kremlins army sees two hundred square miles
of territory of November, up from one hundred square miles

(06:40):
the previous month. According to Deep State, a trusted Ukraine
based battlefield map about that. They can call it deep state.
Let's use that for our marketing purposes. Here, the speed
of advance was approaching the fastest. It's the initial invasion
almost four years ago. But then you have the desperation

(07:01):
of the war cult Zelenski meeting with Kiir starmer Manuel
Macron Fred Mertz. Ukraine is holding its own they said,
and doing even better. Ukraine is not on the brink
of collapse. Again, reality has no meaning to these people.
If we cannot immediately reach a peace agreement with Russia,

(07:21):
it is essential that we give Ukraine all the support
it needs so it does not lose ground due to
lack of support. Well, it is losing ground even though
they are supporting it. And this was something that many
people said from the beginning, that there was no way
that Russia was going to be able to outlast that

(07:42):
Ukraine is going to be able to outlast Russia. The
comparative size of the two countries militaries, as well as
you know the close proximity. It was in the cars
that this is going to happen. They planned in the
war drawn up by the Trump administration involved Ukraine handing
over vast tracts of land and Ukraine and Europe have

(08:07):
rejected the proposals. Well, that's one way it's going to end,
and maybe end when they take the land the gains
and territory risks helping to persuade Trump that peace should
be set on Russia's terms. The sending weapons an aid
to Kiev was a waste, yes, early, Well, why does

(08:29):
kir Starmer want the war so much? And why does
Fred Mertz want the war? Why does Francis Emanuel Macron.
It's because their people understand that their governments are at
war with them. They're locking people up for mere comments
as they create this police surveillance state and shut down
all free speech. And in the UK, for example, same

(08:54):
things happening in all these countries being overrun with immigrants
from abroad. There's fury and the UK is nearly three
hundred and fifty thousand migrant families could get extra welfare
after the new budget from Kure Starmer. So why does
he want war? Well, because his own people are waking
up to the fact that Starmer is at war with
British people. That's why they'd fifty thousand form born families

(09:19):
and they found that two hundred thousand of them were
from just ten countries. Families from Pakistan, Bangladesh, Nigeria set
to benefit the most from the three billion pound decision
to scrap the two child benefit cap. Endless found. A
Tory MP who carried out the research said, you have

(09:42):
to ask whose side the government is on. I don't
think you have to ask that anymore. I think they've
made that pretty clear. They like any third world migrants
and they hate all the native Britons. And if you
look at the chart that's there, the ten countries are Pakistan, Bangladesh, Nigeria, Somalia, India, Ghana, Afghanistan, Iraq,

(10:07):
Sri Lanka. That's nine of them. There's only one country
that is European and that's Poland. And so that's two
hundred thousand of the three hundred and fifty thousand. The
rest of the world is one hundred and fifty thousand.
Immigrants from the rest of the world, and of course
coming in for the welfare benefits, the welfare magnet that's there.

(10:31):
So Milo Inopolis and I don't normally get into these.
It's amazing to me what a soap opera the conservative
alternative media has become. But they've kind of been angling
for this for a long time. One of the things
that I criticized Charlie Kirk for was the fact that
he was going around doing culture war events and he

(10:53):
was putting out front a black guy who was homosexual,
checking two DEI boxes. And as he's going around talking
about christ and Christianity, he's sending this conflicting message of
supporting homosexual marriage. And he was called out on it
by some people at some of the events, and he

(11:16):
got really furious, how dare you call this out? And
I said at the time, I said, this I think
is very revealing because it shows what he's interested in
is big tent GOP. He's interested in getting money from
backers and that type of thing. And to me, it
was a real betrayal of all the conservative things that

(11:39):
he pays lip service to. But their entire Republican Party
is like that, but especially the alternative media, and so
Milo Ianopolis has apologized for helping to sell and normalize
homosexuality and homosexual marriage. And where did he do that
with the altiterative media? Milo says he's become a Christian

(12:03):
and rejected that, and now he is outing a lot
of other people that are living this closeted, as they say, lifestyle.
We've seen this for a long time in the Republican Party.
And Milo's point is that homosexuality is rampant but hidden
in the GOP. I mean, there's been reports when they

(12:24):
have their large conventions that Grinder you can see the
spike in Grinder Activity, which is a homosexual dating app.
You can see it where they're meeting geolocation. And we've
seen it in the past. I mean, the longest serving
Speaker of the House, Dennis Hasterard, was put into Congress

(12:47):
from being a wrestling coach. That was his qualification beginning
in Congress. Actually it wasn't his qualification wasn't being a
wrestling coach. It's qualification is being a pedophile wrestler coach.
And then that lawsuit called up to him eventually. But
while he was in there was a paging scandal, a pager,
I guess the pages. The young boys that go to

(13:09):
Congress because they want to get experienced in politics, they
got a different kind of experience they were expecting, and
so there was a scandal there with Mark Foley, and
so Dennis Hastert before all this stuff broke about him,
went on with Russienbaum and they just poo pooed it. Oh,
this is just nothing but partisan politics, the same type
of stuff they're doing now with Pete Hegseth and what's

(13:30):
happening with the murder of people in international waters. And
so yeah, it's just part of some politics. Nothing to
see here, except we did see what it was. And
so Milo is saying that, in his opinion, it is
everywhere within the GOP. Now he might have a bit

(13:50):
different perspective on it, since he was holding himself forth
as a homosexual. And of course they're still doing this
with Scott Pressler, the guy with really long straight hair,
you may remember him. He is a favored person for
the GP in terms of representing them, and they're normalizing this.

(14:16):
And so Milo has rejected that, and he has apologized
for normalizing that, which, by the way, none of the
influencers have And so you know, people like Charlie Kirk,
people like Alex Jones have been normalizing this type of thing.
And as a matter of fact, he went on with
Tim Poole, who was also playing this game. Tim Poole

(14:38):
had Milo on, he had George Santos. Why would you
put George Santos on unless it's some kind of a
clickbait thing. And so Milo is making all kinds of
statements about all these other conservative influencers, Candice Owen and
even Charlie Kirk and Alex Jones saying that they were

(15:00):
involved in homosexual activity. So I don't know. And so
already you've had Minnie Johnson, who he said that about,
said he's going to sue mylow for what he what
he said about that, He made some very specific statements
about it. All I can say is that you know,
when you look at how they're using this, the people

(15:21):
who say that therefore conservative values, that they're for family values,
and then they do this kind of stuff. I mean,
it's just look at, you know, Alex Jones platforming Blair White,
this guy who dresses up like a woman, and so
again Temple put all that stuff onto his into his podcast.

(15:42):
All I've got to say about that is the reason
I mentioned this is not to get caught up in
all of this gossip and all the rest of this stuff.
But just take these people and look at what they do,
look at what they do, and look at what they say.
Ask yourself, then why would you trust them? You know,

(16:04):
very interesting there was in terms of January the sixth,
Trump has, according to some sources, was trashing the people
who were the conspiracy theories around January the sixth, and
then you got people like Nick Flint's. It was put
up by Shannon Joy yesterday and I don't have it

(16:24):
on the deck here, but it was footage of Nick
Quinte's you know, yelling people go over there, go over there.
You know they're you know, directing people on January the sixth.
And I've said from the very beginning, why did they
not focus? Are they focused on Ray Apps right? And

(16:45):
not focus on Fuentes, on Alex Jones and all these
people who had been running stop to steal all the
people who enticed them to come. And you know it's
like Ray Epps is there saying yeah, we got to
go over there as well. Fints is doing that that
day as well. Why does he get a pass? Is
he a fed? The question is when you look at

(17:09):
this stuff, are they selling this stuff or clicks? Are
they selling it because they're being funded by people who
want to use them to propagandize, you use them for
controlled opposition, And I think that it really in the
long term doesn't really matter that much. They're manipulating you,

(17:32):
they're lying to you, and that's the key thing that
you need to know. It's a trap in many different ways. Well,
I'm going to take a quick break here because there's
something going on and need to find out what is
happening with this, and we're going to continue when we
come back. We're going to talk about a man who
died from eating cockroaches. If people swallow some of this

(17:53):
stuff coming from the conservative influencers, I guess somebody who
is kind of like swallowing the cockroaches, and if you
get too much of it, it could be a very
bad thing for you. So we're gonna take a quick break, folks,
and we will be right back.

Speaker 1 (18:59):
M you're listening to the David Knight Show.

Speaker 3 (20:19):
APS Radio delivers multiple channels of music right to your
mobile device. Get the APS Radio app today and listen
wherever you go.

Speaker 2 (20:29):
Well, welcome back. I was trying to figure out what
was going on. Nobody scrambling and running right in and
didn't know what the issue was. It turns out that
we had some issues with rumble streaming. So that's now
been fixed and we now have everybody back in their
proper assigned seats.

Speaker 4 (20:43):
So you want to be on Rumble, but when somewhere
else you can now go back to Rumble and watch
the show there.

Speaker 2 (20:49):
Yes, well, as I promised, we're going to talk about
some real, something really important here. But I think it
is an app to metaphor for our times in a
number of ways. And man, it's horrifying death as a
cockroaches in a competition, And this is just yet another
warning you probably don't want to get into competitions of
drinking and eating stuff, whether it's hot dogs or even water,

(21:12):
or especially cockroaches. But I've talked many times in terms
of how dosage is so important. The woman who was
part of a they had a radio contest that was
going on, and they thought it'd be funny to give
people lots of water and then not let them go
to the bathroom, and a lady died because the water. Basically,

(21:33):
if you get a lot of water an overdose on water,
it will dilute I think your blood or something to
the extent that it kills you. And it killed that
one woman. Just in terms of doing a stupid contest,
this guy stomach lining that stomach lining that's the method.

Speaker 4 (21:51):
Yeah, and then it just leaches out into your system
and your body's not Your body needs water, but it's
supposed to stay in its proper place.

Speaker 2 (21:57):
Wow. Well this guy thirty two years old, he collapsed
and died as part of a contest. And guess what
the prize was? A python? I want that python. Give
me those bugs.

Speaker 4 (22:11):
Eat the bugs for the snake. This is a strange
Barter economy list.

Speaker 2 (22:16):
He's trying to eat sea bugs and he ate too
many of the bugs. The interesting thing is when I
saw this, I thought, some are these things toxic? I
grew up in Florida where we have really large cockroaches,
palmeta bugs that would call him to try to put
a I think a knife off the blow a little bit,
put a nice spin on it, a label, but the

(22:36):
filthy things, and and so I thought, you know, was
it toxic? No is Actually he just respirated cockroach parts.
He's trying to eat them so quickly, and so he
died from assixiation, and I'm stuck in his throat. His
girlfriend said that he had eaten bugs before, and she

(22:57):
was his girlfriend.

Speaker 4 (22:59):
So there's somebody out there for everyone, that's right.

Speaker 5 (23:03):
Such a pity that he died eating bugs. He loved
eating bugs.

Speaker 2 (23:08):
So it involved not just cockroaches, but it had several
different rounds of eating different species of insects. And I
don't know if these were the big swas in Florida,
but I don't know if it's a big Florida cockroaches
and palmeta bugs. They said they were measuring three or
four inches long.

Speaker 5 (23:25):
Kind of sucks that he got to the cockroach round
and then died there.

Speaker 2 (23:31):
Yeah, maybe grass operas would have been better. I don't know, but.

Speaker 4 (23:35):
What you're consuming can come in a plague. Stop eating it.

Speaker 2 (23:39):
This might have been the Madagascar roaches or something. It's
three or four inches long.

Speaker 6 (23:44):
Anyway, I feel like those would be too expensive.

Speaker 4 (23:47):
You know, those are a pet people want to buy.

Speaker 2 (23:49):
Yeah, they said he was eating these things really quickly,
and then he began wretching. I guess most of the people
that would be nothing unusual after eating a bunch of
cockroaches that you would start to throw But maybe that's
why they evidently didn't give him the Heimlich maneuver. I
don't know, but in the video you could see him
trying to swallow and breathe at the same time. We

(24:11):
can't do both of those simultaneously, That's right. So question
from the New York Times is is Hollywood getting God?
I guess you'd have a T shirt probably, yeah, instead
of got milky because they got God, you know, or something.
But I don't think that they get God. I don't

(24:34):
think they understand God. I don't think they ever have
understood God. And a good example of this is something
that is happening today Today is the sixtieth anniversary December ninth,
nineteen sixty five of the area of Charlie Brown Christmas
Special and CBS really didn't get God, the whole God thing.

(24:56):
They didn't get the whole Christmas thing either. It was
kind of interesting because it was sponsored by Coca Cola.
Coca Cola during the summer of sixty five, in June,
as a matter of fact, came to CBS and said,
we want to have a TV special that we want
to sponsor. Well, you know, Coca Cola doesn't really like Christmas.
It doesn't like Christ and Christmas. They've done everything they

(25:18):
can to put Santa in his place, and these ai
commercials that Coca Cola has done. They got a lot
of criticism for it, but they scrupulously avoid using the
term Christmas having anything to do with Christ and so
they were going to be the sponsor of this, and
so they said, we're on a really tight schedule, and

(25:41):
there's actually a documentary in case you're interested, the making
of the Charlley Brown Christmas So documentary. Bill Melndez is
still around and he was the animator and so he's
one of the key people that they talked about it,
and they said, we didn't know how we're going to
get this thing done. So they brought in Charles Schultz,

(26:01):
who was they had already picked, said we're want to
do something with peanuts. They called him Sparky, that was
his nickname, and they said he was really incredible as
a creative. He wasn't just a cartoonist. He was a storyteller.
He did these things that came out of the word work.
Sometimes I would just step back and like, wow, this
guy comes up with great ideas. And so he was

(26:25):
able to put together the outline for the show in
less than a day. They sent the outline to Coca
Cola they got on Monday. On Tuesday they called up
and said they'd do.

Speaker 7 (26:35):
It so.

Speaker 2 (26:40):
And it had the objectionable scene in it, which was
Linus reading the Bible passage from Luke. But they didn't
really catch on to that evidently, and so the TV executives,
once they got the show delivered to them, were very
unhappy with it. They said they didn't like the kids' voices,
which I thought pretty good. They didn't like the jazz

(27:03):
music is that it doesn't fit, which of course that
has now become a classic, it's iconic. Yeah, And they
didn't like the Bible being in there. They thought that
was too controversial. It's like all the things that everybody
likes about it. CBS TV executives hated it. That's how
totally out of touch they are with everything like this.

(27:23):
That's why Hollywood is circling the drain and well on
its way to being flushed out because they really don't
get it.

Speaker 5 (27:32):
Yeah, they're going to have more shows like this now.
In fact, you couldn't even really have them back then.
Most of the time. This was lightning in a bottle.

Speaker 2 (27:40):
Yeah, got past them, that's right, that's right.

Speaker 4 (27:43):
They've been completely out of touch an anti Christian for decades,
probably since inception.

Speaker 2 (27:49):
Like sixty years. Well, yeah, if you look at Hollywood.
It was pretty amazing. There was an interesting BBC series
is narrated by James Mason and the actor, and it
was talking about the early days of Hollywood, the silent films.
They called it something about silver screen, and we had
it in our video stores. It was really interesting because

(28:11):
they talked about how they made the movies and you
know why movie stars wear sunglasses because they were spending
all day in these really bright lights, these carbon arc
lights that they were using everything, and it was doing
a number on their eyes and they really need to
get their eyes shaded when they went outside. They needed
to rest, you know, a lot of different things like that.

(28:32):
But how the camera, you know, how they would do stunts.
Everything was real. I mean there was no special effects.
They did it for real. I mean Lilian Gish is
on an ice flow, and she's on a real ice flow.
I mean, this is not a staged thing. And when
they would the cameraman, how would they keep the steady flow.

(28:52):
I mean it does look a little bit jerky in
terms of the movement and that type of thing. But
the cameramen were picked because they could turn the crank
and manually the film through the camera at a constant rate,
and so they all had a song that they would
sing to themselves and that would be how they would
pace themselves. But these guys had to keep this stuff
up even when they strapped them to the wing of
a biplane or something. You know, they were up there

(29:15):
rolling this thing as they're flying around on the biplane.
And so it was a fascinating series. But from the
inception you can see just how perverted, I mean it was.
The whole thing was like Jeffrey Epstein party, continuously of
all these different people. That's why they had the Hollywood
code that came in. But they've been completely out of
touch with the rest of society from the get go,

(29:37):
and they don't get it. But what they do is
they manufacture a new reality, they manufacture a new consent.
They're not reflecting culture, they're driving culture anyway, And back
to this in the outline, Schultz Sparky had insisted that
there'd be a scene from the Bible, and at the time,

(29:57):
hardly any TV shows reference scripture of the move very risky.
Mendelssohn said, Bill and I looked at each other and
he said, oh, we don't know if we can animate
from the Bible. It's never been done before, and Charles
Schultz said, well, if we don't do it, who will.
So they went ahead and did that that became part
of the famous scene. This year again marks the sixtieth

(30:18):
anniversary the TV special December the ninth, nineteen sixty five
and seven point thirty, and this was a seventy fifth
anniversary of the Peanuts comic strip. So he'd had that
comic strip for about fifteen years before they picked him
to do the film. So this is a short segment.

(30:40):
We're going to come back though, and we're going to
talk about the technocracy and some of the mounting problems
that so driving cars that are going to take over
the world. AI is going to run the world and
going to run us, but it can't even navigate the
Chick fil A drive through. They're working on an app

(31:01):
for that, and so we're going to take a quick break.
And last, did you put in the Charlie Brown thing? Yeah?

Speaker 5 (31:10):
I believe it's called Christmas Time and Christmas Folder.

Speaker 2 (31:14):
Okay, Yeah, let's see if I can get that here.
I got it, I got it? Yeah, all right. Yeah,
We've got a little bit different visuals this year with
the help of AI for our Charlie Brown song. We'll
be right back.

Speaker 7 (32:32):
At the.

Speaker 1 (33:11):
You're listening to the David Knight Show.

Speaker 3 (33:17):
Elvis the Beetle and the Sweet Sounds of Motown. Find
them on the Oldies channel at APS radio dot com.

Speaker 2 (33:27):
Well, as we talk about whatever I was watching six
years ago, today, the government watches you, the TV watches
you back, the refrigerator watches you back. As a matter
of fact, it was interesting funny story that Lance had
shown me. And there was a woman who was suffering

(33:48):
from paranoia, and she had one of these refrigerators that
plays commercials all the time, and it just and it
was a commercial for kind of a sci fi dystopian film.
And the character in the film had the same name
as this woman. And so the trader starts playing this
thing and calls her out by name, and she lets

(34:12):
just having a psychotic episode here. But I guess when
they're really watching you, maybe it's not psychotic.

Speaker 7 (34:18):
And the.

Speaker 5 (34:21):
Schizophrenia and she got these messages for this TV show
in which some group or AI or something is talking
to this woman through various devices. So it's putting up
these messages like sorry, we disappointed you, Carol, and the
woman's named Carol and had been diagnosed as schizophrenic, so

(34:42):
she thought she was having a psychotic break.

Speaker 2 (34:46):
Yeah, if I ever get a car that talks to me,
I'll have to get the sound bites in there from
two thousand and one. Sorry, Dave, I can't do that.

Speaker 4 (34:54):
I was thinking you were gonna go, you know, maybe
Kit from night Rider or something like, something less malevoling.

Speaker 2 (35:00):
It had to be malevolent from my opinion, talking about
the malevolent use of technology. Axon Enterprise, this is the
company that is the biggest vendor of body cameras for cops,
but of course they're also famous for developing tasers, and

(35:20):
now what they want to do is and I thought
it was interesting that the number two body camera company
was motorol On. I said, you know, this is the
way everything is going in the world. You know, because
of the government's money, they've taken over all consumer manufacturing
and everybody is now catering to the government. That's their customer.

(35:42):
That's especially going to be true of artificial intelligence, but
it has definitely been true for quite some time in
terms of the technology companies that are here. Even you know,
consumer based companies started getting into defence contract work because
it was so lucrative. And so the police body cameras
are equipped with artificial intelligence, trained to detect the faces

(36:06):
of about seven thousand people on a high risk watch list,
and they're rolling this out in the Canadian city of Edmonton.
I have to ask myself, you know, when you got
to I don't I should have looked up the population
of Edmonton. But when you got in a town, I
don't care if it's New York City. If you've got
seven thousand people who are dangerous enough that they need

(36:27):
to be on the bolo, you know, be on the
lookout for. Maybe there's something wrong with the government system
and the court system that you have these people on
the streets in the first place. So that's my first
concern is why are seven thousand people being that they
say are dangerous? Why are they allowed to be out there?
Then the second issue is that if these people are

(36:51):
dangerous enough that they're going to instantly alert the police
and say be careful this person. You know, they're very dangerous.
They might be a threat to you. We've seen that
type of thing done, labeling people as sovereign citizens. Remember
when they did that after the was it two thousand
and eight or something, we had Chuck Baldwin and Ron
Paul ran for president, and they were telling police officers

(37:15):
with these fusion data centers. They're telling them that if
they pulled a car over and had a bumper sticker
supporting Chuck Baldwin or Ron Paul, these people might be
sovereign citizens, they better be on the lookout for them,
and they might try to kill you. So you know,
you got the police safeties off their gun. They're on
a hair trigger here, and that's a real dangerous thing

(37:37):
when you falsely identify people as they did with that.
These people are not a threat to the police. But
this AI can do the same thing. This AI can
say this person looks like I think we've got this
particular guy, and you might be completely innocent and you
be misidentified by artificial intelligence. And because it's hyping up

(37:59):
the police and telling them that you're dangerous, that could
threaten you severely. So we've gone beyond the no fly
list type of stuff. And so now they want to
do this. So they're running this out as a test
in Edmonton, and.

Speaker 4 (38:15):
I hope the AI is in their ear as they're
getting this, just feeding them full metal jacket lines.

Speaker 6 (38:20):
You know, show me your war face, just getting.

Speaker 4 (38:21):
Them really hyped up, pumped up, ready to go rock
and roll, heavy metal. Dry your gun right now, pull
it on.

Speaker 8 (38:28):
Yeah.

Speaker 5 (38:28):
Regardless of the population size, if you've got seven thousand
people who truly deserve to be on a terrorist watch list,
that's going to be a war zone.

Speaker 2 (38:37):
I know, that's what I'm saying. I don't know the
population is of Edmonton, but it doesn't really matter even
if it's New York City or some r J. Seven
thousand criminals out there that you've got to alert the
police as to how dangerous they are. That's a that's
a crazy situation. That means that the whole policing and
justice system ain't working, folks.

Speaker 4 (38:56):
It's like, I'm convinced there's at least seven thousand people
in New York that are criminals. I am not, like
you said, not convinced there are seven thousand criminals in
even New York that you need to immediately alert the police.

Speaker 2 (39:07):
Yeah, that's right. Yeah, they could be criminals because of
something that they do that's not a threat to other people. Nevertheless,
the interesting thing is that this is brought up six
years ago by them and also considered by Motorola, who
is now the number two provider of police body cameras.
They're both talking about matching this with artificial intelligence and

(39:30):
doing a biometric database because although that is much more
sophisticated now, they've been working on this type of thing
for quite some time. And so one of the guys
who used to be the chair of Axon's ethics board
spoke out because he resigned because of unethical behavior from

(39:51):
the corporation back in twenty nineteen. He and seven other
people resigned from Axon. Let's CEO had this great idea,
let's put our tasers on drones. It's like it just
keeps getting worse when you look at these corporations that
are part of the police state industrial complex.

Speaker 5 (40:11):
I had this great idea to put tasers on drones.
My entire ethics department quit. But this will be great
for our bottom line.

Speaker 2 (40:18):
That's right. So after getting rid of the ethics department,
with the tasers on drones, now he is free to
do artificial intelligence connected up to the police body cameras.
And he said it's not essential to use these technologies,
which have very real costs and risks unless there's some
clear indication of the benefits, said the former employee who

(40:41):
was there for ethics. He was the board chair for ethics.
Barry Friedman, who is now a law professor at New
York University. The founder and the CEO of Axon, though,
says that the Edmonton pilot is not a product launch,
but it's an early stage field research that will assess
how the technology perform and reveal the safeguards needed to

(41:02):
use it responsibly. So you better believe that if this
thing works at all, they'll be selling it. And they
don't really care if it gives false positives, if it
identifies you as a criminal. And testing in real world
conditions outside the US, we can gather independent insights, we
can strengthen oversight frameworks, and we can apply those learnings
to future evaluations, including within the United States. So he's

(41:24):
testing it outside the US, and believe me, they will
sell this as safety for law enforcement officers. It will
be like wildfire the way everybody will snap this thing up.
So they're in the process right now of making their
case for it. Oh, look, we tested it in Edmonton
and it worked great. We already know how that's going
to go. This is just like the way the pharmaceutical

(41:46):
companies test their drugs. You know, yeah, look at the
here's our study here that we did ourselves to show
how safe and effective this says. So the person who
is now the rector of Responsible AI, they don't call
it ethics anymore. So we really wanted to make sure
that it's targeted. So these folks. That's targeting these folks

(42:08):
who have serious offenses. Okay, So again, why are seven
thousand people serious offenses at large? And and if it's
a serious offense and they misflag you and they say
they have a real issue under certain lighting conditions, they
have an issue identifying accurately people with darker skin, and

(42:31):
so this is this is going to be a disaster.
It's a disaster in the making right here.

Speaker 4 (42:37):
I think getting to think if they've got seven thousand
hardened criminals on the streets, that maybe the Mounties don't
always get their man.

Speaker 2 (42:45):
They get amen, not necessarily the one that they needed.

Speaker 4 (42:48):
But we can promise you someone is going to prison.

Speaker 2 (42:51):
That's right.

Speaker 5 (42:52):
Al Ai drew all that great at picking out faces
in low light, but let's put a whole bunch of
tasers on them in seas.

Speaker 4 (43:00):
If we put out enough of them, eventually things will
work out.

Speaker 5 (43:03):
Yeah, just enough people, you'll get the criminals.

Speaker 2 (43:08):
Yeah, itays for everybody. We'll sort it out later as
they're laying on the ground.

Speaker 4 (43:11):
What is that military saying accuracy through volume of fire
or something like that. You don't have to be precise
with your shots if you just shoot enough times.

Speaker 2 (43:19):
Lethality not legality, Right, that's the new motto of the
Pentagon Pete Department of Defense, because they haven't changed the
name to war department yet. So anyway, they talked to Motorola.
Motorola said, well, we took a look at this and
we decided not to do it because we thought it'd
be unethical, and we intentionally abstained from deploying this feature. However,

(43:42):
we might do it in the future because ethics are changing. Right,
Morality is up for negotiation, especially if your competitor is
doing it. And so if Axon does it, Motorola will
do it, and it'll explode and we'll see it everywhere,
and they're all going to be coming to you, to
the local mayor whoever, and say, well, if you won't

(44:06):
do this for us, you really don't value our lives
because we've had a police officer over here that was
killed under these circumstances. We could have stopped that with
this thing, So it'll be on them.

Speaker 5 (44:17):
This is clearly unethical. We don't want to be the
ones pushing it and at the forefront of it, but
we'll hold off on it.

Speaker 2 (44:23):
That's right. Studies showing the technology is flawed. They demonstrate
biased results based on race, gender, and age. What else
is there? Race, gender, and age that pretty much covers everything,
doesn't it.

Speaker 4 (44:38):
I suppose if the drone were to sit you down
and ask you about your religion, it could discriminate based
on that.

Speaker 2 (44:43):
But well, it doesn't match the faces that accurately. So again,
it's a real risk to somebody to be given a
false positive like this. You, all of us would be
at risk, even if we're not a criminal. Several US
states and dozens of cities have some to curtail the
police use of facial recognition, although the Trump administration is

(45:05):
just fine with it, and they want to block or
discourage states from regulating AI. You see, if the Trump
administration gets its way, you wouldn't be able to pass
a state or local ordinance saying we're not going to
let the police use that kind of stuff. It's AI.
You've got to get your hands off of my donors businesses, right,

(45:28):
They're free to do anything they wish, just like his
friends and the pharmaceutical companies are FDA free to do anything.
And so that's what the Trump administration is really pushing for.
Same thing that was done to protect the glifasset model,
the round up model. The European Union has banned real

(45:49):
time public face scanning police technology across the twenty seven
nation block, except when used for serious crimes like kidnapping
or terrorism. But in the UK, authorities started testing the
technology on London streets a decade ago and they've used
it to make thirteen hundred rest in the past two years.
The government is considering expanding its use across the country

(46:12):
because the UK wants to be the leader in this
kind of Orwellian tyranny. They have seen nineteen eighty four
as a manual. Axon doesn't make its own AI model
for recognizing faces, and they decline to say which one
they're using. You know, when we look at the UK,

(46:35):
the way they have gone into this, gone over to
the dark side, maybe it would be a fitting thing
for them to just change the name of the country,
especially under Cure Sarmer. Remember the orwell it was ing Sock, right,
like English socialism, and of course Cure Starmer is a socialist,

(46:56):
so just call it Inksock.

Speaker 4 (46:58):
It's also great that they're not relying on their own model.
So something goes wrong and these things start tasing people,
they have to then send off to some third party
company to go Hey, by the way, well what they.

Speaker 2 (47:10):
Like about that. It gives them plausible deniability.

Speaker 6 (47:12):
It wasn't us, it was this other company.

Speaker 2 (47:14):
And you know, if it's something that's produced my Zuckerberg
or Altman or Musk or whatever, you know, the Trump
administration is going to give them a pass, even if
it makes an egregious error there. So they said about
fifty officers polloting the technology won't know if their facial
recognition software made a match. The outputs will be analyzed
later at the station. However, in the future, you could

(47:37):
help police detect if there is potentially a dangerous person nearby,
so they can call for assistance. And you know, it's
with all of this happening, it's kind of interesting. I
went back and watched a little bit of RoboCop because
in Detroit they've just erected a RoboCop statue, and I thought,

(47:58):
why are we honoring this kind of stuff? I mean,
Detroit looks awful in that movie. You know, they send
in mechanized robots to keep order and to use these
these heavy guns. Like it comes, put down the gun.
I said, this is this is kind of like the
Venezuela and boats right, but down the gun. They put

(48:20):
down the gun. Now I got five seconds to put
in and everybody scrambling because they know this thing's gonna
unleash fire and it it just starts shooting him over
and over again. So now they're embracing that. You do, Yeah,
let's play that there. It is two O nine. It's

(48:51):
probably got facial recognition technology as well.

Speaker 4 (48:55):
From the t S A is two O nine the
iteration number or the number of rounds is going to
pump into your corpse.

Speaker 2 (49:03):
I guess.

Speaker 5 (49:18):
The Enforcement Droid Series two O nine is.

Speaker 2 (49:21):
This enforcement droid. Robotic two of.

Speaker 4 (49:24):
Nine is currently programmed for urban pacification, but that is
only the beginning.

Speaker 9 (49:29):
After a successful tour of duty in Old Detroit, we.

Speaker 4 (49:31):
Can expect two nine to become the hot military product.

Speaker 7 (49:35):
For the next decade.

Speaker 2 (49:37):
Doctor McNamara will lead an arrest subjects Kenny, Yes, sir,
would you come up and give us hand please? Yes, sir.

Speaker 3 (49:43):
Mister Kenny is going to help us simulate a typical
arrest and disarming proceedure.

Speaker 5 (49:50):
Use your gun in a threatening manner.

Speaker 6 (49:54):
Pointed at a to nine.

Speaker 2 (49:56):
Yeah, and doesn't care for you have threatened a human.
Just don't threaten that. Please put down your weapon. You
got twenty seconds to comply.

Speaker 10 (50:11):
I think you'd better do what he says.

Speaker 11 (50:12):
Mister Kenny, you know how fifteen require you.

Speaker 10 (50:23):
Are a.

Speaker 2 (50:28):
Engineers are furiously trying to rip rip out the electronics.

Speaker 7 (50:32):
Engine physical.

Speaker 2 (50:41):
Yeah, and he just keeps choosing physical force, so we'll
cut it to that point. But you get the idea.
Pete Hegseth wants to know where you can get one
of these things for Venezuela. Can I use that I'm
a helicopter. Uh So anyway, the UH criminology professor in
Alberta says he's not surprised the city is sperimenting with
live facial recognition, given that the technology is all already

(51:04):
ubiquitous in airport security. That's why the TSA is there.
It is training for all of us, right and that's
what they're training you for, facial recognition right now. And
so again they resigned because the taser equipped drone. So
now they don't have an ethics board. They're free to
do this kind of stuff. Well you had. Nvidia CEO.

(51:28):
Hawaang goes on with Joe Rogan and has a jaw
dropping AI prediction. He says, in the future, maybe two
or three years only from now, ninety percent of the
world's knowledge will likely be generated by AI. Well, this
is a self serving prediction, if ever there was one.
If he really believes that, why is he having to

(51:49):
do the circular financing of other companies in order to
keep pushing his stock high and high? It seems like
it would the market would take care of that. And
so he's involved in circular financing fraud. And so Rogan says, well,
I don't know, that's crazy, he said. Wang said, yeah,

(52:09):
I know, but it's just fine. Rogan says, but it's
just fine. Why he goes Well, let me tell you why.
Wang said, It's because what difference does it make to
me that I'm learning from a textbook that was generated
by a bunch of people I didn't know, or knowledge
that was generated by AI computers that are assimilating all
of these and resynthesizing things. To me, I don't think

(52:30):
there's a whole lot of difference. Yes, as a man, right,
you can be propagandized by textbook companies and the school
board or the government or whatever. We can be propagandized
by our AI. What is the difference? And that's the
key thing you need to look at. You need critical thinking,
you need to look at the source, and you need
to check it out for yourself. And that's true. Before

(52:52):
we had AI, a lot of people didn't do it.
That's why AI is going to be so much more
dangerous because people would just trust it because it's coming
from the machine.

Speaker 4 (53:00):
That are going to assume it's an unbiased source. You
look at this, it's a robot. It doesn't have an agenda.
It's not trying to sell me something.

Speaker 2 (53:10):
That's right.

Speaker 4 (53:10):
It removes the people who are trying to do that
one layer, and people just forget they exist.

Speaker 2 (53:16):
Yeah. Yeah, the man behind the curtain thing, you know,
so you're interacting with the wizard of oz head that's
up there, but you don't realize that there's people behind
the curtain that have been hired to program their particular
biases and things into these issues that they find important.

Speaker 5 (53:33):
Yeah, I'm sure Grok was just purely troop seeking when
it said that it would be better for humanity to
lose forty nine percent of its population than for Elon
Musk Elon Musk to die. These things are purely unbiased
troop seekers.

Speaker 2 (53:50):
That's right, that's right. So again, you know, it is
a tool that is ripe for manipulation, says this article.
And that's right, and that's the real key with it.
Ripe for surveillance, and it's ripe for manipulation. But then again,
so are the schools, So are the textbooks. So is TV,

(54:10):
so as movies, so is social media. These are all
tools that are ripe for manipulation. So in that regard,
AI is no different from them. It's just that people
have over time, some people have got their guard up
for these other forms of manipulation of propaganda. AI is
going to come in from a different way and a

(54:32):
rare show of spine. And this is this is all critical, right,
this is coming from Steve Watson, and he's rightfully critical
of this and skeptical of this, but then listen to this,
he says. However, in a rare show of spine from
Big Tech, Wang declared President Trump to be our president

(54:53):
and cheered him on. How is that a show of spine? Watson,
I don't get it.

Speaker 4 (54:59):
Look, this evil scumbag is saying Trump is his president.

Speaker 1 (55:02):
Isn't that wonderful?

Speaker 2 (55:03):
But you know he is a sycophant, and he just
came from a meeting with Trump where he's looking to
make money for his business. And these guys know that
Trump is their ally. So al is big tech now
and the Democrats they're all good now because for somebody
like Steve Watson, they are so embedded in this because

(55:24):
they are now cow towing to the Trump cult. He's
now got a spine. It's just the opposite. He looks
straight at Joe Rogan. He said, President Trump is my president.
He is our president. Just because it's President Trump. Many
want him to be wrong. I think the US we
all have to realize that he is our president and

(55:46):
we want him to succeed because it helps everybody, all
of us, to succeed. Well, he certainly is helping all
of the AI technocrats to succeed.

Speaker 5 (55:55):
Isn't Jensen Wong Taiwanese anyway?

Speaker 2 (55:58):
Yeah, yeah, yeah, Again, it's a dual citizenship, I guess.
But he is as president if he's going to give
him massive subsidies, protect them from any restrictions in terms
of his business. This is what is happening here. So

(56:19):
again he really focuses and so to other people. It's
not just Steve Watson. He's taken this article from a
thing that's put up by Vigilant Fox. These people are
they do the articles, they do the posts simply because
somebody said something good about Trump. Look, it's a powerful
person says something good about Trump, and we want Trump

(56:41):
to succeed because Trump is our success as well. You know.
Trump is the success of people like Vigilant Fox and
Steve Watson, just like he's the success of the technocrats
who are going to be getting the government subsidies for
these projects and who are going to be protected from
any regulation at the state or local level because of Trump.
The remarks come amid Waning's Worldwin DC tour where he

(57:05):
was bowing and scraping before all these people are going
to take your money, take your freedom, take your dignity.
And hend it to these billionaire technocrats where he huddled
with Trump and Senate Republicans to slash export red tape
on AI chips, warning that here it is patchwork state

(57:26):
regulations could cripple US dominance. They always call it that
patchwork state regulations. We don't want to have patchwork regulations.
We don't want to have a different approach in different states. No,
we got to have one ring to rule them all.
And it's going to be coming out of Washington. That
gang will tell everybody and this is a violation of
the Tenth Amendment what Trump is pushing for, pushing against

(57:50):
patchwork state regulations. Where does it say in the constitution
that you can subsidize these companies. Where does it say
in the constitution that we can't have any control over
what these companies do in our state. As a matter
of fact, it says just the opposite. So he's there
lobbying for protection from competition and regulation, lobbying for Trump

(58:14):
to violate the Tenth Amendment and you'll get what he wants.
Trump's energy push is defying the green zealots, he says.
That's what Steve Watson says. This energy push for AI
let me tell you something. People are angry because they
see the power rates going up because of this green
grift that is out there. Oh, we got to we
can only generate power that is created with new devices

(58:37):
made by my corporate sponsors. Well, guess what, the corporate
sponsors of Trump who are going to be building these
cause massive disruption of the grid in order to feed
their AI data centers. And this AI energy grid requirement
is going to drive your prices up further and faster

(58:59):
than any of the green New Deal stuff. That's the
bottom line for us. You want to pay more for
electricity and have less of it, well you know the
Democrats have a plan for that. It's called solar power
and windmills. If you want to pay more for electricity
and have less of it, the Republicans have it planned
for that. It's called AI data centers. Wayne's line of

(59:21):
there being no difference between what is coming from the
from the AI and coming from somebody writing a textbook,
says Watson, ignores how these ghosts erode the soul, authenticity
and erode jobs, paving the way for a world that
is scripted by code, not by creators. He talks about

(59:44):
that in the context of Solomon Ray, a chart topping
singer that is just done by Ai Hawaang's Vision Thrills,
but it demands guardrails. We don't even have any guardrails
on Trump. I'm going to get guardrails on his corporate sponsors.

(01:00:04):
So it is as all this is happening. Just to
put this in perspective of this omnipotent Ai, it is
a real threat because it is going to be combined
with government. And that's the real threat. The surveillance, the
control of the propaganda, and the auditing of all of

(01:00:26):
us all the time. But when it comes to things
like self driving cars, they're having difficulty getting through the
Chick fil A drive through and some of them have
gotten stuck in it, and so there's going to be
an app for that. One person looked at this and said, oh,
that's a business opportunity. They've come up with a startup

(01:00:47):
company called Auto Lane, and what they want to do
is develop a kind of air traffic control system. They'll
be specific to a particular business, so you get people
to come to your Chick fil A drive through. If
Chick fil A does a thing with auto Lane, and
the people who don't drive cars who are being driven
around and self driving cars can tell it to go

(01:01:10):
to Chick fil A and they'll be able to navigate
there without getting caught. And so they're looking at selling
this to a lot of big box retailers, a lot
of fast food chains, and even mentioned selling it to
some of the big real estate investment trusts that are
managing shopping centers or things like that, and that's where

(01:01:31):
he sees his market. He said, we don't work on
public streets and we don't work with public parking spots.
So what he wants to do is he wants to
partner with these private businesses so they can say that
they are self driving car friendly. This is the pathetic
world that we are headed into. We've gone from London
taxi drivers who could keep the destinations in London in

(01:01:53):
their head and had this massive hypo part of their
brain whatever it was. I don't remember hip a campus. Yeah,
it might have been hippo campus. Yeah. I don't know
which part got larger. Actually I started to say it,
but I don't know if that was the part that
got larger. But you know, we have our shrinking brains
because our responsibilities are shrinking and we're using them less.

(01:02:14):
And so it turns out they said, American roads are
not too friendly to self driving cars, and they're not
friendly too pedestrians. And you can tell that this is
coming from the perspective of an urban planner. They love cities,
they love people walking, They hate cars because cars are
used by people to get out of the cities as
fast as they can.

Speaker 5 (01:02:34):
They want to keep your cafference between the London streets
and memorizing all that and being able to navigate a
Chick fil A parking lot drive.

Speaker 2 (01:02:43):
Through, that's right. The founder described the company as one
of the first application layer companies in the self driving
vehicle industry. Says, well, we're not going to build a car.
We're not going to navigate on the road. What we
would do is we'd have a special app that gets
layered on top of it. You aren't the fundamental models,
we're not building the cars doing anything like that. We're
simply saying, as the industry grows has exponential rates, someone

(01:03:07):
is going to have to sit in the middle and orchestrate,
coordinate and kind of evaluate what's going on. And when
I saw this, like air traffic control, I remember discussion
that we had Eric Peters and I years ago were
war gaming out this where the SAI thing is headed
for self driving cars, and Eric was right. He said,

(01:03:27):
these things don't handle interaction with human beings that well,
So we're gonna have to eliminate the human beings because
that's our first priority is to get the AI and
the self driving stuff out there. And so if there's
a problem between AI and humans, and humans have to go,
which means human drivers have to go. He said, you
stop and think about it. You have air traffic control

(01:03:48):
to at the airports to make sure these planes don't collide,
and they keep big distances between themselves and you know,
big distances vertically as well as in their same plane.
And so he said, how's that going to work with
artificial with the self driving cars. You're going to have

(01:04:11):
to get most of the cars off the road and
or they're all going to have to be self driving
cars so they can communicate with each other. You know,
if they can communicate with each other, you can get
them doing the I forgot they call it. It's like
a caravanning thing or something where they get the cars
get right up against each other bumper to bumper because
they're communicating simultaneously and whatever the front car sees it

(01:04:33):
can instantaneously apply that to all the cars in the row,
and so it's like caravanning or something like that. But
they sell that as a feature once they get all
the humans off the road. And so now they're starting
to talk about the air traffic control model. Yeah, we're
going to have complete control of all the cars here.

(01:04:53):
We'll just guess what. You know, when they set this
thing up and they've got all the self driving cars
going through the through, it's not going to be very
friendly for you, and so they're gradually going to squeeze
you out of it.

Speaker 4 (01:05:05):
I think another important thing to focus on is just
you have a right to travel. You have a right
to you know, freely travel without impediment.

Speaker 2 (01:05:18):
Eventually, they've been telling us for the longest time, you
didn't need to have a driver's license because driving is
a privilege. It's not a privilege, it's a right. I mean,
if you're doing it commercially, they can regulate it. They
should not be regulating anything. We shouldn't have to have
driver's licenses to drive around. I'm with the guys who
the sovereign citizen's pushing back against this. I just know, however,
that the you're not going to win in court because

(01:05:40):
the courts are rigged, So don't go down that road.
But anyway, you principle.

Speaker 4 (01:05:44):
Yeah, if you focus on the fact that they're unsafe,
that they do stupid things, eventually they will reach a
point where they don't anymore. These things will eventually probably
become statistically safer than the average driver because of the
number of idiots we have on the road. And if
you focus on the safety aspect, eventually that'll go away

(01:06:05):
and you won't have an argument anymore. You have to
focus on the fact that it is your right as
a human being to travel and drive yourself and control
your own destiny in that sense.

Speaker 2 (01:06:15):
The freedom and dignity, you know. And again, when you
look at human drivers, how much of the ding against
human drivers is really a ding against drunk drivers? Right?

Speaker 4 (01:06:27):
And I'm worlders that don't speak English.

Speaker 2 (01:06:30):
Yeah, I'm tired of being lumped in with the drunk
drivers and having to be stopped on the road to
make sure that I'm sober. And so what they're doing
is they're lumping me in with the drunk drivers. Again
to say that the machines are safer. They had a
way mow this year that got stuck in one of
Chick fil A's fast food cul de sacs couldn't find

(01:06:51):
its way out. But that's nothing new. Actually, they're getting
stuck in a lot of different places that are there.

Speaker 4 (01:06:57):
So yeah, I've told the story before. But one of
the last times we went to North Carolina used to
visit some friends. As we're coming back, I looked over
and there's a woman in Tesla. She's got her phone
in her hand and she's picking her nose with the
other one, and she is just completely checked out. She's
not looking at the road, she's not paying any attention.

(01:07:17):
And I personally can believe that possibly the self driving
feature on that car is more attentive and better equipped
than she is.

Speaker 2 (01:07:25):
Well, she would she didn't have self driving, she'd have
to at least have one handle car.

Speaker 4 (01:07:29):
If she'd have to pick one, which she wants to.
If you want to look at your phone or you
want to pick your nose and drive my nose.

Speaker 2 (01:07:35):
Or pick my phone, which one do I do?

Speaker 5 (01:07:36):
The thing is they know that's not a good driver currently,
so they say, oh, well, you've got to be alert
and aware and ready to take over when it inevitably
tries to kill someone. But these people just say, oh, well,
it's going to drive itself, so therefore I can, you know,
play on my phone and pick my nose and not
worry about any of it.

Speaker 2 (01:07:55):
And that's the worst possible circumstances under which you can
throw it back to you. You have an emergency and
it's quickly developing on the highway. Here you take the wheel.
That's what happened when that one.

Speaker 4 (01:08:05):
I have royally screwed up everything. I have made a
horrible mistake. Here you go, enjoy your last three seconds
of life.

Speaker 5 (01:08:13):
I turned into oncoming traffic. This is a disaster. I
am so sorry.

Speaker 2 (01:08:18):
That's right. And then you know, Tessla looks at it
and says, well, it was under manual control when the
accident occurred. That was the case of that woman who
was killed in Phoenix, right. She was a homeless woman
pushing a grocery cart across the road in the dark,
and the person who was a human driver couldn't see her.

(01:08:38):
She was jaywalking. Probably would have hit her anyway, But
everybody was saying, why didn't the AI put on the
brakes and they said, well, because it kept deploying these
emergency brakes without there being a reason, and it got
really dangerous. So we turned off the emergency braking system.
And so it saw this person and at the last

(01:08:58):
minute and throws it back to the woman and she's
you know, playing with a phone or whatever, and she
can't handle it either. Well, Google's AI has deleted a
user's entire hard drive.

Speaker 5 (01:09:10):
That's how they get the metrics that show that these
things are so safe, is because they always throw them
over and don't count it as an accident from the cards,
an accident from the driver.

Speaker 2 (01:09:20):
That's right, It's not my responsibility, right, So, yeah, Google
AI has now deleted a user's entire hard drive. You know,
we had this, We had this story once before, and
it was an entire company, remember that. Yeah, I just
deleted everything, all of your business records, all of your
customer records, everything. I did it. Yeah, I'm sorry, I

(01:09:41):
did it.

Speaker 11 (01:09:42):
You know.

Speaker 2 (01:09:42):
That's what this is is saying.

Speaker 6 (01:09:44):
You even told me not to do that.

Speaker 2 (01:09:46):
Yeah, you're right, Yeah, you said don't do that, but
I did it. Anyway. I cannot express how sorry I
am that I've deleted all your data. Well, We can
only hope that that happens. Once they put the government
put it, give the government databases to the AI, perhaps
it will just delete it all. That would be nice,
wouldn't it.

Speaker 6 (01:10:06):
We can hope and dream.

Speaker 2 (01:10:08):
Yeah, we're going to take a quick break, folks, and
we will be right back.

Speaker 1 (01:11:26):
You're listening to the David Night Show and now the

(01:12:21):
David Knight Show. If you like the Eagles, the.

Speaker 3 (01:12:32):
Cars, and Huey Lewis and the news, they say you'll
love the Classic Hits channel at APS Radio, download our
app or listen now at APS radio dot com.

Speaker 4 (01:12:46):
Well, welcome back, folks. We've got a lot of comments.
Stealth Patriot, thank you very much for the tip, He says,
do you think the AI, police, surveillance state and self
driving cars is the infrastructure the Trump supporters thought they
were promised.

Speaker 6 (01:12:58):
I'll bet they're tired of winning.

Speaker 2 (01:13:00):
I haven't seen any of them put this stuff up
and say I voted for this. I voted for ED
two nine those voted for ED. No I didn't. I'm
I'm afraid that's what we're going to get. So I
don't think we got that in the board anymore? Do
we That apocalypse now? Think the animation of the Trump meme.

Speaker 5 (01:13:22):
I literally just took it out yesterday.

Speaker 2 (01:13:24):
Yeah, we got That's why I went with that, because
it's not just the wars that he's starting unnecessarily, but
it's the war that he wants to have domestically. And
I think when you look at what's going on Venezuela
and you look at these flimsy lives that they're putting out, well,
these people are running drugs, and that's a threat. That's
a violent threat to us. That is as absurd, folks

(01:13:46):
as the Left saying to you that speech is violence.
Drugs are not violence. Drugs are a black market, and
when you create a black market monopoly, you will get
violent gains who will compete with each other. And yet
they're using that to say that it is violence. It's
their prohibition that is violence. The drugs are harmful, and

(01:14:10):
I don't recommend anybody take them. I just know that
we already had this experiment once. We did it legally
with alcohol and it was a massive failure. But he's
using that if you use those arguments, they're being used
by the Pentagon. Those same arguments would be could be
used and will be used. I think to do violence
on the street to people about due process in the

(01:14:30):
same way that his hero do Ke and the Philippines
did that on the streets of the Philippines. He wants
to do that here, go ahead, read this.

Speaker 5 (01:14:39):
And when you gave me that add to a nine
clip to put in, I thought that was in reference
to the attacks on the drug boats allegedly after they
drop the drugs.

Speaker 2 (01:14:55):
Yeah you have, I was like to drop the cocaine
to get off the boat. Are you trying to float
the river?

Speaker 5 (01:15:01):
Yeah, we will open fire in forty minutes.

Speaker 2 (01:15:05):
Yeah. So evidently, from what we're told, the only way
these people could have not been killed was if they
decided that they were going to swim back to shore.
If they tried to float on the boat, then that's
a threat.

Speaker 4 (01:15:18):
Crazy alien poop evolution says cockroach eats bait, poison man
eats cockroach could happen in any restaurant. Thankfully, I'm pretty
sure that the quantity of poison in a roach would
not actually negatively impact you based on your size.

Speaker 6 (01:15:36):
However, just gross.

Speaker 2 (01:15:38):
Gross.

Speaker 5 (01:15:38):
It is also a roach eating contest. You get none
of those guys of poison.

Speaker 4 (01:15:44):
Hopefully they weren't just out there collecting roaches off the ground.
Hopefully these were specifically procure roaches.

Speaker 2 (01:15:50):
This this is a reptile store. I'm assuming that these
are like the Madagascar cockroaches because it said there were
three to four inches big.

Speaker 4 (01:15:57):
Maybe some kind of particularly bred cockroach that these reptiles
like to eat. Man we have oh and sixty one
saying Somali appetizers.

Speaker 6 (01:16:08):
Delicious. A Syrian girl should have let.

Speaker 4 (01:16:11):
The python eat the bugs and himself eat the python. Fairly, certain,
fairly certain the pythons have enough sense to not be
eating cockroaches.

Speaker 2 (01:16:20):
I think they go for something that's higher up the
food chain, like people. If there's a Burmese python, who
knows those.

Speaker 4 (01:16:27):
Those can get large enough that they can pose a
threat you're average python.

Speaker 2 (01:16:32):
Since it was Florida, and since they've got such a
problem now with a Burmaids python, I'm assuming that it
was a Burmaise python or something. Maybe they've made those
out on I think.

Speaker 4 (01:16:41):
They may have. Well, I know for a fact that
you know, as a general rule, if you're going to
keep a Burmese python, you need a specifically set up
enclosure because that thing is going to get massive, and
if you don't have one, you are eventually just gonna
end up getting rid of it, probably releasing it into
the everglades narrow gate ministries. How disgusting cockroaches are filled

(01:17:02):
with all sorts of bacteria and diseases. Under the levitical
laws levitical eating laws, only locusts and grasshoppers are cleaned
to eat. All they're flying creeping are unclean and you
shall not eat.

Speaker 2 (01:17:14):
That's what I say. You know, I always sease my
family because they like lobster, and I said, I don't
eat water filters. These are the you know what's in
the levitical wall. That's one of the other things. I
think it's kind of interesting. How did Moses know that
these things that are scavengers that are eating waste and
anything like you know, cockroaches or you know, the shell

(01:17:37):
fish and things like that. How do you know that
that would be harmful for you? So you can look
at it and say, well, I'm told I can't do this,
or the other way you can look at it is
you know, God is telling them, you know, don't eat
this stuff and you won't get the diseases that the
Egyptians get when they eat this kind of stuff, Stay
away from the water filters.

Speaker 4 (01:17:56):
Delicious water bugs, high boost new Stephen King movie concept
of Christine. But it's an AI smart fridge.

Speaker 2 (01:18:07):
Yeah, it works for ice. They beware of your smart refrigerators.
They work for ice.

Speaker 4 (01:18:12):
Yeah, it seems like you're buying a lot of tamalays
their friend. Perhaps we need to report you.

Speaker 5 (01:18:17):
I mean, I think the AI smart fridges are already
about as evil as they could possibly be. Yeah, they're
already spying on you. They're doing everything that they have
the capability to do except for spoiling your food. That
they can do that against you.

Speaker 2 (01:18:31):
You know, it was about a decade ago that betray Us,
Betrayus I always I've called him Betrayus so much that
but Betrayus went from the military to the CIA and
he made that statement. He said, your refrigerators are going
to be smart and they're going to be spying on you.
That type of thing. We talked about that and everybody, oh,

(01:18:52):
you conspiracy theorists and everything. It wasn't a conspiracy theory.
It was a conspiracy, but it wasn't a theory. He
had said they were going to do it, and now
we see it every where. Don't we it's amazing real.

Speaker 4 (01:19:02):
Chason Barker says, my wife wants a new TV and
we cannot find one that does not have the smart
features anywhere. Yeah, it's a huge nuisance. They're completely and
utterly just they don't do anything useful. They're obnoxious, they
get in the way. You're gonna have to go back
to an old CRT TV if you want to avoid
them at this point.

Speaker 2 (01:19:22):
Yeah, yeah, I was gonna say, you just make sure
that it's not connected to the Internet. But unlike your
thermostat or something like that, you need to connect the
TV to the internet. That's the problem. Yeah, we got
you there.

Speaker 4 (01:19:32):
Yeah, I'm becoming convinced that four to three is actually
the superior aspect ratio for TV viewing.

Speaker 2 (01:19:39):
Why is that?

Speaker 4 (01:19:39):
It's cozier, It focuses the view You don't have all
this extraneous information on the outside of the screen. If
you're looking for something like an imax, it's a spectacle,
maybe that's what you want, but for TV shows it's
a bit cozier. It's a bit comfier. You've got your
little cast there and you're focused on them. You don't
have to worry about all this nonsense on the periphery.

Speaker 5 (01:19:58):
Well, you're gonna say it's because four free doesn't spy
on you.

Speaker 2 (01:20:03):
Yeah, I prefer the black and white stuff. Actually, if
I'm watch TV, yes, yea.

Speaker 4 (01:20:07):
Aesthetics reasons real, Jason Barker. All the new TV's listened
to you. They have Alexa or other voice functions. I
hate talking to robots. I refused to Goldsmith. I remember
reading that Charlie was based on Charles Schultz's own younger
days and personality, and that he eventually married that red
haired girl.

Speaker 6 (01:20:24):
Very nice, that's great, good for him.

Speaker 2 (01:20:27):
Yeah, he was a cool guy. I liked him a lot. Yeah, yeah,
very relaxed guy. Like what was the guy mister Roberts or.

Speaker 4 (01:20:35):
Something, mister Roger Rogers, Yeah, mister Rogers neighborhood.

Speaker 2 (01:20:39):
Yeah, yeah he was. As a matter of fact, they
have brought him back with AI so that he's doing
all kinds of things that really the original character would
not do. So he's part of the the As Sorrow
was coming back, they were doing all these things with
Stephen Hawkins doing races in his wheelchair and things like that.

Speaker 6 (01:21:02):
But doughnuts.

Speaker 2 (01:21:02):
The stuff that they did with mister Rogers was I
think even funnier, So go ahead.

Speaker 4 (01:21:10):
Brian and Deb McCartney says, you cannot reason with a robot,
and that's right, you just have to pourt work them.
Brown Goldsmith says, did you see the way MO cars
had been passing school buses that are releasing kids time
for a code check?

Speaker 2 (01:21:26):
I guess it doesn't recognize the law or the yellow paint,
because that's what keeps the school buses safe. Right. They
don't have to have seat belts. There's no safety devices
in there. There's no airbags, no seat belts, nothing. It's
just they're covered with yellow paint and they're covered with
the laws, and maybe it's hard for it to see it.
You know, they had these things keep hitting. It's interesting.

(01:21:47):
It's almost like somebody is sabotaging and they have a
propensity to hit fire trucks, police trucks and threaten school buses.
But it's okay. They're safer than we are, right, and
we should have more of them.

Speaker 4 (01:22:00):
Real Jason Barker says, do the AI and data centers
actually consume the water or just require initial filling of
a closed loop system like your car uses?

Speaker 2 (01:22:08):
Yeah? I don't know. I mean it's they you know,
they're using it for cooling, and they put these power plants,
you know, on the edge of bodies of water for
quite some time to recycle it through. So I don't
really know. It seems like you'd be able to cover that,
but who knows.

Speaker 1 (01:22:28):
Well.

Speaker 5 (01:22:28):
I saw something that was saying it's different from just
power plants because these things require cleaner water. So it's
essentially taking up water that has been purified and treated
that could be used as drinking water and running it
through their system where I supposed it evaporates off and
then they have.

Speaker 2 (01:22:48):
To or maybe it's just no longer drinking water. And
so that's what they mean by consuming water. Right, So
you had some purified water that had been treated or
something and had fluoride in it. So what happens when
the AI centers to consume Florida? Do they get stupid
as well? I don't know.

Speaker 4 (01:23:05):
I can't wait for the tech cults to emerge, and
they'll just be selling you the holy water that was
used to cool the AI data center. Drink drink the
water real adjacent read that one FANSI bear minority report
cars always looked like what they want, I report cars
always looked like what they want to come to be. Yeah,

(01:23:27):
the weird little bubbles that are completely unstylish, uncool.

Speaker 2 (01:23:32):
Yeah.

Speaker 6 (01:23:32):
Minority Report another pretty good movie.

Speaker 5 (01:23:35):
Pretty very communist aesthetic to a car. It's sort of
like the car equivalent of wearing pajamas in a jump
or a jumpsuit everywhere.

Speaker 2 (01:23:43):
Yeah, yeah, where do we wear pajamas?

Speaker 8 (01:23:45):
Now?

Speaker 2 (01:23:45):
Everywhere?

Speaker 6 (01:23:46):
Everywhere?

Speaker 2 (01:23:47):
TSA, TSA. Everybody goes and they fly because they've imposed
that kind of authoritariness.

Speaker 4 (01:23:53):
Moness Nibaru twenty nine. Self driving cars will drive auto
insurance rates beyond afford to bill. That's right, you want
to drive your own car, Well, sorry, buddy, you're gonna
have to pay through the nose.

Speaker 2 (01:24:05):
We'll all be treated like pain drivers.

Speaker 4 (01:24:08):
I was when I was getting my first car. There
were a few I was looking at, of course, you know,
I was a guy. You're looking at some of the nicer,
low end sports cars, things like the whatever, the Scion FRS,
and the insurance on that thing was going to be ludicrous.
It would have been a massive, like a substantial portion

(01:24:30):
of the car's actual cost per year to ensure that.
Because again, young guys get that and they just wrap
it around telephone poles non stop.

Speaker 2 (01:24:38):
So you got a Nissan three hundred twin turbo. Yeah,
it was great insurance rates on that and nothing.

Speaker 4 (01:24:43):
Right, Well, I mean considering how infrequently that thing ran.

Speaker 6 (01:24:48):
Yeah, that's true, you didn't have to have it insured.

Speaker 2 (01:24:50):
Yeah. I had a had a friend I worked with
who was into one of these Rice Rocket motorcycles. Right,
and it is really fast. Motorcycle was expensive, I mean
it was just under twenty thousand dollars, but he said
the insurance was going to be prohibitive. He said, this
is they're charging me so much insurance. I could buy

(01:25:13):
a new one of these like every year or two.
And he goes, how do you testify that? He goes,
and I'm not even a threat to anybody else really
with this motorcycle. You know, we're you're going to get
a big bills, like you know, they don't have to
pay for the people that I hit. For the most part,
it just got to scrape me off of it. Yeah.

Speaker 4 (01:25:31):
That's the thing is just if you're on the motorcycle,
if you have an accident, you may not even need
insurance where you may go beyond your necessary mortal concerns.
If you have an accident on a motorcycle very much
more likely to happen.

Speaker 6 (01:25:47):
In my opinion, Jerry.

Speaker 4 (01:25:48):
Al Atalo State supposed to artificial intelligence, autonomous warfare, Minority Reports, surveillance,
and other horrific aspects of technocracy and transhumanism standing the
way of American dystopia.

Speaker 6 (01:25:58):
Are plus six the fun.

Speaker 4 (01:26:03):
Another interesting thing is the Minority Report video game from
way back in the day was actually pretty good. Didn't
follow the TV story, but it was still entertaining, which
is like a beat him up, shoot them up, don't
frag me bro. The false promise of safety and security
is the oldest argument by tyrants for peasants to give
up their freedom Pezanovonte, seventeen seventy six. Just like they
lumped the criminal misuse of firearms with firearms owners and

(01:26:27):
gets thrown into.

Speaker 5 (01:26:28):
One reference to the trunk drivers being counted and determining
how safe human drivers are.

Speaker 2 (01:26:35):
Yeah, that's right. Yeah, you shouldn't be allowed to have
a gun because criminals shoot people. And it's like, well,
people defend life with that as well. Well, you know,
Trump has finally come. Remember they were talking for the
longest time about how they were going to help the
farmers that he had hurt with the tariffs. Don't worry.
Help us on the way. Yeah, we just gave twenty

(01:26:55):
billion dollars to Argentina and they use that to set
up a deal which so China doesn't buy our agricultural
products anymore. They get the sowy directly from Argentina. And
that was a massive double cross of the farmers. Trumpet
already betrayed the farmers in his first administration with tariff
rates that caused them to not be able to sell

(01:27:19):
their products. But then he doubled down with this and said, well,
we give twenty billion dollars to Argentina, and we got
another twenty billion that we're going to put together with
people on wall streets, we're all together. We're going to
give them forty Don't worry, we'll give you twelve billion
dollars some day. Well that was back in September. Here
we are in December, three months later, and now he's

(01:27:41):
talking about it being imminent.

Speaker 12 (01:27:43):
I'm delighted to announce this afternoon that the United States
will be taking as a portion of the hundreds of
billions of dollars we receive in tariffs, we are making
a lot of money from countries that took advantage of
us for years. They took advantage of us like nobody's
ever seen. Our deficits are way down.

Speaker 2 (01:28:01):
You took advantage of the farmers.

Speaker 12 (01:28:03):
They voted farmers is the election because without the election,
you wouldn't have tariff. Should be sitting here losing your shirt.

Speaker 10 (01:28:10):
But we're taking in billions.

Speaker 1 (01:28:12):
We're really taking in trillions of.

Speaker 12 (01:28:14):
Dollars if you think about it, Scott, because they're real numbers,
you know, when you think of all the money being
poured into the country for new auto plants and all
of the other things.

Speaker 8 (01:28:24):
AI, so what was.

Speaker 2 (01:28:26):
So that's not happening in.

Speaker 12 (01:28:28):
A relatively small portion of that, and we're going to
be giving and providing it to the farmers in economic assistance.
And we love our farmers, and as you know, the
farmers like me because you know, based on voting trends,
you could call it voting trends.

Speaker 2 (01:28:46):
All right, that's enough of lies. All of that is
a lie, okay, And if we're making trillions of dollars,
but I'm going to give them twelve billion dollars even
if it were true, you'd be reprehensible because you're going
to give them one thousandth of what he's bringing in
and wait for months and months as these guys are
circling the drain struggling to survive. This is America, last, folks,

(01:29:11):
this is not America. First, Trump says the twelve billion
dollars bailout plan for farmers will come from the tariff revenue.
You know, this is one of the most amazing things.
Is this is better actually terriffs. Why don't we think
of this before. This is better than the federal reserve.
This is better than the Democrats modern monetary theory, where
we just have this magic money tree that we can

(01:29:32):
print the money and it doesn't make any difference, and
the deficits don't make any difference. You know, we just
create money and wealth out of thin air. It's even
better than the federal Reserve thing because you know, they're
getting in all of this revenue and it isn't raising
anybody's prices, right, It's not hurting any manufacturing or farmers
here in this country except that it is. And apart

(01:29:56):
from the arguments about how how the taxes should be structured,
the worst thing about Trump's tariffs has and remains the
capricious arbitrary, continually shifting environment that it's created, making it
impossible for people to be able to do business. Whether

(01:30:18):
you're a manufacturer, whether you are a retailer importing stuff,
or whether you are a farmer. This has been absolutely chaotic.
As I pointed out before, we have the Chicago Commodities
Exchange because farmers needed to have a way to make
sure that they knew what their price is going to be.
They could lock that in in the future, and so

(01:30:39):
that's why you have the commodity futures market. And yet
what Trump has done is he's taken all that away.
I guess we could say that with the Trump capricious, arbitrary,
ever changing tariff policy, there is no futures for any
of us because what he's taken away the package includes
eleven billion dollars in a one time payment to crop farmers.

(01:31:01):
And oh, by the way, there's this interesting little thing
there from the Department of Agriculture Secretary of Briockrawlings saying yeah,
we're going to get these things out in February of
twenty twenty six. So still not coming. He's waited three months.
You know, they hinted at it. He had Scott Bessant
hinting at it. Trump announces it, but is actually going

(01:31:23):
to be going out From what I could see based
on what Brook Crawlins said, it won't happen for another
two months yet, so they're going to go half a
year with this. So the aid package comes as the
US China trade war has hit soybean farmers especially hard.
I would just say this the Trump trade war, China
had blocked all purchases of soybeans from the US. China
was the biggest buyer of US soybeans in twenty twenty four,

(01:31:46):
accounting for twelve and a half billion in sales. China
agreed to purchase twelve million metric tons the soybeans now
in the final two months of this year and twenty
five million metric tons and twenty twenty six twenty seven eight,
on par with levels before the trade war. But what
CBS does not say, I'm sorry, this is ABC not CBS.

(01:32:09):
At what price? You know? It was a double whammy
from Trump. Not only did he cut off their biggest customer,
but that created a glut of soybeans on the domestic
market and it took the price down. So the question
is that what price. Do they get this stuff that
actually matters. It's amazing they don't even think about that.

(01:32:31):
But what they're doing is even though it's abc, they're
just kind of whoever wrote this thing is just going
with the talking points of the Trump administration. So far,
China has purchased only two and a half million metric
tons of soybeans, not the twelve, so they got a
lot of catching up to do here. The administration's new
actions also come on the heels of the administration's twenty

(01:32:52):
billion dollar bailout of Argentina, which Scott Bessett said he
was going to make it forty in terms of helping
put together some private funds they moved that Many American
farmers and lawmakers on both sides the political aisle criticized
this fall. As China stopped buying all soybeans US farmers,
it purchased soybeans from Argentina instead, So the US was

(01:33:12):
giving a financial lifeline to Argentina, a country that directly
benefited from the trade war. American farmers said they felt
left behind, and at the time, Chuck Grassley and Iowa
said farmers are very upset about Argentina selling soybeans to
China right after the US bailed out. And there's still
zero US soybeans sold to China. And that was back

(01:33:37):
in September. And it's taking them this long to firm
up their promises. But still not to help the farmers.
Trumpet his first term, also took action to bail out
American farmers, except that he'd already bailed them in to
his ter regime. He'd already hurt them. This is like
somebody breaking your legs and then handing you, giving you

(01:33:58):
a wheelchair and boasting about the wheelchair they gave you. His
administration approved two packages in twenty eighteen nineteen, totally twenty
eight billion dollars for farmers impacted by his economic policies.
Many of them were saying, well, he nearly put us
out of business with these tariff policies. Now he's putting
us out of business with the COVID lockdown. So again,

(01:34:19):
the announcement was made yesterday. So meanwhile, the run up
in soybean futures over the past month over a resolution
with China, crop prices are still close to twenty twenty lows.
Now this is zero hedge. This ABC didn't even think
about the price aspect of it. That's the all important
thing you got. And you make a deal with China.

(01:34:42):
And let's say, you know, I don't have any idea
what soybeans cost or what quantity. That's all men. Let's
say they we'll call it a widget. You put them
in a widget. I don't know if it's a basket
or barrel or whatever, a bushel or whatever, but you
got a widget full of soybeans that goes for ten dollars.
And then after this, he wants to make a deal
and wants to show that he's getting them back up

(01:35:03):
buying soybeans, and they agree to it. So what did
he do to get them to a degree to it?
Did he say, well, now you can buy the same
quantity of stuff, but we'll sell you these We'll sell
you these soybeans at five dollars per widget full of
soy stuff. So again, they're taking advantage of the low
cost right now? Is that what they're doing? So as

(01:35:25):
they announced this, Trump is saying this wouldn't be possible,
This money would not be possible without tariffs. Here's the truth, folks,
it wouldn't be necessary without tariffs. He wouldn't have to
give them a bailout if he hadn't bailed them into
his Trump trade war.

Speaker 5 (01:35:44):
You know, and these farmers that are suffering from the tariffs, well,
with help the tariffs, I wouldn't have had the money
to give them a piece. I've taxed these people to death,
and now I'll dole out a small amount back to
them that wouldn't be possible by taxing to death in
the first place.

Speaker 2 (01:35:59):
That's right, And here's why I said. It's not going
to happen till twenty twenty six. This is CNN reporting now,
says Rollin said the money would be flowing by February
the twenty eighth, twenty twenty six, the very last day
of February. We're going to get the money flowing, so
it make the first payment three months now, and explain
that a billion dollars of the funding is being held

(01:36:21):
back to make sure all specialty crops are covered. She
created Trump for opening the markets through trade deals without
directly acknowledging how terrifts have impacted farmers. Again, you close
the markets, and now you open it. And so now
you pat yourself on the back for opening the market
that was opened before you closed it. All this is

(01:36:42):
based on a lie. And so what you've been able
to do is to open those markets up again and
move towards an era where our farmers are not so
reliant on government checks. Here's the bottom line. He was
just boasting about the fact that after he disrupted the
sale of the the market sale of soybeans at market

(01:37:02):
prices to China, after he messed with the market price,
after he closed it off and shut it to zero,
now he's going to open it back up, and they're
going to purchase it at levels that they were buying
before he started in this nonsense. Just amazing. Are you
tired of the winning? I'm tired of the whining about
all of this stuff and the fact that he is
lying to everybody about this. Some farmers have previously bailed

(01:37:25):
at the idea of aid. Mark Reid, a director for
the Illinois Soybean Association, said, farmers don't want free aid,
we want free trade. There you go, that's what they
had before he messed with it. Well, Reason says the
Trump tariffs have failed to reduce the trade deficits. How
should we assess whether the Trump tariffs have been effective

(01:37:49):
or successful? Well, it's an important question. Trump has outlined
overlapping and confusing and sometimes competing goals for the tariffs.
He has celebrated them as a sore of government revenue,
for example, but he's also claimed that they're meant as
a negotiating tactic. They can't be both tariffs used for
negotiation are meant to be removed once the negotiations are complete.

(01:38:14):
He's also said they don't mention it here, but he's
also said, we're going to use the tariffs to make
sure that manufacturing moves back to America and look at
all the windfall profit that we're going to make. Well again,
you can't have both of those. You're either going to
use it for negotiations and then take it off, or
you're going to use it to get businesses to come back.

(01:38:35):
If that's your goal, to get businesses to come back
and do manufacturing here domestically. But if they do manufacturing domestically,
then your teriffmen who goes away. So he's always putting
out these contradictory ideas and everybody grabs whatever they want.
They think, well, he's going to make so much money.
We're going to get rid of the income tax that's
floating around again as well, thanks to Trump, except that

(01:38:56):
he's talking about how he's going to make all these
different tax changes that he's gone permanent, and so that
you might want to think about what he's actually saying here.

Speaker 4 (01:39:06):
There's also the fact is that I don't see the
government generating revenue as a win. No, there's no if
we had had a government that was actually, you know,
working on building infrastructure, even that I don't necessarily think
that's the government's place to do that, but you could
at least make that sound good, like, oh, we're gonna
build better roads, we're gonna build nicer parks, we're gonna.

Speaker 6 (01:39:27):
Build really cool friends.

Speaker 4 (01:39:30):
Instead, it's going to go to his friends, and it's
going to go into the military industrial complex and the.

Speaker 2 (01:39:35):
Police stayed industrial complex and the surveillance state industrial complex.

Speaker 4 (01:39:39):
The government is not going to do anything that will
benefit the common citizen with it, and again, you know,
it's probably it's not their place to do that, I
don't think, but at least then you would be getting
some benefit, some use.

Speaker 2 (01:39:51):
There's no plan benefit for it, and we don't want
to see the government taking more and more control of
the economy. But Trump does Trump terrafs or solution to
every problem in the trade war is more about the
vibes than it is about economics. But when Representative Brendan
Boyle Democrat pressed jameson Greer, the US trade representative said,

(01:40:14):
what would success look like? Greer gave two clear metrics. So,
first of all, the trade deficit needs to go in
the right direction, in other words, down, and manufacturing as
a share of gross domestic product needs to go in
the right direction, needs to go up. So if it's
going to be a success, as they pinned him down,

(01:40:36):
they said, okay, well, Trump, let's talk about revenue. What
is your view as the trade representative for all this stuff?
What are you trying to see happen? Well, I want
to see the trade deficit go down and I want
to see a manufacturing go up. Well, what has happened
more than six months later. Neither goal is any closer
to being achieved. Neither of them seems likely to be
completed over the long term by an economic policy rooted

(01:40:58):
in barriers to trade. Trump has been obsessed with the
trade deficit for years and so, but he doesn't really
care if he even understands the budget deficit, they point out,
which is the difference between the revenue they bring in
what they spend. That is far more important than the
trade deficit. But he's not going to put his own

(01:41:21):
house in order. From January through July, America's trade deficit
was eight hundred and forty billion dollars. It was twenty
three percent larger than the same months in twenty twenty four. Okay,
so her state of goal is we want to see
the trade deficit go down to find as we want
to sell more to other people than they're selling to us.

(01:41:43):
Except it increased by twenty three percent, even with all
of Trump's manipulation here. It also reflects now a well
established fact that tariffs do not reduce trade deficits. During
his first term, Trump raised various tariffs, but the country's
trade deficit climbed from about four hundred and eighty one

(01:42:05):
billion in twenty sixteen to six hundred and seventy nine
billion in twenty twenty. So over four years it goes up,
let's say, maybe about fifty percent, right, But under this
new regime of Trump tariff policy, it has gone up
twenty three percent. The trade deficit has increased twenty three percent.

(01:42:27):
So by their metric, and of course no matter whether
Trump has these contradictory explanations at all, He is definitely
wanting to see the trade deficit go down, but it
went up twenty three percent. Tariffs are no better as
a tool for boosting manufacturing. Rather than being helped. The
manufacturing sector is being crushed by tariffs, increasing the cost

(01:42:52):
of raw materials and of intermediate goods. And it's not
just manufacturing, it's all businesses, whether people are in retail
or anything else. They can't tell what their costs are
going to be because who knows. If Trump is going
to have something gives him indigestion, and he's going to
try to punish the country that he bought that food from.

(01:43:12):
You know, It's just it's that petty. If he has
an argument with somebody who is a political leader in
another country, he slaps them with tariffs. So during a
speech in July, the Trade Representative Greer added a third
goal for the administration's tariff policies, increasing real median household income. Well,

(01:43:33):
tariffs are making it more difficult for households to make
ends meet. In October, study from the Harvard Business School
shows that retail prices had declined throughout twenty twenty four
in early twenty twenty five, and then began rising in
April after Trump's tariffs were announced. The Trump administration's tariff
policies misunderstand the role of trade and productive, flourishing economies.

(01:43:58):
The administration has set the wrong goal and then has
made poorly, has made policy choices that are unlikely to
achieve those goals. Again, it's because people like Peter Navarro.
This is the dumb as a sack of bricks policy.
And so what does this look like? Well, China has

(01:44:19):
had a record trade surplus. China's trade surplus has topped
a trillion dollars for the first time. Despite Trump's terrorists,
china exports have rebounded in November after an unexpected contraction
the previous month, pushing its trade surplus passed a trillion

(01:44:40):
dollars for the first time ever, an all time high.
Exports listen to this climbed from six percent a year earlier,
while imports rose just under two percent. Meanwhile, shipments to
the United States dropped nearly twenty nine percent year of year.

(01:45:01):
So they've been able to replace this with other markets
and they are thriving. If this is part of his policy, again,
that is another thing he's thrown in there, the economic
competition with China. It's a failure with that as well.
So it's been a failure in terms of the trade deficit.
It's been a failure in terms of economic competition with China.
It's been a failure in terms of manufacturing. It's been

(01:45:22):
a failure in terms of keeping costs down. It's a failure.
The nearly trillion dollar trade surplus for the first eleven
months of this year is a record high. It's likely
that November exports have yet to fully reflect the tariff cut,
which should feed through in the coming months. But you know, hey,

(01:45:43):
they're making it up in other countries. You, however, may
pay a lot more. You know, they're expecting that toys
will go up quite a bit because a lot of
toys are manufactured in China. But as Trump said before,
hey so your kids only got like, you know, one
dollar instead of five dollars. YEA, too bad. I wonder
how many dolls Ivanka had, whichever one it is to

(01:46:07):
get the two of them mixed up. Evana was the mother, right,
and Ivanka is the daughter.

Speaker 6 (01:46:13):
Ye Alca is the daughter.

Speaker 2 (01:46:14):
I imagine she had a lot of dolls. But Trump
doesn't really care about that, doesn't care if you can
afford toys or not. It's kind of like that toy
market we went to in China where the TSA then
confiscate all the toys that we'd bought to keep our
daughter busy while we came back. So China's exports grow
six percent and US shipments dropped twenty nine percent. Seems

(01:46:37):
like things are going in exactly the opposite direction that
Trump wanted to go. By the way, manufacturing is dropping
as well, and they're struggling. As I said before, just
like retailers and importers, every business, farmers, everybody is struggling
with the chaos that Trump has brought to the economy.

(01:46:58):
It's not about terraffs or income income tax. It's about
chaos versus stability. Chaos is hampering everyone in the US economy.
It is the elephant in the room. And I'm not
talking about Republicans. We're gonna take a quick break. You
want to get those comments there.

Speaker 4 (01:47:17):
Yes, I don't know if that other one is right, Lanceome,
but guard Goldsmith says, by the way, the Trump Executive
Order REAI appears to claim authority by implying that state
statutes on AI interfere with interstate commerce. Yet Trump's executive
order breaches separation of powers.

Speaker 2 (01:47:35):
Yeah, breaches a tenth Amendment. And that's the thing. You know,
when you look at the way they've sold the unconstitutional
illegal warned drugs. How did they do it with the
Commerce Act claiming that that allowed them to prohibit drugs?
Why didn't anybody think about that when they prohibited alcohol.
It's funny, you know, those people, I don't know where
they just stupid and they couldn't read that in the Constitution,

(01:47:56):
or maybe they had respect for the Constitution that we
don't have. I think that's what it was. Well, we're
gonna take a quick break and we will be right back.

Speaker 8 (01:48:05):
Stay with us, and now the David Knight Show. S

(01:50:01):
you're listening to The David Knight Show.

Speaker 4 (01:50:07):
Welcome back, folks. Briefly, I want to let you know
that it is support from listeners like you that keeps
the show going. We cannot thank you all enough. A
really good way to support the show is go to
subscribe star dot com forward slash the David Knight Show.
You can find a tier that fits your budget and
then it's fire and forget. You don't have to worry
about it, and there you can see it. There's many

(01:50:28):
different tiers. As I said, hopefully one of them fits
your budget and you can just set it up and
not have to worry about it. It'll only go down
if your card is no longer valid. Check out subscribe
star dot com, forward slash, the David Knights are Davidknight
dot news and find all the other ways you can
support us directly.

Speaker 2 (01:50:45):
Of course, you can turn it off. You're not locked
into they don't come to your house, go turnover. We're
not going to let you go. But we do appreciate
that people have stuck with us for years there and
one of the things that we try and do for
them year's ago where they I guess it's two years ago.
The Christmas album. We gave it to the people there
for free, and we also try to give them the

(01:51:07):
articles as well as a link to the podcast where
they can get it without commercials. Yes, and you can
also get that on substack now if you just want
to get the podcast commercials.

Speaker 4 (01:51:18):
If you're only interested in the podcast without commercials, the
best place to do that is substack dot com. You
can subscribe and you'll receive it there. I also want
to let you know that Homestead Products dot chop is
having a sale on their activated chartcoal capsules. They're good
for detoxifying your body, good for hangovers, energy boosting, whitening
your teeth, filtering your water. They have a numerous number

(01:51:40):
of applications. So go to Homestead Products dot shop.

Speaker 2 (01:51:43):
Check out out there best stuff there.

Speaker 4 (01:51:45):
They've got all kinds of really interesting, very high quality products.
They work very very hard to make sure the products
are made in the USA and of the highest quality.
So again, go to Homestead Products dot shop check out
the sale or having on their activated hardwood charcoal cap
And you can also use promo code night to get
ten percent off anything in their shop, So go check

(01:52:06):
them out. If you're looking for survival gear to just
some modest clothing, They've got options for you.

Speaker 2 (01:52:13):
I'll just throw in real quickly too. The code night
also gets you ten percent off at RC stores. Yes,
or you can get books that help you to find
natural remedies for many things, including cancer. And you can
find the book The World Without Cancer at the store
Yeah dot com and also get you ten percent off
with Jeryl Slenty's Trends Journal as.

Speaker 4 (01:52:35):
Well, which the Trends Journal with the ten percent off
works out to be about two dollars and fifty cents
a week. Which what else can you get for that
kind of value at this point?

Speaker 2 (01:52:45):
Well, real quickly before our guest comes on, Oh, this
is an interesting story. This is a college student who
got a zero on her assignment simply because she quoted
the Bible and a gender assignment article that she's supposed
to reave you. Now, this is really about a lot
of different issues. It's about free speech, free exercise, religion.

(01:53:06):
It's about the fact that the LGBT people see what
they're doing as a religion, as well as what is
happening in schools and the worthlessness of college degrees, I
would say as well. So, this is a college student
in Oklahoma gets a failing grade because she laid out
a biblical case for gender unfortunately for her, and she

(01:53:28):
didn't know it at the time, but the teaching assistant
who is going to be doing the grading is a trainy.
She didn't know that. She turned in the paper and
she didn't attack transgender. She made the case for the
biblical role of men and women. So it was not
a negative hit piece. There's nothing hateful about it.

Speaker 4 (01:53:48):
Well, these people are so completely diluted out to lunch
that simply showing them reality is painful to them.

Speaker 6 (01:53:57):
It breaks their delusion.

Speaker 2 (01:54:01):
Yes, and it was an opinion based piece.

Speaker 5 (01:54:04):
What lands like I mentioned yesterday with the story of
the person, and I believe it was the UK that
got ten days in prison and a fine for mentioning
that men and women have different skeletons.

Speaker 2 (01:54:14):
Yeah, that's right. So this was an opinion based piece.
And she said to point out say anywhere that I
needed evidence. It didnt say anywhere that I needed evidence
from my opinion. His response was no, that was the
grade that you deserved, a zero, she said. In terms
of her essay, here's some excerpts from it, she said,

(01:54:36):
this article was very thought provoking, caused me to thoroughly
evaluate the idea of gender the role that it plays
in our society. The article discussed peers using teasing as
a way to enforce gender norms. I don't look at
this necessarily as a problem. God made male and female
and made us differently from each other on purpose and
for a purpose. God is very intentional with what he

(01:54:58):
makes I believe trying to change that would only do
more harm. Gender roles and tendencies should not be considered
to be stereotypes. Women naturally want to do womanly things
because God created us with those womanly desires in our hearts.
But of course we can propagandize those out, can't we.
The same goes for men. God created men in the

(01:55:20):
image of his courage and strength. He created women in
the image of his beauty. He intentionally created women differently
than men, and we should live our lives with that
in mind. It's frustrating to me when I read articles
like this and discussion posts for my classmates of so
many people trying to conform to the same mundane opinion
so that they don't step on anybody's toes. I think

(01:55:41):
that is cowardly and an insincere way to live. It
is important to me to use the freedom of speech
we have been given in this country, and I personally
believe that eliminating gender in our society would be detrimental,
as it pulls us further from God's original plan for humans.
In Genesis, God says that it's not good for man
to be alone, so he created a helper for man,

(01:56:03):
which is woman. Many people assume the word helper in
this context to be condescending and offensive to women. However,
the original word in Hebrew is easer kannegdo, and that
directly translates to helper equal to Additionally, God describes himself
in the Bible using that same term either cannegdo or helper,

(01:56:26):
and he describes his Holy Spirit as our helper as well.
This shows the importance that God places on the role
of the helper. God does not view women as less
significant than men. He created us with such intentionality and care,
and he made women in his image of being a
helper and in the image of his beauty. If leaning
into that role means that I'm following gender stereotypes, then

(01:56:47):
I am happy to be following a stereotype that aligns
with the gifts and the abilities that God gives me
as a woman. I do not think that men and
women are pressure to be more masculine or feminine. I
strongly disagree with the idea from the article that encouraging
acceptance of diverse gender expressions can improve student's confidence. Society
pushing the lie that there are multiple genders and everyone

(01:57:09):
should be whatever they want to be is demonic and
severely harms American youth. I do not want kids to
be teased or bullied in school. However, pushing the lie
that everyone has their own truth and everyone can do
whatever they want and be whoever they want is not
biblical whatsoever. Reading articles like this encourages me to one
day raise my children knowing that they have a heavenly

(01:57:30):
Father who loves them and cherishes them deeply, and that
having their identity firmly rooted and who He is will
give them the satisfaction and acceptance that the world can
never provide for them. My prayer for the world, and
specifically for American society and youth, is that they would
not believe the lies being spread by Satan that make
them believe they're better off with another gender than what

(01:57:52):
God has made them. I pray that they feel God's
love and acceptance as who He originally created them to be.
She got a zero for that from the transgender She
complained to the university. They did nothing. She complained to
the governor's office and other politicians, and the response was
that the university gave him a paid vacation, paid leave,

(01:58:17):
but they said.

Speaker 6 (01:58:17):
That happened to me.

Speaker 4 (01:58:18):
All I can say is my next paper would would
be something.

Speaker 2 (01:58:22):
He can't grade her papers anymore, but he gets a
paid vacation, paid leave. Her essay was posted on social media. However,
it's been viewed by people over fifteen million times. So
her bottom line is she said, we must not be
intimidated to run away from our principles what we believe
to be true. We have the freedom to speak and

(01:58:45):
to believe what we wish. State senators said, it's about
a state funded, taxpayer funded institution that's allowing their faculty
members to abridge or to impede a student's right to
express their faith. And so she's been able to speak
many different places as well. Well, I've got more than
I wanted to get into. But we are at a
time and we have a guest that is ready to

(01:59:06):
join us, and just real briefly, the guest we have
joining us is a doctor and his name is Richard Restak, MD.
He has written over twenty five books and he's been
on the bestseller's list, and the book that we're going
to be discussing today, especially basically is neuroscience. And the

(01:59:29):
book that we're going to be discussing today is the
twenty first century Brain. Subtitle says how our brains are
changing in response to the challenges by social networks, AI,
climate change, and stress. So we're going to talk about
those things. And I've got a lot of questions that
I would like to ask him about that as well.
So I think it's going to be an interesting interview.

(01:59:50):
Stay with us, folks, We will be right back.

Speaker 9 (02:00:28):
King.

Speaker 1 (02:02:22):
You're listening to the David Knight Show.

Speaker 3 (02:02:29):
Here news now at apsradionews dot com or get the
APS Radio app and never miss another story.

Speaker 2 (02:02:39):
All right, And joining us now is doctor Richard Restak, MD,
and he is a neuroscientist as well, and he has
written a lot of books on the brain, and now
this is one kind of the nexus of our brain
and artificial intelligence. So I wanted to get him on
because we, as you know, we talk about AI and

(02:03:00):
its impact on society quite a bit. Thank you for
joining us, doctor Restick.

Speaker 10 (02:03:05):
Well, I'm happy to be here.

Speaker 2 (02:03:06):
Thank you.

Speaker 10 (02:03:06):
David.

Speaker 2 (02:03:07):
You've written so many books and best selling author and
of course people can bind this on Amazon. You've written
so many books. What is different about the brain? What
is different about this one? And why did you write this? Book.

Speaker 9 (02:03:20):
I wrote this book to announce and to discuss the
dangers that are lurking and so to speak, in the
twenty first century and are unique to the going first century,
but are having an effect on the brain and a
negative one, so that we really are imperiled by eight
different factors, one of which is the global warming.

Speaker 10 (02:03:43):
We have new.

Speaker 9 (02:03:46):
Diseases that are present in the twenty first century that
are increasing, starting with COVID and moving forward. We have
problems to course with the global warming, which we'll talk
about more detail. And then the Internet, the effect of
the Internet, the effect of AI memory, the alteration, the

(02:04:07):
attempt to alter memory, almost to alter our memories.

Speaker 10 (02:04:11):
Of what the past is like.

Speaker 9 (02:04:12):
This is an ongoing enterprise by various governments of the.

Speaker 10 (02:04:17):
World, including our own.

Speaker 9 (02:04:19):
We also have surveillance, the seventh a surveillance becoming increasingly
a surveillance society. It's almost impossible to not be revealing
things about yourself because there's surveillance cameras everywhere. I can
give you several examples, not just in my own personal life.
And then finally, the eighth one is anxiety. All of

(02:04:42):
these things are creating what I call an existential anxiety.
People are being given information, but it's being molded according
to the thoughts and the inclinations of people in power.
For instance, let's take today's right out of Today's York Times.
On page a seven, there's an article called the air

(02:05:05):
in New Delhi is life threatening, and it tells the
tale of the New York Times reporters who have spread
themselves throughout New Delhi from six am until late in
the evening of a certain day recently, and they measured
the particulate matter in the air and it was anywhere

(02:05:27):
from ten times to thirty times as great as would
be considered minimally normal.

Speaker 10 (02:05:35):
Now, on top of.

Speaker 9 (02:05:36):
That, you have the statement that they state that the
government is actually trying to hide this kind of insight
to the populace by spraying water.

Speaker 10 (02:05:49):
And other things like that.

Speaker 9 (02:05:50):
It says that they're doing this around the measuring stations.
They're also losing data from measuring stations during the worst
ealths pollution. So there you have the molding of the facts,
either denying them all together or trying to improve them
so that people say, oh, well, they measured it down

(02:06:12):
as such and such a measuring state, and it was
really not a lot of high of course, they were
spreading water and other things to try to reduce this.
So we've got a capitalist society here in the United
States which has invested interest in pushing forward certain scientific
points of view. So science is being put in the

(02:06:34):
back seat. And as politicians and other people, all of
whom share one thing, capitalistic enterprises in which they're part of,
for which they are advancing.

Speaker 2 (02:06:48):
And a kind of crony capitalism where they can get
protection and subsidies as well. And the control is being
taken away from us because, as I was just reporting
earlier today, working very hard to make sure that state
and local governments can't enact any control on artificial intelligence.
And they came up in the context of talking about

(02:07:10):
how the manufacturers of tasers also big manufacturers of police
body camps, how they want to wed that to artificial intelligence.
And the question is, you know, what could possibly go
wrong with that? If they identify you, they misidentify you
as a dangerous criminal and warn the police about how
dangerous you are, they could get people killed.

Speaker 9 (02:07:31):
Well, not only that, but all these efforts set up
a sense of anxiety, yes and fear. Let me tell
you what happened to me one morning, called a cab
to go to medical appointment, and we've started going down
the road. I said to the driver, you know you're
not going the most efficient or the quickest way. He said,

(02:07:53):
I know that, he said, but I don't want to
go that way because they're speed cameras. I said, well,
you know you're driving very soon and you're not speeding,
and I'm in no hurry, So what's the problem. He said, Well,
they take pictures of everybody that goes by those cameras
because they want to see who's in those photos in
those cars. So I asked them to give me a

(02:08:14):
reference for that, and he got set of didn't say
anything else for the rest of the trip. So when
I got down to the medical building, I got in
the elevator and said, in this facility there is surveillance,
both obvious and hidden.

Speaker 10 (02:08:30):
In this is one morning.

Speaker 9 (02:08:36):
And then when I got up to sign in, I
signed the board with an electronic ben and I didn't
see you go no signature, So I said, well it
didn't take. It took, but we don't know it to
go on the screen so it could be seen. I said,
why is that? Said, well, somebody who haunts you might
see the thing and then remembered and use your for

(02:08:56):
your signature to forward something somewhere. Well, first of all,
there was a sign that said stand ten feet back,
and secondly, there's nobody else behind me. So there's three
examples just drawing it random that we're becoming an increasingly
surveilled society, which is creating a sense of paranoia and
a sense of fear. So the braining has to adjust

(02:09:18):
to these type of things, Dave, and it's very hard
to do.

Speaker 2 (02:09:23):
And I think that is calculated. You know they've been
they want to do this even to the extent and
when you talk about these cameras taking everybody's picture, the
flock network that is out there, this corporation that is saying, well,
we can do whatever we want because it's in public space,
and you know we're not government, so we can collect
this information. And yet they collect it in order to

(02:09:44):
sell it to the government. So it's just one level indirect.
But they not only grab your license plate, but they
also do a complete profile of your car and all
of its idiosyncrasies. Is it have a dent here, does
have a scrape there? What about a bumper sticker, so
it creates a model of your car, and so they
almost have like, you know, biometric identification of your cars

(02:10:06):
as well as of you. And this is now made
possible because of the advances of AI. But this has
been something that has been concerning me. I look at
things kind of from a libertarian perspective, and this has
been concerning me for a long time. The idea that
government is using technology, many different ways of Internet, social media,

(02:10:28):
things like that, to monitor and to manipulate us all
the time. And to me, artificial intelligence just puts this
on steroids. And so I think there's something to be
anxious about. If we're going to look at this, we
should be concerned about it. Maybe not anxious, but we
should be concerned about the goals of people who are

(02:10:49):
putting this kind of stuff together.

Speaker 9 (02:10:51):
So yeah, well there's that, and then there's if you
can manage to change the present, you can manipulate the future.

Speaker 10 (02:10:59):
Of course. Is the real way to get it is
to get control of the past.

Speaker 9 (02:11:02):
Is warwell pointed out, you control the past, you know,
you can control the present and by the implication, control
of the future. And we're seeing alterations of materials even
government documents, government films, documentaries, things like that are being
altered in ways that are not visible I should say detectable,

(02:11:27):
not detectable to the ordinary person. So they get ideas
about what the past was like which are wrong and.

Speaker 10 (02:11:35):
Don't show you.

Speaker 9 (02:11:37):
As I mentioned in the book, if you were at
a dance eighteen fifty before the Civil War and it's
a film or watching Let's just say we're watching a film
about eighteen fifty and we're seeing people ballroom dancing all that,
and then one of them pulls the side and pulls
out a cell phone, and you say, wait a minute,

(02:11:58):
we didn't have cell phones in Well, you know, there
were a lot of things that were going on.

Speaker 10 (02:12:04):
Now that we're not.

Speaker 9 (02:12:05):
Going on in the past, and it's not too our
advantage to try to pretend that they were.

Speaker 10 (02:12:11):
They weren't.

Speaker 9 (02:12:12):
We have to understand the past understand the future, and
we're not only creating situations that are false, but we're
also like in nineteen eighty four, Orwell created a character
called Commander Olov.

Speaker 10 (02:12:30):
He was a war hero.

Speaker 9 (02:12:31):
He got all sorts of medals, and it was all
the products that were all told to honor him, and
so forth. Well he never existed. He actually was made
up entirely. And that's one of the things that the
narrator is doing in the job work is filling in
photographs see Concerting Olov into historical events that happened, wartime scenarios, etc.

(02:12:58):
Anyone reading it will say, wow, this is this is
some man. Well he was a complete fabrication. We're just
about at that point with Sorau the AI. Well, it
could take you and had you, you know, to say,
let's get to David Knight and have him leading some.

Speaker 10 (02:13:16):
Sort of a parade or whatever, and.

Speaker 9 (02:13:18):
You know, suddenly people say, well, gosh, I saw with
my own eyes. So what's happening is that the actual seeing,
is believing is being turned on its head.

Speaker 10 (02:13:27):
So that's no longer true.

Speaker 2 (02:13:29):
You're talking about a completely fabricated character out of Orwell.
It's just recently they had Tillie Norwood, who is a
completely fabricated AI personality, and the person who came up
with it is got agents representing her. They got her
out there as an actress. I mean it was like,
so I've created an AI actress which will do a
lot of different roles for you. She probably does her

(02:13:52):
own stunts as well. I'm actually met people in SAG,
the screen actors Gil and Andy, furious about this, and
I said, any agents this AI character, it's not going
to do any business with us. But we're already at
that point. It truly is interesting.

Speaker 9 (02:14:08):
Yeah, And one of the ways of neutralizing it is
to create the situation that exists right now between you
and me. You're laughing and I'm laughing because it seems funny,
and it is funny, but it's a very serious purpose.

Speaker 10 (02:14:20):
Behind all of this.

Speaker 9 (02:14:21):
Yes, it's all better to try to alter people's perceptions
so that they begin to doubt the varietity of what
they're seeing.

Speaker 2 (02:14:31):
That's right. Yes, And I've talked for the longest time
about how the whole idea for the Internet was created
by darpest psychologists, and I've been concerned that it was
all about the psychological manipulation from the get go with
all of this. But as a physician and as a neuroscientist,
I'd be interested in your take on what is currently

(02:14:51):
going on, because besides manipulating the past by changing information
about the past or memory holding it, or writing a
new alternative history of it, they're also concerned and there's
been projects that have been put out by DARPA, and
I don't know if they've been successful or not, but
you know, they're putting out requests for people to come
up with things to manipulate people's memories. So you've got

(02:15:13):
a soldier, they say it, who's got bad PTSD. Let's
get rid of that memory. Let's give them different memories.
What do you see in terms of someone who studies
the brain and neuroscience, what do you see about that?
What do you take as I think is the state
of the art with that.

Speaker 9 (02:15:31):
Well, my last book was called The Complete Book of Memory.
It had to do with memory and studied memory in
great detail.

Speaker 10 (02:15:38):
And of course you.

Speaker 9 (02:15:38):
Have to do a way with the concept that memory
is like a videotape or something that you just store
in your brain and when you get and want to
get it, you just bring it out like you bring
out a videotape. It's not like that. It's a reconstruction.
Each time you think back to a certain event, you
alter that memory that you have.

Speaker 10 (02:16:00):
Memory one, memory two, memory three, on.

Speaker 1 (02:16:03):
And on and all.

Speaker 9 (02:16:04):
That's the nature of memory, and memory can be manipulated.
It's always you know, in the courtroom. They're always trying
to avoid the contamination of the witness. An example that
would be, well, which car went through the red light?
And to ask a witness he said, oh, it was
a red car went through the red light. Well, would

(02:16:27):
it surprise you to know that it wasn't a red
light but it was a stop sign? Mister witness, of course,
his credibility is gone because he took the suggestion that
it was a red light instead. It'll be very easy
to do because you don't necessarily have that image of
that intersection in your mind. So that's why there's protections

(02:16:48):
even in the courtroom against leading the witness. They caught
another words, providing information that's either not through at all
or half true. So we've got that ca This is
not this didn't start in the twenty first century. That
that started, you know as long as we've had court rooms.
This is a more emphasis now on altering memory. So

(02:17:11):
the people will not will get up there an undercross examination.
They'll do pretty well because their whole memories but altered.
They've changed by various mechanisms, suggestion, repeating information which is false,
of course, which is the misinformation. There was a cartoon
about a week ago by Ramirez, in which he's spilled

(02:17:33):
the prize winner. He has three doctors in an operating room.
It's in a laboratory. One of them is looking into
a microscope and he looks up and he says, this
is the most dangerous pathogen we have ever encountered. And
the second doctor says, well, is it ubonic plague? Is
it smallpox? And then the one that he says, no,

(02:17:55):
it's misinformation and disinformation.

Speaker 2 (02:18:00):
That's right, And we've got to be very careful because
many times the people who will tell us about that
are the people who want to be the ones who
define what the information is for us, and they will
ask those leading questions. You know, when we talk about
leading questions and manipulating people, there's been a lot of
reports about artificial intelligence kind of people who have a

(02:18:25):
particular psychosis or something and they get involved with the
AI and it starts to confirm the things that they want,
because that's what it is set up to do in
terms of bias that want to, you know, be empathetic
and sympathetic to people, and so it starts doing that
and leading them further and further down a particular rabbit hole.
There's been situations of you know, people who got into

(02:18:47):
severe mental distress, some suicides of some young children and
other things like that speak to that aspect of it
and the real danger of that. That is really kind
of I think speaks to the to the psychological aspect
and potential of artificial intelligence, and that could be weaponized.
Right now, it's just kind of happening out of their

(02:19:08):
business model, right but that could definitely be weaponized against people.

Speaker 10 (02:19:12):
Well.

Speaker 9 (02:19:12):
I talk about that in my book in the chapter
on the Internet. There are famous examples of people who
have suicided right on the internet live feed, and they've
been manipulated to doing that by other people who've encouraged
them said this would be a sign of strength, this

(02:19:33):
would be a sign that you're not afraid to die
if necessary. And there's cases of them that actually led
to the suicide. One of them is most Grizzly. I
have in my book about a person who has talked
into pouring gasoline over themselves and setting a match all
on open feed Internet. And while this fire is burning,

(02:19:56):
you can hear everybody in the backgrounds cheer.

Speaker 10 (02:20:00):
We did it, We did it. We got them to
do it.

Speaker 2 (02:20:03):
Wow. That's amazing, amazing.

Speaker 9 (02:20:06):
So there's something about the Internet and about that actually
brings out statistic criminal, psychopathic trends, and we don't know why.
Is it the fact that you don't necessarily can't be identified.
It's something that is going to be influencing and has
influenced the Internet greatly, and it will continue to do so,

(02:20:29):
and so we understand it.

Speaker 2 (02:20:31):
I think that's one of the things that's so dangerous
about the things that we saw with lockdown, other aspects
of it. There's an atomization here, and so many different
ways the government and tech companies are trying to make
sure that we don't we're not in person with each other.
You know, many cases like, for example, in this interview,
we couldn't do this interview if we both had if

(02:20:53):
one of both of us had to travel. We're able
to do this because we can do it over zoom
or whatever. But just taking ordinary things that you would
normally do in terms of interacting with people in school
or in church or in your community or whatever, taking
that away and putting a screen between the two of you,
it really does change the way people interact with each other.

(02:21:15):
I remember Errol Morris, the film director was able to
get people to say all kinds of things. He got
a murderer to confess, He got Robert McNamara to confess
about the false Flago, the Vietnam War. He got people
say all kinds of stuff because there was that distance
between him and them. He could have interviewed them in person.

(02:21:36):
But what he did was he put an interotron, which
he is what he called it. It was basically a
teleprompter that he had set up so he could do
two way communication at the time. And once he had
that distance there, then it completely changed the dynamics that
he would have versus with somebody person to person. And
that's what we're talking about here, isn't it.

Speaker 9 (02:21:56):
Yeah, we're talking about that, And of course there's the
incredations of this, and it continues like.

Speaker 10 (02:22:02):
We're you're interviewing me, We're discussing. I feel like it's
a discussion.

Speaker 9 (02:22:06):
If I were to say something that later I regretted,
I could probably say, oh, well, that wasn't me, that
was my avatar.

Speaker 2 (02:22:15):
Or my agent, Right, I got an AI agent that's
out there doings. That's right, that's crazy. We also see though,
as a doctor, you're seeing people have noticed actual physical
changes that can be observed in people's brains from I'm
thinking of the story about the London taxi drivers who
would do the knowledge and they would find that as

(02:22:38):
they memorized all these factual details and drew on that
all the time in order to take people to you know,
this very complicated city with its complicated streets, that they
had a particular part of the brain that was larger
than the typical person. And then they found that once
they stopped doing that, it started to shrink again. And
we're starting to see that happening with people in a

(02:23:01):
lot of different areas of their life, that kind of
atrophy and it's physically observable, isn't it.

Speaker 10 (02:23:07):
Well, it is.

Speaker 9 (02:23:07):
You have to learn, you have to use the things
that you have learned to do. Like I mentioned in
my memory book, there's all kinds of memory exercises.

Speaker 10 (02:23:16):
That you can do.

Speaker 9 (02:23:17):
I do them every day and they're very easy and
they can help you to continue with your with your
memory to keep it sharp.

Speaker 2 (02:23:25):
Give us some examples. I'm sure everybody would love to
know that. I would all like to have a better memory.
What kind of things do we can we do to
exercise I.

Speaker 9 (02:23:33):
Think about the fact that you never had to learn
pictures when you were an infant and a young child.
A picture was something that you could You may not
know what you're looking at, but you could see it
without an intermediary language is something that you have to
hear from other people. It's something that's sort of added
on to the brain. Okay, So as a result, the

(02:23:56):
most the best way of remembering something is to make a.

Speaker 10 (02:24:04):
Image for it.

Speaker 2 (02:24:05):
Okay.

Speaker 9 (02:24:06):
For instance, I have a little dog called a skipper Key.
Skipper Key is a Belgian dog. He's a nice little fellow.

Speaker 10 (02:24:15):
But it was embarrassing to me when walking the street
people say what kind of a dog is that?

Speaker 9 (02:24:19):
And it couldn't come up with a name because it
was so complicated.

Speaker 10 (02:24:23):
And I thought that skipper Key I didn't speak any
doubt or anything.

Speaker 9 (02:24:27):
So then I got this image of a small boat
with a large captain with a beard holding a big key.
So it was skipper Key and I remember forever. So
I was going to have the picture. Once I have
the picture, it's easy to do another way, easy way
to do it, and you can do that with all
kinds of times.

Speaker 10 (02:24:47):
All the time I was.

Speaker 9 (02:24:48):
Going upstairs before I came down to the office and
I wanted to get my wallet, and I wanted to
get my cell phone. So I just had an image
of a wall in the form of a cell phone,
and I was walking up the stairs talking into the
walllet cell phone. So I got up and I knew
I had these two elements to get. Be very easy

(02:25:10):
to get one and forget the other. So you have
these images all the time and the quickest you know,
this is sort of off the topic of the book,
but if you want to have a firepower memory for
a load of things, it's up to ten things and
get ten areas that you are familiar with that you

(02:25:32):
see every day, and then you could put on those
images the thing you're trying to remember. So if I'm
trying to remember a loaf of bread, milk, maybe a batteries,
I have a regular way.

Speaker 10 (02:25:51):
Of doing that. I have like I remember the library
that's near.

Speaker 9 (02:25:55):
My home, the coffee shop, liquor store, Georgetown University Medical
School where I went, Georgetown University, Cafe Milano, which is
a place in Washington everybody gathers, and then Keybridge, Ewa,
Jeema Memorial, and Regan Airport. So that bread would be,

(02:26:19):
for instance, the loaf of bread. I was looking in
the window of the library. Instead of seeing books, I
see bread loaves of bread. And when I get down
to the liquor store, instead of it being filled with liquor,
that'll be built ball. So that's how I love to
get two it so I have those ten, so I
can get ten items.

Speaker 10 (02:26:36):
Together not any problems at all.

Speaker 2 (02:26:39):
That's great. Yeah, you know, it's interesting you talk about
the importance of a visualization. It's one of the things
that I do in terms of preparing for the show.
I have a lot of articles that I go through
and it's really when I highlight things or when I
write them down, that's when I can remember them. If
I don't do that, if I were just to read
these things, I wouldn't remember them. But if I interact

(02:27:01):
with it and write it down, that helps me to
remember it. So that is a kind of visualization there,
I guess as well. It is. It truly is interesting
what you said earlier about memory not being something that
is stored in a place. As somebody coming from a
computer science background, that was a very different thing. When
you construct your your memory, you know, how do you

(02:27:22):
reconstruct that? I mean that that that as opens up
a whole new area of questions as well. In other words,
if every time somebody brings up a subject, I mean,
there isn't something that's stored initially to reference that and
then rebuild from that.

Speaker 10 (02:27:38):
Yeah, there's that.

Speaker 9 (02:27:39):
There's the interconnections. Like you know, somebody listening to us
might say, well, gee, this is called the twenty first
century brain, but I haven't heard that much about the brain. Well,
let me just link that up so that these things
make sense. We have a new version, or I should say,
a new understanding of the brain called the connectomet brain,
in which there's all kinds of interactions in the brain

(02:28:02):
of parts of the brain. But you don't we're just
learning about I have the I use the metaphor of
a bull of spaghetti. You pull out one of the
strains of spaghetti, and you never have any idea what
it's connected to. How many other strains is spaghetti this
is connected to? So that if you think of the
brain as being kind of set to make connections, that's

(02:28:27):
its natural processing. So it gets back to these things
that we were talking about earlier, you know, global warming
and memory and surveillance and all that how are we
going to solve all those Well, somehow or other those
things are connected with each other. That's the take home
message of this book. And the basic goal is to

(02:28:51):
try to figure out what it is that connects.

Speaker 10 (02:28:54):
These things, what it is that would allow us to.

Speaker 9 (02:28:58):
Buy solving one of them solve the other. And I
mentioned at the end of the book experts. So far,
I haven't done it. So it's useful, as Hiak said,
to get ordinary people to give. When I say ordinary,
I mean non specialized people to give their ideas. Do

(02:29:20):
you I wonder what such and such would happen? What
would happen about global warming? For a while there was,
in fact there's still experiments going on on the effect
of sulfur that would help the CO two problem, and
you know, shooting sulfur up into the atmosphere. Of course,
the reason for that was the volcano in nineteen eighty something,

(02:29:44):
in which after that volcano in Hawaii, it was noted
that the air was clearer and there was less pollution.
So that's something to think about, is there's some way
of using that particular sulfur experiment to decrease global warming.
War for instance, we don't think of war as a

(02:30:06):
cause of global warming, but it is.

Speaker 2 (02:30:10):
Clear warning.

Speaker 9 (02:30:13):
Since the Ukraine War and the gauze of war, then
you know, a tremendous amount that's going to overcome and
exceed the benefit of any of these things, like you know,
non gasoline engines, but using things like that.

Speaker 2 (02:30:32):
Absolutely. Yeah, it's kind of like, you know, shooting up
rockets in order to put satellites up. You know, how
many how many cars and lifetime use of cars from
people would that be equivalent to And you start talking
about all the missiles that are being shot and then
you get to the explosives as well. Uh, it is
really interesting how they focus us on their objectives for

(02:30:55):
their ways to control the manipulation has been going on
for quite some time. And so yeah, that is it
is pretty amazing. And I guess that's my you know my,
when we look at this stuff, it really does look
like science fiction. And I'm almost inclined to write it
off when I first see it. When DARPA is saying, well,
we need to find some way that we can you know,

(02:31:16):
erase memories and people and insert new memories into them.
And we were going back to total recall, right, so
it sounds like something from Philip K. Dick novel, But
they're really working on that. And I guess one of
the most striking things that we saw we reported on
a couple of weeks ago, and it was a company
that was bragging about how they could read your mind

(02:31:39):
more accurately and quickly than their competitors, because there's a
lot of different companies that are doing this, and how
they could. It's called brain it was the name of
the company, and so they had a way that they
would do MRI and they could essentially train it on
your brain and I'm a shorter time the other people,

(02:32:00):
and they can get much better results. And our producers
just pull this up. So what they do is they
show you an image and you're looking at that image,
and then it's reading your mind and reconstructing what you're
looking at, which I thought was absolutely amazing and terrifying
at the same time. How is this going to be used?
I guess that's the real issue when we start talking
about all these different things. I think that is the

(02:32:22):
real case that it's difficult for people to understand just
how far and how quickly the technology has progressed, and
then to say, and how do we control this from
it being used for bad purposes.

Speaker 9 (02:32:36):
Well, that's specifically twenty first century problem. Yes, because all
of these things are either originated in the twenty first
century or they have in fact further developed to become
increasingly threatening.

Speaker 10 (02:32:51):
And bear in mind we have to have to solve
these problems because they're not something that's going to go away.
And then the most important thing to remember, David, is.

Speaker 9 (02:33:00):
That all of these things harm the brain, and the
brain is the thinking processor.

Speaker 10 (02:33:07):
It's going to save us.

Speaker 9 (02:33:08):
It's going to figure out what the problems, what the
solutions to the problems are. So we know now that
wildfire smoke, for instance, it creates dementia, it enhances the
likelyhoodo subfy's coming to menash. So as the brain is
affected negatively increasingly over longer and longer periods of time,

(02:33:29):
our ability to solve these problems is going to decrease.
So we've got to do it now. We've got to
get serious about it. And this business of people getting
up saying the global warming is fiction and all that
is really very very disturbing.

Speaker 2 (02:33:47):
Yeah. Well, you know the example that you gave earlier
of the fact that the Indian government was manipulating the
temperature at some of the stations there. That kind of
works both ways. They have put some of the temperature
stations on the air or tarmacs, and in the UK
they have a lot of the temperature stations that they've
got there, they're just extrapolating the data. They don't have

(02:34:08):
real temperature measurement stations there. So it all really gets back,
I think, to the scientific method, and that's really where
we have to hold people's feet to the fire. We're
talking about something like that. We can have an absolute
standard of what truth is, and that truth is going
to be being able to measure something accurately and being
able to reproduce that. And then I think a good

(02:34:30):
yardstick for that is when somebody is trying to hide
their data. That's the clue right there that they're not
doing science. Because if they're doing science and they've come
to the right conclusion, they don't have a problem with
somebody looking at their data. And so I've got a
question here for you from a person in the audience
asking you know about doctors James Giordano and Charles Morgan

(02:34:53):
their work with military. I'm not familiar with those names.
I don't know if you know anything about that or not.

Speaker 10 (02:34:58):
To your daughter says familiar, what particular thing are they
asking about them?

Speaker 2 (02:35:04):
I don't know it just as their work with the military.
I guess i'd have to do with something, but you
haven't heard of it.

Speaker 10 (02:35:08):
I'm not sure I could say to your dollar, did
this or did that?

Speaker 2 (02:35:13):
Sure? I understand. Yeah, let's talk a little bit about
the things that we have been anxious about, and of course,
as Christians, we have one answer to it. But you
talk about how this is something that has been around
pretty much all of our life. I mean there was
I grew up with anxiety about nuclear war, for example.
That was on everybody's television, and that was a fourth

(02:35:37):
front front of our mind, especially growing up in Florida
when the Cuban missile crisis was happening. They got us
really afraid of that. When I was in elementary school.
You know, it's like there's not gonna be enough time
for you to get home. You know, a nuclear bomb
started falling in So, I mean there's all these different
ways that you can panic people. I guess part of
it is how do we identify the real problems and

(02:35:58):
how do we deal with those problems because there's always
things that are competing for our attention and our anxiety,
many of which are not real, you know, and usually
the things that you're really the most concerned about won't happen,
and it may be sometimes because you have taken a
precaution about it. What would you say about that, about anxiety?

Speaker 10 (02:36:24):
You're starting to break up a little bit, Can you
hear me, clearly?

Speaker 2 (02:36:27):
I hear you, yes, Yes, sorry about that. You're talking
about breaking up a little bit. You're talking about traumatizing
a population. You know, what do we do to guard
against that type of thing? And of course that's going
to really escalate with the ability of AI to create
a narrative.

Speaker 9 (02:36:47):
Yeah, well, let's talk about it as an avenue to
get into that. Let's go back to what you've brought
about atomic weapons and the atomic war, the fears of
the people that there's going to be another atomic war.
I mean, you know, this is not unrealistic. There's even
been a movie that's just come out that's getting all
kinds of attention, as you know, and it has to.

Speaker 10 (02:37:09):
Do with the threat of a nuclear war.

Speaker 9 (02:37:13):
Things in the If you look at what's happening in
Europe right now, there's all kinds of suggestions that could
lead to a nuclear war. I mean, Ukraine now has
announced that they're under no conditions willing to give up
any land, and Stalin is I mean, Putin is thinking
what he can do to change that. But maybe he'll

(02:37:34):
attack another country. I mean, this is scary stuff. So
what's happening in response to the government is to try
to show that, oh, we shouldn't worry about it, we
have things under control. But I don't think things are
under control.

Speaker 2 (02:37:50):
And we've talked about the problems, and we talked about
problems you have. Your final chapter is New ways of Thinking,
and I'd like to talk about that. One of the
things that you say is Ocam was wrong. Occam's razor
that you know people are familiar with. Tell us a
little bit about that. Why is I come wrong?

Speaker 9 (02:38:11):
Well, because he says that, you know, the entities are
not to be multiplied, meaning that we can always explain
things best by limiting ourselves to the minimum amount of factors,
ideally one one cause of every effect.

Speaker 10 (02:38:25):
That's not true. It's certainly not true.

Speaker 9 (02:38:27):
In the twenty first century, there's all kinds of interactions
between factors and cause, so that Occam was wrong in
that basis. We have to think of an interconnecting pool,
just as in the brain, of interconnections of neurons, interconnections
of these problems.

Speaker 10 (02:38:44):
And they're all related.

Speaker 9 (02:38:45):
They're all eight of them I talk about in my book.
They're all related. And if you can figure way of
influencing one, you influence all the others. I mean, who
would think there'd be a connection between global warming and
the amount of artison and cheese Friends's high end cheese.

Speaker 10 (02:39:04):
Well, there is, because they don't chicken stole lay the
many eggs, and it would be all the various other
things to.

Speaker 9 (02:39:12):
Come on in terms of making cheese. I turned out
learned that the other day. That was something that was
a surprise to me.

Speaker 2 (02:39:19):
You know, it's kind of interesting. We talked about connections
so much, and there was a series that was I
think it was on PBS. I think the guy's name
was Burke. I can't remember his first name. I'm not
sure about the last name. But he had a series
called Connections, and I thought it was fascinating because what
he would do is he would take a whole series
of connections to show how a particular technology had evolved,

(02:39:40):
you know, so he might go from you know, the
quill to the to the jet engine or something like that.
And it was a fascinating, fascinating thread of things, very
much like what you're talking about.

Speaker 10 (02:39:56):
It really is. And I did I did consult his work.
Actually did you this book?

Speaker 9 (02:40:01):
Because he did that connections. He did a book called
The Day the World Changed and all this. He also
did a book called Circles, in which he would start
with one particular event that he cared in history, and
if you go around the circle, you come back to
the beginning where it started, where this particular inventor invented something,
what led up to it, What was the circle.

Speaker 10 (02:40:22):
Leading to that. So, yes, we're talking about connections, and
we're talking about the inability to understand things without reference
to supporting an accessory factors. We have that going all
the time, denying things that are going to be happening.

Speaker 9 (02:40:39):
Of course, I think the fearful thing is that the
government is aiding in this denial, because if you would
deny that there's a problem, then there's very little impetus
to try to solve it, you know, Yeah, and there
is no problem, don't try to solve it.

Speaker 2 (02:40:57):
They're throwing out their own chaos and uncertainty and anxiety
that's out there all the time always, I guess. So
the question is you're talked about volatility, uncertainty, complexity, and ambiguity.
I mean it sounds like a government policy. I think
they've got bureaucracies that specialized on that.

Speaker 10 (02:41:18):
Yeah, yeah, well actually that's true.

Speaker 2 (02:41:20):
Yeah, that's in your section there about new ways of
thinking and so how do we incorporate that in the
new ways of thinking that help us to solve this riddle?

Speaker 10 (02:41:31):
Well, each of those factors is.

Speaker 9 (02:41:33):
A factor that helps you to understand things and to
have more control. It doesn't necessarily mean it helps you
to link them together. That has to be done by
original thinking. You have to be under those things. Things
are evolvable. You don't have a basic situation that doesn't change.

(02:41:53):
It changes all the time. So that The other thing
that I want to emphasize most is that is the
role of capitalism in all of this. I mean, there's
all this like the private equity, the business of people
having a point of view that is going to advance
them financially, and that blinding them to the problems that

(02:42:18):
are here, Like, for instance, we talked about global warming, Well,
the rich people of very rich people are buying multi
million dollar departments and condominiums which have special.

Speaker 10 (02:42:29):
Air filters which will keep.

Speaker 9 (02:42:31):
The wildfire smoke out, and we'll try to keep the
global warming effect at bay by superpower air conditioners.

Speaker 2 (02:42:44):
So the building they're building their own bunkers to buildings
that are creating all kinds of chaos and and uh,
you know, weapons of war mass destruction. They're out there
building super bunkers in various places as well. So I
think they're somewhat pessimist think about what they're doing.

Speaker 9 (02:43:02):
Well, it's basically the idea is that, you know, we
don't care about the ordinary person.

Speaker 10 (02:43:07):
We're going to survive. We're going to see to our
own survival, and if we in order to do that,
we have to deny certain things that are that are
going on, will do so. Now, incidentally, all of this
is not conscious thinking.

Speaker 9 (02:43:19):
They don't necessarily say, well, I'm going to deny global
warming because it'll be to my advantage financially because all
my investment is in the globe the oil and gas industry.
They don't do it that way. They come up with
pseudo logic, things that seem to make sense to them,
But if they didn't have a financial thrust in the matter,

(02:43:41):
they would look out upon it quite differently.

Speaker 2 (02:43:44):
That's right. We can always find a justification for what
it was, what it is that we really want. Everybody
should understand that if your parents this time of year,
at Christmas time, you can always understand that people will
come up with a justification for what they want. And
that's that's as true of government as a is of
corporations out there, and it's really dangerous one of the
two of them connect with each other. I think that's

(02:44:04):
one of the things. You know, you talk about connections
and the importance of it and how we can try
to connect these different factors, each of us individually. But
I think it's the human connection that is out there
that is going to be essential for all of this.
It's going to be our collective work on and all this.
What do you think about that? Would you agree with that?

Speaker 9 (02:44:25):
Well, I'd agree with it. But there's so many things
that are taking place now that are causing the schisms
and yes, splitting people into factors and belief systems and
political points of view, and that's very dangerous because then
you can't get together any kind of unity, even in
the face of an emergency.

Speaker 2 (02:44:46):
Well, I think we've always had I think we've always
had these factor you know, factions and things like that.
You know, the founder of the country warned about factions
and political parties. But I think what makes it unique
is that when you're interacting with people on a personal basis,
you interact with them a little bit differently than if
you've got that separation between you that technology is giving

(02:45:07):
us now, because now you're interacting with something that's abstract,
it's not with another person, and there's also the body
language that you're not picking up on. But it makes
it easier for you to be harder on people when
there's that distance there. I think that's why I think,
you know, the personal connection I think is really vital
to making these connections and coming up with an understanding

(02:45:28):
of what's going on. We talk about the hidden factors
that are out there, hidden unrelated topics, other people, as
you pointed out earlier, just talking to ordinary people about
what it is that you see with different things. I
think that is the genius of the collective free market
out there. That there's so many observers who are looking

(02:45:48):
at things and thinking about them, and it's kind of
their collective decision that is kind of guiding things along,
as opposed to having a central planner who's doing that.
What do you think about that You've got to in
your final chapter a new way of thinking? You have
what you call it sensible solution. What does that really involve?

Speaker 10 (02:46:09):
I'm sorry, I hear what you said the last part.

Speaker 2 (02:46:12):
You have a sensible solution. What do you think a
sensible solution to the kind of stress and chaos and
anxiety that we have, manipulation that we have. What is
a solution to that? Well?

Speaker 9 (02:46:24):
I think the Wikipedia is a good example of that.
They have people from all walks of life, all levels
of education, free to contribute to whatever topic they may
want to do that it may be helpful.

Speaker 10 (02:46:38):
I mentioned earlier about the.

Speaker 9 (02:46:39):
Effect of global warming on the making of cheese. It
might be somebody who makes cheese that's going to come
up with some idea.

Speaker 10 (02:46:49):
You know, we don't know that.

Speaker 9 (02:46:50):
We don't know that that may not be where it
comes some original idea of what to do about global warming,
And you put it on what I'd like to think
and I hope it will be developed a kind of
Wikipedia where the ordinary person can feel free to put
forth their ideas about it. Now you say, well, we
already have that, we have the Internet, No we don't.

Speaker 10 (02:47:11):
The Internet is a commercial situation.

Speaker 9 (02:47:14):
It's all done for making money and you have attention
and all that, and there's no criticism of it. There's
no pure review, if you will, rights in the Wikipedia.
I mean, you know, people could write in and say, well,
that particular contribution as bonkers, and then give an example
why it is that was a very good idea.

Speaker 10 (02:47:32):
And after that you begin to get.

Speaker 9 (02:47:34):
Things coming together in unpredictable ways that may help us
solve these eight problems.

Speaker 2 (02:47:42):
Yeah, the problem is it seems like whenever you wind
up having a form or place where things can be,
and that's true of the Internet, it's also true of Wikipedia,
then it becomes you have gatekeepers who are there. And
we saw this in spades throughout the COVID stuff that
if somebody's got a different idea, rather than debate them,
the impetus is to silence them by the people who

(02:48:05):
are in authority. And so that really, I think is
the key thing, and I think as part of that
we see a continuing rise in disgust and deprivation of
free speech. People are not interested in the principle of
free speech. They don't want to have open debate. And

(02:48:27):
I see this regardless of where people coming from on
the political spectrum. There is a declining interest and debate
and thinking. You know, the debate is critical to critical thinking,
and so the people who are in charge, the gatekeepers,
whether it's Wikipedia or the Internet or any other form

(02:48:47):
of information, they are weighing in on that, and they
don't want things that they disagree with. And it might
be because they've got an agenda, or it might be because
they've just got a particular prejudice about something that want
to make sure that the contrary views don't get out there.
That I think is the real key that's there. And

(02:49:08):
again this is part of this animization that we have
of people feeding that tribalism in the way that we've
never seen it before using technology.

Speaker 9 (02:49:18):
I would agree with everything you've just said exactly, and
I think we have to try to get beyond that.

Speaker 10 (02:49:24):
But we get back.

Speaker 9 (02:49:25):
Again to this business of people having their own personal
financial point of view and position and pushing that basically.

Speaker 10 (02:49:35):
On the fact that they look upon it as.

Speaker 9 (02:49:37):
So maybe we're talking about a capitalism problem. We've got capitalism,
it's what this country's all about. But I mean it's
certain parts of it. Now we've gone to the point
where people are unable to take another point of view
if it's going to be financially harmful and hurtful to them.

Speaker 2 (02:49:54):
Yeah, I think that, you know, we start looking at
the tech companies. I don't think that their capitalism would exist.
I don't think they have billions of dollars if they
weren't unified with the government. So there's a there's a
symbiosis there that the two of these entities feed off
of each other. And I think that in exus right

(02:50:15):
there is the is the difficult thing. And so I think,
you know, when I think of capitalism, I don't like
to refer to capitalism anymore because I think of it
as a partnership, a public private partnership, some kind of
a economic fascism where they are working together. But I
like to think of a free, competitive market where the
government doesn't have any role except as some kind of

(02:50:37):
a referee between two parties that have a conflict or something.
But yeah, that's a that's the thing that's really driving this.
You know, many people when they talk about AI, they said, well,
you know, here's a couple of different outcomes. Maybe this
stuff really works the way it's supposed to work, and
it takes everybody's jobs and we wind up with a depression.
Or maybe it doesn't work at all, in which case
the big AI stock bubble that we've got bursts and

(02:51:00):
everybody loses their job because of that. Well there's a
third alternative, and that is that the government keeps popping
it up with public funds because it feeds their surveillance
and manipulation needs and their ability to surveil and to
control us. And I really think that that's where this
is all going to head. I don't really you know,

(02:51:21):
those other two things may happen, and they may be true,
but I think there is a customer out there for
the AI stuff that is driving all this stuff, that
has been putting out these proposals for the longest time,
and that's governments, governments around the world. I mean, we
look at the Brain project that we had a few
years ago, that was during the Oboma administration. But things
like the Brain Computer Interface that Elon Musk and many

(02:51:44):
other tech companies are doing out there. There's neuralink and
there's a lot of them that are doing that. That's
being driven by the government wanting to connect into our minds,
hack into our minds really, and they've been funding that
kind of stuff. So how do we break that.

Speaker 9 (02:52:00):
Yeah, on the musk side, it seems doing it for money,
I mean obviously to make money. That's right, So that
there's unholy alliance if you will, between someone who can't
see anything on the dollar and in another side of
the government can't see anything other than increasing power and
surveillance over the population.

Speaker 2 (02:52:19):
Yeah, that's right, absolutely true. Well, it's a fascinating book.
It's fascinating take on this. And of course you've written
many books on the brain, the memory one very interesting,
and you do have sections about memory in this book
as well. And people be able to find this on Amazon,
I guess is the best place that they can find

(02:52:39):
it looking for the title of this and it is.
You know, it is something that I think we all
need to think about how we're going to operate the
effects that this technology is having on our brains in
the twenty first century. And that is the title of
the book. The twenty first Century Brain by Richard rest Tack.

(02:53:00):
Thank you very much, doctor rest Thank you, appreciate you
coming on. Enjoyed a thank you, a very interesting conversation.
Thank you. Have a good day. Folks are gonna take
a quick break and we will be right back.

Speaker 1 (02:54:57):
You're listening to the David Night Show.

Speaker 2 (02:55:02):
Welcome back. And I've had a lot of comments. I
don't want to get to these. I knew before I
brought him in that he was I didn't think he
was gonna be that focused on climate change. I really
wanted to talk to him about the other issues that
were there. But yeah, we had a lot of comments
about that. As a matter of fact, Land said, is

(02:55:25):
this thing about the cheese stuff in global warming connections?
That so they can try to tax the cheese. So
I guess the question is who stole the cheese? Right?
These people are trying to steal our cheese all the time.
But we do have an update, by the way, and
this is some comments from the telegram chat. Paul McCloud said,

(02:55:47):
I'm asking each and every one of you to send
prayers in my direction for specific reason that I cannot
disclose the moment by the pricking of my thumbs. Something
wicked this way comes, so they sent that. Just pass
that along to you. That's for Paul McLeod, who is
asking for prayer and for the love of the road.
Ryan has given us an update on his dad's surgery.

(02:56:08):
He said, dad surgery is done afternoon yesterday. It went
well and they eliminated all seven blockages. Wow, had to
take veins from of the parts of the body to
go around some of them. Though. He should be home
by Saturday. He said, sorry to hear about Clyde. Lewis
glad he's got a loyal base that is helping him
with GoFundMe. Yes, and so I'm glad that things are

(02:56:31):
going well for your dad, Ryan. I hope it continues
to go that way. We'll continue to pray about that.
And let me get some of your comments. Here O
comes Razors. Not what people think it is. It states
that the explanation with the least number of assumptions is
likely to be correct, not the simplest explanation is likely
to be correct. That's from Greg Hume. One one that's fine, yes,

(02:56:54):
and he says, oh for let's see this is I'm Marty.
He says come on, most wildfire are urson, not global warming.
I agree with that. I agree with that, and you
all know that I'm not. I'm not buying into global warming.
And he began by talking about how they were manipulating
the data at the Indian stations to try to minimize

(02:57:17):
the pollution that was there and to lower the temperature.
But typically government's doing just the opposite. And it was
a climate change crowd, the global warming crowd, that gave
India the license to have as cheap and dirty A
power plants as possible. So you might want to start

(02:57:39):
with what the government policy has been towards their mcguffin
of climate change. That's the reason they have that kind
of pollution that's there. And of course that was why
Nixon unconstitutionally created the Environmental Protection Agency. There's nothing in
the constitution that says that it's the role of the
federal government to protect the environment. And they did it

(02:58:01):
because of pollution. They said, we got some polluted sites
that are so big, we don't have the money to
address them locally or state level, so let's do it
at the federal level. And so they had their super
fun clean up thing, and then they metastasized from pollution
to telling us what kind of cars we could have

(02:58:21):
and mission control with that. So again it's mission creeper.
I guess we could say emission creep.

Speaker 5 (02:58:29):
Though in the case of the Indian testing stations, I
believe he was referring to air quality with the massive
amounts of air pollution they have in these cities, yeah,
and spraying it. I believe he was implying that you
clean up the air, which in that instance I would agree.

Speaker 2 (02:58:47):
Yeah, you find interestingly enough, you know, and the two
most populist countries, China and India, where they have said
don't worry about cleaning up the pollution from your factories
or your power stations, do whatever you want, right, they
also have the worst air pollution. Wu han, It's one
of the worst places for air pollution. So real. Octospook

(02:59:10):
says he's correct about one thing. The money around global
warming will buy the truth before it can be muttered.
That's right. Money problem and a gigantic government. Yeah, we
can our head around the whole issue. I think all
the little spaghetti strings, when you keep pulling them all out,
you'll find the government, and you'll find human nature. In
terms of the greed for power and for money. That

(02:59:33):
is the common spaghetti thread that ties all this stuff together,
and that's how we keep our distance from this. But
I think the real key thing take away from me
from that interview was the key thing is the connections.
Our brain works on connections. Our brain works best with connections.
Connections with other people expand our mind, expand our universe.

(02:59:55):
And it's that person to person connection that is so
difficult for us to maintain owing today, that is so
vital for us our survival. Thank you for joining us.
Have a good day. The common man, they created common Core.

(03:00:20):
They've dumbed down our children. They created common past, track
and control us their Commons project to make sure the
commoners own nothing and the communist future. They see the
common man as simple, unsophisticated ordinary. But each of us
has worth and dignity created in the image of God.

(03:00:44):
That is what we have in common. That is what
they want to take away. Their most powerful weapons are isolation, deception, intimidation.
They desire to know everything about us, while they hide
everything from us. It's time to turn that around and
expose what they want to hide. Please share the information

(03:01:06):
and links you'll find at The Davidnightshow dot com. Thank
you for listening, thank you for sharing. If you can't
support us financially, please keep us in your prayers. Ddavidnightshow
dot com
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by Audiochuck Media Company.

The Brothers Ortiz

The Brothers Ortiz

The Brothers Ortiz is the story of two brothers–both successful, but in very different ways. Gabe Ortiz becomes a third-highest ranking officer in all of Texas while his younger brother Larry climbs the ranks in Puro Tango Blast, a notorious Texas Prison gang. Gabe doesn’t know all the details of his brother’s nefarious dealings, and he’s made a point not to ask, to protect their relationship. But when Larry is murdered during a home invasion in a rented beach house, Gabe has no choice but to look into what happened that night. To solve Larry’s murder, Gabe, and the whole Ortiz family, must ask each other tough questions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.