Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
And hello, Well, what a great Tuesday show we have.
I'm telling you, you know, when it comes to Tuesday shows,
this is a pretty packed one, don't you agree, Kim Kim,
you know Tuesday shows and it's a great one. Yeah,
this really is a great one. I mean we typically
have I would say, you know, every team has a superstar,
(00:25):
and we have one, and that is David K. Johnston. Everybody,
So he will join the Pulitzer Prize winner, the investigative
journalist now professor of law. He will join us an
hour two, so that'll be pretty special. In hour one
we talk about AI. I don't know if you've heard
about AI, but it's pretty big. It's catching on. Where
(00:46):
am I? Speaking of eyes? These are my credibility glasses.
I need to put them on for instant credibility. Thank
you everybody? Yeah, I know, I know. And the AI
expert is And how do you say his name, Kim
h How do you say it rah Rahu Pore? And
(01:08):
you say it that way because he says it that
way or because you're guessing. I love it. It's very
I know, very on brand for the show. We could
ask his publicist or him his name.
Speaker 2 (01:24):
I just did I know, I know, it's late in
the game, but I did not. By the time he
comes on, you will have a pronouncer.
Speaker 1 (01:33):
Isn't that great? But we are on and already mentioning
his name, so you're you're right by the time he
gets here, we're not going to blow his name as
you're right. Yeah. Anyway, Tony is here as well, and
Tony's been working hard scrambling to get us a video.
Will share with you the video of Donald Trump, who
is the President of the United States, talking about sending
(01:56):
the military in addition to the National Guard into more Americans.
He's for our safety, you know what I mean. You
can never be too safe, and that means you can
never have too many military troops on the streets. How
about that World Series game last night? It went was
it eighteen innings? Tony yep? To eighteen innings? It was unreal.
(02:22):
I went to the Lakers game, taken by my darling
friend and voiceover agent Mike Schallbetter, who walks into the
Crypto dot Com Arena. And by the way, the fact
that we could have in our country a major arena
in Los Angeles called the Crypto dot Com Arena tells
(02:45):
you all you need to know about America right now.
I guess the bullshit dot com arena name was taken
the crypto dot com arena. He walks in like he's
mayor happy. It was wonderful. And then he walks walks, walks,
and as you're walking to the seats, you're thinking, all right,
(03:06):
you know, our seats are going to be good. They'll
be down here in this front type section somewhere you know,
I don't know, sixty rows back, seventy rows back is
really close to the court. He keeps walking and walking
and walking, and we're like three rows back from the court.
It was incredible, you know. I felt like a make
a wish kid. You know. It's like, so we're like, wow,
(03:27):
this is just the greatest. So and there at the game,
and I love this guy who I've known for so long.
He's been my agent for so long. He said, oh yeah,
I have had these tickets for fifteen years. I'm thinking
you've had him for fifteen years and this is the
first time you've invited me. I mean, I'm just saying it.
I don't mean to be ungrateful, but I'm just going, you.
Speaker 2 (03:49):
Know, uh, any other people are in front of you
on that list, the.
Speaker 1 (03:54):
List, I don't get it, don't. I don't. I don't
get it. I guess there are a lot of other
people who are more important clients. Yeah, but it didn't
eclipse the fact that I was quite grateful and I
had a really great time. So anyway, I come all
the way home and the World Series game is still
going on. It's all tied up at five, and I'm
(04:14):
watching the end of that game. Only that game doesn't
end for another full nine innings of baseball. So we're
sitting there and then on the East Coast, it got
to close to three am, right.
Speaker 3 (04:26):
Tony, Yeah, Yeah, And they also, I know they had
in Toronto, the inside the baseball stadium there, they had
like twenty thousand as a watch along inside the stadium.
Speaker 1 (04:36):
Did all of them make it to three am? Right?
Even in the stadium, you saw all of these people
and they were, you know, all of a sudden, you know,
you're not seeing as many and well, seventh innings when
they stopped selling booze, all of that's right, So you
had a whole game without food or fear.
Speaker 3 (04:53):
No, And I found the best video from someone like
You'll get a kick out of this. It's the guy
in the control room. You'll get a kick out of
the I saw this. I have to show you this.
I saw this last night because you'll see when you
hit the victory music liquid it's on.
Speaker 1 (05:13):
It's an instantly place. That's the same machine I use. Wow,
I use that same machine. Yes, the ding comes off,
that comes off that machine. That is one. It literally
says when we're in better shape. But I don't think
we're in wildly better shape. I love it. And he
(05:36):
has the same scrawl. Yeah, yeah, that I do. That's
exactly what all the I have little post it's for
the different banks of sounds. And that's great, Tony. That's
a great pull, really really great. Yeah, two games rolled
into one. Linda is right, it was pretty crazy. Wow. So, uh,
(05:57):
I'm going to go to the game tonight. I just
got a I just bought a ticket. It just seems
like history, you know. Yeah, I'll be honest with you.
I'm not a you know, you can imagine. I came
up a huge Giants fan, so Dodgers are not necessarily
but I also recognize and I you know, National League
root for them, and my friends are all Dodgers fans
(06:19):
and stuff, and more to the point, I'm this guy
Otani who's going to pitch tonight. I mean, he's he's
an extraordinary figure in the history of baseball, and I'm
excited to see him. And so I got really good
a really good seat for less money than it's going for.
I guess that a bargain on it.
Speaker 3 (06:38):
So yeah, so that's really I got to see him
in regular games at in Anaheim, which was nice, you know,
when he was still an Angel.
Speaker 1 (06:44):
Yeah, yeah, yeah, that's right. He started with the Angels
and really extraor willing to give him a chance when
everyone else is like pitching and hitting. No, it's weird. Yeah,
I mean, how about that. I mean the Dodgers, though,
really turned him into turned him into a franchise, you know,
I mean, he is a commercial franchise. So I heard
they ran out of Baseball's had to scramble to get
more out of storage, said Salad Shoemaker. Is that right?
(07:05):
They run out of pictures too? I think, I don't know.
I yeah, they definitely ran out of pictures. I mean
they began to run out of pictures. But Joe Fish Salva,
the shoemaker, Anthony we hear from Salva the shoemaker Ron
Cook got your money's worth on that one last night,
that's for sure. Anyway, the the show is filled with
(07:28):
a lot of stuff that is related to, of course,
said Donald Trump and his navigating the trade roots across Asia.
But it's also it's also packed with some things that
are related to the government shutdown, which is starting to
take its toll. So we'll get to that as well.
(07:50):
They practically ran out of innings also, that's right, went
through eighteen pitchers total for both teams, says Louis. Wow. Yeah,
it was quite That's what I mean when I say
it's history, Like I kind of wanted a ticket just
because it's historic, you know. So So there's that. Bill
Gates made a statement, says West Theory, and the right
(08:13):
is greatly twisting his words saying that he no longer
is for fighting climate change. I didn't see that. Have
you been able to corroborate that?
Speaker 2 (08:21):
No, Tony's furiously looking it up right now.
Speaker 1 (08:24):
Oh yeah. And meanwhile, Cecilia says, on the way back
from the Vet Oncologist, listening to your show makes the
drive home easier. Well, well, Cecilia Misery loves company, and
my little Charlie has lymphomah and they were very bummed
about that. I love him very much, spend a lot
(08:46):
of time. We'll listen to listen to the nine innings
of baseball with the Charlie on my lap. So it
was a real and he's not a lapcat. That's how
I know he's hurting, you know, because he's wanting to
be right on me. You know, he's It's pretty brutal
the process of seeing that. But anyway, I'm I'm feeling you.
As the kids used to say, have you seen the
(09:08):
video of Donald Trump wandering around confused in that event
in Japan? The Japanese Prime minister had to guide him.
He had no idea what he was doing.
Speaker 2 (09:18):
No, I haven't. We were watching earlier a speech. But
we have we have the seeing him wandering aimlessly.
Speaker 4 (09:26):
I have it, Oh do you?
Speaker 1 (09:28):
Tony is the ninja and he's been able to find
it and we'll share it with you now. If you're
watching on YouTube, you'll see it. Otherwise we'll describe it
to you. So this is the the room. They're playing
the national anthem, obviously, and whereas where's the Donald? He's
(09:52):
not there.
Speaker 2 (09:53):
The Emperor's palace.
Speaker 3 (09:54):
Sorry, let me find that the other one, the one
I marked it and that it changed, So let me
see what.
Speaker 1 (09:59):
I can do. Ter So Wilson maintime says, hey, Market's
Park's birthday today, eighty two and going strong. Can you
fit in a happy birthday to them? We are devoted
listeners and paypaled donors. You sure are, thanks, Teresa. Teresa
Wilson is a wonderful og of this show. She and
Parks both big supporters. He is eighty two years old,
(10:23):
still going strong. Happy birthday you get there. Very cool? Yeah,
pretty cool man. Eighty two seems like a great goal.
I'd like to just make it to eighty two. I'm
telling you it. Just see, life gets harder and hard.
I haven't slept in months. It's just brutal. I don't know.
(10:44):
There are all kinds of reasons that are sweet for that,
and then there are reasons that are horrible for that. Yes,
well this is the thing that we're going to show you.
Jim Shields. Trump also gave a speech where he advocated
using steam to power navy ships.
Speaker 2 (10:58):
Yeah he did. I did see that. Yeah, you said,
what's better electric or steam?
Speaker 1 (11:04):
It was about the catapults. Actually, oh, he was saying
that the oh, the catapults to stop because linear motor
sometimes can be problematic.
Speaker 3 (11:13):
I see, and steam just runs off of the nukes,
you know.
Speaker 1 (11:17):
He was, Oh, I see, he was talking with the
nuclear So yeah, the aircraft carriers as well. So that's
it's just basically, it's just yeah, whatever, the science is ridiculous.
Speaker 2 (11:28):
He got a lot of applause when he said steam.
Speaker 1 (11:29):
So I will say that, well, there's a big there's
a big crew. If I'd known that it was that
easy to get to applause by just saying steam, I mean.
Speaker 2 (11:40):
Yeah, steam, and they all went crazy.
Speaker 1 (11:42):
He did, I love steam, train says wow, all right,
who knew there was this big, this big steam push.
I didn't realize that James Bliss with a ten dollars
supersticker than big shout out to James Bliss. What you know?
We love names on this great last name, great name,
James Bliss. Are you kidding me? I love it? Yeah,
(12:03):
I saw this. Amazon is laying off a bunch of people. Comes.
This is a quote just days after competitor Target announced
it was laying off eighteen hundred corporate roles. This is
from teacher Lori g Yeah, that actually is part of
a series of layoffs that are going to affect different
(12:24):
sectors of the economy. But Kim will have that inter
news a little bit later. But the layoffs are massive
and we're talking about tens of thousands of people being
laid off. So all right, without any further delay, thank
you for being here. Smashed the like button and we
will Mark Thompson show. Trump is saying that he's prepared
(12:45):
to send more than the national Guard into US cities.
That's right, who really can stop You begin with a
national Guard, and before you know it, you're sending everybody in, everyone.
And he talked to the troops in Japan and he
said he would escalate his orders to active duty branches
of the military if he thinks it's appropriate. You know,
(13:08):
we have the sound here he is and a little
bit of what he was saying, the.
Speaker 5 (13:16):
Drum tighte and it's a beautiful thing. And our people
in the service way and you know, people don't care
if we send in our military, if we send in
our National Guard, if we send in Space Command. They
don't care who the hell it is. They just want
to be safe. And we have safe cities. Now we're
starting in Memphis, and Memphis was a disaster. It's been there,
(13:38):
they've been there for two weeks and it's a whole difference.
So he criming is less than half and within a
month it'll be gone, getting rid of all the bad ones.
And we're going to go into Chicago. We're going to
go into our cities. We're going to clean them out,
we're going to straighten them out, and we're going to
have safe cities because you want to protect safe cities.
We're going to have beautiful, safe cities. And it's happening
very quickly and very easily. Actually it's easy for us,
(14:01):
it's hard for them. And we have to have a
little more help. It doesn't matter really, we could do
as we want to do, but it would be nice
to have more help from some of the Democrat governors
that don't mind. In Chicago, two weeks ago, four people murdered,
eleven people shot. This weekend, it was like terrible, much
(14:24):
worse than that. And then we have a governor that
stands up and says, oh, it's wonderful. It's not wonderful.
And what we're doing is we're going to make it
totally safe. It'll be very safe very soon. And we're
doing that with all of our cities that are troubled.
We have cities that are troubled. We can't have cities
that are troubled. And we're sending in our national Guard,
and if we need more than the national Guard, we'll
(14:44):
send more than the national Guard, because we're going to
have safe cities. We're not gonna have people killed in
our cities.
Speaker 1 (14:51):
Yeah, he does make the facts up completely, as you know,
but he tips his hand of course. You know. The
reality is he's right. He can do whatever he wants,
and you can see him doing whatever he wants in
so many different ways. He can send troops into American cities.
He does it. I don't think it's a fight against
(15:12):
crime at all. I mean they don't even go into
the crime written parts of the cities that they are occupying.
They make a move because they want to get us
accustomed to the fact that he can send these troops in.
I mean, this is really like a break the glass,
pull the lever so the alarm goes off. Thing that
a president can do in times of like civil war
(15:35):
crisis or in times of an American city really melting down.
See Los Angeles after the Rodney King verdicts and the
riots that followed. See the American cities that were on
fire from Detroit on in these sixties, you know, as
(15:55):
a result of civil rights protests. These are times when
you needed some reinforcements. And this is a performative flex
on the part of this administration to get you used
to it, because I think ultimately the endgame is that
they're messing with the election. They're messing with turnout. That's
a whole series of topics that we will explore in
(16:16):
the next few weeks. But the reality is that part
of this is the occupation of various parts of the
country by troops that are then well, they're under orders
from the president. And the last thing I'll say, just
because I mentioned troops, is that this whole thing we
(16:37):
told you yesterday about this donor who came through, this
billionaire donor who is very maga, very right wing, he
is supposedly paying the military during this government shutdown. What
was the number one hundred and seventy million. I mean
that sounds like a lot of money, because it is
a lot of money, but it's not enough to pay
the military. I mean the military if you pencil that
(17:01):
out as like ninety dollars a person. But what it does,
and this is the important thing you should take away
from it, is it does set the table for a
private army. I mean, in other words, a rich guy
comes in, he pays a bunch of these military people.
Because the government is shut down, they no longer get
their checks. That looks like sort of an angel making
(17:26):
a support move to the US government, when it really
is creating a situation in which you can have private
industry supporting a private army, and that private army isn't
even bound necessarily by the laws and strictures associated with
the Constitution, etc. So I guess what I'm trying to
(17:47):
say is that as the water gets muddied, these are
the sorts of things you have to keep your eye on.
So that was Trump talking about the fact that he,
you know, he's cleaned up things in short short order.
He did meet with Prime Minister Sanae Takaichi. Did you
(18:10):
Is that how he says it, Kim, That's how I
would say it, Mark. He is trying to get to
some kind of you know, they had a deal in place.
This is the story with the Japanese. I don't spend
too much time on this, but and then the Japanese
(18:31):
had to agree to this investment, and the way the
deal is laid out. I was reading about it. The
Americans are making them invest six hundred and fifty. I
have the numbers. In fact, I might even have them
close by, but not important. It's it's a chunk of
(18:51):
money in America, but there is no stipulation as to
how that money is spent. So it's the oddest kind
of deal because even the Japanese are going, we're not
even sure that this money is going to be used
in ways that actually benefit us as Japanese investors in America.
(19:12):
It's kind of like a shakedown.
Speaker 2 (19:14):
Are they investing in the ballroom?
Speaker 1 (19:15):
I mean again, the money can be used for any
number of things. Yeah, anyway, that's what's that's the Japanese
part of it. And now he's you know, obviously the
China deal is a centerpiece of what's going on.
Speaker 2 (19:34):
The Japanese deal involves the rare earth minerals.
Speaker 1 (19:38):
Right right, Well, the Japanese deal, is that what you said? Yeah, no,
the Japanese the Chinese deal involves.
Speaker 2 (19:46):
The rare earth minerals, the Japanese one.
Speaker 1 (19:48):
There might be there might be an aspect of the
Japanese deal that is similarly, you can, I mean you
can set as straight on that, Kim. You have all
the powers of the yeah, exactly, speaking of and speaking
of things that you look up. There is a an
Elon Musk alternative to Wikipedia, because Wikipedia is woke. Elon Musk, what, yeah, Yeah,
(20:15):
this is Grokipedia. It's an alternative to woke Wikipedia. And
Grokipedia is a creation of Musk's AI chatbot groc. Human
volunteers write and edit articles for a Wikipedia, but here
you've got AI handling the whole thing, artificial intelligence designed
(20:38):
to be closer to conservative political views. They will be
populating the Grockipedia. Musk announcing that he is working on
an alternative to Wikipedia. After a suggestion from David Sachs,
a friend and fellow tech investor who's the Trump administrations
(21:01):
AI and cryptos are, Musk has decided the project in
political terms is the best way to profile it to
the American people. So Wikipedia is woke and Grockipedia is
sort of the answer to woke. So if you don't
(21:23):
like the New York Times and NPR as sources of articles.
That's what he specifically calls out. You'll like Grockipedia. It's
a new way to understand the universe. It differs from
Wikipedia in at least one major respect. There are no
clear human authors. Volunteers write and edit Wikipedia, often anonymously,
(21:45):
but Grockipedia has its articles fact checked by groc all. Right,
so visitors to Grokipedia can't make edits, and you can
make edits on Wikipedia. You know that you can suggest
edits from a pop up but that's the best you
can do. But some Grokipedia entries, they say, are at
(22:10):
least initially now based on Wikipedia. So there is a
kind of It's it's the Internet, it's the wild West, right,
But anyway, Grockipedia is the new Wikipedia, but it's anti woke.
Speaker 2 (22:25):
So I wonder what happens if you typed in DEI
on Grokipedia. On Wikipedia would probably say, you know, diversity,
equity inclusion on Grokipedia? Does it have bad things to say?
Speaker 1 (22:40):
That's a great question. I mean, you're right it could be,
and you could say DEI is a you know.
Speaker 2 (22:48):
Type and institutional racism. What happens on Wrockkipedia?
Speaker 1 (22:51):
Right? Yeah? Right? The the erroneously framed view of liberal
media that they're you know exactly you can. The only
reason I'm here is because you were a friend. That's it.
That's our Elon so I told he used to play
cards with David Sachs. He used to play cards of
them all the time. Now he's like this, I don't know.
(23:12):
He's kind of a villainous character, I think in all
of this. But Grokipedia is the new AI move from
Elon again, completely populated by EI facts, AI facts, I
should say, but by AI initiated by by human volunteers.
Did you find out the answer to the Japanese rare
(23:33):
earth mineral question?
Speaker 2 (23:34):
I didn't look it up because I knew I was right,
But I will play that I did. I did get
a pronouncer for mister Pollar.
Speaker 1 (23:43):
Well, mister Pollar is here, so we can just ask
mister Polar how he says his name. I don't know.
You didn't look up the rare earth thing? Why not, Kim?
Speaker 2 (23:51):
Because I already knew I was right about it, and
I thought we might not.
Speaker 1 (23:54):
Really, Well, why can't you just Tony look it up?
Please Jesus, all right, this is I'm sorry, mister poar
that you have to hear this. This is it's not happy.
It's not a happy time when mommy and daddy fight.
Speaker 2 (24:06):
From Reuters, US Japan leaders sign Rare Earth's nuclear power
deal ahead of Trump's g meeting.
Speaker 1 (24:13):
Was that so hard? Kim?
Speaker 2 (24:15):
There you're I didn't have to look it up because
I already knew it.
Speaker 1 (24:19):
You have to prove it once it's been a question,
just like, have you played scrabble before? Kim?
Speaker 2 (24:24):
It's my favorite game.
Speaker 1 (24:25):
Okay, Well, then you know that there's a challenge in scrabble, right,
even though you know that it is a word that
there is. I'm so I was challenging.
Speaker 2 (24:34):
I was saying, Thomas a challenge throwdown. That wasn't a.
Speaker 1 (24:37):
Challenge per se. It was a simple query. So well,
you of course turned it into a big family dispute,
and I don't know why you. All right, Kim, do
I win?
Speaker 2 (24:47):
Is that? Did I win the challenge?
Speaker 1 (24:48):
I've got? I've got Raoul here, and I'm pretty sure
that's how he says his first first name, but I
know it's l Rahul. Okay, Yeah, there you go. Do
you want to check you want to check that?
Speaker 2 (24:59):
I mean, I I asked for a pronouncer asked, an answered.
Speaker 1 (25:03):
Well, you and I forty seconds before the guy comes on.
You asked you try to ask I did three days ago. Yes,
but yet I asked, so, what is his actual pronunciation? Please?
Speaker 6 (25:14):
Rahul polar?
Speaker 1 (25:16):
Okay, all right, very good, all right, everybody, when he
gets here, just pretend we already knew even though he's
here in this entire conversation. But it'll just play better
for the radio. Thank you for being here. Please smash
the like button like a boss. Hit it hard, Hit
it now. It costs you nothing. I don't know why
you'd hit the like button so far, but I do
(25:37):
appreciate you hitting the thumbs up. It's great to have
you here. And as a part of this conversation, David K.
Johnston an hour two, the Pulitzer Prize winner, will weigh
in on a number of things that are going on.
But now it's time to talk about AI. This guy
is an AI risk expert and entrepreneur. He served as
(25:59):
the head of Advanced Products and Innovation at Thompson Reuters.
Is also part of the founding team and principal technical
architect of Shazam, creating its original iPhone app Tony, You're
gonna love this guy. Please a warm Mark Thompson Show.
Welcome for Rahul. Po Ar Rahul, How do you actually
(26:24):
say your name?
Speaker 4 (26:25):
Kim got it right?
Speaker 1 (26:26):
It rah I know you're new to the show, but
please try to avoid saying Kim got it right if
you can. In the future, I had this story about Grockipedia,
a creation. It's really of the moment. It's the answer
(26:47):
to Wikipedia that's been created by Elon Musk. It's a
weird place to start in this conversation, but just because
it is sort of the story of the day, just
out of that world of Elon Musk and AI. So
this is a Wikipedia completely populated by facts that are
generated and checked by AI. Your thoughts on that, and
(27:09):
then if you will contextualize this in this world of
AI and the growing dependence on AI and essentially the
on that platform anyway, the total takeover of AI.
Speaker 7 (27:20):
Yeah, I mean, I guess everywhere return we see AI
in some shape or form. I think specifically Grockipedia. I
don't have strong thoughts abount that. I don't think it's
going to be particularly impactful because you can think of
AI as a sort of way in which to kind
of compress all of human knowledge into like a machine.
And you know that machine is today mostly a brain
(27:42):
in the box. So you give it some inputs and
it's going to throw some outputs back out at you.
To think that you know, groc as much as you know,
I like that particular AI model personally, the fact that
it could actually condense all of human knowledge into its
brain and then spit it back out on the internet,
you know, effectively as a Wikipedia clone, and having be
(28:03):
sort of better or more effective, you know, irrespective of
all the political ideology that's sort of it's blaster around it.
I don't think that's that's really going to be a
sort of viable step for the future of human interaction
with machines, which is what I care about, right So
I think it's very marketable given the climate at the moment.
But you know, I try not to pay too much
(28:23):
attention to those sorts of things the.
Speaker 1 (28:28):
World of AI. As you've interacted, you've really tried to,
as you say, frame AI and if you will even
handle AI, that is to say, control AI within the
framework the broader framework of human interaction, Like where does
AI fit into our existing culture, society, politics, et cetera.
(28:53):
Speak to that and some of the things that you've
actually created some inroads in these areas.
Speaker 7 (28:59):
Yeah, so that's probably a very long conversation, but I
think at a high level, what I get excited about
is when machines have some sort of positive impact on
the human experience. And you know, people often wonder as
they think about all the things that I worked on,
how the Shazam guy ended up in what I do today,
which is really a cybersecurity company, And the journey is
(29:21):
actually pretty straightforward in my mind, because I'm always excited
about the idea of technology having some sort of really
positive impact on people. And it's great that, you know,
when I talk to people about Chazam, it makes them smile.
You know, it was a piece of technology that probably
had a pretty positive impact on them in some way
or form.
Speaker 1 (29:40):
And as I got for a second rowhol because now
I have to follow up and then we'll get back
about every good conversation I think is built of a
couple of digressions along the way. So here we digress
to Shazam. Shazam is both and it does make you smile,
brilliant and also it seems as though it's an overwhelming
type of technology, Like, how do does that technology work
(30:02):
to know all of these different things that it has
to know from various pieces of music that span all
the different genres of music. It's a remarkable thing. Where
give us quickly a moment on that Shazam creation. Now,
I know you were taking the Shazam technology and making
it the app on the phone, but that's the whole game.
Speaker 7 (30:24):
Yeah, I worked on many bits of that, but if
you think back what it's trying to do, it's trying
to take many lifetimes with the music which you know,
week as individuals could probably never listen to, and trying
to find in that ten to fifteen second snippet of
song that's out there. And the algorithm was invented by
by a gentleman who's actually still in the Bay Area,
I believe every wang And it's a really sort of
(30:45):
brilliant piece of technology and it's not AI and it's
actually very old at this point in time. I think
I probably saw the first prototype maybe back in two thousand,
which you know is going to be what twenty five
years ago, which is pretty crazy to me, but you know,
turning that into something that has had such a big
impact on people's lives. Has been a great part of
(31:08):
the journey, and I think when I look back on
it now versus the kind of technology that we have
and are kind of acclimatized too, it's a totally different
generation of technology. But what's cool about it is it
made people smile, and in many ways, we're getting to
the point where with AI as we use it in
note today, it knows about us and speaks our language
(31:32):
in a manner that machines could never do in the past. So,
you know, I had the advantage of trying chat GPTL
maybe it was twenty twenty or something the playground.
Speaker 4 (31:41):
It was a super private beta.
Speaker 7 (31:44):
And I fired up this thing and I looked at
it's like, what is this machine? And then when I
tried it for the first time, I was totally blown away.
I never thought I would see that kind of human
machine interaction in my lifetime.
Speaker 4 (31:55):
So it completely.
Speaker 7 (31:56):
Changed what I believe machines could do today. And now
we're seeing the sort of full impact of that all
the way from uh, you know, groquet high at one extreme,
you know, to the way people use chat GPT for
all sorts of things that you know, we never thought
we talked to machines about. So it's a pretty amazing
thing to witness over these twenty something years to see
(32:16):
where machines have taken us.
Speaker 1 (32:18):
You've just hit on something though, that I think is
the whole game, which is the way you just mentioned,
the way I was able to interact with the machine,
or the way the machine interacted with me. There is
a it's illusory, I'm sure on some level, but it's
really effective, a way in which the machine does seem
to know you. Like if I asked, you know, for
(32:42):
our next interview, you know, what do you think I
should ask Raoul about? And it will say it will
even say, well, based on your first interview, Mark, I
think you've still you've left out the following blah blah
blah blah. So my record, it's an odd almost relationship.
It's built in a very simple way. Yeah, yeah, I know.
Speaker 7 (33:01):
It's it's it's not more complicated than kind of speaking
our language.
Speaker 1 (33:05):
Right.
Speaker 7 (33:06):
I think it's amazing when we think about it, because
we've been so used to machines being machines and human
beings being human beings. When you start to interact with
them in the way that you interact with the fellow
human beings, the lines gets super blurry. I mean, obviously
we've we've seen a lot about you know, the way
people use and misuse AI today. It's it's you know,
(33:29):
ultimately it's some of what we do. We're obviously seeing
people using AI to try and generate fake content or
impersonate brands with those sorts of things out in the Internet,
which is the sort of which is the dangerous use
cases of AI. It allows us, it allows bad actors
to basically take AI Internet machine that industrializes this kind
(33:50):
of human scale interaction for nefarious purposes. And that's actually
one of the reason why I got to cybersecurity to
begin with. It was like it was a question of hey,
I you know, I think technology is going to be
super powerful, but it's also, uh, they're going to be
some really difficult challenges for us generally to try and
keep the technology and services that we interact with safe.
(34:13):
And I think this is this is this is actually
quite magical when they can they can listen to us
and you know, I think, you know, talking in chatchypt
and text is like one thing, but we can clearly
see cases where it can see like us, and it
can hear like us, which I think starts to open
up a whole bunch of these multimodal kind of interactions,
(34:34):
which makes it feel kind of magical. And I think
you hit on the point of, you know, maybe it's
just a simulation of intelligence.
Speaker 4 (34:41):
I think the jury is out, and you know, genuinely.
Speaker 7 (34:43):
Whether they do think, and whether they're just pretending to
be us, whether they can be creative, whether they can't
be I think for the general population, that doesn't really matter.
If it's if it's something that adds value to the
sort of thing that they're trying to do. If it's
helpful to them in some way, it's probably good enough
to be extreme useful.
Speaker 1 (35:00):
To That's such a great point. There's a almost esoteric
conversation going on about is awareness machine awareness? Might it
actually you know, turn off humanity while it you know,
saves itself. And I've seen and read compelling arguments for
these various ways in which it is capable of doing that,
(35:23):
that is to say, turning off humanity and kind of
creating a machine world. But then as I read more,
at least in one piece, I remember it seemed as
though the developers were guiding it that way, and it
seemed as though it was being nudged a bit, and
so it's a desire for self preservation, if you will,
was kind of being programmed in. Can you speak to
(35:47):
that for a second, that kind of the esoteric sense
of being that these creatures may have, these mechanical creatures.
Speaker 4 (35:54):
Yeah.
Speaker 7 (35:54):
Well, I think the reason why there's so much discussion
about this because it's to say, we currently don't.
Speaker 4 (36:01):
Really know how they work.
Speaker 7 (36:03):
I think we're sort of sitting in a world where
they're unreasonably effective, but we don't quite fully understand why.
Speaker 4 (36:10):
So there's a lot.
Speaker 7 (36:11):
Of you know, there a lot of papers, a lot
of speculation around, uh, you know, the quest for are
they truly quote unquote intelligent and what are their limits
and so on and so forth. And you know, clearly
we can see what some of the limits are. I
would note, you know, I think most of the models
that consumers have access to are very suggestible. So you know,
they they clearly if you set them off on a path,
(36:33):
they will run around you know, underneath their completion engines.
So if you set them off on a path, they're
just going to try and do what you set them
out to do. You know, I think it's you know,
clearly they don't have a lot of agency as it
currently stands. So if you if you kind of set
it up in a way that makes it suggest that
they are, you know, in trouble or they're at risk
and so on, then clearly, you know, where you look
(36:54):
at the internal monologue, there'll be some stuff in there
that's all the sort of feels a bit sky neettish,
I guess, but I think the reality is right now,
they're fairly they're fairly useful machines. And even if progress
on you know, gen AI and large language models stops today,
which clearly it's not given the trillions of dollars that's
(37:15):
really in the ecosystem, but even if it were to
stop today, the amount of utility these tools have for
us that we're still not fully utilizing is in my
mind at least you know, a decade or two of
worth of innovation to be deployed in the world.
Speaker 1 (37:30):
When you see the Palenteer data set that is associated
with tracking Americans and tracking terrorists and tracking drug runners
and tracking fill in the blank, the good and the bad.
And that's just one example, but Palenteers are really I think,
(37:51):
good example, it seems to me like the clear and
present danger of AI in the wrong hands, if you will,
really begins to be a narrative that's a reality.
Speaker 8 (38:04):
Yeah.
Speaker 7 (38:04):
I mean, you know, I saw some data recently that
I don't think as a as a as a species,
we've ever invested in something quite as much as we're
investing in AI right now. And I think that's entirely reasonable. So,
you know, whether you look at the public markets and
say there's a bubble or not, I think that's a
that's an entirely separate issue. I think the impact of
the technology we've built today is certainly gonna last my
(38:27):
lifetime in terms of how that reverberates across all of
these across these principles, and so you know, clearly there's
there's an arms race building. You know, there's different that
different geopolitical powers at places. Supply chains are you know,
super intricate and usually very concentrated. Famously, so there's a
(38:48):
different geo political powers are you know, basically trying to
control the AI. The one thing that they're trying to
do is to sort of be in pole position for
this alleged singularity. The moment where AI tries to and
can improve itself, because clearly whoever gets there first is
has an AI that can improve itself, which means that
(39:09):
no one else can catch up, right in principle, So
people are spending an enormous amount of energy getting there. Personally,
I'm not sure that that's actually going to happen with
the technology that we have. It's not super clear that
just making everything bigger and you know, using more memory,
using more power, using more end video chips is actually
(39:30):
going to create this sort of like beyond human intelligence.
Speaker 4 (39:34):
Though I could be wrong.
Speaker 7 (39:37):
I think, you know, to some ways, China's not chasing that.
China's chasing actually something that's quite.
Speaker 4 (39:42):
A bit more prosaic.
Speaker 7 (39:43):
They're just trying to take the stuff that they have
and make it useful. It's not as exciting and saying, hey,
we're going to have like a super intelligent AI sitting
in a data center somewhere, but it does actually create
you know, quite a bit of economic value for their
billions of people that they have. So it's a very
different philosophy, and I think it's kind of interesting to
see all of this is pay out plan.
Speaker 1 (40:01):
Out that's a fascinating distinction, really fascinating distinction. I want
to ask you about something that's happening right now. A
couple of things, actually, but I'll start with with the
government shutdown. There are massive flight delays. I mean, there
was seven thousand yesterday, and you've actually done some work
in this area, and I'm just wondering the nexus of
technology and in this case, the human problem which is
(40:26):
created by humans. The shutdown of the government and the
fact that there are fewer and fewer people staffing these
various critical positions, and the obviously life and death world
of air traffic control. Can you speak to that the
nexus of the technology and that, Yeah.
Speaker 7 (40:42):
Well, so this is a good example of us building
an entire sort of system of systems that.
Speaker 4 (40:47):
Run our world.
Speaker 7 (40:48):
And you know, in many cases, we don't really fully
realize just how much we rely on the until they
stop working for us, right, And you know, I think
that's one of the challenges that we're going to have,
you know, governments such a notwithstanding I'm expecting you know,
that's that's going to come to pass at some point
in time. But in a similar way, we're building all
of these systems on the back of these technologies, and
(41:09):
we're having these debates, so you know, how many people
do we need, how much is AI going to sort
of automate these jobs away? What can it do, what
can it not do?
Speaker 1 (41:16):
Etc.
Speaker 4 (41:17):
And we're very much.
Speaker 7 (41:17):
At the early stage of this, so we're all kind
of trying to find our way governments, private companies, public companies,
us as individuals, you know, just trying to figure out
how all these tools fit into our world. Like I
said before, I fully expected, like five or six years time,
we'll look back at this period and say, oh wow,
(41:38):
there was just so much stuff we did back then
that we just don't do anymore. And hopefully, you know,
I'm optimistic, brand myself as sort of optimistic technologist. I think, well,
we'll work through some of that. I think we'll come
out to a world where, you know, AI is probably
going to be more helpful and more impactful in our
lives simply by taking a lot of the drudgery away,
(41:59):
the thing that you know, we shouldn't be doing as people,
but maybe we've gotten accustomed to. If we can take
some of that away, we can spend more of our
time on engaging with other human beings.
Speaker 1 (42:09):
Well, specifically in the case of air traffic controllers, Might
that be something that AI could completely take off the
plate of humans.
Speaker 7 (42:17):
I would imagine at some point in time, once we
have a better understanding of how these systems work, where
their strengths and weaknesses lie, especially as we get to
these very sort of safety conscious industries. I think it's
it's I'm happy that it takes longer for technology to
(42:37):
get there then you know in some other realms like
you know, creating ad copy great, you know, creating marketing
videos find you know, the cost of failure is relatively low.
When you're starting to go into medical technology, highly regulated businesses,
sectors that are so critical to the safety of you know,
hundreds and thousands of people. You want to move slow
(42:58):
and don't break things.
Speaker 1 (42:59):
Sure, because AI make mistakes, right, I mean, you know
it's sort of baked in that it's going to make mistakes.
Speaker 7 (43:05):
Yeah, we're still very much learning how to control it,
and you know, there are a bunch of techniques and
people are getting better at it.
Speaker 4 (43:12):
You know, the people who actually do.
Speaker 7 (43:13):
This on foundational research are spending a lot of the
time and energy and trying to look inside the brain.
I mean, you know, you can think of ourselves right
after knowing ourselves for I guess at least tens of
thousands of years, we still don't really understand how we work, right,
And AI is not that different. We've built, you know,
basically a replica of the way our brains kind of
work in a simplified manner and unsurprised, and we can't
(43:34):
figure it out. So I think it'll take a little
while before we can be confident.
Speaker 1 (43:37):
About But you know, you've touched on something. I just
think it's so simple and yet so critical, and that
is it's the language, the way in which this technology
relates to us that on the consumer level vaults it
into this sort of revolutionary status. But the rest of
(43:59):
the stuff, the medical, the legal, the aviation related aspects
of applications of AI, that becomes a much more tricky
game to provide litmus tests for and to really evaluate
because you can't afford to have a margin of error
(44:21):
of any of any substantial statistic, right, I mean, it
has to be you know, less than one percent type
of thing.
Speaker 7 (44:30):
Yeah, No, absolutely, it's it's it's actually more difficult in
some ways and eases easier than others because in some
ways it's it's easier because it's testable. You know, quite
often if you're talking about things like diagnosis, if you're
talking about things like assessments, you could actually build a
data set to say, hey.
Speaker 4 (44:50):
This is good, this is bad.
Speaker 7 (44:52):
It's more difficult to do that when you just talk
about general human interaction, right, if you're just chatting with it,
and it's you know, giving you, I don't.
Speaker 4 (44:59):
Know, relation and ship advice.
Speaker 7 (45:01):
I'm happily marriage, so you don't have to worry about
that anymore. But I know a lot of people use
it for that sort of thing. Sure, it's it's more
difficult to build a testable hypothesis to say.
Speaker 4 (45:10):
That it's doing the right thing.
Speaker 7 (45:11):
Now, obviously there's safety testing and guardrails and all that,
and all the major model makers spend a lot of
time and effort on. But in some other use cases
it can be it can be simpler, it can be
more yes, binary, Yes, it's good, no it's not. However,
that being said, in order to actually make sure that
it's it's good and and I've found that as human beings,
we have a higher standard of excellence.
Speaker 4 (45:33):
For machines than we do for ourselves.
Speaker 7 (45:34):
So you know if we talk about you know, when
we that the machine can't make a mistake, but you'll
forgive a human being, right, And I think that's definitely
true for AI models too. So we need them to be,
you know, significantly more reliable than our best people, especially
in these very very high stake safety critical aspects, before
we can really hand over the keys to them.
Speaker 1 (45:56):
I have one story that just jumped out at us
I wanted to share with you. In March, three months
after being forced out of his position as the CEO
of Intel ensued by shareholders, Patrick Gelsinger took the reins
at Glue, which is a technology company made from what
he calls the faith ecosystem. It's like salesforce for churches,
plus chatbots and AI assistants for automating pastoral work and
(46:20):
ministry support. So this is a pivot to technology associated
with spreading the word, you know. And it's interesting to
me to see AI used and you can imagine there's
quite a bit of money behind this, and there are
one hundred and forty thousand faith, ministry and nonprofit leaders
(46:42):
who are now being targeted for this kind of AI
technology that will interact with them. It's sort of Silicon
Valley comes to the pews of the churches, and it
was to me just a sign of the times, and
I wanted to get your comment.
Speaker 7 (47:00):
Yeah, so this was fun because I ran across this.
Speaker 4 (47:03):
For the first time last week.
Speaker 7 (47:04):
A colleague of mine, who I think lives right next
to them, said me, the have you seen this? And
I was like, wow, that's pretty cool. I mean, it's
interesting that you know, now you know that AI is
really everywhere, right, because it's it's not really just a
thing that happens in the Bay Area. Everyone wants a
piece of the action. Everyone wants to interact with it
(47:25):
on their own terms. And I think building a foundational
model that's kind.
Speaker 4 (47:29):
Of faith based or at.
Speaker 7 (47:31):
Least is tuned to the interactions that you know, the
Christians across the States really would would get value from
is interesting to see. I mean, I can't think of
many other technologies, apart from maybe dating, that would have
had sort of such this kind of like segmentation quite
(47:51):
so clearly. And I think we tried it and it's
it's quite cool. I mean, it's you know, I'm not
a very religious person myself.
Speaker 9 (47:58):
But.
Speaker 1 (48:00):
Yourself a couple of months with the AI.
Speaker 7 (48:02):
Yeah, it was pretty cool. I mean we also some
questions on the sort of you know, boundary of science
and religion that gave some pretty balanced answers.
Speaker 4 (48:09):
I was actually quite impressed.
Speaker 1 (48:10):
Oh that's really that's really great. Well, you're just a
great ambassador for all of this. I mean, I must
tell you, AI in the world of journalism has been
an embattled technology. There's a lot of plagiarism, there's a
lot of ripoff of really hard work that's being done
by investigative journalists and and those who are brilliant writers,
(48:33):
et cetera. So you know, AI has picked up some
black eyes in that area. And that's an area in
which I, you know, know a lot of people and
and and interact, but I also see the virtues and
the wonder of it. So I'm hoping that it can
navigate some of that stuff, the negative stuff, to get
to what you talk about, which is the great potential.
Speaker 9 (48:54):
Yeah.
Speaker 7 (48:54):
No, I mean, clearly, it's it's trained on it's trained
on the best examples of what humans can offer, right
and so early, you know, well written pieces, books, et cetera,
et cetera.
Speaker 4 (49:04):
That's not in the training data.
Speaker 7 (49:06):
Then it's really not going to be a very good model.
There was actually quite an interesting article posted recently about
AI that was heavily trained on you know, just social
media content, and they found that it actually got significantly
and measurably worse at reasoning, which is which is quite
(49:26):
funny and I think it probably reflects maybe some of
the brain rot we have as individuals sort of being
online every day. But it was interesting to see that
sort of express statistically.
Speaker 4 (49:35):
In the way the model got worse.
Speaker 7 (49:37):
And what was interesting is they couldn't make the model
better by training it with with sort of good content anymore.
So there's some really interesting things out there. Clearly it needs,
you know, good examples of the best of what we
as humans can offer so that it can really be
the best of what it can offer. But at the
same time, you know, there's there's great work out there
that is you know, either copyright or either the intellectual
(49:58):
property of people across the world, and we kind of
at some level we need that to make these models effective.
But at the same time, we shouldn't just be a
free for all. So you know, I think this is
just part of the entire frankly national problem.
Speaker 4 (50:14):
Just legislative framework that we're going to have to work through.
Speaker 1 (50:16):
Sure, it's really a new technology, and I think that
the regulatory instruments aren't there necessarily yet. And you know,
I think of in our last minute or so, I
think of the ASCAP BMI licensing overall fees that are
paid by radio stations. Let's say they're playing music, and
you know, there's a just an understanding that we're going
(50:37):
to use this music, but we're going to pay this
annual fee. And I think at least at a minimum,
there should be something like that to compensate all these
many people. I just don't know how that arithmetic works out,
since you're talking about what is, you know, qualifiedly like
almost an infinite number of things written and write. I mean,
I don't know how you work out that. You pencil
(50:58):
that out. It's got to be a pretty big number.
Speaker 7 (51:01):
Yeah, No, it's it's a it's frankly, this is a
really difficult problem.
Speaker 4 (51:04):
I mean, you know, performance right stuff.
Speaker 7 (51:06):
I mean I have I have some knowledge of that,
having worked on a very similar problem back at Shazam
for specifically some of the companies mentioned, So I understand
how that works. I think if you look at you know,
the entirety of the music business, and specifically that's right spot.
It's such a tiny, tiny slice of what we're talking
about now, and it's sort of utility. Humanity is, you know,
(51:30):
so proportionally different. I think it's quite difficult to say, Well,
you know, it's everything out there. How do we isolate
the value of a particular, you know, currently copyrighted piece
of work in this massive body of everything that humans
have ever produced and put some economic.
Speaker 4 (51:43):
Value to it.
Speaker 7 (51:44):
I really don't know that we have a process of
doing that, but we're gonna have to figure something out right.
Speaker 1 (51:51):
We don't have it, and that's the problem. But I'm
excited for our next conversation, and we've just finished our
first conversation. I hope you'll come back raoell you, is
there anything that we can link to if people want
to follow up with you in any of the existing
technologies and efforts that you have underway.
Speaker 4 (52:09):
Yeah.
Speaker 7 (52:09):
So, at Redsift, we're mainly focused on using a lot
of these technologies to try and help our customers or
usually brands, be secure online so effectively stop them from
being impersonated to their customer base, to their supply chain,
to their investors. Whoever that might be, because there's a lot
of that going on, and AI is basically industrializing the
(52:31):
process in which this works. So we try and harness
some of that AI capability for goods, put it in
the hands of the good guys, try and monitor the
Internet at scale and harden their public face so that
it can't really be so easily impersonated by people. And
if there's anyone listening, if id you know, maybe is
experience some of this, or you know, kind of interested
in the kind of solutions that might be available to
(52:52):
protect their business, then redsift dot com has a bunch
of SaaS solutions for you.
Speaker 1 (52:57):
Redshift dot com. We've had it in the chat, we
have it on screen, and I'll have it under this
video with a hot link to it if you want
to check out more. And I look forward to Rule
Power our next conversation. As I say, really a cool,
interesting guy, and you've got some great takes and I
love your sense of optimism because we don't get a
lot of that, so really great stuff. Thanks for stopping through, Raoul.
Speaker 4 (53:21):
Very happy to be on rahul.
Speaker 1 (53:23):
Powar say you Raoul rahul pow Aar. Wow. Loved it.
Loved it, and the Kim came up with the pronouncer
just in time, even though I probably was only yeah,
I was eighty percent there, maybe seventy percent. There a
lot of good love for Rahul in the chat, Raoul.
(53:45):
If you want to feel good about yourself, go back
and watch the feed and you can read about how
great you are. The audience really embraced you. I'd say
the audience has embraced Rahul more than they've embraced me,
which is a what I know, it's disturbing, but look,
(54:05):
the reality is the reality. So oh, David K. Johnson
is here. I was killing time. I'm so sorry he's there. Wow. Anyway, well,
thank you smash the like button for a whole. That
was the tech part of our show, to talk a
little bit of technology and AI also some politics in there.
So if you missed any of it, go back and
(54:26):
check it out. We'll drop it as a separate video
as well. So the Martinson, this guy is a Pulitzer
Prize winning investigative reporter. He's written books on Donald Trump,
The Detail, The Rise of Donald Trump, The Way Donald
Trump Has Done Business, also books on how the richest
Americans have benefited from the tax code and legal system
(54:47):
as it's set up in this country. Of course, all
of that has been supercharged under what is a completely,
in my judgment, corrupt administration on a level we've never
seen before. He is the co founder of DC dot Oregon,
now professor at Rochester Institute of Technology, The Great David K. Johnston,
(55:07):
Hello Marko, Hello sir. All right, Uh, so much to
touch on. The government continues closed, and it serves, it
seems those in the highest levels of government well to
keep the government closed. How do you game this out?
Speaker 8 (55:27):
I gotta tell you, I don't know. I mean, I
expect this will become the longest shutdown ever. The longest
currently is Donald Trump, and right now it's the second longest.
I did think today as I was listening to something
coming from one of Donald Trump's targets, that they may
be up to something interesting about snap benefits, a supplemental
nutrition assistance program what we used to call food stamps
(55:51):
to address hunger, which is just another word for poverty.
Donald has not provided one dollar of federal emergency management
aid for the fires in Los Angeles. I wonder if
they're going to decide to reach into the nearly six
billion dollar reserve fund for snap and provide benefits for
(56:15):
poor people to eat in red states but not in
blue Wow, that would be so blatantly discriminatory. I can't
imagine very many federal judges would go along with that.
But it will be interesting to see what poor in
a blue state you can starve and if you're in
a red state, you're going to get your food assistance.
(56:38):
It is the kind of thing I can this administration.
Speaker 1 (56:41):
Absolutely, there's a sense in that way that this administration
can do whatever they want to do. You could see
it in the flex associated with the military, but you
can see it in the sacking of the treasury for
two hundred and thirty million dollars that never really has
to be adjudicated in any formal way. Trump just says,
(57:04):
I am the aggrieved party here. I was harassed by
this legal system. I go to my DOJ Pam BONDI,
Todd Blanche, my former attorney, and then I have to
sign off on it. Well, I think it's a righteous payout,
and now I want two hundred and thirty million. That
is the most bald faced, corrupt notion of power. Though,
to speak to what we're talking about, I can do
(57:25):
whatever I want that I have seen well, as.
Speaker 8 (57:28):
Mel Brooks playing the French King says in his History
of the World, it's good to be king. And you
know who's going to challenge him. I mean, the Justice
Department is completely under his control. The Supreme Court majority
is in his pocket. I just got off the local
public radio here in Rochester and there was a caller
(57:48):
who was very upset because I had said that we
have a Supreme Court majority to consist of five whites,
supremacists and someone who I don't think is frankly qualified
to be a traffic court judge. And the person was
very upset, saying, well, you know, there's no evidence that
there white supremacist and that that's just ridiculous. And I said,
you know what, Chief Justice roberts first job was out
(58:10):
of law school. He was hired by the Justice Department
in the Reague administration to suppress black votes. And this
isn't news. This was widely reported at the time. You
can go read about it. There are books to talk
about it, the law review articles. I mean, it's robustly established.
And if you take a job the government to suppress
(58:30):
votes of black people, I think it's more than reasonable
to call you a white supremacist.
Speaker 1 (58:37):
What then, with this concierge court that you've described, it
seems to be on the same page. Yeah, I mean
the court.
Speaker 4 (58:46):
I love that.
Speaker 1 (58:47):
That's what he's expecting, right, he can he can bump
everything up to the Supreme Court, and the Supreme Court
never really lets him down. You know. I wonder then
a couple of things on a grand scale, Well, I
wonder what is in our future in terms of elections,
we talk about getting out for the midterms, we talk
about the general. It strikes me in more and more
(59:08):
revelations about how he had pressured Pence to not you know,
you just saw that in the last day the Pence
had contemporary contemporary in his notes associated with conversations with Trump, saying,
you're a whimp if you approve this election. You can't
certify this election. You realize that he has no interest
in there being free and fair elections in America any longer.
(59:29):
And so this is all of a kind. I become
more downcast about the future every day, despite the fact
that there's a big pushback. We saw it in the
streets with no kings, et cetera. I mean, there's definitely
there's a popular rise against Donald Trump right now.
Speaker 8 (59:46):
There is, and you're seeing some of his closest supporters,
like Marjorie Taylor Green, distance themselves from Trump. I'd point
out that in her congressional district, more than forty percent
of the resid cidents are on Medicaid, so she has
a rather powerful incentive here to separate from Trump. But now, I,
(01:00:08):
if you have lemons, make lemonade, guy. I don't particularly
like lemonadecause I don't like sugary drinks. Never have the
Trump administration's demand that red states redraw congressional boundaries to
favor Republicans and more importantly, to make sure that black
Democrats cannot get elected. I don't buy the argument of
(01:00:31):
gloom and doom about this. I see in that an
enormous opportunity to weaken Donald Trump. It does require the
one thing that we're sort of not used to doing
in the age of the Internet, and that's actual work
in publical fields. But there are districts in Texas, Alabama, Mississippi, Florida,
North Carolina, the Confederacy states that right now are fifty
(01:00:56):
five percent or more Republican, And so you can reasonably
expect Republicans will get re elected. But if you want
to jerry manned or black Democrats out, you're going to
have to dilute those districts. And now you're going to
have a bunch of districts are going to be fifty one,
fifty two percent Republican. Well, if Democrats, independance and disaffected
(01:01:23):
Republican together and turn out the vote, can Congress back
with a big fat margin. You know, instead of thinking about, well,
we'll get a three vote majority for the Democrats, you
could end up with a twenty thirty forty vote majority.
But you know, we need to remember that you have
(01:01:45):
to do the work. My law class today, I was
explaining to them, saying about the history of corporate law,
and I pointed out that the Boston Tea Party, one
of about thirty tea parties, was not as we're all
taught in school, about high taxes. It was about a
tax exemption and a favor for the friends of King George,
(01:02:09):
and it involved imposing a monopoly. You could only drink
British tea when three out of four cups of tea
being drunk at the time on the East Coast were
or at least in Boston were Dutch tea because it
was better. And the people who gathered at Fanniel Hall
on the day of the most famous tea party in
Boston Harbor. It was December, middle of December, is cold,
(01:02:33):
it's raining. The hall was packed and there were several
thousand people who stood outside in the rain for twelve
hours as the crowd was worked up to go and
throw the break open the chests of tea and throw
the tea into the harbor. Getting people to registered to vote,
making sure they stay registered, and then getting them to
(01:02:55):
the polls on election day, even if that means you
volunteer with someone else. You need two people to draw
five people to the polls. Isn't that much work. But
if people aren't willing to do the work, then they're
gonna lost their democracy. And we need to instill in that.
You know, tv ads move the needle. Almost nowhere three
(01:03:16):
tenths of one percent is what I've read in the
literature of the political scientists. You want to replace Donald Trump,
encourage the redistricting. Yes, some of the black lawmakers may
not be re electable. That doesn't mean that Democrats or
anti Trump a Republican or an independent can't win the race,
(01:03:37):
pick smart candidates who can resonate with the voters. And then,
in the saying from the Other Times newsroom, starting with
the Watergate era, Goya cod, get off your ass, knock
on doors, do the work, do the work. You want
to live in a free country, you got to do
(01:03:58):
the work. And I know I'm a broken record about this,
but more than a million people died for this country.
So suggesting you should arrange to take Monday, Tuesday and
Wednesday of the first Tuesday in November next year off
to help with an election. That's not a sacrifice of
any consequence, and if you do it right, you'll have
(01:04:19):
a good time.
Speaker 1 (01:04:21):
Yeah. Well, there is a sense a community about those
things that you're talking about. I wanted to, because you
are talking about elections, to mention that the FEDS and
I hate to always seem to default to what's going on,
but I just feel that there's really an aggressive move.
You talked about the jerry mander, and I love that
you pointed out that I think that there's even a
term for it. When you jerrymander yourself to the point
(01:04:43):
that you screw yourself by the jerrymandering, because you know,
it's dumb mandering. I think it's what it's called. But anyway, yeah,
so I love that you've touched on that. And in
California there's this Prop fifty where there is essentially it's
a referendum on whether there should be a redrawing of
and Readish and that is leading in the polls. But
the FEDS, this is my point, are sending now election
(01:05:07):
observers to California. You know, Pam Bondi is sending members
of her team from the Justice Department who will monitor
polls in six jurisdictions leading up to this November fourth election.
And now you've got the Attorney General of California, Rob Banta,
saying that the state will dispatch its own observers to
monitor the federal election observers because there is a sense
(01:05:31):
and I think they're going to see a lot of
this in the elections to come, of leaning on the
polling places and leaning on voters in America from the FEDS,
from those in power.
Speaker 8 (01:05:42):
And this is not a new thing. Up until I
think it was twenty years ago, the Republican Party was
under a national order that they could not do certain
things they had been doing. I think it was a
national order. It's possible it was only New Jersey. They
couldn't call up black voters day before the election, saying
off with the polls. We're going to arrest you. I'm
(01:06:02):
from the police department and we've got an arrest warrant
for you, and other just outrageous efforts to deter the
turnout for the vote. Pam Bondi has exactly zero evidence
that there's anything rigged or improper going on in California elections.
(01:06:24):
Bonta has to make a decision about whether he wants
to go into Fell Court at some point, probably later
rather than sooner, and seek an order saying no federal
monitors will not be around the polling places because they're
inherently intimidating. I will be surprised if the Attorney General
California doesn't see, at least in order, that there are
(01:06:46):
going to be no mapped ice agents or other federal
agents anywhere near polling places, because that would be an
effective way to turn people away and make them afraid,
stopping cars, demanding you identify yourself, where were you born?
Stuff like that. But look, the Trump administration, it's very clear.
(01:07:09):
You know they aren't willing to win in fair and
fah fair election. They know that in a free and
fair election, especially as Donald's numbers are now at least
ten points below his votes in the last fall, that
he's in trouble. He's in deep trouble. And elections under
(01:07:32):
the Constitution are run by the States unless Congress has
a law to the contrary taking control of those. If
Trump tries to get a quickie law pass to bring
federal pervision of all elections, it will be very interesting
to see if any of the Republicans suddenly find their
spines and say no.
Speaker 1 (01:07:53):
I mean, this is a really provocative question, David K. Johnston.
You know, I think they've really slept on everything. I mean,
the TERRORFF thing which threatened the US economy, and so
it threatens them by extension, because you know, you'll pay
a political price if the economy gets turned on its side.
This TERRORFF thing. They kind of just sat on their hands.
(01:08:16):
Let the president do this. The sending of National Guard
troops and legit military marines in la and now he's
talking in Japan. We ran the clip before, maybe we'll
run it again in a moment where he's saying I'm
you know, I don't want to limit it to the
National Guard. I want to actually send in regular troops
(01:08:36):
to various cities and to restore order. Still, you get
no congressional push back. So the idea somehow that they
they might actually passed under Trumpian pressure, the kind of
legislation you're talking about which would give the Fed's control
of the elections.
Speaker 8 (01:08:52):
Yeah, they might. That is a distinct possibility, and that's
why I said it'd be interesting to see if a
few of the Republicans who aim to be serious constitutionalists
stand up. We just saw a congressman today say that
some of what Trump's doing is outside of the constitution.
So there's some minor cracks in Donald's demands for absolute
(01:09:17):
obedience by members of Congress. But the minor and Speaker
Johnson is so completely obient that there's no work being
done in the House. This hasn't happened before with no
work being done in the House. And of course the
reason is that the House comes back in he will
(01:09:39):
be obliged to swear in the new Democrat from Arizona.
She will be two hundred and eighteenth vote on a
discharged position to get the Epstein files, and you know
Ockham's razor. Why is Donald Trump and his administration working
so hard to keep secret the files? He promised to
(01:09:59):
bring them out. I had people today and on another
show elsewhere, you know, all of conspiracy theories about this,
and the Biden administration knows there's nothing there because they
didn't go after Trump over this. The simple answer is
because it's damaging to Donald Trump. That's why they're not
releasing it. There's no other reason, and they have to
be clear those people who don't know the law. The
(01:10:23):
Trump administration has absolute authority to release the Epstein files,
redacted or unredacted. A whole lot of law professor's, former prosecutor's,
civil attorneys for the federal government have all said exactly
what I just said. They have total discretion to do this.
So they're choosing not to do it.
Speaker 1 (01:10:44):
Why exactly, if there's nothing damning about it, then why
wouldn't you release it? I mean, he's his name is
all over any file concerning Jeffrey Epstein like a bad
smell because he was a best friend of jeff I mean, well, and.
Speaker 8 (01:11:01):
If you're thinking, well, there are other people we shouldn't
bring in these other folks are in there. Okay, I
don't like that idea, but I'll tell you what, every
single name in there except Donald Trump's.
Speaker 1 (01:11:15):
I think it's gonna work the other way. David K. Johnson.
I think I think they're dry cleaning for Donald Trump's name.
I play a little bit, if you would, Tony the
president now making it seems to be like what he's
really trying to do, David, with these trade deals is
he's just trying to make, uh, everything go back to
(01:11:38):
the way it was. It's like what you and I
talked about months ago. If you just hadn't touched anything,
he would have been fine. Instead, he has these reflexive
moments of impulse where he just imposes these ridiculous tariffs.
So he's meeting with the Japanese and the Chinese just
trying to get back to good on tariffs.
Speaker 8 (01:11:56):
He has no legal authority to impose an Article one,
Section eight, Clause one of our Constitutions, the Congress shall
have the power to impose taxes, duties, and imposts. A
tariff is a kind of tax. It falls under duties.
Technical stuff you don't need to worry about. But he
has no authority to do this, and these foreign governments
(01:12:18):
they know that, they know that their customers at the
border are paying the tariffs, and they're all filing under
protests to get their money back. Now, I just had
something shipped to me where they made an additional tap
on my credit card to cover the tariff. If you're
a consumer, you're probably never going to get your money
(01:12:39):
back because various retailers who you buy something through are
not going to go through their records in all likelihood
and go through eight million purchases and say here's your
seven dollars and fifty cents or your fifty two dollars.
So we'll be net losers at the end of the
day here. And remember that Donald keeps it backing the Canadians.
(01:13:01):
I was in Toronto over the weekend for my seventh
child's fortieth birthday, and boy, everywhere we went, what did
we hear from Canadians? They are really ticked off at
the United States. You know, the United States is no
longer our friend. We need to have new friends. China
(01:13:24):
is going to be one of our new friends. We're
gonna make better friends with Japan, you know. And what's
wrong with you Americans? You know that you're leaving this madman,
this this crazy person there. And they're also insulted that
Trump asserted that Canada is insignificant part of trade. It's
our biggest trading partner. It's our bigger than China, it's
(01:13:46):
our biggest partner. They really are very angry and all
sorts of commercials normally I pay no answers for Jennifer
and I watched the World Series in our hotel room
and there were all these dingoistic ads, Canadian jingoism that
they were subtle. But whether it's selling beer or hotel
(01:14:07):
rooms or pizzas, Canadian, Canadian, Canadian, you know, we're good people.
We are. And that's long run, very very bad for
the United States.
Speaker 1 (01:14:21):
Well, the fact that you talk about them is so angry,
and they really are generally a pretty you know, move along,
get along crew. They're like the most laid back, wonderful.
They're like Americans, but less angry, you know what I mean.
Speaker 8 (01:14:32):
Well, Americans with more civilization. You don't find any filthy
bathrooms in Canada, and you can find bathrooms you can
use everywhere. Nobody says to you, are you're buying something?
Speaker 1 (01:14:43):
Yeah? I mean, Canadians are boycotting with the purchasing power
they have, says Obi Wan. Yeah at that as well.
Speaker 8 (01:14:52):
There's a wonderful book it's almost impossible to find because
they're only like five thousand copies, called Canada The Almost
Perfect Country. But if you can find a copy of that,
maybe it's now on Google Books. You can get an
e version of it, a PDF e version of it.
It's really quite a good book about Canada and how
they're just they're more civilized there.
Speaker 1 (01:15:14):
Well, the trade situation is kind of what you were
alluding to, or what they were alluding to, the world
is being pushed to China and to alternative trade relationships
because of the embattled nature of the US president. I mean,
it's just crazy who wants to and the deal you
make this week may not be good next week.
Speaker 8 (01:15:35):
No one in our lifetimes has done more to advance
China's effort to dominate the world than we have for
the last seventy five years years than Donald Trump. It's
astonishing what he has done. And by the way, one
little point, I don't want someone to get upset about this.
If you're an indigenous person or what they call first
(01:15:55):
peoples in Canada, you certainly don't think that they were
much better than we were. Not quite as bad, but
pretty damn.
Speaker 1 (01:16:02):
Awful to Indigenous people.
Speaker 8 (01:16:04):
Talking about the you generally see things in the world
today as you walk through downtown Toronto or you're out
in Brampton, where we were most of the time, you
just see a much more thoughtful, civilized, thoughtful economy. Nobody
in Canada loses their home for health bills, and the
(01:16:28):
Canadians are doing just fine economic things. I was most
struck by is every time I crossed the border. It's
about ninety miles from my home. You get into Ontario
and there's this construction everywhere. Apartment buildings, houses, townhouses, factories,
logistics centers, laboratories. They are really on a tear. I
(01:16:49):
can't remember where I've seen a place in the US
that was like that since the seventies.
Speaker 1 (01:16:55):
The way in which you know both countries could do
well with the good relationship and trade that just seems
to be lost. He always needs an enemy, and he's
chosen our friends and primary trading partners to be enemies with.
So it's it's inexplicable. And if you're the guy who
can explain it, because you know your client better than
(01:17:16):
I mean, you've written chapter and verse on this guy.
But let me get to the money. Also, the Argentine
bailout forty billion dollars, which is just done to take
care of the hedge fund investors who are heavily into Argentina.
Speaker 8 (01:17:34):
And that was all of a fascistic politician who's failed,
who's more than half of Argentinians are now living in
poverty because of his policies. A bailout of him politically,
which was important, but it's also a bailout of these
speculators on Wall Street who were going to lose ninety
(01:17:54):
percent of their money. Now then I have a chance
to get off with a much lighter haircut. They're still
not going to get everything they wanted, but that's the
way Trump operates. You know, We're going to take care
of our friends. I'm President of the United States for
people who support me and my friends and everybody else.
Speaker 1 (01:18:12):
Yeah, I mean, it appears that that is the case
with Argentina. And then he turns to Latin America with
warships when it comes to Venezuela, and we've had these
extra judicial killings, you know, and again it's clear maybe
some of them were we were talking about this. I
think last conversation some of them might have been drug
(01:18:34):
boats or might but it appears that there were actual
fishermen who lost their lives and all of this.
Speaker 8 (01:18:39):
You know. On the on the flight to Malaysia, Trump
was asked by a reporter, well, if you think that
these people are an invasion of the US, why don't
you ask Congress to declare Warren Trumps I don't think
we needed that. We're just going to keep killing them.
He has no legal authority to kill him. That's why
one of the most senior military officers resigned last week.
(01:18:59):
He has and said why he resigned, but people around
and know him have told reporters who covered the military
that that's why he designed because he was not going
to carry out illegal orders. And there was somebody in
the Trump administration. I did not catch who they were,
but said, well, of course they're drug dealers. They were
(01:19:21):
out on the Caribbean in the middle of the night.
Excuse me, do you understand nothing. The fish the fishermen
want to catch to eat or sell when the sun
is out, dive down to one hundred feet or so
because predators will eat them. When it's dark at night.
They come to the surface to eat the jetfish and
(01:19:42):
the other things they dine on if they're predators up
at the top. And that's why the fishermen are out
there in the middle of the night. And then Trump said, well,
you know, they really have to send hit these with
the missiles that are coastguard cutters are too slow to
catch the the boats. That's equally just stupid, because we
(01:20:04):
have satellite imagery and drone so we know where your
boat is. If you're heading in our direction. And if
you're heading in our direction with an outboard motor and
Puerto Rico is six hundred miles away, you're not heading
to Puerto Rico. You can't get there. But what you
do is you send out your coastguard cutters and you
array them so that they run into you. And what
(01:20:26):
do you do. Then you tell them heave two. You
don't do that, you fire machine gun or cannon across
their bow to make the message. And if you don't
do it, then now you shoot the bridge. Sure, but
you know, Trump, the draft dodger knows nothing about stuff
like this. But that isn't something you need to have
(01:20:47):
had any military experience. To be able to figure out.
It's just Donald assumes the poorly educated will accept his premises.
Speaker 1 (01:20:56):
Well, he does state them so vigorously and truly, what
we're talking about here is not a military exercise. It's
these drug interdiction moments I think fall into the world
of law enforcement, international law enforcement. I understand that there
might be a military backup, but this, this military this
(01:21:18):
it's almost fetishized by this administration and by Trump, and
of course Hesith loves it because he is, you know,
completely one dimensional. He's really just a talking head from
Fox who's running the Defense Department and is just excited
to have this chance to produce a military flex. But
there's real danger here, David. I mean in some kind
of war or even combat limited that it might be
(01:21:42):
in Venezuela and Colombia.
Speaker 8 (01:21:44):
Yes, and you know, turning your friends angry, turning your
friends against you. Tell me somewhere in the world that
that's a smart strategy, whether it's personal relationships, business relationships,
or international diplomacy. It just goes to Donald try has
no idea what he's doing. He's in way over his head.
He's being enabled by people who around him are themselves
(01:22:09):
not competent. I mean the basic things we know that
always happen when you're setting up a dictatorship. It doesn't
matter left or right. It doesn't matter if you're Mao
or President G in China, Stalin or you're Franken Spain, Mussolini, Hitler. First,
you put family in key position, because you can trust family.
(01:22:31):
Then you point deliberately appoint incompetent people who will be
totally loyal to you, like Pam Bondy, Cash Pattel and
especially RFK Junior. Then you do outrageous things to establish
your invincibility. I can do anything. And that's Trump's plain.
(01:22:53):
You know, he has this right to do anything I want.
Those are his words, not mine. And if no he
stands up to him, that becomes the reality of it.
And what we should really be troubled about is all
of these Republicans in Congress who know better, not every
one of them, some of them think Donald Trump is
(01:23:13):
the best thing since sliced bread, and maybe better than
sliced bread.
Speaker 9 (01:23:17):
But the.
Speaker 8 (01:23:20):
Failure to protect your own institutional interests, you know, or
as I taught my law students the other day, you know,
we went through word by word, every word in the Constitution.
Article two took it apart, and then I said, now,
let me reduce all this to a simple, memorable line view.
Article two says, the President of the United States is
(01:23:40):
the errand boy of Congress who will do what is
directed by Congress. And he's commander in chief of the military,
that's all. And yet he's acting as if they're subordinate
to him, and we're subordinate to him. No, I'm sorry,
we're his boss, not him. He's not our boss. That's
why we don't have kings. But all of this failure
(01:24:05):
to understand our Constitution, the principles of freedom and liberty,
this is the result of decades of cutting back on
the quality of teaching, saying, oh, it's too hard for
my children. You know, you're being mean. They have to
do all this homework. You want my child to read
the Constitution, that kind of thing, and we're paying a
(01:24:29):
price for that, and it's a terrible price that we're paying.
Speaker 1 (01:24:35):
Well, I feel people just generally know Second Amendment if
they're you know, if that's their jam, or they know
First Amendment if that's their jam. I mean, after that,
I think there's sort of a general breakdown. But you
know as we finish up, David, I just I think
it remarkable that there can be sort of a bald
(01:24:55):
faced corruption. And I really feel as though that corruption
maybe the Democrats, because they're the opposition party in this
flawed two party system, that may be the Democrats message
if they want to pick it up.
Speaker 8 (01:25:09):
If that's the only message the Democrats have, Donald Trump
is corrupt and competent, idiotic, and the threat to your
liberty they're going to win. That's too pointy headed, that's
too college educated or independent thinking. What the Democrats need
to do is they need to do that. I'm not
saying they shouldn't do that, but what the message they
need to have is, here's what we're going to do
(01:25:32):
for you. We will make your life better. We will
bring we will bring down the cost of drugs. We're
not going to set up a business to profit off
the cost of drugs as Trump is doing. We are
going to create an economy where wages are going to grow.
We are going to protect the environment so that you
(01:25:53):
are going to have a heart attack in twenty years
because of particulates in the air that bit by bit
are sticking you after even realizing it. What are you
going to do for me? Is the reasonable question to
ask the Democrats. Don't just tell me you're going to
deal with Trump, You're expected to deal with That's like
saying I should be praised because I changed diapers when
(01:26:16):
I have little kids. That's ridiculous. You know, of course
you have to change diapers. And the job is what
are you going to do for me? And so let
me leave you with something. I hope people keep in mind.
The Rand Corporation study that I don't think I inspired,
but came out after my books about the economy and
the all changes in income distribution and wealth distribution, said
(01:26:40):
that in twenty eighteen, when Donald Trump was in his
first term, if we had had the same distribution of
income as nineteen seventy three, the peak year for the
ninety percent, people's average income in the ninety percent would
have been two thirds higher. So if you made a
hundred thousand, you would have made a hundred sixty seven thousand.
(01:27:01):
If you made fifty thousand, you would have made eighty
three thousand, five hundred dollars. Imagine how much better everybody
in the ninety percent would be if they had two
thirds income. People wouldn't be worried about paying their bills,
paying for electricity, they'd have some savings, they'd be able
to spend some money on things to bring joy in
(01:27:23):
their lives. That's what the Democrats need to say, that
you have your pockets systematically picked by the Trump Republicans
or the Trump Publicans, and we want to change that.
We're in favor of creating wealth. We're really good in
America creating wealth. But we need to have a fair
system for how the fruits of our economy are distributed.
(01:27:48):
And that doesn't mean we're giving money the way to
people for doing nothing. It means we're going to change
the rules that suppress your wages, your benefits, raise your costs.
We want to go back to nineteen seventy three when
you were much better off.
Speaker 1 (01:28:04):
Yeah, that's a great I mean, that's a great takedown.
I hope that corruption gets into that conversation alongside, as
you say, what is a real solid message, And I
hope that you know there's a political process that can
follow it all, because it seems this will be talk
about as we continue, David Weekly. You know, there's a
(01:28:26):
real effort to derail it, you know, so it doesn't
matter what you say, we're running the show now, and
you can, you know, save it, save it.
Speaker 8 (01:28:33):
And you want to say what we don't like, or
we'll come after you.
Speaker 1 (01:28:36):
Yeah, I mean it's a it's a real threat. Thank you, sir.
Always great to visit with you, Stady K. Johnston. Yeh,
see you next week.
Speaker 2 (01:28:44):
All right, right, masket with your iron rod.
Speaker 1 (01:28:57):
I've come from regular stock. Here's a reason that this
place is fun.
Speaker 5 (01:29:08):
Don't never use that way.
Speaker 1 (01:29:14):
What are the porn stars doing? Mark?
Speaker 6 (01:29:22):
They gave me a lot of money for having the attitude.
Speaker 7 (01:29:27):
What do you say is the political dogma that they're
trying to shove down our.
Speaker 8 (01:29:33):
Throats straight up right into your grabom.
Speaker 1 (01:29:39):
What the hell is going on in the United States
of America. I'm loving it, loving it, loving it. I
thank you for joining us. If you haven't yet, please
respectfully smashed the leg button like a ball, Smash it
with your iron rocks, do it hard, do it like
you care. It costs you nothing, and up in this
(01:30:00):
ridiculous world of YouTube that we live in, actually helps
the show. Tony, I have a question for you. My
signal during the David K. Johnson conversation was degrading, occasionally
was that from David or is that because my signal
is a degraded signal.
Speaker 3 (01:30:19):
I saw you final time I saw David kind of
had a couple dropouts.
Speaker 1 (01:30:22):
Okay, yeah, so I feel better because I am always
assuming that's me, and I've had a little bit, I
think of the beginning. But you've been good since the
beginning of the show. Yeah, okay, you just wow, Well,
thank you very much. I am excited about Kim's news. Kim,
will you update this Hurricane Melissa well made landfall in Jamaica.
I saw wins of one hundred and eighty five miles
(01:30:43):
an hour. I mean that's extraordinary.
Speaker 2 (01:30:46):
Yeah, so.
Speaker 1 (01:30:49):
I'd love to I'd love to get a quick snapshot
of what's going on. And then Doctor Daniel's coming through.
Speaker 2 (01:30:54):
Doctor Daniel will be here. It is a flu season
and so he'll talk about vax scenes and who should
get them. And also I wanted to ask him about
that whole Trump admitting that he had an MRI and
a YES situation. So we'll check in with him on
that one thing that we were going to get to
that we didn't and I know Tony found it. I
(01:31:15):
think is the someone had talked about the video of
Trump wandering aimlessly in Japan, and I, oh, yes, you
ever found that video?
Speaker 1 (01:31:22):
Yes, so if you're just joining us, we mentioned this
and we said we'd rallied for you. Let's see it now, Tony.
This is Trump wandering aimlessly. You know, if Biden had
done this, they would have eaten him for breakfast. But
Trump can wander aimlessly, and sometimes he just makes it
look cool.
Speaker 10 (01:31:40):
So here's the Japanese leader kind of trying to direct him.
Speaker 2 (01:31:54):
She goes over, and maybe he was just trying to
talk to the band people while they were playing loves
crowd work.
Speaker 11 (01:32:00):
I mean, he's a really great one for crowd work. Yeah,
I don't think all right, she really.
Speaker 2 (01:32:19):
Yeah, where's he going? Oh? Another trying to show him
where to go? Okay, oh, he's got to go to
the receiving line. Here.
Speaker 1 (01:32:31):
There he goes past the press and then there's yeah,
he really I don't know if he's wandering endlessly, but
it does have and can certainly be construed as an
aimless kind of walk around that room. But it also
looked like he was just kind of uh, receiving all
of the different people who were there. I mean, yeah,
(01:32:52):
it looked like he was just doing a loop of
the room.
Speaker 3 (01:32:54):
Yeah obviously you know me, Cara, the camera can't see
that corner that did walking towards them as.
Speaker 1 (01:33:00):
Well, Right, I mean, look, I'm not I'm not a fan.
I've told you, but I'm also not going to take
a cheap shot if I feel like Yeah, but I look,
I like any video of him meandering about. And also
I always think to myself, he's thinking, when can I
get back to like stacking money and you know, going
(01:33:21):
after brown people or whatever, you know, whatever I really
like to do. I know he I mean, look, let's
be just brutally honest. Trump's in it for the money. Okay.
Trump's into everything for the money. He's got very very
simple transactional relationships with everything, and sadly that is bad
news for the American people and bad news for America.
(01:33:44):
But all of this stuff that he's doing right now
is kind of like, God, this is just the stuff
I have to do before I make this deal that
enriches my family further. So I know they're probably those
who go, come on, Mark, that's too so much for
a simplification. Of course, it's an all for simplification. But
I'm telling you it's all transactional with this president and
(01:34:04):
so yeah, money and staying out of jail. Who said
that CC rider, Yeah, that's exactly it. Yeah. So we
think we did this, Mindy, we shared yesterday, didn't we
Trump dancing with the band when he got off the plane, Remember, Tony,
we shared it was part of that wrap package that
we did. We didn't. We didn't. We didn't air it.
Speaker 2 (01:34:25):
We were going to, we didn't.
Speaker 1 (01:34:28):
Yeah, that was pretty funny pre show.
Speaker 2 (01:34:30):
Maybe you can rally that he hopped out, he's you know,
he's jamming, he's dancing that.
Speaker 1 (01:34:36):
Yeah, yeah, yeah. Yeah.
Speaker 2 (01:34:39):
We also have a video of him today. I don't
know if we showed it of him arriving aboard that
aircraft carrier in Japan and he comes in on these
fancy planes and then they have this elevator down and
the you know, the music swells and the crowd collaps
and the crowd goes.
Speaker 1 (01:34:56):
He's got to hear Tony, show it to you here.
You well, this is the stuff that he loves, you.
Speaker 5 (01:35:09):
Know, yeah, stuff a president of the United States, Donald J.
Speaker 9 (01:35:14):
Trump.
Speaker 1 (01:35:18):
Yeah, and that's stuff he loves, coming down on the
elevator in there, you know, there's the big reveal. So
the uh, so we'll get the picture and pictures of
him from him dancing. I mean, he's the president. There's
some ridiculous crazy, you know, how is Trump president moments?
And they're also these scary moments and things he says.
(01:35:42):
And we try to be kind of equal opportunity sharers
when it comes to that. So we bring you both.
Here is Donald Trump arriving in Malaysia, and this is
what we're talking about. There's a band playing there and
Donald Trump is he's getting into it. He's getting into it. Yeah.
Speaker 2 (01:36:02):
Is that the leader of Malaysia who's next to him dancing?
I presume yes, fun fun times.
Speaker 1 (01:36:09):
I mean it's it's festive. Has any any of you
gotten off of Air Force one arriving in Malaysia. It's
a festive moment. Yeah, I'm good for him. Yeah, it's uh,
I'm saying I've never seen anything like that. It's pretty
pretty crazy. So that was the Donald Truth dancing. Look
at look at all of that dance and they really
(01:36:30):
get into it.
Speaker 2 (01:36:31):
I'd be willing to bet my lunch that there's alcohol involved.
Speaker 1 (01:36:34):
I think there is. Wow, that's you just run.
Speaker 2 (01:36:43):
At least if he's in a good mood dancing, it
means he's not blowing up boats and sending the National Guard.
And we're taking we're taking his attention to the music
and the fun.
Speaker 1 (01:36:52):
Right, I'm concerned. I hope you're right. Uh anyway, all right,
Kim's News and the Good Dog, the like button.
Speaker 2 (01:37:01):
You're iron Rod.
Speaker 1 (01:37:02):
Share the show, Share the David K. Johnston Conversation. We
continue Mark Thompson Show, The Mark Thompson Show.
Speaker 2 (01:37:16):
I'm the Mark Thompson Show. I'm Kim McCallister. This report
sponsored by Coachella Valleycoffee dot Com. Hurricane Melissa is battering Jamaica.
It is one of the most powerful hurricane landfalls on
record in the Atlantic Basin. Life threatening flooding, several feet
of rain, and landslides all expected from Melissa as she
(01:37:37):
crosses the western part of the island nation. The Deputy
chairman of Jamaica's Disaster Risk Management Council is urging people
to seek shelter and stay indoors as this storm, with
one hundred and eighty five mile per hour winds crosses
the island. He said, Jamaica, this is not the time
to be brave. This is going to be a bad one.
(01:37:59):
Israeli warplanes are carrying out powerful strikes in the Gaza
strip that according to a statement from Prime Minister Benjamin
Netton Yahoo's office, after he accused Hamas of violating the ceasefire.
Net and Yahoo ordered the new attacks after hostage remains
were identified yesterday as the name of a deceased hostage
hostage who was previously recovered from Gaza in twenty twenty three.
(01:38:23):
So unknown if that played a role on these attacks
or what's happening with that, but it looks like the
ceasefire is definitely being broken. The Department of War says
the United States conducted more strikes on alleged drug boats.
Secretary of War Pete Hegseth saying three lethal strikes on
four vessels were carried out in the Eastern Pacific yesterday.
(01:38:44):
A total of fourteen people were killed in those strikes.
One person did survive. The United States has struck several
suspected drug boats in the Caribbean as well as the
Pacific Ocean, drawing criticism from some Democratic lawmakers, as well
as the presidents of Venezuela and Columbia. Democratic led States
and Washington d C suing now the Trump administration as
(01:39:06):
millions are set to lose food assistance on Saturday. This
is the federal government shutdown stretches to the four week mark.
On Saturday, funding runs out for food aid programs for
more than forty million Americans. The poorest among us will
be in dire straits. Visitors to federal parks speaking of
the shutdown might want to avoid the public restrooms because
(01:39:27):
many across the United States haven't been cleaned since the
shutdown started weeks ago, so the facilities are a little
worse for wear. Some parks, like Rock Creek in Washington,
d C. Have set up porta potties. Sanitation crews from
the National Park Service are among the nine thousand plus
of the agency's fourteen five hundred employees fur loaed.
Speaker 1 (01:39:49):
This is the you know, the practical aspect of the
government shutdown. I grew up in Washington, so I know
Rock Creek and I know the bathrooms well, I mean,
I know it was using them for decades. But I
would just that, you know, this is the stuff that
is not thought about and we'll never be thought about.
Just the fact that the defacing of national monuments, that
(01:40:10):
because there are no park police around because there are
no rangers around, and all the kind of things that
you can undo, but it's just going to cost you
more money on the other side. And then there are
things you cannot undo, you know. I mentioned the defacing
of monuments that was the case during the last shutdown
during the first Trump administration, and there were things that
were lost forever, I mean defaced permanently. And so again,
(01:40:34):
the bathroom's overflowing, the government coming unraveled, you not being
able to get anybody at the Social Security Administration for
those things that you need desperately, and of course the
monies as you suggest, Kim, that have run out for
snap recipients and those who desperately need public assistance. It's
(01:40:54):
it's a travesty. It's a travesty. It's grotesque, and the
fact that it's pure politics is despicable.
Speaker 2 (01:41:02):
I know, doctor Daniel has arrived, so I'll just run
through the headlines of the next few stories just so
people know what's going on. President Trump is appealing his
New York hush money conviction in the Stormy Daniels case.
Remember those thirty four felonies. Trump's lawyers claim the charges
should never have been brought against Trump. They say the
case was politically motivated. Of course, he was found guilty
(01:41:23):
in connection with making hush money payments to Stormy Daniels.
Right before the twenty sixteen election, the Attorney General of
Texas announcing a new lawsuit against the drug makers of
a seat of minifin. Look at how I said. It
rolled right off the tongue I.
Speaker 1 (01:41:39):
Said, known as Thailand. All, oh yeah.
Speaker 2 (01:41:44):
The Republican Attorney General Ken Paxton says he's suing the
manufacturers for deceptively marketing tailand al to pregnant mothers, which
is an accusation that's been made by President Trump. Thailand
all one of the only over the counter pain medications
considered for pregnant women. But the Attorney Generals of Texas
is claiming that the makers have ignored the evidence that
(01:42:08):
Donican a ced of minifin.
Speaker 5 (01:42:09):
Oh no, no as Thailand, All.
Speaker 2 (01:42:14):
Yeah, it could cause autism and ADHD. And we'll ask
the good doctor about that. House Republicans asking for the
Justice Department to launch an investigation into the use of
an autopen by the Biden administration, specifically Pam Bondi looking
into the use of auto pen for pardons. California's special
(01:42:35):
election on Prop fifty, the measure that could reshape California's
congressional districts, has seen over four million ballots returned so far.
Interestingly about split when you look at registered Democrats and
Republicans returning those ballots. So America is getting skinnier. The
weight loss drugs may be a contributing factor.
Speaker 1 (01:42:55):
Well that we should ask doctor Daniel about that. Yeah.
Speaker 2 (01:42:58):
According to a self reported gallop Pull, for the first
time in more than fifteen years, obesity rates in the
United States have fallen from almost forty percent in twenty
twenty two to thirty seven percent.
Speaker 1 (01:43:10):
Wow that coffee doubt. I'm washing down my ozenpic check.
I gonna do it.
Speaker 2 (01:43:17):
This report sponsored by Coachellavalleycoffee dot Com.
Speaker 1 (01:43:22):
I've been.
Speaker 2 (01:43:25):
Oh is it ever good? Coachella Valley Coffee dot Com
the Amazing Coffee, Amazing Tea. Please go to the website.
They have copious tasting notes listed so it's easy to
figure out what you might like. And we have a
super special secret code for the Mark Thompson Show. It
is mark T. No spaces mark T will get you
ten percent off. I'm Kim McAllister. This is the Mark
(01:43:47):
Thompson Show, The Mark Thompson Show.
Speaker 1 (01:43:52):
Come on, everybody, We're gonna finish strong. Got a real
doctor in the house quickly. A note the uh, super
chats and super stickers are live still as we are
a live show every day for two hours two to
four Eastern time. Pray for the poor and they're receiving
(01:44:14):
food from somewhere. This is so terrible and we all
feel this in one way or another, says unvarnished Clarity Sessions. Yeah,
I agree, I mean, and it's more than well. I'll
just say a lovely sentiment, pray for the poor. But
I'm angry for the poor. I'm angry about this. This
is outrageous, absolutely outrageous. Copious is a ding word, says Nulofidious. Yeah,
(01:44:40):
take it. And my doctor said, do not take I
you prof And I'm going to ask our doctor. The
official doctor of the Mark Thompson Show is doctor Michael
Daniel is coming up in seconds.
Speaker 2 (01:44:55):
He's having a bit of a connection issue, so we're working.
Speaker 1 (01:44:57):
On it all right. That is not related to acid.
Speaker 5 (01:45:04):
Benefit commonly known as Thailand.
Speaker 1 (01:45:06):
Oh yeah, okay, it's not related to Thailand. Although the connection,
but Myra says, thank you for supporting independent media. That
is something worth noting. We are independent media. We do
need your support. Without your support, we go away. I
mean it's that simple. So thank you guys for your support.
(01:45:28):
Ten dollars is the kind of support we're talking about.
From James Bliss, who has the coolest name in the
chat today, I've seen anyway, Kim, please give Mark a
time out. He needs to go to his room. Well,
I didn't good day, sir good. I didn't realize that.
Speaker 2 (01:45:46):
Do you feel like you need a break?
Speaker 1 (01:45:48):
I don't know. This is a man with anger. This
is a man who has towards you. I didn't realize that.
I'm sorry. It's the anger that I have and the
resentment I have is barry under other anger and resentment.
That is really the way my world is set up.
Got to dig down, Yeah, what's that? You got to
(01:46:09):
dig down deep to find it? You got to dig
down two levels. I just got off the phone with
the IRS after calling for ten days, said chaplain friend
to solve some tax issues. I was calling them that
was generated by AI by mistake, per IRS representative. Speaking
(01:46:30):
with the representative, she was frustrated sad and tired. I'll
bet she's not getting paid. It's crazy easy worker right now.
I mean it's brutal for that. That's brutal. Yeah yeah, yeah,
yeah yeah yeah.
Speaker 4 (01:46:43):
On the uh.
Speaker 1 (01:46:46):
Boycott or I should say, the you know, the shutdown
and those who were not showing up there was oh
it went away. I thought it was a good comment.
Speaker 2 (01:46:57):
Uh.
Speaker 1 (01:46:58):
Shadow producer Calvin Wong, I believe, suggested on the on
the bathrooms that are no longer in service in the
federally maintained parks national parks, he said, take a crap
in the woods like a bird does, which is always
a thank you very much for that time. Yeah. And
(01:47:21):
then somebody else suggested always keeping a roll of toilet
paper in the car.
Speaker 2 (01:47:27):
That's smart, is it?
Speaker 1 (01:47:29):
I think it's a bit.
Speaker 2 (01:47:30):
That's a boy scout maneuver. Always be prepared.
Speaker 1 (01:47:33):
You know, it doesn't smart. It has never really occurred
to me. I don't know. All right, Uh, we've got
the doctor back. All right, let's do this the Mark
Thompson Show. This guy is an honest to goodness er doc.
He takes a time out of the er, he scrubs
down and he spends time with us once in a
(01:47:53):
while when he visits. We do it in a segment called.
Speaker 3 (01:47:59):
He just he said, getting out, He's going to try
to connect via his phone. He just literally just wise
you were introducing him. Hey, yeah, I was really.
Speaker 1 (01:48:09):
Connection was winding up like a World Series pitcher with
that intro, and now I'm well, that was just on.
Michelle Walton said, I lost my twenty three dollars a
month food assistance and Medicare healthcare advantage plan is now
non existence. I had to find a new plan. Copays
are high. I can't afford to lose any more weight.
(01:48:30):
I'm ninety four on the dot. That's how much Michelle weighs.
Ninety four pounds. Jeez, these kinds of stories are sad
and they should make you angry. And so again it's
really what's happening for political reasons. Now we have doctor
(01:48:53):
Daniel who is a doctor of Internet connections as well
as a doctor of the the medical profession. The as
he sets up, I'll read this last comment from CC
rider and thank you for a ten dollars super chat,
(01:49:14):
because the super chats are one of the ways in
which we support ourselves. So a big shout shout out
to CC rider. Yo dj CC writer says, please share
the legal citation alleging the use of an autopen is
a crime. I'll wait. Yeah, this is Trump being angry
saying I want you to find something on Biden and
(01:49:37):
using that autopen. You know that's the tantrum that they're
responding to. So hi, Mark and Kim says velvet. Velvet
is a strong name. I love that as well. All right, Um,
Judy says I lost my twenty four dollars a month
food stamps too. When I ordered groceries over the weekend.
(01:49:57):
I couldn't apply any EBT. Yeah, I mean that's the
and isn't that the problem? All right? Uh? Do we
have him now? Tony looks okay, all right? Uh this
is house calls. No, sorry, I didn't know it will
(01:50:20):
be when we dropped the video in post. This will
all be cleaned up, everybody, don't panic. Tony has control
over the open. Doctor Daniel has an open. Is it
in the system? We've used it a few times. Isn't
it in the system? Kim?
Speaker 6 (01:50:38):
I asked you looking for some good medical advice.
Speaker 5 (01:50:42):
Supposing you brought the leg inside the body.
Speaker 6 (01:50:44):
Let's say what a real doctor has to say, this
is vital Signs on the Mark Thompson Show.
Speaker 1 (01:50:51):
He is a real doctor out of the er. He
scrubs down to spend some time with us, doctor Michael Daniell.
Look at you. You a handsome devil with your short
sleeved What is that thing called the smock thing? It's not, yeah,
scrubs exactly. That was at scrubbs.
Speaker 4 (01:51:08):
I love it.
Speaker 12 (01:51:10):
Yeah, it's just in between a bunch of shifts. So
thank you for dressing Les's wardrobe. Less wardrobe changes the better.
Speaker 1 (01:51:17):
Yeah, yeah, yeah, very very cool. Well, it's great to
have you here. I went quickly to first, I'm going
to start with your favorite President, Donald Trump, who has
referenced MRI scans. He was like, kind of, it's a
weird brag that he was talking about these MRI scans,
but he did talk about them, So what could they be?
What are they to what extent do you know anything
(01:51:40):
about this MRI thing?
Speaker 9 (01:51:42):
Well, if you think back October tenth when he had
his physical at Walter Reed and the report.
Speaker 12 (01:51:48):
After that mentioned that he had had quote unquote advanced imaging,
but they didn't really give it any more details about that.
Speaker 9 (01:51:56):
And then just to suddenly announce in the last couple
of days. Oh, by the way, I also had an
MRI and it was perfect.
Speaker 12 (01:52:03):
Is it just raises a lot of flags for somebody
who were all the red flags for somebody who's seventy
nine is supposed to be at the peak level of
cognition and functioning to run our country.
Speaker 9 (01:52:16):
And as a doctor, you know, an MRI.
Speaker 12 (01:52:18):
Is not part of a routine physical exam at all,
and I just wish they would be a little more forthcoming.
And so you know, that's just going to lead to
a lot of speculation, which a lot of you know,
doctors on appearing on.
Speaker 9 (01:52:33):
TV have discussed, and we're going to do that here
as well.
Speaker 12 (01:52:37):
And so you know, remember back when he met President
Putin in Alaska and he was walking a little wobbly
down that red carpet, and then there was a couple
other instances where we saw him falling asleep and it
looked like he had a little bit more of a
pronounced facial group than you would expect. And you know
this when I hear MRI I and knowing what's been
(01:52:58):
going on with him, and then especially last night, I
don't know if you saw the video of him in
Japan with the new Japanese Prime Minister walking around their
ballroom there, and he had to be directed every which
way either by her or one of the soldiers there.
Speaker 9 (01:53:14):
He had no idea what was going on.
Speaker 12 (01:53:17):
I mean, I'm sure there's a lot of pop circumstance
involved in what you're meant.
Speaker 1 (01:53:22):
To do there.
Speaker 9 (01:53:23):
But so going back to the MRI, I mean, I
would I would put money on a brain MRI.
Speaker 12 (01:53:30):
Usually we use MRIs after the fact if somebody's had
a stroke and we want to see if there's evidence
of like smaller microvascular strokes.
Speaker 9 (01:53:40):
That just come from you know, bad heart disease and
plaque in the brain just that you would see in
the heart. And so my suspicion is some sort of MRI.
Speaker 12 (01:53:48):
And even if he says it was perfect, meaning there
was nothing acute, I mean, they might be looking for
changes in the brain structure and in the in the blood.
Speaker 9 (01:53:58):
Float is to sort of follow up on what's been
going on with him.
Speaker 1 (01:54:02):
Yeah, I mean, there's nothing damning about having an MRI.
Although it's interesting because as you say, I mean almost
would have to be as you're speaking, it makes total
sense that it would be something around the brain because
I don't you know, he's not MRI ing his pitching shoulder,
you know what I mean, He's so fascinating them the lesson.
(01:54:24):
It was just weird that he was so open about it.
So he was open about the MRI, he wasn't open
about what the MRI was really about. Telling me about
flu season and the best time to get the flu shot,
I kind of feel like that's got to be close
to now.
Speaker 12 (01:54:38):
Yeah, we like to say flu before boo, meaning you
got to get that flu shot right before Halloween, and
Halloween is Friday. And the reason we say that is
because it takes about two weeks for your immunity to
build up to the point where that flu shot is
providing protection. And so we want that to be well
before people start traveling for Thanksgiving, which you know, the
(01:54:59):
biggest travel time of the year for Americans.
Speaker 9 (01:55:02):
And you've got to give it that two weeks.
Speaker 1 (01:55:03):
And so what kind of protection is it providing this season?
I mean, it does seem to change season to season.
Speaker 12 (01:55:10):
Correct, So if you look at last year, the data
that we had in retrospect was that it was forty
percent effective against preventing symptomatic disease symptomatic disease such that
you would need to be seen by a doctor so
severe enough that you would end up in the er
or seeing your primary care doctor. And then this year,
because we always look at the Southern hemisphere for their
(01:55:32):
experience of the flu season because they have winter before us,
and so it's always usually Australia, and the data this
year shows that it's it's improved, so it's fifty percent
protective against symptomatic disease, and so that means that there's
probably even greater protection against you know, milder symptoms or.
Speaker 9 (01:55:49):
You know, keeping you out of the hospital so and
death of course.
Speaker 12 (01:55:52):
And so the other thing that's really interesting this year
is that this is the first time you can order
the flu misted nasal spray vaccine to be sent to
your house. Actually just ordered my I was going to
give it a shot. I ordered mine last last week
and it actually comes next week. But it was you know,
you put in your insurance information, you put in your
medical information. I was approved within five minutes. You just
(01:56:15):
pay like nine dollars for the shipping costs. And so
that's a nasal spray vaccine.
Speaker 9 (01:56:19):
And the theory is that if you think about all
respiratory viruses.
Speaker 12 (01:56:23):
They usually get into our system via the mucosal memoranes
the lining of their nose in their mouth, and so
putting that vaccine in your nose to increase IgE, which
is the unoglobu one that protects against viral respiratory diseases,
supposed to enhance the protection that you have more of
a shield right in the area where you're usually getting
get infected. And it seems to be that in theory
(01:56:46):
it could do better with that than getting a shot
in the arm. So we'll see, we don't we'll see
what the data shows after the year. But that's a
good option for people that just don't like needles.
Speaker 1 (01:56:57):
That's really interesting. And then and it does work. You're saying,
you send a forward and they you sign away for
approval and then you get it at the pharmacy.
Speaker 12 (01:57:03):
Is that it No, they mail it to your house
and you putty right in the fridge and the shelf
life is maybe like a few weeks, but you know
it might be.
Speaker 9 (01:57:14):
A little late to order it now.
Speaker 1 (01:57:15):
Yeah, yeah, yeah, I think a week and a half.
But I don't mean that since I don't really know
anything of medicine. This may be a totally wild thing,
but it seems sort of like almost meat evil to
suggest just because the flu manifests as a thing that
you have a lot of nasal respiratory stuff with, somehow
(01:57:36):
spraying into the nose is a more effective way. I mean,
it's your whole immune system that's reacting, isn't it.
Speaker 9 (01:57:42):
Yeah, you're one hundred percent right.
Speaker 12 (01:57:43):
I mean it just makes so much more practical sense
that for these viral respiratory illnesses, why are we only
doing these?
Speaker 1 (01:57:51):
And so yeah, yes, and no, I'm kind of saying
I don't know, or isn't isn't isn't the injection Just
isn't that working your whole immune system? It's basically reminding
your immune system of this thing that may attack it later,
isn't it. You don't have to remind your nose. Your
whole immune system knows. Yes, you're right.
Speaker 12 (01:58:10):
With the with the with the shot in the arm,
it sort of spurs the entire immune system. But again,
when we talk about specific frontline protection against the respiratory
infection in the nose of the mouth, the theory is
the nasal spray does more to boost IgE than a shot,
just because.
Speaker 9 (01:58:29):
It's right there.
Speaker 12 (01:58:30):
I mean, eventually the shot in arm is gonna you know,
catalyze the IG and your immune system to mobilize to
your nose in your mouth to provide protection. It just
seems like a I don't know, we'll see, well, I'll
be I'll be the guinea pig for how well this works,
because I always get the flu shot. I never get
the flu knock on woods. So we'll report back.
Speaker 1 (01:58:49):
That's great, that's great. My last question is to double
back to our fearless leader, Donald Trump. He was peacocking
about his He called it an IQ test that he
was acing, but in reality, it was a cognitive exam
to sort of assess someone I think after a stroke
(01:59:12):
or after some kind of maybe cognitive hits that he's taken.
So you know, I don't know if he challenged AOC
to take the same kind of test that he's taken.
And it was an odd thing to peacock about. And
it was the MRI that he was talking about that
we mentioned. It was sort of adjacent to that same test.
(01:59:35):
So it's kind of consistent with the story you were
talking about involving the possibility of a stroke or assessing
for a stroke.
Speaker 9 (01:59:42):
Yeah, No, I think you answered the question. I mean
he's confusing the two tests.
Speaker 2 (01:59:46):
I think he is.
Speaker 12 (01:59:49):
When he was challenging AOC, I think in his mind
he was thinking of an IQ test, right, But the
test that he had was for cognition, the test that
we screen for like early onset of dementia.
Speaker 9 (02:00:04):
So that's the test that he got.
Speaker 12 (02:00:06):
But I think he was You're right peacocking thinking that
he was challenging somebody on an IQ test. Obviously AOC
is not at the stage where she's taking cognitive tests yet.
Speaker 1 (02:00:18):
So I mean, this is the you know, man woman
camera TV thing. I mean it was like earlier or
person what I mean it was. It's that kind of
basic thing. They give you a bunch of numbers, you
have to then repeat them in reverse order. These are
the kinds of things you do. It's not an IQ test,
it's a simple cognitive test. So it's funny that he's Look,
(02:00:41):
he's a creature of his own universe and everybody has
told him that he knows everything about everything. He'll tell
you that he knows everything about everything, and they all
just bow. So when it comes to medical science, it
becomes particularly concerning. You have rfk Junior, who's also a
completely coordinated himself to Trump. And then in closing, I
(02:01:03):
wonder if you can just touch on what's happening at
HHS is there. I mean, I guess it's all alarming,
but I'm wondering if you see any any daylight. Have
they backed off a bit on any of their recommendations
or is it all just pretty grim?
Speaker 12 (02:01:19):
Yeah, you know, to bring it kind of full circle
to Trump's visit to Walter Reed on October tenth, just
to remind people he got the flu and the COVID
updated vaccine on those days too, And so it's kind
of funny that you have your president getting the shot
that your HHS secretary is saying not to get and
trying to limit access for so just think.
Speaker 9 (02:01:43):
About that for a second. So at the end of
the day, I mean, it's all I think it's all performative.
It's all performative politics for their base.
Speaker 12 (02:01:52):
But you know, like we've said before, all of our
Kad Junior's kids are fully vaccinated too.
Speaker 1 (02:01:57):
And so you know, have the recommendations for pregnant women
change because of this whole timet all thing.
Speaker 12 (02:02:04):
No, the official recommendations are still the same and we
have you know, multiple medical societies including you know mine
with the American Academy of Emergency Physicians and the American
a Category of Obstetricians, gynecologists, pediatrics.
Speaker 9 (02:02:19):
You know, we're all in line on this that the
recommendations do not change the time and all still the
safest option.
Speaker 12 (02:02:25):
And just to be honest, in the er I've had,
you know, there's that moment where we see a lot
of pregnant women and they have a fever, they have pain,
and you suggest tail and all, and.
Speaker 9 (02:02:35):
There's this like slight I've noticed.
Speaker 2 (02:02:37):
I feel it too.
Speaker 9 (02:02:38):
I'm like, oh, boy, Like I wonder what they're gonna say.
And you know, for the most part, nobody has been resistant.
Speaker 12 (02:02:45):
I think people people get it, and and and the
big caveat especially is when pregnant women come in with fever.
It's so dangerous to not treat that fever. It's dangerous
for the mom if if that infects continues unabated, and
you're allowing that that fever, that illness to rage, it's
(02:03:05):
a danger for her, it's a danger to the baby.
I you know, most people do seem to accept still
the medical recommendations of the doctors standing in front of them,
which which was great to see. I mean, I've only
had a couple of patients where you know, they've expressed, well,
I've heard there's this concern about tile and all.
Speaker 9 (02:03:22):
But you know, it's an opportunity for us to educate.
Speaker 12 (02:03:26):
And to explain, and people are receptive to it when
when I politely educate and explain why it's still important.
Speaker 9 (02:03:34):
So, yeah, that's where we're at.
Speaker 1 (02:03:37):
Does just tell me this will remind me what's the
most common thing you see in the er? Oh?
Speaker 12 (02:03:44):
Boy, Well, for me, and then the data supports it,
it's abdominal pain, like stomach pain.
Speaker 9 (02:03:51):
That's the most common reason people come to the er.
Speaker 12 (02:03:54):
And I expect, you know, after the Dodgers game last night,
we're gonna see we're going to see a lot of
people coming in with just stress and do some dominal
pain or you know, from heavy drinking or just binge
eating in front of the TV just watching that eighteen
inning seven hour classic.
Speaker 1 (02:04:12):
Wow, James, that's why you're in the er because of
binge eating on the couch watching the game. That's a
good reason to be in the er. Doctor Michael Daniel,
thank you. You can follow doctor Michael daniel on Instagram
and across social media. Daniel is dai g n au lt.
(02:04:32):
I'm spelling it for people that are listening on Spotify
and iHeartRadio, and he has got a lot of stuff
out there, so google him and you'll, I think, learn
something along the way. Very good information. And we love
that you stopped through here every so often. Thank you,
doctor Danielle, Thank you Mark. Good to see you. Okay,
good to see you, sir. That is vital signs for
the back.
Speaker 6 (02:04:52):
Check your vital signs again next time only on how
the Mark Jompson Show.
Speaker 1 (02:05:02):
Yeah he's a he's a good doctor. He's a good doctor.
Yeah yeah. So, uh tell me what I am missing
and where I Oh my god, it's gotten late. I
know I didn't even know that. Geez. Well, I guess
I have to. I have another interview to do around
(02:05:22):
a couple of minutes. I got to you gotta wrap
this up. Well, it's been a very productive visit. I
feel it's gone very well. The New York Times is
reporting says Randy, that the US sunk four more boats
operating in the Pacific. There has been no evidence or
information provider that we actually know who it is that
we're killing. That is a concerning to say the least.
Speaker 2 (02:05:45):
As I reported earlier, fourteen dead in these attacks against
these boats. One survivor in this one.
Speaker 1 (02:05:53):
Robert Schwartz says Trump doesn't want anything to be done
on the government shutdown. He can do what he wants
if the House is out of the way. Seems the
House members would not relish giving away their power, but
alas they have. Certainly it's true MRIs don't work, says
Dirk Diggity. If you're smarter than the MRI, right, I forgot,
(02:06:18):
oh man, what a state we are in. I can
only tell you that tomorrow brings a really special show.
I believe John Roffman returns tomorrow, everybody, and also a
true hero in the world of getting legislation passed despite
(02:06:40):
the headwinds of big money and big politics. Doctor Jennifer Conrad.
She's a veterinarian who got legislation passed in California that
was opposed by some of the biggest money in that field.
We'll talk about how that happens and how the difference,
really the difference can be made by one person. And
(02:07:01):
Belinda Weymouth will save the planet one Wednesday at a
time she is to see tomorrow with It's It's the
Planet Stupid, So that's happening tomorrow as well. Thank you
all for joining today and supporting the show in all
the ways you do. Thank you Tony, Thanks you Sam
Stevens for the Mark Johnson Show. Bye Bay some fun,
(02:07:23):
some heavy and more by but we are out of
time until tomorrow. Bye BLA.