Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:10):
Broadcasting live from the Abraham Lincoln Radio Studio the George
Washington Broadcast Center, Jack arm Strong, Jetty.
Speaker 2 (00:18):
Arm Strong and Jetty and Key arm Strong and Gay.
Speaker 3 (00:33):
Welcome to a replay of the Armstrong and Getty Show.
We are on vacation, but boy, do we have some good.
Speaker 2 (00:39):
Stuff for you. Yes, indeed we do.
Speaker 4 (00:41):
And if you want to catch up on your ang
listening during your travels, remember grab the podcast Armstrong and Getty.
I demand you ought to subscribe wherever you like to
get podcasts.
Speaker 3 (00:49):
Now on with the infotainment ELSA is a liar. Stay tuned.
It's all about Ai. Speaking of Ai.
Speaker 4 (01:00):
Ah Yeah, so Layer listener Mike sent us an email
with an intriguing link to chat GPT. He was trying
to remember specifically how I worded something the other day
about swimming in a pool of Satan's sulfurous boiling hot
urine or something like that, and uh, and he couldn't remember.
He talked about us uh, replacing Rush Limbaugh as his favorite.
(01:21):
You know, Klay and Buck never talk about waiting in
a sea of urine, at least as far as I've heard.
They do not have my flair for Wordsmith or anyway,
So he asked chat GPT, Hey, what did Joe say exactly?
Speaker 2 (01:35):
And it told him.
Speaker 4 (01:36):
Immediately and then uh, then he says, uh. It continued
to I decided to train it by saying this is
my exact sense of humor, and it continued to come
back with more and more quotes from the show, but
amusing to Jack and myself. It also came back with
comments like, uh, we had a conversation about Putin being
(01:56):
a war criminal. I said, I trust this guy like
I trust gas Station. They don't hold back on blunt
global judgments and those instantly memorable visuals. Okay, Jack Armstrong
riffing on political caution, boarding up my house when I
see you're wearing a helmet as well, That absurd is
twist on paranoia is pure A and G.
Speaker 2 (02:16):
And then this is one of my favorites.
Speaker 4 (02:19):
On flights packed with wheelchairs and as Joe says, they're
magically heal tering the flight. It's a miracle. They lampoon
the system while staying hilarious. Thank you, chet GPT.
Speaker 3 (02:33):
Oh my god, but they lampoon the system while staying alarious.
Speaker 4 (02:40):
That's chet GPT's review of our show. Well, so I
agree with you that AI does have ways to.
Speaker 3 (02:46):
Go or it's hallucinating completely, which is the problem we've
got here with ELSA. So ELSA is the new AI
thing they're using at the FDA. One of the most
concrete claims that anybody who's pro AI has been making
sense its inception is what it could do with like
(03:07):
medicines and health care and stuff like that, that it
could just figure out all kinds of health complicated health things.
Speaker 4 (03:14):
Fastially, yeah, diagnosis, figuring out drugs, all kinds of different things,
way faster than human beings could.
Speaker 3 (03:22):
And so this ELSA has been working on some of
this stuff at the FDA, and they just have discovered
that it is doing a lot of hallucinating in the
way that AI does, and nobody's exactly sure why or
if it can be stopped. But for some reason, like
my one kid used to be when he was six,
if you ask him a question and he didn't know
(03:43):
the answer, he would make something up because he felt
like he just had to give you an answer. He
he for some reason, why my son is a little
kid couldn't say I don't know, he would make something up.
But I finally caught on to that and I'd say,
you don't have to do that, Just tell me I
don't know.
Speaker 2 (03:58):
It's perfectly fine to say I don't.
Speaker 4 (04:00):
No. Yeah, kids are rewarded and praised for knowing the
correct answer to a question.
Speaker 3 (04:04):
So I get why a six year old would do that.
Why's Ai do it because he gets a cookie or
you say, man, you're smart. Anyway, Elsa was making stuff
up and it took him a while to figure this out.
And uh, some within the FDA so say, it's really
really not as helpful as they were hoping in that
you have to double check everything to the point of
(04:26):
your practically doing the research anyway, to like replicate what
Ai said.
Speaker 2 (04:32):
And it goes so far as to like make up
studies with.
Speaker 3 (04:35):
All the data you know, and it you know, we
tested eighty thousand islands and sixty percent of the islands
over the age of forty five who had smoke to
blah blah blah, and it just makes this I almost
dropped it as bumb.
Speaker 4 (04:47):
It just makes this crap up. This is audio, so
you don't know, I'm making my what face? And I'll
bet a.
Speaker 2 (04:55):
Lot of y'all are too. That's crazy. All the details
and scientists names and stuff. Oh yeah, yes, exactly.
Speaker 3 (05:03):
Well, it's not that surprising given the fact that what
we've heard the stories of it making up legal cases
where lawyers, you know, go to a legal case and
there are names in an instance and it.
Speaker 2 (05:13):
Cites the law Google v. Tennessee. It's not completely made up.
Speaker 3 (05:20):
What an interesting thing that just seems to occur, apparently
across multiple AI platforms. It's not like it's just grock
or just open AI. They all AI does this. It's
got some need to create an answer.
Speaker 4 (05:37):
Sometimes it's practically impossible to explain technically, and you find
yourself going to you said, it has a need, and
you know, I don't know about you. My coffee machine
doesn't have any needs whatsoever. It's a mindless automaton. I mean,
it's just a machine. Uh, what what is it that
drives it? There's another cheat. I've used a human emo
(06:00):
to try to explain this phenomenon.
Speaker 2 (06:02):
It's just nuts.
Speaker 4 (06:04):
So the speaking of nuts, before I forget, I brought
to you yesterday the article about the guy who was
having manic episodes in Borderline Psychosis and chat GPT just
kept egging him on.
Speaker 2 (06:18):
I read further.
Speaker 4 (06:19):
Into that and it's it seems to have a need
and I'm sure it was programmed into it in one
way or another, but to be agreeable and enthusiastic, which
is kind of fun if you're using it for something.
You know, I'm going to London. I'm really into history.
(06:39):
I'd appreciate your recommending this. And it does and says
we can tell you more Winston Churchill sites if you like,
and I say yes, please, and it says great, and
it's kind of endearing, but evidently it goes way too
freaking far.
Speaker 3 (06:52):
It does do that. Yeah, I had never really noticed
it until you said it. It is cheerful and enthusiastic,
which kind of gets you all excited up for it.
Speaker 4 (07:00):
I am including when you're delusional and heading towardumental breakdown.
I know you're right, you are smart. It's the other
people who are crazy. Oh boy, that's not helpful.
Speaker 3 (07:10):
No, I haven't had a hallucination yet where I like,
you know, I'm in a town and it says, hey,
this is where hey chat GPT, this is where I
am one.
Speaker 2 (07:20):
It's the best great breakfast place.
Speaker 3 (07:21):
And I end up going to forty third and H
Street and you know, it's a tire shop, and it
just completely made up a place that serves friends, show
mos pancakes. You got to go to mos pancakes if
you're in Seattle.
Speaker 2 (07:34):
Maybe that'll happen someday.
Speaker 5 (07:36):
Uh.
Speaker 3 (07:36):
Anyway, briefly, just to get back to this, so, the
agency was already using ELSA, their their AI chat thingy
to accelerate clinical protocol reviews, shortened the time needed for
scientific evaluations, identify high priority inspection targets, blah blah blah.
Speaker 2 (07:51):
They're at the FDA, but.
Speaker 3 (07:52):
Now they've had to pull way back because of the hallucinations.
And they talk about how it's still very very handy
for organizational stuff like i'morizing all the notes from a
meeting into a handy short thing that everybody can read
and digest much faster than human beings and all that.
Speaker 2 (08:09):
So AI is really good at that sort of stuff.
But that's wild.
Speaker 3 (08:12):
If this wouldn't it be something If this all gets
stymied by you can't figure out why, it just lies
sometimes out of nowhere. Yeah, we read to you that
list of instructions. One of our beloved listeners gave to
whatever AI system he was using, and it was it
was damn near a dozen different instructions about if you
(08:35):
do not know, say so, do not create anything, do
not make anything up. If you are speculating or based
on insufficient information, tell me that you are speculating. It
was again, it was at least half a dozen, and
I think closer to a dozen different very specific instructions
on that level.
Speaker 4 (08:54):
It's weird that that's necessary. But I wonder whether that
sort of thing can be introduced into systems and they can,
you know, rectify this pretty quickly. I would guess they can.
Speaker 2 (09:10):
I don't know that.
Speaker 4 (09:12):
It's certainly one of those you remember that robot whirling
out of control and it would have taken off somebody's
head if they got within range, right, I think AI
may be at that stage at this point. It's good
at putting the rivet there in the bumper, but stay
out arranged, Jim hear if you hear the thing, if
you see the overheating light, come on, step back.
Speaker 2 (09:32):
It's just not quite ready for prime time. As they say, what.
Speaker 3 (09:37):
Prices are going up already or about to go up
because of combination of things changing in the world, including tariffs.
Speaker 2 (09:44):
We can get to that.
Speaker 4 (09:47):
Yeah, And I want to get to some of the
best writing I've ever come across on the topic of
when tolerance is taken too far, societies become totalitarian and
utterly intolerant.
Speaker 2 (10:00):
Wow, you've got to have a limit on tolerance. I
definitely want to get to that.
Speaker 3 (10:04):
Brett Stevens New York Times, Israel is not committing a genocide.
I want to read a little from that, just so
you have some ammunition in case you're run into some
of these people that are pushing that narrative hard and
a lot of people are, including the soon to be
Mayor of New York.
Speaker 6 (10:22):
Jack Armstrong and Joe Getti the Armstrong and Getty Show,
The Armstrong and Getty Show.
Speaker 7 (10:48):
High pretension is something that will be recognized with lower numbers.
Used to be one forty was the number you really
paid attention to before thinking about treatment. Now it's one thirty,
and they want to talk about treating that more aggressively. So,
you know, it's interesting if you look at one twenty.
Most people know these numbers, but one twenty over eighty
(11:08):
and lower that's considered normal. One twenty to one twenty
nine is considered elevated. But that one thirty number is
where people are really starting to pay attention. If you
have blood pressure that falls into that range for three
to six months, you should try, you know, basic lifestyle changes,
which you know, diet, exercise, cutting back on salt. But
that doesn't work after six months, you probably need to
(11:30):
be thinking about medication.
Speaker 3 (11:32):
Yeah, that's is I'm I'm always cynical about this sort
of thing. I know it's not always warranted, but man,
there's a tremendous amount of money to be made if
you lower that number.
Speaker 4 (11:45):
You know, I'm such a fool. I hadn't even thought
of that. I was thinking in terms of, you know,
what is the ideal. You're a great athlete, you're perfectly
healthy level of blood pressure. But no, you're right, My god,
there's there's there's so much money to.
Speaker 3 (12:01):
Be billions and billions and billions of dollars to be
made over the next many decades if you get that
number lower just a little bit, and then it's all.
Speaker 2 (12:10):
The proof of it is an interesting thought.
Speaker 8 (12:14):
So so I went to my nephrologist not long ago.
Speaker 2 (12:18):
And so somebody reads the bumps on your head to
predict the future, Yes, that's precisely. I visit him often. No,
it's my kidney doctor, okay.
Speaker 8 (12:26):
And he took my blood pressure and it was one
twenty three over like eighty one. It was right in
the good zone. And he goes, you know, we wanted
a little bit lower. He's like, we're looking for like
a one seventeen over seventy eight.
Speaker 2 (12:38):
Wow. Yeah.
Speaker 8 (12:39):
And I'm like, oh, you have big goals, sir, that's
really low.
Speaker 2 (12:43):
Yeah wow.
Speaker 4 (12:44):
But he says, saw my doc the other day and
he was thrilled with mine, which was you know.
Speaker 3 (12:50):
So that's interesting. Came across this story ultra marathoners. It
would seem, according to a new stufe have a higher
likelihood of the rectal cancer colon cancer than regular people.
And they don't exactly know why I don't do it,
(13:10):
and that is why I stopped.
Speaker 4 (13:11):
I'd be running an ultra marathon right now if it
weren't for that just seems foolish what.
Speaker 2 (13:16):
I was planning for this afternoon. But I guess I'll
call it off. And they don't know why.
Speaker 3 (13:22):
And it started with a doctor who noticed, man, I
see an oddly high number of ultra marathoners here. As
an expert in colon cancer, like we it seems odd
that I'd come across this many. So then they looked
into the study and there's some connection that they have
(13:42):
no idea what, but wow, I mean, I'm intrigued. I
want to know more. I'm intrigued by ultra marathoning in general.
Remember our old old producer Scott, Buddy Scott he I
think he had high blood pressure. That was what got
him started. His blood pressure got a little high. He
was worried about it. Is his dad hadad problems, and
he became an exercise nuts still is. He's in tremendous shape.
(14:05):
But he became an ultra marathoner running shout out Scotty. Yeah,
very very good guy. His his great uncle is mentioned
in Ulysses.
Speaker 2 (14:14):
I'm reading the Ulysses.
Speaker 3 (14:15):
Yes, our old producer Scott Sandow, whose great great.
Speaker 2 (14:19):
Uncle was the world's strongest man. That's right, lifted their
pony his head. The great Sandow.
Speaker 3 (14:25):
You can find uh YouTube videos of him with Edison
is the first Edison film I think was our old producer,
Scott's great great uncle.
Speaker 2 (14:33):
Wow.
Speaker 3 (14:33):
And anyway, he's mentioned in Ulysses and I thought, Wow,
that's Scott's uncle. That's hilarious. But he's hilarious. So he
became an ultra marathoner and how.
Speaker 4 (14:41):
Name dropped in Ulysses, Yes by what's his face, the
Irish guy, James Joyce.
Speaker 2 (14:47):
Yeah, that's the one. Uh.
Speaker 3 (14:50):
But I'm I kind of believe that in general, you know,
for all all our jokes aside that there's a lot
of harm to your body by running ultra marathons. You're joints,
you're there, all kinds of stuff that, I mean, you're
not built to do that, right, And just the way
various things we do change our brain chemistry.
Speaker 4 (15:09):
I mean we have a kindergartner's understanding that as a
as a species. I mean, science has got a lot
better at it, but I still think we're at the
very beginnings of understanding, you know, the very things, the
various things we do and don't do, how they change
our body chemistry.
Speaker 3 (15:23):
So but here's in a narrow casting thing about exercise
that I found interesting and I don't know if it
will ever apply to me or or any of you.
Speaker 2 (15:30):
But I've got a friend.
Speaker 3 (15:31):
He's an older guy in his seventies, but he used
to be really serious bike rider, competitive bike rider, And
I just bought a really fancy bike and I'm riding
in my back and I'm not going to compete or
anything like that.
Speaker 2 (15:42):
I just want to get exercise.
Speaker 3 (15:44):
But he was explaining to me on how he was
so good and he won so often, and I didn't
know this. It's a matter of figuring out what you're
the upper level is of your heart rate, and then
you figure out a percentage and then if you stay,
like I forget what it was, two and a half
percent below your peak heart rate, you won't get the
(16:08):
lactic acid in your muscles that causes people to cramp up.
And in long, long running races and bike races, oftentimes
what takes you out is you get cramps. But if
you know what you're you're the highest end of your
like the top five percent of your heart rate is,
and then you stay two and a half percent below.
Speaker 2 (16:24):
It's just math, he said.
Speaker 3 (16:25):
And if you just figure out the math and you
keep track of your heart rate, you can stay out
of lactic acid thing compete at a high level the
entire race and run by all these people are ride
by all these people who have cramps.
Speaker 2 (16:37):
I thought that was really interesting about the human body.
Speaker 3 (16:39):
I didn't know that, and I wonder why that isn't
like talk to more of us, because I've I've had
that problem before, Like if you've.
Speaker 4 (16:44):
Ever h has been I think slow and steady wins
the race. I always thought that was ridiculous because a
rabbit could whoop a turtles ass in a race and
everybody knows it. But that was the ancients trying to
tell their warriors, don't get tired out.
Speaker 3 (16:57):
Uh, you know yourself, that is exactly right. They didn't
know the math on it or whatever. They just had
the human experience of They didn't know from lactic acid. Yeah,
you start out too fast and all of a sudden
you got pain. That pain in your side you get
or your your yes or whatever. You got to stay
just below that. That's interesting that that is probably where
(17:19):
that comes from. I'll bet I'll be damn and again,
haste does make waste. I am going to cancel the
older you two, Michael, You're canceling your Ultra marathon for
this afternoon.
Speaker 2 (17:30):
I am yes, The Armstrong and Getdy Show.
Speaker 1 (17:33):
Yeah, Moorja Orgio podcasts and our hot links.
Speaker 2 (17:39):
The Armstrong and Getty Show.
Speaker 5 (17:41):
I was drinking so much alcohol, almost a handle a
vodka day. And alcohol is the most destructive drug, not
just to your body, but it puts you in more
danger than any.
Speaker 2 (17:52):
Other drug that I've ever experienced.
Speaker 5 (17:54):
And then you add on top of that the amount
of crack that I was using at the time, and
crack cocaine in terms your physical health is not as
dangerous as the situation that you put yourself in to
be able to obtain it.
Speaker 3 (18:07):
That was a pretty interesting little lesson from a guy
who's done the experiment, so you don't have to with
all those different things in his lifestyle. I've known plenty
of drug addicts that took them a long time to
figure out their alcoholics and that's why they ended up
keep kept doing drugs. You had to stop drinking or
you weren't gonna ever stop doing all those drugs.
Speaker 2 (18:28):
Anyway.
Speaker 3 (18:30):
That's Hunter Biden did a three hour podcast yesterday and
just incredibly funny, incredibly ill advised comments on all kinds
of different stuff. Here's the end of a long rant
about George Clooney, the actor who you'll remember. George Clooney
took out a full page ad in The New York
Times saying Joe Biden need to step down. Talked about
how Joe Biden was not mentally competent. Blah blah blah.
(18:53):
Hunter didn't like that.
Speaker 2 (18:55):
Why do I have to listen to you?
Speaker 5 (18:57):
What right do you have to step on a man
who's given fifty two years of his life to the
service of this country and decide that you, George Clooney,
are going to take out basically a full page ad
in New York Times to undermine the President.
Speaker 2 (19:11):
Clsey's senile is why?
Speaker 3 (19:12):
Well that and longtime listeners of the show know, nothing
gets me more wild up than people who are in
got incredibly wealthy in government talking about public service. Give
me an effing break. Who gave fifty two years of
his life to the country. Is that what he was
doing as he ended up with like nine homes? All
right and ungodly wealthy? Is public service?
Speaker 4 (19:33):
Yeah?
Speaker 2 (19:33):
He really? He really gave a lot. Thanks for your sacrifice.
Give me a break.
Speaker 3 (19:38):
How do you get that rich in government and continue
to delude yourself that you're in public service?
Speaker 2 (19:44):
That's what you do for a living. You have to
that's part of the scam kipping go.
Speaker 3 (19:49):
That is unbelievable. But anyway, the George Clooney thing, we're
not going to play the long thing. But what's interesting,
and I don't think I was aware of this before
us says that it wasn't about his dad's confidence. It was,
but that might not have been the driving force because
everybody was willing to overlook Joe Biden's brain. George Clooney
(20:12):
was mad because his hot young wife is a super
anti Israel activist and wanted and was leading a charge
and even raising money to have Netanya who arrested and prosecuted.
Joe Biden had said on the record that's a ridiculous idea.
Really pissed off Clooney's wife, and that's when Clooney turned
(20:34):
on Biden. And I can believe that because everybody who's
willing to overlook his obvious sinility, George Clooney got upset
about it when his wife's main purpose in life got
shut down.
Speaker 4 (20:49):
Right, Often things have more than one cause that's an
interesting angle.
Speaker 2 (20:54):
Should know.
Speaker 3 (20:56):
We're getting now, we're going to get into some stuff
that's more relevant to history. Hunter talking about his dad
leaving the race and the debate night and all that
sort of stuff.
Speaker 2 (21:06):
Let's roll with that. Which number was up fifty three?
Did you kind of see him dropping out of the race.
Did you see that coming?
Speaker 5 (21:15):
No, no, I was. I thought that we had cleared
all the hurdles that they had set up for us.
For some reason, oh god, the intelligentsia of the Democratic
Party with twenty twenty hindsight, believes that Joe Eiden should
have considered not running again because of their perception that
(21:40):
he was too old. And so then the drum beat began,
and The New York Post wrote, I mean, the New
York Times on a near daily basis, egged on by
the pod Save America saviors of the Democratic Party with
what four white millionaires that are dining out on their
(22:05):
on their association with with Barack Obama from sixteen years ago,
living in Beverly Hills telling the rest of the world
what black voters in South Carolina really want or what
the women in the waitress living outside of Green Bay,
Wisconsin really believes. Wait, what the I mean? I can't
believe that. Do we do this over and over again?
Speaker 2 (22:28):
Yeah?
Speaker 5 (22:29):
Or I hear Ram Emmanuel is gonna run for president?
What a like David Axelrod's gonna run his campaign for him?
That's like, oh boy, there's the answer, there's the answer.
Speaker 2 (22:42):
Yeah, I think it might actually be you idiot. He
is delusional.
Speaker 3 (22:46):
He is, and I'm surprised he's sober. I wonder if
he is because he is non stop. I'm the victim.
I thought we'd cleared all the hurdles they put in
front of us. I mean, just everything is how the
world is set up again, even though you're all ungodly
wealthy and at the time the most powerful people on earth.
(23:06):
Everybody was against you, all the hurdles they put up
against us, whatever.
Speaker 2 (23:12):
That's interesting.
Speaker 4 (23:12):
That was a tacit admission that he was a close advisor,
and he clearly was because it was all us. The
second thing is his argument that it was merely the
intelligentsia of the Democratic Party that came up with a
conspiracy to get rid of Biden, and that's why it happened,
as opposed to seventy five to eighty percent of America
was run around saying the guy's too old to be president.
Speaker 2 (23:35):
Or that the New York Times was your enemy.
Speaker 3 (23:38):
They stood, They covered for your dad the entire time
until it was I mean they still are, because they
still haven't come forward and admitted that they were ignoring
it to try to keep Trump from getting elected again.
Speaker 2 (23:53):
So, yeah, the New York Times was on your side.
Speaker 3 (23:55):
They had their thumb on the scale, so hard for you,
and you're acting like the New York Times was your
enemy and that's what kept you down. That's hilarious, seriously delusional,
absolutely amazing that he looks at the world that way,
especially after the debate. Well, and I know you're leading
to that clip, but I'm not done yet. So his
(24:17):
I lost my train of thought. His always think at
the beginning, were you surprised your dad dropped out?
Speaker 2 (24:22):
Yeah? Absolutely you were. You were surprised that that happened.
I don't know, I don't. I can't with these people.
I mean, I just.
Speaker 4 (24:35):
How long before he dropped out? Was it that I
was saying, I'm one hundred percent certain he's dropping out?
Speaker 2 (24:39):
A year?
Speaker 4 (24:40):
Yeah, And in granted, you know, sometimes a blind pig
finds an acorn, but no, I was one hundred percent confident.
The math just did not work. You could not get
him through a presidential campaign to election day and it
didn't happen.
Speaker 2 (24:52):
But Hunter Biden.
Speaker 4 (24:54):
Didn't see it coming at all until the intelligence you
launched their evil plot.
Speaker 3 (24:59):
I know the other part that I hate. So his
level of victimhood and delusion. My dad was in public service,
and the New York Times is against us, and they
put all these hurdles in front of us. You're talking
about millionaires in Beverly Hills and white guys. You're a
rich You're in the top point zero eight zeros to
(25:21):
the one percent of elite in the world have been
your whole life, got that rich and then didn't pay taxes,
and you're still living in a place. I believe that's
like thirty thousand dollars a month paid for by somebody else.
Speaker 4 (25:35):
How do you have the balls to talk about rich
people at Beverly Hills. He is a seriously nutty guy.
Speaker 3 (25:43):
God, I'd say again, I'm surprised he sober, or I
wonder if he actually is. I don't know if he's
gonna be able to stay sober at that attitude. Anyway,
he explains why his dad failed on the debate stage.
Speaker 5 (25:53):
And then they said, well, look, it's all gonna come
down to this state of the Union speech. Goulda come
down to the state of Union speech. I was, and
he knocks it out of the park.
Speaker 2 (26:01):
And it was that one debate that caused the full back.
Speaker 5 (26:03):
Set man, and I'll tell you what I know exactly
what happened in that debate. He flew around the world,
basically the mileage that he could have flown around the
world three times. Yeah, he's eighty one years old, he's
tired of give him ambient.
Speaker 2 (26:15):
To be able to sleep.
Speaker 5 (26:16):
He gets up on the stage and he looks like
he's a deer in the headlights. And it feeds into
every story that anybody wants to tell.
Speaker 2 (26:23):
And Jake Tapper with literally.
Speaker 5 (26:25):
How many anonymous sources if this was a conspiracy?
Speaker 2 (26:28):
Andrew you know this.
Speaker 5 (26:30):
Somehow the entirety of a white house in which you
literally living on top of each other, has kept their
mouths shut about you know, like what in what's conspiracy?
Speaker 2 (26:41):
Yeah that Joe Biden got old. Yeah he got old.
He got old before our eyes. I don't even think
that needs any comment.
Speaker 3 (26:51):
So some of the details of a utation, some of
the details on that I'm quoting News Nation. I haven't
done the fact checking on this myself, but I assume
they're right.
Speaker 2 (27:00):
He had nine days to get ready for that debate.
Speaker 3 (27:04):
After all, his traveling around, his traveling, having to travel
so much included his choice to fly from the big
G seven or G eighteen or whatever meeting they were having.
He flew back to Hollywood to do the George Clooney fundraiser,
then had to go back to the meeting.
Speaker 2 (27:22):
That's your choice, dude.
Speaker 3 (27:23):
You got a debate coming up in a week or whatever,
and you want to do that, go ahead, knock yourself out.
Speaker 2 (27:27):
By the way, you chose the date for the debate.
Speaker 3 (27:29):
If you remember how that unfolded, Trump and we were
amazed by this at the time, gave up everything venue
day moderators. Trump just said, you know, however you want
to do it, we'll do it. So you chose the date.
Speaker 2 (27:43):
How are you blaming this on like circumstances.
Speaker 4 (27:47):
Well, yeah, he's just he's obviously delusional. He knocked it
out of the park. The State of the Union. Yeah,
he shouted in a weird, manic way for the longest time,
and the very intelligency Hunter's blame Aim. We're backing Biden that, Oh,
he knocked it out of the park, whereas most of
America said, no, it was weird and off putting. But again,
(28:07):
the guy's just he's lost his marbles. Whether it was
the cracker he's born that way, or entitlement. I don't know,
but he just he doesn't make any sense.
Speaker 3 (28:14):
God, he is so entitled and a victim and just
like the worst kind of character you can imagine. I
wouldn't want to be involved with him in any way.
Speaker 4 (28:26):
Right, Yeah, he's like a black hole of negative energy
and victimhood.
Speaker 2 (28:33):
God, i'd say, Hunter, you're a loser. He is.
Speaker 3 (28:36):
He is absolutely a loser who happened to be born,
you know, into one of the most powerful families on earth.
Speaker 2 (28:41):
Here he is.
Speaker 4 (28:42):
Blamed painter there for six months now, I mean, unbelievably talented.
Here he is funny how those paintings aren't selling anymore.
Speaker 2 (28:50):
Well, you know, the styles come and go. That's a
good point. Here he is blaming some other people.
Speaker 5 (28:55):
The people that came out against him were who nobody
except Speaker Pelosi emeredis Speaker of Meredith Pelosi did not
give a full throat an endorsement, which allowed everybody else
to kind of go okay, except who came out full
thrown Progressives AOC Bernie, the entire progressive bring Roe Connam,
(29:19):
the entirety of the progressive side of the Democratic Party,
said Joe Biden has got more of our agenda accomplished
in four years than any president in history.
Speaker 3 (29:32):
You know, this might actually be important in that Joe
Biden was regularly portrayed throughout his career as like a centrist.
Was he way more progressive than he let on all
those years? I mean, because Hunter thinking that David Axelrod
(29:55):
and Ram Emmanuel are a ridiculous choice.
Speaker 2 (29:57):
No, that sounds like a pretty good choice to.
Speaker 3 (29:59):
Try to get the Democratic Party back on track and
like in the mainstream of America. But he's touting the
aocs and the birdies of the world as being like
the center of the party. So maybe the whole Biden
klan was just way more progressive than we ever realized.
Or Biden was just a weather vein through his entire
career and especially in his final years in office when
(30:20):
he was clearly senile. He was so in the thrall
of his progressive advisors and he was convinced that's where
the energy in the party was. And he's right to
some extent that he just did the weather vein that
he is turned in that direction.
Speaker 2 (30:34):
I think that's more likely.
Speaker 3 (30:35):
Honestly, my last comment on this would be if Hunter,
if anybody thinks Joe Biden would have won if he'd
have stayed in. I don't I don't even think there's
a point in trying to have a conversation with you.
Or you're crazy. You're crazy. If you think Joe Biden
(30:57):
staying in the race he would have won. I mean,
you're absolutely flutely deluded. Maybe that helps you.
Speaker 2 (31:05):
Sleep at night.
Speaker 4 (31:07):
That in the handle of vodka. It's an ambient, but
give it nine days to bounce back from that ambient.
Speaker 2 (31:14):
Man's rough stuff. Of course, that's the person, crazy person.
Speaker 3 (31:19):
First time we've ever heard about Biden being on ambien,
So who knows if that's true or not. He didn't
look like a guy who needed ambien to go to sleep,
and look like all he needed to do was just.
Speaker 2 (31:30):
Be in a room.
Speaker 4 (31:31):
Well, and during the debate prep eight or nine days. Famously,
according to multiple accounts, he would tire of the prep
a few minutes in and go sit by the pool
and stare into space.
Speaker 2 (31:42):
Right right.
Speaker 4 (31:44):
But it was just a cabal of insidious intelligentsia that
convinced us all that he was too old.
Speaker 2 (31:49):
You're right, Hunter, you're right. Those bastards.
Speaker 1 (32:02):
The Armstrong and Getty Show.
Speaker 3 (32:08):
We were wondering if, like, if he used the therapist
Harry on chat GPT, if it's different than just generally
asking questions a way the rest of us have been doing.
Speaker 2 (32:18):
What did you find out, Katie?
Speaker 8 (32:19):
Yeah, it's significantly different. So for example, I just used
like one of the medications from my IVF process, and
chat GPT regular kind of sympathized with that it might
make me not feel good, but then gave me a
list of resources and things I can do to feel
better in all this.
Speaker 4 (32:37):
Yeah, I was totally unaware is Katie pointed out this
segment that they have like individual bots with specialized programming
for different topics, including the number one being astrology. So
there's no hope for humanity. But number two was find
my celebrity look alike, And so I did that.
Speaker 2 (32:58):
I uploaded a photo and it struggled for a while.
Speaker 4 (33:03):
I could tell it was thinking, Chase, if somebody looked
like you, they wouldn't be a celebrity.
Speaker 2 (33:06):
That's what do we tell this guy did give you? Anyway?
Did he give you Barney Rubble?
Speaker 4 (33:11):
Looking No, As a younger man, I did resemble the
great Barney Rubble a great deal. Uh. Looking closely at
your features, strong brow lines, expressive forehead, square jaw, with
a salt and pepper beard, and a slightly rugged but
approachable look. Hey, it's doing pretty well so far. If
(33:31):
it is, you have a resemblance to Nick Nolty in
his later years Lute Hollywood and Sleeping on park bench.
Speaker 3 (33:41):
No, no, Buddy Holly, Nick Nolty, not now, Nick Nolty.
Speaker 4 (33:45):
Oh no, unfortunately not, and also a bit of Kurt
Russell in his more recent roles. Again in Nignulty, resemblance
comes through in the weathered, expressive foreheadlines and the way
your beard frames your face, especially like Nolty around the
two thousands. Ah oh oh, they cite Kurt Russell and
the hate late Yes, if only.
Speaker 3 (34:10):
The fact that the number one use for AI for
Dunderpates is astrology.
Speaker 2 (34:17):
It's amazing that you're combining like.
Speaker 3 (34:19):
The most cutting edge advanced thing human beings have ever
come up with with the most old timey dumbest from
one hundred thousand years ago, reading the stars for your future.
Speaker 4 (34:36):
Well, more importantly, I need Katie to react with complete
honesty to this. Your look has that mix of ruggedness
and warmth that both of those actors are known for
in their later careers.
Speaker 2 (34:46):
I think you'd probably agree, wouldn't you nailed that?
Speaker 3 (34:48):
Why is that thing giving you all positive feedback? And
there's no You look sort of like a hobo I
saw on the way to work, or just are you
all right?
Speaker 2 (35:00):
Do you have medical professionals you can call?
Speaker 4 (35:05):
Oh my gosh, how long did the doctor tell you had? Oh? Yeah.
On one final AI note, Katie mentioned this headline Jeffrey Hinton,
often called the godfather of AI. He's calling on researchers
to design systems that will take care of us like
we're babies, And we all reacted like, I'm not sure
(35:26):
I need that, but thanks very much. But I read
what Hinton's reasoning is that's the only way to keep
them from becoming like our overlords who shred us and
take our organs for whatever purpose. He's like, we need
to make machines that are smarter than us to care
for us, like we're there babies. We need to imbue
(35:49):
them with genuine concern for human well being.
Speaker 3 (35:52):
Otherwise really, yeah, so we need to convince AI to
care about.
Speaker 2 (35:59):
Us so they don't, you know, eat us.
Speaker 1 (36:01):
Wow, the Armstrong and Getty show yea or Jack or
Joe podcasts and our hot links and armro dot com