Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Broadcasting live from the Abraham Lincoln Radio Studio the George
Washington Broadcast Center.
Speaker 2 (00:07):
Jack Armstrong and Joe Getty. I'm Strong and Getty and
he Armstrong and Getty.
Speaker 3 (00:23):
In a stunning development, Canada has declared war on the
United States.
Speaker 2 (00:27):
Let's go to Joe Braxton, who is live at the border.
I'm currently at the border, but there is no war.
Speaker 1 (00:32):
Mom, Dad, I know this looks kind of real, but
it's all AI.
Speaker 2 (00:37):
Grandpa, I'm fine, this is just AI. You don't need
to wire money to anyone. I am not in love
with you, and I do not need your money for
a plane ticket.
Speaker 4 (00:47):
I am AI.
Speaker 1 (00:49):
This fake moon landing footage isn't classified.
Speaker 5 (00:52):
It's AI.
Speaker 6 (00:53):
There could be Aliens, but I'm not one of them.
Speaker 2 (00:57):
I'm AI, Uncle Fred. Government has not been taken over
by bisert people. You don't need to send us five
thousand dollars in gift cards because you missed jury duty. Grandma,
it's not real.
Speaker 5 (01:10):
It's AI. It's Ai.
Speaker 2 (01:13):
AI.
Speaker 1 (01:13):
It's kind of funny, if only, but that's not the case,
as we all know, and all those instances.
Speaker 5 (01:19):
Boy got to run that on every TV service that exists,
and people see it.
Speaker 1 (01:25):
Well, so I'm about to talk about this new AI
book in just a second, and one of his big
points is governments need to start regulating AI like yesterday
and coming up with some guardrails before it gets unleashed
on the world in all kinds of different ways. Luckily,
the state of California is already on it, Katie.
Speaker 3 (01:43):
Yes, California lawmakers moving forward with a bill to regulate
companionship chat box. It requires reminders every three hours that
the chatbots are not real.
Speaker 1 (01:58):
So first, there's several things I like about this. So
you have a relationship with the chatbot that's obviously already
like weird at a level of dystopian that we only
could imagine just a few years ago, and that already happening,
And it's already happening.
Speaker 2 (02:14):
And I know people who know people.
Speaker 1 (02:15):
I don't know anybody who's actually doing the relationship thing,
but I know people who say they have friends who
are like in some cases like lonely old farmers, which
I find really disturbing, like the last sort of people
you would expect that would you know, get emotionally attached
to a chat bot. But anyway, it's so prevalent that
the state of California felt like they needed to put
(02:36):
in guardrails and step in and have a warning. Every
two hours, No two and a half hours, No four hours,
that's not long enough, Jack, Okay, three hours, will compromise.
Every three hours that will warn you that you're talking
to a computer, you weirdo.
Speaker 5 (02:52):
So there, I am trading intimacies. Oh that reminds me
coming up later this hour out to fall in love.
Science has figured it out. Wow, I'm trading intimacies with
my sexy, sexy computer. And or I suppose a sex
spot in the future, and the State of California is
gonna say, we hate to interrupt your sexy chat with
(03:12):
your computer, but like to remind you it's a computer
and not a real human being. This message brought to
you by Gavin Newsom and the State of California see
in three hours.
Speaker 2 (03:22):
And the super majority Democratic Party. I would love to have.
Speaker 1 (03:26):
Been involved in that conversation for a number of reasons,
concluding I'm as much as I'm an expert as anybody
who made that decision, because nobody has any idea where
any of this is going, So my opinion would have
been as valid as anybody who you know, wrote that
and voted for it. But first of all, you could
start with, do we need the government jumping in and
(03:49):
telling us.
Speaker 2 (03:50):
I know, you feel really good right now.
Speaker 1 (03:52):
And you're really enjoying yourself and you're being you're getting
enjoyment at a deep, deep, human level, but we want
to tell you it's fake because we're the state and
that's our job.
Speaker 2 (04:03):
I don't know what I think about that.
Speaker 5 (04:05):
I mean, I think that's a great question. That's the
thirty thousand foot question. I mean, as we've seen a
steep decline in social belonging, just in general, friendships, civic organizations,
church attendance, all of the guard rails of our behavior,
or a.
Speaker 2 (04:24):
Lot of them have gone away.
Speaker 5 (04:27):
And so the state of California said, well, let's say
you're an atheist with no friends. We'll tell you what's
crazy and what's not.
Speaker 2 (04:34):
I don't want the government doing that.
Speaker 1 (04:35):
Well why wouldn't the So the government doesn't announce, you know,
every three hours at a strip club, the music stops,
the girls stop dancing, and announcement from the state comes up.
These girls do not actually like you. They are going
to pretend to love you, but they are getting paid
to do that. Now, back to the hot chicks dancing.
I mean, right, the state doesn't step in to let
you know this is phony emotion.
Speaker 5 (04:55):
Right before we begin the fourth quarter of the forty
nine ers game, we'd like to re mind you that
these young gentlemen are actually rented millionaires and couldn't give
a crap about the Bay Area beyond finding a nice
place to live. They'll be gone the moment there contract
has ended. Now back to the game, or.
Speaker 1 (05:12):
At the end of a lottery commercial that the state
benefits from very few people, We'd like to remind you
very few people actually win, and studies show that people
who do win end up less happy than before.
Speaker 2 (05:22):
I mean, when is it the state's role to jump
in on this? So there's that.
Speaker 1 (05:27):
Then the other part, if you're going to accept that
they should, how did they come up with three hours?
I would have loved to have heard that back and forth.
Speaker 2 (05:34):
I mean, that's hilarious.
Speaker 5 (05:37):
I mean, you don't want to interrupt people's intimacies with
their computers too often because that'll be obnoxious.
Speaker 1 (05:43):
But if you wait five hours, they'll be so hooked
there's no turning them around or something.
Speaker 5 (05:47):
Ye's say, this sweet spot here what do you think,
Jim three. Okay, you know, there's part of me that
thinks I don't really think this, but I've had this
thought pop into my head that the state of California
is Trumpian in that it proposes all this ridiculous crap
to distract from the fact that now the bullet train
and the tension time I'm sorry, the pension time bomb,
(06:11):
and you know, a dozen of the soon to be
incredibly even more expensive gas prices than the rest of it.
Speaker 2 (06:17):
I don't know.
Speaker 1 (06:18):
So I'm reading another AI book. I'm fascinated by the
whole thing. I think everybody should be. I think it's
going to be a giant deal. But I do need
to read my next book. I swore last night has
got to be one that is the other side. And
there are a lot of people on the other side
that say it's not going to be that devastating to
(06:39):
mankind or society. So I need to read one of
those books next, because I've been leaning toward the people
who think it's going to be a big deal. But anyway,
here's the review of this book from Bill Gates. Guy
aren Microsoft used to be the richest man in the
world for a very very long time and spend a
ton of is currently spending a ton of money on
his own AI, you know, creation that can compete against chat,
(07:03):
GPT and GROCK and all the others. Here's there's a
review of the book in which he mentions the book.
When people ask me about artificial intelligence, their questions often
boil down to this, what should I be worried about?
And how worried should I be? For the past year.
This book came out a year and a half ago,
which unfortunately is a little dated by AI standards, But
this review is from December, so the review is only
(07:25):
seven months old. For the past year, I've responded by
telling them to read The Coming Wave by Mustafa Suliman.
It's the book I recommend more than any other AI
on AI to heads of state, business leaders, or anyone
else who asks, because it offers something rare, a clear
eyed view of both the extraordinary opportunities and genuine risks ahead.
(07:45):
And I'm two chapters in and it's already fantastic. What
sets this u bah blah blah blah blah. What sets
this book apart from the others is Mustafa's insight that
AI is only one part of an unprecedented convergence of
sign typic breakthroughs happening at the same time. Gene editing,
DNA synthesis, and other advances in biotechnology are racing forward
(08:09):
in parallel with AI. As the title suggests, these changes
are building like a wave far out at sea, invisible
to many, but gathering force. Each one of these individual
things could be game changing for mankind on its own. Obviously,
this is supposed to be reassuring me. No, this is
a scary book. Any of those obviously, gene editing, DNA synthesis,
(08:35):
whatever that is, advances in biotechnology where you start screwing
with food and cattle and all kinds of different sorts
of things. Any of these could be life altering for Earth,
each of them game changing. Together with AI, they're poised
to reshape every aspect of society. And yeah, this is
the first book where he's combined. Hey, AAI is the
(08:57):
isn't the only thing out there. It's all these other things,
and it's gonna come bind with AI to where countries
or non state actors can alter crops.
Speaker 2 (09:06):
Maybe in ways that are great.
Speaker 1 (09:07):
They don't need much water but provide as much wheat,
you know, twice as much wheat on half the water
or whatever could be fantastic.
Speaker 2 (09:14):
But also Molio will.
Speaker 5 (09:15):
Soon be dealing with an obesity problem.
Speaker 1 (09:17):
But he also could develop some sort of a germ
that will kill off wheat.
Speaker 2 (09:22):
And you, you know, let at.
Speaker 1 (09:24):
Least loosen Kansas and Nebraska if you're the Chinese or whatever.
Speaker 5 (09:28):
Yeah, if anybody needs me, I'll be in the fetal position.
Speaker 1 (09:33):
Oh God, you get into gene editing and all that
sort of stuff. Just it boggles the mind. I can't
wait to get further along in this book. And he
so he started.
Speaker 2 (09:43):
I forget which company it was, but it's.
Speaker 1 (09:44):
The one that got bought by Microsoft and Bill Gates
and he works, he had been working for him, but
he talks many times, and Bill Gates actually says his
review similar. Look, I'm an optimist. I've always been an
optimist about technology. I've always believed that it's a positive outway.
It's negatives blah blah blah. But not in this case.
(10:06):
It's basically what he says, And he said, I hope
I'm wrong. I'd love nothing more than two people look
back on this book years from now and I was
completely wrong.
Speaker 2 (10:16):
But I don't think so.
Speaker 5 (10:19):
Yeah, I feel like we're giving chimpanzee's handguns. I just
I think it is a tool that we cannot handle.
Speaker 1 (10:29):
I think it's probably significantly worse than that. Yeah, I mean,
so I did adopt this. I heard Charlie Cook cerme
National Reviews say this the other day, why he embraces
technology as opposed to rejects it. I'm a guy who's
always rejected technology. I may have changed my mind based
(10:50):
on what he just said, on what he said the
other day. He said, if it's gonna happen anyway, you
might as well embrace it. There's no point in sitting
around and complaining you wish the internet had an involved happened,
and talk about all the ways that life would be
better if it didn't. It did, and everybody's got it,
and the same thing is with AI.
Speaker 2 (11:06):
It's coming.
Speaker 1 (11:07):
Non state actors are gonna have it, China's gonna have
everybody's gonna have it. So the I wish this wouldn't happen,
or it'd be better if it didn't, or sticking your
head in the sand and pretending it's.
Speaker 2 (11:15):
Not doesn't do any good.
Speaker 5 (11:18):
No, there's no benefit in that whatsoever. So yeah, I
think those are two different cases. I absolutely see the point.
I think a lot of the times when I am
complaining about the effects of the Internet, it's not some
sort of denial that it's real, or it's an affirmative
statement of values that I think are more important or
(11:39):
activities that are more important and healthy than being on
the internet. So I mean, I don't I do bemoan
the fact that it exists, but it's it's a tangent
to the fact that, look, it is an it's entertainment
and information, but don't don't spend too much of your
(12:00):
life on it or it will ruin you.
Speaker 1 (12:02):
Do you think life define life however you want, society,
the country, whatever, is better because of the Internet or worse?
Speaker 5 (12:12):
Uh, in complete grade.
Speaker 2 (12:14):
I think worse. I mean, I know, I think I'm
a minority on that.
Speaker 5 (12:19):
But no, no, I it's it's obviously a complicated picture.
But the question you have to answer before you answer
that question is by what standard are we gonna decide this?
Speaker 2 (12:34):
People? Is are a measure?
Speaker 1 (12:36):
Oh, I think the only measure. It's like the measure
you should have for the government.
Speaker 2 (12:39):
And Paul's.
Speaker 1 (12:42):
What's the Thomas Jefferson phrase? You know, pursuit of happiness. Yeah,
I think it's lessened our pursuit of happiness.
Speaker 5 (12:49):
I think it's our people happy, yeah, with their lives
and yeah, oh that is absolutely undeniable. Yeah, people are
less happy. I think it's the fruit of the tree
of knowledge. Yeah, whenever, bona genesis, whenever.
Speaker 1 (13:02):
I actually have actually haven't debated anybody on this, but
when I hear people debating on this, they always get
the productivity and you know how much better email is
and snail out? Okay, fine, great? Are people happier or
less happy since the internet occurred? Uh?
Speaker 2 (13:16):
Less happy? So what difference does it make?
Speaker 5 (13:19):
Right, You've got to design, what the decide what the
bottom line is. The interesting slash troubling part of this,
and then we need to wrap it up is that
the bottom line for the people in charge of this
is the bottom line.
Speaker 2 (13:30):
Right.
Speaker 5 (13:30):
They couldn't give a single crap about human happiness or
children's anxiety or suicide or the rest of it. Some
of them can on an individual basis, but it's about
making money.
Speaker 1 (13:39):
Quick reminder that chat bought you're having a romantic talk
with is not real.
Speaker 2 (13:44):
Thanks Uncle Gavin.
Speaker 1 (13:46):
Interested in your thoughts on any of this text line
four one five two nine five k FTZ.
Speaker 4 (13:54):
Gladimir Putin, according to President Trump, vowing retaliation after a
one hour phone call between the leaders, President Trump posting
Putin did say and very strongly that he will have
to respond to the recent attack and acknowledging it was
not a conversation that will lead to immediate peace.
Speaker 1 (14:14):
And then what did you say to him, Donald Trump,
since the ball is in your court on this entire war,
did you say, we're putting together a package of sanctions
that will devastate you. So I think maybe it'd be
better if you didn't or anything that leads even within
a million miles of a threat or pushback on the
(14:35):
side of Ukraine.
Speaker 5 (14:37):
It will not lead to immediate peace. Yeah, he's going
with there's no interest in piece whatsoever. It's it's these guys.
Neither one of them will actually want piece. They hate
each other. So what are you going to do?
Speaker 3 (14:47):
Well?
Speaker 1 (14:48):
Are you gonna be on one side or the other?
Is one option? I know for a lot of you
think stay out of it is the option. I don't
think so.
Speaker 5 (14:55):
So. The Wall Street General Editorial Board wants to know
if President Trump, when will he finally take no for
an answer. Senior Kremlin official Dmitri Medvedyev, also a frequent
mouthpiece for Putin, talking about the so called peace talks quote,
the negotiations in Istanbul are not aimed at compromise piece
based on someone else's delusional terms. The goal is our
(15:18):
swift victory and complete destruction of the neo Nazi regime.
Speaker 2 (15:22):
Right. That's one of the written.
Speaker 1 (15:27):
Pieces for the so called ceasefire that have to be met,
the elimination of the Nazis. Okay, fine, all the Nazis
are gone, because what the hell does that mean? But
you have to make Russia the official language in Ukraine
is one of their demands. For instance, they pretend they
didn't take any children, so Ukraine wanting kids back.
Speaker 2 (15:45):
We didn't take any kids. What are you talking about?
Speaker 5 (15:46):
So that's not part of the deal, right, right, Ah,
it's clear that vlad Putin, It's pretty clear to me anyway.
Speaker 2 (15:54):
And the.
Speaker 5 (15:57):
Molus in Iran have decided. Trump is so proud of
his ability to make deals that we're just gonna keep
telling him there's a deal out there, but never come
to one. That's what it seems like to me. I mean,
because the talks with Iran are going nowhere. There have
(16:18):
been a couple announcements from the White House that we're
getting close. We're getting close. They're still saying uranium enrichment
is the key to our nuclear program. The the Ayatola
himself said that in a televised speech, the rude and
arrogant leaders of America repeatedly demanded, we should not have
a nuclear program. Who are you to decide whether Iran
should have enrichment? Well, the answer is that everybody knows
(16:40):
in Richmond's path through nuclear bomb. You don't need it
for a convention for you know, energy programs. But they
keep sending Steve Whitkoff to negotiate. I just I don't
think there are deals to be made here.
Speaker 1 (16:54):
We may have taken a step toward the greatest whizzing
contest in world history.
Speaker 5 (17:00):
Oh my, plus science tells you how to fall in love.
Oh that's nice love and whizzing coming up? You?
Speaker 2 (17:10):
Ah, what are you? Peep? You brought it, don't you
Ooh to me? You brought it up.
Speaker 7 (17:17):
Armstrong and Geeddy, Elon and I left on a great note.
We were texting one another, you know, happy texts. You know, Monday,
and then and then yesterday, you know, twenty four hours
later he doesn't want eighty and he comes out and
opposed the bill. And it surprised me. Frankly, I think
he's flat wrong. I think he's he's way off on this,
and I've told him as much, and I've said it
(17:37):
publicly and privately. I'm very consistent in that. But am
I concerned about the effective this on the midterms?
Speaker 4 (17:42):
I'm not.
Speaker 7 (17:42):
Let me tell you why, because when the big beautiful
bill is done and signed in a law, every single
American is going to do better.
Speaker 1 (17:48):
Yeah, we'll see that Speaker Johnson saying, Hey, I'm friends
with Elon, but I think he's wrong. So it started
quite tepid, the Elon backing away, and I thought it
was way.
Speaker 5 (17:58):
Over disagreeing, very gentleman.
Speaker 1 (18:00):
I thought it was way over blown. Last week, I
the guy was wrong or or I was right, but
the people who guessed where it was growing were right
were right. AnyWho.
Speaker 2 (18:10):
So elon the last two days he's amped it up.
Each day.
Speaker 1 (18:12):
He went from forty eight hours ago from what the
hellacious abomination or whatever he called it, disgusting abomination.
Speaker 2 (18:20):
Then to yesterday kill the bill.
Speaker 1 (18:23):
Kill the bill all day long, urging Republican senators to
vote against the bill. Now, Donald Trump, sitting with the
leader of Germany and the Oval Office, ask about the relationship,
said this, I've.
Speaker 8 (18:37):
Always liked Elon, and yeah, I can understand why he's upset.
Remember he was here for a long time. He saw
a man who was very happy when he stood behind
the Oval desk, and even with the black guy. I said,
do you want a little makeup, We'll get you little Mecca.
But he said no, I don't think so, which is interesting.
Speaker 2 (18:58):
And very nice. He wants to be.
Speaker 8 (19:00):
Who he is, so you could make that stay so too.
Speaker 1 (19:02):
I guess he said that's enough. So he said I
Elon and I had a great relationship. I don't know
if we will anymore, which is I've been following Trump.
We've been following Trump for a long time. That's the
first part of the turn before you go full on
scorched earth.
Speaker 2 (19:20):
I don't know if we will anymore.
Speaker 1 (19:21):
He hasn't said anything bad about me personally, which we
all know that is the line you cross, but I'm
sure that'll be next.
Speaker 2 (19:31):
I'm very disappointed.
Speaker 1 (19:32):
I've helped Elon and not a lot if Elon makes
any comment, which he will as again, Wall Street Journal
called these two of the most powerful men on planet Earth.
They both have the ability. I mean, they've done it
in the past with a lot of high level people.
They both have the ability to go farther than anybody
I've ever known in my life would ever go in
(19:54):
terms of insulting somebody.
Speaker 5 (19:56):
Well, they are both incredibly powerful, but they are both
incredibly under disciplined. Yeah, I mean, I can I can
say plenty of people are the most people of powerful
people on Earth who would not utter a syllable that
they had not carefully considered.
Speaker 1 (20:10):
Or would never personally attack someone someone even an opponent,
let alone of somebody on your side. But we've seen
Trump do it over and over and over again, and
Elon is quite undisciplined in this. I think there's a
chance this blows up into the biggest story of all
I mean, I could imagine some crazy, crazy stuff coming
(20:32):
out of posts between Trump and Elon by the end
of the day or certainly by the end of the week.
Speaker 5 (20:37):
Right right, His rockets.
Speaker 1 (20:39):
Blow up all the time. He's not a very good engineer,
I've been told, you know. And then just just it
explodes from there.
Speaker 5 (20:48):
The other day, Trump was blasting the head of the
Federalist Society, which recommended Kavanaugh and Amy, Connie Barrett and
Neil Gorse, that you've been absolutely wonderful justices because now
some of their appointees have been upholding the Constitution, telling
Trump no to some of his plans. That's my point
of view, anyway, And Trump said, of I can't remember
(21:10):
the guy's name, but he said, was it derelict or
scumbag or reprobate?
Speaker 2 (21:18):
It's one of those words.
Speaker 5 (21:20):
He said, he's a reprobate and he probably doesn't even
love America. Right, You're gonna get the head of the Federalists,
the co founder of the Federalists. You're gonna get that
from Trump toward Elon at some point, and Elon is
not is going to say something sober. Yes, Michael, I'm
picturing Trump saying, who names their child? X?
Speaker 2 (21:36):
Oh yeah, I could be all kinds.
Speaker 1 (21:38):
Of hell, hiy fair criticism. I heard he has a
drug problem, whatever the hell? And then Elon will not
hold back tall. So the George Carlin of me in
me that just watches the world and is amused, is
really looking forward to it the I want us to
be a successful nation. Part of me is not so excited.
Speaker 5 (21:59):
Yeah, I've got a bit of a feeling of sad
resignation on my shoulders at this point, like to hear
it described. So I would like to apologize to the
American people. Okay, there's not a single chance it would
be idiotic to launch you into the whole science and
falling in love thing now. We don't even have a
(22:21):
fraction of the time left that we need. Why don't
we do it as part of the armstrong and get
you on demand? I'm sorry. One more thing podcast?
Speaker 2 (22:28):
Cool? That sounds good. Give me the tease.
Speaker 5 (22:32):
If you This is how to fall in love and
it can be accomplished in an hour. Haven't you and
another person you've hit it off, you like each other,
you'd kind of like for it to go somewhere.
Speaker 2 (22:47):
I can have you in love in an hour. But doesn't.
Hasn't the way it's been working been fine for people? No?
Speaker 5 (22:53):
No, no, no, oh my god, you're a reprobate who
probably doesn't love America. No, it's you know, circumstances can
intervene and things can happen, and something very very promising
went sideways and it shouldn't have. Besides, it's the go
go twenty first century. Who has weeks and months to
(23:14):
fall in love? This is actually, you know, or making
light and I'm being very vague. I found it a
very very interesting dissection. Is an unfortunate term, but that's
what popped into my head of how true intimacy is
built and you can like supercharge the process and save
(23:37):
a lot of time.
Speaker 1 (23:39):
Wow, do you want to save time? Or is the
gradual process one of the greatest things that ever happens
in your entire life?
Speaker 5 (23:46):
But and it has its own advantages that we just
don't sense because we're humans and we're dope that is possible.
Speaker 1 (23:54):
And then you've got the story we started the hour
with where the government of California has decided to stay
in And if you're enjoying falling in love with your chatbot,
every three hours in the state of California will remind
you on your screen. Remember you're talking to a computer.
Do not be happy now as you were. The more
I think about the Elon v. Trump feel, the more.
Speaker 2 (24:18):
It's gonna be.
Speaker 1 (24:19):
I just I could go go back and dig up
examples of things they've each said to people that are
just so over the top. But if you get it
being toward the president from the richest man in the
world or vice versa, I mean.
Speaker 2 (24:34):
It takes on a whole I mean him saying things about.
Speaker 1 (24:37):
I don't know, you know, what's his name, Carson the doctor,
or Marco Rubi or whoever's attack you know, Rosie O'Donnell.
That's one thing, but the world's richest man who does
not need to hold back for any reason whatsoever, and
the sorts of things Elon might say about so.
Speaker 2 (24:55):
So Trump said, well, let's throw this up, you know,
go ahead.
Speaker 5 (25:01):
But it's it's not it's less what he says than
what he does. That's the part that's got me feeling
oh ish, trumping the levers of the power of the
executive branch to lash back hello at SpaceX or Twitter
or whatever.
Speaker 2 (25:17):
In a way that may be.
Speaker 5 (25:19):
Unpalatable to the courts and the American people in two
thirds of the Senate. If you hear me hinting, so.
Speaker 1 (25:27):
This this might be what kicks it off.
Speaker 2 (25:32):
Or Trump said this.
Speaker 8 (25:33):
And you know, Elon's upset because we took the ev
mandate and you know, which was a lot of money
for electric vehicles, and you know they're having a hard
time the electric vehicles, and I want to say, it
makes the point.
Speaker 1 (25:46):
So he's already claiming that Elon is against the bill
because it financially damages him.
Speaker 5 (25:52):
Not a principal stance against overspending and suffocating debt. No,
it is a selfish motivent.
Speaker 1 (25:58):
Which Elon has been talking about four years and talking about, hey, look,
our interest payments are now greater than what we spend
on the military.
Speaker 2 (26:07):
Blah blah blah.
Speaker 1 (26:08):
Oh, one hundred percent defendable positions on the bill, and
Trump claims it's just because it's hurting you financially. That
would get my hair up, That would make me angry.
I could believe it's both too. I mean, Elon can
possibly be pleased at the credits. No, no, but he
has been talking about the debt and how we it's
(26:29):
unsustainable and it's going to ruin the country for years.
So I don't think he's gonna react well to that statement.
That just happened in the last forty five minutes. So
you know, Elon, I'm sure he's changing the diaper on
one baby while he teaches another kid how to ride
a bike, and then he's up in another one with
algebra homework here on the last week of school. And
you know he's very busy with his thirteen kids. But
(26:51):
as soon as he hears what Trump said, he might
fire back something. And then it's on ladies and gentlemen, truth,
social versus Twitter. Oh boy, let it fly. We will
finish strong next.
Speaker 5 (27:01):
Things are getting weird, and they getting weird fast. Are
strong yet?
Speaker 1 (27:07):
So every day after the show, Joe and I joke
about how bad the show was today, and maybe we
can do better tomorrow. So our executive producer, Hansen, said,
make a song in that theme, like it's seventies soft rock.
Speaker 6 (27:19):
Let's try to put this one behind.
Speaker 2 (27:26):
Just tamp down the shame. AI.
Speaker 5 (27:31):
Obviously, maybe we do better tomorrow.
Speaker 2 (27:35):
We can only hope unless we continue to be so lame.
Speaker 1 (27:43):
Then a seventies guitar solo after a seventies drum fell.
Speaker 2 (27:47):
Please give the drummer some love. Yeah, if you're old enough. AI.
Speaker 1 (27:53):
But I was just talking to our boss, Steve, who's
roughly our age, and he said he couldn't believe how
dead on seventies rock music this was.
Speaker 2 (28:04):
Here comes the chorus, Ladies and germs being strong, and.
Speaker 1 (28:12):
Yet show.
Speaker 6 (28:17):
Just tamp down on the shad, tamping down a shame. Now,
maybe we do better to love unless we contu being so.
Speaker 5 (28:34):
This is uncanny and franky and disturbing from the Elton
John esque A couple of interesting twists on an obvious
court Forgison to the timber of his voice is very
elton esque to the production is exactly right. Not to
geek out on you, but just the way it's mixed,
and how loud the drums are and the amount of
(28:56):
reverb and the lead guitar it's all out of like
seventies m M production.
Speaker 2 (29:01):
It's disturbing.
Speaker 1 (29:03):
Man, I feel like it picked up some sticks and
Peter Frampton and all kinds of different stuff from the second.
Speaker 5 (29:08):
Oh yeah yeah, a little sprinkle of this spread that
just distilled it down.
Speaker 1 (29:12):
And dos I want to be guitar player? It makes
me want to cut off my hands, Like what is
going on here?
Speaker 5 (29:21):
I don't, I don't know. I really I don't think
I am an old man yelling at clouds. I really don't.
My animal instincts are danger danger, you know what. I
don't think I've ever put it that plainly.
Speaker 1 (29:45):
That your animal instincts are yeah, yeah, My instinctive reaction
to this is this is bad.
Speaker 5 (29:51):
This is a threat. That's as simple as that. I've
found various ways to uh, to phrase that concern, but
that's that's that's the.
Speaker 2 (30:00):
Description I can give.
Speaker 5 (30:01):
It's an instinctive revulsion.
Speaker 1 (30:03):
Well, I mentioned that book I'm reading about AI and
I'll have more of it than I get through the book.
But he he says we should feel that way, and
uh and not and not and not not feel that
way because something big is coming and we gotta think
about it.
Speaker 2 (30:19):
So you know, you know what.
Speaker 5 (30:20):
I'm reminded though of what finally helped me enjoy sports more.
Right after clip fifteen, Michael, gimme clip fifteen. Give it
to me for the first overtime. Here's Pectavid out for
Deuchen Hopkins out of the point, who sharp Chent Hopkins?
Speaker 2 (30:39):
Wave it? Here's Pectavid. Listen to that cry of Flint
came one. Oh.
Speaker 5 (30:50):
I've been rooting hard for the Oilers. I'm a I'm
a last couple of years a big Oilers fan. I
love playoff hockey so much. Do respect to basketball fans.
It is if you understand the game, there's no comparison,
because the decisive play could happen in the first minute,
the twenty third minute, or the last minute. Every minute
(31:11):
is critically important in a hockey game, as opposed to basketball,
where there's a three and a half quarter exhibition and
then they play hard for four minutes. Plus they play
on ice, which is cool.
Speaker 2 (31:21):
Yes, it's on beice, Michael. It's a good point the
NBA or the hockey.
Speaker 1 (31:27):
Yeah, so it's got that whole soccer thing there with
you know, s score is such a big deal. Listen
to that crowd though. Wow, that was something so obviously
it was in Canada. They haven't in Canada, you know,
they hockey means so much of them. They have won
the Stanley Cup.
Speaker 2 (31:39):
In thirty two years, thirty three years.
Speaker 5 (31:41):
Yeah, the whole country is beside themselves rooting for Edmonton.
So anyway, playoff hockey.
Speaker 1 (31:48):
Oh.
Speaker 5 (31:48):
What I started to say was the only thing that
comforts me is I watch mankind slide into the abyss.
Speaker 2 (31:54):
Is that just like watching.
Speaker 5 (31:55):
Sports, I don't actually have any effect on the outcome. Well,
I mean we have a teeny tiny effect doing this show,
but not really the great tidal forces that are sweeping
humankind toward you know whatever AI powered droid nightmare.
Speaker 2 (32:13):
It's gonna happen, whether I like it or not.
Speaker 1 (32:15):
I was thinking about that last night, about you know
why I'm so fascinated by this and I just can't
get enough information about A part of is that I
would like to have some hand in guiding my kids
to how that what the hell their lives.
Speaker 2 (32:26):
Are going to be like?
Speaker 1 (32:27):
If I can get an inkling that their lives could
be more different from mine than has ever happened in
one generation in human history, in fact, that I think
that might guaranteed to be true.
Speaker 5 (32:40):
I think that's inevitable. Yeah, I think you've nailed it. Yep.
Speaker 2 (32:43):
Wow, think about.
Speaker 5 (32:44):
That well, I think their lives may be more different
from yours than yours were from the founding fathers.
Speaker 2 (32:53):
Wow. I'm strong, I'm strong, You're ready.
Speaker 5 (33:07):
Strong.
Speaker 2 (33:09):
Here's your host for final thoughts, Joe Getty.
Speaker 5 (33:12):
Let's get a final thought from everybody on the crew
to wrap up the show for the day. There he
is Michael Angelo, pressing the buttons. Michael, what's your final thought?
Speaker 2 (33:18):
All right? Guys, coming soon.
Speaker 5 (33:20):
Elon Musk versus Donald Trump in a UFC fight.
Speaker 3 (33:24):
Dana White will be there to promote.
Speaker 5 (33:26):
It real fight or like or I mean, well he
liked it.
Speaker 2 (33:33):
Pro rat you wanted Mark Zuckerberg. Yeah, he challenged Mark
to a real song.
Speaker 5 (33:38):
Got a weight advantage, though, Katie Green are seen Newswoman
as a final thought, Katie.
Speaker 3 (33:42):
There's a new Katie's corner out at Armstrong getty dot
com and you can see the photo of Drew's birthday
dinner last night where I wore a shirt covered in
his face. Wow, wats of his face?
Speaker 2 (33:54):
You're a big fan her, I'll betty was jack final
thought for me. I don't want to let that lay
with repeating it.
Speaker 1 (34:01):
My kids' lives are going to be more different from mine.
That has happened than any generation in human history, and
I think that is probably true.
Speaker 5 (34:12):
My final thought is I watched the video that a
guy made to distribute to his parents and grandparents, hipping
them to how good AI is and how this isn't real,
This isn't real, This isn't real, and this isn't real.
I think it's really great and important and cool, and
(34:33):
it's at Armstrong egeeddy dot com if you want to
zap it around.
Speaker 2 (34:36):
Is that the one we've got at Katie's corner.
Speaker 1 (34:39):
Yes maybe yes, yes, Armstrong and Getty wrapping about other
grueling four hour workday.
Speaker 5 (34:45):
So many people will think, so little time. Go to
Armstrong e getdy dot com for that fine video. It's
really quite amazing. Drop us a note mail bag at
armstrong you geddy dot com. Pick up some swag while
you're there, a hat or a hoodie.
Speaker 1 (34:57):
You know, as you heard in the song, so much disappointment,
so much shame.
Speaker 2 (35:00):
We'll try to do better tomorrow. We'll see it then.
God bless America. I'm Strong and Getty. This is fabulous.
Speaker 5 (35:07):
Perhaps you also know that hot dog is my favorite
meat and that was none of that did not come
from a dog.
Speaker 2 (35:14):
But damn it, let's not play games with this. This
is the United States of America, for God's sake. Lie
after lie after lie. Do not listen to the lies.
Speaker 5 (35:23):
This is what will happen to you.
Speaker 2 (35:24):
Necessary. Okay, this is crazy. Yep, that's enough of that.
I thank you. Have a terrific day, Armstrong and Getty.