Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Broadcasting live from the Abraham Lincoln Radio Studio the George
Washington Broadcast Center, Jack Armstrong and Joe Getty.
Speaker 2 (00:10):
Arm Strong and Jetty and he.
Speaker 3 (00:15):
Armstrong and Yetty's largest AI infrastructure project by far in history.
Speaker 2 (00:27):
The joint data center project is called Stargate.
Speaker 4 (00:30):
I believe that as this technology progresses, we will see
diseases get cured at an unprecedented rate. We will be
amazed at how quickly we're carrying dis cancer and that
one in heart disease.
Speaker 5 (00:41):
So leaning on the health stuff, I would too when
you're pushing AI, because it's the easiest to sell to
people as a clear positive, because there's lots of how
is this going to turn out? Or clear negatives that go.
Speaker 2 (00:53):
Along with AI. But anyway, apparently.
Speaker 5 (00:55):
We're committing to being the number one in the world
with AI, just like we've been number one with practically
everything else for the last century. That's the Trump administration
a half trillion dollar investment and AI infrastructure, including building
the world's largest computer and coming up with some sort
of energy source because I still don't quite understand this,
(01:17):
but AI takes a tremendous amount of energy, and where's
the electrustee going to come.
Speaker 2 (01:21):
Back, come for that.
Speaker 5 (01:24):
But it's a modern just watching some Fox coverage on this,
they're looking at it as the modern space race like
was happening in the late fifties early sixties between US
and the Soviet Union, except this time it's US in
China once again a communist power, and that if we
don't win this AI race, which they say we will
with this half trillion dollar investment, you know, it's going
(01:44):
to be really bad for the United States, really bad
for the world. One of the things Elon's been warning
about on that since the beginning, though, is once this
turns into a space race or a or a race
either for a safety you know, who's going to control
the world militarily, or for dollars, then all ethics go
(02:05):
out the window, because that's you know, that's the way
it works. And and he's really concerned that, you know,
the the downsides of AI won't be managed well when
it's a competition, which I'm sure he's correct, but I
don't know why you're going to stop it.
Speaker 1 (02:20):
Oh, I think that's so clearly true. There's almost no
need to say yes, I mean exactly absolutely. I'm a
little uh Elon. Well, we've got a couple of clips
of some of the tech gurus at the White House
talking to Brett Berry yesterday.
Speaker 2 (02:33):
Do we want him playing the of those.
Speaker 5 (02:34):
Let's we're talking to Sam Altman, who's one of the
ogs of AI.
Speaker 6 (02:40):
What this means for AI and the future for the
US investment here.
Speaker 4 (02:44):
This means we can create AI and AGI in the
United States. In America. Wouldn't have been obvious that this
was possible, but I think at a different president it
might not have been possible. But we are thrilled to
get to do this, and I think it'll be great
for America. It's great for the whole world.
Speaker 2 (02:56):
Tech bros, oligarchy's, et cetera. You should be cared, very scared.
Speaker 5 (03:01):
There's be're talking to another tech billionaire, Larry Ellison.
Speaker 6 (03:04):
Larry, you talked about what can come out of this
as far as healthcare and other things. People are starting
to get their head around AI, but not fully well.
Speaker 7 (03:15):
When you go to the hospital these days, if you're
if you're using a modern system, excuse me using it
using a modern system, Uh, the doctor will get a
little note written by AI describing what the purpose of
the appointment and then that note will have your latest
lab tests and your uh and before so the doctor
(03:35):
is prepared when they meet with the patient.
Speaker 3 (03:38):
Uh.
Speaker 7 (03:38):
The the AI will actually listen to the conversation between
the doctor and the patient and make recommendations and improve
the likelihood that the patient is going to.
Speaker 2 (03:48):
Get a high quality of care.
Speaker 7 (03:49):
It's easier for the doctor, it's better outcomes for the patient. Uh.
It's really a revolution in medicine, but it's a revolution
in many other industries as well as well. Medicine just
touches us all.
Speaker 5 (04:01):
Yeah, yeah, I Joe said earlier, and this is absolutely right.
They do have a tremendous financial interest in this succeeding.
So while it's true I think that there'll be tremendous
medical benefits.
Speaker 2 (04:19):
I mean, you're really YadA YadA.
Speaker 5 (04:21):
In the end of civilization by focusing on the medical stuff.
Speaker 2 (04:27):
Yeah, it's it's complicated.
Speaker 1 (04:30):
Yeah, they're trying to raise amounts of money that I
can't even comprehend. So naturally, this is both an announcement
and a huge sales pitch. It's interesting that Altman and
Ellison and the banker from SoftBank, who is a Japanese fellow,
what's his name. It doesn't matter. I can't remember. He's
a banker. They met with Trump. Trump was singing their praises,
(04:53):
singing the praises of the project. They were talking about
what a great president Trump is. The banker dude was saying,
he's right, this is a golden age. We're starting a
golden age of American blah blah blah. And Elon immediately
tweets out that they don't have the money. He said,
I have it on good authority soft banks only secured,
like as well, under ten billion dollars secured. And they're
(05:14):
talking about a five hundred billion dollar project. I gotta
believe Trump and Elon may have a short, uncomfortable conversation
about that.
Speaker 5 (05:25):
What does he mean they don't have the money? So
Trump announces this half trillion dollar project. That's not all
tax money. I assume when there's a big space race
thing that taxpayers are paying for it.
Speaker 2 (05:36):
Uh No, it's an investment.
Speaker 1 (05:37):
It's private joint venture to build advanced data centers in Texas.
A Soft Bank ceo, Maseyoshi Son was the third guy there.
Speaker 5 (05:49):
And Elon just flat out says they can't afford to
do what they're claiming they're going to do well.
Speaker 1 (05:53):
He says, they don't have nearly the financing. They claim, well,
you do like they did. They're trying they've been trying
to do with the bullet Training California, and it's worked.
You get far enough down the road, well we can't
stop now.
Speaker 2 (06:04):
And that's when you bring in all the tax payer money.
Speaker 1 (06:07):
Or it reminds me of you're trying to get a
movie made and you say, well, Dustin Hoffman is attached,
he's ready to sign for the lead role. As you
try to get you know, somebody else to jump on board.
So I just I think it's interesting and undisciplined that
Elon would weigh in in that way. Why would you
bother I happen to know they don't have nearly enough financing.
Speaker 5 (06:27):
Way, come on, wait a minute. I think Elon is right.
It's there's no way around this though. But I think
he was right because at the very beginning of AI,
when he was with Altman, right it open, the open
whatever it was, and they wanted to be open and
not have it be about profit and all that sort
of thing, so they could make sure about the ethics
(06:48):
and not letting AI get out of control.
Speaker 2 (06:50):
And ruin the world.
Speaker 5 (06:51):
Well that quickly Elon quickly left because they decided they
wanted to be about profit. Elon started his own thing.
Other people started their own thing. Google's invested billions of dollars.
Bill Gates has invested billions of dollars in those. It's
its own space race within the United States for who's going.
Speaker 2 (07:05):
To be the best AI out there. And then we're.
Speaker 5 (07:07):
Repeating against China. But it's a lot like the nuclear
weapon or whatever. We want to be the winners. I mean,
I think I think a lot of bad stuff is
going to come out of AI, but I'd still rather
have it be our company's version of bad stuff than
China's version.
Speaker 2 (07:22):
China. All China is going to try to do is
take over the world. Yeah, I would say the best
case is mutually assured destruction.
Speaker 5 (07:30):
The best case, let's quote Joe Getty on this today,
best case scenario is.
Speaker 2 (07:35):
Mutually assured destruction. Yeah.
Speaker 1 (07:37):
I will stand by this statement of fantastic nuclear arms,
You're never going to have an exclusive So the best
case is, you know, you can wipe them out and
they know it and vice versa.
Speaker 2 (07:46):
So everybody stays cool.
Speaker 5 (07:48):
Well, and if we ever reach the singularity point or
whatever they call it, where AGI happens, where AI is
training itself. I don't even know what that means then,
because I would think you can't keep the Chinese version
in the US version separate, can you won't They start
learning from each other, and then then they're taking over
(08:10):
the world and as Joe always points out, drainings of
our vital juices.
Speaker 1 (08:13):
Well right, yeah, chaining is to the walls of laboratory
slash factories and draining our juices. Yeah, I do want
to hear that they are nefarious purposes. Brett Baar did
get into some of the possible downsides of the AI
revolution with Samton.
Speaker 2 (08:26):
Let's hear this is good Sam.
Speaker 6 (08:28):
For the people who are concerned about AI on either
the work and job front or just it's scary, what
do you say to them that there are rails to
make sure that it doesn't go out of bounds.
Speaker 4 (08:41):
Yeah, there are hundreds of millions of people using these
tools today. It'll go to billions. People are using it
and really depending on for all sorts of great things
in their lives. Like any new technology, we do have
to put some rails on it. And you can imagine
ways it could go wrong, but if you look at
the safety record, but we've been able to ship so far.
What I think we'll be able to do in the future.
The benefits of this tremendously outweighed downside. We have to
be responsible how we do this. We have to build
(09:02):
it carefully. But I think people are really good and
people will do, on balance, incredible things with this technology.
The scale of this investment, obviously is huge, and what
I think that says about the likely progress of the
technology least what all of us believe is correspondingly huge.
But I have enormous faith we'll figure it out.
Speaker 1 (09:20):
Let's play when the genius is an idiot who can
pick out the phrase or lyying.
Speaker 2 (09:28):
I don't know which I think.
Speaker 1 (09:29):
I think people are really good and I think it'll
turn out.
Speaker 2 (09:32):
Fine, and the good always excuse me, sorry, the good
will outweigh the bad.
Speaker 5 (09:39):
You see, nuclear power can power an entire city, it's
cheap and it's clean. It also can blow up that
entire city similar situation. Luckily, so far we have not
blown up the world. I think it might be somewhat
inevitable that that happens with nuclear weapons, but this is
the way AI is okay, great, so it can identify
cancer earlier. If it eliminates eighty percent of job and
(10:00):
everybody's out of work, I'm not sure that's a win
for humanity.
Speaker 1 (10:05):
Well, and the evildoers unleash computer viruses and hacks and
the rest of it through AI that had closed all
the hospitals where you'd be getting that advanced cancer treatment.
Speaker 2 (10:15):
Sam.
Speaker 1 (10:17):
Of course, there's no stopping It's like the atom I remember,
and you've actually read more than I have about this,
but in the early days of the atomic programs, there
was these raging arguments over whether human kind could handle
this if we were quote unquote playing god.
Speaker 2 (10:33):
The answer is yes, yes we are.
Speaker 1 (10:35):
And the answer to the question of whether we can
handle it is barely well right.
Speaker 5 (10:40):
And if you've seen the Heimer movie or read the
book it was based on, yeah, they were having the
very similar discussions, with Oppenheimer's conclusion being we're just talking
about with AI, it's gonna happen. There's no stopping it.
The ability to do it exists. So do you want
the bad guys to do it? Or do you want
to be the lead in it? That's the only question.
(11:02):
And the people who are unwilling to work on the
Manhattan Project because it could be.
Speaker 2 (11:05):
Used for war are simpletons.
Speaker 5 (11:07):
And people who are unwilling to, you know, be involved
in this because it could be used for bad things.
Speaker 2 (11:11):
You're a simpleton.
Speaker 5 (11:13):
China's gonna do it, North Korea is going to do it.
Russia's doing it as fast as you can. Would you rather
they lead the world in this or US?
Speaker 2 (11:19):
Right right?
Speaker 1 (11:19):
It reminds me of that dividing line I keep bringing up.
They're roughly two kinds of people. Those who recognize this
is heavy and it's ugly and it hurts my heart,
but it has to be done and they do it.
Are those who say, I don't care what you say.
I'm not going to do something ugly. I don't care,
(11:41):
and they just can't deal with it like an adult.
It's again, there's part of me that I got to
reread the Book of Genesis with the tree of the
Fruit of Knowledge, I know.
Speaker 2 (11:57):
Feels like it.
Speaker 5 (11:58):
There's a Genesis movie coming out. I think they're making
a movie or a mini series all about Genesis. I've
already made a car in a band. Final thought on this,
I don't obsess about AI as much as it probably
sounds on the radio show. But I am sort of
obsessive about the idea of people who think it's not
(12:18):
a big deal. It's a huge freaking deal. There's nothing
I can do about it. So that's why I don't
obsess about it. But it is a huge freaking deal.
Speaker 1 (12:27):
Just because there's hype doesn't mean it's not a huge deal.
Speaker 2 (12:32):
I hope.
Speaker 5 (12:33):
I One of the reasons I want to stick around,
not only for being around for my kids, is.
Speaker 2 (12:37):
I want to see how this all plays out. I
want to live long enough to see how this all
plays out, because it's going to.
Speaker 5 (12:42):
Be incredible, and I don't have to live like maybe
another ten years to have a pretty good idea now
it's going to turn out.
Speaker 4 (12:48):
I think people are really good and people will do,
on balance, incredible things with this technology.
Speaker 2 (12:53):
Yeah, it is one of the funniest things I've ever
heard in my life. There's Oppenheimer right there.
Speaker 5 (12:58):
I think, on the whole people are good and we'll
do incredibly good things with this technology.
Speaker 2 (13:04):
Kim Jong un is good, She's in Ping's good.
Speaker 1 (13:07):
All those people chopping up children with machetes in Africa
are good, and they'll do good things.
Speaker 2 (13:12):
On balance, Wow. I'd like to hear him respond to
that more on the way stay here.
Speaker 5 (13:20):
I almost hate to say it because we've been pretty
heavy Elon today, although he is our co president. A
quote from Elon Musk about Persevance I thought was pretty good.
Coming up, We're going to get to Trump's order around
DEI being taken out of our federal government. Fantastic and
(13:41):
the specifics are great. So stay tuned.
Speaker 2 (13:44):
Love it, love it.
Speaker 1 (13:45):
Can't wait for the NFL playoffs this weekend. The Mighty
Philadelphia Eagles still in the thick of it. And we
give to you Cherrelle Parker, the mayor of Philadelphia.
Speaker 2 (13:56):
Hee S Bigos, Let's go Birds. You gotta play that again.
Speaker 1 (14:06):
He cow g y S.
Speaker 2 (14:10):
Bigos, e e Berts e l G s e S Eagles.
Speaker 1 (14:21):
Her bio mentions that she was briefly an English teacher
in New Jersey. I wonder why that career ended. Holy Cow,
one party politics town.
Speaker 5 (14:36):
If she got one letter wrong, you know, you're up
there on stage and and a microphone and probably trying
to do a bunch of things at once in the game.
Speaker 1 (14:43):
She wasn't even close. She got the ete ow g
S S big All right, you explained to me. She
again she did get the eriche. Explain to me, how.
Speaker 2 (14:57):
The hell you go to the st e L G
S E S Elgis's she got to be impeached for that.
Speaker 1 (15:14):
G S.
Speaker 7 (15:18):
God.
Speaker 5 (15:18):
If I'm if I'm a radio station in Philadelphia, like
I might, I might just, uh, you know how like
radio stations go all Christmas music, I might go all
that clip for a full day, just playing that over
and over, go to commercials, come back play it some more.
I love that idea. That is shocking.
Speaker 2 (15:40):
Yeah, that's pretty good. I'm trying to listen to people
in the background. Were they spelling exactly as she was spelling?
Or were they trying?
Speaker 1 (15:46):
They're no, they're spelling it right. Ah. If there's more
to that clip, can somebody find a longer version of that?
Speaker 2 (15:53):
Yeah?
Speaker 5 (15:53):
Does somebody shut out? He ow g.
Speaker 1 (16:00):
S?
Speaker 2 (16:02):
How do you go early?
Speaker 5 (16:06):
I think you should have gone yes a couple more
times there at the end.
Speaker 1 (16:11):
Elks says Elks, there was nary an a to be
found that's spelling either.
Speaker 5 (16:22):
Oh God, and she was a good school teacher. That's fantastic.
I don't get the controversy over the birthright citizenship EO
other than the president doesn't get to change the constitution.
I get that controversy, but the wanting to do away
with it, I don't quite get. What's your argument for it?
(16:43):
Keeping birthright citizenship, yes, keeping the Constitution the way it is,
what's your argument for it? If you have one text
line four one, five, two, nine five KFTC.
Speaker 1 (16:52):
I'm familiar with a couple of them. We can trot
them out in a minute or something to hear that.
It absolutely needs to be interpreted because the language in
the Constantitution is pretty fuzzy.
Speaker 5 (17:02):
Right, this is exciting if you're in a federal government
and DEI you're done at the close of business today?
Speaker 2 (17:08):
How about that? Armstrong and getty?
Speaker 1 (17:14):
And yes, that salute was evocative of things that we have.
Speaker 2 (17:16):
Seen through history. It was quick.
Speaker 4 (17:19):
I think our viewers are smart and they can take
a look at that.
Speaker 2 (17:23):
But it certainly was. It's not something that you typically
see in American political rallies. I put that one. No, No,
it was not something that you usually would see.
Speaker 8 (17:30):
That was a Nazi salute. And he didn't just do
it one time. He did it twice for emphasis. And
if you talk to anyone the historians folks actually study
the Nazis and study this actual kind of disgusting display,
will have been very clear that what that was. And
he should not just apologize, he should be condemned for
those kinds of actions. So gross, disgusting, But more of
(17:51):
what we can expect I think from Elon Musk and Donald.
Speaker 5 (17:53):
Trump, that is hilarious. Elon Musk's alleged a Nazi salute.
And the same crowd that had no problem with college
students all around America in favor of beating down or
killing Jews, which is pretty Nazi like behavior, wouldn't call
them Nazis. But Elon's wave to the crowd clearly a
(18:14):
Nazi salute.
Speaker 1 (18:15):
I've got to admit it's funny. I'm sitting here feeling
like a football coach, and I got a playoff game
next week the weekend, and I'm getting game film of
my opponent. They're like playing slap ass in the huddle,
and some of them aren't even wearing spiked shoes and
they don't have any pre planned place. If the left
(18:38):
media is that dopey and stupid, oh, they're gonna be
down for a long time.
Speaker 2 (18:45):
Yeah, I agree.
Speaker 5 (18:46):
Good, keep it up, good, luck with that, as they say,
good luck in your senior year, as Joe Biden would say.
Speaker 2 (18:52):
So, Hey, here's on air meeting real quick.
Speaker 1 (18:54):
We've got the audio of the SNL cold Open Handy,
which is kind of on the same topic.
Speaker 2 (19:02):
Sure we could do that hour four. Yeah, let's do that.
Speaker 7 (19:06):
Yeah.
Speaker 2 (19:06):
And if you don't get hour four you gotta go
somewhere or what have you. That's fine.
Speaker 1 (19:09):
Subscribe to the Armstrong and Get You on demand podcast
and you can listen to hour four later. It is
funny they mock. They brutally mock MSNBC on Saturday Night Live,
believe it or not.
Speaker 5 (19:21):
So The New York Post is reporting today that see
the cn boss, CNN bossers two ends, CNN boss.
Speaker 1 (19:28):
It's like the mayor of Philadelphia over there, Mark, you
can't even spell CNN folks.
Speaker 5 (19:33):
Ah. Mark Thompson, who runs CNN, had told Jake Tapper
and Anderson Cooper and one hundred other journalists there not
to express outrage during the Trump inauguration, which is a
smart idea. Hey, tone it down, don't be angry. Half
the country voted for the guy. He's got the highest
approval rating he's ever had. Plus it's just not good journalism.
(19:54):
So let's just treat this, you know, describe what's happening.
That was a good idea, but then you got that
later that day Elon Musk with his weird chest thump thing,
whyever he did that is not a Nazi salute. It's
the only thing I do know treating it like obviously
a Nazi salute. So I had a good friend who
(20:14):
used to do this whenever I would get anywhere near
catastrophizing something, and they would always say, well, let's take
that further, like try to come up with an example.
I think you're gonna lose your but what if this happens,
I don't know. I would lose my job. And what
happens if you lose your job, I'd have to get
another one, And then what would happen? Then I would
(20:35):
have another job and then bubba I mean just I mean,
because it takes all the wind out of it sometimes
if you just take things step by step like that.
Speaker 2 (20:43):
I love that.
Speaker 5 (20:44):
Yeah, it's really good. It really works on a lot
of things. And I wish whoever was sitting there with
the analyst talking about it being a Nazi salute had
done that with that Okay, let's take this further. If
it was a Nazi salute, what does that mean? I mean,
go further with you. So are you saying he's a
member of the Nazi party, he believes in the final
solution of ending all.
Speaker 2 (21:04):
Just do you believe Trump knows that? You believe?
Speaker 5 (21:06):
Do you believe he did that to signal the Nazis
in the arena.
Speaker 2 (21:11):
That didn't already know that he was.
Speaker 1 (21:13):
How has he covered up his overt Nazism until now?
Did he just slip up by doing that or has
he now come out of the closet in europinion right right? Well,
of course on MSNBC, for instance, they would say, well,
he's just finally he's emboldened by Trump's victory and now
he can express his true feelings of Nazism.
Speaker 2 (21:32):
Okay, that's what I wish someone him.
Speaker 1 (21:34):
Do.
Speaker 5 (21:34):
You don't even need to say they're wrong, Just make
him spell it out very directly. So you think he
now feels as a closet Nazi and they've gone ahead
and adopted all of the Nazi like symbols and arm
gestures of eighty years ago, that he's willing to do
it out loud in front of the big national audience
(21:57):
at an arena because he's so comfortable with me and
Nazi's that's what you're telling me just happened, because it
sounds pretty silly if you lay it out like that.
Speaker 1 (22:05):
Oh yeah, yeah, it's it's beyoncely A full credit to
Dennis and Lincoln California who pointed out he's watching the
news and AGB has replaced AOC. Not the buxom half
wit progressive ACUTI, but the phrase, Uh, what does AOC
(22:27):
stand for?
Speaker 2 (22:28):
Abundance of caution? Oh?
Speaker 1 (22:30):
AGB is the new media phrase. A growing backlash as
the Elon musk blab blah, A growing backlash as Donald
Trump declares the Fourteenth Amendment, A growing backlash as Trump
pardons he's right, absolutely right. Look for that phrase that
might be. You know what it is, it's the new
Republicans pounds as.
Speaker 2 (22:51):
Trump backlash.
Speaker 5 (22:52):
Trump ran on a lot of things, including putting an
end to DEI in the federal government, and yesterday he
put out his memorandum to do just that. It was
called the name of it is ending radical and wasteful
government DEI programs and preferencing Executive Order.
Speaker 2 (23:09):
January twentieth twenty twenty five. He signed this on day one.
Speaker 5 (23:12):
One of the things I like about the way Trump
operates is he he puts things in like regular people language,
because a lot of times your governmental stuff you can't understand,
Like section one, is this the purpose in policy? The
Biden administration forced illegal and immoral discrimination programs going by
the name Diversity, Equity and Inclusion DEI, and virtually all
(23:34):
aspects of the federal government an areas ranging from airline
safety to the military. This was a concerted effort, semming
from Biden's first day in office, and he talks about
Biden's executive order pursuent to my Executive Order one three
nine eighty five and follow on orders nearly every federal
agency blah blah blah, blah blah blah. So by five
point thirty to day close of business today, all DEI
(23:57):
officers of any kind and all departments they're.
Speaker 2 (24:00):
On paid leave. And that's the end of that, right right,
Love it. It's important.
Speaker 1 (24:06):
It's going to take a long time to disentangle this
insidious philosophy from the universities and schools, but at least
it's been hit with a big face full of roundup
in the federal government. And again what you have to
understand is good, well meaning people think DEI is about
making sure there's no discrimination against people. It's not that
(24:31):
it's a neo Marxist takeover of institutions. Masquerading is racial understanding.
They understand that you're a good person who's against racism.
They're exploiting that with this philosophy. I suggest, for the
umpteenth time, very strongly you read at least part of
James Lindsay and Helen Pluckrose's brilliant book Cynical Theories. It's
(24:54):
about critical theory in all of its incarnations.
Speaker 5 (25:00):
It's down to a language where it talks about the
way all hiring and promotions will be done. Oh right
here shall reward individual initiative, skills, performance and hard work,
and shall not, under any circumstances consider DEI or DEI
other factors, goals, policies, mandates, the requirements, meritocracy.
Speaker 1 (25:20):
In other words, right, and just the idea that always
being obsessed with race and constantly lecturing people about race,
study after study. And it's funny, given my previous argument
that DEI is not what it claims to be anyway.
Speaker 2 (25:36):
The funny part is what it claims to be.
Speaker 1 (25:39):
It's a miserable failure at It doesn't help black people, It.
Speaker 2 (25:44):
Alienates and angers everybody.
Speaker 1 (25:46):
It makes the workplace less workable and less pleasant. It
fails by every measure at what it's pretending to be.
But it isn't that anyway.
Speaker 5 (25:59):
If you if Kamala Harris had been elected, we'd have
gone way down, further down that road for four more years,
you know, and maybe eight or twelve, you never know.
And and it would have been so ingrained and deep
into the federal government it would have been very hard
to get out.
Speaker 2 (26:16):
Now we got a shot at getting it out. I
mean that that's a big difference right there.
Speaker 1 (26:20):
Yeah. I just hope people start to understand what it
is and what it isn't, because it could come back
so easily if good, decent but not very aware people
think it's well, it means not being a racist.
Speaker 5 (26:32):
The left does such a good job of naming things
in a oh pleasant sounding way, and it works so often.
Speaker 2 (26:37):
I need to start doing that in my own personal life.
Speaker 5 (26:40):
Yeah, Like if you the example for you, like if
you call you wanted to buy new golf clubs, but
you present it to your family as the.
Speaker 2 (26:50):
Healthy father who will be here for the future. Act perfect.
Speaker 5 (26:56):
And that's why I'm getting buying new golf clubs with
family money, is because I'm gonna be a healthy for
the future and be here for you in your old age.
Don't you want me to hold your grandchildren? You can
what's the word outdoor into there as well?
Speaker 1 (27:08):
That would be good. But uh yeah, that's a perfect illustration.
Does you got to reword everything?
Speaker 7 (27:12):
Oh?
Speaker 1 (27:13):
They are fabulous and they turn equality into equity and
people think, well, same three first letters. Of course the
mayor Philadelphia doesn't know that, but uh, it starts the same,
so it must mean roughly the same. And in the
same way illegal alien became a legal immigrant, became undocumented immigrant,
which became whatever, which then is now a migrant work
(27:33):
for some undocumented worker, unpapered migrant, fairy dust angel dreamers.
Why do you adopt the left's verbage? Don't stop it
gender affirming care. You're a sucker if you don't know
how you're being played through that term. I gave Brettbaar
the rough side of my tongue on that, and they
(27:55):
have not used that term again on Special Report.
Speaker 5 (27:57):
Well, I believe I was heard and the best of
all time. Approach choice not pro bor pro choice. It's
a joe you what you don't like people making choices?
Speaker 2 (28:04):
What ya got? Awful person? Are you?
Speaker 7 (28:06):
Oh?
Speaker 2 (28:07):
Eit the hypocrisy?
Speaker 7 (28:08):
Uh?
Speaker 5 (28:08):
Quickly before we go to break somebody. A friend of
mine just texted me this, having arrived at their place
of employment. Knees week, arms are heavy, ready to leave
work already or whatever.
Speaker 2 (28:20):
Eminem said.
Speaker 5 (28:24):
It's funny how some days you just don't have it,
and it's always difficult to nail down. Sometimes it's clear
like you went to bed to later, drank too much
or something. But aside from that, some days it's just
like why am I? Why am I not in the
groove today? Whether I'm at the gym or works, Like
what I can't get on my horse today?
Speaker 2 (28:43):
What is the free? No mojo?
Speaker 1 (28:44):
Yeah?
Speaker 2 (28:44):
I got no mojo?
Speaker 1 (28:46):
Yeah, I know, I know, it's just it's it's the
human condition.
Speaker 2 (28:51):
Yeah.
Speaker 5 (28:52):
And I find as you age, the days without mojo
seemed to come more often.
Speaker 2 (29:00):
Yeah.
Speaker 1 (29:01):
Yeah, I just I I have a friend who I'll
remain I'll stay vague on this, but he like, we
have a job millions want, but only hundreds have, and
he said, our running joke is we're living proof that
you can grow to hate your dream job. Not in
(29:23):
any sustained way. I love what I do, but there
are days you think, no, seriously, I will do anything
but talk about the news today.
Speaker 2 (29:33):
But you know, it comes and it goes.
Speaker 7 (29:35):
Yeah.
Speaker 5 (29:35):
I always use that line on my kids for all
kinds of different stuff. It doesn't matter what it is.
You know, I want you to clean your room. I
don't want to clean my room. You know who else
doesn't want to clean the room? Everybody? I don't feel
like going to school today. You know who else doesn't
feel like going to school? Every kid in America or
just whatever. You know who else doesn't want to go
to work today?
Speaker 2 (29:54):
Me and everyone else. So there you go. Yeah, welcome
to the human rights I've got a lot more in
the way. Stay here. Armstrong Costco workers.
Speaker 1 (30:06):
Have voted to authorize a nationwide strike. Yeah, and response
Costco has threatened to replace them with Kirkland brand workers.
Speaker 2 (30:17):
That's a pretty good joke.
Speaker 5 (30:19):
Hey, By the way, I know I'm late to the
party on this and I'm opening myself up for mockery here,
but that's what I do. And I realize if you
have an Android phone, this has been around for years,
you Apple haters, But the iPhone just got the technology
fairly recently, and I even more recently started using it.
So I took picture of myself at the Washington Monument
in Washington, DC, and I really liked the picture, and
(30:41):
my thirteen year old said, well, why don't you get
that guy out of there? There's a guy like standing
behind me. Otherwise it was just me and the monument. Ah,
And he grabbed my phone and he did it. He
showed me how to do that, and now I'm obsessed
with it. The taking a picture and eliminating the garbage
can behind you or the person you don't want or whatever.
Speaker 2 (31:01):
It's effortless.
Speaker 5 (31:02):
It's amazing. It's like magic. It's just got to show
me how it's just incredible. On the uh, you know,
how you adjust the cropping and everything like that.
Speaker 2 (31:13):
There's a little eraser emoji thing there and you tap.
Speaker 5 (31:16):
On that and you just use your finger and you
just kind of color in what you want to erase,
and it figures out, oh, you mean that person, and
it just takes them out and fills in with the.
Speaker 2 (31:25):
Background like magic. It's amazing.
Speaker 1 (31:28):
Rando's in the background. That l advised ex boyfriend that
your friends wanted you about, but you ignored them.
Speaker 5 (31:34):
Yes, there's lots of people that have pictures like that.
Oh I love that picture of Yosemite. It's just says
my ex and he was a jerk.
Speaker 1 (31:41):
You take him out. You still have the picture of
you usemite. Yeah, uh interesting. I gotta get used to that.
I gotta do that.
Speaker 5 (31:49):
Feel bad for people who used to do that at
you know, and get paid great salaries for it, because
it was a tremendous talent.
Speaker 7 (31:56):
I know.
Speaker 1 (31:57):
I've known people in the arts, like graphic artists, commercial
artists and stuff like that, and they were smart and
they got wise to computer art when that became the thing.
I mean, they put down their pens and markers and
the rest of it and picked up a mouse and
they got good at that. And now it's like any
dips with a keyboard can type in. Give me Donald
Trump riding a white stallion with an ice cream cone
(32:20):
in his hand, and.
Speaker 2 (32:20):
There it is. For some reason, yeah, for.
Speaker 1 (32:23):
Some reason, speaking of advertisements and that sort of thing,
I thought this is crazy.
Speaker 2 (32:28):
It's a report on.
Speaker 1 (32:31):
Advertising in a variety of vice doesn't really matter, But
four percent of young Americans report that ads infiltrate their dreams.
Whoa commercial messages find their way into their dream I.
Speaker 2 (32:47):
Don't think that's ever happened to me, or at least
I'm not aware of it. Yeah, that's obviously what I
was going to ask.
Speaker 1 (32:52):
I don't recall ever having a dream like like that?
Speaker 5 (33:02):
And why why do young people say it's happening and
it's not happening to us? Do they spend that much
more time taking an ad.
Speaker 2 (33:08):
Tho we do?
Speaker 1 (33:09):
They don't go into real depth on that. They were
polling as well. Over it was like eleven hundred Americans
age eighteen to thirty five. Twenty two percent of respondents
experience add like content in their dreams between once a
week to daily. Other roughly twenty percent report such occurrences
(33:30):
once a month every couple of months. The phenomenon isn't
merely passive. The survey reveals that these dream based advertisements
may be influencing consumer behavior in tangible ways.
Speaker 3 (33:40):
Well.
Speaker 5 (33:41):
Two thirds of yes, so am I having a like
I'm having a dream where I'm flying? Love those dreams?
How man have a dream where I'm flying and then
all of a sudden, a dream voice comes in, like
flying try Southwest Airlines, go to Southwest Airlines dot com.
Speaker 2 (33:55):
That sort of thing.
Speaker 1 (33:56):
Oh yeah, that's hilarious. No, but they say that advertising
corporations are desperate to figure out how this works so they.
Speaker 2 (34:08):
Can do it more.
Speaker 5 (34:09):
I'm sure we'll get back to this sex dream right
after this message from Nike. Nike has new shoes that
could get a girl like this in bed with you.
Speaker 2 (34:16):
Wow. Yeah, I mean the good luck with that approach.
It's hilarious.
Speaker 1 (34:20):
But can you imagine you're sitting there, you're watching the
NFL playoffs and some weird ad comes on with like
imagery like what the hell was that? And a Honda
car and you're like, honey, did you see that ad?
Speaker 2 (34:32):
That was weird?
Speaker 1 (34:33):
And then for the next seven nights you dream about
it because they've somehow tapped into the neural codes involved.
Speaker 2 (34:39):
Katie.
Speaker 8 (34:40):
Wow, I'm just thinking these aren't dreams, they're nightmares.
Speaker 5 (34:43):
Yeah, it's in just so advertisers thought, we've got eight
hours we're missing out on out of a twenty four
hour day. We need to figure out how to advertise
to them in their dreams in their sleep.
Speaker 1 (34:53):
I used to have dreams, especially as a kid, where
some monster, bad person was chasing me and I couldn't
outrun them. This dream brought to you by twenty four
Hour Fitness Get on the Treadmill Joe.
Speaker 5 (35:04):
If you miss an hour, gets the podcast arm Strong
and Getty on demand Armstrong and Getty