All Episodes

December 1, 2025 36 mins

Hour 4 of A&G features...

  • Strikes on Venezuelan drug boat
  • "Man-keeping"
  • We need a plan for Ai
  • Final Thoughts! 

Stupid Should Hurt: https://www.armstrongandgetty.com/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Broadcasting live from the Abraham Lincoln Radio Studio the George
Washington Broadcast Center, Jack Armstrong and Joe, Ketty Armstrong and
Getty and he Armstrong and Eddy. I don't know that

(00:24):
that happened, And Pete said he did not want that.
He didn't even know what people were talking about. So
we'll look at we'll look into it. But no, I
wouldn't have wanted that, not a second strike.

Speaker 2 (00:37):
So that's a story that's burbled up over the last
couple of days of So when we blasted I think
it was the first drug boat we blasted. After we
blasted it, there were a couple of people clinging to
the boat, and then, according to The Washington Post, Pete
Hegzeth ordered those two people to be blasted again, which
would be a violation of all that is good and decent.

(01:01):
Here is a report on NBC about that. Then we'll discuss.

Speaker 3 (01:04):
The Washington Post, citing two sources with direct knowledge, reports
defend Secretary Pete Hegseth issued a verbal directive to target
a boat suspected of ferrying drugs, with one source saying
the order was to quote kill everybody. But the Post
reports when it was discovered two people on board had
survived the strike. A Special Operations commander ordered a double tap,

(01:27):
a second strike that killed the survivors, one source telling
the Post the follow on strike was to clear the
debris of the wreckage for other boats, not to kill
the survivors.

Speaker 2 (01:38):
Now, these are anonymous sources from the Washington Post. They're
the original reporting on this. Nobody has confirmed those reports
that I'm aware of. Yet every story I saw about
this today NBC, whoever I was watching that, I'll say,
we have not individually confirmed these reports from the Washington Post.
Pete Heggzath says that is not what happened. That was

(01:59):
news to him. Trump says, if it did happen that way,
I would not have wanted it, which is interesting, But
I'm sorry.

Speaker 4 (02:06):
Who said was that Pentagon sources who said the second
strike was to clear debris, So that was acknowledging a
second strike occurred.

Speaker 2 (02:15):
Okay.

Speaker 4 (02:16):
So the only question then remaining is were their survivors
clinging to that wreckage and did the US military know
it before they fired the kill shots of people who
are obviously no military threat.

Speaker 2 (02:29):
The thing that is a war crime, and the crime
is murder. The thing I really don't know. That needs
to be nailed down. I haven't everybody talk about it?
How much? How many people are involved in this communication
between Pete and whoever? You know, how many people are
down the line from Pete to somebody else, to somebody

(02:49):
else to somebody else who actually pulls the trigger on
the aerial shot from the drone? I mean, I don't.
I don't know how many people are involved there. And
then there is there any record of that communication they
record that is it a verbal I don't.

Speaker 4 (03:03):
Know any of any questions all. Yeah, the Free press
is writing about it. According to the post mission, Commander
Admiral Frank Mitchell, Bradley ordered the second strike specifically to
kill the two survivors, so he he would certainly be
guilty of the war crime, and depending how to direct
the order was from Pete, hegseeth he might.

Speaker 2 (03:25):
Be as well, I guess, you know, allegedly. And then.

Speaker 4 (03:33):
It's worth pointing out we have Republicans joining in the Hey,
we've got to figure out what happened here.

Speaker 2 (03:38):
Cry It's not just you know the usual suspect, right.
You only open an investigation in the House of Representatives
if the majority of party wants it, and they've opened
investigations and a couple of different committees in the House
of Representatives because the Republican committee chairs have said, yeah,
we want to look into this. Also a Senate committee
looking into it. And is it just a coincidence that

(04:03):
the week before this story broke the Republican the Democrats
put out that video of you do not have to
follow illegal orders, You do not have to follow illegal
orders and that whole dust up. Is it just a
coincidence that that happened right before this information came out.
That would be a hell of a coincidence.

Speaker 4 (04:20):
Sure seems like it because this attack allegedly happened on
September the second, So yeah, you absolutely could believe that
there were leaks and people were hearing Oh, this was
a clear war crime in the blasting the drug boats
the thing, and that's why they put out that video.
It makes more sense now.

Speaker 2 (04:41):
Right because it kind of seemed like it came out
of nowhere. Although would they be after this what is it?
Admiral whoever? Yeah, of a Bradley. You know, I'm pretty
cynical about our politics today. I'm not sure how many

(05:04):
of these Democrats who are making a big deal out
of this, are actually concerned about these Venezuelans clinging to
the boat versus they want a scalp in the Trump
administration to damage the Trump administration, which is more important
to them. So I feel like some admirable, no admirable,
I did it. Yeah, I feel like I feel like some

(05:26):
admiral whose name nobody had ever heard before would be
enough for them to feel like they got a victory. Yeah.

Speaker 4 (05:35):
I don't disagree with anything you said, but I don't
think it's appropriate to look at this story through the
lens of what is the motivation of Democrats, whether it
is holy or unholy? If a war crime was committed,
we got to get to the bottom of it.

Speaker 2 (05:51):
Yeah, I guess I was assuming that it didn't happen
the way it's being portrayed in the Washington Post, and
we don't know that. Yeah, I don't know.

Speaker 4 (05:59):
No idea the fact that were you going to play
our two Congress fellas?

Speaker 2 (06:03):
I wasn't, But we can. Let's do it. Sixty three
and four back to back.

Speaker 4 (06:06):
Mike Turner and Don Bacon, both Armed Services Committee guys
or House Intelligence Committee guys, both Republicans.

Speaker 5 (06:14):
Congress does not have information that that had occurred. Both
the chairman of the Center Armed Services Committee, Chairman of
the House Armed Services Committee, and ranking members have opened investigations.

Speaker 2 (06:24):
Obviously, if that occurred, that.

Speaker 5 (06:26):
Would be very serious, and I agree that that would
would be an illegal act.

Speaker 6 (06:31):
If the facts go to where the Washington Post article
takes it, well, then.

Speaker 2 (06:36):
We'll have to go from there.

Speaker 6 (06:37):
But if it was as if the article said, that
is a violation of.

Speaker 2 (06:42):
The law of war.

Speaker 6 (06:43):
But people want a survivory, don't kill them, and they
have to pose an eminent threat. It's hard to believe
that two people on our raft trying to survive would
pose an eminent thrut.

Speaker 2 (06:54):
I would just like to have somebody point out that
if it turns out this didn't happen, how about a
little pressure on the Washington Post and not put out
this sort of crap because that happens a lot too.
Where you get these stories with anonymous sources. They become
a big deal for a couple of twenty four hour
news cycles. Then they kind of drizzle away and nobody

(07:14):
ever gets packed.

Speaker 4 (07:15):
Straight years during the Russia collusion hoax, please hat them shift,
you scum back.

Speaker 2 (07:19):
No kidding, And then never nobody ever circles back to say, hey,
that stuff that didn't turn out to be true.

Speaker 4 (07:24):
That's not cool, right, right, just take down the Washington Post. Sorry,
Jeff Bezos. I know you wasted a lot of money
on it, but I'm sorry. We're shutting it down. Yeah,
we'll just have to wait for the facts to come out.
It's troubling. We need to be.

Speaker 2 (07:38):
Better than the worst people on earth as a country. Well,
it's getting it pretty hard to imagine either Pete or
the guy underneath him thinking that was okay, yeah, yeah
it is. I mean that would be extraordinary.

Speaker 4 (07:52):
I mean if Pete said something to the effect to
kill them all, which I think is in the Washington
Post story. I can't remember the phrase exactly, and this
guy interpreted that to kill hopeless, helpless survivors, Yeah, that
would be an enormous error in judgment, the sort of
which results in, you know, prosecutions.

Speaker 1 (08:15):
Uh.

Speaker 4 (08:15):
The commander overseeing the operation from Fort Bragg in North Carolina,
this Admiral Bradley, told people on this secure conference call,
according to Washington Post, that the survivors were still legitimate
targets because they could theoretically call other traffickers to retrieve
them and their cargo. According to two people, he ordered
the second strike to fulfill Hexeth's directive that everyone must

(08:39):
be killed.

Speaker 2 (08:41):
So, I don't know anything about this. So if you're
in that line of work, you're the sort of person
that pulls the trigger on the drone to blast these
people in the boat. Are you taught all of this
and some sort of training or are you just supposed
to know it from watching movies? Are I don't know that.
Would you have been taught. Now if you hit the

(09:04):
boat and there are survivors, they're no longer a threat,
so it would be illegal to kill them. Are you
actually taught that? I would hope so, but maybe maybe
are Maybe aren't. I don't know if you know anything
about this, Like you served in the military and you
got that sort of training. Let us know on the
text line or the email.

Speaker 4 (09:21):
Yeah, I'm looking to see if we have any decisive,
knowledgeable emails. No, just a bunch of dopey comments online.

Speaker 2 (09:36):
Well, this is a Grade A scandal. If it can
be substantiated in any way, it would drive Hegzeth out
of his job and perhaps have some sort of trial
or something.

Speaker 4 (09:50):
I don't know, here's a Trump loyalist says, Joe, you're
missing a possibility. Maybe the Dems made that illegal order
video because they were going to play and to a
false story about Hegzathy.

Speaker 2 (10:04):
See that's yeah, that's a stretch.

Speaker 4 (10:10):
And this from Robert So the WAHPO reports something quoting
anonymous sources. Why is that given any credibility at all,
given the history of the WAPPO with fabricating things. The
dude was just put in place. That would be the
admiral on August first Navy seal. I'm guessing he knows
the law and is not afraid of the Secretary of Defense.
This is an entire argument against a straw man to

(10:32):
put and keep it in the news unless there's proof
this order exists or something happened illegal. It's just people
trying to whip up news. Yeah, you know what, And
he says our media is so broken. I would agree
with you, and both Don Bacon and Mike Turner said,
if this is true, it's a serious deal. But all right,
now we're in the let's let the facts come out
part of the process.

Speaker 2 (10:56):
I'm kind of interested. I often like what Barry McCaffrey says,
retired general. He's usually on NBC. Can we are sixty seven?
I'd just like to hear his stake on this.

Speaker 7 (11:05):
The easy question is whether or not we are allowed
to use LESO force against people clinging the wreckage. They
have submitted there's no threat to the US military forces.

Speaker 1 (11:18):
That seems to me.

Speaker 7 (11:19):
It needs to be investigated, need to be confirmed. It
seemed to be clearly an illegal order in a war crime.
And this admiral should have, when he got allegedly got
this verbal order from the Secretary of Defense, should have
put it in writing, said we will not comply and
explained why. So that one straightforward, it's no different than

(11:40):
a Nazi submarine machine gunning survivors in the water of
a sunken ship in World War Two.

Speaker 4 (11:49):
For which they were prosecuted at the Nuremberg trials and.

Speaker 2 (11:52):
Executed man And so how fast does this stuff happened? Too?
So the first shot on the boat went How much
time was there but between the first strike and the
second strike, probably not a lot, I'm guessing.

Speaker 4 (12:06):
Yeah, the Wallpole count Is took a while for the
smoke tick clear, and that booty was surprised to see
survivors and the order was given hit him again. But again,
at this point, I feel like the entire question is, Okay,
did that stuff actually happen in the way it was described?
If so, okay, then the process begins. If not, let's

(12:27):
move on with our lives after kicking the hell out
of the wallpole for false stories.

Speaker 2 (12:32):
Again, there needs to be better reporting on how this
whole thing works, and I'm surprised that's not included. That's
one of the reasons that makes me very skeptical about this,
because if I'm the reporter receiving this information from some
anonymous source, I'm asking, Okay, how does the chain of
command work here? Pete talks to who, who talks to who?
Who actually fires the rocket or the missile? Is all

(12:53):
of this videotaped? We've seen the video of the original
strikes on a lot of these boats. Are all of
them videotapes? So are you claiming there is a videotape
of this out there? Also? I would ask those questions
if I as the reporter receiving this information and include that
in my report and say videotape exists of this, for instance,

(13:13):
because that would be a big deal. It's very easy
to get to the bottom of then whichever committee just
needs to say, Okay, we need to see a videotape, like
by noon today. I'm not always surprised at how long
it takes to answer these questions. Seems like they should
be answerable, like in an hour.

Speaker 4 (13:32):
Yeah, of course, but everybody lawyer's up and drags their
feet as much as possible. I mean, you can have
the most legitimate request in the world for information from
the government, and it's just it's.

Speaker 2 (13:43):
Like, you know, I don't know mining for rare earth minerals.
Do you have Republican committees in theory trying to protect
a Republican administration and a Republican sect deaf and all that.
I would think they'd all want to Yeah, let's get
the video out. This is what happened, unless something bad happened. Yeah,
interesting thoughts Texas please four one, two, nine, five KFTC.

(14:07):
I might want to get super involved in the AI
argument because it might be the most important thing that's
ever happened in the history of the world. Want to
talk about that coming up next segment. Became aware of this.

Speaker 4 (14:17):
New book, A Kinder Gentler Feminism that well, the title
is actually the Dignity of Dependence, a feminist manifesto that
Caleb Bart reviewed super interesting. We don't really have time
to get into it now, but the author, who's a woman, uh,
talks about, you know, feminism got way off track, but

(14:40):
it got off track when it began to deny women's
essential differences from men. She observes with great acuity, depth,
and wisdom that much of feminism today seeks to make
women more like men in so far as they are
autonomous and impenetrable blah blah blah, and and just that
that's a very perverse way to look at empowering women.

Speaker 2 (14:58):
There's no make them into There's no doubt that that's true.
Though a lot of modern feminism is just if you're
anything that's not being like a woman is wrong. Yeah, yeah,
it's just crazy.

Speaker 4 (15:11):
And then that reminded me of something else i'd seen,
and I'd mentioned earlier in the show that in the
midst of my holiday revelry and a lovely time with
my family, I made the mistake of clicking on an
article in The New York Post, which was just a
perfect example of Internet snipy bitterness, superiority trolling.

Speaker 2 (15:28):
The rest of it here's another good one.

Speaker 4 (15:30):
This is a Vice article that came out a couple
of months ago, but now it's gone viral.

Speaker 2 (15:35):
It's about man keeping. I will quote.

Speaker 4 (15:38):
Mankeeping describes the emotional labor women end up doing in
heterosexual relationships. It goes beyond remembering birthdays or coordinating social plans.
It means being your partner's one person support system, managing
his stress, interpreting his moods, holding his hand through feelings.
He won't share with anyone else, all of it unpaid, unacknowledged,
and often unreciprocated.

Speaker 2 (15:59):
What are we talking about, well man keeping?

Speaker 4 (16:02):
And as Nelly Bowles writes, hmm, when I was young,
it was called loving someone. Matter woman, you do a
lot of that stuff for your partner, But now we
call it man keeping and we hate it.

Speaker 2 (16:13):
Oh do we hate it? Holding his hand ew through feelings. Eh,
he won't share with anyone but you. Ooh. And it's unpaid.

Speaker 4 (16:22):
This is worse than an internship.

Speaker 2 (16:25):
Took the words right out of my mouth. That's that's
called love?

Speaker 4 (16:28):
What that is a viral whatever that means? Article on
the internet that's really making the rounds. Young women can say,
I don't need that this.

Speaker 2 (16:37):
Man keeping good grief. Unplug the effing Internet.

Speaker 4 (16:43):
I'm gonna miss buying stuff online, but I'll schlep to
the store. I'll write letters, I'll call people on the telephone.
Unplug the Internet.

Speaker 2 (16:52):
Well, it's too late to unplug the Internet. Is it
too late to unplug the a dot? The idea of
AI taking over the world and destroying mankind? And probably
isn't too late. But we'd all have to rally together,
which I might be interested in trying to start? What
do I need? A sledgehammer, a fro bar and a gasoline?

(17:13):
What do I tell me?

Speaker 3 (17:14):
And I'll bring up day tune, Armstrong and Getty.

Speaker 8 (17:19):
Retailers are hoping to add a new spin to their
sales events, leaning into artificial intelligence. In the weeks leading
up to peak holiday spending, chat GBT maker open Ai
partnered with Walmart and Target. Over half of shoppers set
in a recent survey they plan to use AI this year.
This year, Amazon's offering its own chatbot called Rufus, where
you can take a screenshot of a shopping list and

(17:40):
automatically add those items to your card.

Speaker 2 (17:43):
Oh, how exciting is that? That's not that great? All
the discussions we're having around AI are missing the main
topic which should be discussed probably constantly and maybe maybe
the number one political topic in America, if not the
top couple. As it's been pointed out by a number

(18:07):
of the people I've been watching and listening to and
reading over the last several weeks about AI. These chatbots
were all using They're like a web page as opposed
to the Internet. It's got nothing to do with what
AI is going to become or the impact it's going
to have on the world. The fact that chat GPT

(18:28):
is a cooler Google, you know that sort of thing,
and it's kind of misleading people into thinking as to
what AI actually is. So I've been on this kick
for quite a while. If you listen you know, the
big into AI and read about it and listen to it,
let's podcast about it and everything like that. The book
that came out fairly recently I mentioned Before the Break
by Ellie Ykowski, which has gotten a lot of attention

(18:50):
in AI circles. It's called if Anyone builds It, Everyone Dies.
He was the one of the biggest proponents of AI
over the last several decades, one of your leading cheerleaders
for AI. If you ever read anything or watched a
show or anything like that, or any network television show

(19:11):
Oprah in the afternoon, whatever, and somebody was talking about AI,
it was probably him up until recently when he decided, no,
this is we can't control this. Superintelligence is absolutely going
to happen. We're creating a beast significantly smarter than us.
How do we think that's possibly going to turn out
to our benefit? We need to stop it immediately. And

(19:31):
he wrote this book. If anyone builds it, everyone dies,
and he's trying to get a whole bunch of people
on board to do something about this. Elon said the
other day he thinks there's a ten to twenty percent
chance that AI destroys mankind. Why in the hell would
you build anything that there's even a ten percent chance

(19:52):
that destroys all mankind? It seems crazy, that is.

Speaker 4 (19:58):
I mean, the answer is so so clear and so unimplementable.
At the point that it gets real good at curing cancer,
for instance, we all love it and then stop right,
but it's unimplementable.

Speaker 2 (20:12):
I was watching it.

Speaker 4 (20:13):
I mean, for the obvious reasons, we don't have the
cooperation of everyone on Earth.

Speaker 2 (20:17):
I was watching a I'm gonna start wearing t shirts
from the doom people because I'm one of them. I'm
a doomer definitely that believes that it's all going to
go to hell. Doom Debates is a website. I've been
watching a lot. Practically everybody involved in this arguments within

(20:39):
sixty miles of this radio studio, and I wish we
could get some of them on the air, the Doom Debates,
where they have some of the leading people on to
discuss various sides of it. I was watching a doom
debate between this guy, Max Tegmark, who I've mentioned a lot.
I've read a couple of his books, Life three point
zero and bunch of different stuff. He's an MIT scientist
with this other guy who's one of your leading AI researchers,
who thinks you know hit And the main pro argument

(21:03):
from this dude is there's just no regulating it anyway.
I mean, how the hell are you going to regulate it?
Which he might be right about, but tech Mark and
a lot of others, and this is what I might
want to get involved with personally. They're trying to get
people's attention and maybe have marches or something to try
to alert the government. We need to come up with

(21:26):
a plan. We're just screaming one thousand miles an hour
toward developing a beast or whatever you want to call
it that is absolutely going to doom humanity and nobody's
putting any brakes on it. What the freaking hell are
we doing?

Speaker 4 (21:45):
Yeah, twice already you've used the word something, And I'm
not saying that we shouldn't be trying to get people's
attention so that they're willing to do something. Then the
obvious next step is what thing.

Speaker 2 (21:58):
Welle Mark, tech Mark other people's argument, And he's been
he's been trying to lobby Congress, but there's just not
enough public knowledge out there will to really have any
any haf yet. And that's why I's wondering where maybe
we can come in, or I can come in or whatever,
get the media involved to alert people to what could happen.
Would just be to say, everybody's got to stop open

(22:18):
AI chat, gpt elon Zuckerberg, you gotta stop no more
until we get our heads around this and come up
with some sort of regulations. One of the points being
made is we have way more regulations on sandwiches in
America currently than we do on AI. It's not even close.

(22:39):
There are Sandwich There are almost zero, almost zero regulations
on AI at this point, right, And you know, if
you're a listener to this show, you know it's not
like me to want to be pro any kind of regulation.
But as he points out, you're not allowed to just
make any kind of drug you want to and put

(22:59):
it out to the people. But we have no regulations
on AI for what we're just going to unleash on humanity.

Speaker 4 (23:07):
Does I assume he gets to my next you know,
Devil's advocate type question is okay, if we regulate it
but China does not, then where are we?

Speaker 2 (23:18):
That always gets sticky also, and that's one of the
pushbacks from people. You'd have to have some sort of
world pressured, same way we had around nuclear weapons, that
inspectors going in, or you have to announce when you're
blah blah blah, all kinds of different things. Yeah, I
don't know if that's fling me. Don't know what the
questions and what the some things are. Yet does not

(23:39):
in any way deny that we ought to be trying
to come up with them. But boy, it's a head scratcher.
I'd say it's one of the arguments is that we're
the smartest beast on earth. Every other living organism lives
at our pleasure. It's only because we as a society,

(24:02):
and this is kind of in the modern age, that
we've decided we want to let chimpanzees live and we
should value them and not just murder them for their
teeth or whatever. Uh, it's they live at our even
though they're the second smartest beast on earth. Why would
you think that that's not going to occur when there's
a smarter thing than us that we're not going to live.

(24:24):
It's its pleasure, whether it wants us to be around
or not. Makes sense to me. Yeah.

Speaker 4 (24:31):
Yeah, I've got this horrible thought that when the super
Ai is developed, the first thing that's going to happen
is Kim Jong un is going to empty everybody's bank
accounts worldwide. It's all going to flow into the North
Korean treasury and that'll be it. That'll be plenty, can
you imagine?

Speaker 2 (24:49):
Yeah, I don't know if that. I don't think any
individual is going to have any control When all the
experts say this, no individual is going to have any
control over super AI. It will do whatever the hell
it's w it wants. It's not going to do the
bidding of the Chinese or North Korea or US or
any other human. It's going to do whatever it wants
and what it wants. Nobody has the slightest idea.

Speaker 4 (25:09):
May I hit you with an intriguing email from our
friend jt in Livermore about the dangers from super intelligent
general AI. I think it comes down to this question.
Would an intelligence vastly superior to human intelligence basis actions
on the most base animalistic behaviors and emotions, or would
it be driven by a higher understanding and intelligent empathy.
Claiming that a super AI would attack us out of

(25:31):
desire for self preservation, or out of paranoia, or out
of indifference to the value of life, for presupposes that
the basist emotions would be the domin drivers of the
AI's action. But don't most higher thinkers believe in empathy,
helping those less fortunate, the sanctity of life and the
beauty of life. Wouldn't a super AI be more likely
to adopt those higher forms of enlightenment rather than the
basist motivations of a selfish scared Toddler, and then, by

(25:54):
way of illustration, at the end of the fabulous original
Blade Runner movie, the last Ultra Advanced Replicant has almost
every reason that killed the human that has killed all
of his friends, and that was trying to kill him,
but he chose to let Harrison Ford's characters live. The
character live because it believed in the beauty and sanctity
of life.

Speaker 2 (26:13):
Yeah, one would that'd be fantastic that went that direction,
but I don't know how you count on it. The
example was used that when Germany started two World Wars,
they were pretty much the most sophisticated advanced society on
planet Earth, with the finest arts and writers and everything else,
and they went completely off the rails and did the

(26:35):
things that they did because intelligent beings are capable of
doing that and convincing themselves they're doing the right thing.
And a reminder that C. S.

Speaker 4 (26:44):
Lewis, to paraphrase him, put it that the most oppressive
oppression is from people who think they're doing it for
your right. And I could easily see one power or
another deciding that, you know, all of humanity would be
a hell of a lot better off, you know, under
our boot heel.

Speaker 2 (27:01):
Yeah, one example was used. So the alignment problem all
along has been can you align whichever AIS you're talking about?
And I have become aware that they all use the
term AIS when they're talking about multiples. That is just
the term of art. So whichever AIS you're talking about,
you try to align them with some sort of morals

(27:23):
or decency that your company puts in them. But the
argument was made, and I thought this was really good.
We're programmed with really one alignment completely as human beings,
and that is to stay alive and pro create. Yet
we invented birth control and abortion, which seems to run

(27:44):
completely contrary to the one thing we were aligned to do.
And then we regularly do things that aren't like in
our best interest in terms of eating or exercising or
all kinds of different things. We're aligned to stay alive,
but we do all kinds of things that will kill us.
So you don't necessarily stay on track, which would be

(28:04):
the same problem with whichever AI is built to be
aligned with whatever Elon Musk or Sam Altman's goals are
for the for the supercomputer.

Speaker 4 (28:14):
So now that you've terrified everybody, what's the latest thinking
on timetables. Is there any predominant opinion on when you
know various terrifying end or awe inspiring bench stones are
milestones are reached. It's all over the place, but the
bench stone, it's benchmark or milestone anyway back to you.

Speaker 2 (28:32):
It's all over the place. But everybody agrees that we
got here way faster than everybody thought we would to
where we are today, ended up arriving much faster than
most predictions. So so far it has been on the
forward end of things happening, as opposed to the back
end in terms of how fast it can happen.

Speaker 1 (28:54):
Now.

Speaker 2 (28:57):
They were generally arguing in the debate I was watching
last night somewhere in the early thirties, which is only
five and a half years from now, between five and
ten years from now that it will arrive. But what
do you think of the idea of trying to raise
public will to do this? Do you think you could
possibly do that? Convince people? And the other thing. I

(29:17):
would actually love to talk to these people about this,
some of the thinkers that are trying to move the masses.
If a whiff of partisanship comes into this, it's over.

Speaker 1 (29:35):
Though.

Speaker 2 (29:35):
If Trump weighs in one way or the other on AI,
forget it, We're done. Or if some pundit, you know,
assigns being pro AI is what Trump wants or being
anti AI is Trump, it'd be like masks and vaccines
and everything else. It's over at that point, right, And
I don't know if there's any avoiding that.

Speaker 4 (29:55):
I just keep thinking back to the hilarious movie Don't
Look Up, which a lot of conservatives didn't like because
it's scured conservatives, but I thought it scured lefties. Every
bit is brilliantly. I don't know that you can get
that going. I don't know that you can get people
to pay attention.

Speaker 2 (30:13):
Well like and Don't Look Up. There was a movement
toward making sure the media is going to benefit so
many people that we need to make sure the media
hits here. But what if it hits in the United
States and not in countries where they have you know,
more inequality, you know, all that sort of stuff, right,
which absolutely will be the topic for AI.

Speaker 4 (30:33):
But metior to destroy life, the poor and minorities affected most.

Speaker 2 (30:38):
Yeah, do you think there's a possibility because we have
to take a break here soon, do you think there's
a possibility that you could raise awareness, get people worked
up enough that you could end up with And I've
been against every march that's occurred since nineteen sixty eight,
but if you could, if you could get people like
in the streets and say regulate AI, let's do something

(30:59):
about Do you think you could even come close to that? Yeah?

Speaker 4 (31:04):
Yeah, and I think it would probably be useless because
of the China factor, but I don't know for sure.

Speaker 2 (31:11):
So far, we're way ahead of them, We're the leading
edge of it.

Speaker 4 (31:19):
This has to be how people felt when they saw
the mushroom clouds in the forties.

Speaker 2 (31:25):
They thought this can't end. Well, yeah, and that's an
argument for c that everybody thought that would doom humanity.
But that's been in the hands of a small number
of people and human beings making the decision. What if
the you know, the nuclear weapons didn't get to make
their own decisions of what would be best for life
and go ahead and develop independently of human needs. Well, in,

(31:48):
come on, Japan didn't have mutually shared destruction in nineteen
forty five, right, and well they were on the receiver end.
Anybody with any thoughts on this, I'd love to hear
it for a mailbag tomorrow or on the line or whatever.
It's a hell of a topic. Anyway, we will finish
strong next waning moments of the first day back after

(32:09):
a long vacation in which I gained three and a
quarter pounds. Congratulations to me, Michael, can we hear forty seven? Please?
Why do you blame the Biden.

Speaker 1 (32:19):
Administration because they let him in? Are you stupid?

Speaker 2 (32:22):
Are you a stupid person?

Speaker 7 (32:24):
Because they came into on a plane along with thousands
of other people.

Speaker 6 (32:28):
That shouldn't be here, And you're just asking questions because
you're a stupid person.

Speaker 2 (32:34):
I love him or hate him, He's always a gentleman.
Donald J. Trump? Are you stupid? Are you a stupid person? Yes?

Speaker 4 (32:40):
Indeed, so that's just he's not getting more controlled. It
should be interesting going forward. I played that partly as
an excuse to bring this up again. We were talking
earlier about how so many countries around the West have
finally said openly, look, and Marco Rubio made an announcement

(33:01):
about it Thanksgiving week. Look, we're not going to import
people who hate our values from the third world. It's suicide.
And that's unquestionably true. And if you don't understand that
you are a fool or to dopey. Now I sound
like the president to participate in the national conversation.

Speaker 2 (33:22):
I hope I'm not.

Speaker 4 (33:24):
Just came across this stat brace yourself. The percentage of
children born in Canada who had a foreign born mother
last year forty two point three percent.

Speaker 2 (33:37):
No, yes, no, yes, That.

Speaker 4 (33:42):
Proportion is almost exactly doubled since nineteen ninety seven. The
West is rapidly becoming not the West anymore. To support
the economy, politicians and enough the people who live here.

Speaker 2 (33:58):
And I don't understand the crowd that that's automatically a
good thing. Maybe we'll discuss that tomorrow. They're idiots. What's
to understand? They're idiots? Thought I'm strong, I'm strong. You're
ready with Katie Green and.

Speaker 1 (34:17):
Thought Strong.

Speaker 2 (34:20):
Here's your host for final thoughts, Joe Getty.

Speaker 4 (34:22):
Hey, let's get a final thought from everybody and the
crew to wrap things up for the day. There is
our technical director, michae Lingelow. Michael, what's final thought?

Speaker 6 (34:28):
I'm a little irritated having to work today on Cyber Monday.
I was at home, you know, I wanted to decorate
for the holidays.

Speaker 2 (34:35):
But here I am working on Cyber Monday, of all days,
son of them.

Speaker 4 (34:40):
Katie Green are Esteemed Newswoman. As a final thought, Katie, we.

Speaker 2 (34:43):
Did Thanksgiving at someone else's house and I only regret
no leftover. Yes, that's the worst part of doing Thanksgiving
somewhere else. I mean the upside you didn't have to
make the meal, you don't have to clean up, but
no leftover sucks.

Speaker 4 (34:55):
If you have any decency, you help with the cleanup
point of order.

Speaker 2 (34:58):
Jack, do you have a final thought for Yeah, that's
a good point to know. Leftover's thing. Yeah. I mean,
on the other hand, maybe I'm better off without a
bunch of pie sitting around my house and stuffing and
everything like that, as I did gain three and a
quarter pounds.

Speaker 4 (35:10):
Yeah. My final thought is also dessert related. We had
all sorts of different pies prepared for Thanksgiving dinner, but
we had twelve people in the house and family and
all and twelve yeah, and so like, by two o'clock
the next afternoon, it was all gone.

Speaker 2 (35:26):
Damn it. It's bummed.

Speaker 4 (35:28):
I could eat apple pie for breakfast every day in
my life.

Speaker 2 (35:32):
Heck, yeah, oh I may some days. It exactly Armstrong
in Getty wrapping up another ruling four hour workday, so.

Speaker 4 (35:40):
Many people will thanks a little time good Armstrong and
Getty dot com. No shipping charged today. Free shipping at
Armstrong and Getty dot com. The Ang Swag Store. Get
your favorite Ang fan. Maybe it's you a little souvenir.

Speaker 2 (35:54):
And you're gonna be running out of time soon to
get it to them before the holiday. You know, we'll
see tomorrow. God bless America.

Speaker 1 (36:01):
I'm Strong and Getty just on then this bollic.

Speaker 4 (36:14):
Easy for you to say, Signed Stegas Store Segast. That
would have been a good punch, would have been great.

Speaker 2 (36:19):
Oh yeah, there you go. Month is off to a
bad start. A pie, that's your problem.

Speaker 1 (36:28):
Armstrong and Getty
Advertise With Us

Hosts And Creators

Joe Getty

Joe Getty

Jack Armstrong

Jack Armstrong

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Male Room with Dr. Jesse Mills

The Male Room with Dr. Jesse Mills

As Director of The Men’s Clinic at UCLA, Dr. Jesse Mills has spent his career helping men understand their bodies, their hormones, and their health. Now he’s bringing that expertise to The Male Room — a podcast where data-driven medicine meets common sense. Each episode separates fact from hype, science from snake oil, and gives men the tools to live longer, stronger, and happier lives. With candor, humor, and real-world experience from the exam room and the operating room, Dr. Mills breaks down the latest health headlines, dissects trends, and explains what actually works — and what doesn’t. Smart, straightforward, and entertaining, The Male Room is the show that helps men take charge of their health without the jargon.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.