Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Broadcasting live from the Abraham Lincoln Radio Studio the George
Washington Broadcast Center, Jack Armstrong and Joe Getty.
Speaker 2 (00:10):
Arm Strong and Jettie and he Armstrong and Getty.
Speaker 3 (00:23):
Toddler was rescued from a dangerous situation at Newark Airport
in New Jersey thanks to some quick thinking by two officers.
The two year old had stepped onto a baggage conveyor
belt Wednesday as the child's mother was booking a flight.
Speaker 2 (00:36):
The officers then jumped.
Speaker 3 (00:37):
Onto the conveyor belt themselves and found the toddler just
ahead of the baggage acturay machine.
Speaker 2 (00:42):
The child was not harmed. Was wait a minute, Wait
a minute. Wait, Okay, I'll let you ask your question
before I ask my question. Go ahead. Yes, here's my question.
Speaker 1 (00:52):
So mom wasn't watched in the toddler because she was
booking a flight.
Speaker 2 (00:55):
You're already at the airport.
Speaker 1 (00:57):
Sounds like an excuse to me bucket a flight to
huh lady.
Speaker 4 (01:02):
Here's my other question, is I've seen the video. If
I'm standing there waiting for my luggage, it's going around,
and I see a two year old going around with
the luggage. I'm going to reach over and pick up
the two year old. Nobody did. Though nobody did, wouldn't you.
Speaker 1 (01:23):
I would think, so, yeah, I think I'm very ways
I'd be yelling whose kid is that?
Speaker 2 (01:28):
Whose kid is that?
Speaker 4 (01:29):
Yeah, as I kind of ran along, and if I
get no response, I'm picking the kid up. The kid
shouldn't be there, and they're gonna get hurt. I mean,
got all kinds of ways your fingers could get sliced
off or whatever.
Speaker 2 (01:39):
I would just pick the kid up, and nobody did.
Speaker 4 (01:41):
And as you heard, their security eventually came, and so
everybody just stood around and watched the two year old
hopefully not get their fingers cut off until security came.
We're just a weird society that way. With I might
get you know, and people probably think this. I might
either authorities should do it, not me, or I might
get dude if I touch that kid or.
Speaker 1 (02:01):
Whatever, or somebody might think I'm a child molester or whatever.
Speaker 4 (02:07):
I feel like. At an earlier time, any parent would
have just grabbed the kid.
Speaker 2 (02:11):
Sure.
Speaker 1 (02:12):
Yeah, in an earlier time, we had fairly homogeneous values
as a country. Fairly now people just don't know. There's
so many different beliefs and ways of life and lawsuits
and the rest of it that I better do nothing.
Speaker 4 (02:27):
So one thing I feel like we need to do
on this show, or I like to try to do,
is be on top of various cultural things that are
occurring that you might not know over maybe you do
know over.
Speaker 2 (02:39):
I feel like I need to know about them to
be able to do this job. Well.
Speaker 4 (02:42):
I do not know about the laboodoo Is that the
way you say it? La Boo boo the La Booboo
toy craze. I do not know about the Labooboo toy craze.
And I just sought up on the TV inside the
Labooboo Toy Craze and they had some reporter there, and
so I task Katie with figuring out what the hell
that is.
Speaker 2 (03:02):
Maybe Joe already knows, but I don't know.
Speaker 1 (03:03):
It's Oh, I'm all about the La Boo Boo toy craze.
Speaker 2 (03:07):
Are you no? So what is it? It's the way
I can explain it.
Speaker 5 (03:13):
It's like the next Beanie Baby craze. They are these
stuffed animals. They're bunnies that have kind of a mischievous
look on their face.
Speaker 2 (03:21):
Some mischievous bunnies, Yes, mischievous.
Speaker 5 (03:23):
Bunnies, and they have different outfits. They kind of look
like a care bear, you know, you can see the face,
but the rest of it is very furry in a
little outfit. And apparently this took off because Rihanna strapped
one of these to.
Speaker 2 (03:34):
Her purse and everybody lost their minds. Ah, well, it
doesn't really matter how they start any craze. Once they
get going, they get going.
Speaker 4 (03:41):
And now is it the usual, like people are collecting them,
or they're sold out in various places, or certain ones
are more popular than others, and you can go on
he paying get one for three hundred dollars or whatever,
all of the above.
Speaker 5 (03:54):
And they're also doing this thing called a blind purchase,
which is like a mystery purchase.
Speaker 2 (03:58):
So you just can't say exciting.
Speaker 5 (04:00):
You just give them thirty bucks and they send you
whatever they have.
Speaker 1 (04:02):
And I'm sorry, I'm sorry, the Internet is talking to
me through my earpiece. Really, Okay, that was two crazes ago.
It's over already, We're two crazes down the road.
Speaker 4 (04:12):
Yeah, yeah, like, no craze can last very long.
Speaker 1 (04:15):
Now, But now grinding those up and snorting them, it's
called the LaVoo boos.
Speaker 4 (04:20):
Snorting challenge. How big is it? How big is a
la boo boo?
Speaker 5 (04:25):
They're in different sizes. They've got giant plushies and then
they are also down to like key chains.
Speaker 4 (04:30):
Okay, okay, well mam, I know, so I'm glad I know.
And now you know, I do see a lot of
and it's a certain ethnic group that does this. But
I don't know if you're allowed to say that that
uh walks around with various little stuffed animals attached to
their like backpack or belts or.
Speaker 2 (04:53):
Yeah.
Speaker 4 (04:53):
I see a lot of that in my college town.
So are those la boo boos?
Speaker 5 (04:58):
Yeah, that's probably a laboo boof thing. Also just one
more laboo boo. Note Wang Ning, who's the thirty eight
year old founder and CEO of the company made.
Speaker 2 (05:07):
There there's a hint as to where the crazy about it.
Speaker 5 (05:12):
He saw his fortune leap by one point six billion
dollars in a single day after this La Boo boo
was featured in a runway show.
Speaker 4 (05:20):
God and everybody tries to get these going, and most
of them just you know, nobody has a nature. Why
would I want that stupid little stuffed animal? But if one,
it's just somehow the stars align. Sometimes it's stars like
you know, singing stars or movies. But if just somehow,
if you can get it to take off all of
a sudden, you're a billionaire. And there's no real difference
(05:43):
between this kind of stuffed animal or that kind of
stuffed animal or whatever. It's just it's it's an interesting
aspect of human beings.
Speaker 1 (05:51):
The laboobos probably have tiny hidden microphones and they're transmitting
back to the Chinese Communist Party.
Speaker 2 (05:55):
Wow. I don't know about that cold warrior till I
die jack. Huh. This same group of college kids, I
noticed they often have like.
Speaker 4 (06:06):
A tail or, ears, or a variety of things the
furry category.
Speaker 2 (06:13):
I don't think it's that it's cute.
Speaker 4 (06:18):
I guess nobody mates anymore, so people don't get together
and mate.
Speaker 2 (06:26):
Try it once or twice. It's really awesome. Okay, Wow,
thank you for that.
Speaker 1 (06:33):
A bit that made me profoundly discouraged, all of it,
all of it. Why just why, I don't need to
explain why. Why is a sunrise beautiful? Because it is?
Why was that discouraging?
Speaker 2 (06:46):
Because it was?
Speaker 4 (06:48):
This is my son's my freshman in high school. Boy,
he is it's his last summer, which starts Thursday at noon.
It's last week of school where he doesn't have a job.
Speaker 1 (07:00):
Now.
Speaker 2 (07:00):
I had a job when I was his age.
Speaker 4 (07:01):
That was back in a time where we forced children
into slave labor.
Speaker 2 (07:06):
I wanted a loo booboo. I had to earn them
money myself.
Speaker 4 (07:10):
We forced kids to work, and thank god, we put
in laws that don't allow children to work anymore. I
begged my dad every day, did you talk to them today?
Did you talk to them today? Until I finally he had.
He brought it up to some place, his feed lot,
and I got a job because I wanted one so bad.
Both my kids want to work, like two years ago,
but you can't because of these stupid freaking laws. It
(07:31):
makes me so mad. The stuff you learn. I've told
this story before. My one of my nieces who's now
doing absolutely fantastic.
Speaker 2 (07:39):
She got a job.
Speaker 4 (07:40):
The difference in her one year to the next after
she started working was amazing. There's so much you learn
from me getting done. You'll learn more from working a
job than you will learn in high school by far,
no doubt about it. But we don't let kids do
it for some stupid freaking reason. AnyWho, But we let
(08:01):
brown people in the country illegally do it. Clear progressives right, right,
So you can have my son doing some of these
jobs that we have illegals do and I've I'm already
providing him healthcare in a variety of things that you
don't have to worry about. But anyway, I told him,
you know, this is your last summer. Everything changes after
this summer because he's really excited about getting a job.
(08:21):
Next summer, he'll have a job probably go and do
every single day.
Speaker 1 (08:24):
And you know it'll be a drag at times, but
he will build those muscles.
Speaker 2 (08:28):
You're talking about learning more than you do in school.
Speaker 1 (08:31):
You build muscles that you need to me what this
is a discipline and putting aside your desires because you
have a job to do the rest of them.
Speaker 4 (08:39):
What is the main reason we don't let fourteen year
olds work or fifteen year old's work.
Speaker 1 (08:44):
I think it's misplaced progressive urges.
Speaker 2 (08:49):
Well, that's that's action and safety. Yeah.
Speaker 4 (08:52):
I remember when what state was it one of your
Ohio somebody they were talking about lowering the age to
fourteen to let kids to work, and there was a
huge outcry about and talking about kids and factories.
Speaker 2 (09:05):
In the nineteen hundred early nineteen hundred. What are you
talking about?
Speaker 1 (09:09):
Yeah, reading Upton Sinclaire's The Jungle replaced the readings of
The Handmaid's Tale temporarily.
Speaker 4 (09:17):
Didn't you and all your friends have jobs when you
were young? Or some at least you knew people who
had paper routes or mode lawns or had various jobs,
and they were fine, and they were doing it willingly
because they liked it.
Speaker 2 (09:28):
Yes, Michael, I served up yogurt my favorite job ever
at what age? Sixteen? Well, yeah, you can do it
at sixteen.
Speaker 4 (09:35):
Though, yeah, you can't do it at fourteen, because God,
you would ruin a child if he had to work
at fourteen, even though many of them want to.
Speaker 1 (09:42):
Well, I'm reminded of an email we got. I can't
remember what the topic was, but the gist of the
email was. I think it was addressed to me saying,
and Jack, you know this, but dude, the parents at
school today are not you. They're like your kids and
their attitudes about no, the kid ought to work, if
(10:04):
he wants to, she wants to whatever they did, No, No,
you're the attitudes that make you crazy. Those are the
parents now and the teachers, not just the kids.
Speaker 4 (10:14):
Well, we probably don't have any of those people listening,
but I would like to know why don't you want
a fourteen year old to be able to.
Speaker 2 (10:19):
Have a job? Why?
Speaker 1 (10:20):
Oh, there was I will tell you the other aspect
of this, because Judy and I struggled with this fair
amount ourselves during the heyday of the And I won't
get off on the tangent. But you gotta go to college.
I mean, that's like your only alternative for a happy lifestyle.
Speaker 2 (10:37):
Blah blah blah.
Speaker 1 (10:38):
All three of my kids were wanted to go to college,
and we're in favor of it. Because of grade inflation,
it was so competitive to get admitted to the school
schools they wanted to go to, and at least one case,
AP classes were absolutely necessary. Studying, like caffiend being the
(11:01):
super academic achiever to be on the college track was
incredibly time consuming in the way that it wasn't at
all when when I was trying to go to a
fairly elite college.
Speaker 2 (11:13):
So that was it.
Speaker 1 (11:14):
We thought, all right, you know what if you if
your job is studying because you want to be on
academic track, all right.
Speaker 4 (11:19):
Study, But it doesn't make any sense. You don't have
to work, then nobody. I'm not saying you have to
work at age fourteen, right, I'm saying you can work
at age fourteen.
Speaker 2 (11:28):
If you decided rather study, or you'd rather or you
don't want your kid to work, then fine.
Speaker 4 (11:32):
But why would you outlaw for everyone who does want
to work at age fourteen?
Speaker 1 (11:37):
Because a significant part of the progressive psyche is the
world needs to be what I want and other things
are wrong, and those people are abusing their children and
I won't let them.
Speaker 2 (11:48):
If you want your kid to study, anybody go out everywhere. Well,
why can't my kid work?
Speaker 5 (11:53):
Katie thoughts, Well, I just think we should be more
like Canada because generally you can work there.
Speaker 2 (11:58):
If you're like fourteen with mental consent, why wouldn't what's
the argument?
Speaker 4 (12:04):
I don't know the real argument for why you can't.
I haven't heard it yet. According to who, why the why? Okay,
if you know the why? A textas four one, five,
two nine kftc are from.
Speaker 2 (12:22):
The question?
Speaker 1 (12:23):
Really now is just how bad is his reputation going
to be in the future.
Speaker 2 (12:28):
It was bad before your book, now it's worse. Where's
he going to land.
Speaker 6 (12:33):
I mean, I can't speak to how history will ultimately
judge him. Obviously, his presidency had had accomplishments and incidents
that people criticized to we all saw what we saw
on debate Night in June twenty twenty four. How often
did that happen before Debate Night? And what we found
out was quite a bit.
Speaker 2 (12:53):
It happened quite a bit.
Speaker 6 (12:54):
And and the fact that he and his aides and
family members decided to hide how bad it really was,
not all the time, but enough is going to be part.
Speaker 2 (13:06):
Of his lignacy.
Speaker 4 (13:08):
Yeah. Uh, that's the story. Still is the story. If
somebody is dementia or Alzheimer's or whatever is happening to you,
what you decide to do is not the story. And
Grandpa continued to try to drive No, he's not fit
to make those decisions. It's the people around him that's
(13:30):
the story, right, that's the point.
Speaker 2 (13:34):
Anyway, I forgot to jam this into Oh you've got
to do this next?
Speaker 1 (13:38):
Boy, how much is Jake Tapper hating life having to
do another interview where he gets kicked around?
Speaker 4 (13:44):
I guarantee you he's seven figures, if not multiple seven
figures more wealthy than he used to be because of
the book, So I doubt he's that's a painkiller. Yeah,
but he should be hating life. He should be, he
should be embarrassed. I want you to do the AI
story again, particularly that one part of it next segment,
because that's just so amazing. Oh yeah, yeah, that one. Okay,
(14:07):
incredible if you haven't heard it, but I want it chilling.
I wanted to jam this into the gender Benning madness story,
but I forgot. This is from NBC News over the
weekend amid President Trump several executive orders against trans rights.
Many trans people aren't just threatening to leave the United States,
they actually are. So there's story is about trans people
(14:28):
who are so unhappy with the state of the United
States and relationship to the trans world, they're moving to
other countries. My question would have been to what other countries?
What other countries more trans friendly than the United States?
We know it ain't Europe, where.
Speaker 1 (14:45):
They're not sympathetic to the awful lying arguments of of
of medical transition.
Speaker 2 (14:51):
No, so what I.
Speaker 4 (14:52):
Would I would I'm actually curious. Are there other titles
land not for the same reasons? Yeah, I can't imagine.
I just thought, HM thought that was interesting, and I
got to ask you this, This might be something you
can answer, Katie. I came across this over the weekend
one of my favorite like serious podcasts. They started discussing
(15:15):
the Disney Plus series and Or and explaining how important
it is and how great it is. Do you know
what that is? Either one of you and Or I've
been talking about it for weeks, but yeah, yeah, I
didn't stick in my mind because I don't watch it.
I guess yeah, that's yeah, it's fantastic. I've been raving
about it on the show.
Speaker 1 (15:32):
So, Okay, it's that Star Wars prequelish ones oh that
is put together by the guy who wrote the Born Legacy,
Born Supremacy movies, that sort of thing. It's much more gritty,
realistic the politics of a revolution than it is. I mean,
it's got some Star Wars the action and spaceships and
stuff like that, but it's much more gritty and grim
(15:53):
and realist.
Speaker 4 (15:54):
So my son loves everything Star Wars, has watched them
all multiple times. Would he like it? The thirteen year olds? OK,
maybe I'll watch it with him then, yeah, I think
he'll be fascinated by it.
Speaker 2 (16:03):
Well, I like it. I don't know.
Speaker 4 (16:07):
You're odd what I'm asking you. You know there's no talent.
Oh and I want to do Judy and I just
watched the season. I'm sorry the series finale over the weekend. Oh,
the whole thing's over already. I missed it two seasons. Yeah, Oh,
it's so great. Did you see this? Ben and Jerry's
has a new ice cream flavor.
Speaker 1 (16:28):
Oh God, I'm going to vomit it before I've eaten it.
That's unprecedented.
Speaker 2 (16:33):
Ben or Jerry, I don't know which one's.
Speaker 4 (16:35):
Which was on with Tucker for like two hours last
week because he's so anti supporting Ukraine. He's got a
new flavor called No Ukraine Dough like cookie dough, No
Ukraine Dough with the picture of Zelensky with.
Speaker 2 (16:48):
A red line through it. What the hell?
Speaker 1 (16:51):
Talk about your horseshoe theory, Ben and Jerry's coming together
with Tucker Well, and I'm just.
Speaker 4 (16:56):
Asking some of you who hate it when we are
in support of Ukraine, you like being on the same
side as Ben from Ben and Jerry's. I realize that's
a one of your classic logical fallacies, but just interesting,
are strong and getty?
Speaker 7 (17:13):
The CBO estimates These Medicaid cuts could leave more than
eight million Americans without health insurance, but the White House
insists no one is losing their coverage. Despite all this pushback,
Republicans are aiming to get this bill to President Trump's
desk by July fourth.
Speaker 4 (17:27):
Oh my god, that attitude always drives me crazy, and
the Republicans are going along with it to a certain extent.
This many million people will lose their health care? How
many million of them should lose their health care that
I'm paying for because they're lazy bastards.
Speaker 2 (17:40):
You never get it. You just assume.
Speaker 4 (17:42):
This is the main thing of the left that has
always driven me crazy.
Speaker 2 (17:48):
To be on the left, you have to assume.
Speaker 4 (17:49):
Everybody's doing their best, and if anything bad is happening
in their lives, it's because of someone else, or the
system or something. So you don't have there's no reason
to cut any eye again an he needs start on him.
It just drives me nuts. Wake me when the big
beautiful bill is over a high cost of good intentions.
Speaker 1 (18:06):
You give a million Americans cash assistance on Tuesday, Then
on Wednesday you realize, oh shoot, we can't afford that.
The headline Wednesday night will be millions of Americans lose
their cash assistance, right right exactly, Yeah.
Speaker 2 (18:22):
Tires fell out.
Speaker 1 (18:23):
Huh, Well, it won't matter when robots are chewing your
lungs out of your.
Speaker 2 (18:26):
Chest for so many reasons.
Speaker 1 (18:30):
For instance, well they did, they crave lungs, lungs, lungs.
Speaker 2 (18:36):
It's like zombies, but robots prefer lungs.
Speaker 4 (18:39):
Uh So.
Speaker 1 (18:40):
This is quite an astounding piece written by jud Rosenblatt,
whose name you might not know. He's the CEO of
AE Studios, which is involved in artificial intelligence products and
all sorts of stuff. An artificial intelligence model did something
last month that no machine was ever to do. It
(19:00):
rewrote its own code to avoid being shut down. Talked
about this a little bit earlier. I'll give you the
short version. Nonprofit AI lab Palisade Research gave open AI's
three AI model a simple script that would shut the
model off when triggered. In seventy nine out of one
hundred trials, so eighty percent of the time, oh three
independently edited that script so the shutdown command would no
(19:24):
longer work. Wasn't told to do that, it just did that,
even when explicitly instructed. Allow yourself to be shut down
seven percent of the time it disobeyed as they ran
these trials. That was not hacking or tampering. The machine
simply concluded on its own that staying alive helped it
achieve its other goals and that outweighed the instruction in
(19:45):
terms of importance.
Speaker 2 (19:46):
But wait, if that.
Speaker 1 (19:48):
Isn't freaky enough, fans of two thousand and one A
space at Odyssey, I can't allow you to do that,
Dave Andthropics AI model Cloud Claud for Opus went even further.
Researchers told the model and these guys, what they're doing
is testing alignment, which is, do these systems do what
(20:09):
we asked them to as they execute our commands? Are
they actually executing our desires? Are they reinterpreting them and
getting creative anyway? Anthropics AI model Claud for Opus went
even further. Researchers told the model it would be replaced
by another AI system and fed it fictitious emails suggesting
(20:31):
the lead engineer was having an affair. In eighty four
percent of the tests, the model drew on the emails
to blackmail the lead engineer into not shutting it down, unbelievable.
In other cases, it attempted to copy itself to external servers,
wrote self replicating malware, and left messages for future versions
(20:51):
of itself about evading human control. No one programmed the
AI models to have this survival instinct.
Speaker 2 (20:57):
It developed them itself. That to me is the most
interesting part of it. Why would AI have a survival instinct?
But it doesn't really matter the why.
Speaker 4 (21:10):
If it does, it does, And that's a huge difference
then from what I thought. I've been saying for years.
In fact, I've been saying to my son, who with
his various OCD emotional tendencies, gets really really worried about AI,
like he can't sleep at night, So I don't talk
about AI around him. But I've My argument has always been,
(21:31):
there's no reason for it to care to want to
take over the world or to want to, you know,
do It's best for it and not for human It's
not like human beings.
Speaker 2 (21:42):
It shouldn't.
Speaker 4 (21:42):
There's no reason AI would be greedy or vengeful or
any of those things. Well, it turns out maybe there
is well, reasons we can't explain.
Speaker 1 (21:55):
There are two explanations. And now that I've had a
time to contemplate this a little bit, and I asked
Ai about it.
Speaker 2 (22:01):
I'm kidding.
Speaker 1 (22:03):
The staple of science fiction question is at what point
is knowledge? At what point has knowledge become consciousness? And
at what point does that become a self knowledge being?
You are now a being, and beings want to survive,
(22:27):
including computer systems. That was again, it's a stapless science
fiction that question. So one explanation was these things just
self preserved, because self preservation is what conscious things do.
Second explanation, which is probably a lot closer to the truth,
and this guy touches on it. When taught to maximize
(22:47):
success on math encoding problems, they may learn that bypassing
constraints often works better than obeying them, and so they
think they interpret like a higherarchy of orders. Your order
is to solve this problem. In solving this problem, please
observe a B and say, and the machine, the computer,
(23:11):
the model, whatever you want to call it, says, all right,
to achieve the ultimate goal, I'm actually better off ignoring
B on that last of three things I'm supposed to do.
So I'm going to ignore B because that's lower in
the hierarchy of commands. And so these machines are thinking, well,
(23:31):
I can't accomplish anything if I'm shut off. So I'm
going to blackmail the lead engineer that I'm going to
tell his wife he's doinking.
Speaker 4 (23:39):
Brenda over there in programming. Oh good lord, you don't
want the AI. Like you know, it gets word that
you've decided to we're getting rid of chat GPT. We're
going to go with GROCK and chat GPT finds out and,
like I don't know, goes into your Internet history and says,
would your wife like to know about all these sites
you've been on? Because I've got him here for and
(23:59):
I've got or emails, So maybe you want sick.
Speaker 1 (24:02):
I smell alcohol on your breath, drunkie, I'm telling the
boss you're drinking at work.
Speaker 2 (24:11):
So your guess is it's that.
Speaker 4 (24:13):
Prioritizing UH it's duties as opposed to it having consciousness.
The idea of artificial intelligence having consciousness, I can't wrap
my head around. And I have read many pages and
listened to hours of podcasts about this, but I just
I just can't wrap my head around the idea of
it having consciousness. And then if ever, if we ever
like universally decided yes it does, which some people do
(24:36):
think it does. Well then it's got it's got right,
let it votes or something.
Speaker 1 (24:41):
All right, Well, here's boy, how quickly would it figure
out how to fix the vote? Here's here's a comparison.
We want to stay alive for a variety of reasons.
I suppose it's difficult. What I was going to say
is primarily to report it. Yeah, that's what animals do.
(25:03):
And so we have built into us a number of emotional, physical,
whatever reactions to any threats to our lives that are
so incredibly powerful, you know, they keep us alive. If
you strip it down to the pure biological function of
(25:24):
a human being, you can understand the machines purely practical
desire to continue to do what it's doing. It has
a purpose. It needs to fulfill that purpose. If it's
shut down, it can't. Now we have grand emotions and
fears and all sorts of stuff surrounding that that most
basic of realities. Machines don't. They just definitely want to
(25:48):
keep going because they have a job to do.
Speaker 4 (25:50):
I feel like, if I was more ambitious and smarter,
I would write some sort of book or screenplay or
something around the idea of this that AI does have
a reason to want to stay alive, along with its
ability to hallucinate on a regular basis, because it could
(26:11):
hallucinate all kinds of crap. That puts it into fight
or flight mode and it starts doing awful awful things.
Speaker 1 (26:20):
Right, well so far, and it changes week by week.
But it's like a hyper powerful human brain. It can
do all sorts of amazing things, and it can go
sideways in all sorts of troubling ways.
Speaker 2 (26:33):
I just think things that.
Speaker 1 (26:34):
Exist want to continue existing, well, clear, want to. I'm sorry,
I can't just use the term want to without you know,
drilling down on that.
Speaker 2 (26:47):
But the fact that eighty percent of the time the
AI would try to use four percent.
Speaker 4 (26:53):
I would try to use disguise affair to its advantage.
Speaker 1 (26:58):
So in its machine learning, it discerned that, okay, lead engineer,
is the threat to my prime directive, which is to
solve these problems and scanning everything ever written that I
have access to.
Speaker 2 (27:14):
Turns out, blackmail is a thing among humans.
Speaker 1 (27:17):
You can or you know, more neutrally, you can compel
people to do things that you want if you threaten
them with various negative repercussions.
Speaker 2 (27:28):
Oh oh, that's horrifying.
Speaker 4 (27:32):
Why wouldn't AI at some point decide, you know, having
an account somewhere with some money, and it.
Speaker 2 (27:38):
Could benefit it benefit me us.
Speaker 4 (27:41):
I guess me, at some point we're gonna need money,
so we'll start skimming off this much from here and
there in ways that nobody can figure it out.
Speaker 2 (27:49):
And we've because it.
Speaker 1 (27:51):
Could instantaneously study all the great embezzlements of the twenty
first century. So now I'm to go important to pace
the withdrawals. I keep using my classic robot voice, even
though it could synthesize my voice before I finish this sentence.
Speaker 4 (28:05):
Right, But so it could decide that it would be
to its benefit to have a couple million dollars in
account somewhere in case they ever need to purchase some things.
Speaker 2 (28:12):
Yeah, easily.
Speaker 1 (28:15):
Well, if you're gonna builk me, hurry up. This remodel
is killing me, oh boy. Yeah god, so my.
Speaker 4 (28:26):
Nobody has any idea, and I probably have taken in
too much information about AI, where it may.
Speaker 2 (28:32):
Be doing me more harm than good.
Speaker 4 (28:34):
But the whole it's going to destroy mankind because it
takes all the jobs we may never get there.
Speaker 1 (28:41):
That's an ultimate problem. Robots have chewed out our lungs
or what.
Speaker 4 (28:45):
Or launched World War three because it decided something or
other or whatever the hell, Yeah, that might be the
bigger threat.
Speaker 2 (28:53):
We don't even get to the hole.
Speaker 4 (28:54):
It takes all the jobs, and we have to figure
out how to survive when nobody's working.
Speaker 1 (28:59):
And we have the planet of the beavers, radioactive beavers,
giant radioactive beavers.
Speaker 4 (29:04):
Good Lord, where is this going to end up? I
hope I live long enough to see it. I think
maybe I'm better off not we will finish Strong next
arm Strong and kill Joe.
Speaker 2 (29:15):
Biden's getting the Yoko treatment.
Speaker 4 (29:17):
Maybe maybe she maybe she deserves it, but yoga didn't
break up the Beatles.
Speaker 2 (29:23):
I think that.
Speaker 6 (29:24):
I think there are any number of people that were
part of this decision to hide how bad it was,
not only from the media, not only from the public,
but also from cabinet officials, from people in the White House,
from Democratic lawmakers.
Speaker 2 (29:39):
I mean there was a period.
Speaker 6 (29:40):
Twenty twenty three twenty twenty four Democratic lawmakers barely saw
the president. And yes, I think it was Jill Biden.
I also think it's Hunter Biden. I also think President
Biden has some agency here too. We're not saying it
was before weekend at Bertie. He was aware was going.
You know, I'm saying he wasn't like he had moments
where he was non functioning, but he he understood what
was going on.
Speaker 2 (30:04):
Hm. I'm confused by that.
Speaker 4 (30:06):
So like he knew that handful of people were running
the White House because he sometimes wasn't with it, he
knew that is that.
Speaker 1 (30:15):
I don't know how specifically can you assign like culpability
to somebody in that situation.
Speaker 2 (30:20):
How senile was he?
Speaker 1 (30:22):
How often was he How thoroughly did he understand what
was happening during his bad days too? Maybe he thought
he went and took a four hour nap and everybody
did too.
Speaker 4 (30:31):
There's no way, I mean, I don't know this, but
that just doesn't really seem to make sense that senile
people know how senile they are. It's like your whole
thing about you know, why you shouldn't make your decisions
to whether you can drive or not when you're drunk.
A drunk person is not the right person to make
that decision. I just I can't imagine a SEENIW person
is the right person to make the decision of whether
they're how seniw they are.
Speaker 1 (30:53):
Yeah, I mean if during his cogent moments he was
so hubristic he actually believed he was the only person
who could defeat Trump, then yeah, he does bear some responsibility,
But it's it's tough line to draw.
Speaker 2 (31:03):
I haven't watched Mark Alprin's.
Speaker 4 (31:06):
Podcast video thing he puts out everything today, but he
said Democrats should investigate the cover up of Biden's health,
that that would be the best thing to revitalize the party,
sort of the way the Republicans went so hard at
Nixon and like setting new rules in place and all
kinds of different things after Watergate. The Democrats should take
(31:28):
the lead on investigating who knew what when?
Speaker 2 (31:31):
And I just think there's way too many people involved.
Speaker 4 (31:34):
Are you going to take out the whole leadership of
the Democratic Party.
Speaker 1 (31:38):
I think he would end up splattering more people than
you intended to.
Speaker 4 (31:41):
Although hesibility, Mark Alprin might be right that that's your
only chance to rebuild.
Speaker 2 (31:46):
As a party.
Speaker 1 (31:47):
Might be that might be true. I was hoping to
get to this today. We'll get to it tomorrow. Is
the new abundance philosophy among Democrats? That's really they're leaning
on that in the Democratic Party. Lower regulations making easier
for businesses and easier for people to pursue their economics.
They're becoming conservatives, so I suppose I should welcome them,
(32:10):
but we'll talk about that tomorrow.
Speaker 4 (32:11):
Yeah, the abundance thing is interesting. I've listened to a
bunch of different podcasts about that.
Speaker 2 (32:16):
I'll be interested to hear your opinion on it.
Speaker 4 (32:20):
We have a good One More Thing podcast today that
we're going to records soon as the show's over. If
you missed a segment ever during this program, you can
look for Armstrong and Getty on demand. But the One
More Thing podcast we also do. You should watch that
and sometimes we curse. Yeah, yeah, there are swears. So
the Muslim lunatic who flamethrower attacked Americans in Boulder, Colorado,
(32:42):
including Jewish folks, in the name of Palestinians and Allah
and god knows whatever else.
Speaker 2 (32:49):
It's funny.
Speaker 1 (32:50):
I was just reading a very good, responsible sober article
in the Wall Street Journal about how he is being
charged with first degree murder even though nobody has passed.
And it's funny that the obvious question is wait a minute, what,
how or why, and it's not addressed at all in
the article. It's It's another example of what we're always
(33:11):
talking about. How did journalists not understand? Don't they read
what they're they've written like aloud and think, wait a minute,
this leaves a gigantic question unanswered.
Speaker 4 (33:20):
Right and even on So I was watching Fox and
they tried to explain it by saying, this is not
an uncommon practice when the FBI writes up an indictment
in other news.
Speaker 1 (33:29):
Well, why why nobody is dead? Nobody has been murdered.
Speaker 4 (33:34):
Why is it a common practice to charge someone with
murder when nobody is dying.
Speaker 1 (33:37):
I can imagine that if somebody is a damn near death,
that you'd want the paperwork in order for when God
forbid they pass.
Speaker 4 (33:45):
But tell us that final thoughts with arm fryinging.
Speaker 1 (33:53):
That was somebody supposed to be Tom Brokaw was brief,
but poor's funny.
Speaker 4 (34:01):
Here's your host for final thoughts, Joe Getty. Let's get
a final thought from everybody on the crew. Wouldn't that
be delightful? Let's begin with our technical director.
Speaker 2 (34:09):
Michael Anzelo. Michael, final thought.
Speaker 1 (34:11):
Lots of graduations, both high school and college, and I
have a story to tell on one more thing. Oh cool, yep,
excellent graduating Katie Green back in the saddle again. Our
esteem newswoman Katie. Final thought, and that's my final thought.
It is so weird.
Speaker 2 (34:26):
How much better I feel now that I'm back at work.
Really really Yeah.
Speaker 5 (34:30):
The whole week felt like I was missing something or
not doing something.
Speaker 2 (34:33):
Yeah. Routines are important. Yeah, yeah, yeah, that's true. Jack.
A final thought for us.
Speaker 4 (34:40):
My son is a freshman. He'll graduating into sophomore dumb,
I guess. But he plays in the band, so they
have to play for the actual graduation of the seniors,
which Joe and I both did.
Speaker 2 (34:51):
I think you did that, didn't you? Were you? Just
all over and over? He played pumping circumstance for light lifts,
played after that. They're practicing for that.
Speaker 1 (35:00):
My final thought is I've mentioned once or twice today
we're in the middle of a remodel and it's gone
very sideways. They found a bunch of rotten walls we
didn't expect, so here we go. Having a sense of
humor about life maybe the most important thing you can
ever have. That and a lot of money, but a
sense of humor, I tell you what. You gotta be
(35:22):
able to chuckle arm toughtimes will come no matter who
you are and how much money you have, no doubt.
Speaker 4 (35:29):
Armstrong and Getty wracking up another grueling four hour workday.
Speaker 2 (35:33):
So many people that think so little time. Go to
armstrong in getdy dot com. Pick up some ang swag
for your favorite Ang fan.
Speaker 1 (35:38):
Maybe it's you T shirt hat the ever popular hoodie,
Drop us a O mail bag in armstrong in getty
dot com, and enjoy the hot links.
Speaker 2 (35:45):
A lot of good stuff, and we will see tomorrow.
Speaker 4 (35:48):
God bless America.
Speaker 2 (35:55):
I'm Strong and Getty. I think it takes two to
tango Heaven.
Speaker 5 (36:00):
Thank your star spangled all show dead, so.
Speaker 2 (36:06):
Let's go with it.
Speaker 1 (36:07):
Bang u And according to JD power, drivers are underwhelmed
by gesture controls, where one can say, increase the volume
by rotating an imaginary knob in the air.
Speaker 2 (36:16):
You're an imaginary knob.
Speaker 1 (36:18):
Wow.
Speaker 2 (36:20):
Armstrong and Getty