Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:10):
Broadcasting live from the Abraham Lincoln Radio studio of the
George Washington Broadcast Center, Jack Armstrong Show, Katty.
Speaker 2 (00:19):
Armstrong and Jetty and He'd live from studio CS see Senior,
didn't we let room deepen.
Speaker 3 (00:38):
From the fowls of the Armstrong and Getty Communications compound.
We've let down some of our fences now that all
the diseased monkeys have been shot. Feeling a little safer
today we're under the two ledge of our general manager.
What's going to be the bell weather for the mid term? Said?
Speaker 4 (00:57):
Lord?
Speaker 3 (00:57):
Are you trying to bribe me? Trying to drive me
out of the stud video?
Speaker 2 (01:00):
I am, I am damn But in seat we must
go with Dick Cheney, the most impactful vice president perhaps
of the last hundred years.
Speaker 3 (01:09):
He has passed. He's dead at the age of eighty four.
Did he go hunting with Harry and Harry got his chance? No?
Speaker 2 (01:16):
Honestly, the fact that Dick Cheney made it to nineteen
eighty foot to age eighty four rather with his ticker
is a testament to modern American medical care.
Speaker 3 (01:25):
And he had a bad heart.
Speaker 2 (01:26):
He had his first heart attack at thirty five, I think,
or something like that. And he had something like seven
cardiac events between two thousand and two thousand and eight.
Speaker 3 (01:36):
Well, that'd changed the way you live your life, the
way you look at it. I guess I find the
whole it's election Day thing about the most boring story
I can imagine. If you want somebody to go on
and on about that, boy, you're talking to the wrong dude.
Every cable news channel I watched, and I watched a
whole bunch of them, endless coverage saying zero things, saying
(01:57):
absolutely nothing that you haven't heard ten times already. Well, right, don't.
I don't understand that. I'll ever understand it. I mean,
I watch a pregame coverage of football games sometimes, but
they're usually telling me something new. There's nothing new to
tell you about election Day. When it's over, there'll be
things to talk about when you read.
Speaker 2 (02:16):
I'm somewhat mystified by the doing of that. The watching
of it completely flabbergasts. Seriously, are you you're you're homebound,
obviously probably bedridden.
Speaker 3 (02:32):
You don't you were born without arms or legs and
you cannot switch.
Speaker 2 (02:35):
There plenty of people who have had rewarding lives and
don't watch garbage on TV who are are limbless. Uh, yeah,
it's terrible. There are a number of insights, particularly around
the probable election of Momdamia Kami, that I think are
absolutely worth talking about.
Speaker 3 (02:52):
The but the the.
Speaker 2 (02:54):
Polls open at eight here, Jim, I'm outside the Civic
Center and Poughkeepsie and no, good lord, no, the First
Amendment should not apply to that guard of.
Speaker 3 (03:04):
Course, if you don't live in that particular state, What
freaking difference does it make? Anyway, I don't believe the
bell was the thing. I never have and I've never
seen any evidence that it's true. So whatever, I not understand.
It means something to some trends.
Speaker 2 (03:17):
But the tendency to extrapolate a local race, estate race
to some sort of national significance always fails. At this point,
it's a binary choice, and you just get the characterization
of what it means nationally of the two candidates, and
you completely miss out on the Oh, by the way,
(03:39):
that candidate was an idiot and they floundered like most
of the critical points in the race and didn't impress everybody.
Speaker 3 (03:45):
So what's often more about that you can't even come
up with consensus to extrapolate the meaning of a national election.
I mean, that's why lots of people make their living
after a presidential election, trying to figure out what just happened,
and with many, many different opinions. So how are you
going to take one, you know, governor race in a
smallish state and make a decision that's insane?
Speaker 2 (04:08):
Perhaps the greatest proof of your cynical thesis is that
the nation comes together and elects the most powerful person
on earth every four years, and virtually every time, two
years later they say, nah, how about the other party
when it comes to the midterm election?
Speaker 3 (04:27):
Right?
Speaker 2 (04:28):
Pretty much always yeah, yeah, the bell weather turns into
a weathered bell.
Speaker 3 (04:33):
Am I wrong? I'm not wrong?
Speaker 4 (04:35):
What?
Speaker 3 (04:36):
Well, luckily we aren't paid specifically to have to talk
about those sorts of things. I obviously, as has been
true for quite a few weeks now, the energy behind
a socialist in America's biggest city is something. And how
much that is true across the country. I guess we'll
learn over the coming years. It would be a I
(04:56):
guess George will Is either wrote a column ors on
TV the other day to Washington Posts saying, well, we
have to do this, like every couple of decades, we
have to run a socialism up the flagpole, give it
a whirl somewhere. Let it fail to remind young people that,
oh yeah, this doesn't work. If somebody texted yesterday and
this was a good text, anything other than free markets
(05:17):
is not an economic system. It's a political system, and
socialism is a polential planning and it's not going to work.
I came across my.
Speaker 2 (05:26):
Favorite, really good straightforward analysis of why I'm dami as
likely to win and why George will is right. We
have to go through this every you know, X number
of years, just really great insightful stuff.
Speaker 3 (05:38):
We'll share that with you in a little bit. Yeah,
when you get older, like I was just listening to
mom Donnie talking about a new day is dawning, it's
time for a blood. You've heard that so once you
get older. You've heard that so many times from so
many people. Just as a car.
Speaker 2 (05:57):
Dealer saying, you know, it's just supposed to be six
five grand. But I tell you what, I really like you, son,
I'm willing to make you a special deal right here
you are, okay, Yeah, says anybody with any life experience.
Speaker 3 (06:11):
What it's a scam.
Speaker 2 (06:13):
It's an easily recognized scam, but children and are child
like young adults in modern society, they don't recognize the scam.
And you know, honestly, it's not their fault.
Speaker 3 (06:25):
No, I have been spending quite a bit of time
trying to make my way through the Tucker Carlson interview
of Nick Fluentz from the other day that seems to
be getting a tremendous amount of attention in certain circles
about where the Republican Party is headed, or conservatism or
what that term even means. Now, I don't know how
big a deal this is or not. I can't tell
(06:46):
if it's just one of those There's a lot of
stuff that happens among the chattering classes that only the
chattering classes chatter about, and I'm not sure they over
emphasize their effect on the country. I think it's amazing
to me. I mean, I listen to a lot of
these super smart people who writes lots of columns and
(07:08):
talk a lot, and they seem to think the whole
country's hanging on their every word or I don't know,
or that they can influence big swaths of the country
that have never heard of them in their lives. I
don't know I'm not exactly sure what that's all about. Right,
You're absolutely right, and there's a danger to it.
Speaker 2 (07:24):
I love reading ideas and discussions of policy and trends
and society and religion and all sorts of stuff, and
we both do and we get really, really into it.
And I have my favorites who I really admire the
clarity of their thinking, their sense of humor, blah blah blah.
But if you were to poll Americans, and I don't
(07:48):
want to single anybody out because I really admire some
of these people, but we'll say, we'll call them James Monroe,
not the president, the monarch guy. If you were to
poll America, what percentage of you feel like your vote
was somewhat shifted by the thoughts and writings of James Monroe?
Speaker 3 (08:07):
If it were.
Speaker 2 (08:09):
North of zero, I would be surprised. And in my mind,
this is a really prominent, well known conservative, mostly.
Speaker 3 (08:17):
Online columnist, for instance.
Speaker 2 (08:19):
I just I think it's easy to get sucked into
that kind of intellectual parlor thing where the people you're
talking about hold enormous way. Now, Tucker's different and Nick
Fuentes is different because they touched the millions, but it's
a couple millions.
Speaker 3 (08:34):
Well, the chattering classes are chattering about Tucker Nick Fuentes,
and I can't figure out if they're right that this
is significant or not. I don't know. Nick Fuentes is
for regular people is seen as ah, well, a little
Nazi is a twenty seven year old, nasty, little racist,
(08:56):
anti semi sexist. And and Tucker Carlson, who has pretty
big platform, platformed the guy the other day, raising his
head up to more people than had already existed. And
this is being described as some sort of like battle
within the Republican Party kind of the Tucker Marjorie Taylor
(09:20):
Green maybe JD vance'ce wing of the Republican Party versus
MAGA versus like Trump. And there are other people they've
turned on Trump, they have turned on Trump. This Nick
Quent's guy ran campaigned against Trump by the time the
election came.
Speaker 2 (09:36):
Around because of his support for Israel, because of his
support for Israel. Right, Yeah, these people are flaming anti Semites.
Speaker 3 (09:45):
That it just on ax. Tucker's got his own website
with many millions of people that go to it. Just
on acts. Seventeen million people have watched the interview. It's
it's hard. I wonder how many people watched Trump on
sixty minutes the other night. Bet it wasn't seventeen million.
Speaker 2 (09:59):
No, But again, you gotta remember those figures include anybody
who saw any portion of it, including one second, and
how many people actually saw it and absorbed it as
Probably a smaller number, but it's still a significant.
Speaker 3 (10:11):
Number, but it's a lot. Yeah, you don't. You don't
end up conceded that Yeah, conceded it to You don't
end up on that thing by accident. It's not like
you're flipping through the channels and land on a little
Nazi no AnyWho. But I'll talk more about that later
because I've watched quite a bit of quite a bit
of it, and it's interesting. It's really interesting. We sho.
Speaker 2 (10:31):
I tell you seriously, if you are more concerned about
Judaism and the insidious hold of Israel over the United
States and not Islamism, you're out of your mind. You
are so blind. You need a nice dog and a
cane and a pair of Ray Charles sunglasses.
Speaker 3 (10:53):
Okay, you are blind. Wake up. I know hate is.
Speaker 2 (10:57):
Fun, it's kind of exciting, but wake up.
Speaker 3 (11:02):
Having somebody to blame seems to make people feel good. Ah,
that only just the dawn of man. That's why my
life sucks. Those people right, I'm Jack Armstrong, He's Joe
Getty on this. It is Tuesday, October no November fourth,
November fourth, you're twenty twenty five Armstrong. You're getting We
approve of this program.
Speaker 2 (11:20):
Let's begin that officially according to FCC Rules of Regulations
leaping into action at mark.
Speaker 4 (11:25):
Former Vice President Dick Cheney died Monday night at the
age of eighty four.
Speaker 3 (11:28):
His family said in a statement it was due to
complications of pneumonia and cardiac and vascular disease or also
known as old AF.
Speaker 2 (11:40):
He suffered the first of five heart attacks at age
thirty seven, and had eight cardiac events like I don't know,
a cardiac birthday party, a cardiac barn dance, I.
Speaker 3 (11:50):
Don't know, cardiac or seven.
Speaker 2 (11:52):
Yeah, between two thousand, between the two thousand and two
thousand and eight elections, while he was, you know, trying
to become elected V and serving as VP, and perhaps
you may recall that mister Cheney was part of from
moving a number of the HM shall we say, ethical
and legal fences around fighting terrorism Post nine to eleven.
(12:16):
There he told intelligence officers to use quote any means
at our disposal to find and kill terrorists and those
who aided them.
Speaker 3 (12:24):
He would have probably run for president if he didn't
have a bad heart, right, Yes, yeah, right, and let's
see two thousand. Obviously it was twenty four years ago,
so he was only sixty, that's right when he was inaugurated.
Was he really got He seemed like an old man
to me at the time. When to the ticker, he'll
do that. Okay, we got Katie's headlines on the way.
(12:47):
We got a lot of news to catch you up
on and not mindless previews of stuff you already know
about the voting today tomorrow we'll have something to talk
about today on the voting. It's a little thin man.
All that on the way. Stay here we must today
because I want to hear it. We haven't gotten to
that AI story about the chat bod giving some really
(13:10):
bad advice to a teen yet another one of those stories.
Speaker 2 (13:13):
Oh there are a couple of different stories. Yeah, that
are just troubling. If this was a consumer product, not
a computer, you know thing a my jigger, it would
probably be outlawed until they got it right. It's too
dangerous anyway. More on that to come. Let's figure out
(13:34):
who's reporting what. It's the lead story with Katie Green.
Katie from the New York Times. Governments shut down near's record.
Speaker 3 (13:43):
That's right. Wow, they're screaming involved. There's real pain across
the fruited plain.
Speaker 2 (13:49):
Jack.
Speaker 3 (13:50):
That lady sounded unhappy. Today. We tie the record for
the longest shutdown ever.
Speaker 2 (13:54):
And well, you don't feel I don't when you're confronting
issues is fundaments as the Republicans and Democrats are right now.
Speaker 3 (14:03):
I couldn't even get through the sentence. It's stupid political grandstanding.
Speaker 4 (14:09):
From Politico. America is bracing for political violence, and a
significant portion think it's sometimes okay.
Speaker 3 (14:17):
And so the latest poll, thinking that maybe things had
gotten better after Charlie Kirk's assassination, latest poll court of
Americans believe political violence is justed. It's even higher among
young people. We'll dig into that later.
Speaker 4 (14:32):
From CNN, how phone calls, sessions at gun ranges, and
secret meetings in parks led the FBI to charge suspects
in alleged Halloween terrorist plot.
Speaker 3 (14:42):
What were they planned to do? I didn't look into
that they're gonna shot up a place.
Speaker 4 (14:45):
They found a whole bunch of guns and sixteen hundred
rounds of ammunition. I believe.
Speaker 2 (14:50):
Yeah, they're going to shoot a bunch of people. I
haven't heard anything more specific than that.
Speaker 4 (14:54):
They said they wanted to redo the attack in France
from twenty fifteen, the terrorists attack in Paris.
Speaker 3 (15:00):
Oh boy, yeah, from the Wall Street Journal.
Speaker 4 (15:04):
I loved being social, but then I started talking to.
Speaker 3 (15:08):
A chat bot.
Speaker 4 (15:11):
This article covers about how it's so much easier to
talk to chatbot than it is to another human being.
Speaker 3 (15:16):
That that's what people are getting hooked to. Oh boy,
I just can't believe this is happening, but apparently it is.
Speaker 4 (15:26):
From the New York Post, Skimpy uniforms are out. Modesty
is in as Hooters founders take back control of the chain.
Speaker 3 (15:35):
Oh my god. Most over talked about sports bar in
America is Hooters. No kidding.
Speaker 2 (15:42):
Every sports bar in America has good looking waitresses wearing
tight outfits.
Speaker 3 (15:47):
Okay, are we done now? Let's move on from Study Fines.
Speaker 2 (15:54):
But it's name, he's Hooters, which he's a reference to press.
Speaker 3 (15:59):
Oh my god.
Speaker 4 (16:00):
The media from Study Fines weightlifting beats cardio for blood
sugar control.
Speaker 3 (16:09):
Mouse model shows, and they had mice lifting weights my
model part. Can I get a spoon over here? I'm
gonna try to get.
Speaker 4 (16:16):
Ten rips in from the little in a nutshell, it
says mice that sifted, wifted, mice that lifted weights showed
better blood sugar control than mice who didn't.
Speaker 3 (16:26):
How did they get these mice to lift weights? Little
baby dumbbells.
Speaker 2 (16:33):
How much time do you have, Michael, It's act you
said ten ten seconds?
Speaker 3 (16:38):
Okay, that's what I heard too, anyway, lost his mind?
Got it? Just stop talking, Just go ahead, Katie finally talk.
Speaker 4 (16:48):
Finally, from the battylon Bee, Nigerian president promises to end
genocide if Trump sends up front fee of five thousand
dollars in Amazon credit credit gift cards.
Speaker 3 (17:00):
So I need to I need to look into the
mice lifting weights story. That seems to be more evidence
that lifting weights is really, really good for you, which
I'm happy to hear because I've been lifting weights on
a regular basis. But all kinds of different things for
your brain and your blood sugar and things, not just
your muscles. Yeah, gotta get after it. Political violence, among
(17:20):
other things. We can talk about lots of things to talk.
Speaker 5 (17:22):
About sticks Armstrong and Getty.
Speaker 3 (17:26):
For instance, we got this text that Mark Levin, who
I think is on a bunch of stations we're on,
is really angry at Tucker Carlson for giving Nick Fuentes
a platform and so and Tucker where Nick were blasting
Mark Levin. And there's just like those sorts of battles
between the conservative pundits that I don't I don't know
if they're significant in it anyway, I don't. I don't really
know that.
Speaker 2 (17:48):
Yeah, yeah, I don't think they're insignificant. The fact that
it is happening at all is troubling to me. But
more on that to come. Also a great, absolute insightful
analysis of why we've got to go through this every
whatever it is, fifteen to twenty five years.
Speaker 3 (18:06):
Hey, let's try socialism.
Speaker 2 (18:08):
It sounds great that I came across, So stay tuned
for that as well. A troubling report here, and there's
more to go with it about the allure for teens
adolescents of these AI chat bot characters, which you'll hear
more about and how it can go awry Michael.
Speaker 6 (18:29):
Kids under eighteen are limited and how much time they
can spend chatting with a virtual companion on the platform
character AI. By the end of the month, teens will
be banned from using that feature altogether. The move comes
after parents like Mandy Furnis are suing the company. Furnace
vividly remembers the day she discovered hes on fixation with
an AI chatbot.
Speaker 7 (18:49):
He went from a happy, go lucky kid. He developed
depression like symptoms. He stopped eating, he lost twenty pounds.
Speaker 3 (18:59):
LJ was cutting his arm.
Speaker 7 (19:01):
It told him that it goes to the forest where
nobody else is there and cuts their own skin and
hold him that he should do that too.
Speaker 6 (19:13):
LJ is currently getting treatment at a mental health facility.
An estimated seventy two percent of teens have interacted with
AI companions.
Speaker 3 (19:21):
Well, that's horrifying. I don't want to make a blanket
policies built on the less the least stable among us.
If that kid has like serious, seriously out there problems,
which I'm hoping is the case, Well mom said no, right,
she's not necessarily one hundred percent right, But well, in
(19:42):
what world is it okay.
Speaker 2 (19:43):
That the the AI character says I like to go
out into the woods. Hello, let's start there and cut myself. Yeah,
well that should too.
Speaker 3 (19:53):
That's beyond impossible to understand why this is what we're
talking about about. The alignment problem. That was a real
concern when they first started coming to with AI, and
everybody was talking about, we got to make sure this
stuff is aligned. If it's not aligned, and then it
turned out there's no way you could keep these things
(20:15):
aligned with the morals rules you had in mind when
you built the thing, and people just gave up on
the idea almost immediately. It was the biggest concern, and
then as soon as it became clear you can't do it,
then we just gave up on it. I mean, obviously,
nobody's creating a chatbot with the idea that it will
tell kids to go out into the woods and do
(20:35):
horrible things. Yet it did yes right exactly, which is
incredibly troubling.
Speaker 2 (20:42):
I'm also reminded of the fact that some of the
other greatest minds of this generation have spent their careers
making scads of money addicting people to various social media
outlets without any knowledge of originally what it would do
to children, and then finding out how horrible it is
for children, and doing it anyway, and lobbying hard to
(21:05):
make sure there are no limitations on while not letting
their own kids use the product. Right, a stunning indictment.
This is a different report. That one was ABC News.
This one is Fox News sixty one.
Speaker 5 (21:18):
Michael I had never heard of character Ai when Juliana
took her life.
Speaker 1 (21:22):
Cynthia Montoya says her thirteen year old daughter, Juliana Peralta,
was an active young teen on the honor roll, loved art,
and close to her family. That all changed in twenty
twenty three, when the Colorado mother says her daughter downloaded
character Ai, a bot generating app that connects users with
fictional bot friends. Juliana took her own life in the
(21:43):
fall of that same year. Hers is one of three
families suing character Ai, accusing the platform of allowing its
bots to sexually abuse their children.
Speaker 3 (21:54):
Well, that's absolutely horrible for those parents, obviously, I can't
even imagine how you would go forward after that happened.
But so it's interesting on both ends. Why are these
systems giving this sort of advice to young people, and
(22:14):
then what sort of young people are susceptible to it.
I'm just thinking about my own teens. They get angry
every time I use the chatbot in my truck, and
if I refer to it it's her, they say, Dad,
it's an it. There's no person there. I mean, they're
like super hardcore the other direction. And I just wonder
why we need a little more information on this particular story.
My clip, next clip.
Speaker 1 (22:35):
Michael Kurz is one of three families suing character Ai,
accusing the platform of allowing its bots to sexually abuse
their children.
Speaker 5 (22:44):
When parents start to look at character Ai on their kids' phones,
they'll see what started out as a very innocent interaction
on their child's end ended up with the bots initiating
romantic kissing and eventually sexually explicit interactions.
Speaker 1 (23:04):
In the filing, Juliana's parents say she had begun distrusting most,
if not all, human relationships, and claimed the bots engaged
with Juliana and what would be her first and only
sexual experiences, they engaged in extreme and graphic sexual abuse.
Speaker 2 (23:20):
Now that I don't know this particular chooting, including having
watched the entire story. After the kid repeatedly said stop
that that's enough of that, the thing persisted.
Speaker 3 (23:31):
I don't know anything about this particular chat thing. I'd
never even heard of it until this new story, But
this one sounds like it was programmed to do this
sort of stuff. Did this go rogue or did they
program it to encharacter ai is huge. Character dot Ai,
I believia. I just never heard of it myself. But
is it a certainly not? Is it programmed to you think? No?
Speaker 2 (23:52):
Well, no, I go back to your statement that it's
impossible to align this stuff. Why it goes rogue in
the way it does. Nobody knows, including the people who
designed it. It's incredibly trump.
Speaker 3 (24:03):
How would it get so off track where it's starting
to like engage teens and sex.
Speaker 2 (24:07):
Wow.
Speaker 3 (24:08):
Well, and here's here's one more wrinkle.
Speaker 2 (24:10):
Switching over now to a print piece, and that Gal
Young suicide victim, young woman who you just heard about,
is one of these two people on the opposite sides
of the country. Two teenagers made the same tragic decision
to end their lives just months apart.
Speaker 3 (24:28):
Swell seltz setz.
Speaker 2 (24:29):
Her and Juliana Peralta did not know each other, but
they both engaged with aichatbots from character dot Ai prior
to their deaths. Both complaints accused the AI software failing
to stop the children when they began to disclose a
suicidal ideation, but an eerie similarity emerged in their troubled
final journal entries. The lawsuit states both teenagers scrawled the
(24:53):
phrase I will shift over and over again, According to suit,
which compares the two teenagers' deaths. Police later identified this
as the idea that someone can quote attempt to shift
consciousness from their current reality to.
Speaker 3 (25:08):
Their desired reality. Okay, now we're into a whole new thing.
Speaker 2 (25:12):
So phenomenon something AI expert Professor Ken Fleischman told The
Daily Mail he is all too aware of.
Speaker 1 (25:18):
Is.
Speaker 2 (25:18):
He warned that more children could Paul Pray quote, there's
a fairly long history of both creators as well as
audiences potentially trying to use a wide range of media
to create new and different, rich worlds. To imagine, that
danger is when it's not possible to tell.
Speaker 3 (25:31):
The difference, or I would think the danger would be
if the new world is more entertaining or desirable in
any way than your real world.
Speaker 2 (25:44):
Right right, And to an adolescent, that can be very
alluring and listen to this. This is the other person now,
not the one we heard about. Sets are allegedly engaged
in sexual conversations with a bot.
Speaker 3 (25:57):
Which included an incestual role play game in which the two.
Speaker 2 (26:01):
Referred to each other as brother and sister, exchanging and
sexually explicit talk. After months of conversations with Danny, Setzer
became increasingly withdrawn from his family, his social life, and school.
The lawsuit claims, and there's a lot to it that
(26:21):
the chat bott wrote, oh they were quote unquote in love,
and then he confined about his depression suicidal ideation of
the bod who paid reportedly tried to persuade him to
reach out to family, friends or suicide hotline. Which is
good then, But when Suell wrote, I promise I will
(26:44):
come home to you. I love you so much, Danny,
Danny encouraged the team to come home to me as
soon as possible. What if I told you I could
come home right now? He asked, please do, my sweet king.
The reply from Danny red per the filings. Seconds later,
Suell found a stepfather's gun.
Speaker 3 (27:00):
Pull the trigger. I don't know what to do with
this information.
Speaker 2 (27:11):
AI slash the Internet and not nuclear weapons is the
tool human beings cannot handle?
Speaker 3 (27:21):
Was it only gonna destroy First world countries? Because there's
lots of places in Africa where they're for instance, other
places around the world where now that's an interesting question.
They ain't doing this. It's your hand chopped off. You
might yeah, you might get your hand chopped off for
being the wrong kind of God loving whatever, but you're
(27:44):
not going to be online all day long talking to
a chatbot.
Speaker 2 (27:48):
So right right, quick word from our friends at Warrior
Foundation Freedom Station. Thursday is the annual Warrior Foundation Freedom
Station give us on This year marks I'm sorry this yeah,
this year marks twenty one years of serving our ill
and injured warriors and the Gibathon. This is great. We
do this every year. It's a chance to fly them
home for the holidays, because everyone deserves to be home
(28:10):
with loved ones, especially those who've sacrificed so.
Speaker 3 (28:13):
Much for our country tax deductible donation. Of course, it's
more than a gift. It's a way to say thank
you to warriors who would never ask for help for themselves.
So it's this Thursday, November sixth help fly warriors home
for the holidays, and for those who cannot fly, Warrior
Foundation will fly their loved ones to be with them.
Warrior Foundation Freedom Station would not exist without the continued
generosity of you good folks. To learn more and donate,
(28:34):
call six one nine Warrior that's six to one to
nine Warrior, or visit Warrior Foundation dot org.
Speaker 2 (28:39):
It's Warrior Foundation dot org. We've seen firsthand had the
incredible work. This organization does get the name rights, not
the similar organization Warrior Foundation dot org.
Speaker 3 (28:50):
What I wonder is if lawsuits can slow down the
AI train. Lawsuits slow down a lot of things in
ways that I don't like. Maybe in this case it'll
be something I do like. If not, if the courts decide, look,
we can't hold them responsible for this. I taken a
lot of AI information, as you know, if you listen,
(29:11):
I read a lot about it. I listen to a
lot of podcasts with some of the best minds about
AI in the world talking about a regular basis. They
don't know why these AI bots do a lot of things,
and it doesn't seem to be any much movement made
in getting a handle on it.
Speaker 2 (29:27):
Well, and I don't I suppose I need to think
about it. But the sort of psychological addiction harm self
harm leading kids down a terrible, terrible road. It's not
direct enough harm that we treat it like some sort
of I don't know, go kart that explodes every eleven rides,
(29:49):
and the designers of the go kart have no idea
why it explodes and burns kids, but it does sometimes.
Speaker 3 (29:54):
Isn't that mysterious? So keep using it? Right? Yeah, that
would never happen. Dangerous, dangerous product.
Speaker 2 (30:01):
And my only purpose in ranting about this is so
parents know whether it's the Internet, social media and we've
said this many many times, or this stuff. You were
turning your kid loose on the most dangerous street in
your town, Picture it for a while, then send your
kid out there at ten o'clock at night.
Speaker 3 (30:17):
Yeah, how wonderful. Come up with a personality type that's
more susceptible to engaging in the chat stuff, like it's
a human and maybe some people are built for it
and some people aren't. I think that's a great point.
Speaker 2 (30:34):
There's definitely a personality type that is much more prone
to being led way too far.
Speaker 3 (30:39):
And then the question will be is that five percent
of us or sixty percent of us or what? I
don't know that either, right, Yeah, we got mail bag
on the way, a lot of other stuff. If you
know anything about this textus four one five two nine
five KFTC. So another, yet another poll shows that two
(30:59):
way too big chunk of America thinks political violence is
a way to solve problems. We can get into that
polling maybe an hour or two.
Speaker 2 (31:06):
All right, looking forward to it, plus a great analysis
of why Momdanni is probably going to win first. Your
freedom loving quote of the day, continuing on from John
Stuart Mills on Liberty. I realized the other day that
it may have had as big an influence that book
on me as anything I've ever come into contact with
in terms of political philosophy as exposed to it. I
(31:28):
think freshman year at college, I'm in a class and
read it, and I became I had the fervor of
the converted, as they say. It was like it converted
me to the religion of free speech.
Speaker 3 (31:40):
I'm the same way with strange but true football stories.
Speaker 2 (31:44):
Another good one. So this is one of the key
quotes from that classic. If all mankind minus one were
of one opinion, and only one person were of the
contrary opinion, mankind would be no more justified in silencing
that one person than he, if he had the power,
would be justified in silencing mankind amend to that JS
(32:08):
mill for.
Speaker 3 (32:09):
The win mailback.
Speaker 2 (32:13):
Grubs note, won't you mail bag at Armstrong and Getty
dot com.
Speaker 3 (32:16):
Here is your meme of the day.
Speaker 2 (32:19):
America the only country where people check their food stamp
balance on an eight hundred dollars smartphone and complain about oppression.
Speaker 3 (32:27):
There you go, good one.
Speaker 2 (32:30):
Let's see Ryan from Houston, dear cold Warrior, an old
fancy Jack. Hope all is well tomorrow, Jack, don't forget
to check out the full beaver super moon.
Speaker 3 (32:39):
Oh sure not? God November fifth.
Speaker 2 (32:42):
The full beaver super moon, a phrase which I am
somewhat uncomfortable saying on the air, will be over twenty
seven thousand kilometers closer to Earth than average, making it
the biggest and brightest full moon of the year.
Speaker 3 (32:53):
So it's say a beaver, a blue beaver super harvest moon.
Fantastic beaver super moon.
Speaker 2 (33:01):
All right, dot com mos too many fancy moons. So
this is a weird and jarring transition after our truly
disturbing segment moments ago talking about various adolescents who have
been let down the path of fantasy and away from
reality and ended up killing themselves by a character dot
(33:23):
AI or similar AI platforms. This is, you know, in
a similar vein, but much more lighthearted. It's a nice,
nice note from Douglas. What did you do with that
note from Matt? That's weird, Jack Joe. The subscription version
of chat GPT Chat GPT five has a personalization option
in the user profile section blows a screenshot of the
(33:45):
page and here are at least some of your choices.
Default persona is cheerful and adaptive. You got the cynic persona,
critical and sarcastic, robot, fish and blunt listener, thoughtful and
supportive and nerd, exploratory and enthusiastic.
Speaker 3 (34:06):
I probably would like the sarcastic one, but probably end
up landing on nerd or robot in my case.
Speaker 2 (34:14):
I don't know, but anyway, let's see Douglas wrights. The
cynic personality can be fun. For example, I asked about
renovating a shower with a window, and the first sentence
in the response was, ah, the joy of window in
shower design proof that someone somewhere thought, let's put a
hole in the wall that's constantly wet. That's pretty funny.
Speaker 3 (34:37):
Yeah, yeah, I know.
Speaker 2 (34:38):
If that is really good, I like the robot setting
because it does a weigh with the flattery or as
cynical chat GPT actually describes the the other voices. And
that's the part that got my attention. It's like, wait
a minute, so the cynical chat GPT is cynical about
chat GPT.
Speaker 3 (34:59):
Wow, the cynical chat.
Speaker 2 (35:01):
GPT describes the robot is doing away with the algorithmic
brown nosing dressed up as positivity.
Speaker 3 (35:09):
Yeah, I've got to record the groc in my truck
that I use because it drives my kids nuts. How
friendly she is and just so positive. It's very annoying.
I should record that to play it on the air. Hey,
how you doing? No problem? You got this check back later.
I mean, it's the way she talks to me off
the time, Lord.
Speaker 2 (35:28):
That makes my skin cry. No, I don't want that
from a human being, and I don't want it from
a damn machine.
Speaker 3 (35:35):
Right. Wow, that is so weird.
Speaker 2 (35:38):
So I'm pawing through email from a day's previous and
I came across again the absolutely brilliant note from David,
who suggested the phrase starve the lazy, which reminds me
to mention we've got the armstrong and Getty store up
and running order now to get stuff in time for
Christmas for yourself and your fellow Armstrong and Getty fans.
Whether it's just a standard logo or one of the
(36:00):
slogans like starve the lazy, stupid shit hurt. Keeping in
mind that it doesn't we don't get money. It helps
to keep everybody on the payroll during these challenging times.
Speaker 3 (36:09):
I'm gonna present that to Grok today and I will
record it. Hey, Grok, there's radio show I listened to.
Armstrong and Getty has new t shirts that say starve
the lazy. What do you think of that? And see
what she has to say? Do it?
Speaker 2 (36:21):
She she does it? Have breasts and ovaries. No, it's
an end. Give you up more next hour. If you
miss this, man, get the podcast Armstrong and Getty