All Episodes

April 21, 2025 36 mins

Featured during Hour 3 of the Monday April 21, 2025 edition of The Armstrong & Getty Replay...

  • Jack Teen Sex Bot
  • Newsom on Maher BS
  • Germany Free Speech 
  • DMV losers

Stupid Should Hurt: https://www.armstrongandgetty.com/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:10):
Broadcasting live from the Abraham Lincoln Radio Studio, the George
Washington Broadcast Center, Jack Armstrong, Joe, Getty.

Speaker 2 (00:18):
Arm Strong, and Gatti and he arm Strong and Getty Strong.

Speaker 3 (00:26):
And so I got a couple of stories. One of
them I told before, and I got to be vague
about them because they're They're real life stories people have
told me, and I don't want them to get in
trouble for.

Speaker 2 (00:46):
You know, passing along more or less confidential information.

Speaker 3 (00:49):
But I told the story a while back about somebody
who was talking to a group of working class assault
of the Earth gentlemen, the last kind of dudes that
you would ever think that.

Speaker 2 (01:02):
This would be a thing for.

Speaker 3 (01:03):
I'm not talking about like some Berkeley androgynous poetry majors.
I'm talking about like work in class, work with their
hands guys, blue collar guys, talking about how much they
enjoyed the companionship of the female chatbots when they came

(01:23):
home from you know, a long day in the field
at night, and how you know they listen to them
and understand them and they look forward to it all
day long, and that sort of thing.

Speaker 2 (01:32):
And I thought, wow, I mean, if.

Speaker 3 (01:35):
That crowd can fall under the sway of this in
its current form, mankind is doomed. No, I've never tried it,
but I almost don't want to because I have some
concern with like a lot of other things that I've
dismissed and joked about, and find it more appealing than

(01:57):
I'd like it to be.

Speaker 4 (02:00):
You know, I would hope that your oogie factor would
overcome that temptation, but your illustration of the sort of
fellows who like it is troubling.

Speaker 3 (02:10):
Although I'm luckily I'm not. I don't feel trapped in
a lonely world like a lot of people do. And
if you feel like it's, you know, you're lonely and
it's really difficult out there to meet people and everything
like that, then this answer comes along.

Speaker 2 (02:32):
It must feel good to you.

Speaker 3 (02:33):
So a slightly different version of this, and also a
real life story Mom was telling me the other day,
and it's a troubling story from the beginning, as the
daughter involved as twelve, but let's go with thirteen because
it's they're close enough to thirteen but still are Actually
twelve had ended up in a situation by being in

(02:54):
a friend's house or whatever. This seventeen year old boy
was hitting on this young girl in a way that
they shouldn't if they weren't a creep, but they were,
and apparently they are a very handsome, smooth talking dude,
so really got the attention of this quite young girl.

Speaker 2 (03:09):
Wow.

Speaker 3 (03:11):
Anyway, mom gets contacted by the school saying, hey, we
are taking a look at your kid's search history and
computer use because.

Speaker 2 (03:27):
Maybe you know this, maybe you don't. I mean, I've
had kids in public school.

Speaker 3 (03:31):
They allow you to use the Chromebook or you buy
your own Chromebook, but you have to be on the
school system and they have the right to check and
see what stuff you're doing on that computer, which I'm fine.

Speaker 2 (03:43):
With, but they have a variety of.

Speaker 3 (03:50):
Protection programs that you know, if you're eighth grade boys
looking at porn on the Chromebook, the school will contact
you and say, hey, your kid's using a Chromebook for porn,
and then you know, you talk to them or step
in or do whatever, and then if you continue doing it,
there's penalties down the road.

Speaker 4 (04:05):
But the squad in many schools, if you say I'm
I want to be transgender, then they won't tell your parents.

Speaker 2 (04:12):
Excellent point.

Speaker 3 (04:13):
Wow, that is really good. Caught your thirteen year old
looking at naked women. Oh no, what a shock. Your
thirteen year old wants to become a woman. Keep it
on the down low a mom and dad's business. Wow,
good point. But so anyway, this mom got contacted by
the school. Hey, your twelve year old daughter's computer was

(04:36):
showing them on this site talking to an advice chat bot,
sex chat bot. I guess it's particularly in the area
of sex advice. That's twelve year old and mom. I
don't remember from reading the back and forth or asking
the kid now, but either way found out the kid

(04:59):
was regularly going on this sex chat bot to get
advice on how to please a seventeen year old boy,
and really like got addicted and it'd be too much,
but like really kind of obsessed with, you know, as
soon as you get home from school, checking in with
the chat bot and see what the latest advice is

(05:20):
on how to please a seventeen year old boy.

Speaker 2 (05:23):
And it just became a like hard to break cycle. Wow. Wow, Now.

Speaker 3 (05:34):
If there had been I can't imagine it when I
was like thirteen, fourteen years old. You know, when you're
a young man starting to understand things your body is
capable of doing, or certain urges that can be.

Speaker 2 (05:48):
Enjoyed in a certain way, and there had been some.

Speaker 3 (05:50):
Sort of chatbot I could talk to that would tell
me sexy stories or do whatever. Oh my god, I
would have never been I don't know how you got
me out my room, but you know, so you can
have the sex talk with your kids. They're having the
sex talk with some chatbot.

Speaker 4 (06:10):
What's really interesting is that, so far as far as
I can observe, the premature sexualization of children, which the
left is so enthusiastic about, has mostly resulted in people
not pairing off, not actually having sex, not having relationships.

Speaker 2 (06:29):
An interesting.

Speaker 3 (06:32):
Coexistence of those two things, although it makes intuitive sense
that there would be some, maybe most, who having their
normal development blocked in this way, if you can picture
that as a metaphor, A lot of people go to
the left toward this is all sick and weird and

(06:52):
I can't handle it never mind, and some people will
go to the right being hyper sexualized, addicted to pornography, whatever.

Speaker 2 (07:00):
It seems to me, is the.

Speaker 4 (07:03):
Step by step natural progression of the way you become
aware of the world in adulthood, everything from sex to
taxes and responsibility and paying bills and real, deep emotional
relationships with another adult. That's an inch by inch process
for the entire history of mankind, except now now, and

(07:23):
I can't resist another shot at the left.

Speaker 2 (07:25):
Forgive me. Now.

Speaker 4 (07:26):
You go to your woke school where you're immediately sexualized
and you're surrounded by poor or whatever, and that step
by step is like vaulting a mile at a time
in a way that their poor young minds and hearts
can't handle.

Speaker 2 (07:39):
It's incredibly troubling to me. I know, I can't imagine.

Speaker 3 (07:44):
Learning all the things that I learned, like you said,
little by little, inch by inch over a period of years,
and just got dumped on me, you know, like a
bucket on my head, a lot of it, really bad
ideas and bad advice.

Speaker 2 (07:58):
And all kinds of things.

Speaker 3 (08:00):
Yeah, I mean, so you're gonna have the sex talk
with your teenage daughter to make sure she's said, well,
she's fine and she's got you know, you're not the
only role model for them. They've got another role model.
It's the AI bot that they get to talk to.
And apparently it's a thing like they this person became
aware of it from friends because that's what the friends
are doing too, And God, knows what sort of you know,

(08:22):
the whole garbage in, garbage out?

Speaker 4 (08:24):
Why is why are AI systems woke? Well, because the
people programming them are and blah blah blah.

Speaker 2 (08:31):
You got that issue as well.

Speaker 4 (08:32):
You know, I keep every time we talk about this
sort of thing, I have the same urge toward you know,
some sort of fundamentalist subcultural civilization or community or or
you know, build your own compound or something. And you know,
people I'd say, yeah, we're fundamentalists, and they say, like
religious fundamentalist, No, Islamic fundamentalists. No, no, no, no, we're

(08:56):
just fun to. We just concentrate on the fundamentals of life,
you know, whatever you want. And know I'm not a
cult leader, and no I'm not sexing up the young women,
which tends to be an inevitable thing in these little
off hue communities. But yeah, we just there's a lot
of the modern world that sucks. Oh do you like
not do medicines and stuff?

Speaker 3 (09:15):
Oh no, no, no, we do medicines and vaccinations, you know,
and all this.

Speaker 2 (09:20):
Yeah, we you know, we're not lunatics.

Speaker 3 (09:23):
We've just we've learned to separate the wheat from the
chaff of the modern world because and this is this
is so obvious and so fundamental it almost seems stupid
to say.

Speaker 4 (09:35):
But we all, as human beings, tend to be swept up.
And we talked about this a couple of days ago
in fascinating fashion. We all tend to be swept up
by the culture and assume everything that is offered to
us is something we ought to take in.

Speaker 2 (09:48):
And that's not true.

Speaker 3 (09:49):
There is some wheat, but there's some not only chaff,
but poison, like thumbtacks in the wheat of.

Speaker 2 (09:55):
The modern world. This is a great idea.

Speaker 3 (09:57):
Man, if I had billions, fight eons billions, I would
start towns like this, or communities or I don't know how.
You don't I have it an unfold, but you'd be
like the Amish, except for No, we're not gonna ride
buggies down the road to work. It's ridiculous. But we're
not gonna have the damned Internet. We're not gonna have smartphones.

(10:18):
We're not gonna have all this stupid stuff. We're gonna
go back to like way back to two thousand and six.

Speaker 2 (10:24):
Okay, maybe I don't know on the internet. I have
to think about that.

Speaker 3 (10:26):
But definitely, not smartphones, Definitely, not Ai, Definitely none of
that stuff.

Speaker 2 (10:31):
And I think a lot of people would gravitate toward that.

Speaker 4 (10:35):
We're gonna have backyard barbecues and the kids are gonna
all go off to the side, and they're gonna talk
and giggle and laugh, and we'll wonder what they'll say
and if they're saying, and then they'll invent if in
a game with a ball and a stick, and you know, yeah.

Speaker 3 (10:47):
And is there any way to program morality into any
of this chatbot stuff? I mean, you couldn't force it,
but I mean, is there any way to have a
chatbot says, Wait a second, how old are you?

Speaker 2 (10:57):
I'm twelve.

Speaker 3 (10:58):
While you shouldn't be sex at all, and certainly not
being in a relationship with a seventeen year old. There's
either something wrong with them or they're just wanting to
use you for sex. But this is a bad idea.
Is there any way a chatbot whatever say that?

Speaker 4 (11:14):
Well, the issue is since groups, since we don't really
have shared a shared sense of morality anymore, because we've
become a much more diverse country, the it's impossible to
quote unquote infuse morality into it because nobody can agree
on what morality is or should be. Therefore, all things

(11:36):
digital are utterly a moral They are without morals. Does
that trouble anybody sending your child into a completely a
moral environment?

Speaker 3 (11:48):
In this particular story, so chromebook got taken away mom
late at night. At one point realizes computers missing. Uh oh,
goes in the daughter's bedroom. Couldn't stay away from the
sex advice chatbot to please a seventeen year old now
has to sleep with the computer, all computer devices in

(12:11):
the bedroom to make sure they I mean, oh my god,
this is not something our parents had to deal with.

Speaker 4 (12:19):
No, I realizing I had to deal with and my
kids are now mid twenties to early thirties.

Speaker 2 (12:27):
Yeah, I realized the hubris that comes with saying this
is a harder time to be a parent.

Speaker 3 (12:31):
This is a harder time to be a parent than
it was for previous generations.

Speaker 2 (12:36):
It's horrible.

Speaker 4 (12:37):
Yes, as somebody raised the just one more generation earlier,
You're right.

Speaker 2 (12:41):
You're one hundred percent. God, it's so crazy.

Speaker 3 (12:43):
Anyway, if you know anything about this or had any
experiences our text line four one five two nine to
five KFTC.

Speaker 1 (12:51):
Jack Armstrong and Joe Satty The Armstrong and Getty Show,
The arm Strong and Getty Show.

Speaker 2 (13:02):
Now I haven't heard this.

Speaker 3 (13:04):
Gavin Newsom and Bill Maher discussing some California policies.

Speaker 2 (13:09):
We'll discuss I see today the Trump administration.

Speaker 5 (13:12):
They talked about the fact that California had a rule
that schools cannot be required to notify parents if their
kids in school have changed their gender, their pronouns. That's
the kind of thing, even though it doesn't affect a
lot of people, that makes a lot of people go, well,
you know what, that's the party without common sense. Now,

(13:34):
if that's your state, how are you are you?

Speaker 2 (13:37):
I just disagree with that. I mean, the law was
you would be fired.

Speaker 6 (13:41):
A teacher would be fired if a teacher did not
report or snitch on a kid talking about their gender identity.
I just think that was wrong. I think teachers should teach.
I don't think they should be required to turn in kids.

Speaker 2 (13:52):
And by the way, turning that we're talking about their parents.

Speaker 5 (13:54):
How can snitch The idea of a snitch and a
parent to me, doesn't combine.

Speaker 6 (13:59):
Just I don't But what is what is the job
of a teacher?

Speaker 2 (14:01):
It's to teach.

Speaker 6 (14:02):
If Johnny's talking about some identity issue or some issue
about liking someone of the same sex.

Speaker 2 (14:07):
Is that the teacher's job.

Speaker 3 (14:09):
Oh well, one thing that's clear from that, he is
not confidently coming out and saying it's ridiculous that teachers
would not be allowed to tell parents about it, And
he didn't say that. He had the opportunity right off
the bat to say that, and he didn't. So there's
no chance he's gonna be president the United States. You
cannot be president in the United States unless you're willing

(14:30):
to take a position on this. It's been proven over
and over the candidates who try to like fudge these
things and be wit into thinking they're gonna have it
both ways. Never works. Never works. Didn't work for Kamala,
ain't gonna work for Gavin. I can't believe he doesn't
have the balls to come out and say, even in California,
what just has gotta be an eighty twenty issue, maybe
to ninety ten.

Speaker 2 (14:51):
I can't believe he didn't have the balls to say
that out loud.

Speaker 4 (14:54):
And to claim that Johnny, who now wants to be
called Jenny, that that would be snitching on the kid
and akin to maybe the kid hints that maybe he
likes boys.

Speaker 2 (15:04):
That's just that is so false. I mean, it's so funny.

Speaker 3 (15:07):
And also the job of a teacher is to teach,
to teach about the genderbread person in radical gender theory, Gavin,
you require them to teach that stuff. Now, with all
the crap in California that they haven't teach that's not reading, writing,
and arithmetic, that ain't gonna fly. And he's trying to
conflate what DeSantis is doing in Florida, where they have

(15:29):
the law that you'll be fired if you don't tell
the parents, and he's trying to act like that's what
he's fighting against.

Speaker 2 (15:35):
No, no, no, no, no, you went completely the other direction
where the teacher is not allowed to tell the parent.
That's nuts. Yeah, and every most people think it's nuts.

Speaker 3 (15:45):
As Bill Maher points out, most people think that's nuts.

Speaker 2 (15:48):
I thought Gavin was smarter than that. Here's the deal.

Speaker 4 (15:52):
What's really wrecking him and people like him, thank God,
is more and more people are understanding the relationship between
the neo Marxists, the radical you know, gender theory crowd
or the queer theory crowd, all these lunatics, the neo
Marxists and their connection to Frances Since the teachers Union,
which is down with all this stuff, Gavin doesn't dare

(16:15):
defy the teachers Union, which is down with all this stuff.
So he went as far as he's going to go
with his Yeah, there's.

Speaker 2 (16:23):
A injustice there. We need to strike a balance with
the girls in So it's not the vice versa.

Speaker 3 (16:28):
So it's not the voters that he has in the
back of his mind when he's answering those questions.

Speaker 2 (16:32):
It's the teachers Union.

Speaker 4 (16:34):
Have the teachers unions and the radical activist class of
which he I think is to the extent that he
has any beliefs whatsoever, they seem to be quite progressive.

Speaker 3 (16:42):
I think he believes he wants to be president. Yeah,
I'd like to hear more of that exchange.

Speaker 2 (16:47):
Yeah, no kidding, because mar cannot be.

Speaker 4 (16:53):
I got to be able to say this. He cannot
be bull asked. I'll just say that, why do you
so oily?

Speaker 2 (17:04):
Why do you take so much joy and cursing? I
don't know, No, it's just it's a wonderful thing. It's
a wonderful thing.

Speaker 4 (17:13):
Well, you know, I consider bull asked to be a
perfect work universally understood and its meaning.

Speaker 2 (17:18):
It is brief, it has a ring tw it, it
has a rhythm to it. It's perfect work.

Speaker 3 (17:25):
Yeah, and unfortunately the substitutes do not carry the same weight.

Speaker 1 (17:30):
Arm Strong and Getty, Jack Armstrong and Joe Getty, the
Armstrong and Getty Show.

Speaker 7 (17:42):
We were with German police as they conducted early morning
raids on citizens who had been accused of hate speech,
threats and inciting violence online.

Speaker 2 (17:52):
In the United States.

Speaker 7 (17:52):
A lot of people look at this and say, this
is restricting free speech. It's a threat to democracy.

Speaker 2 (17:58):
Free speech meets boundaries. Wow, and yeah, yeah, throw in
the German accent and that's really something that's from that's
from sixty minutes Sunday night, coming on the heels of
JD Vance giving Europe a lecture about censorship and be
an anti free speech over the weekend.

Speaker 4 (18:18):
And as we will discuss these sixty minutes report was gleefully, weirdly,
troublingly positive about the idea of restricting free speech if it's.

Speaker 2 (18:28):
The wrong speech.

Speaker 4 (18:30):
Made even more notable their awful attitude by the fact
that it occurred on the same day as this clip.
Margaret Brennan talking to Marco Rubio on Face the Nation.

Speaker 8 (18:40):
He was standing in a country where free speech was
weaponized to kentuckt a genocide, and he met with the
head of a political party that has far right views
and some historic ties to extreme groups.

Speaker 4 (18:57):
Margaret Brennan attempting to suggest the JDVS chants shouldn't have
advocated free speech because free speech is what led to
the genocide in Germany against the Jews, which is almost
hilariously idiotic and so wildly inaccurate. It's barely worth the
time to describe how incredibly inaccurate it was free speech

(19:19):
under Hitler.

Speaker 2 (19:20):
When was that? Exactly, asks every historian on Earth.

Speaker 3 (19:25):
I know, I'm mister hyperbole, but I feel like that's
one of the craziest things I've ever heard. One of
the major Sunday Show anchors say.

Speaker 2 (19:34):
It is Jadavan. Unquestionably, jd Vance was standing in Germany
where the rise of fascism happened in the genocide, and
they weaponized free speech. What are you talking about? Oh
my god, Yeah, she's nuts.

Speaker 3 (19:51):
I'm so lost on this. I don't understand their worldview.
I guess we'll learn more of this.

Speaker 2 (19:56):
Well I do.

Speaker 4 (19:58):
I can describe it to you exactly. It's the lust
for power. If you control speech, you control everything else.
But how does that work in a democracy? Don't you
The other side gets to do it too when they're
in charge. Again, I hate even going to that argument
because then it makes it a conditional thing. Well, for

(20:18):
practical reasons, I guess you're right. I won't limit free speech.
I don't even want to go there, right, it's horrific, horrific,
and how you get there in a democracy. The other
thing is you declare an emergency. Happens all the time
both parties. Is Trump's doing it right now and I
don't approve of it. There are half a dozen different

(20:40):
emergencies he's declared which are highly questionable for the purpose
of gaining emergency powers. I hate it on both sides.
We need to stop anyway, back to sixty minutes. Do
you want to ring lean our way through the cliffs?
Doesn't really matter. We can start with eighty Michael.

Speaker 7 (20:54):
It's six' oh one on a Tuesday morning, and we
were with state police as they rated this upon Hartman
in northwest Germany, just.

Speaker 2 (21:02):
To put it inside them humid inside.

Speaker 7 (21:05):
Six armed officers searched to suspects home, then seized his
laptop and cell phone. Prosecutors say those electronics may have
been used to commit a crime, the crime posting a
racist cartoon online At the exact same time across Germany.

Speaker 1 (21:23):
All the online as homes for Swiss Neunivorald.

Speaker 7 (21:25):
More than fifty similar raids played out part of what
prosecutors say is a coordinated effort to curb online hate
speech in Germany.

Speaker 2 (21:34):
And the one guy points out, maybe it will be
in one of these clips, the seven thousand cases or
something that they investigated last year. Wow, that's a lot
in a country much smaller than the United States. Imagine
what that would look like in the United States. And
hate speech, of course, the one of the problems with
hate speech being who's determining its hate speech, who's making

(21:57):
the judgment on that? Yes, Sharon ALFONSI helpful.

Speaker 4 (22:00):
He just saved us the trouble and told us the
cartoon was racist, putting aside whether you know you should
limit free speech on the basis of quote unquote racism anyway,
But yeah, what did it say?

Speaker 2 (22:09):
In what sense? Was it racist? Who is it racist against?

Speaker 1 (22:12):
What?

Speaker 3 (22:12):
Give me the specifics you're asking for the right to
censor me, and you just say, take my word, it
was terrible.

Speaker 2 (22:18):
This part is amazing.

Speaker 7 (22:20):
What's the typical reaction when the police show up at
somebody's door and they say, hey, we believe you wrote
this on the internet.

Speaker 9 (22:28):
They say in Germany, we say, that's a bit more
motives off my zagen. And so we are here with
crimes of talking posting an internet and the people are
surprised that this is really illegal to post these kind
of words.

Speaker 2 (22:43):
They don't think it was illegal.

Speaker 9 (22:44):
Oh, they don't think it was illegal. And they say, no,
that's my free speech. And we said no, we have
free speech as well, but it also has his limits.

Speaker 2 (22:55):
I see, I don't.

Speaker 3 (22:56):
I can't.

Speaker 2 (22:57):
I've got to accept that.

Speaker 3 (22:58):
It's just true that obviously, smart people in charge of
things can say words like that. How can you sail
it free speech as its limits?

Speaker 2 (23:05):
Well, then it's not free speech. You just nullified the
first part of your sentence. What the second part of
the sentence.

Speaker 4 (23:13):
Everybody understands that there are certain limits, but they are
extremely limited limits. The people like this just say, hey,
because there are limits, there can be more limits, and
I will decide what limits there are.

Speaker 2 (23:25):
Control your soul's desire for freedom.

Speaker 4 (23:27):
Yeah, that's which I respond. No, you don't get to decided,
No freaking way control your soul's desire for freedom. They
say in China. This part amaze me. I was unaware
of this.

Speaker 7 (23:36):
It's illegal to display Nazi symbolism. A Swasaka denied the Holocaust.
That's that's clear. Is it a crime to insult somebody
in public? Yes, yes, and it's a crime to insult
them online as well.

Speaker 2 (23:50):
Yes. The fine could be even higher.

Speaker 9 (23:53):
Yeah, if you insult someone in the internet, why because
in Internet it stays there. If we are talking at
face to face, you insult me and sold you okay, finish.
But if you're in the internet, if I insult you
or a politician.

Speaker 2 (24:07):
It sticks around forever. Yeah. So she says to these
three people, it's a fine.

Speaker 3 (24:12):
There's a fine too if you insult someone, and they'll
just say yeah, like you know, of course.

Speaker 2 (24:18):
What what what defines an insult?

Speaker 3 (24:21):
And folks keep in mind, remember we're still in the
era certainly on university campuses and in government and some
other places where if you are insulted, that's proof that
the other person is done wrong.

Speaker 2 (24:35):
You remember, I didn't mean that to be racist.

Speaker 3 (24:38):
It doesn't matter what you meant, it's how I received it.
So you give the person receiving it the carte blanche
to declare whatever they want to be insulting or hurtful
or racist or abusive Islam or whatever, and therefore that.

Speaker 2 (24:54):
Falls within Well, that's hate speech.

Speaker 4 (24:55):
Sorry, we're going to take away your right to say
any to make any criticism that any anybody could even
implausibly claim is insulting.

Speaker 2 (25:02):
It's horrible. So somebody posts the Chancellor is a moron,
I assume that would be a crime. Maybe you didn't
post it, maybe you just did this.

Speaker 7 (25:11):
If somebody posts something that's not true and then somebody
else reposts it or likes it, are they committing a crime.

Speaker 9 (25:20):
In the case of reposting it as a crime as well,
because the reader contstinguished whether you just invented this or
just reposted it.

Speaker 2 (25:28):
That's the same for us.

Speaker 3 (25:31):
So when I was listening to that one, I was
thinking about it was an official Biden Harris campaign ad
that way we had playing in the United States. Donald
Trump threatens a bloodbath if he loses, which was absolutely
a lie in misinformation on the side that sixty minutes

(25:51):
is on. Would you consider that a crime or only
when it's coming from the other side. See, that's where
the rubber meets the road on this whole thing, and
it gets, you know, unworkable.

Speaker 2 (26:02):
Yeah.

Speaker 4 (26:02):
I happen to have a couple of articles, think pieces, etc.
That point out how incredibly one sided this is. It's
just a lust for power the other thing that really
and it's funny, we're talking about one of the most
fundamental human rights that any human beings ever enjoyed. But
you know what annoyed the crap out of me was
when they're and I'm not sure if we have these clips,
but they were talking about the fines, which are fairly heavy.

(26:24):
I mean it's thousands of dollars in fines and multiple
offenses can put you in jail. But some people just
have their phones and their laptops confiscated for good.

Speaker 2 (26:34):
They don't get them back.

Speaker 4 (26:35):
And Sharon al Fonzie said, your phone, wow, because everything's
on there.

Speaker 2 (26:39):
That is tough.

Speaker 3 (26:41):
And she was amused at the idea of these people
who indulged in what the Germans are calling hate speech
getting their devices taken away and not getting access to
them anymore. She thought that was funny, an insensitive joke.
They come and take your laptop and your phone and
you don't get it back, and it's a certain crowd
that that's awesome. This story I found amusing, but it

(27:03):
was a twenty.

Speaker 7 (27:03):
Twenty one case involving a local politician named Andy Grote
that captured the country's attention. Growth complained about a tweet
that called him a pimmel, a German word for the
male anatomy, that triggered a police raid and accusations of
excessive censorship by the government. As prosecutors explained to us,

(27:24):
in Germany, it's okay to debate politics online, but it
can't be a crime to call anyone a pimmel, even
a politician. So it sounds like you're saying it's okay
to criticize a politician's policy, but not to say I
think you're a jerk and idiot.

Speaker 2 (27:43):
Exactly.

Speaker 9 (27:44):
Yeah, commen's like your son of a bitch, excuse me forward.
This words has nothing to do with a political discussions
or a contribution to a discussion.

Speaker 2 (27:56):
That's amazing.

Speaker 4 (27:58):
It's one of the great kanards that sensors use. Here's
an example where I'm censoring something reasonably right, and then
you're supposed to extrapolate from there. Therefore, I trust you
to censor whenever you want. F you no, and you
are a Pimmel. You're a table full of Pimmels, right.
I kept thinking that as they used various examples. I thought, Yeah,

(28:19):
it'd be nice if you could censor that and not
other stuff. But once you open the door, then you
start making choices and who's making those.

Speaker 2 (28:31):
Choices, and it gets out of hand really really fast.

Speaker 3 (28:34):
We all lived through this in the United States, so
we know how off track this can get so fast.
When it was basically against the rules to say, you know,
I think the virus probably came out of that lab,
you couldn't say it for a couple of years.

Speaker 4 (28:51):
Yeah, or you would be punished, maybe not by law
enforcement in the ways we're used to, but you would
be punished by proxy by the government.

Speaker 2 (28:58):
And CBS, it would seem. And many others.

Speaker 3 (29:02):
A lot of college professors want it to be against
the law to post misinformation and again misinformation like if
you get the inoculation, you can still get COVID and
you can still spread it.

Speaker 2 (29:15):
That is dangerous misinformation. Wow, it's scary. One hundred percent true.

Speaker 3 (29:19):
They're so enthusiastic about it there in Germany, and then
we got and then CBS is enthusiastic about it too.

Speaker 2 (29:24):
That is freaking frightening. We got a lot more on
the way now.

Speaker 4 (29:27):
A quick word from our friends and sponsors that simply
save home security or pointing out that with longer daylight
hours you may be spending more time away from the
house and giving burglars more opportunities to strike.

Speaker 3 (29:38):
I'll tell you what, every time I drive away from
my house and lock the door, I'm happy to see
the simply Safe sign. I've gotten the yard in front
of the door to let people know I got the cameras,
I got the censors, I got all the stuff protecting
my home. It definitely does give you a peace of mind,
there's no doubt, and we love.

Speaker 4 (29:54):
That while we're sleeping right at night. But FBI crime
data shows there are more break ins during daylight hours
than under the cloak and knights, so you need Simply Safe.
In their Active outdoor Protection twenty four to seven, it's
really an amazing system. AI powered cameras back by live
professional monitoring agents monitor your property.

Speaker 2 (30:10):
And detect suspicious activity.

Speaker 4 (30:12):
Then they can call the cops, yell at the people,
turn on your spotlights, whatever's necessary.

Speaker 3 (30:15):
There's a lot of different stuff about simply safe, but
one of them is no contracts. About a dollar a day.
Visit simply safe dot com slash Armstrong. Save fifty percent
off a new system with a professional monitoring plan and
get your first month free. That's simply safe dot com
slash Armstrong. There's no safe lights, simply Safe.

Speaker 1 (30:31):
Jack Armstrong and Joe Gatty, The Armstrong and Getty Show.
Jack Armstrong and Joe Getty, The Armstrong and Getty Show.

Speaker 3 (30:44):
Why are all lights losers hanging out at the DMV?
Where are the regular people? There are no regular people
the DMV. It's almost entirely losers, like in line there
and stuff you.

Speaker 2 (30:56):
Hang it out, Oh yes, sitting line.

Speaker 3 (30:58):
Never know either's like would be. It's got a bit
of a Homer Simpson White. Are the things that happen
to stupid people keep happening to me. But you don't
look around and see anybody. I mean, I thought this
when I was twenty five, so this isn't like me now.
I've always thought this. It's like serving well, every car

(31:19):
parked in the parking lot is like Dennett and missing
hubcaps and got you know, a plastic bag over one window.
The percentage of people with either a crutch or a
sling is way higher than the regular population. I don't
know what to make of this. I'm uncomfortable with it.

Speaker 2 (31:37):
And I don't know why it is. I've always wondered that. Now.

Speaker 3 (31:41):
I tweeted that out and some people said, well, those
of it that there is a separate one for those
of us who are life's losers.

Speaker 2 (31:46):
It's called Triple A.

Speaker 3 (31:47):
And I did discover that in my thirties that if
you're a member of Triple A, a lot of the stuff
you can do there. The thing I'm doing bringing a
car in from another state, you have to go to
the DMV. California tries to make it impossible to bring
a car in from another state because there are ultimate
goals to have no cars.

Speaker 2 (32:04):
That is the goal of the state of California. They
hate cars and they would like to get rid of them.
So you're always.

Speaker 4 (32:09):
Everybody in evs, which tear up the highways, which are
already bad.

Speaker 3 (32:13):
Yeah, and I'm registerrying to register an EV and it's
still just as hard. But anyway, that aside, I had
a point of Oh somebody did point out though, because
I got into a lot of conversations online because I
had two hours to kill wondering about this question. Is
that uh, further down the ladder of life working out

(32:33):
for you, or when you're young you drive cheap cars
that have you know, more difficulties, and you swap cars
more often, and just with with other crappy cars. Just
lots of things happen that require the DMV more often. Yeah,
and there's there's a certain percentage of trips to the
DMV which are caused by lack of organization, no, no doubt.

Speaker 2 (32:56):
I'm speaking for myself.

Speaker 4 (32:57):
I know that if if if I, for since it
had gotten the FORMA X thirty four or B in
on time, I wouldn't have to be standing in that
damn line. And there's definitely a correlation between ability to
be organized and think ahead and success in life. You
either have it or in my case, you marry it
and thereby avoid a lot of the trips.

Speaker 3 (33:19):
But anyway, well, it's insane that we can't have a
simpler system though that I know a lot of it
has gone online, but it all should be easily doable online,
shouldn't it?

Speaker 2 (33:31):
Why not? Yes, clearly. I wonder if AI can get
a wrap on this someday where there's a dmv AI
thing that can tell you, no, you need this form,
click here and you'll have the form, and then you
fill out the form and it's submitted on the computer.

Speaker 3 (33:47):
And all of this is nonsense of waiting in line
for hours. And this didn't happen to me, thank god,
but I saw it happened to practically everybody around me,
waiting in line for hours to be told no, you
need thirty four b you have thirty four.

Speaker 2 (34:00):
Oh that's so Soviet Union. That is so evil. Actually,
Elon Muskin, the Doge boys.

Speaker 4 (34:05):
They're getting all the attention for cutting this and firing
them over there, but one of the main priorities they
have is updating the ridiculously antiquated and unconnected computer systems
of the government, and I would love to see that
catch fire. Just as an aside, after the a half
dozen or so of the Doge leaders made that great
appearance on the Special Report with Brett Beher, I really

(34:27):
thought they would be mounting a charm offensive where more
of those people would be doing more interviews I.

Speaker 2 (34:32):
Haven't seen it.

Speaker 4 (34:33):
It could be that all of the alphabet networks in
the usual suspects that New York Times have no interest
in it because it undercuts their narrative of it's just
a handful of frat boys on meth running around firing nice,
innocent people with families, when indeed that's not it at all.

Speaker 3 (34:50):
I actually hadn't been in a DMB in quite a
few years, so I was trying to take the multiple
hours as an opportunity to just observe, you know, our
government system at work. I also committed myself to knowing
that I wouldn't get accomplished where I wanted to accomplish
in one trip, and I didn't, but to having a
cheerful look about it as I watched so many people

(35:11):
get angry and thinking, you've made yourself miserable, You've made
the person that works there even a little more hardened
against the public. Nothing good was accomplished by getting upset
about this, no matter what.

Speaker 2 (35:24):
But I thought, how do you do that job?

Speaker 3 (35:27):
How could you do that job for a day and
not end up the way a lot of DMV drone
people are.

Speaker 2 (35:34):
The person I worked with was very cheerful and nice,
but a lot of them aren't, and I could.

Speaker 3 (35:39):
I don't know how you could work that job one
week without being one just a mindless boredom of it,
and then that everybody being mad at you.

Speaker 2 (35:50):
It's really inhumane to subject someone to doing that. Yeah,
I don't know how you would do that. I agree.

Speaker 4 (35:55):
It's not an excuse exactly for being mean or abusive,
but I understand it having out with a public fair amount,
especially in younger jobs.

Speaker 3 (36:02):
I think it's how TSA people end up where they
are too, although with the added benefit for both of
those jobs if you can't be fired, because working in
retail is a lot of that too, and you have
to keep your cheerful out look because they can fire
you and get somebody different, So you have to overcome
the whole The customers pissed me off are trong, Hey, Jettie,
Advertise With Us

Hosts And Creators

Joe Getty

Joe Getty

Jack Armstrong

Jack Armstrong

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.