All Episodes

May 30, 2025 36 mins

Hour 3 of A&G features...

  • More thoughts on the growing influence of Artificial Intelligence...
  • We turn to AI to craft a new tune about Joe Getty...
  • Who are the cream-of-the-crop American youths protesting at college campuses? 

Stupid Should Hurt: https://www.armstrongandgetty.com/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Broadcasting live from the Abraham Lincoln Radio Studio the George
Washington Broadcast Center, Jack Armstrong and Joe Getty.

Speaker 2 (00:10):
Armstrong and Jettie and he Armstrong and Eddy.

Speaker 3 (00:23):
The way I look at artificial intelligence, it's both like
it can be a threat for sure, and it can
be a massive opportunity. What I don't understand is when
I walk around this city or you talk to lawmakers, why.

Speaker 4 (00:34):
People aren't obsessed with this.

Speaker 3 (00:36):
It is going to reorder society over the next decade,
and almost any person who studied it at any level
has come to that exact same conclusion. I sometimes joke
to Mikey and others that I feel like I'm living
in a simulation where you see so clearly where the
world's going over the next five years, and yet Washington
pays very little attention to it.

Speaker 5 (00:55):
Jim Vanda High is getting a lot of attention for
a PC he wrote for Axios about a there's a
big one in The New Yorker also in which they
have two AI experts, one arguing it's going to change
society more than anything ever has and very soon, and
another expert arguing that now it's way, way, way over
blown and any of the effects are years off, which

(01:17):
I hope that that guy's right, although I tend to
think the other guy's right, that it's happening sooner and
gonna be a big deal.

Speaker 1 (01:24):
I've been reading James vander Heys piece in which, among
other things, he interviews Dario am a day. Is that
how you're protesting? He's the CEO of Anthropic, one of
the most powerful AI companies in the world, And it
occurred to me the perfect metaphor. I don't know why
I see the world in metaphors. I do, but it's
as if.

Speaker 5 (01:44):
Doctor Franklin, You're still seeing the world in metaphors, is
like an ox needing something. It's like if doctor Frankenstein
was designing his monster, but it was going to be
like fifty feet tall. But doctor Frankenstein was perfectly sane

(02:04):
and was saying, now, reanimating a corpse is going to
be an incredible scientific breakthrough, but it could rampage down
to and through the village and kill many, many people,
and we're really not sure which. So Igor, if you
would fire up the lightning positron, please let's get this
started anyway, Thank you, dear More with Jim Vanda high.

Speaker 3 (02:35):
Just pay attention, get way more familiar with this technology.
If you are about to be a graduate, figure out.

Speaker 4 (02:41):
Are you in the right field. If you are in
the right field.

Speaker 3 (02:44):
How would you utilize this technology for it to be
a force multiplier of the work that you do. How
can you help it ten x, two x, whatever the
number is your productivity, so that you can do but
really creative interesting things. And I think if society prepares
for it, if the company's for it, it doesn't have.

Speaker 4 (03:01):
To be a massive upheaval.

Speaker 5 (03:04):
Again, I hope he's right. I just I just don't know.
It's it's not It's not like, you know, except the
fact that the car is coming. Learn how to fix
a car because nobody's gonna need horseshoes anymore, right me?
You know, made sense, makes sense. I'm not sure if
it works the way the you know, the the people
on the extreme end of all the jobs AI can take,

(03:27):
If they're right, there are gonna be so many millions
of people out of work. There's there's no like, well,
I guess for your young pick a career that AI
can enhance. But well, here's a be a lot of
guessing involved in that. Let me set the table a
little bit, then we can discuss. So, mister m Day
and again, forgive me if I'm mispronouncing it. The CEO

(03:48):
of Anthropic, talking to Jim Vanda Hay, has a blunt,
scary warning for the US government and all of US.
AI could wipe out half of all entry level white
collar jobs and spike on themloyment ten to twenty percent
in the next one to five years. Five percent of
jobs wiped out. Interesting noways you said.

Speaker 1 (04:08):
Half ten half of all entry level white collar jobs
and ten to twenty percent unemployment in general in the
next one to five years. And Day said, AI. Companies
and government need to stop sugarcoating what's coming. This is
doctor Frankenstein standing there I'm picturing on the drawbridge of

(04:28):
his moat saying to the villagers with the pitchforks and
the torches breaks, guys, you gotta be ready in case
the monster goes on a rampage.

Speaker 5 (04:39):
Well, the first clip we played from Jim Vanahei makes
the most well, it's not hard to explain, makes the
most sense to me. How is this such a giant
topic everywhere but in Washington, d C. I wonder if
it has anything to do with the average person in
any leadership position.

Speaker 2 (04:55):
Being eighty years old.

Speaker 4 (04:57):
I think so.

Speaker 1 (04:58):
And also there are so many unknowns it's difficult to
know what to do legislatively, particularly if, for instance, I
don't know your legislature is utterly cowardly and afraid of
doing anything lest they do the wrong thing, so they
just let the executive branch do everything. But let me
finish the thought. Emday said AI. Companies and government need

(05:19):
to stop sugarcoating what's coming. I'm sorry, what's coming the
possible mass elimination of jobs across technology, finance, law, consulting,
and other white collar professions, especially entry level gigs. I
got a kid who just kicked buttoner first year of
law school. I'm very proud of her. But we're gonna
have a serious talk, and practical part of me doesn't

(05:41):
want to say this on the air because I want
as few people to come to this realization as possible.
We're gonna have a serious talk about there's a very
good chance you're either going to be the person eliminated
by AI or you're going to be the person who's
become so adept at using it. You're the one out
of a that is kept on to get good at

(06:03):
this stuff.

Speaker 5 (06:04):
That's exactly what bend Hi was saying. Accept the fact
that it's going to be here, and figure out a
way to.

Speaker 2 (06:12):
Use AI. So you're one of the people that can
continue to make a living.

Speaker 1 (06:17):
Yeah, so, amide says he's speaking out in hopes of
jarring government and fellow AI companies into preparing and protecting
the nation. Hardly anyone is paying attention. Lawmakers don't get
it or don't believe it. CEOs are afraid to talk
about it. Many workers won't realize the risks posed by
the possible job apocalypse until after it hits. And by

(06:40):
the way, if you're a new Ish listener to the show,
thank you for being here. First of all. Secondly, we're
not clickbait hyperbole. We try very hard to stay away
from trying to scare you all the time or making
you angry all the time. There are plenty of reasons
to be scared and or angry. We just try not
to go over the top. I don't think it's hyperbole

(07:01):
at all. To you know, have the weight on the
balls of your feet and your feet at shoulder position,
leaning forward at the waist, getting ready for this could
be very, very disruptive.

Speaker 5 (07:12):
Well, I've read several of the long leading books on
the topic, Life three point zero by Max Tegmark, which
is fantastic, and then a newer one from Nick Bostrom,
who wrote an AI book that got tons of attention
several years ago. But this Deep Utopia book, which he
gets really into. If it does what some people claim

(07:34):
it's going to do, what the hell is mankind going
to do? What is society going to be? How is humanity?
Even even if you come up with a way for
people to have food and shelter, what do human beings
become if they have they don't have to work. It's
an experiment that's never been run before really, other than

(07:57):
like the super rich.

Speaker 2 (07:58):
And you see how they often end up up.

Speaker 1 (08:01):
Give us the ten second version of your working on
the song yesterday after this commercial or right now, no,
right right now? Okay, So I'm ordering you to do anything,
but I think it would fit in well, and then
coming up after the commercial jack reference the hey don't worry,
it's gonna be okay argument.

Speaker 5 (08:18):
So I was working on this song yesterday, I'd come
up with these lyrics in my head, and then I
picked out a key I could sing in, some chord
structure and stuff like that. The working title is Jo's
an a Hole. But it's very tuneful, very nice.

Speaker 2 (08:31):
I'd like to hear that song. But I thought, you know,
I need a bridge for this song.

Speaker 5 (08:38):
And I almost feel guilty for even having done this,
but I thought it'd be kind of fun to see
what it did. I went to chat GBT and I said,
what's a good bridge for like a country style song?
And it presented me like with five different options immediately,
by the way of if you want something that's kind
of uplifting, go with this. If you want something that

(09:01):
is good for setting up a more poignant ending, go
with this. And it just gave me the chords in
that give you all the chord structures for doing it.
This is the most popular currently in like country pop,
and it was. I mean, you wouldn't even know, you
didn't know anything about.

Speaker 1 (09:18):
Music to craft a song that way, So instead of
experimenting and struggling and listening to it and trying to
figure out how do I get back to the verse
chords and that's a little uncomfortable, And then finally, after
you know, minutes or hours of effort coming up with
something you're proud of. No, the computer just told you
what to do. How does that change songwriting? I've got

(09:38):
to admit I heard that and thought, wow, that's cool tool.
But I also thought, what the hell's the point? Right?

Speaker 5 (09:43):
Yeah, that's what I felt like too. I haven't used
one of them because I thought, do I want that?
Do I want it to not be something? I came
up with?

Speaker 4 (09:51):
What am I doing here?

Speaker 1 (09:52):
Right right?

Speaker 5 (09:53):
Why don't I just go to the end write a
song about this topic and then okay, now produce it?

Speaker 2 (09:58):
Okay, now I'll listen to it. Well, that was fun.
What am I doing coming up?

Speaker 1 (10:04):
It'll be fine? Argument, so stay tuned for that afterward
from our friends at Trust and Will. If you're not
familiar with the need for a trust and Will, I
hate to say it, but someday you will not be
here and your assets will go to the people you
care about. And if you don't have a trust and
or Will, it's going to go through a long, tortuous
government a bureaucracy abuse process known as probate. Don't do this, friends,

(10:29):
especially because Trust and Will makes it so easy and
affordable to get your act together.

Speaker 5 (10:33):
Yeah, you die unexpectedly today, like sometimes happens, all kinds
of never ending legal battles in the state deciding what happens.

Speaker 2 (10:42):
Or you can come up with this Trust.

Speaker 5 (10:43):
And Will with this website, which is fantastic, starts at about
one hundred ninety nine dollars, create and manage a custom
of state plan. You can manage your trust or will
online and this is all legal based on wherever you
live with they're easy to use website and by the way,
not on your own live customers. Customer support chat, phone

(11:04):
or email. To get this all put together. Save your
family the heartbreak. Sometimes this stuff tears. Family is apart
after you're gone.

Speaker 1 (11:09):
Don't do that. Trust and Will has made a state
planning accessible and affordable. It's really quite amazing. Secure your assets,
protect your loved ones with Trust and Will. Get twenty
percent off on your estate plan documents by visiting trustinwill
dot com slash armstrong. That's Trust and Will dot com
slash armstrong.

Speaker 5 (11:28):
So we'll have to get to the optimistic side of
the AI thing next. But I just on the music thing,
as you mentioned it.

Speaker 2 (11:36):
Is what is the point art wise?

Speaker 5 (11:38):
Certainly, Hemingway famously said a writer writes for himself and others.
I mean, but you're doing it a lot for yourself.
And the reason I sat down to craft a song yesterday,
which I hadn't done in a long time, is I
was in emotional turmoil about a certain thing, and all
these thoughts were in my head and I just was
compelled to write a song about it. Well, if I

(12:00):
have AI do it, doesn't that negate the whole artistic
you know, getting your feelings out of you onto paper
or music or painting or whatever.

Speaker 1 (12:11):
Well, right, to summarize again, what are we doing here?

Speaker 2 (12:16):
Exactly?

Speaker 5 (12:17):
So if I was gonna have them write the bridge,
well write the whole damn thing, write the lyrics, produce it,
you know, tell me in five minutes what to listen to.

Speaker 1 (12:24):
And I'll listen to it in a five seconds.

Speaker 2 (12:26):
Five seconds.

Speaker 5 (12:27):
Then I'll listen to it and think, well, now I
feel like I've expunged my demons.

Speaker 2 (12:31):
I'll get to go about my life.

Speaker 1 (12:33):
What well?

Speaker 4 (12:34):
Right?

Speaker 1 (12:35):
And so the great conundrum is, okay AI is going
to free us all up to become poets and songwriters,
which the AI.

Speaker 5 (12:42):
Will do so right, excellent point. And then there's also
the what if it's way better than what I would
have come up with, which it probably would be.

Speaker 1 (12:51):
I'd say that's the material.

Speaker 2 (12:52):
To me, Oh it is, but who wants that?

Speaker 1 (12:57):
It's very discouraging to worry about it. The the positive
optimistic argument is coming up in moments Stay with us, friends.

Speaker 5 (13:14):
We're talking about AI, both Axios and The New Yorker.
I now big pieces on the is this going to
change in the world? Is in ways we can't even imagine?
Maybe for the worse or is it not going to
be that big a deal? I certainly don't know which
I'm hoping for the latter.

Speaker 1 (13:32):
Actually, well, and the guys who are designing all this
stuff don't know either, which is no Harry back to
my doctor Frankenstein metaphor. But I will say that the
admitting the monster might rampage through the village and kill everybody,
it's impossible that well, it's not impossible.

Speaker 5 (13:49):
But obviously Google elon. All kinds of people think it's
going to be a big deal.

Speaker 2 (13:54):
Bill Gates.

Speaker 5 (13:55):
They're spending hundreds of millions of dollars billions of dollars
this because they think it's going to be that big
a deal.

Speaker 1 (14:03):
Rich Lowry in the National Review has expressed the counter
argument that many of you find folks have expressed through
the years and certainly worth considering. Rich writes, chat GPT
is coming for your job. That's the fear about the
rapid advances in intelligent artificial intelligence. They mentioned the headline
Naxios who warned of a white color bloodbath. CEO of

(14:26):
Anthropic we've been quoting in the last segment told the
publication AI could destroy up to half, well destroy half
of all entry level white collar jobs in the next
one to five years. That's like right around the corner,
and drive the employment rate up to ten to twenty percent,
or roughly great depression levels. This sounds dire, but we've
been here before. In the thirties, John Maynard Keynes thought

(14:46):
that labor saving devices will quote out running the pace
at which we can find new uses for labor. And
let's thought the same thing in the sixties when John F.
Kennedy warned that quote, the automation problem is as important
as any we face, and in our era too. If
a prediction has been consistently wrong, it doesn't necessarily mean
that it will forever be wrong. Still, we shouldn't have

(15:08):
much confidence in the same alarmism repeated for the same reasons.
That gives a few examples to push back.

Speaker 5 (15:16):
A lot of people have and I hope it's absolutely
holds true for AI. It has been true in the past,
but God AI could reach into so many different areas
of life at the same time.

Speaker 2 (15:29):
A little murder.

Speaker 1 (15:30):
Rich's argument, and then perhaps we'll counter it. If technological
advance was really a net killer of jobs, the labor
market should have been in decline since the invention of
the wheel. Instead, we live in a time of technological marvels,
and the unemployment rate is four point two percent. Rob
At Consent of the Information Technology and Innovation Foundation points
out that the average unemployment rate in the US has

(15:50):
not changed much over the last century, despite an increase
in productivity by almost ten times. Technology increases productivity, driving
down costs and making it possible to invest and spend
on other things, creating new jobs that replace the old.
This is the process of society becoming wealthier, and it's
why nations that innovate are better off than those who don't.

(16:13):
Good powerful argument. The rise of personal computers collapsed the
demand for typists and word processors. These positions were often
held by women. Did this decimate the economic prospects of
women in America? No, they got different and frequently better jobs.
It goes into some bookkeeping and accounting parallels. It's a
good solid argument. The only thing I would throw it

(16:36):
rich is and we've seen this in a lot of
different aspects of the modern age. There are two questions,
two challenges, the amount of change and the pace of change. Right,
And it could be that this, given the rapidity with twitch,

(16:57):
it'll drown us all or beat us to that like
Frankenstein's monster in my metaphor, It's gonna be so fast
and so huge, the jarring adjustment period is going to
be very ugly and volatile.

Speaker 5 (17:13):
Yeah, we're pretty We're being pretty loose and fast with
the term AI. Also, it's all about a gi artificial
general intelligence, whether that happens or not. And I've read
enough of these books to know there are people that
think it will never happen. There are people that think
we're twenty years away. There are people that think we're
five years away. But if artificial general intelligence happens, where

(17:34):
computers are as smart as human beings, smarter than human beings,
can learn everything instantly, then it's impossible anything that you
come up with that human beings can do, that are
that come off of the technological technological advancement, can also.

Speaker 2 (17:51):
Be done by artificial and general intelligence.

Speaker 1 (17:53):
Yeah, that's too much for a person like me to
even contemplate. I can't. I'm stuck in the halfway between
period where you know or eighty percent of lawyers will
be replaced by a nap. That's going to be dislocating
enough your thing. Please, I don't even want to think.

Speaker 5 (18:11):
About agi happens. I can't imagine how that doesn't do
them society.

Speaker 2 (18:15):
But anyway, who knows.

Speaker 1 (18:18):
Armstrong and Getty. So hey, before we get started, you
ought to talk about what we were just talking about
off the air. On the air, to whatever extent you can,
you're good at that. I just at some point the
theme being people are way crazier than you are told

(18:40):
as a kid. As a kid, you think, you know,
grown ups have it more or less buttoned up, and
they can at least perceive reality.

Speaker 5 (18:47):
A lot of people cannot. They live in a world
of delusion. Especially. I never thought when I was younger
that rich, smart people could be this dumb.

Speaker 1 (18:59):
Right, Okay, well, too good? You have a particular skill
set that makes you a lot of money. That doesn't
mean you have generalized intelligence or wisdom, and its correct you.

Speaker 2 (19:06):
I will tell that story, uh in a little bit.

Speaker 5 (19:09):
But first, executive producer Mike Hanson, who rarely comes on
the air, why are you here?

Speaker 2 (19:13):
Hanson? I was inspired really by our AI conversation.

Speaker 6 (19:18):
Yeah, I mean, you guys are talking about trying to
form up a song, and I see I've been picking
up my guitar more lately as well. I don't really
concern myself with a lot of you know, song structure
in the bridge and I just kind of fiddle around.

Speaker 2 (19:29):
But I bang around. I make a lot of a
lot of noise. But I was inspired.

Speaker 6 (19:34):
So I turned my AI tool, thinking I should craft
a tune this morning. Uh So I quickly jotted down
some lyrics and I wrote all the lyrics on this.
I didn't ask you to help me. I wrote them
in about twenty twenty seconds.

Speaker 1 (19:50):
Way to put in the time.

Speaker 6 (19:52):
Well, and a lot of great songs inspired. I was inspired.
I mean, come on, don't tell me about my art.

Speaker 4 (19:58):
This was my art.

Speaker 2 (19:59):
So way he came up with a song.

Speaker 7 (20:10):
He gets up every morning, he's dedicated to his craft,
his hall mark, his fairness, and everyone loves to hear him.

Speaker 5 (20:26):
Leader.

Speaker 7 (20:28):
He's a master of metaphorm and a one of hurd smith.

Speaker 2 (20:35):
Tell me why not.

Speaker 1 (20:37):
He's got kind of a but don't ask him.

Speaker 7 (20:39):
To work to harm cause of leisure.

Speaker 2 (20:44):
He's the king.

Speaker 4 (20:49):
Jokey Joe Gay, that was awesome.

Speaker 2 (20:56):
This is my kind of music, dud Joe Getty, phone
up with the light on you god swaying back and forth.

Speaker 5 (21:10):
This is the sort of song that was on he
Haw or the Glenn Campbell Show every Saturday night when
I was a kid.

Speaker 1 (21:18):
Or if you're an alt country nineties guy like me,
it was just revd up a little bit and the
electric guitars and yeah, I love that music because of leisure.
The King. I am a man who craves leisure. Yes,
as I've made clear through the years.

Speaker 2 (21:35):
Oh my god. The fact that you can do that
and like, how long did it take you total? I'm
telling you it was twenty seconds.

Speaker 1 (21:44):
That is amazing, amusing and horrifying.

Speaker 5 (21:48):
Yeah, there are people that used to make six figure
salaries back when that really meant something doing that for
radio shows back in the day.

Speaker 1 (21:59):
Oh never mind, the just doing that right.

Speaker 2 (22:02):
Yeah, So I gotta put together at Rush.

Speaker 1 (22:04):
Limbaugh, you know, the famous song parodies and stuff everybody enjoyed. Yeah,
go ahead, And as I got to.

Speaker 6 (22:09):
Put together a Jack song now too, I've got some ideas.
Something comes to the top of your mind right now,
just to spit it out, and I'll try to incorporate
that into my belligerents.

Speaker 1 (22:22):
Maybe we can, maybe we can take suggestions from listening.

Speaker 2 (22:26):
I like that.

Speaker 5 (22:26):
Let's open the text line. In fact, let's take live calls. Okay,
So you work on that and we'll get to that
maybe an hour or four. So back to my story.
I was looking at the Twitter feed and the TV
and everything like that, and I expressed to Joe during
the commercial break, I don't quite get the hatred for

(22:48):
Elon Musk at you looking at a friend of ours
and his tweet.

Speaker 2 (22:52):
I don't get the hatred for Elon Musk.

Speaker 5 (22:54):
I can fully understand, like if you think Doge didn't
do enough, or you thought these cuts were.

Speaker 2 (23:00):
Wrong or whatever it is.

Speaker 5 (23:01):
But I don't get the like, you know, deep hatred
for him or whatever. But I do know a couple
that's leaving the country, the whole family there are children involved.
They're leaving the country because they don't want to live
in Trump and Elon's America.

Speaker 2 (23:22):
And what.

Speaker 5 (23:24):
The straw that broke the camel's back was Elon's Nazi
salute when at that convention or whatever he did, whatever
he said.

Speaker 1 (23:32):
When the autistic fellow waved at the crowd and his
arm was out like that for a fraction of a second.
Yes that they they have believed and internalized that that
was an actual Nazi salute, and Nazism is coming to
America Idazi.

Speaker 5 (23:45):
I didn't have the first hand conversation with him, but
I've talked to the person that did, and yes, they
can't believe that. You don't realize that was obviously a
Nazi salute.

Speaker 1 (23:58):
To me, dot I would like to hear more. And therefore,
Crystal Knoch is coming. It's the left that's abusing Jews.
But anyway, back to you.

Speaker 5 (24:08):
Right, I think that's so dumb. I can't even it's
hard to even engage in conversation. This is a very
wealthy couple, by the way, incredibly successful, both of them,
like really successful, and I don't what.

Speaker 4 (24:26):
Do you do with that? How could you possibly think that.

Speaker 1 (24:31):
I would like to spend the rest of my life
studying in between long periods of leisure, obviously, but studying
the relationship and lack of relationship between intelligence, skill at
a particular skill set, and the concept of wisdom, because

(24:53):
they can be completely unrelated. I mean, we have all known.
I mean the classic exam, I guess would be the brilliant,
brilliant musician who is just unspeakably talented.

Speaker 2 (25:05):
But these people are.

Speaker 1 (25:06):
Replaced by AI but cannot manage their life at all
and dies a drug addiction and is miserable.

Speaker 5 (25:11):
And I understand anybody in the world of arn't more
being that way as a musician or whatever.

Speaker 2 (25:16):
It's so different.

Speaker 5 (25:17):
But these are like business people, one scientist adjacent, the
other one just flat out business person, like, like.

Speaker 2 (25:27):
So successful, and I just I can't.

Speaker 5 (25:30):
So you think Elon is secretly a Nazi and slipped,
or you think that was the signal to start something,
or I can't even imagine what the what the scenario
would be.

Speaker 1 (25:43):
Yeah, yeah, I find this so interesting.

Speaker 2 (25:48):
And you would pick up your family and move across
the ocean.

Speaker 4 (25:51):
Because of it?

Speaker 1 (25:53):
Yeah, yeah, nuts.

Speaker 4 (25:55):
You're nuts.

Speaker 1 (25:58):
On the other hand, you told me where they're going
and it's a pleasant place. Yeah, yeah, that's again. I
have a million comments. I just I'm so blown away.
And it takes money to indulge that sort of media.
I mean, that's part of it, because Joe Schmoe, you know,

(26:18):
struggling to get by and his wife and two little
kids might want desperately to go to Uruguay, for instance,
where I've been convinced if I ever go somewhere, that's
where I'm going. You're sure, No, I'm half, I'm like
sixty percent sure. But I was hearing about how interesting
a country that is in a little more American terms

(26:42):
of freedom and that sort of thing than sometimes America.
But you can only indulge that impulse if you've got
a hell of a lot of money or you're willing
to live extremely modestly. So anyway, that's why celebrities are
always doing it or talking about doing it, because they can. Well.

Speaker 5 (26:56):
In this family, I assume is surrounded by people that
say I understand, or that's a good idea, or we're
going to do.

Speaker 2 (27:04):
That too, which is so crazy.

Speaker 1 (27:08):
Yeah, I am so fascinating by the idea of getting
in of understanding how other people's psyches work, which you know,
you never can fully But the sort of person who
processes all of the inputs of life, everything they see,
they hear, they perceive through an emotional lens, and then

(27:28):
they structure it quote unquote logically to make it sound
like it makes sense. But it's all emotionalism, and you
can't talk them away. It's like the classic example is
I don't care what you say. That's what I believe
you're stating. Wait a minute, I have presented rock solid,

(27:49):
irrefutable evidence to your conclusion being wrong, but you are
shouting at me that you have no interest in it.
That sort of person's psyche operates way differently than mine,
for instance, and I think my way is better, but
they probably think exactly the same thing. I just don't
want them in charge of anything. So I've said a

(28:10):
million times I love songwriters and poets and dreamers. We
would be lost without them. I don't want them in charge.

Speaker 5 (28:19):
I was wondering the other day if anybody has recently
set the record for doing backflips while on fire. Good
news on that front, and maybe that song about me
an hour four or soon?

Speaker 1 (28:31):
And what else? Plus coming up, let's meet some of
the young Columbia Radicals. You're in court the other day.

Speaker 5 (28:38):
Oh boy, I have a guess.

Speaker 2 (28:50):
I was freezing.

Speaker 8 (28:51):
I am wearing a couple of layers that are soaked
in a jail. I'm putt in a fridge for twenty
four hours, so it helped protect me from the fire
that I'm on fire. So we're seven seconds in. I've
done two backflips, so I'm on good trot to beat
the previous record, and I need to move quickly. The
oxygen's burning up around me. If I don't keep moving,
I could find it very difficult to breathe on find

(29:11):
the heat a bit too overwhelming. There are parts that
are actually falling off what I'm wearing on burning. My
successful record title was seven bock flips in thirty seconds
while engaging in a full body burn. It's incredible what
humans can achieve.

Speaker 5 (29:24):
All right, So there's a British man who has set
the new record for doing backflips while on fire.

Speaker 1 (29:31):
Sounded irish to me, but back to you, all right,
another backflipping on fire news.

Speaker 2 (29:37):
Wait a minute, there isn't any Well.

Speaker 5 (29:39):
What's the record for doing backflips while on fire? If
you're if you have red hair, so I don't know
if you follow this.

Speaker 1 (29:50):
A group of Kafia clad students to send it on
Manhattan Criminal Court on Wednesday to face formal arrangement arraignment
for their roles in the violent takeover of Columbia Universe
these Butler Library. You remember the video at the time,
the damage, the injuries, the they're demanding that the charges
be dismissed because they're peace activists and they set up

(30:13):
a mere teaching in the library. You may also recall
that they the mob injured to security officials, damaged bookshelves,
distributed pamphlets praising Hamas, and renamed the library after a terrorist.
They were led by the Columbia University Apartheid divest a
notorious anti Semitic group. It wants the Ivy League school
to cut all ties with Israel in chance. We want

(30:34):
investment devestment now, among other things like globalize the Intafada,
which means killed Jews wherever you find them. So who
are these global cream of the cream of the crop
of American youth yesterday?

Speaker 5 (30:46):
Globalized the Intafada is exactly what that guy chanted after
he killed those two nice people who worked at the embassy.

Speaker 1 (30:55):
Including one of whom was a Christian. Yes, yes, globalized
the Intafada. Indeed, So who are these brave, angry revolutionaries
rising up from the gutter to challenge the man jack.
Let's meet some of them, like Emma Biswus, who grew
up in a life of luxury. She led the robotics
team at the Harker School, which bills itself as one

(31:16):
of the nation's top college prep schools and tuition can
reach sixty five thousand dollars a year. Wow, that's right
for a high school. While there, she interned with a
biotech company that boasts prospective Nobel Prize winners work here,
and until she left for Tony Barnard College, Biswus lived
in the San Francisco Bay area in a mansion.

Speaker 5 (31:34):
Worth six million dollars. Moving along, That's amazing how often
that's the case.

Speaker 1 (31:41):
Interestingly, her dad is with a tech company that does
a lot of defense work, including with Israel. But let's
not get hung up on that.

Speaker 5 (31:48):
The rich kids that don't have a purpose because everything
has been handed to them, they need to achieve something.
Falling gave it away, fall into these ideologies so easy.

Speaker 1 (31:58):
But miss Biswuss was just one of eight Barnard and
Columbia students who attended swanking, prestigious private schools and were
charged with criminal trespassing, among other things. Although there was
Barnard's student Luna Firefly Deerfield cumming Shaw.

Speaker 2 (32:11):
How many names is enough?

Speaker 1 (32:13):
Sweetheart? Wow? Luna Firefly Deerfield cumming Shaw the granddaughter of
Mark Shaw, who describes himself as John F. Kennedy's unofficial
family photographer, stabbed, snapped many many celebrities, and her grandmother
is an actress best known for a role in Some
Show Anyway. Cumming Shaw attended the Putney School in Vermont,

(32:35):
which charges day students fifty thousand dollars a year and
boarding students over eighty thousand. Despite the price, it advertises
itself as a progressive school that considers the inclusivity a
fundamental principle of the school, its sports two committees dedicated
to the cause, allowing it to remain in the forefront

(32:56):
of the drive for racial justice. Her mother is a
long time left wing activists Bernard Alum and Bernie Sanders supporter.
Before Marisol Rojas Cheetham was arrested, The Columbia pre law
student grow up in a Berkeley California home value to
two million dollars, which, honestly, in Berkeley, isn't that fancy
and attended the Bentley School, where tuition reaches nearly sixty

(33:18):
thousand dollars a year according to your LinkedIn page. Rojas
Cheetham was the captain of the varsity women's soccer team,
played lacrosse, president of the student government, to name a
few of her extra curriculars on Bentley's sprawling twelve acre
upper school campus.

Speaker 5 (33:33):
To Patty Hurst, if you're old enough to remember, oh yeah.

Speaker 2 (33:38):
There is something to that.

Speaker 5 (33:39):
The whole your life has been you know, the pathway
has been paid for you.

Speaker 2 (33:45):
It's going to be too easy.

Speaker 5 (33:46):
You've got no feeling of accomplishment or making a difference
in the world, because you know, you could go to
the fancy school and then immediately show up at a
job making a lot of money. It was handed to you,
and you just don't feel like you've done any thing,
so you fall for these ideologies. I think there's got
to be something to do. Ohsama, bin Laden was that?

Speaker 1 (34:05):
Yeah? Yeah, A couple more, just to make the point.
Columbia grad student Ava ambrose Ta Muscula e Garcia seventy
eight thousand dollars boarding school or parents. Both professors at
the University of Notre Dame accomplished in their fields as
a novelist and an artist. Sophia Elizabeth Jones spent eleven
years living overseas at the American Community School in ab Dabi.

(34:29):
Honors student identifies as American with Indonesian and Canadian roots
to Barnard's Class twenty twenty seven, writes frequently about the
Palestinian issues for a Columbia student website. Then you got
Barnard student Dima Abat cozm honored guest at Mayor Eric
Adams Abate Hate Summit last July, a graduate of the

(34:51):
prestigious McDuffie School in Massachusetts, where annual boarding tuition exceeds
seventy five thousand dollars, another multimillion dollar mansion. All of
these very rich young women, yeah, and completely radicalized.

Speaker 5 (35:07):
The percentage of these angry cafia wearing hate spewing nut
jobs who are wealthy women, young women is astounding.

Speaker 1 (35:19):
They have no purpose in their life.

Speaker 5 (35:22):
Well, and I'm sure that you know the teachers are
teaching them all to hate America and Israel and everything.
Also and if it's a boarding school, then you're really surrounded.
I mean you're immersed in it.

Speaker 1 (35:34):
And then you add to that young women's tendency to
really crave belonging in approval of their peers, and if
approval of their peers is dependent upon adopting this radical ideology,
you get it in spades. It really is amazing. Look
in the picture of these very very rich girls trying
to achieve a bo ho radical look and the rest

(35:56):
of it oldest time.

Speaker 5 (36:00):
Wow, you spend all that money sending your kid to
school and they end up nut jobs.

Speaker 1 (36:05):
Yeah. Yeah, Well, we have a blockbuster hour four coming up.
If you don't get our four or you've got to
go off and do something, that's fine, just grab it
later via podcast. You should subscribe to Armstrong and Getty
on demand and listen to it whenever you like.

Speaker 5 (36:20):
Yeah, that's pretty cool at your own leisure. You're a
man who craves leisure. As the song says, yes, we
got a big hour four coming, I hope you can
enjoy them.

Speaker 7 (36:30):
Armstrong and Getty
Advertise With Us

Hosts And Creators

Joe Getty

Joe Getty

Jack Armstrong

Jack Armstrong

Popular Podcasts

Boysober

Boysober

Have you ever wondered what life might be like if you stopped worrying about being wanted, and focused on understanding what you actually want? That was the question Hope Woodard asked herself after a string of situationships inspired her to take a break from sex and dating. She went "boysober," a personal concept that sparked a global movement among women looking to prioritize themselves over men. Now, Hope is looking to expand the ways we explore our relationship to relationships. Taking a bold, unfiltered look into modern love, romance, and self-discovery, Boysober will dive into messy stories about dating, sex, love, friendship, and breaking generational patterns—all with humor, vulnerability, and a fresh perspective.

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.