Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Broadcasting live from the Abraham Lincoln Radio Studio, the George
Washington Broadcast Center, Jack Armstrong and Joe Getty.
Speaker 2 (00:10):
Arm Strong and Getty and now he Armstrong and Yetty.
Speaker 3 (00:23):
Thirty six air traffic control facilities of reported staffing problems.
That includes the towers at Chicago O'Hare, Dallas, Fort War,
Phoenix Sky Harbor, huge hubs for American airlines. You can
also have the towers at San Francisco and lax to
the list. This is all on the heels of the
worst weekend for air traffic control staffing since the shutdown began.
Speaker 1 (00:43):
Yeah, we're at day forty two the shutdown. It's going
to end, but it has an officially end. So we're
in day forty two and the flight problem is still big.
Three thousand cancelations on Sunday, twenty five hundred yesterday and
off to a bad start today. And then how long
this is gonna last? I still don't quite get the
(01:03):
lots of air traffic controllers retired. Yeah, you had a
life plan and then you you altered it because of
the shutdown.
Speaker 2 (01:13):
You retired early. What I don't get it, I mean
even a year early because of this?
Speaker 4 (01:18):
You know, mess up. That'd be odd, but it happened.
I guess apparently.
Speaker 1 (01:22):
Anyway, I'm looking up at all your big there are
twelve hundred cancelations so far today. I don't know when
you're gonna be listening to this, but this is gonna
be a problem for a while. Sure, hope they get
it straightened out before the biggest traveling week of the year,
Thanksgiving obviously, which is coming up kind of so yeah, yeah,
another big story around sports.
Speaker 2 (01:45):
I didn't realize the Winter Olympics are.
Speaker 1 (01:47):
Twelve weeks away. We're getting close. Yeah, And there's somewhere
somewhere snowy and isome France.
Speaker 2 (01:54):
Where are the winter limit good?
Speaker 1 (01:55):
It'd be a good idea rock that for us. Well you, Katie,
whare are the dang Winter Olympics? But then Italy I think,
or was that last time? That doesn't matter?
Speaker 2 (02:03):
Someplace snowys be a good bing it?
Speaker 1 (02:05):
Yeah, probably not Jamaica, Milano, Cortina, I win, congratulations. So
the Winter Olympics coming up in Italy. They ain't gonna
do the whole trans thing this time around. Here's a
little bit of that.
Speaker 3 (02:18):
This sounds like a really big change potentially, Nick.
Speaker 5 (02:21):
This would be big, and it would be a huge
policy shift for the ile C, which until now has
led each sports international governing body set their own policies
centered on transgender inclusion. It would also mark a big
change from the ILEC's twenty twenty one framework, which said
there should not be a presumption of advantage when it
(02:41):
comes to trans athletes.
Speaker 1 (02:43):
So they were running scared last Olympics, like lots of
people were didn't want to be on the wrong side
of this and said some things that they knew were
wrong and crazy, like you just heard and now I've
changed your mind.
Speaker 2 (02:55):
Here's one of the spokespeople for the Olympics.
Speaker 6 (02:59):
Next in or what I would like for the IOC
to do is to bring everyone together to try and
find a consensus amongst all of us that we can
all get behind and that we can implement and above
anything and everything else, it's fair and protects the female categories.
Speaker 1 (03:15):
They're all about protecting the female category. This she's new
and she got elected on the idea of I'm going
to protect women's sports, which is code for we and
let dudes.
Speaker 2 (03:26):
Participate in our women's sports.
Speaker 1 (03:28):
Well, of course, you're not going to let men in
women's sports, then it's not women's sports, says virtually all
of humanity that got bullied into silence. You know, in
most cases, all these conversations that are going on are
how do we handle this without being in trouble politically
as opposed to what's the right thing to do. You
might have a couple of true believers, you know what,
(03:49):
your high school in California or something like that, but
mostly it's how do we handle this without causing a firestorm,
and here's the reasoning behind it all.
Speaker 5 (04:00):
Will issue the band sometime early next year, citing a
new scientific review that found evidence men have a permanent
physical advantage over women athletes even after hormone therapy. However,
the Guardian newspaper says the band could still be a
year out and that the IOC is facing pushback to
a possible ban on athletes who reported female at birth
(04:21):
but have male chromosomes and the same testosterone level as men,
also known as differences in sexual development. That would include
athletes like South Africa's Casser Semenya, who won gold at
the London and Real Games before Track and Fields governing
body World Athletics band DSc athletes from competing as women
in twenty twenty.
Speaker 1 (04:42):
Three DSc different sexual characteristics or whatever. Yeah, yeah, yeah,
I like this new study that came out that said
men have an advantage over women.
Speaker 2 (04:54):
That's a good scientific study. I wonder who did that.
How much did you spend?
Speaker 4 (04:58):
Tell me about the Methodology's hilarious that they had to
hide behind a new study, good lord, instead of ancient wisdom.
Speaker 2 (05:05):
You know, it strikes me.
Speaker 1 (05:08):
You've learned since you were a little kid on the playground.
Going forward is the new scientific study.
Speaker 4 (05:15):
So I was reading about the women's soccer league. They've
had a controversy lately, and it's all centered around this
one player who is much bigger and muscular than all
of the girls on the field. Looks like a dude,
plays like a dude, built like a dude, et cetera.
It's a similar case to Castor Semenya, who they mentioned,
(05:35):
who has internal testicles.
Speaker 2 (05:37):
It was one of the.
Speaker 1 (05:40):
That'd be awesome. Oh I like having testicles. I just
don't want him in the way.
Speaker 2 (05:45):
I see, Thanks for clarifying.
Speaker 4 (05:46):
So, but that's one of the extremely rare cases where
sex assigned at birth is a phrase that makes any
sense because these are people with both sets of genitals
meaning no, of course not no. So she has both
ovaries and testicles and has much much higher test to
party like that the Turkish fella who was whipping up
(06:09):
on the girls in the boxing last time around too.
Speaker 2 (06:12):
But you know, and it's funny.
Speaker 1 (06:14):
How so what do you left, What do you think
we should do with people who have testicles and ovaries?
Speaker 4 (06:20):
Oh, it's a shame, but they can't compete in women's sports.
And that's the point I was about to make. The
left always it's funny. They just are crazy about individual rights,
except when it comes to individual rights that conservatives like,
then they have no interest in them.
Speaker 2 (06:32):
It's all about the community.
Speaker 4 (06:34):
But yeah, I feel terrible for those people if they
want to be athletes, but they can either compete as
manner in an open category. You can't beat the crap
out of women because you are functionally athletically speaking, a man.
Speaker 2 (06:45):
According to a new study, there avantage by being a man.
Speaker 4 (06:49):
Oh, speaking of this sort of thing, I just finished
reading the piece by Colin Wright, who's terrific writer.
Speaker 2 (06:56):
He writes about this sort of thing.
Speaker 4 (06:58):
He was an academic scientist Penn State in twenty twenty
and there was that crazy explosion in adolescent transgenderism among
young girls, and he commented two words social contagion. Within hours,
his colleagues denounced him as a transphobic bigot, and the
online mob came from him before him and crushed his
(07:18):
academic career. And he talks about how he was referring
to research published by a scientist who had coined the
term rapid onset genderness for you in twenty eighteen, the
peer reviewed paper blah blah blah, and that the pattern
was clearly explained by social contagion, the spread of ideas
or behaviors through peer influence. And there are other examples,
(07:40):
whether it's cutting or anarexi or whatever. Teen girls are
just incredibly prone to that sort of thing.
Speaker 2 (07:46):
But then he gets to.
Speaker 4 (07:49):
The fact that the left wing dogma that gender identity
is innate and immutable people are born as transgender. It's
not they're not convinced by a trend or whatever. It's
they're born, and that claim underpins the medical practice and
the legal strategy puberty blockers crossed such hormones, mutilations of kids, miners,
(08:14):
and the rest of it, and the civil rights argument.
So then he makes the point that the dominant argument
to the counter argument to the social contagent theory is
that the sharp rise in transgender identification over the past
decades simply reflects liberation right people are more comfortable expressing
their authentic selves. That has been the argument. As transgender
(08:39):
activist and biologist Julius Erano put it in twenty seventeen,
there really wasn't a rise in left handedness so much
as there was a rise in left handed acceptance. That's
an interesting premise, isn't it. People were free to be
who the f they were, as John Oliver put it
on his Last Week Tonight. But then Colin points out,
(09:00):
if transgender identity were an innate trait like left handedness,
we would expect identification rates to rise at first when
it became socially acceptable, then plateau and remain stable at
a fixed level. If the phenomenon were instead driven by
social contagion, we might expect a boom and bust pattern,
a spike followed by a rapid decline once the social
(09:22):
forces driving it were weakened, And indeed that has become
incredibly clear. Transgenderal identification has fallen fifty percent in two
years among college students and adolescents a couple of different studies.
It was clearly a social contagion, and people who got
(09:45):
their careers ruined for saying it wasn't good. Lord, were
you a victim of an angry wrong mob.
Speaker 1 (09:51):
I'm glad left handedness that didn't catch on as a
social contagent and have a hard time pretending I was
left handed, eating, tying my shoes, whatever I'm doing.
Speaker 4 (10:00):
Yeah, and then he points out that the overwhelming majority
of those driving the trans mania fall into the none
binary category, adopting identities which are said to be neither
both somewhere in between demi boy, gender fluid.
Speaker 2 (10:14):
Two spirit.
Speaker 4 (10:15):
These are social identities, not biological ones. Unlike right handedness
or left handedness, non very non binary identities have no
anatomical or physiological reference. They're conceptual, political, and responsive to
cultural trends, hallmarks of a social contagion.
Speaker 2 (10:31):
Case closed, Bam, Next case.
Speaker 6 (10:35):
Man.
Speaker 1 (10:35):
I'm looking at the dust up that happened in Berkeley.
I wish I had gone last night. I was thinking
about going. It's only whatever. It is forty five miles
from my house, but I had some kids stuff going on.
But it got pretty spicy there outside the Turning Point
event in Berkeley last night.
Speaker 2 (10:52):
Who are they needed you there fighting Antifa?
Speaker 1 (10:54):
And what did you do? You stay at home? Who
are these numbnuts that show up to fight this stuff?
Just let them, let them gather and speak. What's the
skin off your nose? They're actually convinced that they're fascists.
They believe, and that's I have to remind myself of
that semi regularly. And it's true on the right, but
especially on the left, there is a significant group of
(11:15):
people that believes the most lunatic rantings of activists.
Speaker 2 (11:20):
They believe it.
Speaker 1 (11:21):
They're willing to get into a fistfight over somebody speaking
in an auditorium.
Speaker 2 (11:26):
Yeah, you're you're the fascist.
Speaker 4 (11:27):
I mean clearly, let's see you dress up in a uniform,
you go to the opposition events and you beat people
up and call them fascists.
Speaker 2 (11:36):
Right people inside having the speaking engagement.
Speaker 4 (11:39):
It's the lack of appreciating the irony that offends me
the most.
Speaker 1 (11:44):
I want to stand up for the Armstrong in Getty
pickleball paddle when we talk about the store.
Speaker 2 (11:49):
When we come back, okay, because there's time.
Speaker 1 (11:51):
To buy stuff in the Armstrong in Getty store if
you want to get it in time for Christmas.
Speaker 4 (11:55):
Speaking of rhetoric, the phrase cheap Chinese pos has been
thrown around Jacks.
Speaker 1 (12:01):
Not by me, having now just handed I held it
my hand, so I'm going to stand up for it.
And lots of other news of the day, So stay tune.
Speaker 5 (12:13):
When asked Essay during the Fox broadcast of the Washington
Commanders game, who his favorite team was, President Trump said, quote,
so I love the Jets and I love the Giants.
Speaker 2 (12:21):
Wow. You know you can just say you don't like football.
Speaker 1 (12:28):
So what's going on with the NFL? We're talking about
Broncos Raiders. Sunday night? Was it was it ten to seven?
And then last night you had the Eagles and Packers.
The Eagles won ten to seven. The first touchdown was
scored in the fourth quarter. Did they change the rules
in the NFL or something. That's not the way the
(12:48):
NFL has been for the last I don't know decade.
Speaker 4 (12:52):
Healed three hundred yards long. Now they forgot to announce
it is crazy.
Speaker 2 (12:56):
Do you think there's betting going on?
Speaker 1 (12:58):
No, but it's just a it's just weird. It's a
football of my youth. That's what NFL.
Speaker 2 (13:02):
Used to be. It's pretty yards and a cloud or dust.
Speaker 1 (13:05):
Yeah, anyway, first touchdown in the fourth quarter. Yeah, the
the networks can't be happy about that.
Speaker 4 (13:13):
I do love punting. That was a great game, I said,
no one ever. Armstrong and Getty store is open.
Speaker 1 (13:21):
So some of you buy stuff for your family or
friends for Christmas, if they're fans of this here radio program.
Speaker 2 (13:28):
It's just for yourself. I feel like it's one for me.
Speaker 1 (13:31):
I feel like it's one of those great excuse me,
easy gift, checked it off, didn't cost you much, straight
from the heart.
Speaker 2 (13:40):
Seems like you did something nice.
Speaker 4 (13:42):
It was really there will be laughter, tears, hugs. It
will be the bell of the ball.
Speaker 1 (13:47):
You go to the Armstrong and Getty store, and Hanson,
who runs a damn thing, is really pushing people to
do it soon to make sure you get everything in
time for Christmas. There was some talk about the Armstrong
and Getty pickleball paddles. I thought it was pretty funny
that we had them, and then there are some complaints
about them, and Hanson brought one in and I don't
have any idea what a really high quality pickleball pedal is,
(14:08):
like I have them. I bought mine at large five,
I think, But they're just like this. There's nothing wrong
with this. This is perfectly tie. Unless you like, get
serious serious about pickleball, this is gonna be absolutely fine.
What is wrong with this? This isn't doesn't feel like
cheap Chinese crap. It feels like something quality to me.
I like everything about it. And it comes with this
cool carrying case. Although they're sold out, so they're I
(14:29):
don't know why I'm even talking about it. We're sold
out of the pickleball pedalsar'ms trying. You can get a pick
a ball, but not anymore. They're gonna be back. But anyway,
and go to the store and see what we got.
Speaker 4 (14:38):
Yeah, we put the lash to the Chinese slaves and
they made several more of them.
Speaker 1 (14:42):
What if we had adult items like the sort of
thing people were throwing on WNBA courts at one point,
like marital aids. We'll go on it for novelty purposes,
only take a and g to bed.
Speaker 2 (14:52):
Yes, I love that idea.
Speaker 1 (14:53):
Anyway, and you know, we need to branch out, we
need to have more things. They're probably not as popular
as the hoodies. Including the conscience of the nation hoodie,
the star of the Lazy Stupid should Hurt stars including
the f yolickin Party which I wear proudly, or stocking stuffers,
ang coasters, decals, uh coffee mugs, stainless steel water bottle.
Speaker 4 (15:18):
Or the get Your Words Straight Jack note. Look that's
Alid Armstrong and Getty dot Com. Anyway, that's enough of that.
Speaker 1 (15:26):
I'm trying to figure out what to do for my
kids this year. They're definitely the age they're almost fourteen,
almost sixteen, They're definitely past the age of more cheap
Chinese crap. There's just no reason so either an experience
or like accumulate the money into one actually.
Speaker 2 (15:42):
Worth something gift. I think this year.
Speaker 1 (15:45):
God, when they're younger, it's just endless piles of cheap
Chinese crab is what they get when they're little kids.
And they're delighted, and that's fine, and it is cheap,
but it well, it was always.
Speaker 4 (15:58):
Interesting to me that we would we give our kids
the same beloved toy that we had, but you'd take
it out of the package and instead of feeling like
it could last, you know, one hundred years.
Speaker 2 (16:10):
Yeah, it's cheap Chinese crab.
Speaker 1 (16:11):
Yeah, definitely China definitely the modern version of something you
played with as a kid that is just so flimsy
and light and poorly made exactly.
Speaker 7 (16:21):
Yeah.
Speaker 4 (16:21):
Oh, speaking of China, we ought to get to that.
China's really rushing to catch up and pass us on AI.
Couple of pretty compelling AI stories in the news.
Speaker 1 (16:29):
Today definitely are that I can't wait to talk about,
among other things that we got coming up. And I
do want to get some of the audio from the
Turning Point event in Berkeley. You know, the Charlie Kirk
crowd trying to get together and the Antifa types outside
not wanting them to get together and speak for some reason,
and fighting cops and setting off smoke bombs, and I
(16:54):
just don't get it. I don't get you people.
Speaker 4 (16:56):
Which side is trying to shut down free speech? That
the answer to that question tells you who's the good
guys and.
Speaker 2 (17:05):
Who's the bad guys. That's all you need to know.
Speaker 1 (17:09):
That seems so obvious to me. I can't believe it
even needs to be said, but it needs to be
said for a lot of the media.
Speaker 2 (17:14):
Apparently.
Speaker 5 (17:15):
More on the way stay here, Armstrong and getty, Hey.
Speaker 7 (17:20):
Optimists, what are you doing there?
Speaker 2 (17:23):
Just chilling you ready to help?
Speaker 7 (17:25):
Hey, Optimist, you know where I can get a coke? Sorry,
I don't.
Speaker 1 (17:32):
I have a real time info, but I can take
you to the kitchen if you want to check for
a coke there.
Speaker 7 (17:37):
Oh yeah, that'd be great. Go, yes, let's do that.
Speaker 2 (17:42):
And then it just stands there.
Speaker 7 (17:45):
Let's go awesome, attend to the kitchen. Okay, okay, go,
I think it's I think we need to give a
little bit more.
Speaker 2 (17:58):
Okay.
Speaker 1 (17:58):
So that's the voice obvious Elon Musk right there who
said we need to give it more room. They were
standing too close. I guess you can take that down, Michael.
They're standing next to optimists as they were all going
to go to the kitchen to get a coke, and
Optimist who was just standing there looking at him ilis
and Elon said, I think we need to back up
a little bit.
Speaker 2 (18:17):
Anyway.
Speaker 1 (18:17):
My takeaway from that video was we ain't even close yet.
We're not even close to robots taking over yet. Now
it's moving pretty fast. Maybe it'll be exponentially better in
a year. I'm sure it will be. But the fact
that Elon has got a trillion dollar incentive packtion package
(18:38):
now from Tesla, and he's focusing mostly on optimists the
AI robot more than the electric cars.
Speaker 7 (18:46):
I don't know.
Speaker 1 (18:47):
It seems like we're a long way away. Where do
I get a coke? I don't know where to get it, stops,
gets hung sort of glitch, has no idea, and then
it just stands. It's I thought it would be further
along than.
Speaker 4 (19:02):
That, didn't you Wait a minute? I just googled where
do people keep cox? It suggested the refrigerator, which is
often in a human kitchen. Let's go to the kitchen.
All right, let's go. You go first.
Speaker 1 (19:18):
I'm not trying to come off as a guy who
mocks technology thinking it'll never because I'm sure it will
be a thing eventually. But it's not as close as
I thought. But didn't you think that optimist robot would
be more impressive than that?
Speaker 2 (19:32):
Yeah?
Speaker 4 (19:32):
Yeah, certainly before you trotted it out to do what
they just did, right right, right, right right. You know,
I'm reminded of Elon trotting out the cyber truck for
the first time and saying, and the windows cannot be shattered. Boom,
he shatters the window here.
Speaker 2 (19:50):
So this article we got a couple of AI stories
for you.
Speaker 1 (19:53):
This article in the Wall Street Journal today about China's
push to catch up within surpassity the United States is
flipping troubling. For instance, this paragraph the ESCALATINGAI race is
drawing comparisons with the Cold War and the great scientific
and technological clashes that characterized it.
Speaker 2 (20:12):
It is likely to be at least as consequential.
Speaker 1 (20:16):
The AI race between US and China is going to
be at least as consequential as the Cold War between
US and the Soviet Union, if you're.
Speaker 2 (20:24):
Old enough to have lived through that. Holy crap.
Speaker 1 (20:29):
China realized that all the big AI was going to
be the next big thing, maybe the next big only
thing on planet Earth, and it was way behind Open Ai, Google,
all the American companies that were doing so well, and
then decided we got to do something. And they've done
a whole of nation effort to try to catch up
and poured a ton of money into it and relaxed
(20:52):
all kinds of regulations, which is highly troubling and.
Speaker 2 (20:59):
Sudden silent concerns.
Speaker 4 (21:02):
You know, just hey, quit talking about safety and what's
best for humanity.
Speaker 2 (21:06):
We don't have time. I mentioned. My favorite quote in
the article is from JD.
Speaker 4 (21:10):
Vance, and he argued this in February, the AI future
is not going to.
Speaker 1 (21:15):
Be won by hand ringing about safety. Well, he's right,
I understand what he's saying. What he's wanting to say
is China and Russia, mostly China, because China's got the
money to put into this. China is going to do
whatever the hell they want. And if they beat us
to the punch on this, it ain't gonna make any difference.
(21:36):
That we tried to be ethical and safe about it,
it ain't gonna make any difference.
Speaker 4 (21:40):
And I'm certainly the wrong guy to ask this question,
but I find myself wondering, can their AI essentially crush
our AI if it gets to you know, whatever critical
stage first, it can mess with our efforts and our
programs and databases and the rest of it to the
point that it blows ours ee up. Yeah, they could
(22:01):
be some sort of like on purpose effort like that,
But I take in a ton of AI information reading
and listening to podcasts with the smartest people in the
world dogging about this. The more likely concern is without
any attempt whatsoever to ethnically control it, it just gets
loose on its own and gets into computers and travels
around the world and just kind.
Speaker 1 (22:20):
Of does its own thing, and then the genie is
out of the bottle, which is pretty much inevitable. How
do we prevent that though, I mean, even if we're first, oh,
we can't. Okay, never mind forward to having your organs harvested.
Speaker 4 (22:36):
I mean, because even if we beat them to the
punch by five years, when they catch up five years later,
unless our AI can trump their AI, they will unleash it.
Speaker 2 (22:43):
And the hell you're speaking of.
Speaker 1 (22:45):
Something, I suppose my big first of all, that paragraph
about the Cold War, I find just like bone chilling,
don't I don't feel like the population is taking this
like the challenge that it is. The way the Cold
War was. I mean, my dad grew up hiding underneath
(23:07):
his desks in rural Iowa in case the Russians dropped
the bomb, but it was on their radar that we
were in a you know, fight to the death with
a foe that was close enough to we're equal to
have to worry about it. I don't feel like people
feel that way about China and AI. The average person
doesn't have any idea any of this is happening. No, no,
(23:30):
which is troubling.
Speaker 2 (23:31):
I think, my I guess we're better off that way.
Speaker 4 (23:38):
I mean, because if we spend all of our time
terrified of our if our AI overlords, that's no way
to live. Instead, you'll be going about your business. One
day you'll turn around, there's a robot beyond you. You'll think, Wow,
that's weird. Then it'll suffer your head. I mean, just
like that, and you won't have suffered the fear.
Speaker 2 (23:54):
Hard to imagine where no head. But what are you
gonna do? Yeah, I suppose worrying about it? And not
much you can do about it.
Speaker 1 (24:01):
I was gonna says, as a guy who cares about
his money, I worry about the economy and what's going
to happen and whether or not this is all a
bubble and it's going to completely collapse. And it is
CHIP companies trading money with AI companies back and forth
and investing each other, and it could bust and it
doesn't turn out to be what they said. But well,
that one of the other lead stories that have got
(24:21):
for today the.
Speaker 2 (24:24):
Right here.
Speaker 1 (24:26):
Yan Lukun is Meta's chief AI scientist, the top guy
working on AI for Zuckerberg, who has spent tens of
billions of dollars. I think he spent one hundred billion
dollars on this project. His lead scientists is leaving and
starting his own company. All of these people, including the Chinese,
(24:47):
can't all be wrong.
Speaker 2 (24:49):
Can they?
Speaker 1 (24:49):
That it turns into a dot com bubble where it's like, oh,
I guess AI is not going to be profitable do anything,
So never mind gosh, I wouldn't think so, you wouldn't
think Elon and Zuckerberg and China and ever everybody else
could be wrong about this. So that's what leads me
to believe that this is going to be a thing
unfolding in front of our eyes at some point.
Speaker 2 (25:08):
And then looming behind us and severing our heads.
Speaker 1 (25:10):
So so you say that all the time, which is funny,
But do you do you have a real world sense
of bad things that AI could do?
Speaker 4 (25:22):
Well. It goes back to the commonly spoken theme of
AI decides the only thing impeding it is human beings
or the only thing impeding the planet being at peak health.
Speaker 2 (25:34):
Is human beings.
Speaker 4 (25:36):
I mean, those are the two classic why they would
sever our heads.
Speaker 1 (25:40):
And then even if they don't do that, what if
it wipes out seventy five percent of jobs.
Speaker 4 (25:47):
Well right right, yeah, and then political turmoil and revolution
in the Straits, et cetera, et cetera. Robots great, oh yeah,
no kidding, Wow. So a couple more quick AI notes.
I thought this was really interesting, the Wall Street Journal
reporting that Anthropic, which is the company behind Claude, expects
to break even for the first time in twenty twenty eight.
(26:11):
By contrast open AI, the chat GPT folks plan, they're
forecasting their operating losses that year twenty twenty eight will
be about seventy four billion dollars. They will lose seventy
four billion dollars in twenty twenty eight, or roughly three
quarters of revenue thanks to ballooning spending on computing costs,
(26:32):
And they don't think they'll They're gonna burn through roughly
fourteen times as much cash as Anthropic before turning a
profit in twenty thirty but certainly don't take my word
for it through the Wall Street Journal and invest carefully.
Speaker 1 (26:46):
Yeah, and then you've got this story of Amazon that
never made any money and was losing money like crazy,
and I remember all the jokes about it never turned
to profit and everything like that, and obviously came to
dominate the landscape in so many different ways of eventually.
Speaker 2 (27:01):
And then I'm sorry, Michael's what did you say to us?
Speaker 6 (27:04):
Oh?
Speaker 2 (27:05):
Price picky? In just a second? Coming up?
Speaker 4 (27:07):
In one more AI note from a website I had
never heard of, sent to us by alert listener Hillbilly
Savingcountry Music dot Com. The headline is AI song's top
Billboard chart. Why we need transparency? Now?
Speaker 2 (27:22):
Okay, I want to hear that. Well, that shits nizzle.
Are you kidding me? I'll tell you what.
Speaker 1 (27:29):
Cutting off my head, taking everybody's jobs and running country music, well,
this is no good.
Speaker 2 (27:34):
Unplug it. So a word from our friends. A prize
picks It is.
Speaker 4 (27:38):
The easiest, most fun way to get into fantasy sports
around You just pick more or less on at least
two player stats. You think your favorite basketball player is
going to go off against the weak defensive wherever they're playing,
pick more or vice versa. And you can even combine
like a football player with a basketball player or multiple
players on your lineup, mix.
Speaker 2 (27:56):
Up the sports.
Speaker 1 (27:57):
I guess for the NFL, now you go with less
on everything with these ten to seven games? What the
heck is going on there?
Speaker 2 (28:03):
Anyway?
Speaker 1 (28:03):
If you got figured out a trend or a hot player,
or somebody who's passed their prime and all that sort
of stuff. You can take your opinion and turn it
into cash with Prize Picks. Download the Price Picks app
today and use the code armstrong to get fifty dollars
and lineups after you play your first five dollars lineup.
That code is armstrong to get fifty dollars in the lineups
after you play your first five dollars lineup Price Picks.
Speaker 2 (28:22):
It is good to be right.
Speaker 4 (28:24):
Yeah, If you want flexibility, you can play the flex
play where you can get paid even if one of
your picks misses. Once again, the coat is Armstrong. That
Prize Picks app get fifty bucks in lineups after you
play five prize picks. All right, so I just opened
this up again. Thank you hill Billy for sending this along.
There's an alarmingly low sense of urgency. They write about
(28:45):
a rapidly developing dilemma that threatens to absolutely eviscerate everything
we know and love about music in a matter of months.
We're talking about AI, of course, but it feels almost
embarrassing and trite at this point even bring it up
in such a breathless context, in part because we all
have an inherent sense of how catastrophic is going to
be for the human creators and how inevitable its impacts ultimately?
Speaker 2 (29:03):
Are you know?
Speaker 4 (29:04):
Hill Billy mentioned that the guy who wrote this is
a terrific writer and he is. Wow, that's some good writing. Anyway,
what's his name?
Speaker 2 (29:13):
I don't know?
Speaker 4 (29:14):
But they there's the picture of AI generated artist breaking rust.
It's a little too perfect, country looking bearded guy in
a cowboy.
Speaker 1 (29:24):
Hat and right and all uh handsome, rugged yet sensitive.
Speaker 2 (29:29):
Yes, how do you know?
Speaker 4 (29:30):
You must have seen this picture, Papa? But do we
expect Congress to address this existential crisis facing human creators?
They're saying, we should do something about it, attempt to
install some guardrails and guideposts, and expend at least a
modicum of effort to at least make sure the public
is aware of weight, what is AI and what is not.
Speaker 1 (29:48):
Yeah, there's some effort by lots of people that you
have to declare something in AI creation. Do you think
that makes any difference? You dig in a song, Oh
it's AI. Will never mind that. I don't know either
like it or I don't like it? Do you think
I'd like it more if it turns out it's a human.
Maybe maybe I find out it's some thirty four year
(30:09):
old former drug addict was in jail. You know. I'm
thinking of a what's the guy jelly world type of
story or something like that.
Speaker 2 (30:17):
That hooks you.
Speaker 4 (30:18):
M Yeah, because the sentiment in the song seems much
more real. You've asked the key question. That's a super
interesting question. Will people still enjoy it? This guy's advocating
any piece of music made by AI or even partially
maybe AI, must be disclosed as such to the public period.
So like, if you use AI to clean up the bassline,
(30:39):
I don't know because your bass player's drunk or something.
Speaker 2 (30:45):
I don't know.
Speaker 4 (30:46):
Because evidently this breaking rust song walk My Walk top
the Billboard Country Digital Song Sales Chart.
Speaker 2 (30:55):
An AI track was number one song in country. I'm
gonna listen to that during the break, please do.
Speaker 1 (31:02):
We can't play it for copywriting reasons, but I'm going
to listen to it.
Speaker 2 (31:05):
Didn't break, see what I think? Yeah? Uh God add
it into a weird world? Yeah yeah, I don't know.
And everybody's guessing.
Speaker 1 (31:13):
But the people with a lots of money, like the
richest people on Earth are guessing that it's going to
be a big deal.
Speaker 2 (31:18):
And going to be profitable.
Speaker 1 (31:21):
I don't know if there are any super wealthy people
are saying, Nah, this is overhyped. I'll be in the
woods if you need me. Man, Well, good luck the
AI robot. But currently can't find a coke. We'll be
able to find you in the woods and chop off
your head for whatever reason. Look, honey, look at that squirrel.
Zero's and ones flash in his eyes. It sends down
(31:42):
about me.
Speaker 2 (31:43):
Yeah.
Speaker 1 (31:45):
Likely Any thoughts on this text line four one KFTC.
Speaker 3 (31:54):
Listen to this gen z is posting tiktoks of a
new challenge where they do nothing but sit in silence
for as long as they.
Speaker 2 (31:59):
Can hand where as your grandparents call that life.
Speaker 1 (32:06):
Speaking of tech, So before we went to break, Joe
mentioned this country song that's on the top of some
chart and it's AI, and so we both took a
listen to it. We can't play it because it'd be
a violation of something. But so before I tell you
what I thought of this AI song that's at the
top of the charts, Is it all AI?
Speaker 2 (32:27):
Is that what you're saying? It's entirely AI song, I
believe so.
Speaker 7 (32:30):
Yeah.
Speaker 1 (32:31):
So the dude on the cover is a made up picture.
The voice is Ai, the instruments, the writing, all of.
Speaker 2 (32:37):
It is that.
Speaker 4 (32:39):
Yeah, I don't know if the there's probably somebody who
wrote the lyrics just but maybe not.
Speaker 2 (32:45):
I don't know. Having listened to it.
Speaker 4 (32:46):
That is highly troubling. Yeah, it seconded me. That is
way too good. That is why why would anybody try
at this point? Well, why would anybody try to become
famous and make money at it? If you want to
make music, I do it every day at home. I
played the piano in my bedroom in my underwear.
Speaker 1 (33:06):
I do that all the time. But it left that
last part out. Hey, I isn't going to replace that.
But any I'm gonna make it onto the charts. I
don't know if there's any point in Matt anymore.
Speaker 4 (33:16):
Yeah, I'm I'm already very very cynical about pop music.
And it occurred to me that a lot of us
of a certain age we had the unbelievable experience that
we took for granted that pop music, which was entirely
a commodity. I mean, anybody who was actually a creative
artist was exploited and thrown away by the money guys.
Speaker 2 (33:39):
It was just again a corporate commodity.
Speaker 4 (33:41):
And then there was a brief and wonderful period of
I don't know, ten to twenty five years where the
art was dominated by actual creative artists, at least to
a significant extent. It still was corporate, but there was
a hell of a lot of creativity. And now, and
I'm not saying there's no creat activity left, but pop
music is so corporate. There's so much money to be made.
(34:04):
The formulaic AI, they might as well be AI. Song
factories are so efficient, the underwear models lip syncing to
the music are so good looking. In the rest of it,
it's easy to be very, very cynical about it. Having
said that, the lyrics of this song in particular are
a person who has had some very painful times in
(34:28):
their life pouring out their soul. And the fact that
that is cranked out by a computer because they know
you like that sort of thing makes.
Speaker 2 (34:36):
Me want to vomit. That's a good point. The fact that.
Speaker 1 (34:42):
A chat bot picks up Oh okay, people have angst
and pain and that sort of thing.
Speaker 2 (34:47):
Oh right about that?
Speaker 1 (34:49):
Yeah, when it comes from somebody who's had that same feeling,
and we have that, we have that in common as
a human being. Oh you felt that. I'm feeling that
right now. Thanks for writing about it. When it turns
out it's a computer completely you know.
Speaker 2 (35:02):
Yeah.
Speaker 4 (35:03):
One of my one of my favorite songs by one
of my favorite bands. The writer and singer happens to
be a gal. He's talking about, you know, being you know,
self destructive, in love and drinking way too much. And
the line is, maybe I'll find my maker on the
bedroom floor, which is a hell of a line.
Speaker 2 (35:21):
Maybe I'll meet my maker, I think it is.
Speaker 4 (35:23):
Anyway, uh to hear that somebody just cranked that out
because the computer algorithm said that would that would be compelling.
Speaker 2 (35:30):
I don't know, just ugh, my skin.
Speaker 4 (35:33):
Is crawling, my guts are churning. Maybe I ate something
bad for dinner last night. But yeah, that's that's awful.
It's not good, it's not funny, it's not amusing.
Speaker 1 (35:42):
So Google hired some AI guru to come over. They
spent two point seven billion dollars to buy character dot Ai,
and then this guy had a whole bunch of posts
about how he doesn't believe the whole trans thing is real,
and so Google tried to shut him down, having just
spent three billion dollars on his company, and that became
(36:04):
so they got a woke problem within the Google eye stuff.
So that would be something China is not worried about.
Speaker 6 (36:09):
That.
Speaker 1 (36:10):
I guarantee you China's attempt to be the dominant AI
force on Earth is not worried about the politics of
the individual employees.
Speaker 4 (36:17):
Right, speaking of wars, man is the Democratic Party at
war with itself over the end of the shutdown? Man,
some strong stuff being said.
Speaker 5 (36:24):
Teach you that coming up Armstrong and Getty