Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to KFI AM six forty on demand. Hey,
good afternoon. I'm Chris Merril.
Speaker 2 (00:12):
This is KFI AM six forty and we are on
demand anytime in the iHeartRadio app. When you're on that app,
you can click on that talkback button. If you have questions, comments,
quipt quotes, criticisms, or compliments about the program, feel free
to let us know the question for tonight's talk back.
I had a story about to former inmates of San
Quentin going back to play an alumni game on their
(00:32):
Field of Dreams, which is the baseball field inside the prison,
and so it's former inmates versus current inmates. Current inmates
won big ley. Incidentally, what is someplace that you would
never go back to visit? Because if I got out
of Saint Quentin, I would never even want to drive
by the place, and if I did, I go nightmares
in there.
Speaker 1 (00:52):
That's it. I just never want to go back.
Speaker 2 (00:55):
Curious about what where it is that you've spent time
that you would never want to go back to again.
Speaker 3 (01:00):
Good afternoon, Chris, listen is to Heilly out of him
in California. The one place I would not go back
to Tahati Village time share presentation. I thought it was
cool because I said they were giving away like three days,
four nights, But it felt like I was a hostage
and it was very hostile. I do not advise that
(01:21):
for anyone. The work's the one place a time share presentation.
Oh my god, thinking about it now is give me exiety.
Speaker 1 (01:29):
Yeah, me too.
Speaker 2 (01:31):
My wife dragged me to a couple of those because
she's like, oh, they're gonna pay for you know. We
had two nights at this dump of a hotel in Branson, Missouri. Oh,
we did it there. And then when we first got married,
actually was here. We sat through one of those timeshare
presentations because they give us tickets to I think Universal
Studios or something. We were living in northern Arizona at
(01:51):
the time, and they said, so we had to sit
through that, and they said, oh, you should only take
about two hours. You know, when it only takes you know,
when the presentation only takes two hours is when you
when you sign up for a time share right away.
In both cases, we were stuck there for four hours.
We were the last to leave. And he's right, you
(02:13):
feel like a hostage. Oh it's the worst. I'm with you.
One hundred percent, and finally after the last one, my
wife is like, Okay, We're never going to do that again.
It's not worth it. I go, was that worth the
two hundred dollars that we just saved? She goes, no,
it was not.
Speaker 4 (02:27):
Hey, Chris, someplace I'd never go back to visit probably
send you Comic Con that place. I mean, it's it's great,
but after going for you know, so many years, the
pandemic stout me and I think I'm done with that place.
Speaker 1 (02:45):
Too crazy. It is crazy.
Speaker 2 (02:47):
Yeah, you've been to comic Con, Kayla, Now, no, it's
pretty crazy. When you like, you have to be in
that mood, the same mood you have to be in
when you go. I'm going to go deal with the
crazy amount of people at for instance, Universal Studios, which
the day that we went, which was part of our honeymoon,
it rained like torrents and it was wonderful. Oh so
(03:11):
few people there. I mean it was buckets. We're riding
roller coasters.
Speaker 1 (03:15):
In the rain.
Speaker 2 (03:15):
It was so great. I loved it so much. But
Comic Con is it's just a madhouse. It's like going
to Disney on a popular day. It's just it's just
a madhouse and you're standing out, you're standing outside and
you're in line, and uh it's hot.
Speaker 1 (03:31):
Yeah, beautiful setting.
Speaker 2 (03:33):
You didn't get a better setting than the convention center
in San Diego, but yeah, it's a lot. You just
have to be all in for it. If you're not
all in for Comic Con, it's no good. Uh, all right,
did you want to.
Speaker 1 (03:48):
Do this other one?
Speaker 2 (03:48):
We had one guy that had thoughts on trans athletes.
We were talking about the transathletes earlier. For those of
you joining us, I just don't like the parents are
screaming at kids.
Speaker 1 (03:58):
I don't. I don't care for it at all.
Speaker 2 (04:02):
And then we have people that are justifying parents screaming
at sixteen year olds.
Speaker 1 (04:05):
Well they deserved it. So what are you gonna do?
Speaker 2 (04:09):
I mean, yeah, you have freedom of speech, but you
also have freedom to not be, you know, an adult
acting like a child process your emotions. Let's uh, let's
find the words. Probably not scream at children. Trumps on again, off, oup,
hang on, wrong one, hang on.
Speaker 5 (04:26):
If I had a daughter, I would suggest that she
boycott these competitions where there's transgenders. I mean, it's a
no win for these young women that can't compete with
the genetics of a boy. I mean, you can put
lipstick on the boy, but it's not a girl. So
(04:48):
just saying it's not gonna work, huh. And this is
gonna end up tragically down the road.
Speaker 2 (04:54):
I think it is. I'm afraid it is gonna end
up tragically down the road. And I don't think that's
because you have trans athletes. I think it's because you
have people bullying children. And as far as it's a
no win, well, the trans girl that competed did get
gold in one event and shared that gold with two
other girls and got silver in the in the long
(05:17):
jump event, So you can't really say it's a no
win when she didn't beat them all.
Speaker 1 (05:24):
So I don't know, man, I wouldn't.
Speaker 2 (05:28):
I wouldn't deprive my daughter of the opportunity to go
win at the state track and field because of my own.
Speaker 1 (05:38):
Political leanings. I just wouldn't do it. I wouldn't. Yeah,
I mean, you do what you want. It's your kid,
But I wouldn't deprive my kid. How about this?
Speaker 2 (05:49):
Did you happen to see that there was a story.
I'm shifting gears here. I don't want to talk about
it anymore. Did you have to see the story about
Elon Musk? They say that he was doing drugs during
the campaign. What New York Times the mind bending article.
As Elon Musk became one of Donald Trump's closest allies
(06:10):
last year, leading raucus rallies and donating two hundred and
seventy five million to help him win the presidency, he
was also using drugs far more intensely than previously known.
According to people familiar with his activities, The consumption went
well beyond occasional use. He told people he was taking
so much ketamine that it was affecting his bladder, which
is a known.
Speaker 1 (06:29):
Effect of chronic use.
Speaker 2 (06:30):
He took ecstasy and psychedelic mushrooms, and he traveled with
a daily medication box that held about twenty pills, including
ones with the markings of the stimulant adderall according to
a photo of the box and people who have seen it.
Speaker 1 (06:42):
I don't know about you, guys, I'm stunned.
Speaker 2 (06:46):
I'm shocked that the guy who would wear a hat
on top of the hat and get on stage and
jump and scream and carry on the way that he
did would have any history of drug use.
Speaker 1 (06:57):
That seemed like perfectly normal behavior to me. So I
just that is, uh, that is wild that he was
using a lot of drugs. I assume we're all on drugs.
Speaker 2 (07:09):
I mean, isn't there a point where you see people
and you go, okay, well there's some drug use going
on there.
Speaker 1 (07:14):
He's at it.
Speaker 2 (07:15):
Before I'd profen always you can just oh yeah, right, yeah,
I know, right, Like, hey, here's a fun drug I
just started this week, Saw Palmetto. Yeah, yep, it's supposed
to help with prostate function. So I also carry around
a little pill box with about twenty pills a day.
Speaker 1 (07:34):
Yeah, saw Palmetto.
Speaker 2 (07:35):
I've got my multi vitamin, I've got my blood pressure
medication I got Yeah. Yeah, I guess I'm like Elon
Musk minus the ketamine and adderall and the jumping on
stage with a chainsaw. Yeah, but otherwise exactly the same.
The thing with Musk, though, is that when he did
all that stuff, we didn't immediately go what is he on?
(07:56):
Because he's an eccentric person to start with, you know,
so it's like, well, Elon Musk is being crazy, but
that's what makes him a genius, and you go, nam
Elon Musk, Elon Musk.
Speaker 1 (08:07):
Is an eccentric and he's had some really genius things.
Speaker 2 (08:09):
Don't get me wrong, I don't want to take it away
from anything that he's accomplished.
Speaker 1 (08:12):
But the behavior that we saw was that was a little.
Speaker 2 (08:16):
Beyond just eccentric. That was just that was drugs. That
was straight up drugs.
Speaker 1 (08:22):
Do you buy the black eye? Oh?
Speaker 2 (08:24):
That he told his son to hit him. I was
playing with my son. I told him to punched me,
and they hit a lot harder than you'd think.
Speaker 5 (08:31):
Uh.
Speaker 2 (08:33):
I don't know who tells their kid to punch him.
That's weird to me. Also, I'm not surprised though that
if the black eye. It wouldn't surprise me if the
black eye did come from the kid. And I say
that because the kid is a bit undisciplined as well.
I mean, he's a four year old who Musk brought
(08:54):
to the Oval office during a press conference. It was
running around and telling the president pretty horrible things during
live television. You're an fn idiot. A four year old
is telling the President of the United States, the commander.
Speaker 1 (09:08):
Of the free world, you're an fing idiot.
Speaker 2 (09:11):
So yeah, it would not surprise me if the kid
has discipline issues and would punch Dad in the eye.
Speaker 1 (09:16):
That does not surprise me at all. That's what Bratts do.
You know.
Speaker 2 (09:20):
They misbehave and if they're not corrected, then they'll continue
to misbehave.
Speaker 1 (09:25):
So yeah, that part doesn't surprise me at all.
Speaker 2 (09:28):
So Trump was asked about Elon Musk in this press
conference because you know, there was the kind of the
falling out, and then Musk said, Wow, I've not really
saved the country any money, but I sure have saved
all of my investors a lot of money.
Speaker 1 (09:43):
They just haven't made squad.
Speaker 6 (09:45):
What started as a political romance is now wrapping up
with a little less love. Elon Musk's one hundred and
thirty days as the head of the Department of Government
Efficiency came to an end today, well short of his promises,
and just days after he criticized President and Trump's.
Speaker 1 (10:00):
A big beautiful Bill.
Speaker 6 (10:02):
Trump and Musk sharing the spotlight today at the White
House as Musk ends his role as a special government employee.
Speaker 2 (10:09):
But while okay, then they get on more stuff blah
blah blah, and then Trump gave him a gold key
and then he said it's great, and uh, we're gonna
save a bunch of money and YadA, YadA, YadA.
Speaker 1 (10:18):
Does this strike you as.
Speaker 2 (10:19):
Odd though the times that they've had the joint press
conferences in the Oval Office, Trump is set at the
resolute desk and Musk has stood by, sometimes with this
kid crawling all over the president, sometimes just nearby, wearing
his T shirt because of the respect for the office.
I know Trump is sitting, and of course Musk wearing
a baseball capital time. I know that Trump is sitting
(10:41):
behind the desk, and I think he's saying, I'm gonna
sit behind the desk and look presidential.
Speaker 1 (10:46):
But there's just something about the tableau of it.
Speaker 2 (10:50):
Where Trump is. I mean, he likes to lean right,
he likes to slouch. We've seen that everywhere he goes
he slouches, and which is fine. He can slouch aw
you want, I'm slouching as week, But the tableau of it,
and I know Trump is like, I'm the president.
Speaker 1 (11:03):
I'm gonna sit behind the resolute desk.
Speaker 2 (11:04):
This makes me look presidential, and then I'm gonna have
him standing there and it's gonna look.
Speaker 1 (11:07):
Like he's an aid.
Speaker 2 (11:08):
But somehow Musk upstages him. Part of it has to
do with Trump's posture. Part of it anything has to
do with just the way that Musk stands and commands
an energy in that room that that Trump doesn't have
when he's sitting there and then looking up at Musk.
This is a really odd thing that I noticed. I
(11:28):
don't know if I haven't heard anybody else point that out,
but it just seemed really strange to me. Just seemed
really odd. All right, the Tariff's on, Tariff's off. That
is next, Chris Merril KF I AM six forty Live
everywhere in the iHeartRadio app AM I AM six forty
More stimulating talk Chris Merrill. This week there was a
(11:50):
much ado as a number of code talkers disappeared from
military websites. The History of the code Talkers, which used
to be on the the Department of Defense website, vanished.
If you're unfamiliar to the code talkers, especially the Navajo
code talkers who the most famous. All the code talkers
were used throughout history, Chalk, Taus, Comanche, Navajo, others, but
(12:15):
the Navajo code talkers basically developed a code in World
War II using their own language that the Germans couldn't decipher.
They couldn't get to it. Now, if you'll recall the
Germans were very very good. Listen, I'm getting old, and
I realized about the time I hit forty, suddenly all
those World War II documentaries on the History Channel became
(12:37):
fascinating to me. You'll recall the Germans of World War
Two were very very good at their codes, at disguising
their messages. We couldn't break it. In fact, it was
only after the development of the Enigma machine that we
were able to do.
Speaker 1 (12:57):
So.
Speaker 2 (12:57):
It was basically the first computer. We allowed for the
machine to UH to decipher it. Because the Germans had
a way of changing their code every day, it seemed
to be a.
Speaker 1 (13:13):
Completely random pattern. So suppose you're.
Speaker 2 (13:16):
Doing one of those ciphers in the in the newspaper
or online or wherever you do it, and uh and
and and you have to try to figure out what
it says. And you're and you're trying to break the code,
and you're going, Okay, well this squiggly line this I
figured out that this squiggly line that equals a D.
So then you go through all the cipher and you
find all the squiggly lines and you go, that's a D.
(13:36):
And then from there you try to figure out the
rest of it, right, that's it. Well, imagine if every
time you got to one the whole thing changed. The
whole thing changed.
Speaker 1 (13:48):
So Alan Turing was the guy.
Speaker 2 (13:52):
He's kind of considered to be the father of modern
computers because he had this notion, rather than trying to
get people, who are it crosswords and ciphers in the paper,
why don't we get a machine that can figure out
these codes and it'll be able to do it faster
than we can now. They didn't have the kind of
machinery and chips that we did, obviously, because they didn't
have computers then, so it was a very mechanical system.
(14:15):
But he made it work and they were able to
use this Enigma machine to decipher the code, and that
gave us an advantage on the Germans.
Speaker 1 (14:23):
On the American side, we didn't have any of this.
Speaker 2 (14:27):
We didn't worry about Germany's technological superiority to decipher our codes.
Speaker 1 (14:32):
Because what we did is we took a dead language.
Speaker 2 (14:36):
Well to the rest of the world anyway, the Navajo
were still using it, which is why it was known.
We took a language that the rest of the world
didn't know. They had no idea, they didn't know what
these different sounds meant. They didn't have any interpreters. They
didn't have so they're trying to decipher a language with
no key. There was no well, this sound means this
(14:58):
letter or this phrase could POTENTI mean this. They couldn't
decipher it for squat And it was all based on
that ancient not ancient, but that Navajo language which had
been around for for hundreds of years or more. So
the Navajo stepped up. The Navajo soldiers spoke in this language.
(15:19):
The Germans couldn't decipher it, and the way that we
were then eventually able to decipher their codes using our
computing expertise, they couldn't do it. So in many ways
it was the code talkers who gave us an advantage
in the war, including iwo Jima, six code talkers. Iwo
Jima sent more than eight hundred messages with zero errors.
(15:42):
Part of the reason that we had the Navajo in
the military was because there was an emphasis on diversity
in the military. They wanted diversity because they knew there
was strength and having people with different backgrounds who could
bring different skills, and in the case of the code talkers,
(16:03):
they brought an entire language that nobody else could speak,
and it was the most effective encryption that anyone had
during World War two. And again, as I mentioned, we said,
this happened in the past, Choctaw, World War one, Comanche,
also in World War two. We've seen this happen before.
So President Trump signs the DEI Executive Order. We're gonna
(16:25):
get all diversity, we're gonna get all equality, we're gonna
get all inclusion and anything that promotes those things out
of the military. And so they said, we're gonna scrub
all the websites of any of this stuff. And what
they do, they scrubbed the website. At least ten articles
that mentioned the code talkers disappeared from the US Army
(16:46):
and the Department of Defense websites.
Speaker 1 (16:49):
Well, as one might expect, all hell broke clues.
Speaker 2 (16:51):
People were not very excited about the fact that we've
just decided to erase history because we're afraid it might
be too inclusionary, so we don't learn our history.
Speaker 1 (17:02):
What could possibly go wrong?
Speaker 2 (17:04):
Okay, if I AM six forty, we're live everywhere on
the iHeartRadio app.
Speaker 1 (17:08):
You're listening to KFI AM six forty on demand.
Speaker 2 (17:14):
Thank it, Amy Chris Merril, if I am six forty
more stimulating talk talkback question.
Speaker 1 (17:19):
Today, we had.
Speaker 2 (17:20):
A story about former inmates at San Quentin that went
back to play an alumni baseball game against current inmates,
and I thought, if I got out of San Quentin,
I ain't going back no way.
Speaker 1 (17:33):
So what is someplace that you would never go back
to visit?
Speaker 2 (17:36):
We've had a few on here on One guy said
he would never go back to Flint, Michigan.
Speaker 1 (17:41):
I'll blame him.
Speaker 2 (17:42):
Another dude said he would never go back to a
Timeshare presentation. Amen, A men, my friend, So what is
someplace you would never go back to visit? If you're
on the iHeart radio app, just cleik on that talk
back and let us know what if someplace you would
never go back to visit?
Speaker 1 (17:57):
And then obviously why why would you not go back there?
Speaker 2 (18:00):
There was an interesting story that popped this week and
an interview was done with the CEO of Anthropic, which
is one of the AI companies that's out there, and
he's basically warning a bunch of white collar jobs are
going to go away.
Speaker 1 (18:18):
And now I was reading and he was talking with
an Axios reporter.
Speaker 2 (18:21):
Axios is continue to follow up on a lot of
these AI stories this week. They've really been the forefront
of reporting on the AI here of late, and they said,
the US government needs AI expertise and dominance to beat
China in the next big technological and geopolitical shift, but
they can't pull it off without the help of Microsoft, Google,
Open Ai, Nvidia and others. And so we're seeing emerging
(18:43):
of Washington and Silicon Valley. They say, driven by necessity
and fierce urgency. So much so then that they they
use the phrase codependent superstructure. The government and the the
tech companies have formed a codependent superstructure in the race
(19:05):
to dominate AI, and that strikes me as really odd.
I know that we have government that supports certain businesses,
and it doesn't matter if you're a Republican or a Democrat.
Speaker 1 (19:17):
It happens.
Speaker 2 (19:18):
And then when the Democrats do it, the Republicans say
that we shouldn't be using government to pick and choose winners.
And then when the Republicans do it, then the Democrats say,
I can't believe that the Republicans would do this to pick.
Speaker 1 (19:28):
And choose winners.
Speaker 2 (19:29):
And of course there's always winners that benefit typically that party,
whether that's labor unions, or whether that's big business and
low taxes, blah blah blah. Right, We've seen this play
out time and time again, but we are seeing a
lot of subsidies going towards some of these companies, or
you could say investments. Axos points out that the White
(19:54):
House has cultivated a deeper relationship with America's AI giants,
championing of five hundred billion dollars stargate infrastructure led by
Open Ai, Oracle soft Bank, which is out of Japan,
and MGX from the United Arab Emirates, all of these
different things to try to give us a leg up.
Speaker 1 (20:14):
On the AI race.
Speaker 2 (20:16):
And yet I think to myself, if we had, if
we had the government that is trying to push forward
this the efforts in AI in a public private partnership
that benefits the private sector, how is that different than
(20:37):
what China does. China is a capitalistic communistic government, right,
I mean, it's communist capitalism. Chinese Communist Party does this.
They invest and they work with the companies and then the
companies make money. But in this case, the Chinese government
gets some of the proceeds. That's really the difference is
(20:58):
that if the company in China that's working with the government.
Speaker 1 (21:01):
Let's say TikTok.
Speaker 2 (21:02):
If TikTok makes a bunch of money because with some
of the investments that the Chinese government has made, then
the Chinese government says, cool, that was our investment.
Speaker 1 (21:10):
We want something out of it.
Speaker 2 (21:11):
Right in the US, the government makes an investment and
then the company takes off, but the government doesn't really
recoup that. Right, we give subsidies, or we give tax breaks,
we give we give certain advantages, tax abatements, call them
right to these companies because the companies need to help.
Speaker 1 (21:33):
We want to.
Speaker 2 (21:34):
We want to, we want to win this battle. And
then you know, okay, well what does the government get
out of it? And they go nothing, but the CEOs will,
and then the CEOs go, we built that, right. It
has been a big political argument. You didn't build that,
Oh yes I did. I built this from my hands,
you know, with my bare hands. Like no, literally, the government,
(21:54):
the government gave you a break so you could build it. Oh,
I built this from scratch all by myself. I played
by the rules. You did play by the rules, but
the rules were definitely in your favor. Because if I
go out there and I say I want to start
a business, I think the government should give me a
tax break. The government goes, well, how big is your business.
The bigger the business, the more likely you are to
(22:15):
have some help from the government. It's just the way
it works. But what do you owe back to the government.
The smaller your business, the more you're going to owe
the government. I don't get that tax break because my
business is too small. Small business in the backbone of America.
It pays all of our taxes so that we can
give those taxes to the big companies that really are
going to do great things. Right, That's how our system works.
I find it to be flawed. Not a big fan
(22:37):
of it. I understand how we got here. You're just
not a big fan of that. And moreover, I'm even
less of a fan of the dishonesty around it, Like, no,
we're not, Yeah you are.
Speaker 1 (22:47):
Just call it what it is you are.
Speaker 2 (22:49):
So either we're all on a level playing field and
we're all going to pay the same taxes, or we're
not at a level playing field and just admit it.
Speaker 1 (22:55):
That's what it is.
Speaker 2 (22:57):
So what this CEO of Nthropics said is basically, quit
sugarcoating what's coming mass elimination of jobs across technology, finance, law, consulting,
and other white collar professions, especially entry level gigs.
Speaker 1 (23:16):
Entry level white collar roll. That's what you need. Entry
level white collar.
Speaker 2 (23:23):
That's a cush job until the cuts come and then
you're the first one out. But let's just call it
lower to middle management. This is from Anderson Cooper talked
with the Anthropic CEO on CNN. Dario, you've said that
AI could wipe out half of all entry level white
collar jobs and spike unemployment to ten to twenty percent.
Speaker 1 (23:44):
How soon might that happen?
Speaker 7 (23:47):
Just to back up a little bit, you know, I've
been building AI for over a decade, and I think
maybe the most salient feature of the technology and what
is driving all of this is how fast the technology
is getting better. A couple of years years ago, you
could say that AI models were maybe as good as
a smart high school student. I would say that now
they're as good as a smart college student and sort
(24:08):
of reaching past that.
Speaker 1 (24:09):
I really worry, particularly at the entry.
Speaker 7 (24:12):
Level, that the AI models are are very much at
the center of what an entry level human.
Speaker 2 (24:18):
Worker would do. The problem is, what is an entry
level human worker? Because I can tell you this, I'm
not impressed. I'm not impressed by AI yet. I mean,
I'm impressed by where it's going. I'm impressed by the concept.
I'm impressed by the proof of concept. I'm impressed by
some of the early functionings. I'm not impressed with a
(24:39):
lot of it because hallucinations still run wild. Hallucinations are
still a big issue. Well right now AI is hallucinating
case law, which is one of the reasons that people
are worried about it. In fact, there was another poll
that came out that says the public is not so
cool on how quickly things are moving. Yeah, it's great
(25:01):
for punching up my resume. It's really neat for making photos.
But maybe it's moving a little too fast. They might
have a point, they might just be scared. We'll do
a little analysis on the sociology of it next. Chris
Merril M six forty WeLive everywhere in the iHeartRadio app.
Good evening, my friends, Chris Merril kfi AM six forty,
(25:24):
U demand anytime the iHeart Radio app All right our
questions and I wear is one place that you would
never go back. I had a story about former inmates
at San Quentin that went back to play an alumni
baseball game at their Field of Dreams, And I say,
you never get me back there? No way. If I
got out, I'm not going back. I'm not doing a visit.
(25:46):
So where's someplace you would never go back?
Speaker 5 (25:47):
Pigeon Forge, Tennessee, the gateway to Dollywood and a multitude
of other tourist traps. I've never seen traffic so bad
and so many tourists per square foot. It's horrible and
you can't enjoy it. I don't know what people see
in that place, but it's just absolute congestion, beyond anything
(26:08):
LA traffic has ever seen. Wow, and worth hundreds of
miles of driving out of the way. It's not worth it,
and I will never return and avoid it at all cost.
Speaker 1 (26:17):
All right, you guys ever been to Pigeon Forge? No? Okay,
good talk? Nope, nope?
Speaker 2 (26:23):
Yeah, all right, So now not a glowing review. I
guess we don't go to Pigeon Forge.
Speaker 1 (26:28):
Ever. After twenty nine years of living in California and
Huntington Beach, I said, I never moved back to Nashville, Tennessee,
where I grew up. Wow.
Speaker 2 (26:37):
A couple of people for Tennessee today. A lot of
people don't like Tennessee.
Speaker 1 (26:41):
All right, well guess where I I am? Oh, thank
you NYSA. Oh you made me break a promise.
Speaker 2 (26:47):
Oh, Gavin Newsom made you big. Jerry Brown didn't make
you break that promise. It took the Newsom to do it. Okay, Larry,
all right, what else?
Speaker 8 (27:00):
Good evening. I would never go back to Catalina, Ireland.
That was the most boringest place I've ever went to.
I have no desire to ever go back. Wow, Thank
you guys, have a great evening.
Speaker 1 (27:12):
Thanks you too.
Speaker 2 (27:14):
She's not going back to Catalina while she wasn't there
during the wine mixer. That's the big one, s all right.
Speaker 1 (27:20):
Hey Chris, Hey, I thought it was kind of weird.
I totally agree with everything you're saying today. That is weird.
Speaker 8 (27:28):
I guess the show is still got a lot left,
so we'll see what happens.
Speaker 1 (27:35):
And look out, Angel Martinez. Here comes Lucy. Heell a,
Hi Kayla. Okay, that got creepy fast. Hi Steve, he
loves Hi Kayla. Hey Steve, Hi, Kayla, Steve, you got
a little thing with him? You have a little something
going on there Hi Kayla, Hi Steve. At the end
(27:55):
of the talk backs, he always gives me a personal message.
Yeahs always great. He's a good guy.
Speaker 2 (28:00):
Okay, all right, it sounded a little creeper to me,
but if you're cool with it, then.
Speaker 1 (28:04):
Hi Kayla, thank you. Jesus, that was brig me. That's
I'm gonna have nightmares. Thank you. Yeah, I don't blame Hey.
It's kind of weird. I agree with everything you said tonight.
So Hi Kayla, thank you. I was talking about the
(28:26):
AI here.
Speaker 2 (28:27):
More than three quarters of Americans now want companies to
create AI slowly.
Speaker 1 (28:32):
They say, slow it down.
Speaker 2 (28:34):
Here to tell us more about it, is the story
read by AI Outstanding.
Speaker 9 (28:38):
Most Americans want AI progress to slow down.
Speaker 1 (28:42):
Now that sounds natural, doesn't it. All right, but then
there's kind of a giveaway when it comes to AI, which.
Speaker 2 (28:50):
Is why I think that we are still it's still
not quite ready for prime time.
Speaker 1 (28:53):
All right, here's what he said. Let's try this from
the beginning, from the top.
Speaker 9 (28:57):
Most Americans want AI progress to slow down, poll finds.
Speaker 1 (29:03):
Poll finds. Okay, all right, so we're reading.
Speaker 9 (29:06):
This as tech giants race to develop AI. Seventy seven
percent of Americans say they prefer companies take their time
and get it right, even if it delays breakthroughs. According
to the twenty twenty five Axios Harris Poll one hundred,
only twenty three percent support rapid development at the risk
(29:27):
of mistakes. The causes sentiment spans generations from ninety one
percent of boomers to seventy four percent of gen Z.
Speaker 2 (29:36):
Wow, even gen Z three quarters say let's let's pump
the brakes. But the concern is that China is going
to get ahead of us. Don't worry is if we're not.
Speaker 1 (29:46):
Ahead of AI, China will get ahead of AI. And
then I then I mean, I mean, then.
Speaker 2 (30:00):
Uh uh, they would, they would, they would, They would
be able to replace their workers first.
Speaker 1 (30:09):
So I mean you got to win.
Speaker 9 (30:15):
Despite pressure from CEOs and investors to lead in a
global AI race, the public remains skeptical.
Speaker 1 (30:22):
Yeah.
Speaker 9 (30:23):
Many fear job loss, misinformation, and irreversible early design flaws.
The poll suggests Americans have learned from past tech missteps
and want slower, more responsible innovation.
Speaker 2 (30:37):
Yeah, and we're also in an exciting period with with AI.
Part of the reason that we were that were moving
so quickly is that it's exciting, it's new. We want
to see just how far this can take us? Right,
just how how much can we do with AI? It's
kind of exciting for for the kids these days, like
Kayla Kila. You probably we don't remember not having the internet,
(31:03):
do you you do?
Speaker 4 (31:05):
Oh?
Speaker 1 (31:05):
Okay, all right, all right, that's fun. Yeah.
Speaker 2 (31:09):
So then you also then remember like the early days
where the web pages were really rudimentary compared to what
we have today, right, where it was angel Fire and
what was the other one?
Speaker 1 (31:20):
Angel Fire was one of them, there was what was
the other? What was it?
Speaker 2 (31:28):
Well? Earthlink was a provider, right, I'm just thinking of
the web page designers like Geo, geolinks or something and
angel Fire and you could make your own website. Right,
So why are we so afraid of AI? Because ultimately
when we say slow down, it's because we're worried about something.
Speaker 1 (31:45):
Right. One of the things that motivate is fear and greed.
Speaker 2 (31:47):
And so you've got some people that say, boy, I
want to see how far this can go. I want
to see what AI can do. I just want to
see how prosperous we can be with AI. Right, that's
the greed. But what what what we what causes us
to say, slow it down, let's exercise some caution.
Speaker 1 (32:02):
Is the fear side of us.
Speaker 2 (32:04):
And that's that's a that's a basic emotion that is
a survival mechanism, the fear, all right, And we have
a lack of understanding when it comes to AI. We
don't know what it's capable of, good or bad, and
the unknown is scary. And most of us don't understand AI.
We don't understand the limitations, we don't understand the abilities,
we don't even understand.
Speaker 1 (32:21):
How it works.
Speaker 2 (32:23):
And I was reading some more on that today too,
and a lot of it is basically just predictive, right
in the same way that you might be using a
you might be sending a text it, or you might
be writing an email, and all of a sudden, your
your your Outlook or your Gmail starts to try to
predict what the next word would be, Right, That's kind
of how AI works, but just on a grant, like
what's the most likely answer based on what the next
(32:44):
word would be? And if this is the most likely
next word, what would the next most likely word be?
And that kind of thing, And that's kind of how
AI builds. It's uh, it's responses. But we have a
lack of understanding and that leads us to worry about
word case scenarios. I mean, it's the same reason that
we don't know. You know, before we had telescopes, we
(33:08):
didn't know what was in the stars, and then we
were worried about what could come and destroy us. We
don't know what is in outer space. We don't know
if there are other life forms, which makes for great
Hollywood movies where you've got.
Speaker 1 (33:24):
The war of the World or some sort of an
alien invasion Independence Day.
Speaker 2 (33:28):
We worry because we don't know, and the fear comes
from that blindness that we have. And then we get
bad news headlines, right. We get the headlines about the
hallucinations of AI. We get the headlines about kids using
AI to cheat all over the place. We get warnings
(33:49):
from insiders like the gentleman from Anthropic who says this
is going to create a bunch of layoffs. We're gonna
see high unemployment, right, and we also be honest, we've
had bad experiences with some previous tech. It's like all
the spam mail that you've been getting, and I've just
been getting some more.
Speaker 1 (34:06):
In fact, here.
Speaker 2 (34:06):
I've got a new one how to spot phishing emails.
Now that EI is cleaned, AI excuse me has cleaned
up the typos, which means AI will be used by
bad actors. And nowadays it doesn't feel like everybody's trying
to take advantage of you, right, They're all trying to
steal your information or your money. We're getting incessant spam
and fraudulent emails, and we see how that tech is
(34:28):
being used for evil, and AI will make that worse.
Bad actors will use AI to try to take advantage
of you, but AI may also help you identify those
threats more quickly. Chris Merril k I AM six forty.
We're live everywhere in the iHeartRadio app, KFI AM six
forty on demand