Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to KFI AM sixty on demand.
Speaker 2 (00:05):
Hey, good afternoon. I'm Chris Merril. This is KFI AM
sixty and we are on demand anytime in the iHeartRadio app.
When you're on that app, you can click on that
talkback button. If you have questions, comments, quipts, quotes, criticisms,
or compliments about the program, feel free to let us
know the question. For tonight's talk back. I had a
story about the former inmates of San Quentin going back
(00:25):
to play an alumni game on their Field of Dreams,
which is the baseball field inside the prison, and so
it's former inmates versus current inmates. Current inmates won big lee. Incidentally,
what is someplace that you would never go back to visit?
Because if I got out of Saint Quentin, I would
never even want to drive by the place, and if
(00:46):
I did, I go nightmares in there. That's it. I
just never want to go back. Curious about what where
it is that you've spent time that you would never
want to go back to again.
Speaker 3 (00:56):
Good afternoon, Chris listens to hear the other him a, California.
The one place I would not go back to the
heating village time share presentation. I thought it was cool.
I said they were giving away like three days, four nights.
But it felt like I was a hostage and it
was very hostile. I do not advise that for anyone.
(01:18):
The worst the one place a time share presentation. Oh
my god, thinking about it now is give me exact.
Speaker 2 (01:26):
Yeah, me too. My wife's dragged me to a couple
of those because she's like, oh, they're gonna pay for
you know. We had two nights at this dump of
a hotel in Branson, Missouri. Oh, we did it there.
And then when we first got married actually was here.
We sat through one of those timeshare presentations because they
give us tickets to I think Universal Studios or something.
(01:46):
We were living in northern Arizona at the time, and
they said, so we had to sit through that, and
they said, oh, should only take about two hours. You know,
when it only takes two hours, You know, when the
presentation only takes two hours is when you when you
sign up for a time share right away. In both cases,
we were stuck there for four hours. We were the
(02:08):
last to leave. And he's right, you feel like a hostage.
Oh it's the worst. I'm with you. One hundred percent,
and finally after the last one, my wife is like, Okay,
We're never going to do that again. It's not worth it.
I go, was that worth the two hundred dollars that
we just save? She goes, no, it was not.
Speaker 4 (02:23):
Hey, Chris, someplace I'd never go back to visit probably
send you Comic Con that place. I mean, it's it's great,
but after going for you know, so many years, the
pandemic stopped me and I think I'm done with that place.
Speaker 2 (02:41):
Too crazy. It is crazy. Yeah, you've been to comic Con, Kayla, No, No,
it's pretty crazy. When you like, you have to be
in that mood, the same mood you have to be
in when you go. I'm going to go deal with
the crazy amount of people at for instance, Universal Studios,
which they that we went, which was part of our honeymoon.
(03:02):
It rained like torrents and it was wonderful. Oh so
few people there. I mean it was buckets. We're riding
roller coasters in the rain. It was so great. I
loved it so much. But Comic Con is it's just
a madhouse. It's like going to Disney on a popular day.
It's just it's just a madhouse. And you're standing out,
(03:22):
you're standing outside and you're in line, and uh, it's hot,
la beautiful setting. You didn't get a better setting than
the convention center in San Diego. But yeah, it's a lot.
You just have to be all in for it. If
you're not all in for Comic Con, it's no good. Uh,
(03:42):
all right, did you want to do this other one?
We had one guy that had thoughts on trans athletes.
We were talking about the trans athletes earlier. For those
of you joining us, I just don't like the parents
are screaming at kids. I don't. I don't care for
it at all. And then we have people that are
justifying parents screaming at sixteen year olds. Well they deserved it.
(04:03):
So what are you gonna do? I mean, yeah, yeah,
freedom of speech, but you also have freedom to not be,
you know, an adult acting like a child. Process your emotions.
Let's uh, let's find the words. Probably not scream at children.
Drums on again, off oup, hang on, wrong one, hang on.
Speaker 5 (04:23):
If I had a daughter, I would suggest that she
boycott these competitions where there's transgenders. I mean, it's a
no win for these young women that can't compete with
the genetics of a boy. I mean, you can put
lipstick on the boy, but it's not a girl. So
(04:45):
just saying it's not gonna work, huh. And this is
gonna end up tragically down the road.
Speaker 2 (04:50):
I think it is. I'm afraid it is gonna end
up tragically down the road. And I don't think that's
because you have trans athletes. I think it's because you
have people bullying. And as far as it's a no win, well,
the trans girl that competed did get gold in one
event and shared that gold with two other girls and
(05:11):
got silver in the in the long jump event, So
you can't really say it's a no win when she
didn't beat them all. So I don't know, man, I wouldn't.
I wouldn't deprive my daughter of the opportunity to go
win at the state track and field because of my
(05:33):
own political leanings. I just wouldn't do it. I wouldn't. Yeah,
I mean, you do what you want, it's your kid,
But I wouldn't deprive my kid. How about this? Did
you happen to see that there was a story. I'm
shifting gears here. I don't want to talk about it anymore.
Did you have to see the story about Elon Musk.
(05:54):
They say that he was doing drugs during the campaign.
What New York Times with a mind bending article as
Elon must became one of Donald Trump's closest allies last year,
leading raucus rallies and donating two hundred and seventy five
million to help him win the presidency. He was also
using drugs far more intensely than previously known. According to
(06:16):
people familiar with his activities, The consumption went well beyond
occasional use. He told people he was taking so much
ketamine that it was affecting his bladder, which is a
known effect of chronic use. He took ecstasy and psychedelic mushrooms,
and he traveled with a daily medication box that held
about twenty pills, including ones with the markings of the
stimulant adderall according to a photo of the box and
(06:37):
people who have seen it. I don't know about you, guys,
I am stunned. I'm shocked that the guy who would
wear a hat on top of the hat and get
on stage and jump and scream and carry on the
way that he did would have any history of drug use.
That seemed like perfectly normal behavior to me. So I
(06:57):
just that is uh, that is wild that he was
using a lot of drugs. I assume we're all on drugs.
I mean, isn't there a point where you see people
and you go, okay, well there's some drug use going
on there, and he's admitted to it. Before I'd do
profen always you can just oh yeah, right, yeah, I know, right, Like, hey,
(07:18):
here's a fun drug I just started this week, saw Palmetto. Yeah, yep,
it's supposed to help with prostate function. So I also
carry around a little pill box with about twenty pills
a day. Yeah, saw Palmetto. I've got my multi vitamin,
I've got my blood pressure medication I got Yeah. Yeah,
I guess I'm like Elon Musk minus the ketamine and
(07:40):
adderall and the jumping on stage with a chainsaw. Yeah,
but otherwise exactly the same. The thing with Musk, though,
is that when he did all that stuff, we didn't
immediately go what is he on? Because he's an eccentric
person to start with, you know, so it's like, well,
Elon Musk is being crazy, but that's what makes him
a genius, and you go, nam Musk, Elon Musk is
(08:03):
an eccentric, and he's had some really genius things. Don't
get me wrong, I don't want to take away from
anything that he's accomplished. But the behavior that we saw
was that was a little beyond just eccentric. That was
just that was drugs. That was straight up drugs. Do
you buy the black eye?
Speaker 6 (08:20):
Oh?
Speaker 2 (08:21):
That he told his son to hit him. I was
playing with my son. I told him to punch me,
and they hit a lot harder than you think. Uh.
I don't know who tells their kid to punch him.
That's weird to me. Also, I'm not surprised though that
if the black eye. It wouldn't surprise me if the
black eye did come from the kid. And I say
(08:43):
that because the kid is a bit undisciplined as well.
He's a four year old who musk brought to the
Oval office during a press conference and was running around
and telling the president pretty horrible things during live television.
You're an e fn idiot. A four year old is
telling the President of the United States, the commander of
(09:04):
the free world, you're an fing idiot. So yeah, it
would not surprise me if the kid has discipline issues
and would punch dad in the eye. That does not
surprise me at all. That's what Bratts do. You know,
they misbehave and if they're not corrected, then they'll continue
to misbehave. So yeah, that part doesn't surprise me at all.
(09:25):
So Trump was asked about Elon Musk in this press
conference this week, because you know, there was the kind
of the falling out, and then Musk said, Wow, I've
not really saved the country any money, but I sure
have saved all of my investors a lot of money.
They just haven't made squad.
Speaker 7 (09:42):
What started as a political bromance is now wrapping up
with a little less love. Elon Musk's one hundred and
thirty days as the head of the Department of Government
Efficiency came to an end today, well short of his promises,
and just days after he criticized President Trump's Big Beautiful Bill.
Trump and Musk sharing the spotlight today at the White
(10:02):
House as Musk ends his role as a special government employee.
Speaker 2 (10:06):
But while okay, then they get on more stuff blah
blah blah, and then Trump gave him a gold key
and then he said it's great, and uh, we're gonna
save a bunch of money, and YadA YadA, YadA. Does
this strike you as odd though the times that they've
had the joint press conferences. In the Oval Office, Trump
is set at the resolute desk, and Musk has stood by,
sometimes with this kid crawling all over the president, sometimes
(10:28):
just nearby, wearing his T shirt because of the respect
for the office. I know Trump is sitting, and of
course Musk wearing a baseball capital time. I know that
Trump is sitting behind the desk, and I think he's saying,
I'm gonna sit behind the desk and look presidential. But
there's just something about the tableau of it where Trump is.
(10:48):
I mean, he likes to lean right, he likes to slouch.
We've seen that. Everywhere he goes he slouches, and which
is fine. He can slouch ow you want. I'm slouching
as we speak. Just seem really odd. All right, Tariff's on,
Tariff's off, and uh check on traffic. That is next.
Chris Merril kf I AM six forty live everywhere in
(11:09):
the iHeartRadio app.
Speaker 1 (11:10):
You're listening to KFI AM six forty on demand.
Speaker 2 (11:15):
KFI AM six forty more stimulating talk. Hey Broll is
Lucy there in the traffic center.
Speaker 6 (11:21):
Yeah, I'm here.
Speaker 2 (11:22):
Hey, Lucy, heyes, from the show. I'm sorry that you
had to walk into this circus.
Speaker 6 (11:27):
Oh no, no, no, it's fantastic. You're you're so great.
Speaker 2 (11:31):
Oh you're You're wonderful to say that, but I I
hate to say the obvious here, Lucy, but you don't
sound like you don't sound like you're from here. You
sound like you're from somewhere east, like Vegas or Omaha,
or maybe Saint.
Speaker 6 (11:41):
Louis or North Carolina. Yeah, yeah, yeah, I know I
get that a lot. Yeah, from originally from England originally,
but I have been in this wonderful country for thirty years,
I think. So I came here on my own. It
was just on a whim. I grew up watching American shows,
you know, like I don't know, like The Rockford Files.
Speaker 2 (12:04):
And that Lock.
Speaker 6 (12:07):
And Stopped and I was like, wow, that's really cool.
So you know, it's like one of those things where
you know, it was like the soft power of Hollywood.
So it lured me and the old movies, you know,
with Kerry Grant and I mean, I know he was
from my neck of the woods as well, but you know,
I grew up, so I grew up with the mystique
(12:29):
of Hollywood, with with America's greatness and with its energy
and dynamism, and it still has it. So here I am.
I came to the East Coast and then came over
to California and did several things on the way.
Speaker 2 (12:44):
How long were you on the East Coast?
Speaker 6 (12:46):
Twelve years?
Speaker 2 (12:47):
Oh? Okay? And then the law. Yeah.
Speaker 6 (12:51):
Well, I lived in Boston and I love Boston still
and I have so many good friends. But you know,
the Bostonians are tough. They're a tough bunch. So but
but you know what, you get them on your side,
and they're on your side forever. That's not yes.
Speaker 2 (13:10):
So when I was interviewing for my first job in
California in San Diego, I had a station in Boston
reach out at the same time, and so then I
was thinking, in Boston, of course, is a larger radio
market than San Diego. So I was thinking, well, this
is a bigger market. But I thought, do I want
to have the bigger market where everybody hates me, or
do I want to go to San Diego where the
(13:30):
weather is amazing and everybody's from somewhere else, you know,
So I chose San Diego.
Speaker 6 (13:36):
Yeah, and you're right, because every Bostonians hate everybody. They
hated me at first. Oh yeah, I got death threats.
I did. Oh you're kidding, No, no I did, I did.
I mean it didn't help the English Irish issue, you know,
in the nineties and so, but that was resolved and
(13:58):
then everybody got used to me. It's like, oh, you're okay, You're.
Speaker 2 (14:02):
Yeah, you just got you got to do your time,
and then you uh yeah. And then that's what when
you came to California, What are the hardest names? Because
you I mean I was listening to your traffic and
and like you said, you've been here forever, so I
mean obviously you know all the places. What were the
hardest ones when you came here and had to learn,
you know, because every every area has its own unique pronunciation.
What are some of them that were difficult for you?
Speaker 6 (14:24):
It does koanga, that's crazy, that word cowenga pulva. I mean,
I've heard all sorts of versions of that. I've heard,
you know, let's see. You know, I'm just I'm looking
at my map right now. I mean, just the Spanish
names in general. I mean I even had a problem
(14:44):
saying San Diego. It didn't roll off the tongue.
Speaker 2 (14:48):
Did you say it like ron Burgundy, San Diiago.
Speaker 6 (14:51):
Yes, something like, but it was you know, but everyone
was so kind when I came here, they were so kind.
They held me and they coached me. He's like, no, no,
don't say that, don't say that. But then you have
other words, like people say them differently. No galas no
(15:11):
galas street on the follow five. So some people say
nigoles and some people say no galas. I don't know
which one it is.
Speaker 2 (15:19):
Some of those places can get away with either way, right,
you can. You can say it either way. Uh, it's
it's sort of like Colorado, Colorado, you know, or.
Speaker 6 (15:30):
Oh, Nevada or Nevada.
Speaker 2 (15:32):
I was just gonna say that, yeah, because I've got
the president. Trump said that and then he he was
there and I've got this audio. This is great, and
he started started telling people they were saying it wrong.
Speaker 8 (15:43):
Nevada Novada, right, And you know what I said.
Speaker 1 (15:48):
You know what I said.
Speaker 5 (15:50):
I said when I came out here, I said, nobody
says it the other way.
Speaker 2 (15:54):
It has to be Nevada.
Speaker 9 (15:56):
No, no, if you don't say it correctly.
Speaker 2 (16:00):
And it didn't happen to me, But it happened to
a friend of mine who was killed. That's true, though
they actually found his body in Lake Mead when the
levels went down.
Speaker 6 (16:08):
It doesn't surprise you, and it was it.
Speaker 2 (16:11):
Was funny because they had they found like four bodies
out there, and three of them were people who actually
said Nevada instead of Nevadada.
Speaker 6 (16:17):
But here's the thing is a lot of people won't
allow you to say Nevada. I've had really disapproving looks
Nevada story or my.
Speaker 2 (16:26):
My father likes to say, uh, Oregon, oh yes, yeah, yeah,
don't say Oregon.
Speaker 6 (16:33):
Yes, the Oregonians don't like that at all, exactly.
Speaker 2 (16:36):
I've corrected him on that. And who cares, I said,
people from Oregon care. Yeah. Yeah, So that's funny. Well,
I'm glad you're in the show, Lucy. This is great.
It's good to hear from you.
Speaker 6 (16:44):
Oh it's wonderful. I love listening and I'm chuckling away
and just I just want you to know that.
Speaker 2 (16:50):
Oh you're brilliant. Thank you so much, all right, god
bless all right, very good. Well, check my schedule and
see if there's time that Lucy and I get together
for some bangers and mash Yes, yes, love that all right?
So I don't know. Let me think here, Kayla, do
(17:15):
we have I don't want to get into the tariff stuff?
Do we care about the tariffs right now? You want
me to just go to break early and then spend
more time talking about AI. I got a lot of
stuff on AI this week, so many AI things, So Kilea,
let's just do that, all right, Let's just go to
Are you guys cool with that? Can I get a yes?
Speaker 10 (17:32):
No?
Speaker 2 (17:32):
Yes?
Speaker 8 (17:32):
No?
Speaker 2 (17:33):
Right? All right, good, thank you. That's all I needed,
just one because I have nothing else to say, and
I don't want to talk about tariffs because they bore me.
It's AI AI AI AI I I Next Chris Merril
kf I AM six forty. We're live everywhere in the
iHeartRadio app.
Speaker 1 (17:45):
You're listening to KFI AM six forty on demand, thy Amy.
Speaker 2 (17:51):
Chris Merril CAFI AM six forty more stimulating talk talkback question. Today,
we had a story about former inmates at SAM that
went back to play an alumni baseball game against current inmates,
and I thought, if I got out of sand Quentin,
I ain't going back, no way. So what is someplace
that you would never go back to visit. We've had
(18:13):
a few on here on One guy said he would
never go back to Flint, Michigan. I blame him. Another
dude said he would never go back to a timeshare presentation. Amen,
A men, my friend, So what is someplace you would
never go back to visit? If you're on the iHeartRadio app,
just leak on that talk back and let us know
what is someplace he would never go back to visit.
(18:34):
And then obviously why why would you not go back there?
There was an interesting story that popped this week and
an interview was done with the the CEO of Anthropic,
which is one of the AI companies that's out there,
and he's basically warning a bunch of white collar jobs
are going to go away. And I was reading and
(18:56):
he was talking with an Axios reporter. Axios has continue
to follow up on a lot of these AI stories
this week. They've really been the forefront of reporting on
the AI here of late. And they said the US
government needs AI expertise and dominance to beat China in
the next big technological and geopolitical shift, but they can't
pull it off without the help of Microsoft, Google, Open Ai, Nvidia,
(19:18):
and others. And so we're seeing emerging of Washington and
Silicon Valley. They say driven by necessity and fierce urgency.
So much so then that they they use the phrase
codependent superstructure. The government and these tech companies have formed
(19:38):
a codependent superstructure in the race to dominate AI, and
that strikes me as really odd. I know that we
have government that supports certain businesses, and it doesn't matter
if you're a Republican or a Democrat. It happens. And
then when the Democrats do what, the Republicans say that
we shouldn't be using government to pick and choose winners.
(20:00):
And then when the Republicans do it, then the Democrats say,
I can't believe that the Republicans would do this to
pick and choose winners. And of course there's always winners
that benefit typically that party, whether that's labor unions or
whether that's big business and low taxes, blah blah blah. Right,
We've seen this play out time and time again, but
(20:22):
we are seeing a lot of subsidies going towards some
of these companies or you could say investments. Axos points
out that the White House has cultivated a deeper relationship
with America's AI giants, championing a five hundred billion dollar
stargate infrastructure led by Open Ai, Oracle soft Bank, which
(20:42):
is out of Japan, and MGX from the United Arab Emirates,
all of these different things to try to give us
a leg up on the AI race. And yet I
think to myself, if we had, if we had the
government that is trying to push forward this the efforts
(21:06):
in AI in a public private partnership that benefits the
private sector, how is that different than what China does.
China is a capitalistic, communistic government, right, I mean it's
communist capitalism. Communist party does this. They invest and they
(21:27):
work with the companies and then the companies make money.
But in this case, the Chinese government gets some of
the proceeds. That's really the difference is that if the
company in China that's working with the government, let's say TikTok.
If TikTok makes a bunch of money because of some
of the investments that the Chinese government has made, then
the Chinese government says, cool, that was our investment. We
want something out of it. Right, in the US, the
(21:51):
government makes an investment and then the company takes off,
but the government doesn't really recoup that. You have subsidies
or we give tax breaks, we give we give certain advantages,
tax abatements, call them right to these companies because the
companies need to help. We want to, we want to,
(22:11):
we want to win this battle. And then you go, okay, well,
what is the government get out of it? And they
know nothing, but the CEOs will, and then the CEOs go,
we built that, right, It's been a big political argument.
You didn't build that. Oh, yes I did. I built
this from my hands, you know, with my bare hands.
Like no, literally, the government, the government gave you a
(22:32):
break so you could build it. Oh, I built this
from scratch all by myself. I played by the rules.
You did play by the rules. But the rules were
definitely in your favor. Because if I go out there
and I say I want to start a business, I
think the government should give me a tax break. The
government goes, well, how big is your business. The bigger
the business, the more likely you are to have some
(22:52):
help from the government. It's just the way it works.
But what do you owe back to the government. The
smaller your business, the more you're gonna owe the government.
I don't get that tax break because my business is
too small. Small business in the backbone of America. It
pays all of our taxes so that we can give
those taxes to the big companies that really are going
to do great things. Right, that's how our system works.
I find it to be flawed. Not a big fan
(23:14):
of it. I understand how we got here. I'm just
not a big fan of that. And moreover, I'm even
less of a fan of the dishonesty around it. Like, no,
we're not, Yeah you are. Just call it what it is.
You are. So either we're all on a level playing
field and we're all gonna pay the same taxes, or
we're not at a level playing field. Uh, and just
admit it. That's what it is. So what this CEO
(23:35):
of Anthropics said is basically, quit sugarcoating what's coming. Mass
elimination of jobs across technology, finance, law, consulting, and other
white collar professions, especially entry level gigs. Entry level white
collar roll. That's what you need. Entry level white collar
(24:00):
that's a cush job until the cuts come and then
you're the first one out. But let's just call it
lower to middle Management. This is from Anderson Cooper talked
with The Anthropics CEO and CNN Dario.
Speaker 11 (24:12):
You've said that AI could wipe out half of all
entry level white collar jobs and spike unemployment to ten
to twenty percent.
Speaker 1 (24:21):
How soon might that happen?
Speaker 12 (24:24):
Just to back up a little bit, you know, I've
been building AI for over a decade, and I think
maybe the most salient feature of the technology and what
is driving all of this is how fast the technology
is getting better. A couple of years ago, you could
say that AI models were maybe as good as a
smart high school student. I would say that now they're
as good as a smart college student and sort of
(24:45):
reaching past that. I really worry, particularly at the entry
level that the AI models are are, you know, very.
Speaker 11 (24:53):
Much at the center of what an entry level human
worker would do. The problem is what is entry level
human worker? Because I can tell you this, I'm not impressed.
I'm not impressed by AI yet. I mean, I'm impressed
by where it's going. I'm impressed by the concept. I'm
impressed by the proof of concept. I'm impressed by some
(25:13):
of the early functionings. I'm not impressed with a lot
of it because hallucinations still run. Wild hallucinations are still
a big issue. Not only that, but in fact, I
was just seeing this today. I googled something today so
mad they just couldn't find it.
Speaker 2 (25:34):
I was looking for. My son was talking about free
range chicken. This is just completely random here. My son
was talking about free range chickens versus cage free versus
industrial farm poultry. Because he's buying eggs and he bought
the cheapestakes. And he says, oh, this says that they
are cage free. What does that mean? They have one
(25:56):
foot by two feet instead of they I don't know
this disclosure so that they could roam around the barn.
And he said, there're still not enough room for him, right,
So you just saying this, and I thought, I've seen
the stat before where chickens and cages at these egg farms,
they have like a half foot one half square foot
(26:19):
that's where they live their entire lives. Very small. It's
about the size of an iPad. They live their entire
lives in this area, about the size of an iPad.
And and so I was looking that up. I was
trying to find the exact area, and I kept well,
first of all, I binged it because Cayla, you know big.
And then Bing wasn't giving me the answer I wanted
(26:41):
because I said, how big? I put in the question
something like how large a space do hens have in
an industrial poultry farm? And it said, if you're building
your chicken coop, you should offer this much? And I go,
that's not what I wanted, and it said and then
it gave me a bunch of different instructions on how
to build a chicken coop. So I said, forget this,
and I went over to Google. I, you know, like
what the kids use and uh. And I went over
(27:03):
to Google and Google gave me the same stuff. And
then you know how they all respond now with with
the AI response for Bing, its co pilot for Google
is Gemini. The Gemini response told me the same stuff.
It was wrong, And then it was interesting, it comes
with a disclaimer and it said AI responses may include mistakes.
(27:27):
They had to add a disclaimer. So when you when
you hear uh is it a moodi is how you
say his name m and a a mo dei darrio
is his name? The guy from Anthropic when he says
that they're they're at about entry level job positions. I
think to myself, what are the expectations of a human
(27:48):
entry level job position? Because if the expectation is it's
going to be filled with mistakes, hallucinations and false attributions, yeah,
it's there. But if the expect is that the person
is going to do their job right, then the large
language learning models are not there. They're just not there yet.
(28:09):
And I know I'm only talking about the language learning
models and not the other AI, but it's not there yet,
all right, he continues on, and so it's hard.
Speaker 12 (28:19):
To estimate, you know, exactly what the impact would be,
and you know that there's always this question of adaptation,
and you know, these these technology changes have happened before.
Speaker 9 (28:30):
But I think what is striking to me about that
this this AI boom is that it's bigger, and it's broader,
and it's moving faster than anything has before, and so
compared to previous technology changes, I'm a little bit more
worried about the labor impact simply because it's happening so
fast that yes, people will adapt, but they they may
not adapt fast enough, and so they're they're you know,
(28:51):
there may be an adjustment period.
Speaker 2 (28:53):
Okay, I think that's a very reasonable thing that he says,
and I like that he's he's clear that people will adapt.
And this goes back to the Ludites, right, like, you
can't bring in these automatic weaving machines, how are we
gonna make any money as textile workers. This goes back
to the seventeen hundreds, right, This is where the lud
Eites got their name, people who fear technology. And what
(29:15):
did we learn when they brought in the looms. They
were to make a lot more clothes, a lot faster,
and it brought the price down on clothing. You were
able to crank out supply a lot faster, and then
more people could wear your clothes. They and sales went
up and it actually created more jobs. Right now, AI
is doing the same thing. People that are using it
(29:36):
effectively are enhancing their job. They're becoming more efficient, which
a lot of bosses like, oh, look at how much
more work we can put on them, And people are like, Ah,
you can't make me work harder, how dare you? But
it's the same argument that they had in the seventeen hundreds. Oh,
this technology, now you're gonna want me to do more stuff. Yeah,
but it's gonna be easier to do that stuff. You're
working the same number of hours, you're just getting more done, right,
(29:57):
So that's cool. And anytime that we've had an advancement
in to achnology. I used to hear this in the
eighties all the time. Roll You remember that, Kayla, You
don't because you're you were just a You're a twinkle
in your father's eye in the eighties. But Rowl and
I remember this. You remember back in the eighties, it
was like, robots are going to replace us all. It's
really bad. I grew up in Michigan and so it
was all about the robots. We're going to destroy the
auto industry and uh and they were talking about they're
(30:21):
going to replace all these jobs and all these people
are going to be out of work and it's going
to be terrible. And what we found out is you
could crank out cars faster, more consistent quality, and and
the price of vehicles came down, right, That's what we saw.
The other problems with the auto industry in Michigan are
kind of self imposed. But as far as the industrialization
(30:45):
of the robotics, that wasn't it that made things better?
AI is going to make things better, But as as
he points out, it's happening so quickly that there's going
to be an adjustment period, and as company, they are
looking to save money, especially if we're headed towards some
sort of a recession, which you know, people keep talking
about recession here. Recession there hasn't really played out in
(31:06):
the numbers so far, but we'll see how that goes.
As the summer goes on, then company's gonna be looking
to how can we be more efficient? If I can
bring AI in and I can have my one worker
do twice as much work, then I don't need two workers.
So that's when they say we're gonna be able to
lay people off. Right, We're gonna be able to replace people,
and especially in things like entry level, white collar. So
(31:28):
how many pair of egals does a lawyer need if
AI can do some of that? Well, right now, AI
is hallucinating case law, which is one of the reasons
that people are worried about it. In fact, there was
another poll that came out that says the public is
not so cool on how quickly things are moving. Yeah,
it's great for punching up my resume. It's really neat
(31:49):
for making photos, but maybe it's moving a little too fast.
They might have a point, they might just be scared.
We'll do a little analysis on the sociology of it next.
Chris MERYLI Am six forty were live everywhere in the
iHeartRadio App.
Speaker 1 (32:04):
You're listening to KFI AM six forty on demand.
Speaker 2 (32:08):
Good evening, my friends, Chris Merril, KFI AM six forty
on demand anytime the iHeart Radio App. All right, our questions,
and I where is one place that you would never
go back? I got a story about former inmates at
San Quentin that went back to play an alumni baseball
game with their field of dreams, and I say, you
(32:28):
never get me back there? No way. If I got out,
I'm not going back. I'm not doing a visit. So
where's someplace you would never go back?
Speaker 5 (32:36):
Pigeon Forge, Tennessee, the gateway to Dollywood, and a multitude
of other tourist traps. I've never seen traffic so bad
and so many tourists per square foot. It's horrible and
you can't enjoy it. I don't know what people see
in that place, but it's just absolute congestion beyond anything
(32:57):
LA traffic has ever seen. Wow, hundreds of miles of
driving out of the way. It's not worth it, and
I will never return and avoid it at all.
Speaker 2 (33:06):
All right, you guys ever been to Pigeon Forge?
Speaker 6 (33:10):
No?
Speaker 2 (33:10):
Okay, good talk?
Speaker 4 (33:11):
Nope?
Speaker 2 (33:12):
No, yeah, all right, so now, uh not a glowing review.
I guess we don't go to Pigeon Forge ever. After
twenty nine years of living in California and Huntington Beach,
I said, I never moved back to Nashville, Tennessee, where
I grew up. Wow, a couple of people for Tennessee today.
A lot of people don't like Tennessee. Right, Well, guess
(33:32):
where I I am. Oh, thank you newsso oh you
made me break a promise. Oh, Gavin Newsom made you big.
Jerry Brown didn't make you break that promise. It took
to Newsom to do it. Okay, Larry, all right, what else?
Speaker 6 (33:49):
Good evening.
Speaker 2 (33:50):
I would never go back to Catalina Island. That was
the most boringest place I've ever went to. Oh, I
have no desire to ever go back. Wow.
Speaker 3 (34:00):
Thank you guys.
Speaker 2 (34:00):
Have a great eating. Thanks you too. She's not going
back to Kettlin, not while she wasn't there during the
wine mixer. That's the big one.
Speaker 10 (34:08):
All right, Hey, Chris, Hey, I thought it was kind
of weird. I totally agree with everything you're saying today.
Speaker 2 (34:17):
That is weird.
Speaker 10 (34:17):
I guess the show is still got a lot left,
so we'll see what happens. And look out, Angel Martinez.
Here comes Lucy Heal, Hi Kayla.
Speaker 2 (34:30):
Okay, that got creepy fast.
Speaker 6 (34:31):
Hi Steve, he loves Hi Kayla. Hey Steve, Hi Kayla, Steve.
Speaker 2 (34:39):
You had a little thing with him. You have a
little something going on there, Hi Kayla, Hi Steve. At
the end of the talk backs, he always gives me
a personal message.
Speaker 6 (34:47):
Yeah.
Speaker 2 (34:48):
Always, always, He's great.
Speaker 6 (34:48):
He's a good guy.
Speaker 2 (34:49):
Okay, all right. It sounded a little creeper to me,
But if you're cool with it.
Speaker 10 (34:53):
Then Hi Kayla, Kayla.
Speaker 6 (34:58):
Oh Jesus game that was bring me. That's I'm gonna
have nightmares.
Speaker 2 (35:04):
Thank you. Yeah, I don't lan, Hey, it's kind of weird.
I agree with everything you said tonight, So Hi Kla,
thank you. I was talking about the AI here. More
than three quarters of Americans now want companies to create
AI slowly. They say, slow it down. Here to tell
us more about it is the story read by AI outstanding.
Speaker 8 (35:27):
Most Americans want AI progress to slow down.
Speaker 2 (35:31):
Now that sounds natural, doesn't it. All right, but then
there's kind of a giveaway when it comes to AI,
which is why I think that we are still it's
still not quite ready for prime time. All right, here's
what he said. Let's try this from the beginning.
Speaker 1 (35:45):
From the top.
Speaker 8 (35:46):
Most Americans want AI progress to slow down, poll finds.
Speaker 2 (35:52):
Poll finds. Okay, all right, so we're reading.
Speaker 8 (35:55):
This as tech giants race to develop AI. Seventy seven
percent of Americans say they prefer companies take their time
and get it right, even if it delays breakthroughs. According
to the twenty twenty five Axios Harris Poll one hundred,
only twenty three percent support rapid development at the risk
(36:16):
of mistakes. The causes sentiment spans generations from ninety one
percent of boomers to seventy four percent of gen Z.
Speaker 2 (36:25):
Wow, even gen Z three quarters say let's let's pump
the brakes. But the concern is that China is going
to get ahead of us. Don't worry is if we're
not ahead of AI, China will get ahead of AI.
And then I then I mean, I mean, then they
(36:52):
would they would, they would, they would be able to
replace their workers first, So I mean you got to win.
Speaker 8 (37:04):
Despite pressure from CEOs and investors to lead in a
global AI race, the public remains skeptical.
Speaker 2 (37:11):
Yeah.
Speaker 8 (37:12):
Many fear job loss, misinformation, and irreversible early design flaws.
The poll suggests Americans have learned from past tech missteps
and want slower, more responsible innovation.
Speaker 2 (37:26):
Yeah, and we're also at an exciting period with AI.
Part of the reason that we were that were moving
so quickly is that it's exciting, it's new. We want
to see just how far this can take us, right,
just how how much can we do with AI? It's
kind of exciting for for the kids these days, Like
Kayla Kila. You probably don't remember not having the internet,
(37:52):
do you?
Speaker 5 (37:52):
I do you do?
Speaker 6 (37:54):
Oh?
Speaker 2 (37:54):
Okay, all right, all right, that's fun. Yeah. So then
you also then remember like the days where the web
pages were really rudimentary compared to what we have today, right,
where it was angel fire and what was the other one?
Angel Fire was one of them. There was what was
the other man?
Speaker 8 (38:15):
What was it?
Speaker 2 (38:17):
Well, Earthlink was a provider, right, I'm just thinking of
the web page designers like geo, geolinks or something an
angel Fire and you could make your own website, right.
My Space made it easy for anybody to make their
own website. And that was you know, that was fifteen
years after we were all sort of getting out of
the Internet in the first place. And if you look
back at MySpace from from the mid to late two thousands,
(38:41):
you see just how rudimentary that looked. So I think
we're at this really exciting time like we were in
the nineties with the explosion of the Internet. We just
want to see how much we can get out of it,
and we're all worried about what if China gets it? Okay,
guess hello, Hello, is there anyone there? Hello? Oh, that's freedom.
(39:06):
That's the sound of freedom. Oh nobody and then right,
nobody complain when we got rid of that noise, no
at all. And then once uh, you be, you'd be
on the internet, you'd be in your chat room with
your AOL, right, and then Mom would pick up the phone.
Speaker 6 (39:25):
Mom, I was using that.
Speaker 2 (39:31):
It was the worst. Facts would come in. It's terrible.
So why are we so afraid of AI Because ultimately,
when we say slow down, it's because we're worried about something.
Speaker 8 (39:41):
Right.
Speaker 2 (39:41):
One of the things that motivate us fear and greed.
And so you've got some people that say, boy, I
want to see how far this can go. I want
to see what AI can do. I just want to
see how prosperous we can be with AI. Right, that's
the greed. But what what what we What causes us
to say, slow it down, let's exercise some caution. Is
the fear side of us? And that's that's a that's
a basic emotion. That is a survival mechanism, the fear,
(40:05):
all right, And we have a lack of understanding when
it comes to AI. We don't know what it's capable of,
good or bad, and the unknown is scary. And most
of us don't understand AI. We don't understand the limitations,
we don't understand the abilities, we don't even understand.
Speaker 8 (40:17):
How it works.
Speaker 2 (40:19):
And I was reading some more on that today too,
and a lot of it is basically just predictive. Right
in the same way that you might be using a
you might be sending a text it, or you might
be writing an email, and all of a sudden, your
your your Outlook or your Gmail starts to try to
predict what the next word would be. Right, That's kind
of how AI works, but just on a grant, like
what's the most likely answer based on what the next
(40:41):
word would be? And if this is the most likely
next word, what would the next most likely word be?
And that kind of thing, And that's kind of how
AI builds it's it's responses. But we have a lack
of understanding, and that leads us to worry about worst
case scenarios. I mean, it's the same reason that we
don't know. You know, before we had telescopes, we didn't
(41:05):
know what was in the stars, and then we were
worried about what could come and destroy us. We don't
know what is in outer space. We don't know if
there are other life forms, which makes for great Hollywood
movies where you've got the war of the World or
some sort of an alien invasion Independence Day. We worry
because we don't know, and the fear comes from that
(41:32):
blindness that we have. And then we get bad news headlines, right,
we get the headlines about the hallucinations of AI. We
get the headlines about kids using AI to cheat all
over the place. We get warnings from insiders like the
gentleman from Anthropic who says this is going to create
a bunch of layoffs. We're going to see hig unemployment, right,
(41:54):
and we also, let's be honest, we've had bad experiences
with some previous tech. It's like all this am mail
that you've been getting, and I've just been getting some more.
In fact, here I've got a new one how to
spot phishing emails. Now that EI is cleaned, AI excuse me,
has cleaned up the typos, which means AI will be
used by bad actors. And nowadays it doesn't feel like
everybody's trying to take advantage of you, right, They're all
(42:16):
trying to steal your information or your money. We're getting
incessant spam and fraudulent emails, and we see how that
tech is being used for evil, and AI will make
that worse. Bad actors will use AI to try to
take advantage of you, but AI may also help you
identify those threats more quickly. We don't really consider those advantages.
(42:39):
We worry about the dangers. That is normal survival mechanism.
All right, Enough of me sounded like an old man.
Not really, I can't get away from it, but I
will stop talking about old man topics. Like there the
kids these days, and they're fancy electronics you're going to
ruin everything. There's no business like shel business. Every Sunday,
(43:01):
six o'clock at his next Chris Merril KFI AM six
forty were live everywhere the iHeartRadio
Speaker 1 (43:06):
App, KFI AM six forty on demand