All Episodes

September 30, 2024 26 mins
Gov. Gavin Newsom vetoes AI safety bill opposed by Silicon Valley. Los Angeles unveils ‘real time’ crime centers, aimed at helping officers rushing to scenes. Remember that DNA you gave 23andMe? How to save outdoor recess.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listenings KFI AM six forty the bill Handles show
on demand on the iHeartRadio app. And this is KFI
Bill Handle here. It is a Monday morning, September thirtieth.
We started another week. The debate, the vice presidential debate
is going to occur tomorrow night. Fascinated by what's going
to happen. It's not even a question of policy anymore.

(00:23):
Policy is way way down the list. This is the personalities.
This is going to be great. Jd Vance, Tim Waltz.
I mean, it's just going to be fascinating.

Speaker 2 (00:34):
Okay. Now, oh, the other thing is on a negative note.

Speaker 1 (00:38):
Unfortunately, Hurricane Helena was as bad as was anticipated, if
not even more so. I mean it was the destruction
is just insane. And then, of course what's going on
in the Middle East regional war? Is it about to
break out? It's going to be a rough week. Now,
let me tell you what's going on up in Sacramento. Yesterday,

(01:00):
Governor Newsom vetoed SB ten forty seven, an Artificial Intelligence
safety bill that would have established requirements for developers of
advanced AI models to create protocols aimed at preventing catastrophes
now this is throwing out the baby with a bathwater

(01:22):
concept is what's happening here, and that's because.

Speaker 2 (01:26):
The AI industry has a lot of pull.

Speaker 1 (01:29):
There are certain industries that have enormous influence in Sacramento
and nationally. The doctors do a great job. Certainly, the
lawyers have a great deal of influence. Big Pharma has
a great deal of influence. Silicon Valley has a great
deal of influence, particularly here in California. And so in

(01:51):
his veto message, he said, and why we don't want
this bill and why he's vetoing it because it's going
to give the public a faull sense of security about
controlling this fast moving technology because it targets only large
scale and expensive AI models, not smaller scale specialized system

(02:13):
So this is system.

Speaker 2 (02:14):
So this is where baby with the bathwater comes in.

Speaker 1 (02:17):
And that is saying that we're going to stop major
companies involved in AI from screwing over the public and
developing or not developing safeguards for catastrophes that may happen
or AI runs a muck because no one knows what
AI is going to do. And he is saying, well,

(02:39):
because it's only the big guys small guys are able
to do what they want to do. Wait a minute,
that's why you're vetoing the bill. Yeah, that's why he's
vtailing the bill. Okay, let's talk about that for a moment.
Let's get into that. Because you're talking about major playlayers,

(03:00):
they're going to be restricted, the guardrails have to kick in.
Smaller players are wide open. So therefore it's in the
public interest to have nobody no guardrails established.

Speaker 2 (03:14):
But we're going to talk about it.

Speaker 1 (03:16):
He has said that his administration is wants to talk
to everybody. We want to analyze capabilities and risks. We
want to create workable protection, and we're going to keep
working with the legislature to do exactly that. But in
the meantime, I'm vetoing this bill because I want to

(03:38):
protect the public from being protected. It's very strange, bottom line,
is the way I view it.

Speaker 2 (03:48):
He caved into the AI.

Speaker 1 (03:51):
The major players and of course who was fighting this
bill are pushing for the.

Speaker 2 (03:59):
Veto meta.

Speaker 1 (04:01):
Of course, that's Facebook Chat, GPT, the Open AI company,
Democrats from Democratic Congress people, including Representative Nancy Pelosi.

Speaker 2 (04:10):
Who is in that area. And so here we go.

Speaker 1 (04:13):
Well, thank you, Governor Newsom, because you're protecting me by
basically not protecting me because not every well, for example,
here let me give you an analogy the laws in
terms of pasteurization of milk.

Speaker 2 (04:26):
This is the analogy I make. Is first thing that
came to mind.

Speaker 1 (04:29):
It only it would only deal with the big big dairies,
not small ones. So the big one Alta, Dina and Kroger,
those would be restricted to some very high end safety factors.
The smaller ones would not. Therefore, this bill is terrible. Therefore,

(04:52):
we want no pasteurization protection at all because only the
big guys are in fact affected. That's what this is
basically saying. You know, please come on, guys. You know,
at some point I had a bill vetoed and early
on in the world of surrogacy, and I was in
there very very early, and I had gotten pretty close

(05:13):
to some assembly people. And by the way, for those
of you that are close and would like some influence
with Assembly members, take them out to lunch and bring
a paper bag full of cash.

Speaker 2 (05:26):
It helps enormously, by the way, share your lunch with them.

Speaker 1 (05:30):
So assuming you can sit down with an assembly member,
which we were able to do through various sources.

Speaker 2 (05:36):
And it's a question of all influence.

Speaker 1 (05:38):
Is we wrote a bill, a surrogacy bill for protection
of surrogates, and we got it through the Assembly, and
we got it through the Senate. Governor Duke Majen at
that point vetoed it. Vetoed, I go, why, well, because
we really don't need those protections. We'll develop more protections

(06:02):
when we need them. How about now, Nope, we don't
need it. And then of course found out why because
one of his biggest, biggest donators happened to be it
happened to be a very religious guy and was not
in favor of any of this third party reproduction stuff.
But that's the same situation is, if you can't get
all protections, we're going to give you no protection. And

(06:25):
he says, but let's sit down and talk about all protection.
Let's sit down about making all of this work. But
we're not going to do it right now.

Speaker 2 (06:33):
Okay.

Speaker 1 (06:34):
And by the way, do you think we need guardrails
in terms of AI And this bill sets up stringent
standards with these systems.

Speaker 2 (06:45):
There has to be a.

Speaker 1 (06:46):
Report made to the Age every year the Attorney General.
There has to be systems that the state can shut
down any one of these AI companies if there is
quote a natural or a disaster or catch.

Speaker 2 (07:00):
I don't know what that means either. By the way,
what is.

Speaker 1 (07:02):
A catastrophe and where do you define it? How do
you define it? You know a catastrophe is you waking
up and listening to the show every morning?

Speaker 2 (07:10):
That is a catastrophe.

Speaker 1 (07:13):
You know.

Speaker 2 (07:13):
What's the way?

Speaker 1 (07:14):
Neil is nodding, Amy is nodding, Kno is nodding.

Speaker 2 (07:19):
Thank you, guys, greatly appreciated.

Speaker 3 (07:21):
The okay, hire audience is nodding all of it.

Speaker 2 (07:25):
Okay.

Speaker 1 (07:25):
Now, let me tell you what's going on in La
La County and the Sheriff's Department specifically, it's opened its
first real time watch center.

Speaker 2 (07:36):
It just opened.

Speaker 1 (07:37):
The Police Department of La LAPD is three months away
from opening it's real time crime center.

Speaker 2 (07:44):
What does that mean?

Speaker 1 (07:46):
Well, it these war rooms effectively that allow the authorities
to monitor private surveillance camera footage. They can take your
ring cameras or your security cameras if you're a business
person or a home person or have a system at home,
and they can use those in real time.

Speaker 2 (08:06):
Not just here's the video of what happened. It's real time.

Speaker 1 (08:09):
Now it has to be with the permission of the
homeowner or the business owner. But in reality, I don't
know who's going to say no to that. When you
think about it, it's not videoing the inside of your house.
It's not anybody looking at you, stopping your wife or
your husband, or your boyfriend or in many cases unfortunately,

(08:31):
the family dog. Depending on where you are in that world.
I know there's ann giving me.

Speaker 3 (08:36):
The look and then took a turn. That took a
turn for like no reason.

Speaker 2 (08:43):
It just goes that way. I don't control this stuff.

Speaker 3 (08:46):
Oh yeah, your own brain.

Speaker 2 (08:48):
Yeah, I have no I don't know what they hell
I'm going to say. So this is a system.

Speaker 1 (08:55):
And this is where, of course the civil libertarians are going,
oh my god, there's a right to privacy.

Speaker 2 (08:59):
It shouldn't happen.

Speaker 1 (09:00):
Well, of course it should, because you've got now license
plate readers that if you're looking at video. For example,
let's just say you've got someone who's committed to crime.

Speaker 2 (09:13):
Let's go really far.

Speaker 1 (09:15):
A kidnapping of a child outside of your home, kids
walking and someone just grabs the kid and throws him
into a van or a car and then drives off. Well,
as soon as the police are called that there is
an issue, and let's say a description, because they now
have possibly a description, or someone calls and here's the

(09:38):
video and they're following the guy in real time. And
you have a license pate license plate reader, so you
can use that video or the police can use that video.
Or there's something going on right now, a group of
people have committed a crime, are leaving, there's been an
assault that happened in front of a business in the

(10:00):
middle of the night. Well, they have this real time
they can click into it anytime. Now, they're not going
to be monitoring it in the sense of the cameras
all over. But because that's impossible, right, well, let me
tell you what actually does happen. You go to London,
the city of London. The city of London is about

(10:20):
one square mile. London, of course is this enormous, enormous metropolis.
But the actual city of London, we think of the Westminster,
which is a separate city in the city of London.

Speaker 2 (10:30):
That's London because so much is in Westminster.

Speaker 1 (10:33):
So the financial part of London is this central part London.

Speaker 2 (10:38):
There is a video camera.

Speaker 1 (10:42):
On every street corner on the corner of every building
and the entire I mean everybody walking, driving through, coming
up to a doorway.

Speaker 2 (10:55):
Walking in and out of a business.

Speaker 1 (10:57):
It's all being videoed every bit and it's real time video.

Speaker 2 (11:02):
And they have the war room where you can see the.

Speaker 1 (11:06):
Police looking at these dozens hundreds of screens that are
available and have the ability to instantly switch to another camera.
It's almost like a football game with the technical director saying,
let's pick up that lineman, let's pick up that quarterback

(11:26):
throwing a ball. You know how they switch cameras because
there I don't know, maybe a dozen cameras.

Speaker 2 (11:30):
Well.

Speaker 1 (11:30):
You take that concept and you not only square it,
you cube it where anytime a phone call is made
or the possibility of a crime occurs, London in real
time has the ability to follow that person or see
what's going on. That's where we're moving and that's where
we should for the thought that we have the capability to.

Speaker 2 (11:54):
Look at people. Now keep in mind this is out
in public.

Speaker 1 (11:56):
So for you people that are oh my god, the
right to privacy, there is no privacy out in public.

Speaker 2 (12:01):
It doesn't exist. Anybody can video.

Speaker 1 (12:04):
You, anybody can snap pictures of you, anybody can hear
you talk. You have no right to privacy outside your
front door.

Speaker 2 (12:12):
It does not exist.

Speaker 1 (12:16):
So now most crime is committed well at least people
are leaving outside. So let's say there's a burglary. Police
are called. I've just been burgled. They've run outside and
this is my address. Camera goes on because I've given
permission for the police to tap into my camera right

(12:39):
there and my video, and now they're following the perpetrator.

Speaker 2 (12:45):
In real time. There he is. It's like a helicopter.

Speaker 1 (12:48):
It's like those chases that we see, those low speed
chases where we're following people in real time. That's what
this capable ability is. And there are cities that are
jumping on this. I mean the fact that the La
Sheriff's Department just opened up and LAPD is going to

(13:11):
open up three months from now.

Speaker 2 (13:15):
So here's an example.

Speaker 1 (13:16):
Nine to one one call comes into the shriff stations,
the registered camera in the vicinity.

Speaker 2 (13:21):
Immediately pop out and start.

Speaker 1 (13:23):
Looking on the streets, and you're seeing this and the
cops are seeing this. They're just getting the general area.
Is there a description of the vehicle? Is there a
description of the guys who are running down the street
and they what an advantage that the police have that
we have. And so far there are several hundred cities

(13:46):
around the country. I haven't heard much yet in opposition,
But tell me how this does not help our fight
against crime. Connect this with AI, and now we're talking
about some very sophisticated stuff.

Speaker 2 (14:05):
I don't know where the connection is.

Speaker 1 (14:08):
I guarantee you that there are people working in the
world of AI and crime prevention and surveillance, and the
surveillance issue we have to be a little careful about.
We really do, and I understand that. But the basic
premise of real time, let's see what everybody's doing out there,
I'm fine with.

Speaker 2 (14:28):
I don't believe in privacy. I don't I don't care.

Speaker 1 (14:32):
Relative to the bad guys, I personally would rather lude
have a privacy issue that I'm dealing with the authorities.
Helicopters flying overhead, drones watching my backyard, and what I do.
If I want to bathe naked, Hey, fine, go ahead,
and you take a camera video me.

Speaker 2 (14:51):
The joke's on you.

Speaker 1 (14:53):
I don't know if you've ever seen me naked, but
it's not pleasant.

Speaker 3 (14:57):
I have, and I can tell you it is not pleasant.

Speaker 2 (15:01):
That's true.

Speaker 1 (15:02):
Neil and I have actually roomed together at many events
in which well I have been naked, going back and
forth to the shower, and to see Neil's face, the
grimace and the revulsion that.

Speaker 2 (15:19):
You showed was really a joy. I wish I could
video that. But those are the days.

Speaker 3 (15:24):
Oh no, no, no, no.

Speaker 1 (15:26):
Yeah, yeah, yeah, I remember what were you? I remember
what did you say? At one point?

Speaker 2 (15:33):
I remember you screaming, oh the size of that thing?

Speaker 3 (15:39):
No, I don't remember that.

Speaker 2 (15:41):
Oh yeah I do, I do. All right, let's take
a break.

Speaker 3 (15:44):
I think I said, hey, you have a little little
lint or something.

Speaker 2 (15:48):
Yeah, we're gonna take a break. Okay, yeah, okay.

Speaker 1 (15:53):
I had asked Anne, and she's running around someplace that
she went to twenty three and meters for Jeanette testing
to find out basically where I'm assuming it's where you've
come from.

Speaker 2 (16:04):
Where you are Amy, Did you do any of this
background testing? Are you one of the people that are
in the database. I have not done that, okay, kno
you no, sir, I haven't either. And I don't know
where Neil is. He's running around someplace too. I didn't
do it.

Speaker 1 (16:19):
I am not interested in doing it at all, finding
out what genes I have, because frankly, it's a waste
of time.

Speaker 2 (16:26):
I'm right, and I won't spend the money.

Speaker 1 (16:27):
I don't even know how much it is ninety nine bucks,
one hundred and fifty dollars or whatever it is. And inevitably,
do you evern't know that any everybody who has gone
back to find out their origins, they're all Egyptian princes
or princes. You ever notice that, like if I were
to find out, I would find out that I'm really
I was really a fire hydrant, you know, previous reincarnations

(16:50):
that I'm at it?

Speaker 2 (16:50):
Just what is it? What?

Speaker 1 (16:53):
And is it interesting to find out that your X
percent And I'm not talking about just twenty three and
meters because you have all kinds of different kind of
DNA tests that you're part Eskimo or your partner Rwegian Eh,
who cares?

Speaker 2 (17:07):
That's for me.

Speaker 1 (17:08):
But on a serious note, the company twenty three and
me has fifteen million people. It's about to go bankrupt,
it's about to go under. And here is the problem
with that twenty three in me. They're pretty careful about
the information that they have about people. It is This

(17:30):
is data that goes way beyond where you shop, what
piece of parlor you're at, what I buy at costco.
That that with a pop up ads, so everybody knows
or you can sell the data of me or you
to advertisers or people that do all kinds of data mining.

(17:51):
With your genetics, that's a very different animal. That is
pretty private stuff.

Speaker 2 (17:57):
So let me.

Speaker 1 (17:58):
Start with the company twenty three and meters and these
other DNA companies. They're not held to hippo standards, they're
not considered medical offices. They can turn around and sell
your information all day long to whoever your DNA information,
your genetic makeup. How's that for some information like selling

(18:19):
them to insurance companies to determine your risk or their
risk and insuring you or any number of any number
of things.

Speaker 2 (18:29):
So that's one problem.

Speaker 1 (18:31):
By the way, if you look at the fine print,
it's there that says if we are acquired or we
emerge this, your information can be sold.

Speaker 2 (18:42):
That's a little scary.

Speaker 1 (18:43):
The other thing we're finding out is that it doesn't
give much information if you're looking at DNA, if you're
looking to find out if you have a predisposition to
X disease. Now what Well, it turns out it isn't
a gene. It is a whole series of genes which
are not particularly well understood. And let's say you are

(19:05):
predisposed to diabetes or heart disease.

Speaker 2 (19:09):
What is the intervention?

Speaker 1 (19:11):
Exercise, eat well, moderation, don't smoke.

Speaker 2 (19:16):
It's the same crap.

Speaker 1 (19:17):
So what good does it tell you you have to
or you should live a moderate lifestyle anyway, Now, California
does give.

Speaker 2 (19:28):
Some additional genetic privacy, but not enough. And we're going
to see what we're going to get.

Speaker 1 (19:35):
By the way, consumers have assumed this risk without really
thinking about it and not getting much in return. When
the first draft of the human genome was unveiled, it
was billed as a panacea.

Speaker 2 (19:47):
Okay, it's not. Now it's kind of fun to know.

Speaker 1 (19:52):
Oh, Ans come back and you went to twenty three
and meters, right, what did you find out?

Speaker 2 (20:01):
I found out what things I'm likely to be ill with,
things that I'm likely to be allergic to, those types
of things.

Speaker 1 (20:11):
Yeah, well, it turns out that it's not going to
do you that much good. Maybe the allergy part, But
there's a lot of genetic information out there that is
so complicated, is the information is so across so many genes,
that is, you know, it's not worth my So let
me ask you this. You've got They know about your genes,

(20:34):
they know what you're made of, basically your predispositions.

Speaker 2 (20:40):
Which way to go?

Speaker 1 (20:41):
How do you feel about it being sold to advertisers,
medical groups, insurance companies.

Speaker 2 (20:47):
Yeah, not so good?

Speaker 1 (20:48):
Okay, well, yours is up for grabs. Yours is absolutely
up for graps. By the way, do you know how
much Norwegian or in to it or African you have
in you?

Speaker 2 (21:04):
Yeah, I'm like ninety European, like white European. Yeah, Scandinavian.

Speaker 1 (21:10):
So you are Hitler's favorite group of people. Congratulations, she
is an aryan, a perfect aryan.

Speaker 2 (21:19):
I do have a cousin out there that I realized
that well has been ignoring my emails for about a
year and a half.

Speaker 1 (21:26):
Oh yeah, then you get to find out people in
your family who don't care about you, and then you
find out that it really is some serial killer.

Speaker 2 (21:34):
Although what do they call.

Speaker 1 (21:37):
Maternal DNA and they catch bad people with that? That
is worthwhile. That is how I'll buy that one. Now,
I'm not a privacy nut. At all, and frankly, I
don't care. I mean, okay, take my DNA.

Speaker 2 (21:52):
I don't give a damn.

Speaker 1 (21:53):
But I'm in the minority of people are upset about business,
insurance companies, vendors knowing everything about you. For example, if
you have Italian blood, are you going to eat more
pizza and pasta?

Speaker 2 (22:09):
Of course you are.

Speaker 1 (22:13):
If you are Haitian, are you more likely to eat
dogs and cats?

Speaker 2 (22:17):
You are recess. We've all gone out for recess.

Speaker 1 (22:26):
And guess what, outdoor recess is going to be a
thing of the past. Why, well, let's talk about climate change,
right Chicago heat index hit one hundred and fourteen degrees.
Two hundred thousand kids couldn't go outside. Outdoor activities canceled
around Washington, d c. Here in southern California. Why because

(22:47):
kids going outside particularly susceptible to extreme heat. And we're
finding out that the schools and society and the planners
the city has done it absolutely backward. They have taken
out trees for fear of kids climbing and falling out
of trees and swing. They have taken out grass, they

(23:12):
have put in They've taken out grass and put in
those rubber surfaces, you know, to play on because they're
a little bouncy.

Speaker 2 (23:22):
And what happens, all of.

Speaker 1 (23:24):
It simply adds to heat to the point where kids
can't even walk because.

Speaker 2 (23:30):
There's so much heat that's generated and reflected.

Speaker 1 (23:34):
So do we bring trees back? We do, that is
a big answer. And of course, the poorer the school,
the poorer the area, the more asphalt there is, well
you put in trees. Oh boy, there's an answer. It
takes a generation or ten or fifteen years before any
meaningful shade is created. And you know what, and this

(23:57):
was a stunner, UCLA said, the difference between being outside
in the sun. It can feel up to seventy degrees cooler.
Now that seems to be ridiculous, but certainly twenty thirty
degrees cooler when you go outside. Look at the difference
between standing on a hot day you're in the shade,
and then you go out and the sun beats down

(24:18):
on you.

Speaker 2 (24:19):
It is degrees upon degrees higher. So what is the answer.

Speaker 1 (24:23):
Well, the answer is relatively cheap, and it happens instantly.

Speaker 2 (24:27):
In a matter of days. Put up a put.

Speaker 1 (24:31):
Up shade, put up tents, put up structures, you know,
build you can buy them pre fab tarps wherever kids play,
just get shade there and do it right now, and
you can do it fairly quickly, well fairly quickly, how
about a matter of days? And then that means the

(24:52):
kids can go outside. And there's such a huge difference.
Inside recess is the pain. You know, when go outside,
what happens, Well, they socialize more, they can hang out,
They group in their little clicks and make the kids
that are out of the clicks feel terrible.

Speaker 2 (25:13):
That's easier to bully people when they're outside.

Speaker 1 (25:16):
All the good things that happen when you have outdoor recess,
when you have indoor recess, recess, none of that happens.
And that takes away, It does take away from a
lot of socialization and just going out and playing. It's
hard to play in a classroom, and it's hard to
play in the gym when you got ten classes going
in there.

Speaker 2 (25:35):
And why do they call it recess? Any idea? I
have no idea why they call it recess?

Speaker 1 (25:44):
All right, just throwing that out there, and someone's going
to email me and say, hey, you moron, here's why
they call it recess.

Speaker 2 (25:52):
All right? Fair enough?

Speaker 1 (25:53):
This is KFI am six forty live everywhere on the
iHeartRadio app.

Speaker 2 (25:58):
You've been listening to the Bill Handle Show.

Speaker 1 (26:00):
Watch my Show Monday through Friday, six am to nine am,
and anytime on demand on the iHeartRadio app.

The Bill Handel Show News

Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

Crime Junkie

Crime Junkie

If you can never get enough true crime... Congratulations, you’ve found your people.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.