Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:22):
Kf I AM six forties later with Mo Kelly with
live on YouTube, Instagram, Facebook and the iHeartRadio app. Have
a huge show for you tonight. It's a beautiful day outside.
It was a nice drive on the way in, wasn't
too much traffic. Let's get to it. How about you.
Do you have a favorite grocery store?
Speaker 2 (00:42):
I do.
Speaker 1 (00:43):
In fact, I will drive an extra five or six
miles to get one of my preferred grocery locations. I'm sorry,
but I'm not stopping at Food for Less. If that
offends you, that's too bad. I'll go to Albertson's. There's
not a pavilions around me. And I'll definitely stop at Ralph's,
except for the one that's right by my house because
that one has too many gunshots fired and according to
(01:07):
Mark Ronner, might even have Ladies of the Evening in
front of them.
Speaker 2 (01:09):
Depending on the location, that might just be my neighborhood.
I don't know. Good evening, Mark, Hello, Mo, good evening, Stephan,
How you doing, sir? Hello? How are you sir? All right?
Speaker 1 (01:20):
Just want to check in, make sure everyone was here.
We got Carnacia here. She's going to be managing the
chat on YouTube, on Facebook, on Instagram. So if you
have an issue, be careful because Carnecia blocks more people
than I do. Be on your best behavior, and she
blocks them quicker than I would. She's like, my you
(01:40):
have like the angel and the devil from your shoulder.
Speaker 2 (01:43):
She's the devil one. That's not true, no, she but
I will say this. I watch it. She gives a
quick warning right away.
Speaker 1 (01:52):
Yeah, I should call a foul next, and there's an
ejecture one chance.
Speaker 2 (01:57):
That's it.
Speaker 1 (01:58):
After that you're in time out. She is a person
after my heart because she just understands. There will be
no foolishness allowed in the chat, but some of the
subjects will be discussing. Tonight, Kroger is going to close
sixty locations, and if you like me, that might impact
you depending on where you like to shop, where you
(02:18):
have to shop, because with me, there's certain foods I
know that are only available at my rouse, the one
that I want to go to, the one that has
the least amount of bullet holes in the side of
the building. No, seriously, I actually shop at the one
on Redonal Beach Boulevard in Venice. So don't be surprised
if you see me there. That's why I do all
(02:39):
my grocery shopping. Well, what kind of foods are we
talking about here that you must go there for? They
don't have a Sonic out here. For example, Sonic has
these tater tots which are so damn good, so good, you.
Speaker 2 (02:54):
Know what I'm talking about.
Speaker 1 (02:55):
They are so good, and they're not in a lot
of grocery stores. But this particular rouse has frozen. Yes,
it's such a bummer that we don't have a Sonic
out here.
Speaker 2 (03:08):
We don't. I think what's the nearest one, maybe Fullertin
or something like that.
Speaker 1 (03:11):
Yeah, it's not close, So I will get my son
tater tots from there. They are head, head and shoulders
above all the other tater tots?
Speaker 2 (03:20):
And then what do you air frame or put them
in the oven?
Speaker 3 (03:22):
Oh?
Speaker 2 (03:23):
No, no air frier? Okay, all right, airfire ten minutes
of three hundred and fifty degrees.
Speaker 3 (03:27):
This is important to know what. Everybody loves tots and
this isn't even a Napoleon dynamite thing either.
Speaker 2 (03:32):
But not everyone can make them. Well, now's seasoned differently? No,
because you don't want soggy tots. I'll tell you that
right now. No, soggy tots.
Speaker 1 (03:41):
Look no disrespect to Orida, but they're just not as good.
There's too much salt. It's seasoned differently. But thus one
of the foods that I have to get from my
particular rouse. And also there's some fake egg products It's
called eggs from plants that I'll buy.
Speaker 2 (03:59):
They usually have their in stock.
Speaker 1 (04:03):
Long story short, I've found out that I have somewhat
of an egg allergy and it makes it difficult on
my arthritis. True story if I eat too many eggs,
and so I've been eating some egg alternative products. But
if you like, if I was eating egg whites and
that product egg beaters, and egg Beaters has egg whites
(04:24):
in them, and so you know, it was wreaking havoc
on my body. So I had to find particular products
and they always have it at that rouse.
Speaker 3 (04:35):
You know, I feel like egg Beaters really missed a
golden opportunity by not shooting any commercials with Ike Turner.
Speaker 2 (04:41):
Hello, Hello, come on food.
Speaker 1 (04:46):
That would have worked five years before the B two movie.
Oh please, we can't cancel you, but we can cut
off your mic. Mark said that Mark, Yes, send all
hate mail to at Mark Runner.
Speaker 2 (05:00):
The real Mark Runners, how many gold It's going to
be one of those nights.
Speaker 1 (05:07):
We're also going to talk about how the LAPD has
been allowed to use drones as first responders under a
new program. I don't know if I predicted this, I
don't know if I called it, but I definitely set
off the conversation surrounding the use of drones and what
I think is going to be an uncomfortable conversation with
the Constitution. That's how I like to think of it,
because when it comes to privacy, search and seizure and
(05:30):
you have drones out there basically mapping your house, I
don't know if I'm comfortable with that, And we'll revisit
that conversation.
Speaker 2 (05:38):
We'll talk about that a little bit later. So much
to do tonight.
Speaker 1 (05:41):
Of course, you can tune in on YouTube, you can
tune in on Facebook and Instagram. It's Later with Moe Kelly.
We're live everywhere. In addition to that on the iHeartRadio app.
We'll talk about Croker when we come back.
Speaker 4 (05:52):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.
Speaker 3 (05:59):
Am.
Speaker 1 (06:00):
Later with Moe Kelly, and I'll say a last segment
I have my place where I got a shop as
far as grocery shopping is.
Speaker 2 (06:06):
And I don't think I'm the only one.
Speaker 1 (06:08):
I think that you should recognize there's a difference in
quality between grocery stores. And I'm pretty much brand loyal.
Now there's an occasional Vaughan's I'll end up at if
there's just nothing in my particular path, you know, a
from on my way home and I need to stop somewhere.
Vaughn's will do. But I'm not stopping at Food for Less,
(06:29):
not doing it. And if you don't know, Food for
Less is a subsidiary of Kroger, as is Ralph's. Like
Ralphs is their lexus and Food for Less is there
a Toyota? Think of it that way, except Food for
Less is not even a Toyota. It's really really bad.
It's their Oldsmobile Delta eighty eight.
Speaker 2 (06:50):
There we go. Do they have Food for lesson in Washington?
I don't believe so.
Speaker 3 (06:55):
No.
Speaker 1 (06:55):
Yeah, I haven't seen them outside of California. It may
be just a California brand. But Food for is really
really bad. And unfortunately, Food for Less always is near
where I live, and the routs near where I Live
is really bad as well. The reason I mentioned that
is because Kroger Food for Less than Routes, the parent
company Kroger, has said that it's going to close sixty
(07:19):
US stores in the next eighteen months as a as
a point of consolidation and hopefully improving products. Now, they
don't list the prospective stores which are going to be closed.
But if you're just to take the eyeball test and
go through some of these stores. So the one I
that's close to my house that I can't stand and
(07:41):
refuse to go to because they're well put this way,
if you walk into your Routs or Food for Less
and they have armed security, that's probably not a place
where you want to shop. And I mean I'm not
talking about the old security guard. I'm talking about armed
security because cause shots may ring out at any moment.
(08:03):
That's my rous and it's that I want to say,
Vermont and one hundred and twentieth Street.
Speaker 3 (08:09):
So what is the caliber of the security guards of years?
Because mine they were a little liffy, You weren't You
weren't confident that if stuff started going down that they'd
be able to handle it.
Speaker 1 (08:19):
No, these were decent size, decent age youngish men.
Speaker 2 (08:23):
I want to say, maybe early thirties. Are we talking
like gravy seals here? No? No, no, no, no, no
serious guys. No, these seems like seem.
Speaker 1 (08:31):
Like guys who are probably trying to get into the
police academy.
Speaker 2 (08:34):
I don't think that's how you do that.
Speaker 1 (08:36):
I'm just saying it's looked like they they're maybe law
enforcement adjacent. All I know is they got their you know,
they got their gun. And it's not a revolver. Okay,
it's probably like it's a glock. From what I could tell,
I wasn't like, you know, putting my hand on it
to sea, but it seemed like it.
Speaker 2 (08:53):
Was a glock. Yeah, that's discouraging. Yeah, so I'm not
going to shop there.
Speaker 1 (08:56):
And I say that to say, uh, the rouse by
my house, please lease close it, Please just take it away,
because when you walk in, and I've walked in it before,
not only is there a likelihood of violence breaking out,
but it smells rantidd there. It's like how can people
shop here? But then you remember, yeah, where I live
there's pretty much a food desert. You don't have a
(09:19):
lot of grocery items. That's a serious point. There are
a lot of not a lot of places to shop
as far as grocery options, and so you get what
you get, and you're stuck with with what you're stuck with.
And I'm quite sure that's one of the lowest performing stores.
If you ever go by one twentieth in Vermont, you'll
see it's really, really bad. But for many people it's
(09:39):
the only option. And if Kroger's going to be closing stores,
I earnestly beg of them, please close that one, but
put up something better.
Speaker 2 (09:50):
Oh, raise it to the ground. Yeah, there's no good
coming out of that store.
Speaker 3 (09:54):
Now, you're not the same kind of degenerate vampire late
night person I am. And I go to the store
after work a lot of the time. And I don't
know if you know how Harriet gets at Ralphs when
they're trying to shut the doors at one and people
still want in. They think they're gonna get some booze
or something, and the security guards like, nope, you got
to come back tomorrow, and people they don't take no
for an answer all the time.
Speaker 1 (10:14):
They do not, and the routs by being I think
they close at eleven just because it gets too dangerous.
After that I don't stop at that one. Like I'll say,
I'll stop at the occasional Vaughans. Going home from work,
I'll stop at that vaughns On Pass, I think it
is it pass that's where I go.
Speaker 2 (10:29):
Yeah, that's that's the And I say, I go to Vaughn's.
Not choosing Vaughn's.
Speaker 1 (10:33):
It's just the nearest grocery store because I'd rather stop
on this side of town than mine, especially if you
have like something last minute.
Speaker 2 (10:39):
It's literally a four minute drive from the studio.
Speaker 1 (10:42):
Yeah, yeah, yeah, And I don't know what there's a
rouse around here, so I just go to von Well,
I'll say the routes that's nearest us is. Oh, you know,
it's what I'm talking about. Oh yeah, No, it's not
a little sketchy. It's a lot of sketchy. Yeah, it's yeah,
there's a lot of illegal things happening in that parking lot.
And I didn't expect it in Burbank. And I was like, oh,
(11:02):
I guess hood areas exist out here too, Oh.
Speaker 2 (11:05):
There are there are.
Speaker 1 (11:06):
It's just you you know when you when you're in
a hood, you know that you're in a hood.
Speaker 3 (11:10):
Also, it's a long standing joke. I don't even know
if you remember when we were kids, Johnny Carson would
never fail to get a laugh when he mentioned shopping
at Ralphs because it was so unthinkable that Johnny Carson
would set foot in the Ralph's.
Speaker 1 (11:24):
Well, it's interesting you say that because in like Studio City,
they have some very good Ralphs markets.
Speaker 2 (11:33):
I think it's at Vineland.
Speaker 1 (11:34):
And when I used to live in Studio City, I
would go in that Ralph's and you would see quote
unquote celebrities Hollywood figures all the time, all the time.
I mean it did be character actors that be movie stars.
I can't remember all the people. Like I walked in
one day and there was like Wayne Knight walk in.
(11:56):
I'm trying to think all the people. It's various people
from TV shows. You recognize him by faith, Oh Adam Saylor.
One night, you will see people and then you realize, oh, yeah,
they do go grocery shopping too.
Speaker 3 (12:08):
I had no idea. I wouldn't think Ralphs is for
the A listers Wayne Knight, I believe.
Speaker 1 (12:12):
Well you think of the it's not who the A
listers are, it's the location. And on Ventura Boulevard. You
see like two or three different Ralphs. Now, I think
they have a Whole Foods. Yeah, they do, but it's
going towards Sherman Oaks, So you have one or two choices.
It's either Whole Foods or Ralphs on Ventura Boulevard.
Speaker 3 (12:34):
You know, in six or seven years living here, I've
still never run into a celebrity at a grocery store.
Speaker 1 (12:39):
You see me every single day time for the news.
I knew you'd appreciate that. But Kroger is going to
be closing sixty of their stores, which means a lot
of Ralphs. Please close the one nearest to my house
for Stephan's sake and mine as well. It's Later with
mo Kelly ca if I AM six forty. We are
live everywhere in the iHeartRadio app and when we come back,
we have to talk about LAP and their use of
(13:03):
drones going forward for first responders. That is probably going
to end up as a constitutional issue, but we're going
to talk about it when we come back.
Speaker 4 (13:14):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.
Speaker 1 (13:39):
KFI Later with mo Kelly Live on YouTube, Instagram, Facebook,
and the iHeartRadio app.
Speaker 2 (13:44):
And this is an ongoing conversation.
Speaker 1 (13:46):
I'm trying to be very transparent with you and open
about how I think about these issues. And I think
about not how they're presented, but how not only an
intended consequences, but how they are likely to evolve. And
when we talk about drone technology and law enforcement, of
(14:07):
course everyone wants to be safe. Of course everyone wants
law enforcement to be safe. Let's make that acknowledgment that
we agree on. We agree on all that, hopefully we do,
but then let's see where it goes from there. And
I am uncomfortable with the idea of the use of
drones in certain situations. We've talked about how drones will
(14:31):
be used as far as finding out who is responsible
for the use of illegal fireworks, and I complain about
fireworks all the time.
Speaker 2 (14:40):
I would like the people who are.
Speaker 1 (14:42):
Using them, selling them, purchasing them to be found and arrested.
Speaker 2 (14:49):
Accordingly, it is a crime.
Speaker 1 (14:51):
But I also know that a drone flying over people's
houses in over backyards to discern who is doing something
illegal makes me feel uncomfortable.
Speaker 2 (15:02):
That was just fireworks. Now the LAPD is.
Speaker 1 (15:07):
They have been approved by a civilian oversight body to
an update to their policy which would allow drones to
be used in more situations, including calls for service like
you call nine to one one. Depending on the nature
of the call, they'll send out a drone in advance,
like an advanced scout, to know what police will be
(15:29):
happening upon.
Speaker 2 (15:30):
And I get it. You want to keep law enforcement
officers safe. I get it.
Speaker 1 (15:35):
And according to the new guidelines, they'll be used for
high risk incident, investigative purpose, large scale event, natural disaster,
and so forth. But here's the thing.
Speaker 2 (15:48):
If you talk about how they're going to.
Speaker 1 (15:50):
Be used, you can also discern who most likely they'll
be used in conjunction with. For example, they'll be sending
out drones ahead of officers to help with dangerous standoffs,
crowd control monitoring of mass protests for safety reasons wink wink,
(16:12):
But department official stress that it will not be used
to track or monitor demonstrators who aren't engaged in criminal activities. Well,
if you're monitoring a demonstration, you're monitoring everyone. That's inclusive
of the people who might be an agitator. That's inclusive
of the people who are genuinely protesting and it's not
(16:33):
about me trying to say that there is no good
use for drones and surveillance and law enforcement. I'm not
saying that at all. I'm just very very careful and
particular when you talk about the use of drones for
mass surveillance, because when you're sending out the drones, it's
(16:53):
not to protect people, it's to surveil people.
Speaker 2 (16:56):
Let's be clear.
Speaker 1 (16:58):
When we're using it for first responders, you're sending out
the drones, and there's nothing wrong with this, but let's
be clear on what it is and what it isn't.
You're sending out the drones as a look ahead for police,
not as a response to help you, the lay person,
the civilian. If you call nine one one, and let's
(17:18):
say it's a domestic call, and this is all hypothetical,
but I try to reason this all the way through
and see it to its logical conclusion. You call nine
one one for a domestic dispute and the drone goes out,
The drone is not going to get there and help you.
The drone is going to get there and give the
lay of the land for law enforcement so they know
what they're walking into. And I'm okay with that, because
(17:39):
police should not have to walk in and get broadsided
or blindsided by something, because everybody should be able to
go home at the end of the night. But I
am careful in my support because I don't want drones
just hovering in my neighborhood because they are look looking
(18:00):
for quote unquote, say it with me, criminals. I know
what it's like to fit the description quote unquote. I
know what it's like when you are profiled. I know
what that's like to walk through a store, for example,
(18:24):
and the assumption is that you're going to steal something.
I know how that then translates to treatment. It may
be a drone, it may be an actual law enforcement officer,
but I know, even under the best circumstances and intentions,
(18:44):
it can and historically has disproportionately negatively impacted certain communities.
I know what it's like to be driving through Venice
and be stopped and asked, how did you get this car?
How did you afford this car? And my answer is
always respectful and the same is my license and or
(19:09):
registration not in order. I remember I was in Venice
by Venice Beach driving a convertible as two thousand, relatively
new got pulled over.
Speaker 2 (19:22):
Officer just went straight forward, how did you get this car?
So what I.
Speaker 1 (19:26):
Think about law enforcement techniques and even though they may
be designed to protect the community, I do think about
how they may be misused because humans are fallible, Humans
have biases. Humans under the best circumstances don't always have
great judgment. And this concerns me when you have these
(19:49):
first responders going out and monitoring, because we just had
an array of protests here in Los Angeles, and if
you have crowd control, a crowd mon and mass monitoring
of people, and say, well, what do you have to
worry about if you're not doing anything criminal? That's not
the point. That's not the point. And I've made this
statement before. It's like, look, I don't have anything stuffed
(20:11):
up my behind, but it doesn't mean that I want
to submit to a cavity search. Some things are about
the principle of it all and if you're okay with it,
and well, let me rephrase, I find that the people
who have absolutely no complaint or concerned are the people
who it does not impact and probably won't impact. If
(20:33):
you've never had to worry about fitting the description, then
this probably won't bother you at all. If you've never
been assumed to be doing something criminal, then this won't
have any reference point for you. If you've never worried
about walking somewhere and being followed and monitored, then this
(20:56):
probably won't resonate with you. We all have a different
prison in which we look at life, and we are
the sum total of our experiences. And I can tell
you things and you probably say, nah, that didn't happen, MO,
And yes, my experiences are real, and I can recount
them to you each and every time. And I can
(21:16):
tell you specifically, line item by line item, the times
that I know that I was profiled and pulled over,
that I will stopped and question told to get out
and put my hands on the hood of a police car.
If you don't know hot hands what those are, then
you don't know what I'm talking about. If you've never
been sat down on the curb ask where you're going,
(21:39):
you won't know what I'm talking about. So I'm always
always suspicious, I think that's the right word suspicious when
we're talking about another layer of surveillance being used for
not even criminal purposes. That's what it is. When you're
telling me that you're going to be using these drones
(22:00):
to monitor crowds. You're you're not talking about a a
criminal call. You're just talking about a social mass monitoring.
And that makes me uncomfortable because I know how it
can be misused and abused, and we don't know, and
and and in the story in the La Times, the
question is articulated. You know what happens to all this
(22:23):
video later on?
Speaker 2 (22:25):
Is it? Is it?
Speaker 1 (22:26):
You know, the facial recognition, And I know Mark, you
agree with some of this. The facial recognition, the mass
surveillance of all these cameras everywhere is used in ways
that most of us don't even know.
Speaker 3 (22:41):
And it's just going to keep moving in that direction.
If you've read anything about a company called Pallanteer run
by Bigger teal uh, and if you haven't look it up,
you're going to be shocked. This is some dystopian stuff
and it will affect everybody, no matter what color you are.
Speaker 1 (22:56):
We're talking about a singular database of everyone, everyone, I mean,
Gary Oldman, everyone, except it's not funny. No, it's not funny,
but it is actually real.
Speaker 3 (23:11):
Now, I don't know know if you saw this because
it was viral yesterday and last night I saw some
video of a guy in a field who was kind
of being stalked by a drone and he took it
out with a broom handle. No, I wanted to stand
up and clap in the room I was in, and
I wouldn't be mad. No, you know, I wouldn't be mad.
Speaker 1 (23:32):
And now there's a constitutional issue of where your right
to privacy begins. You know, there's a constitutional issue of
the reasonable expectation of privacy when you are supposedly in
the public and we are not in public in the
way that you would not allow someone to just get
(23:53):
up on your fence and look over into your backyard.
I would liken this to that, because we're talking about
monitoring in some instances, we're not actually talking about crime
stopping or crime addressing. Now this story is talking about
allowing the drones to be used as first responders. I
would say it's like the away team, where they go
(24:15):
ahead and see it advance what law enforcement may be encountering.
Speaker 2 (24:19):
And I absolutely agree that there is a use for it.
Speaker 1 (24:22):
I'm just saying we should always consider the possible misuse
of it in the way that Mark Ronner used, says
que Bono.
Speaker 2 (24:30):
Who benefits.
Speaker 1 (24:31):
I always think about the other end of the spectrum,
who might be victimized because of it, who might be
penalized because of it? And if we don't have that discussion,
then I think we're not having a full discussion. It's
Later with mo Kelly I Am six forty Live everywhere
on the iHeartRadio app, YouTube, Instagram, and Facebook.
Speaker 4 (24:49):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.
Speaker 1 (25:00):
Litter with mo Kelly Live everyone in the iHeartRadio app.
And we're having a good, respectful, thoughtful conversation in the
motown chat by the Momigos talking about the use of drones,
talking about how they can be used, how can they
be misused. We all have our biases when it comes
to law enforcement and the tactics which are used.
Speaker 2 (25:24):
I think we should be honest in that regard. We
all have our biases.
Speaker 1 (25:28):
And it's usually an accumulation of our experiences. There's things
that I've experienced that Tauwala hasn't experienced, and vice versa.
And I think about the unintended consequences, not what you
tell me is going to be. It's like, well, it
could be this, and if it does that, Like, for example,
(25:48):
I may mention a facial recognition that's kind of a
neutral topic, facial recognition. There are a lot of people
will think that, well, it's good, we'll know where everyone is.
And there's some people that I'm just paraphrasing people's argums.
Some people will say, like, hey, I don't feel comfortable
with that, you know, because facial recognition is not super
duper accurate. I mean, there have been a number of
(26:10):
studies where it has shown that it has been inaccurate.
There's I think it's the city of San Francisco, Mark,
please correct me. I think it was San Francisco that
opted out of facial recognition. I don't remember. I'm going
off my memory. But not everyone is in a municipal census.
It bought in on facial recognition, partially because of the
unintended consequences of getting it wrong. This is why I
(26:34):
worry when it comes to drones about what happens if
and when they get it wrong, because we know that
it's only going to be used in certain places under
certain circumstances. That's not my read, I mean, that's what
the story says it's going to be used for certain
types of crimes. So that means it's going to be
in a higher crime area that's gonna be used. It's
gonna be used for demonstrations. So what does that mean. Okay,
(26:56):
it's going to be used in mostly urban areas. I
wonder what that leads to, you know, because when you
have the drones going out first looking for fireworks and
you can't see me unless you're watching the show Fireworks
air Quotes, or they're going out on a nine to
one one domestic violence call, I have questions about what
(27:18):
happens to that video, you know if if they show me.
Speaker 2 (27:21):
I'll give you an example.
Speaker 1 (27:23):
Let's say there's a nine to one one call and
someone saying, uh to walla sharp, this guy who lives
next to me is using illegal fireworks and by law,
they're allowed to send out the drone for that.
Speaker 2 (27:33):
But along the way they.
Speaker 1 (27:35):
Notice that Stefan is doing something else in his backyard
which may or may not be illegal. How does that work?
How does that work? Or they find out something about
me and my backyard? Are just me going about you know,
like Google Earth, you see how people's lives end up
on Google Earth. I wonder about the unintended consequences. And
(27:58):
it's not that I'm worried about the doing something illegal. No,
I just want to believe, or least think that there's
some level of privacy having nothing to do with a
nine to one to one call for someone else.
Speaker 5 (28:13):
I just marvel at how, within a week's time, we
went from talking about police departments using drones to detect
illegal use of fireworks to drones being rolled out all
together for all types of different reasons. This is what
(28:35):
they're telling us as the public story, but they can
alter this the reasons why. This is the forward facing
press release. Yes, we don't know what is inside, the details,
the minutia of why they can and will use them. Next,
you know, there's a drone going out, I don't know,
catching me jaywalking, and it's like jaywalking. It's well, you know,
(28:59):
I mean ran the area and they were looking for
protesters or they were looking for, you know, a nine
to one WOD call. They were just waiting and they
saw you jaywalking and we got your face, so we're
sending you this ticket.
Speaker 1 (29:11):
It's funny, but it's not funny because we do have
the not necessarily drones, but the cameras on the buses
who just read your license plate. I mean, I have
to believe that all this works together is and it's
in the same database.
Speaker 2 (29:23):
It just makes me uncomfortable.
Speaker 1 (29:24):
I can't say legally constitutionally where it is run a foul,
But as sure as I'm sitting here, I promise you
within the next three hundred and sixty five days. I
don't like making predictions, but I am on this one.
There will be a constitutional challenge on this on Fourth
Amendment grounds as far as illegal search and seizure, right
(29:50):
to privacy, and the expectations because.
Speaker 2 (29:55):
It's invariably going to happen.
Speaker 5 (29:57):
But but that's only going to happen when one of
the drones happens to drift into a neighborhood where the
intended use of the drone has not been permitted if
you read what I'm saying.
Speaker 1 (30:13):
I do, I do, or if there's some sort of
unintended consequences where someone else gets caught up in something,
because in other words, like if the drone is looking
out looking for fireworks, I don't know if the drone
can claim exitent circumstances for something that's going on three
yards over a three backyards over right. I say, in
(30:35):
the next three hundred and sixty five days, I expect
there to be a constitutional challenge to this because you
can't control all of the data, which it's a massing,
and all of people's lives, which is a massing in
real time. We have not had that discussion yet here
in America, North Korea, China, even South Korea.
Speaker 2 (30:58):
Yes, they've had that conversation. We have it yet, and
I don't think we're ready for it. I also don't
remember voting on this, No, we didn't.
Speaker 1 (31:05):
It's later with Mo Kelly k if I am six forty.
We're live everywhere the iHeartRadio
Speaker 4 (31:09):
App KSI and kost H D two Los Angeles, Orange
County more stimulating talk