Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to Bill Handle on demand from KFI AM
six forty.
Speaker 2 (00:06):
And this is KFI Handle here. It is a Thursday morning,
October third. A lot of stories we're looking at, still
waiting for Israel and the retaliation that is absolutely going
to happen against Iran, and new court filing. Yesterday, Jack Smith,
the special prosecutor, came up with more allegations against Donald
(00:27):
Trump and the attempt to undo the election. And because
of what the Supreme Court ruling was, and the president
has immunity if he's doing a presidential act, Jack Smith
is working towards these were private acts, a candidate trying
to overturn an election, not a president.
Speaker 1 (00:47):
And that's where it's going to go. Well the court
by that, who the hell knows? Now.
Speaker 2 (00:50):
A couple of days ago not good news. Massive port
strike along the East and Gulf Coast's fifty thousand member
of the International Longshoreman Association boom out the door, walking
picket lines and here's what they want.
Speaker 1 (01:06):
Well, first of all, let me give you a couple
of quotes.
Speaker 2 (01:09):
The president of the union, Harold Daggett, said, I quote,
if we have to be out there a month or
two months, this world will collapse.
Speaker 1 (01:17):
And he said, go blame them.
Speaker 2 (01:19):
And he's pointing to management, that is, the port operators,
the owners, the terminal owners.
Speaker 1 (01:26):
Et cetera.
Speaker 2 (01:27):
And here is the real problem. They're asking for wage
increases and amy you had said the longshoreman. The average
they started eighty one thousand dollars a year and they
go up to two hundred thousand dollars a year. Not
bad for a union job. That, by the way, what
kind of marketable skills do you need to move cars
(01:48):
around and be a longshoreman? That's not to say any
valuable work, but it's not quite the same as getting
a degree or getting training as a pairalegal or someone
in coding or a car mechanic.
Speaker 3 (02:03):
Is it particularly dangerous?
Speaker 1 (02:05):
Not anymore? Not anymore. I mean it can be.
Speaker 2 (02:09):
You're under those you know, you're under those big containers.
I don't know how many people are heard or killed
every year. I'm assuming a minimal number because they're pretty
careful about safety.
Speaker 1 (02:18):
But here is the problem. Two issues. Number One, they
want more money a pile more money.
Speaker 2 (02:23):
They want five dollars an hour raise over every year
for the next six years. So basically it's a seventy
seven percent raise over six years on top of this
enormous salary, which, by the way, I don't particularly begrudge
them because there aren't that many and I think spread out. Yeah,
(02:44):
if I were a terminal, if I were a terminal owner,
i'd complain. But that's management, and that is the employees,
and that's the union.
Speaker 1 (02:54):
That's sort of a given. Now.
Speaker 2 (02:56):
The part that I am having a much harder time
with is part of the negotiations is automation has to stop.
Technology has to either slow down or stop because what
technology does is you need fewer workers, and therefore they
(03:17):
want to stop technology.
Speaker 1 (03:21):
That is scary stuff.
Speaker 2 (03:24):
Can you imagine the Auto the United Auto Workers saying,
no more robots. We want to do it ourselves because
robots eliminate a lot of the workforce. That's the part
that's scary. These are ludites and they're willing to simply
(03:45):
stop science, stop advancement to keep people working. You know,
at some point, I mean, you don't even talk about that.
You say, that is what the future is about. It
is to automate. Same thing happened in the here's one
with the workers. You know, one of the most powerful
(04:06):
one of the most powerful unions that ever existed and
still powerful are the rail workers, right, the employees that
work for the rail line, the railroads. You know, there's
the people who do what the engineers et that are
around the railroads. Uh. There used to be cabooses, remember cabooses,
which they don't exist anymore. Well, cabooses were retained for
(04:29):
decades and decades after they were not needed anymore. And
the caboosemen, you know, the cabooses were about getting the mail,
you know, swing it when it's there, and about helping
water go into the boilers. And so for decades after
they were not needed, there were caboosemen. Then they got
(04:51):
rid of the cabooses. Caboosemen were still there. With no cabooses,
it was a a caboozman less job.
Speaker 1 (05:06):
But you still had a cabooz operator there.
Speaker 2 (05:09):
I don't even know where they sat, frankly, because there
were no cabooses to sit in.
Speaker 3 (05:13):
They sat on their caboose.
Speaker 1 (05:14):
That's good point.
Speaker 2 (05:15):
So we're talking about much the same thing, probably not
to that extent, but to say that automation has to
slow down, you go, okay, man, Now.
Speaker 1 (05:26):
We've reached the point where it's scary.
Speaker 2 (05:29):
I get the money part. I get the wages, I
get the benefits I get.
Speaker 1 (05:34):
Oh, by the way, I'd love to know what the
benefits are for long shortman.
Speaker 2 (05:37):
I mean, that is a job. Do you know the
most highly paid? Here's one for you. The most highly
paid employees are the stage hands in New York. Amy,
look this one up real quickly. Stage hand unions in
New York on Broadway. And what is the average salary
of a guy who moves sets? Okay, hey, eure up right,
(06:03):
points to it, microphone headsets to go, Hey, you're up.
Speaker 1 (06:07):
This is what they do. Amy, what is it?
Speaker 3 (06:09):
It's not that high?
Speaker 1 (06:11):
Oh, I thought it was enormous.
Speaker 4 (06:13):
Well, no, it says as of September for union stage
hand in New York.
Speaker 1 (06:20):
It's part of the union. Now keep in mind we're talking.
Speaker 4 (06:22):
Twenty bucks an hour.
Speaker 1 (06:23):
That's twenty bucks an hour.
Speaker 4 (06:25):
Salaries range from twenty four to forty seven.
Speaker 2 (06:28):
Okay, how about the union elevator manufacturers or elevator mechanics. Okay,
I blew it on that one. Let's go for the
other one. Elevator mechanics. Hey, handle the union.
Speaker 5 (06:40):
Do you remember one of your anniversary shows we were
doing in Vegas and all of a sudden there was
some need we had, oh yeah age, and they wouldn't allow.
Speaker 3 (06:51):
Us to do it. We had to get a Union person.
Speaker 2 (06:54):
We're talking about changing changing a light bolt with lights
or something, and way it was crazy thousand dollar or something.
It's insanity. I mean, this is the unions that power.
The unions have.
Speaker 4 (07:05):
Union elevator mechanic in Los Angeles makes one hundred and
twenty nine, four hundred and eighty dollars.
Speaker 3 (07:11):
Per year, okay on average.
Speaker 2 (07:15):
Rights And by the way, I think I think they're
pushing also for getting rid of buttons and putting elevator
operators back in, you know, with those little pillbock hats,
pillbox hat tho.
Speaker 1 (07:24):
I like that.
Speaker 5 (07:25):
My dad was the elevator operator at Bullocks, Wilshire.
Speaker 3 (07:28):
That's how where he met my mom. He would go
to her floor.
Speaker 5 (07:31):
She was a receptionist, and the doors would open, he'd
see her and then he'd follow his eyes with her
as they shut.
Speaker 3 (07:39):
And then they made like seven babies. That's so fixed,
mixed race babies.
Speaker 1 (07:45):
Yeah, that's I mean, that's sweet. That is I don't care.
Speaker 5 (07:51):
Hey, you know what, how many long shortmen are on
strike right now, Bill fifty thousand. Okay, So fifty thousand
is going to cost US five billion dollars per day.
If if we eradicated fifty thousand lawyers, everything would still
stay the same.
Speaker 2 (08:05):
That's true, you'd get paid more. You don't need lawyers.
You do need longshore men or women or whatever they
or trends or whatever they call the longshore people. Longshore people?
Is that because you can't use the word men anymore?
All right, Okay, Now we move on, and this one
has to do and a story that I always share
(08:26):
with you. I'm constantly talking about how screwed up and
how antiquated, defective and just.
Speaker 1 (08:35):
Not workable.
Speaker 2 (08:35):
And this is our medical system in the United States.
It's a disaster. And so I want to tell you
about what's going on recently. What's happening. I don't know
if you've ever heard PBMs, okay, And I've always wanted
to figure out is it pharmacy benefit managers or is
it public bowel movements?
Speaker 1 (08:52):
And then I.
Speaker 2 (08:52):
Realize it's interchangeable. You can use either one to describe
what's happening. And the are companies, these PBMs that are
effectively middlemen, and they were created to make it cheaper
easier for us to get drugs. So they are between
the health plans that buy drugs for the patients who
(09:17):
buy drugs from the health plans or the pharmacies and
the manufacturer. They're sort of the middle people in the
whole thing that sort of controls prices and what drugs
are on formularies and what drugs are covered under plans
that they make the decisions and they're making money coming
and going because what they do is they decide whether
(09:41):
a give an insurance company and they decide this is formulary,
this is covered, you have to use the generic, and
they make the more expensive the drugs are, the more
money they make because they get quote rebates.
Speaker 1 (09:57):
Well, Blue Shield is saying, that's enough. Here's what we're
gonna do.
Speaker 2 (10:01):
We're gonna go directly to the manufacturers. Forget about PBMs.
You're not gonna PBM on us anymore. And that's gonna
change everything because it was established to save money, and
it turns out they're making money on the backs of
(10:21):
insurance companies, which means on the backs of US consumers.
And so it's oh, by the way, just let you
know how much money they make. You know who owns
the PBMs now, the big drug companies. Big farm has
bought all the big b PBMs. What does that tell you?
Speaker 1 (10:40):
So the system completely broke.
Speaker 2 (10:42):
And here's just another just another chapter in this another
page in the book of we're getting screwed.
Speaker 1 (10:49):
And then we go back to how do you undo
all this?
Speaker 2 (10:52):
You go to what's called Medicare for all, you know,
basically national health, where you know who negotiates, the government negotiates.
And let me tell you how much powerful you know,
when you're negotiating with big Pharma and you're representing eighty
million people, you can cut a pretty good deal, not
bad so. And by the way, Medicare just the Congress
(11:16):
just allowed Medicare the government, which is national government, which
is National Health, to start negotiating with big Pharma. And
the first thing they did is instead of three four
eight hundred dollars one thousand dollars for insulin, which of course,
if you don't have insulin you die, it's going to
be thirty five dollars a month, let me tell you
what Big Pharma did on that one. So we're moving
(11:41):
in the right direction, and I think we're moving. We're
going to get there, maybe not in my lifetime. National
health where it just becomes part of what we do.
Like you get medicare, You're going to get Medicare for all,
which is the way it should be done. All right, enough,
excuse me, enough of that. Hey there's a case, and
(12:02):
this is of course right up my alley. So and said,
oh yeah, you got to talk about this. And this
has to do with suing people, which I happen to love.
And there is something called an arbitration agreement and virtually
every contract you have, certainly you can't go to a
doctor without signing an arbitration agreement.
Speaker 1 (12:22):
Doctors just won't see you. And what it says is, in.
Speaker 2 (12:24):
The event of a dispute, you're not going to court,
We're going to file for binding arbitration. Binding arbitration means
exactly that arbitrator makes their decision.
Speaker 1 (12:34):
Chow baby, it's over.
Speaker 2 (12:36):
And the reason the folks, the medical people, the car dealerships,
whatever force you to sign binding arbitration is because arbitrator
awards are far less than jury awards. And when you
get huge, big cases, a good attorney is going to
have that jury crying.
Speaker 1 (12:54):
And here's what happened.
Speaker 2 (12:55):
There's a New Jersey couple severe accident during an Uber ride,
and I mean really severe. The mom's sustained several fractures
throughout her whole body, cervical lumber, spine, rib fractures, surgeries,
I mean just the mess. He diminished use of his
left wrist in a fractured surname.
Speaker 1 (13:15):
I mean, this is really big stuff.
Speaker 2 (13:18):
So they're in this uberie and they try taking Uber
to court for a jury trial. But the lower court said, yeah,
you can, and the appellate court turned that around and
said they can't because they had agreed to Uber's updated
terms and conditions requiring arbitration. Look at the small print there,
(13:44):
it is you agreed to arbitration. Now, the couple said
it was her minor daughter that used mom's phone to
agree to uber each terms of service, clicking on a
button that verified she was eighteen years old, so therefore
that shouldn't apply. The appellate court said, yes it should.
(14:04):
Those terms are valid and enforceable and include the acknowledgment
that disputes concerning auto accidents or personal injuries will be
resolved through binding arbitration and not in a court of law,
and again if well, I'm a kaiser, remember and I
(14:24):
have been, you know, ever since I've ben five years old.
And you know what it is every year in the membership,
you know, or when you apply right there binding arbitration.
If you have a dispute, it's going to be heard
by an arbitrator, not in a court of law, malpractice, whatever,
it doesn't matter.
Speaker 1 (14:40):
And this is every doctor you go to, every dentist
you go to, you know.
Speaker 2 (14:45):
You sign the consent forum and you sign the forum
says you might die. And here are the possible reactions.
I love those where you get you go in and
you have a problem with your toe, or you have
a cavity, Well you could have this, this, this, and this,
then you might die. Okay, you sign that. And it
also says, in the event that there is an issue
(15:06):
of personal injury or you're complaining about the kind of
tooth I'm putting in or.
Speaker 1 (15:10):
Fixing, we go to arbitration. Now.
Speaker 2 (15:14):
The lower court said that Uber's arbitration clause wasn't enforceable
because that pop up with the terms and conditions regarding
arbitration weren't clear. They failed to clearly and unamb ambiguously
informed plaintiff of her waiver of the right to pursue
(15:37):
her claims in a judicial form. They said, you know what,
how about someone who speaks a foreign language. How about
someone who's a kid and doesn't understand. How about if
let's say you have special needs and you don't understand,
you know, you're mentally disabled or mentally backwards, or whatever
(15:58):
the hell the political correct statement is these days. For
people that are disabled that way, it doesn't matter. The
appellate court said, it doesn't matter. You're going to be
held in that arbitration agreement. Whoa I mean, seriously, when
you park a car, you know they give you that
(16:19):
little receipt and if you look at that little the
fine print on that receipt, it said the parking lot
are not responsible for theft, or on and on and on.
Speaker 1 (16:34):
And there was a guy whose car was stolen.
Speaker 2 (16:37):
Sued the parking lot people and went to court, and.
Speaker 1 (16:42):
The defense was he agreed not to sue.
Speaker 2 (16:45):
There it is right there on the ticket that you
take when you get out. By accepting this ticket, you
agree not to sue us, and we're not responsible for
And then it goes on and on and on, and
the judge picked up the ticket and he said, I
can't read this without a magnifying glass. This is not
going to happen. Insurance company, get your home, homeowners or
(17:10):
car owners insurance and start reading it. It's font big
enough for people that have virtually no eyesight, for starters.
And it explains the terms we parentheses, the insurance company,
you parentheses the insured We cover this, this, this, and
(17:31):
this parentheses, which means we don't.
Speaker 1 (17:34):
Cover this, this, this, and this close parentheses.
Speaker 2 (17:37):
Why because there were lawsuits of plenty saying we really
didn't understand. Lawyers are taught to convolute, to mix up,
to use Latin phrases. Now I'm trying to remember if
some quote panissum, I've got a big schwant so I
(17:58):
have no idea what that is when I was taught in.
Speaker 1 (18:00):
School, quid pro quo whatever.
Speaker 2 (18:05):
The point is, we don't understand it, or I'm supposed
to understand it, but you don't. And the pell at
court said, it doesn't matter this Massachusetts, it doesn't matter.
You sign it, you bought it, you agree to those terms.
Now remember the case at August there was binding arbitration.
(18:26):
There was a family that got hurt at Disneyland. And
it was pretty severe injury and sued and Disney said,
you have to go to arbitration. And why is that, Well,
because you had years ago signed a test or one
of those you know, one month only programs with Disney Plus,
(18:49):
and right there you had accepted that it was binding
arbitration in the agreement to try Disney Plus, and therefore
anything relating to Disney is covered with the arbitration clause.
Speaker 1 (19:01):
I thought it was pretty tenuous. I thought it was
pretty flaky.
Speaker 2 (19:03):
But the publicity, the optics of what Disney was doing
was insane.
Speaker 1 (19:11):
And they came back with and this is.
Speaker 2 (19:14):
Why these copywriters and these pr people get so much money.
Here is how they replied when they caved, because the
public pressure was so insane.
Speaker 1 (19:25):
Quote, at Disney, we strived.
Speaker 2 (19:28):
To put humanity above all other consideration considerations with such
unique circumstances as the ones in.
Speaker 1 (19:35):
This case, we believe this situation warrants.
Speaker 2 (19:38):
A sensitive approach to expedite a resolution for the family
who have experienced such a painful loss. They fought it,
they wanted to invoke the arbitration clause.
Speaker 1 (19:51):
They were prepared to go to court, but.
Speaker 2 (19:55):
They strived to put humanity above all other considerations. Hmm, okay, anyway,
you sign it in Massachusetts, you sign an arbitration agreement,
and it doesn't matter whether you understand it, don't understand it,
you're a kid. It doesn't matter. You're held with it,
which means you go to arbitration. And if you look
(20:15):
at the awards, nothing nothing like the jury awards where
you really get big, big money.
Speaker 1 (20:21):
All right, all right, chat GPT, Oh there's a story.
I want to take it.
Speaker 2 (20:26):
So here's what happens with chat GPT is uh, you
have conversations with chat GPT and they can go on
for hours. It's not just searching something. It's let's say
you're searching something. Normally you would Google it and here's
the information. With chat GPT, you can ask a question.
You know, I go on there, what is you know?
(20:48):
What was the date that Buchanan was the president? And
I would just be I would just get it from
from Google, from chat GPT.
Speaker 1 (20:57):
It's why do you want to know? Bill?
Speaker 2 (21:00):
What part of that presidency do you want to talk about?
The early part the latter part? Okay, that is a
history issue. That's what I would do. How about this,
I'd like some advice on breaking up with a girlfriend.
I would like to know should I cook on a
grill or should I cook on a fry pan?
Speaker 1 (21:24):
For example.
Speaker 2 (21:24):
He asked that with Neil, and then Neil asks you
questions when you go on the show.
Speaker 1 (21:28):
Well, which way you go to, which way you want
to go? What are you going to do? How many people?
Speaker 2 (21:34):
That's what chat GPT does. It gets you involved. It
gets involved with you, and people get personally involved with
chat GPT because number one, it gets personally involved with you,
and two it's non judgmental. You get into it with
people and you get personal and there's judgment in there.
Speaker 1 (21:56):
No matter what. I want to split up with my
girlfriend or.
Speaker 2 (22:00):
My wife, give me some advice, give me some pointers,
and now you get into a whole discussion and with
someone if someone is giving you advice, there's all kinds
of value judgments in there. Even subtly. You know, there's
no way around it. That's human chat GPT.
Speaker 1 (22:15):
There is no judgment, which makes it easier. And this
is science.
Speaker 2 (22:19):
By the way, I'm just not making this up because
while I really like to make things up, I look
at studies and I look at science, and I look
at research.
Speaker 1 (22:28):
When I make these statements to you and so no judgment.
Speaker 2 (22:33):
And you now get into these chats and talk about
personal information.
Speaker 1 (22:37):
What happens with your personal information? Now? When I go to.
Speaker 2 (22:43):
Costco, they know I like to buy those frittatas. Costco
knows that, okay, and so for ads that I get
or I buy pizza, and all of a sudden, I'm
driving past the pizza parlor, boom and ad pops up.
Speaker 1 (22:59):
Okay, fine, I'm okay with that. Most of us are.
Speaker 2 (23:02):
But how about all of our intimate information, how we
feel about you, how we feel about people, what our
thinking is.
Speaker 1 (23:13):
That's all available because you've laid it all out.
Speaker 2 (23:16):
Because what you have done is you've done Facebook posts,
and you've done chats, and you've gone on Instagram and
you've taken pictures and shared them.
Speaker 1 (23:27):
And what AI does is put.
Speaker 2 (23:30):
All of that together, all of your history, plus your
conversations that you've had with chat GPT, and they're going
to know more about you than any family member, than
any intimate friend because they have your entire history, and
(23:51):
they have all the pictures, and they have all of
your texts, and they have all of what you have
ever bought and what you are interested in, and what
you have googled, what you have searched for. I can't
even begin to tell you how many pops up for
marital aids I get per day. They know all about me.
(24:17):
They know I enjoy dressing in women's underwear. Okay, maybe
see that's something I never shared with you, but if
you look that up.
Speaker 5 (24:25):
Unfortunately, we all know that you eat for tatas and
the underwear stuff and all that we all know. Okay,
are your chat gpt Unfortunately?
Speaker 1 (24:35):
Yes, that's true.
Speaker 2 (24:37):
But on a lighter note, this is it's not only
do they have that information, but the way chat GPT
works chat bots, they are uniquely good at getting us
to reveal details about ourselves.
Speaker 1 (24:53):
Far more than you would with anybody else.
Speaker 2 (24:56):
They become our best friends, asking the questions in a
non judgmental way, and then of course remembering everything you've
ever said and connecting it to every picture you've ever taken,
anything you've ever posted, everything you've ever text. You put
all of that together, and what do you have? You
(25:18):
have no privacy at all. Oh, by the way, your
medical stuff too. Oh I've got a sore foot. I'm
really kind of upset about that. Well, I went to
the doctor. I have a doctor's appointment this afternoon for
my foot. Okay, And this is what it feels like.
And I hurt like hell. And here's a picture of
me in my cast. And without violating any hip of rules.
(25:41):
Guess what I can get all the information about exactly
what happens to you. And that's just the medical part
of it, financial part of it, your I would even
guess things that you say to your therapists because people
share that.
Speaker 1 (26:00):
World. It really is and it's a should we be
frightened of it?
Speaker 2 (26:03):
Well? Yeah, the civil libertarians, I'm not frightened of it
because it's just a given. Now, it's a given. That's
all you have to accept it. That's that's our world.
No privacy whatsoever.
Speaker 1 (26:14):
Don't even try, Which is.
Speaker 2 (26:16):
Why what do I not share here on the show? Well,
I share everything on the show talking to you sometimes, Neil,
I get excited.
Speaker 1 (26:26):
I get would I share that?
Speaker 2 (26:29):
You know?
Speaker 3 (26:29):
There are no secrets here. I'm just share with you today.
Speaker 1 (26:33):
I'm sharing something that's here here, here's what just happened.
I'll tell you what just happened.
Speaker 2 (26:38):
I shared something with you that went on the air
that is now being recorded somewhere someplace, and you do
a search and you will see my connection or my
feelings towards you. Neil, It's all there, it's all available. Okay,
making an example, that's all, making an example of what
(27:00):
things can do and what can happen.
Speaker 1 (27:03):
Quit looking at me that way, Neil, you are perplexed.
Speaker 2 (27:06):
You're giving the you're giving the eyebrows.
Speaker 3 (27:10):
What is the weird one?
Speaker 5 (27:12):
Because I'm responding, yes, getting aroused by me?
Speaker 2 (27:16):
Well only no, I didn't say that, but no, I
said in very limited terms, very limited times.
Speaker 1 (27:23):
I do that.
Speaker 3 (27:23):
You always speak in limited terms.
Speaker 1 (27:25):
Okay, we're coming back, all right. It's just it's you know,
you almost think it's.
Speaker 2 (27:29):
Friday, don't you. You always say it's Friday every day
this week.
Speaker 1 (27:33):
That's you know why.
Speaker 3 (27:34):
I'm gonna yes, because you want to be gone.
Speaker 1 (27:36):
I gu yeah, yeah, all right.
Speaker 2 (27:37):
KF I am six forty live everywhere on the iHeartRadio app.
Speaker 1 (27:41):
You've been listening to the Bill Handle Show.
Speaker 2 (27:43):
Catch my Show Monday through Friday six am to nine am,
and anytime on demand on the iHeartRadio app