Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listenings kf I Am six forty the bill Handle
show on demand on the iHeartRadio app cam Fine Am
six forty bill Handle Here on a Taco Tuesday, February tenth.
Quick reminder that at eight thirty on Friday, it's ask
Handle anything I need you for that one? And so
(00:21):
you ask a question, Neil chooses the question and I
get to answer it.
Speaker 2 (00:26):
And the whole point of it is making an ass
out of me.
Speaker 1 (00:28):
So during the show, you go to the iHeartRadio app
and you'll hear this show being broadcast. Click on the
microphone in the upper right hand corner and record a
question ten to fifteen seconds and I get to answer them.
Speaker 2 (00:42):
Every Friday at eight thirty.
Speaker 1 (00:45):
And that's the whole point of this show, isn't it,
Neil to make an ass out of me?
Speaker 2 (00:49):
Yes, sir, that's yeah.
Speaker 1 (00:50):
Matter of fact, somehow we should title that on the show. Okay,
Rich Tomorrow It is Tuesday, rich on Tech every Saturday
right here am to two pm. Is on KTLA TV
every morning, Instagram at rich on Tech website, Richontech dot
tv and Happy Tuesday, Rich, Good.
Speaker 3 (01:11):
Morning, Happy Tuesday to you. I'm surprised you didn't say
taco Tuesday.
Speaker 2 (01:16):
I generally do say taco Tuesday. By the way, I
just missed out.
Speaker 1 (01:21):
Yeah today, it's only eighty percent of the time that
I've been saying taco Tuesday. So I humbly apologize that
I didn't say it. Starting with this segment, Okay, we
got a lot of stuff to cover, and hope we
get to as many of them as we can.
Speaker 2 (01:37):
Let's start. The most controversial gadget you have ever tested.
Speaker 3 (01:42):
This one's wild. This is called b It's a fifty
dollars wristband, and it basically listens to your day and
captures that audio, transcribes it, and uses AI to come
up with insights, to come up with patterns, to come
up with facts about you, to do lists, I mean,
(02:03):
all kinds of stuff. It's kind of like a if
you had a personal assistant that walked around with you
all day and just took notes and the app but
it's all there, yeah analyzes, so it's wow.
Speaker 2 (02:16):
Yeah.
Speaker 1 (02:16):
So if you so, if you're you get religion during
the day in the evening and you're screaming oh God,
oh God at the top of your lungs, uh, that
will be analyzed, correct.
Speaker 3 (02:28):
Yes, it will be. Yeah, you should maybe take this
thing off if if you're you're practicing your religion. It's
called b by the way, and this is a company.
Speaker 2 (02:38):
It was a startup.
Speaker 3 (02:39):
Amazon bought them shortly after they were created, so now
it's owned by Amazon. They call it an ambient AI
computing band. So it looks like a Fitbit basically, it's
really small. I wore it for a couple of weeks
and uh, you know it's kind of and you're the
lawyer here, but you know it's listening. So a lot
of people have emailed me and said, Rich, this is
(02:59):
legal in California. You can't. It's a two party consense state.
Amazon says that this is actually not recording. It's it's
capturing whatever that like, the audio is not saved, so
it's I just found it really fascinating. I don't know
if I want to wear this thing long term because
it does capture everything.
Speaker 2 (03:17):
Yeah, that is an interesting legal issue. I love that.
Is it for sale?
Speaker 1 (03:22):
Now?
Speaker 2 (03:22):
Is it out there?
Speaker 3 (03:24):
Yeah? It's for sales. Fifty bucks the battery last two
all right?
Speaker 2 (03:27):
Us said, that's I mean that's not very much money.
Speaker 1 (03:30):
Okay, let's let's keep on going there's a trial and
we talked about this before, but it's going on right
now in Los Angeles, and this is huge, I mean
it's not being covered enough. And that is the addiction
trial where all the major platforms basically are being sued
(03:51):
because kids are addicted and they're planning it. I mean,
they're programming their platforms. So what's going on with that?
Speaker 3 (04:00):
Ye? Right, Well, that's the question they're trying to figure
out in the court of law here in Los Angeles.
So the defendants right now are Meta, which owns Facebook
and Instagram, and then Google's YouTube which a lot of
kids watch what's called YouTube shorts, that's their version of TikTok.
TikTok and Snap actually already settled right before the trial started,
so we don't know what they settled for, how much
(04:22):
they settle. But the plaintiff here is a twenty year
old woman who says she became addicted to these platforms
as a teenager. She developed anxiety depression, body dysmorphia, and
her lawyers say that it was all because these apps
are designed to keep you scrolling. They use all kinds
of hacks and tricks to just get capture attention. And
never let it go. Meanwhile, YouTube and Meta deny all
(04:45):
of this, of course, as they would as they are
big companies. They say that no, it's not their apps
that caused all these things for her, it's her family
trauma that caused it all. And so now this is
the first big test case. And if this woman wins
basically as I understand it, there would it would open
the floodgates for thousands of other similar cases across the US,
(05:07):
which might trigger, you know, some sort of class action
or something like that.
Speaker 1 (05:10):
I can't imagine this would be a class action lawsuit.
It couldn't even begin to think that it would not be.
And the as you said, the settlement of those two
big ones, and they never talk about the amount of
money that settled, or admitting liability or any of it.
It has to be in the billions and billions of
(05:31):
dollars that the settlement has already happened. It almost reminds
me of the big tobacco That settlement was I think
over one hundred billion dollars, and the pharma industry with
fall with the Purdue leading the leading the list of defendants,
and that went for set I don't even know how much.
Speaker 2 (05:54):
It went for one hundred and something billion dollars.
Speaker 1 (05:56):
I mean, it's and the number of plaintiffs as astronomical
as you said, this is going to just explode in
terms of these lawsuits to the point where these companies
are going to take a really good hit, a big hit,
and they not.
Speaker 3 (06:10):
Just a hit, but yeah, they got to they got
to think about, you know, their actions in the future.
But you know how they design these apps, so how
you know, are they addictive? Do they are they a
little mini slot machine in your pocket? You know? Are
they are they targeting kids? And I think Bill, you
can comment on this more. I think it really comes
down to the jury here is is there I assume
there's a jury in this case, but it comes down
(06:32):
to you know, if anyone has kids and they've seen
the effect that these apps and services have on their children,
or their grandkids or any kids, or their teacher, they
know that they do hook kids in and these kids
cannot escape it. Some adults can't escape it. So I
think that that is the big question here is you know,
who in the world is going to say that these
(06:53):
apps are not designed to keep you going forever until
your eyeballs fall out of your head.
Speaker 1 (06:58):
Like I say, by the way, wait just a personal note,
And I don't know if you can make the prediction
be you being a new person. Do you believe that
those those apps are designed to grab kids?
Speaker 3 (07:12):
There's not even a doubt in my mind. Now is
it written on paper bill like in the emails between
Zuckerberg and you know, his staff saying hey make this
more addictive?
Speaker 2 (07:20):
Probably not.
Speaker 3 (07:22):
But if you don't think that these companies are sitting
there trying to figure out a way to make these
apps stickier than the other one, it's just there's no
number one. From a business standpoint, that's what you want.
They're not sitting here saying how can we protect kids?
They only protect kids when the law tells them they
have to, or they're worried about the law telling them
they have to.
Speaker 1 (07:41):
Right Rich, chat GPT where you go to is now
well you may start seeing ads saying here is the question. Normally,
don't they make enough money simply by data mining selling
information about people that come in to chat GPT?
Speaker 2 (08:02):
Uh?
Speaker 1 (08:02):
Is it ever enough money? Or does everything have to
go to ads?
Speaker 2 (08:06):
Also?
Speaker 3 (08:08):
Uh, everything has to go to ads? Because CHATCHYBT at
this point is not data mining. They're not selling the
information they gather so far. They primarily make their money
right now through their API, which means big companies license
their their information, you know, and their and their their
AI in general. And then of course the users that
(08:29):
pay a lot of people pay big bucks to access
chatchybt and so that's been the primary way. But here's
the thing. They know that this is expensive. This is
not a I mean, technically open ai I think is
still a nonprofit. But they're you know, they're in it
for the investors, which have put a lot of money
into open ai. So they need to turn a profit
and the way to do that is to squeeze the
(08:51):
people that are using this for free. So if you
are on chat gybt for free, at this point, you
may start to see ads underneath your I call them
search results, but I guess it's just your answers on
chat gbt. So this is rolling out starting today. If
you are on a free plan, if you're on the
plan where you're paying eight dollars a month with which
(09:12):
they just introduced, you're also going to pay or you're
also going to see ads. So bottom line, you're going
to start to see ads CHATTYBT is doing it in
a way bill where they say it's going to be private.
They're not sharing your data with the advertisers, they're not
sharing your chats with the advertisers, and they're keeping everything
in a separate kind of section of the app. But
you'll see it'll say sponsored.
Speaker 2 (09:31):
At what time?
Speaker 1 (09:32):
Or in your opinion, do you think that chat gpt
is going to not only sell ads, have ads, but
also sell to the data mining organizations and to license
itself to the majors.
Speaker 2 (09:48):
I hit all three of them.
Speaker 3 (09:50):
I mean yeah, I mean, here's the deal. We know
that people are using these AI chatbots for everything. There
is a treasure trove of information about trends, what people
are interested in, what people are looking at, what are
they researching, and that includes products, services, ailments, whatever you
can imagine. So the data that they are sitting on
(10:13):
is huge. It is It is the new kind of
you know, the new thought system of America. Like everything
that's that we're thinking is inside CHATCHBT. It used to
be Google and it still is to a certain extent.
Obviously people are still using that, but you know they
have the ability to monetize that in various ways. I
think it depends on this company and the way that
(10:33):
they want to respect the people that are using it
to decide how do they want to proceed, because it
only takes a small misstep for them to go down
the lines of a Facebook or a Meta where people
don't trust them anymore. So you have to be very,
very careful. And right now they're still building their business,
so they're going to tread carefully, I think.
Speaker 2 (10:52):
Okay.
Speaker 1 (10:52):
The last one, Weaimo is admitting that humans sometimes take over.
Speaker 2 (10:56):
You go, wait a minute, those Weimo cars, there's nobody
in them to drive. How does that work? Okay?
Speaker 3 (11:02):
This I thought was wild because we know that Weimo
says their software they call it the most experienced driver
in the world. They've logged many, many millions of miles autonomously.
But they just admitted in a Senate hearing that some
of it's robotaxis actually phone a friend when they need help,
and the friend is in the Philippines. These are workers
(11:23):
that are not necessarily driving the cars, but they basically
help the cars figure out when they run into an
odd situation like a construction zone or like a traffic pattern,
and that's what happens. So the operator looked through the
car's sensors and video feeds and tell it how to
do it. So it's kind of interesting because we always
(11:43):
thought that weimo's were fully autonomous. Now we're learning through
this Senate hearing that they may not be.
Speaker 2 (11:49):
Wow and the Philippines do those operators in the Philippines
sing karaoke while they're doing this.
Speaker 3 (11:57):
I don't know what they're doing, but the reality is
the senators were wondering about the distance between the car
and the operator because clearly there's not much latency if
they can be operating from remote of a distance that long.
By the way, there is a car service in Las
Vegas called vey Vay, and what they do is they
(12:20):
make no bones about it. They're driving their cars remotely.
And so Bill, instead of getting an Uber, you would,
let's say you need a rental car or a ride somewhere. Okay,
they will drive this car autonomously to your house, pick
you up, drive you to wherever you need to go,
and then pick up the next person. And the person
that's driving this car is remotely driving. So imagine your
(12:41):
Uber driver is sitting in an office building with a
bunch of computer monitors and a steering wheel in front
of them.
Speaker 2 (12:48):
Okay, and that's the company called Oivy. Do I have
that right.
Speaker 3 (12:53):
I think it's just Vey, but I think it was
inspired by oive So you might be saying that when
you're in the car.
Speaker 1 (13:00):
Yeah, all right, rich we'll talk again next Tuesday.
Speaker 2 (13:04):
This weekend eleven am to two pm, rich On.
Speaker 1 (13:08):
Tech the segment KTLA every morning Instagram at rich on
Tech website, rich on Tech dot TV.
Speaker 2 (13:15):
And we'll catch you next week.
Speaker 3 (13:18):
Thanks Bill.
Speaker 2 (13:18):
All right, WAMO.
Speaker 1 (13:22):
Drivers remote drivers in the Philippines.
Speaker 2 (13:26):
Boy, that's news to me. And I'll bet you that
they sing.
Speaker 1 (13:32):
Plings nothing more than plings.
Speaker 2 (13:37):
Okay, we're done, guys. Now.
Speaker 1 (13:39):
The big story, or one of the big stories that
we're following, is the Mark Kelly trial that is going on.
Mark Kelly, of course, the US Senator from Arizona, former
fighter pilot, former astronaut, now a US Senator, got on
the wrong side of Pete.
Speaker 2 (13:58):
Hegseth our defense Secretary of Defense.
Speaker 1 (14:01):
He calls himself Secretary of War because that's what the President,
of course has asked for.
Speaker 2 (14:07):
And he is, needless to say, an a a acolyte
of the first order.
Speaker 1 (14:12):
If the President says it it will happen no matter
what jump off a bridge, it happens.
Speaker 2 (14:18):
A matter of fact, he's changing his spittle name to Lemming.
Speaker 1 (14:21):
Because that's the way to describe a lot of the
president's followers. Anyway, you know the story of Kelly and
five members of Congress who were all in the military,
and they came out with that video saying to the
(14:41):
military people, saying to soldiers, not only do should you
not follow illegal orders and of course referring to Donald
Trump and his various orders, but you cannot follow illegal orders.
Speaker 2 (14:55):
Well, HeiG Seth immediately.
Speaker 1 (14:56):
Went nuts, of course, and is trying to bring Kelly
to trial. Now, originally it was going to be a
military trial, a court martial, which, by the way, the
Secretary of Defense War can do simply argue that this
(15:17):
is conduct donebecoming and come in with a military trial
to demote him. And this is the punishment, is to
demote Kelly, which means not only his rank is demoted,
but also his retirement pay gets reduced. That's the big one.
That's where you get hit. And originally it was going
to be a tribunal, but it turned out that there
(15:38):
was no chance he was going to win the tribunal
because this is the First Amendment issue. Kelly is retired,
He's no longer in the military. There's never been a
case where a military official or a military person was
brought up to a court martial under these circumstances, and
the general consensus was the Hexas was not going to
(16:01):
win this one.
Speaker 2 (16:02):
So instead he.
Speaker 1 (16:03):
Moves into an administrative move, which he is allowed to do,
and administratively he is saying, I now can move Kelly
into that position and reduce his pay it's retirement pay,
and reduce his rank. Well, there's a lawsuit.
Speaker 2 (16:23):
Of course.
Speaker 1 (16:24):
Kelly filed a lawsuit saying, wait a minute, this is
only for political retribution.
Speaker 2 (16:30):
That's all it is.
Speaker 1 (16:33):
I was simply exercising my First Amendment rights as an
ex soldier.
Speaker 2 (16:38):
Why is he liable?
Speaker 1 (16:39):
And the other ones are not because he is still
a reservist and so therefore he is under the control
of the Secretary of Defense. The other five are they're
done no retirement, No, they're done, or if they do
have a retirement, they're finished. Not reservists, no longer under
the auspices of the Defense Department. So it's turns out
(17:00):
that Kelly is suing it. I hope he wins, because
heg Seth has said that the reason they're going after
Kelly is because he's a seditionist, because he spoke against
the United States. Donald Trump called him a straight out trader,
and he should be executed for what he said, simply saying,
(17:27):
by the way, this came right out of the Uniform
Military Code. It's right there. The law that controls the
military says that specifically you cannot follow in illegal order.
And what these five congress people and Kelly the Senator
simply parroted what the Code of Military Conduct says. And if, however,
(17:54):
if part of that, if you are saying, if you're
making a statement that so somehow is anti Donald Trump,
and this is no shock, you're a trader. You're a seditionist.
Now I'll tell you who is not a seditionist. Who
is not a trader? January sixth, people that overran the Capitol,
they were patriots.
Speaker 2 (18:14):
It was a peaceful demonstration. Exactly.
Speaker 1 (18:17):
However, if you say you cannot follow a military order
that is illegal, you are a trader to the United States.
You are a seditionist. And Hegseth just wants to put
him in prison. Well, actually, at this point, Hegseth just
wants to remove his retirement pay or reduce it and
knock him down in rank. It's the president that's going
to the extent that he should be executed.
Speaker 2 (18:42):
I mean, it's just it's crazy. So is he moving forward? Yeah?
Speaker 1 (18:46):
And here is the issue that the court says, and
I think there's two sides to this, or there's two issues.
Issue Number one is it's clearly retribution. They've said it
on literal literally on the table they put it, this
is retribution because you were anti President Trump. At the
(19:07):
same time, it's an administrative decision by the Defense Department,
and the Court gives administrative decisions wide latitude administrative agencies,
of which the Defense Department is an administrative agency under
the purview of the president. So is the Court going
(19:27):
to say, we are not going to get in the
way of Hegseth's decision could be And if it goes
up to the Supreme Court incidentally, which a very well might,
I think the Supreme Court is going to give it
to the administration because they've given the administration ninety percent
of what they've asked for, expanding the powers of the
executive so it's it's pretty depressing stuff. As far as
(19:51):
I'm concerned.
Speaker 2 (19:52):
I just I'm having a hard time with all of
that for sure.
Speaker 1 (19:59):
Okay, Now, California, as you know, we have a super
legislature that is all Democrats. I don't know if you've
ever heard about Will Rogers, who was the great comedian
and commentator. When Will Rogers died, on his tombstone, you
will read, I never met a man I didn't like.
California legislators to a person their tombstone will read I
(20:23):
never met a.
Speaker 2 (20:24):
Tax I didn't like.
Speaker 1 (20:26):
And that's exactly what's happening now because we're not tax enough.
I just want to let you know that that we
don't have enough money for the homeless situation, for medical
and especially now because of what the President has done
in terms of yanking a lot of social programs. California,
which provides huge amounts of social programs, we don't have
(20:49):
the money. So Senator Bernie Sanders and California legislators have
come up with a great idea, and that is, let's
charge a one time five percent tax on the assets
of all California billionaires. And that's essential to prevent millions
of the state's most vulnerable residents from losing access to
(21:12):
healthcare for the most part because of federal funding. And
then opponents are saying, wait a minute, that means wealthy
entrepreneurs are going to flee the state and we're done.
Speaker 2 (21:25):
Gavin Newsom, mister liberal.
Speaker 1 (21:27):
Is on the side of no. He does not want
to charge billionaires as five percent or tax them. I think, well, okay,
I mean it makes a lot of sense because, well,
you can only tax people to a certain extent. Is
there a limit that you can tax people? Bernie Sanders,
(21:47):
also known as his nickname in high school was Karl
Marx When he was asked, is there a limit to
the amount of taxes that can be charged that can
be levied? And asked what is that limit? And he said,
(22:08):
I don't know what it is, but I'll tell you
when I see it. It's like Justice Potter of the
Supreme Court, when asked to define pornography in one of
his cases, says, I can't define pornography, but I'll tell
you when I see it. And so the legislature as
(22:29):
a matter of fact, is not doing this because this
is so crazy. So they're bringing it up to the
November elections with a proposition, and they need six hundred
and seventy five thousand votes, which means they have to
actually have seven eight hundred thousand people actually register because
a huge number are ineligible because they put a wrong address,
(22:50):
or they put a wrong initial, they put a wrong period.
They'll have to be exact, and we'll see what the
people of California are going to say. If I had
to guess, I would say the people of California are
going to vote yes to this because it's billionaires, because
billionaires shouldn't have that much money. They just shouldn't. You
(23:11):
got to share it with the rest of us. It's
a really redistribution of wealth more than anything else, and
it's everything the United States is not about.
Speaker 2 (23:21):
Right it is.
Speaker 1 (23:22):
It's basically an anti capitalistic system. Now it's a one
time right five percent levee. Does anybody believe it's a.
Speaker 2 (23:32):
One time levee?
Speaker 1 (23:34):
Not a chance, because every law we have ever passed
that has a sunset clause where its sunsets, I mean
it ends at a certain given time and it involves taxation.
Speaker 2 (23:47):
It somehow becomes permanent. It's just permanent.
Speaker 1 (23:54):
So we'll see that probably gonna make the ballot and
we'll see what happens when in November.
Speaker 2 (24:00):
It's gonna be an interesting mid term, real interesting. We're done, guys.
That's it.
Speaker 1 (24:04):
Gary and Shannon are up next this afternoon at at
four o'clock. Made your announcement here at KFI are Grand Puba,
the Grand Vizier Imperial Wizard of KFI. Paul Corvino is
going to make that announcement on I think the Conway Show,
and John is going to be there to help with
(24:25):
the announcement. I won't God, why is it we're always
left out of everything?
Speaker 2 (24:30):
Guys?
Speaker 1 (24:33):
You know, we just we're not We're not involved. They
just don't care about us. Well, we wake up, we
go to work. Here's the philosophy of die, you wake up,
philosophy of life. You wake up, you go to work,
you retire, then you die. Okay, that's why.
Speaker 2 (24:50):
You say you're never retiring. Huh yeah, pretty much? Yeah?
Speaker 1 (24:54):
And also what am I gonna do? I have no friends,
I hate my family, I don't want to be any place. Hey,
and I wake up the air and I wake up early?
Speaker 2 (25:04):
What die on the air? I mean not like you
do every day? Oh man?
Speaker 1 (25:08):
No, I know, I know if I'm ever going to
take a swan dive off a parking structure, man, I'm
going to have a mic, a little Lavalier Mike attached
to my caller, and I will be announcing it in advance.
Speaker 2 (25:19):
You tell me what the ratings are going to be like, Okay,
We're done.
Speaker 1 (25:29):
Tomorrow morning, five am wake up call with Amy and
Will and then Neil and I jump aboard and Anne
and Kno always here to make the show as to
the point where no one ever in management pays attention
to us.
Speaker 2 (25:45):
This is KFI. Oh oh, my dogs. You've been listening
to the Bill Handle Show.
Speaker 1 (25:50):
Catch My Show Monday through Friday, six am to nine am,
and anytime on demand on the iHeartRadio app