All Episodes

September 24, 2024 23 mins
KFI's own Tech Reporter Rich DeMuro joins The Bill Handel Show for 'Tech Tuesday'! Rich talks about CA new law restricting cell phones in public schools, Russian anti-virus program switching overnight, and deep AI-learning. Is there anywhere safe to complain about work online? Trump calls for 200% tariffs on John Deere. Mark Cubans says that’s insane.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listenings KFI AM six forty The Bill Handles show
on demand on the iHeartRadio app. This is KFI AM
six forty Bill Handle here on a tech Tuesday, September
twenty four, twenty twenty four. Okay, it is time for
Tech Tuesday with Rich the Mureau, host of Rich on

(00:21):
Tech here on KFI Saturdays eleven am to two pm.
He's kfi's tech Guy. In addition KTLA's Tech Guy Instagram
at Rich on Tech Website, rich on tech dot TV.

Speaker 2 (00:34):
Good morning, Rich, Good morning to you. Bill.

Speaker 3 (00:38):
Happy Tuesday.

Speaker 2 (00:39):
Oh, happy Tuesday, although you know Tuesday's ever happy for me?
But what the hell?

Speaker 1 (00:44):
Sam Altman, who is the founder of chat GPT, shared
some thoughts on AI. He was just coming back from
a he was coming back from an auction to see
if you want by Pakistan because he has so much money,
and he talked about deep learning and how it's going

(01:08):
to change our world and how jobs are going to
be affected.

Speaker 2 (01:11):
So deep learning jobs being affected? What did he mean?

Speaker 3 (01:16):
Yeah? I think this is interesting because he's calling it
the intelligence age. And in fact, he put a kind
of a painting on his little blog post, kind of
like looks like the Renaissance. So I guess he's putting
this in the same category as a major world shift
that we are in right now. And so he talks

(01:36):
about why he believes this is the case, and he
thinks that this idea of deep learning where computers can
actually understand things and draw conclusions has actually worked after
all these years of trying, and he believes it's going
to make previously magical capabilities a reality within decades. I

(01:58):
think it might be sooner than that. Obviously, he thinks
it will be very quick. He sees a world where
we will have personal AI teams to help us accomplish things,
virtual tutors for students, so if you have a problem
in class, you will have the most personal virtual tutor
you can ever imagine. Healthcare, software development, significant improvements. There

(02:20):
increased prosperity because of all of this, and I think
that's already happened for him, like you alluded to. But
he did say that there will be labor market disruptions,
there will be jobs lost, but he believes in the
end people will find a way and there will be
new jobs created.

Speaker 1 (02:37):
Yeah, I mean that makes sense because the computer age
came in and jobs would be lost and at manufacturing
jobs were lost because of robotics and we've been able
to survive.

Speaker 2 (02:50):
That is he one of those people in the world.

Speaker 1 (02:54):
Of AI that is frightened of where AI can go
or is going.

Speaker 3 (03:00):
I don't think so. I think he understands that it
does have potential downsides, and I think he's talked about that,
and he talks about, you know, minimizing the harms of
AI and things like that. But I think he is
one of these people that is just full throttle, wants
to go full speed ahead, and you know, more so
than someone like an Elon Musk, who is a little

(03:22):
I think he wants to go full speed ahead as well.
But I think he is much more vocal about the
dangers of AI, especially when someone starts to put AI
into a robot body. And I know that sounds silly,
but that is the goal of all this is to
literally recreate the human smarter, more capable than ever, and

(03:44):
that is the goal, the ultimate goal of all this
AI stuff.

Speaker 2 (03:47):
It really is. Yeah. I mean, we think of and
everybody knows this.

Speaker 1 (03:52):
The first rule of robotics robots will do no harm
to humans.

Speaker 2 (03:57):
Rule number two. Rule number one is a croc. Of course,
robots are gonna kill us all.

Speaker 1 (04:02):
Are you frightened at all about where AI is going?
Are you one of those naysayers? Not not necessarily a naysayer,
but scared of the possibility of the negative aspects of AI?

Speaker 3 (04:15):
I think I am aware of how much it's going
to change things, So of course I think about the
possibilities of where this can go, especially when it takes
on almost a human shape. We're creating things that are superhuman,
and it's it's we're equipping people with tools and things
that they've never had before, even you know, something as

(04:38):
simple as a self driving car. I think there's there's
implications of that when it comes to hacking, when it
comes to how this stuff is used, because yeah, the
main companies like Google said, you know, they used to
say do no evil. I think they wipe that from
their blog at this point, are their manifesto whatever it is.
But at the same time, there's always going to be

(04:59):
people that use this for evil, And that is the
scary part is that you know the average person, Yeah,
they just want to get a better grade on their test,
or help them write a presentation, or their car drives
them to the grocery store or whatever. That's going to
be amazing. But the people that are using this for
evil things, I mean, imagine you just program every car
to run into a wall. I mean, that's the problem,

(05:20):
you know, and there will be people who try to
do that, and everything's getting highly automated and highly AI
and that is a real possibility of where things go all.

Speaker 1 (05:29):
Right, Rich you tried on snapchats new augmented reality glasses,
how reality and how augmented?

Speaker 3 (05:40):
Very well, well, it's very real and very augmented. So
these look like oversized sunglasses, so they're big, they have
thick stems, kind of thick glasses. But what's neat about
them is that they're all in one, so there's no
cables and basically inside the lenses there is a see
through screen that augments information on top of the world

(06:03):
around you. So you know, for many years us in
the tech world or any you know, anyone thinking in
the future has imagined that you'd walk around and see
information while you're walking right, like if someone's name or
just you know, if you're taking a map so you
can take a left it they'll show you that arrow
right on the world in front of you. So this
is the closest I've ever seen to that. The problem

(06:24):
is they're big, they're bulky, They're very simple for now,
I mean full color inside video and pictures. The battery
life doesn't last very long. But if they can get
these things smaller, the battery life longer and the price down,
we might finally have glasses where you just literally look
through the screen and see information.

Speaker 1 (06:45):
All right, So a couple questions about that when they
first came out, and I forget the name of the
first reality augmented reality glasses to now, how has the
technology evolved?

Speaker 3 (07:00):
Well, I think it's gotten The video has gotten clearer,
so you can see the video better, and I think
it's just they've gotten smaller, like less bulky. But the
thing is most of the technology out there is focused
on virtual reality goggles. So these big, giant goggles that
envelope your entire head. I think the difference here is

(07:20):
that they're working on something that looks more like sunglasses,
more like what Meta is doing with their ray Band
Story sunglasses. But those don't have augmented reality. Those just
capture pictures and videos and have speakers on them.

Speaker 2 (07:34):
Okay, So I would think that the future.

Speaker 1 (07:36):
To put all of that together would be glass sunglass
sized goggles that just do a wrap around and effectively
they would be a ray band that wraps around and
to hits your ear. And you get all of that,
And I want to ask you how expensive they are
right now? And what else?

Speaker 2 (07:57):
Did you say? Yeah?

Speaker 3 (07:59):
No, nailed it. That's it. The idea is to get
these as small as a standard size of ray bands,
and they're close. I mean, the battery is still big,
the battery is still very you know, short lived. But
right now, these that they're they're coming out with from
Snap are mostly aimed towards developers and creators, and you
have to rent them, so it's ninety nine dollars a month,
one year minimum, and I don't think you get to

(08:21):
keep them after that. But yeah, this is mainly for
developers to play with these and come up with like
a killer app. I can think of one. You know,
if you're walking around, you know, it just gives you information,
you know, where you're walking, It gives your notifications on
your you know, basically in your line of sight. I mean,
that's the idea here.

Speaker 2 (08:40):
Okay, so I'm a little confused.

Speaker 1 (08:41):
So let's say you're walking around and you're walking through
you know, whatever scenario, and at the same time, you're
now looking ahead of you and you see what is
effectively a heads up display giving all the information.

Speaker 2 (08:54):
How do you not fall down?

Speaker 3 (08:57):
Well, because it's it's actually you're unlike the app vision pro,
you're actually seeing the world and you're actually seeing it
through your own eyes.

Speaker 2 (09:05):
No, I understand that.

Speaker 3 (09:07):
Yeah, I guess you know.

Speaker 1 (09:09):
It's like I'm walking from here to the bathroom and
all of a sudden, I'm staring straight ahead and all
its informations in front of me. How do I not
bump into the wall or fall into the toilet? I
don't quite get that.

Speaker 3 (09:21):
Well, when I put them on, you're very much aware
of your surroundings. You can walk around. I played basketball
in them. I mean it's it's very much just glasses
that have information in the you know, in your field
of vision. So it's quite natural actually to be able
to still see while you have these on. And I
think you know number one, like if you're walking down

(09:42):
the street and you're following maps right now, you're bringing
your phone up, you're looking at your watch, this would
just overlay an arrow to say turn left up ahead,
and it would be placed up ahead. It wouldn't be
placed right in front of your eyes.

Speaker 2 (09:54):
Okay.

Speaker 1 (09:54):
And by the way, when I am walking ahead and
I am looking at a Google map or whatever, I
do fall in the toilet simply because it multitasking like
that is difficult for me.

Speaker 2 (10:05):
I'm just wondering if other people who are the same.

Speaker 1 (10:07):
Okay, let's finish up with the Russian anti virus program
which and I'm looking at the headline switched on overnight
that right.

Speaker 3 (10:20):
Yeah, this was banned by US regulators as of September thirtieth,
so people that had this Keospersky anti virus on their computers.

Speaker 2 (10:30):
All right, hold on, I'm sorry about that. My dogs
around here, I have to get rid of them. See
this is live radio.

Speaker 3 (10:35):
Hold on, all right, you got dogs in the studio.

Speaker 2 (10:39):
This is today.

Speaker 3 (10:40):
Am I still on air? So I just continue telling
the story.

Speaker 4 (10:43):
Yeah, Rick, just go ahead, And why don't you plug.
Your newsletter which is fantastic.

Speaker 3 (10:49):
Oh thank you. Newsletter is at rich on tech dot tv.
Sign up every Saturday morning. You get a free newsletter
that talk about all the great tech stuff you should
know about and helpful hants and tips.

Speaker 2 (11:01):
And tricks.

Speaker 3 (11:02):
That's what I'm all about.

Speaker 2 (11:03):
Got it.

Speaker 1 (11:03):
By the way, I missed all of that because I
was kicking my dogs out the door.

Speaker 2 (11:06):
But whatever you said, I'm greatly appreciative of. Thank you, Bill.
All right, Rich, We'll catch you again next Tuesday. Take care. Okay.

Speaker 1 (11:15):
I've done this story before, and I love this story
because I you are caught up in this and this
happens to be with the question is there any safe
place to complain about work online? Oh? The quick answer
is no, and the slow answer is no. There is

(11:38):
no safe place. If you are crazy enough to use
your company's computers and or servers, they're looking at you,
and even worse a I can now get involved, and
we'll get involved. Now, are there legitimate reasons for a

(11:58):
company to look at what you're saying your chats?

Speaker 2 (12:03):
Well?

Speaker 1 (12:04):
Yeah, if you've been, for example, accused of some criminal
or malfeasons. Yeah, But if you're unhappy at work, is
that legitimate? Let's say they want to look at how
you're treating customers? Is that legitimate?

Speaker 2 (12:20):
Now?

Speaker 1 (12:20):
The easy stuff is like here and the policy that
we have here at KFI, for example, who's ever crazy
enough to look at porno during the day, even during
breaks or lunch on the company servers, is completely out
of his or her mind. We've had those issues, haven't

(12:41):
we kneel When you were in management? That came up occasionally,
did it not.

Speaker 2 (12:47):
I can neither confirm nor do you stop that you
do that all the time. Okay, I will say.

Speaker 4 (12:54):
I'm not supposed to point out Klono. Okay, it's really
I'm kidding.

Speaker 2 (12:59):
Yes, of course, there's stuff like that that goes yea
and those people, those people do not well.

Speaker 1 (13:04):
For example, I had a little bit of an issue
when management came to me and said, Bill, we noticed
that you went on to the website lesbians from Venus dot.

Speaker 2 (13:16):
Com and why were you doing that?

Speaker 1 (13:18):
And I got away with it because it was for
research for a topic that I was doing.

Speaker 2 (13:28):
That's legitimate.

Speaker 1 (13:29):
Now, when Kono goes on the website Venie lesbians from
Venus dot com, it is a different animal.

Speaker 3 (13:38):
It's also for research.

Speaker 1 (13:39):
It's also for research because you need that research to
run the board.

Speaker 2 (13:44):
You have to know that. I understand. Now.

Speaker 1 (13:48):
The issue is funny at first glance, but it really
has to do with how involved a business is with
your business. You know, The bottom line is, yeah, if
you are using a company computer or server, you know
you can't be stupid enough to do what you're doing,

(14:09):
and you know that they're looking at you, and know
that they're looking at you.

Speaker 2 (14:12):
For legitimate reasons.

Speaker 1 (14:14):
Now, a lot of these chat groups are saying, we
will not share any of that because it is on
their programs, on their platform, and so okay. Sometimes they'll
just say, and there are a few that only do
it under we'll only do it under subpoena when they

(14:34):
keep what's going on fairly secret.

Speaker 2 (14:37):
But let me ask you this.

Speaker 1 (14:38):
And I'm not a tech maven, and certainly Neil, you're
much more of a tech mayven. How far away are
we from a company being able to effectively hack without
being a hacker and go into based on your information
the company has and go into much more private information

(14:59):
about you, the ability to dive into your stuff outside
of work.

Speaker 4 (15:05):
If they pay for your phone and they and that
you were working on their computers, they have access to it.

Speaker 1 (15:11):
Any I understand they have access. But let's take it
to a next level. Is the technology on its way?
For for example, AI and I have no idea. This
is me speculating who has very little knowledge of technology.
How far away are we from a company being able
to grab your information prior to the information with just

(15:34):
the information they have about you, because they have your
social Security number, they know where you live, they probably
have your driver's license. You had to give them the
You had to give them an information. We have to
show our passports for example, to show citizenship. Certainly I
did when I started working. I mean, it's that crazy.

Speaker 4 (15:58):
With a social I've worked with private investigators many times.
They're professionally for the station and beyond, and I will
tell you there you would be freaked out of what
they can do with just a social Security number.

Speaker 1 (16:13):
Lesbian from venus dot com is the website to go to,
By the way, I have no idea if that is
a legitimate website.

Speaker 2 (16:23):
I bet I bet they all are. Oh, you're gonna
go someplace. Little fingers got the look that up.

Speaker 1 (16:32):
Yeah, we should look that up and report on it
when we come back. Because I know, and you know,
I've said, you come up with the craziest ass topic
statement you could ever make, and all of a sudden,
there is a world of chats and clubs.

Speaker 2 (16:54):
Okay, I was just about.

Speaker 1 (16:57):
To share a few of them, and you know, I
really would like to be here tomorrow.

Speaker 2 (17:02):
Oh look at the.

Speaker 1 (17:03):
Time, buddy, all right, before we go, and I just
heard the promo, by the way, before we get into
the race between Donald Trump and Kamala Harris as to
how crazy financially economically each one is. There's hopscotching in
terms of figuring out who is worse. A quick one

(17:26):
about one more time, Amy and Neil of your charity
coming up on Friday for the Union Rescue Mission. Very quickly.

Speaker 5 (17:36):
We're going over the edge repelling twenty five stories down
the Universal Hilton. We're raising money for the Union Rescue Mission.
Eighty six percent of the money donated goes directly to
the people who need it the most, and we need
your help to make it a successful repelling.

Speaker 4 (17:51):
Okay, And the easy website just go to just help
and the number one just help one dot org, Just
help one dot org and you'll see the iHeartMedia team
and you just click on that and you'll see Amy
and I there and you can donate if you feel

(18:11):
moved to.

Speaker 2 (18:12):
But it's a great cause. Yeah.

Speaker 1 (18:14):
Eighty six percent, by the way, goes directly. The other
fourteen percent is for drugs and alcohol. From what I understand.

Speaker 2 (18:21):
Oh no, they are a dry house. Oh okay, which
is why.

Speaker 5 (18:26):
They need our help. They get no government funding because
they don't take in people who.

Speaker 2 (18:32):
Do it out, so they do it out on the sidewalk.

Speaker 4 (18:36):
Okay, all right, moving on, you're a bad person, Bill Handle.
For everyone who thinks Bill Handles a bad person, just
donate one dollar and just help one dot.

Speaker 5 (18:49):
Our goal of five million dollars.

Speaker 1 (18:51):
Yeah, and I know you're a little bit behind, so
today I'm jumping in becaus to everybody.

Speaker 4 (18:57):
They've been very Our listeners are great, and we appre
shade every little bit they do.

Speaker 2 (19:02):
And this is where you're gonna tell me I'm a
good guy.

Speaker 1 (19:06):
Is if I am part of asking people to donate
money to whatever cause I have to donate, I can't
ask you to donate without me jumping into it.

Speaker 2 (19:18):
That doesn't mean you're a good guy. No, you're right, Ay,
No I am. That's a good point. I still good thing.

Speaker 1 (19:24):
That's okay, that's fair, that's fair. And then the only
issue is how much do I donate? Because the response
usually is no, not a thank you, not a hey,
that was a.

Speaker 2 (19:39):
Very nice thing to do.

Speaker 1 (19:40):
It's come on, Bill, really, I mean, what do you
think two bucks gets you? You know it's today, It
gets you half a cup of coffee at Starbucks. No,
not even a third of a cup of coffee at
Starbucks if I'm not mistaken. Now, what I would be
doing is doing this segment, which I ran out of time,

(20:03):
so I'll probably do this tomorrow. And this has to
do with Trump's call for a two hundred percent tariff
on John Deere. He was in Dubuque, Iowa, where John
Deere is and and they're moving some manufacturing to Mexico.
And he's responding to that in front of the workers,

(20:25):
and he said, this is going to save your jobs.
And okay, possibly Mark Cuban, by the way, who happens
to be a shark tank and he happens on the
Mavericks and is pretty close to a billionaire, says this
is crazy. This is completely nuts, all right? So Trump
is nuts? How about Kamala Harris?

Speaker 2 (20:46):
Is she nuts?

Speaker 1 (20:47):
Economically speaking? Absolutely? Both of them are insane. And I
would use the word pandering as opposed to legitimate economic
thing and programs. Okay, so we're done, and I'll do
that tomorrow. Because there's a lot to be said. Also,
it is a Tuesday, which means I am taking phone

(21:10):
calls off the air starting in just a few moments
for handle on the law.

Speaker 2 (21:16):
I'm going to be.

Speaker 1 (21:16):
Giving you marginal legal advice where I tell you have
absolutely no case and it is a show of humiliation
and degradation.

Speaker 2 (21:27):
And that's not you to me, that's me to you.

Speaker 1 (21:31):
Do you find it astounding that people still call Neil
a just every Saturday? Right?

Speaker 2 (21:38):
No, I don't get it.

Speaker 1 (21:38):
You know, I'll have someone who is We have a
few elderly people that want to ask me about a
trust or a will. I go, hey, we have to
do this quickly because you're going to die by the
end of this phone call.

Speaker 2 (21:49):
Do you understand let's speed this up?

Speaker 4 (21:53):
If they have doing it to get through all that
garbage to get good advice would be one thing, but
to be barraged with, you know, a barrage of indecency
and then get bad advice.

Speaker 2 (22:07):
Yeah.

Speaker 1 (22:08):
So it's eight seven seven five to two zero eleven
fifty is the phone number to call.

Speaker 2 (22:13):
And there are no breaks, there are no commercials, there is.

Speaker 1 (22:17):
No weather, there's no news, and there certainly is no
patience from me. So it's eight seven seven, five two
zero eleven fifty starting in just a few moments off
the air.

Speaker 2 (22:28):
Well, we come back tomorrow all over again.

Speaker 1 (22:30):
We do this starting with wake up call, and that's
with Amy five to six, Neil and I join a
board at six, we go six to nine, and then
of course Ann and Kno are always there somewhere in
the building. This is KFI AM six forty live everywhere
on the iHeartRadio app.

Speaker 2 (22:49):
You've been listening to the Bill Handle Show.

Speaker 1 (22:51):
Catch My Show Monday through Friday six am to nine am,
and anytime on demand on the iHeartRadio app.

The Bill Handel Show News

Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.