Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to Bill Handle on demand from KFI AM
six forty. You are listening to the Bill Handle SHOWFI.
Speaker 2 (00:17):
AM six forty Bill Handle here Wednesday morning, August thirteenth.
Yesterday we were supposed to have Rich Demurle on with
Tech Tuesday, so instead, because of scheduling issues, we have
rich on on Humpday Wednesday, Tech hump Day Wednesday, and
(00:39):
Rich of course our tech guy, live every Saturday eleven
am to two pm.
Speaker 3 (00:45):
Here you can see on'm i kt la, Instagram at.
Speaker 2 (00:49):
Rich on Tech his website which has loads of information
rich on tech Dot TV.
Speaker 4 (00:55):
Morning Rich, Good morning, Bill.
Speaker 3 (01:00):
Oh you sound so excited. This is good.
Speaker 2 (01:02):
But then again we have to tell our studio audience
you that's your contract.
Speaker 3 (01:07):
You have to get excited when you come on the show.
Speaker 2 (01:10):
Now there is a sale or an offer for chrome
going on. One of the things before we jump into
to this is when there was a billion or a
two billion dollar sale of one company to another. Those
were such astronomical numbers it was almost impossible to believe.
(01:33):
Today a billion or two billion dollar offer means virtually nothing.
It's a change in your couch kind of offer. And
this one is Perplexity offering what thirty five billion dollars
for Chrome.
Speaker 3 (01:50):
How does Chrome become that valuable?
Speaker 4 (01:53):
Well, it's your entree to everything on the web, and
we know that's changing because of AI. But Perplexity is
one of these startups that sort of has a disadvantage.
You know, right now, you've got chatchbt, which has become
the Kleenex of ais. Everybody in the world knows that
brand name. You've got Gemini to a lesser extent, which
is Google's and they're building that into the Android phones
(02:15):
and all that stuff. But then you have Perplexity, which
is also very popular and it's a great service, but
they have a disadvantage because they don't have phones and
they don't have a web browser, and so how do
you get access to them? Well, you have to download
an app or you have to go to their website.
So if they can get something like Chrome, which is
already installed on a gazillion computers worldwide, and get themselves
(02:39):
in front of more people, it might be worth this
purchase price. But Bill, I have to tell you that
number one, this is unsolicited. Google is not selling Chrome
as of right now, and that's the main thing here.
We heard from the judge that was looking at the
antitrust case involving Google that he said, maybe we might
have to split up the Chrome browser, maybe sell that off.
(03:01):
You know, it's got a sixty percent market share, but
it's not a done deal yet and we don't even
know if that's going to happen. So this company is
just trying to get ahead of the game. Say look,
we will give you thirty five billion dollars for Chrome.
By the way, this company's worth eighteen billion dollars, so
that's double the valuation. I'm not a finance guy, but
those are some wild numbers to work out.
Speaker 2 (03:21):
Well, that's why those are completely wild numbers. Usually you
have a ten percent, twenty percent premium when buying a company.
Speaker 3 (03:30):
One hundred percent premium, that's that's pretty high.
Speaker 2 (03:33):
Now, you mentioned something about phones that they don't have
a phone. But does chat GPT have a phone? It doesn't,
does it?
Speaker 4 (03:43):
They Well they okay, Well two things with them. Number One,
they have a deal with Apple, which they have you know,
they're pre installed through Siri on pretty much every iPhone
that has Apple Intelligence, So that's number one. Number two,
they hired Apple's former designer lead design for many years Johnny,
I've he's been doing stuff with Apple for many many years.
(04:04):
Recently left that company chatchbt Open AI. I was gonna
say bought him, but they did a deal with him,
and so they are designing something. We don't know what
it is, but it's going to be some way to
access open ai chatchybt on the go.
Speaker 5 (04:20):
Is that a phone?
Speaker 4 (04:21):
Is that some sort of wearable. Is it a necklace?
Is it a clip? Is it a pen? We are
not sure, but chatchbt is in the leading position here
for no matter what they make, people are going to
be interested because it's CHATCHBT.
Speaker 2 (04:35):
Now when I go to Siri, for example, and I
just asked Siria question, am I using AI?
Speaker 4 (04:43):
Uh, well, it's I'm not really, I mean, it's a
blend of I mean, obviously there's a little bit of
machine learning built into what Siri does. I wouldn't call
it over AI. But if you want to use their
blend of AI, which now includes open ai and chatchubt,
you can say to Siri ask chat GPT and she
(05:04):
will now ping chat GBT in the background and bring
you your answer.
Speaker 5 (05:08):
So that's kind of a way to shortcut it.
Speaker 2 (05:10):
Okay, So could I use Siri all the time, clearly,
even on the show. So let's say I want to
ask a question. I have Siri right in front of me,
so I have to say, Siri, bring up Well, I'm
trying to think of something with you, and I'm trying
to make it clean and I can't figure it out.
Speaker 4 (05:31):
Yeah, that's impossible. But you know, if you wanted to
say something like, you know, hey, Siri, ask CHATCHYBT for
the top ten countries to visit that are most popular
in the world, it will go out and instead of
trying to find that information, you know, usually Siriri would
just give you like here's what I found on the web.
It will ping open AI or CHATCHYBT and give you
that answer fully.
Speaker 2 (05:52):
So all I have to do is, hey, Siri, ask
GPT and then it comes back on my phone a
talk to me or am I reading.
Speaker 5 (06:01):
It in both? So I just got the answer.
Speaker 4 (06:04):
It just said asking CHATGBT and here it is top
ten most popular countries to visit worldwide, and of course
it just went away on my screen. So anyway, it
is working. But the key is you have to say Siri,
ask CHATBT and that will bring you to that search.
Speaker 5 (06:20):
Now that's a shortcut.
Speaker 4 (06:21):
You can ask Siri a complex question and if it
doesn't have the answer itself, it may ask you, hey,
do you want me to ask chatgybt for that answer?
And that's you know, Apple is riding a very fine
line here because they want to build the beauty of
chatchubt into the iPhone. But there's a lot of privacy concerns.
People are worried, like, hold on, I'm asking Siri to
(06:43):
call my mom, or I'm telling this personal information or
a calendar invite or something. I don't want you to
give that information a chatchbt. So they've built this kind
of secondary system that lets you know every time when
it's going to ask chat gubt or share your data
with chatchbt.
Speaker 3 (06:59):
All right, that.
Speaker 2 (07:00):
Works, Rich DeMuro, It is tech Wednesday, because yesterday, Rich
couldn't make it. By the way, Rich, why couldn't you
make it yesterday when you were rearranging your socks or something.
Speaker 5 (07:13):
Well, that was part of it.
Speaker 4 (07:14):
But you know, I do have a day job at
KTLA and sometimes they ask me to do things that
are a little bit different, and whatever they asked me
to do yesterday broke into my schedule here, So that's
the answer.
Speaker 3 (07:25):
Okay, so it's a food chain issue.
Speaker 2 (07:27):
I get it, no problem, Bill, Believe me, I've been
well aware of food chains most of my life.
Speaker 3 (07:35):
Okay, this is what I talk about a lot.
Speaker 2 (07:38):
Because it is depressing as people get scammed, especially elderly people,
and they see their life savings wiped out and it
breaks your heart. So there is a new scam out there,
passport renewal.
Speaker 3 (07:54):
Would you like to share with us what's going on?
Speaker 4 (07:57):
Yeah, so you know you can now renew your passport
online through the government. I did this earlier this year.
Very easy process. It doesn't take very long at all.
You do it all online as long as you have
a current passport that's not lost or stolen or whatever
or expired I think. But anyway, so of course scammers
try to prey upon this, and they know that people
(08:18):
are going to be searching online for passport renewal, and
so what do they do. They make websites that are
bogus and phony and charge you to fill out the
same exact government forms that you would get for free.
I mean, that is the kicker in this whole thing.
You're paying a processing fee to fill out forms that
are typically free. I'm laughing because I'm actually crying inside
(08:40):
that people are already falling for this. If you go
through the Better Business Bureau scam tracker website, people are
losing looks like an average of one hundred dollars on
this one, probably more for some people. But yeah, the
websites are out there. The main thing to know is
that it is state dot gov. State dot gov is
the website to renew your passport. Some people are getting
(09:00):
their sole security numbers stolen because they're putting them into
this fake website. And the other thing is, and I
tested this myself, if you go on like a search
engine like Google and just say renewed passport online, half
the ads up at the top are websites that are
a third party that are trying to lead you astray.
So just be very careful, be very aware. When I
(09:21):
was doing this bill, I double checked the website I
was entering my info into about twenty seven times because
I was so worried that I was like, wait, am
I on the right website?
Speaker 5 (09:30):
Let me make sure, let me make sure, okay.
Speaker 2 (09:32):
And that's not the only not so much a scam
because you're actually getting what you're trying to get. You're
just paying for it and you shouldn't. And that's people
where it used to be you can homestead your house
for a couple one hundred dollars, which is a waste
of money. There's also one and there was an account
(09:52):
that I had through my former business that somehow fell
through the cracks and it went to the state, And
of course I'm getting letters like crazy saying we can
help you, we can help you, will take ten percent.
It is a form that you fill out and you
send it to the state and if there's enough money there,
(10:13):
you get to notorize it and you are done and
they charge ten percent. And it could be substantial money,
so at least doesn't clean out your life savings.
Speaker 3 (10:24):
But still it's scam city, and it's.
Speaker 2 (10:27):
Just by the way, Are we as Americans more susceptible?
Do we jump into scams more often than people in
other parts of the world.
Speaker 4 (10:37):
You know, I don't have the numbers on that, but
I would imagine because we are highly connected, there's a
lot of money in the US. There's a lot of people,
and you know, our privacy rules are not very good here,
so there's just something.
Speaker 5 (10:52):
New every day.
Speaker 4 (10:53):
I mean, really, if you want to laugh, Bill the
Better Business Rail Scam Tracker, I mean it's sad, it's
not laugh but they've actually built a pretty good tool
where what happens is people go on there and they
report the scams that they are either a victim of
or that they see, and a lot of people are victims.
So right now you can see there's like you know,
(11:13):
popular scams or pet scams, pyramid schemes, consumer fraud, lawyers,
cash app, tax fraud, and because money is so easy
to move nowadays, you know, with these apps, the zells
of the world, the cash apps, the venmos, I mean,
that's just opened up a whole new world for these
crooks because you know, you get the money out of
(11:33):
these apps, and you're not getting it back most of
the time. I mean, you can fight for it, but
you know a lot of the times you're not getting
it back, especially the bitcoin stuff.
Speaker 2 (11:41):
Anything is the scam where you get a call about
your kid or your grandkid being held at a prison
in the Philippines and has to make bail and you
have to send over five or ten thousand dollars immediately,
Is that still.
Speaker 6 (11:56):
Going on.
Speaker 4 (11:58):
Even better now they are texting. Here's the twist on
that one, and Bill, the reason I know these things
is because everyone that follows me they know I post
this stuff on Instagram, so they are now sending me
every single scam they get on a daily basis. I mean,
it's really wild how many I see, and like, I
try to post the ones to my Instagram at Richontech
by the way that I find that are new or unique,
(12:20):
so I don't post them all. But the one you're
just talking about. The twist on it is that they
text you and they say, hey, mom, I lost my
phone or I broke my phone. Can you text me
at this number or call me at this number real quick?
Because you know it sounds realistic, right. I mean how
many times have someone texted you and you say, who's this?
(12:41):
Oh I got a new phone, Sorry, I don't have
you in my address book or whatever. So the way
that they scam us is they take what we know
and love and they just twist it a little bit
and put a high tech spin on it.
Speaker 2 (12:51):
So what happens you call that number and what kind
of information goes to them if you're simply calling that number.
Speaker 4 (13:01):
I'm guessing whatever they can get out of you, So hey,
I need to Now here's the thing. If you're calling
that number and it's not your kid either, they're using
an AI voice you know, right, that's like another new
scam that's happening. The voice cloning. It may sound familiar
or it may sound like your child. And so what
are they trying to do? Trying to get you to
transfer money instantly. And that's the name of the game
(13:22):
with all of these, With the online scams, a lot
of it is taking control of your computer. Like if
you're falling for one of these, like you know, your
phone is infected or your computer's infected. You know, they
literally take control of your computer. And once they get
that malware on there, you're never it's like it's game over.
Speaker 5 (13:39):
Yeah, it's tough.
Speaker 2 (13:40):
I remember once I actually got one of these calls
that my kid had been arrested and I have to
send money, and I just said let them keep her,
and it was fine.
Speaker 3 (13:51):
They just dropped it right there.
Speaker 2 (13:53):
Rich Thank you as always eleven am to two pm
on Saturday here on KFI KTLA, which is much more
important to him than on the show.
Speaker 3 (14:03):
Instagram. Rich on Tech website, Rich on tech Dot TV. Rich,
thank you. We'll catch you next week.
Speaker 5 (14:10):
Thanks, Bill, appreciate it.
Speaker 3 (14:11):
Okay, you got it.
Speaker 2 (14:13):
Rich is always a big part of the show. Love
hamming him on. Before we get to Jim Keeney. Quick
word about ask Handle anything. We do that at eight
thirty on Friday after the Foody Friday segment. And if
you want to ask personal question, great fun when we
do it during the course of the show. Just click
(14:33):
onto the Bill Handle show and then click onto the
microphone in the upperright and hit corner and then just
record your question.
Speaker 3 (14:38):
And then Neil.
Speaker 2 (14:40):
Chooses the ones that are most embarrassing for me and
we play them and I answer Jim Keeney, who is
with us every Wednesday, and Jim, of course, chief medical
officers for Dignity Saint Mary Medical Center in Long Beach.
Speaker 3 (14:56):
Okay, asthma pill connect out with aler geez asthma. I
guess that's an allergy kind of Jim.
Speaker 6 (15:05):
Explain that, Yeah, so what this is about? And I
apologize It sounds like we might not have a great
connection and so hopefully we'll get through this.
Speaker 5 (15:13):
But good.
Speaker 6 (15:16):
So this is a so there's a drug out there
for asthma in the past, and it's to stop a
cute asthma attack. It's to print the attack occurring. So we.
Speaker 5 (15:29):
Do we Yeah, we have.
Speaker 2 (15:30):
A crap line. You were right, Okay, you want to
call it? Do we call in or does he call
or we just keep on going.
Speaker 6 (15:37):
Yeah, yeah, we tried a couple of times.
Speaker 3 (15:40):
All right, all right, let's just do it. And let's
just do it.
Speaker 2 (15:43):
And so far you talk about as a drug and
uh it can all right, moving.
Speaker 6 (15:51):
On, Yeah, and so if you can hear the that
this drug is made for asthma, but researchers have looked
for food allergy and a phylaxis, the kind of severe
allergic reaction you get when you when you you know,
have like a penisergy blood usher ops. I mean it
(16:11):
could be fair. So right now, what we have are
basically we can give people or all you know, allergens
and kind of slowly mump being able to talk. Peanuts
or expensive all antibodies can give and they don't really
work in everyone. This has been amazing because it reduces
(16:32):
severe allergy symptoms by ninety five percent. And so this
is a medication. Say you're going to go light and
you're worried about what you know, cross contamination. Are your
food or you're going to go to a birthday party,
your kids are going to a birthday party. You could
just take this pill before you go. It doesn't have
to be something you take every day. It's an interesting
pathway if we're going to be able to get well. See,
(16:54):
but it's interesting path Look, this was started in mouse
restarch I mean, we found out that the gut ends
that I am like that of a mouse. Will it
stops leakiness of the you know, of allergyms that are
then they get absorbed in the bloodstream. And what happened
(17:15):
is researchers took that and developed a drug that's very called.
Speaker 5 (17:22):
Zoloton.
Speaker 2 (17:23):
Okay, a couple of things about animal studies. I know
you wanted to talk about that and this legitimate and
if I'm right, let me start with two kinds of
people out there. Those that look at animal studies, mice studies,
somehow it connects to human beings and look, therefore you
got to get this disease, you're going to be cured whatever.
And then those people that look at animal studies and
(17:44):
how dare you use animals? These in innocent animals for
testing when we now have computer models that can do
the same thing.
Speaker 3 (17:52):
So would you comment please sure?
Speaker 6 (17:56):
I mean so in the story we did last segment
there that was a perfect example of a mouse model
that they used that showed how certain of these enzymes
work in the body and that they can block an
allergic reaction. That we could translate that directly to humans.
(18:16):
After learning about that, and a drug that has a
similar effect that's already on the market, that's you being
used for asthma, could be all of a sudden, these
researchers realized, wait a second, if this works in a mouse,
it really should translate to humans. And it turns out
it does, and it's pretty dramatic how it does. So
suddenly found by doing mouse research, found a very low cost,
(18:40):
high quality solution. I mean, this medicine is not super
low cost, but compared to bringing a new drug to
market and going through all of that, we've got a
drug on the market that we can repurpose for a
pretty good effect here.
Speaker 3 (18:54):
And now mouse studies.
Speaker 2 (18:56):
You know, for example, I drink diet coke and I
think it aspartame in it, and people say I'm going
to die because my studies show how dangerous it is.
And then I always reply with, you know, how much
aspartame I actually have to take into my body that's
equivalent to one thousand cans a day. So you said
on the on the drug is a direct connection. Yet
(19:21):
the amount that mice take to either have or cure
a disease is just insane.
Speaker 3 (19:29):
Talk about the difference.
Speaker 5 (19:30):
Please, Yeah.
Speaker 6 (19:32):
So, I mean there's different ways, right we use We
can use bacteria or a mouse or a lot of
different things to determine whether something has a toxic effect
or whether it has you know, this kind of effect
where it causes cancer, especially when you have you know,
you want to see something that replicates very quickly over
and over again. That's why we use a lot of
(19:53):
bacteria models to see if it's if it's damaging DNA
and it's causing some type of mutations. But when you're
talking about so that's what you're talking about the Asper
team is when we're exposing these models to certain chemicals
or anything, we're looking to see does it cause any damage. Now,
on the other side, when we're trying to find novel
(20:13):
ways to treat a disease, what you're looking for is
some type of model that can prove a theory that says, look,
when we block this, something happens, or when we enhance this,
something happens. And that's where a lot of times these
animal studies can translate directly to humans.
Speaker 2 (20:33):
Do they still do animal studies with thumper rabbits for example,
or even the chimpanzees, I know that was a period
of time, which, of course chimpanzees are just insanely expensive.
How far up the food chain do we go with
animal studies or has that been way curtailed.
Speaker 6 (20:55):
It's been curtailed because, like you said, we've gotten better
at using certain you know, smaller rodent type of things
like mice to do studies. It's easier, cheaper. But no,
definitely we're still using you as far as I know,
We're still using primates and things like that for research
because there's some things that you can't model in a
(21:16):
mouse or somethings that just don't don't translate or don't
cross over. But absolutely all a lot of that's been curtailed.
I mean, you know, when I remember when U. C.
L A had a pig lab and many years ago,
and that was being attacked by by animal rights people,
So you know, definitely, Okay, what's that I was attacked.
Speaker 2 (21:40):
By the beef industry. That was a bad joke, by
the way. I just wanted to point that out. Okay,
I think we'll end it there, Jim, and all we
always end the program in the segment with you is
go ahead and kill somebody today, Jim. We'll catch you
next week. Okay, take care, all right, all right, coming
(22:02):
up in just a moment, it's Gary and Shannon and
we're back again tomorrow. Wake up call Will and Amy,
Neil and I from six o'clock until right now, and
then of course a big part of the show, A
very big part of the show marginal.
Speaker 3 (22:21):
That's true, but it's cono. I know you're pointing.
Speaker 2 (22:25):
I was gonna say your name and and this is
KF I am sixty.
Speaker 3 (22:31):
You've been listening to the Bill Handle Show.
Speaker 2 (22:33):
Catch my Show Monday through Friday, six am to nine am,
and anytime on demand on the iHeartRadio app.