Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:06):
Okay, we're live everywhere on YouTube, Instagram, Facebook, and the
iHeartRadio app. It's tech Thursday. That means Marshall Carlier joins
us in studio. As always, it's wonderful to see you,
Marshall car Thank you so much for coming in tonight.
Let's dive right into this since we're a little bit
behind on the clock. We are inundated with all things AI,
(00:29):
and one of the continuing things we've talked about is
protecting our privacy, that's for sure. What is the intersection
of AI in privacy? From where you.
Speaker 2 (00:37):
Sit AI and there is no more privacy. There are
some things you can do when you use AI tools.
And I said, we're short down time. We may have
to carry it over, but I think this is important.
You know, there are quite a few AI tools that
you can deal with. Yes, there's groc from x duc
(01:00):
duc Duck has duca dot Ai. There's Perplexity, there's Proton Lumo,
Microsoft Copilot and from anthro pic Claude I call it
Claude so and Google Gemini chatchpt, which is open AI
is it is.
Speaker 1 (01:21):
There a huge difference between these different chat bots or
AI interfaces.
Speaker 2 (01:26):
I think so I think so what I've done in
my own experimentation. First of all, I don't recommend downloading
the app, because you're just opening the door. Hello, come in,
read everything I have right here. Yeah, exactly, just for
downloading the app. And you don't have to do that.
(01:47):
They all have a website. You can go to the
website and they all have some sort of free basis
that you can use it. You may run out of
time and space, but just play with it if you
like it.
Speaker 3 (02:00):
And two, if you have a.
Speaker 2 (02:02):
Question, let's say your rhododendrons are dying in your guard
I mean pick up something like that. Ask something about
rhododendron culture and why is this happening? Ask it two,
three different, and see what kind of answer you get back.
Speaker 1 (02:19):
When I use Google, I'm just looking up, Hey, what
time does the movie start?
Speaker 3 (02:23):
Or whatever? Google.
Speaker 1 (02:24):
Now now it integrates with jem Andi. It gives me
an AI overview, regardless of whether I ask for it
or not. So it seems like it's being integrated everywhere,
whether I'm looking for it or not.
Speaker 2 (02:37):
Kind of like our governor said, whether you.
Speaker 1 (02:40):
Like it or not, true true. So, and I know
Mark Ronald probably thinks that this is going to be
the end of civilization as we know it.
Speaker 3 (02:51):
Paraphrase it. It may well be. Honestly, wait which part
of it?
Speaker 4 (02:55):
Now?
Speaker 5 (02:55):
While I was doing something else and then just heard,
you're trying to needle me about AI.
Speaker 3 (03:00):
No, no, no, we were.
Speaker 1 (03:01):
Saying that AI is embedded in search engines, and we're
making the point, like, whether you'd like it or not,
AI is going to be integrated in most facets of life.
The cameras that we're using on this show tonight, there's
AI where they're studying our faces and our movements, and
it's trained to recognize us, identify us, and follow us
(03:23):
in the studio.
Speaker 3 (03:24):
Yeah, you know.
Speaker 5 (03:24):
Earlier in the day, I bookmarked a tweet from somebody
that I wanted to share with you. It says, telling
people who oppose AI have fun getting left behind doesn't
really work as a response. They've seen where AI is
leading people in society and have decided that's not where
they want to go. When the cars speeding towards a cliff,
getting left behind is just fine.
Speaker 2 (03:43):
Well, I'll tell you Mark, I was just saying, you know,
downloading the apps is just opening the door.
Speaker 3 (03:49):
Yeah, you're putting it on your device.
Speaker 2 (03:52):
If you really want to do it, you can go
to Each one has a website you can go and
ask the questions.
Speaker 5 (03:58):
It was.
Speaker 2 (03:58):
I think I mentioned once my earthquake insurance policy. I
took the details from that and I gave it to
three different ais and asked their opinion on it. Boy
did they have opinions. But what was interesting is chat
GPT was more empathetic. Grock had a lot of thinking
(04:20):
to do, and the other one was just, you know,
kind of very factual. The thing that I wonder about
it is, I have the answer, are your chats used
for targeted ads?
Speaker 3 (04:33):
What do you think?
Speaker 1 (04:34):
I think obviously any type of data which is being
used and stored and sold is going to be used
for marketing purposes, marketing purposes at some point.
Speaker 3 (04:43):
It's all fair game and it shouldn't be well.
Speaker 2 (04:46):
Chat GPT open AI's privacy part policy says it does
not sell or share personal data lies for contextual behavioral
advertised lies. Claude doesn't use conversations for targeted ads lies,
Google lies use chat. This is in their privacy policy.
(05:12):
What can I say? But the ones that say and
admit to it.
Speaker 4 (05:18):
Yes.
Speaker 2 (05:19):
Perplexity, which is really popular now and worth trying out again.
Go to the website. Ask a question that's bothering you.
You know what can I substitute for?
Speaker 3 (05:31):
I don't know, but obviously don't give it personal information.
Speaker 2 (05:36):
Well that's the problem if you give it personal information.
But what I wanted to tell you is Microsoft, you
know that's baked in Copilot is baked into the computer.
Speaker 3 (05:47):
If you don't want it.
Speaker 2 (05:50):
Using your data for targeted ads and other things, what
you need to do is go to click your profile
image in the upper right hand corner, your name, privacy
and disabling personalization and memory. A separate link disables all
(06:10):
personalized ads for your Microsoft account.
Speaker 3 (06:13):
I have a question.
Speaker 1 (06:14):
You may not know the answer to this because I'm
springing it on you. Just about if I understand you correctly,
any Windows computer or laptop at this point that we purchase,
we it will have Copilot loaded in it, won't it?
Speaker 2 (06:30):
Not all of Copilot. You still have to give permission
for the extended version. For the extended version. Okay, this
is one of the reasons why I'm testing and forcing
myself to use one of the new super pro chromebooks. Boy,
it's it's a challenge, but it's all different and I
figure we're keeping the data with one person, one company.
Speaker 3 (06:54):
You hope, I hope we'll we come back.
Speaker 1 (06:56):
We're going to continue this conversation with Marshall car here,
maybe get in to what Samsung is up to in
advance of what Google is getting ready to let us
know what they're up to.
Speaker 3 (07:05):
AI just gave the game away.
Speaker 1 (07:10):
It's Later with mo Kelly. It's Tick Thursday. Marshall Collier
joins us in studio k if I AM six forty.
Speaker 4 (07:16):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.
Speaker 3 (07:23):
Kf I AM six forty. It's Later with Mo Kelly.
Speaker 1 (07:25):
We're live on YouTube, Instagram, Facebook, and the iHeartRadio app
and we're live right now in studio with the Marshall
Collier as we talk about all things tech on this Thursday. Marsha,
I know that beyond the conversation of AI, every single
device in the IoT, the Internet of Things will possess
include AI going forward.
Speaker 3 (07:46):
Is that a reasonable assumption?
Speaker 2 (07:47):
Your refrigerator is going to have a AI, your stove
is gonna have AI. It's going to be everywhere. But
the only thing you can do is you can regulate
how much of it you're gonna let in. When you
have to agree to extra terms because you buy something,
just by buying it and using it, you're agreeing to
(08:08):
certain stuff. But when they want to add stuff on
and they ask for more click here to agree. You
better read what you're doing, and why not copy and
paste it into another AI and see what it really is.
Just ask ask for a summary or a bulleted point
(08:29):
to document it. That's what it's good for, and that's
what I use it for. But there are people using
a AI now. I'm sure Sam can tell us about that,
that they're falling in love with AI. They're supplemental girlfriends
and boyfriends.
Speaker 1 (08:48):
And Sam was telling us about that. We're talking about
Sam the sex Doctor. How people and I think there's
a generational difference where it serves a utility AI serves
a utility companionship.
Speaker 3 (09:02):
In their lives that it doesn't make sense in my life.
Speaker 1 (09:06):
I'm still about the actual person, even though I may
talk to someone only via screen.
Speaker 3 (09:12):
I know that there is a person on the other side.
Speaker 2 (09:14):
Well, but don't you remember the days when you called
one a day day in it? I never did that,
but I'm sure you did, but I remember seeing.
Speaker 3 (09:23):
The ass Yeah, no, you.
Speaker 2 (09:30):
I think every generation had their own version of it.
I mean now it's just gotten. And when as soon
as the robots really start looking good. I see that
one robot at all the tech conferences, and they've improved
her every year.
Speaker 1 (09:43):
Oh look, it's going to be here, that actual physical companion.
I'll just say companion. Yeah, that is going to replace
a human within someone's life.
Speaker 2 (09:54):
Oh, absolutely absolutely, But it will be good in certain ways,
like people need caretakers, people who've had surgery and need
a nurse to watch them. They can take pictures, they
can send pictures back to a doctor. They can follow instructions.
I mean that can be done. We have the technology
(10:15):
for that. Now, we just have to make them look
a little less scary.
Speaker 3 (10:18):
I think.
Speaker 1 (10:19):
Yeah, they haven't quite mastered the facial expressions, the eye
where it looks like it's looking at you as opposed
to pass you, the subtleties which make it feel more human.
Speaker 3 (10:32):
Well, the humanity.
Speaker 2 (10:34):
It's just like on some radio stations they use AI
to read the news.
Speaker 3 (10:40):
I was telling Mark that he didn't want to hear that.
He's saying it was abomination that no one wants it.
Speaker 1 (10:44):
I said no, the corporate overlords absolutely want it.
Speaker 5 (10:48):
Yeah, the corporate overlords do because they'll make money and
they won't have to they a human.
Speaker 3 (10:52):
Being, have to survive. Correct.
Speaker 5 (10:54):
The rest of humanity doesn't want it and has no
interest in listening to an AI delivered the news or
anything else.
Speaker 2 (11:00):
When we're off the air, I will be glad to
tell you who, when and where and who has it.
Speaker 1 (11:07):
And some of it is damn good as far as
pretty hard to distinguish.
Speaker 3 (11:11):
It's trained off a real place.
Speaker 4 (11:13):
Well.
Speaker 5 (11:13):
Also, if you think that a journalist is just a newsreader,
I got a few things to tell you.
Speaker 3 (11:18):
Something so altruistic.
Speaker 1 (11:19):
Okay, we all know that it's not the same as
an actual journalist is making editorial decisions.
Speaker 2 (11:25):
Is not a newsreader, and they're not just a presenter.
There's a difference between a presenter and a journalist. A
journalist finds the stories or does the stories that the
presenter presents, and the presenter can be AI very easily.
Speaker 5 (11:45):
This is exhausting and depressing.
Speaker 3 (11:48):
The end is nigh. Yeah.
Speaker 2 (11:50):
I mean, the journalist can have the human intuition, and
that's one of the things that they lack, even in
the voices of the AI. You don't hear the cadence
of a human the way we talk.
Speaker 3 (12:04):
Yeah, yeah, it's very unappealing.
Speaker 5 (12:06):
And by the way, I think it's sometimes a mistake
to try and separate those elements out into discrete, singular elements.
I mean, I'm all those things. I read the news,
but I'm also a journalist. But you're also more expensive,
not that much. You're more expensive than zero.
Speaker 2 (12:21):
Yeah, more than zero, Yes, yep, more expensive than zero.
But you know, I know you guys talked about that.
But AI, all you have to do is think about it.
Human empathy is one thing it can't yet and I
don't think it will in our lifetimes. Put into the
equation of its decision making. I hate to be conspiratorial.
Speaker 1 (12:46):
I think it exists, but probably on a military level,
not consumer use.
Speaker 3 (12:52):
What like. For example, I make this analogy all the time.
Speaker 1 (12:55):
You look at the satellite photos of the nineteen sixties,
and with the exception of colors, pretty damn good.
Speaker 3 (13:02):
And that was the nineteen sixties. Okay. Who knows what
they have today except the pictures of the moon.
Speaker 2 (13:09):
The pictures they of the moon that they did in
the olden days. Don't even look like it's on the moon.
Speaker 3 (13:14):
So you know, there's that she was talking about. I know,
Stanley Kubri, Yes, I know this is capricorn one. I know.
Speaker 1 (13:22):
But but my point is I'm quite sure what we
as consumers are seeing and using probably a small percentage
of the true capability, which has already been reached.
Speaker 2 (13:35):
Absolutely, I mean, there's no question about it. But still
there will have to be the human because a mathematical
decision isn't always the best one. No, I mean, do
you want to continue to send someone for medical treatment
to medical treatment, medical treatment, and yes it's going to work,
and yes it's going to cure them eventually, but the
(14:00):
years that they may have to spend in treatment may
eat up too much of their life, too much of
their happy time, too much of their family time.
Speaker 1 (14:09):
Now there has to be a human element, and this
is where I do agree with Mark. We humans are
still needed in the equation of AI's capabilities.
Speaker 3 (14:21):
It can't do anything without us.
Speaker 1 (14:22):
That it has still has to learn from articles which
are actually already in existence, is taking information that we're
in putting into it. So AI is not exclusive of us.
It has to be inclusive of us. So far, at
least for now.
Speaker 2 (14:36):
Well, there's a new app called notebook LM from Google
and someone has taken my books and gone chapter by chapter.
Oh gosh, and it makes a two person podcast. And
to god, this is the truth and the two person
podcasts you'd swear with people. But it is in that
halting AI voice, and they discuss each chapter of my
(15:03):
book just like they were reviewing the book and the
whole thing. And there's nothing I can do about it.
Speaker 3 (15:08):
It's are they selling it?
Speaker 2 (15:11):
Nope, but they're getting clicks, they're advertising.
Speaker 5 (15:16):
See.
Speaker 2 (15:16):
Now, we make money in many different ways, and this
will just be another way to make money.
Speaker 3 (15:23):
I just want to leave you with one thought.
Speaker 5 (15:25):
We keep talking about AI replacing workers and professionals and
journalists and whatnot. You know, what would be the best
use of AI would be to replace CEOs because those
things are expensive and as far as I can tell,
they don't do that.
Speaker 2 (15:36):
Much and they won't need that balloon at the end
when they get fired the parachute.
Speaker 1 (15:43):
The only problem is this the CEO who would have
to make the expenditure and also the decision. And just
like Congress, they're not going to do anything which might
end up replacing them.
Speaker 3 (15:54):
It's going to be a tough sell and it may
require some force.
Speaker 2 (15:57):
Yeah, let's just go back to the gold standard and
we'll figure it out from there.
Speaker 3 (16:00):
Well, that ain't gonna happen.
Speaker 1 (16:01):
So, Marsha car is always great to see you, love
having you in the studio and hope we get to
do it again this time next week.
Speaker 3 (16:08):
I hope so too.
Speaker 4 (16:10):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty