Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:10):
We're live everywhere on YouTube, Instagram, Facebook and the iHeartRadio
app at mister mo Kelly is tech Thursday. So you
know that means that means Marshall Callier joins us in studio.
Always love to see Marshall Collier. Marsha is great to
see you. How are you doing tonight.
Speaker 2 (00:26):
I'm afraid I have bad news this week tech news,
but it's something we have to talk about it.
Speaker 3 (00:32):
But the rebellion against AI.
Speaker 1 (00:36):
Oh, the marked runner needs to pay attention to this.
Speaker 3 (00:43):
Mark never listens to me. That's great.
Speaker 2 (00:46):
They have found that people are starting like the language
learning app dou A Lingo has started using AI voice
or they're there are classes. People don't like that. They
like the sound and the intonation of a human voice,
and that makes sense. We've all heard computer voice and
(01:09):
it ain't the same, right we can still distinguish, so
we can still distinguish that. And then the other thing
is reading books, like even read It and all that.
People have threatened to cancel subscriptions to do a lingo
and audible. They voiced concern for human translators and narrators
(01:35):
and AI creates inferior experiences, which I think it does,
and especially this article that I found brought up a
great thing. What about if it's reading a romance novel. Oh,
we can't pull off romessage.
Speaker 1 (01:51):
But you just said something that peaked my interest audible.
In other words, books on tape quote unquote books on tape.
They're having AI voices read it instead of having someone
like you or me give it a genuine treatment.
Speaker 3 (02:07):
It's a whole lot cheaper, and they admit it.
Speaker 4 (02:10):
Well, yes, that doesn't surprise me.
Speaker 2 (02:13):
I mean, you know, current AI can make it sound good,
and they said, but it would make a sex scene hilarious.
Speaker 4 (02:23):
Quote.
Speaker 2 (02:23):
Even if they were able to program an AI to
be breathier, to ramp up the rhythm, right, it still
isn't going to be the same as a human narrator
who has experienced a sexual encounter.
Speaker 1 (02:36):
So there, AI, are you saying that we should train
up AI.
Speaker 4 (02:42):
What's the recommendation here?
Speaker 2 (02:44):
It would have to understand feelings and we don't.
Speaker 3 (02:49):
We're not there yet.
Speaker 2 (02:51):
I mean, it's enough that it's turned off potential customers
and people are just walking away.
Speaker 1 (02:57):
You don't know this, but Mark has made the argument that.
Speaker 4 (03:00):
We society do not want a I. It is. It
is a derivative it is.
Speaker 1 (03:05):
It cannot create, It can only steal from creators.
Speaker 4 (03:10):
Mark. You can disagree with me or enhance what I'm saying.
Speaker 5 (03:12):
I'm actually kind of in the middle of something right now,
but I think you've got it covered. And I emailed myself.
I believe one of the articles you guys are talking
about earlier today, just because I wanted to rub your nose.
Speaker 4 (03:24):
In after last night's fantasy spirited debate last night.
Speaker 5 (03:29):
Oh let me tell you, but it was the article
about how relying on AI causes your brain to atrophy.
Speaker 3 (03:35):
We're just about going to talk about that, right Thank.
Speaker 4 (03:37):
You for opening that door. Mark Marshall, take it away.
Speaker 2 (03:41):
Now, I have the study in front of me. It
was from MI T Now, MIT are really smart people. Hey,
I'm pretty smart. MO is really smart. We're all pretty
smart around here. And if you take all of our
collective brains, this is Tony Stark smart exactly. This is
so so I'm going to have to be careful as
(04:01):
I give you the information. So they did a study
on the cognitive cost of using a large learning machine,
which is AI. They use chat GPT from open ai
as their example. They assigned participants to three groups, the
LM group, the search engine group.
Speaker 3 (04:21):
And the brain only group. Remember that one.
Speaker 1 (04:24):
See no, I'm still in a brain only group. I'm
not using actively using AI for anything yet.
Speaker 2 (04:30):
They had three sessions for the same task three times,
and that task was each participant used a designated tool
or no tool if you're in the brain group, to
write an essay. Then they use electro and cephalography EEG
(04:53):
to record the participant's brain.
Speaker 3 (04:56):
This is where it gets really weird.
Speaker 2 (04:59):
So the active in their brain to assess the cognitive
enhancement and cognitive load as they were performing these tasks. So, okay,
they performed a lot of technical stuff which I'll skip,
but what they discovered is brain only, l M, and
(05:19):
search engine. Those were the three groups, the tools they
were allowed to use. The brain only group had They
all had significantly different neural connectivity patterns. Brain connectivity systematically
scaled down with the amount of external support. So the
more stuff they took from AI, unless their brain was penetrated.
Speaker 4 (05:44):
Don't we understand that?
Speaker 1 (05:46):
Know that on a certain level, the brain is a
muscle if you're not exercising it. It actually used to
use the word that Mark.
Speaker 2 (05:52):
Used, Well, absolutely, but we've got kids graduating high school.
Speaker 4 (05:57):
They can't read or write an essay, or.
Speaker 3 (06:00):
Write an essay.
Speaker 2 (06:01):
But I mean, here it was proven brain connectivity systematically
scaled down with the amount of support. The brain only
group exhibited the strongest and widest ranging networks, the search
engine group showed intermediate engagement, and the LM assistants elicited
(06:21):
the weakest overall coupling.
Speaker 1 (06:24):
Let me be a contrarium very quickly, because I remember
growing up my family. You know, you shouldn't use a
calculator because you're not working your brain. As far as
doing the math, you shouldn't use a computer because you're
not working your brain.
Speaker 4 (06:37):
This is that to the degree it seems this.
Speaker 2 (06:40):
Is the future, and this is what's scary because people
can't do math either anymore.
Speaker 3 (06:45):
I'm not math is my show.
Speaker 4 (06:47):
I couldn't do math anyhow, or so I'm not losing anything.
Speaker 3 (06:50):
So I mean.
Speaker 2 (06:51):
In the end, the reported ownership of LM group's essays
in the interviews was low. The search engine group had
strong ownership, but lesser than the brain only group. The
LM group also fell behind in their ability to quote
from the essays they just wrote minutes ago.
Speaker 4 (07:14):
Because they had no firsthand participation.
Speaker 3 (07:16):
They didn't do it themselves.
Speaker 2 (07:18):
And that is what your teacher has been saying at
all these earth.
Speaker 1 (07:22):
Remember when our teachers will get angry at us for
just using cliffs notes.
Speaker 2 (07:26):
How about I want to see your work in maths.
Oh my goodness, show your work, Show your work, show
your work.
Speaker 3 (07:32):
And then they came out with that new math, which
still to me makes zero sun.
Speaker 4 (07:36):
It seems like it takes more steps.
Speaker 2 (07:38):
It seems to, but that faded out because it did so. Anyway,
bottom line, use as little AI as necessary. Like I
told you, I went over my insurance policy with AI.
That was helpful because I know nothing about insurance.
Speaker 1 (07:56):
Great great, great recommendations in tips. Then when we come back,
we have to get to the other part of the
bad news is another major breach, and Marsha, you are
knee deep in this.
Speaker 4 (08:08):
As far as trying to get to the bottom of it.
Speaker 2 (08:10):
I've been trying to contact Google, so we'll talk about that.
Speaker 4 (08:15):
It's later with mo Kelly.
Speaker 1 (08:16):
Marshall Collier joins us in Studio Tech Thursday, KFI AM
six forty, YouTube, Instagram and Facebook app.
Speaker 4 (08:22):
Mister mo Kelly, We're live everywhere in the iHeartRadio App.
Speaker 5 (08:25):
You're listening to Later with Mo Kelly on demand from
KFI AM six forty.
Speaker 4 (08:31):
KFI AM six forty is Later with Mo Kelly.
Speaker 1 (08:33):
We're live on YouTube, Join the party, Join the Momigos
in Motown. We're live on Instagram, and now we're live
on Facebook app mister Mokelly, and we're live everywhere in
the iHeartRadio App. Let's continue Tech Thursday with Marsha Collier,
who's still joining us in studio. Marsha, seems like you
got more bad news to tell us.
Speaker 3 (08:52):
Oh, this is humdigger. If you're gonna use that phrase,
this is it.
Speaker 2 (09:01):
Not only was there a breach recently of one hundred
and forty one hundred and eighty four million logins on
social media. There was a breach and we thought that
was big. Now we have a new one. It's only
sixteen billion names passwords and what companies are they?
Speaker 3 (09:25):
From? Apple? You've got an Apple password?
Speaker 4 (09:30):
Google, I got some of those. Facebook definitely got some
of those.
Speaker 3 (09:34):
Telegram don't have that.
Speaker 2 (09:36):
And a bunch of VPNs and I have a feeling
they're going to be more because no official word has
come out.
Speaker 3 (09:46):
So before I came to the show. I messaged on.
Speaker 2 (09:53):
X to Google. That was it two thirty five today? Google,
do you suggest everyone sho the Google account. They're to
Google account password now to protect their Gmail YouTube password
manager and more.
Speaker 3 (10:09):
No response, that's what I'm hearing, crickets.
Speaker 1 (10:15):
Well, let me ask you this. Let's tie this back
to what you were saying. I think was last week
of the week before two factor authentication. Let's say our
information is out there, someone gets our password and it's.
Speaker 4 (10:25):
The correct password, correct username, and they log in. Won't
two factor authentication protect us?
Speaker 3 (10:31):
Yes?
Speaker 2 (10:32):
But what will even protect us more is you've almost
pulled me over to the ubkey side. Okay, I can't
a UBI key as so you know it what it is.
It's a little hardware device looks like a little key
plugs into one of your USB ports, or it plugs
in at the bottom of your phone, and that locks
the data in your phone. You can't use the data,
(10:56):
the passwords unless you have it plugged in.
Speaker 3 (11:00):
And I'm just too afraid I'm going to lose it.
Speaker 4 (11:03):
I know I would.
Speaker 1 (11:04):
And also for me and I thought about getting one,
it would get in the way of my charging. You
might get in the way of my headphones, just the
normal day to day use.
Speaker 2 (11:12):
And if I put it on a keychain, I'd figure
the clanking of the keys was.
Speaker 3 (11:15):
Going to break it. Yeah, something bad was going to happen.
Speaker 2 (11:18):
But anyway, this is it's a lot of credentials. Think
about your crypto. Huh you didn't think of that, did you?
Because that stored in there too. Think about your email,
your messaging at breaking information, banking information. What can these
(11:39):
people do? Account takeovers? Real simple. When I messaged, that
wasn't message, that was a public tweet. When I tweeted
were xd to Google? I saw they were getting lots
of You have to click on the replies. You go
to their page and you click on the replies because
they don't show that in their regular stream.
Speaker 3 (12:01):
The regular stream is.
Speaker 2 (12:02):
Nice, pr and bravo for you, but at least they
do address some of the people. But they were getting
one after one of people whose accounts were taken over,
and people could not get back into your accounts. So
what I'm telling you you need to do. I don't
want to do this myself. I've not done it because
(12:23):
I don't want to do it. Do your Google or
your Apple account change the password today? What do you
want to gamble that you're you're special, You're not one
of the sixteen billions.
Speaker 4 (12:39):
I know I'm not special.
Speaker 2 (12:42):
With my lock you know, definitely you got to do it,
have to factor authentication on everything from now on.
Speaker 4 (12:50):
I got that, that's it.
Speaker 2 (12:52):
I do too, But I mean the havoc that this
can wreck on your life. And Moe, we got a
minute here. What are we going to do about this?
Speaker 3 (13:01):
What?
Speaker 4 (13:02):
What can we do?
Speaker 3 (13:03):
What we can't do? Not reuse passwords? Well, yeah, we
reuse passwords. We're human breeds beans brain only.
Speaker 2 (13:14):
Yes, I let Google a sign, you know, Google password
Manager assign one of those wacky passwords that I would
never know.
Speaker 1 (13:20):
I look, that's how I usually have to do lost
password because I use one of those arbitrarries self generated
passwords that you can't I've never even looked at one,
much less trying to remember it.
Speaker 2 (13:32):
Well, you know, I've found and I've only used Google,
so I'm sure that Apple other have do a good
job at it self generated passwords, but they haven't missed
a trick. If I am on one device and it's
set up as my Google password manager, it always works,
and I'm always a little paranoid about using my biometrics
(13:55):
with my thumb or my whatever finger because what if
if I've cut myself? What if I picked up a
pen and I burned off my fingerprint. It can happen, right,
I mean no.
Speaker 4 (14:07):
It can. There are days where it doesn't recognize your.
Speaker 2 (14:10):
Finger, So we just have to be super careful and please,
please please please change your password, sapy.
Speaker 4 (14:20):
But they're not going to, are you gonna? Yes?
Speaker 1 (14:23):
I am, yes, I am probably tonight when I get home,
when I have a chance to do it. That's what
I try to do all my stuff, and I try
to do it all at once. That's something else. The
only thing that I would recommend, just do them all
at once. You're thinking about it.
Speaker 2 (14:36):
If you change your main one, there's so many I
don't care about. Please do you can hack into my gas?
Cussocal right, help pay the bill? Pay my bill? Same
thing with DWPY knock yourself out.
Speaker 1 (14:54):
You know, I would say, social media, email, banking information.
Speaker 3 (14:57):
Now, you don't want social media.
Speaker 2 (14:59):
Somebody could You could be sleeping and some idiot could
be in wrecking habit.
Speaker 1 (15:05):
Oh that's what I'm saying. That's why we're change those passwords.
Because they are important and can really recavoc in.
Speaker 2 (15:10):
Your life and banking stuff like that. Obviously you got
to do it. It's I don't know how else we
can protect ourselves because the government isn't going to protect
us and the big businesses are not going to back
up and pay us for any damage they caused.
Speaker 4 (15:26):
I just likeing it to look.
Speaker 1 (15:28):
Okay, some fool out there stole the master key, so
you got to change your locks at home. And the
alternative is getting robs, so just go ahead and change
your locks.
Speaker 3 (15:38):
Oh lord, lord, what you know? You got to use
your brain?
Speaker 2 (15:41):
Yep, fake people are reading audible books for us. I mean,
the world is changing, folks, and we have to take
our responsibility for some things and keep learning. And that's
why I'm here every week, because hey, we got to
keep learning.
Speaker 4 (15:57):
And very quickly.
Speaker 1 (15:58):
Of course, for those who might be hearing and seeing
you for the first time, how can they reach you?
Speaker 4 (16:02):
Marshall Collier.
Speaker 2 (16:03):
I am Marsha Collier. I'm the author of over forty
seven books in the Four Dummies book series. You can
find my books on Amazon dot com. You can find
my website at Marsha Collier dot com. And I'm on
x at Marsha Collier.
Speaker 1 (16:19):
And I always have to throw this in here because
people assume that it's spelled one way, what is spelled
the other way.
Speaker 4 (16:23):
M A R S H A C O L l
I E R.
Speaker 3 (16:27):
Thank you okay.
Speaker 1 (16:29):
Is Later with mo Kelly, Marshall Collier. Great to see you,
and I hope to see you again next week.
Speaker 3 (16:32):
Next week, it's a date you're listening to.
Speaker 5 (16:35):
Later with Moe Kelly on demand from KFI A M
six forty