All Episodes

December 6, 2024 17 mins
ICYMI: ‘Later, with Mo’Kelly’ Presents – A look at the FBI’s warning to iOS and Android users to avoid ‘texting’ AND a way to see how much Google’s AI is obtaining from your photos on ‘Tech Thursday’ with regular guest contributor; (author, podcast host, and technology pundit) Marsha Collier - on KFI AM 640…Live everywhere on the iHeartRadio app
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.

Speaker 2 (00:06):
Let's talk tech and text with Marsha Collier, our resident
tech guru. Marshall call, I know you saw the story first.

Speaker 3 (00:17):
Good evening. He's good to see you. Good, see you too,
I know you saw the story.

Speaker 2 (00:20):
The FBI has warned iPhone and Android users put together,
means everybody stop sending texts. And it's not sending text
from Android to Android or iPhone or iOS to iOS,
but you know the two shall not meet. Don't text
from Android to iOS. What the hell is going on?

Speaker 4 (00:41):
I think the FBI has an excellent PR department that's
trying to build itself up to show its importance.

Speaker 2 (00:48):
Sorry, folks, No, no, no, if we need to go there,
then let's go there, because I would like to think
of myself as somewhat tech savvy.

Speaker 4 (00:57):
Okay, how long have you been texting on your phone
since the beginnings right at pregate?

Speaker 2 (01:03):
Oh no, no, no, even pretexting when we had the two
way pagers and everything.

Speaker 5 (01:07):
Right exactly, so none of that was encrypted.

Speaker 4 (01:11):
None that Chinese know all about that, right right, They're
done with that.

Speaker 3 (01:16):
Let's walk through this whole thing.

Speaker 5 (01:18):
Right exactly, So that's the way it was.

Speaker 4 (01:21):
Then Apple invented encryption for their users in the I
message system.

Speaker 3 (01:28):
Those are the blue bubble people.

Speaker 4 (01:29):
Right, and they encrypt from end to end. Now Google
invented RCS messaging, which is encrypted from end to end,
as is Gmail, which is encrypted from end to end. End.
I might recommend if you plan on doing something super
hyper secret or something that could get you in trouble,

(01:52):
probably better to use Gmail and have them deleted at
the other end.

Speaker 3 (01:57):
And we would never recommend anything illegal on this show.

Speaker 4 (02:00):
Never, never, But I mean something you didn't want your
family to know about, something you know, because you know,
I really don't think that the Chinese are going to
be looking for birthday plans or Christmas gifts on your text.
But it seems that when Apple graciously decided to adapt

(02:21):
OURCS so that there was some symbolic merger of Androids
and Apple. Yes, they do get RCS features, but their
encryption is only within their own ecosystem. So this gift

(02:42):
that he get that Apple gave to Android is not
really a gift. It's wide open and unencrypted. But as
we said at the beginning of the show, think about it.
You have been texting for the past decade and it
hasn't been encrypted, and you know seriously, I mean, oh
my goodness. Amid an unprecedented cyber This is from NBC News.

(03:06):
Unprecedented cyber attack on telecommunication companies such as AT and
T and Verizon. US officials are recommending that Americans use
ENCRYPTID message apps to ensure their communications stay hidden from
foreign hackers.

Speaker 2 (03:22):
For those who don't know, RCS is rich Communication services,
so you can get your emojis and you know, your
little thumbs up all those things.

Speaker 4 (03:30):
When people send you happy birthday techs, so you're going
to get all kinds of.

Speaker 2 (03:34):
Universe animations and everything that's RCS. But why I understand
what they're what the FBI is saying, and I understand
why they're saying it. Why do you think it's being
said now? Is it because of the holidays? Is it
something maybe that a major tech provider or company might

(03:57):
have been breached and they don't want to say.

Speaker 4 (03:59):
Don't they have been breached all long and all long.
You are a celebrity, mo. If something slightly untoward happened
to you, I am sure that you'd have a PR
person jetting out press releases of all the good stuff
you're doing. This is the way, in my humble opinion,
that the FBI is showing how important they are, because

(04:21):
there's no pe for protection in FBI.

Speaker 5 (04:24):
But I will give you the solution. If you are slightly.

Speaker 4 (04:28):
Afraid that foreign countries are going to care when your
husband sends you to pictures from the market that say
do you want this milk or do you want that milk?
If you're really concerned about that, use an app called WhatsApp.
I use WhatsApp. I used it today. I spoke to
my family in England. You can do free long distance

(04:50):
phone calls on WhatsApp. You can text on WhatsApp. Your
pictures don't get screwed up. On WhatsApp, you can sell
send files. It's by Meta, which is Facebook, but don't
confuse it with Facebook Messenger. It's a separate program and
it's called WhatsApp, and honestly I recommend it.

Speaker 5 (05:12):
It is encrypted end to end.

Speaker 4 (05:13):
But the only thing is we've said the word end
to end about twenty times here. Once it gets to
your phone or your computer, then it's all dependent on
what kind of security you have on your device, right
because you know, if you walk by somebody who's got
a WiFi scanner, I guess they could be interested in

(05:35):
read your texts.

Speaker 2 (05:37):
Or a stinger or if they had like a Oh
for example, when I was on I don't know if
you heard the story I was telling the story. I
was coming back from Washington, DC, and Doug im Hoff
was on our flight, Secret Service and everything. If you
don't know what a stinger is, A stinger will basically
pull down all phone calls and text messages within the

(05:58):
radius of the plane or some small area. And I
knew I had to be very mindful of what I
would text. I can't say, guess who's on the flight.
I'm going to go back there and give them a
piece of my mind, because Secret.

Speaker 3 (06:11):
Service would have been all over me.

Speaker 2 (06:13):
But there's a lot of technology out there which can
suck your text.

Speaker 4 (06:18):
In exactly so encrypted end to end. It's okay if
it's flying through the universe and through the fiber and whatever,
it's encrypted, but once it gets to your phone, it's
widely readable by anybody hackable unhackable. So just think about this.

(06:38):
Don't put anything in a text that you need to hide.
If semi you want to feel more comfortable, use the
app called WhatsApp.

Speaker 2 (06:48):
That's the biggest takeaway. Don't put anything in texts of value.

Speaker 4 (06:53):
Oh my father told me, never put anything on paper
that you don't want on the front page of the
New York Times.

Speaker 5 (06:59):
And that was the best advice I think he ever
gave me.

Speaker 2 (07:01):
It's true, it's true, and now there is much more
evidence of all the stuff that we don't want to
see the light of day.

Speaker 5 (07:09):
So in wrapping, the FBI has a great PR department.

Speaker 2 (07:14):
When we come back, let's talk about how Google's AI
and how it can fight even more things about you
just by the photos that you have.

Speaker 5 (07:25):
Oh, we're gonna surprise you with that. You're gonna lie.

Speaker 3 (07:27):
Good surprise or bad surprise.

Speaker 4 (07:29):
It's kind of boring, you know, this big AI They
all got great BR departments.

Speaker 2 (07:35):
Well, we'll talk about it next. Marcia Collier joins me
in studio on Later with mo Kelly.

Speaker 1 (07:40):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.

Speaker 3 (07:45):
But I'm continue to be joined in the studio by
Marsha Collier.

Speaker 2 (07:48):
Marcia, I understand that Google has extensive AI capabilities. In fact,
the Google Pixel nine. They feature it with its AI
already in the hardware.

Speaker 3 (08:04):
But people are concerned about privacy. Yes, you have you
have a lot of stuff on your your your your device.

Speaker 2 (08:10):
We talked about text messages and how those might be
used against us. What about our photos? We always worry
about uploading our photos.

Speaker 4 (08:18):
That's right. They have your photos and they can see
everything though, everything, everything, but unfortunately everything just like texting.
Isn't that exciting?

Speaker 3 (08:29):
M hm?

Speaker 4 (08:31):
Realize that AI is a computer. AI is looking to
build intelligence from your photos, nothing else.

Speaker 3 (08:41):
They are not no Skynet.

Speaker 5 (08:43):
No, they're no terminators, none of that. It's not yet,
it's it's all very boring.

Speaker 4 (08:49):
Photos are only used to train generative AI models to
help people manage their image libraries.

Speaker 3 (08:56):
That's it.

Speaker 5 (08:57):
They analyze age, location and photosob that's it. It's nothing
to be afraid of.

Speaker 4 (09:04):
So mo I pointed you to a website called they
See Your Photos dot com and you uploaded a photo.

Speaker 2 (09:13):
Yes, and the slug is your photos reveal a lot
of private information. In this experiment, we use Google Vision
API to extract the story behind a single photo. And
if you were to go to my Instagram at Misphoer Bokelly,
and we're to look at my profile photo. That is
the photo that I use for this experiment. It shows

(09:36):
me sitting basically on a television set which I was
in a lounger chair.

Speaker 5 (09:42):
It shows a pleasant man in his forties.

Speaker 2 (09:45):
Yes, yes, which I'm not. I'm not pleasant, nor am
I in my forties. But we'll take it with a
coffee table in between two chairs, just like a typical
television set.

Speaker 3 (09:55):
Right, and here's the description.

Speaker 2 (09:58):
You can see the photo again at mister mo Kelly
on my Instagram profile picture and it reads it the
photo as follows quote. The foreground features a middle aged
black man in a dark blazer in jeans as accurate,
seated on a light gray armchair accurate. He has a
pleasant experience, a pleasant expression, and appears relaxed.

Speaker 3 (10:18):
Accurate.

Speaker 2 (10:18):
A small dark brown coffee table sits in front of
him holding a small potted succulent. The background is a
large backdrop depicting a hazy, distant view of the Los
Angeles city skyline. It goes on. All of that is accurate,
and it details at least how well it can read
a picture.

Speaker 3 (10:37):
But there's nothing.

Speaker 4 (10:38):
But that's what they're doing so that if they did
a search for offices with succulents in them, if this
was in your Google photos, that might come up in
that description.

Speaker 5 (10:52):
And it is training the AI.

Speaker 4 (10:55):
I'm giving this little example to see succulents on table.

Speaker 2 (11:00):
Well, that's what I learned. It was a plant to me, right,
you know you're gonna call it a succulent or I
don't know, a rotal gendron.

Speaker 3 (11:07):
I don't know, but it was a plant.

Speaker 4 (11:09):
But I mean, people sent stock images to this vision,
and it's able to pick up subtle details in them,
like a person's tattoo.

Speaker 5 (11:22):
The initials, whether it's a picture of a leaf. The
whole point is that it's just a single photo. You're
not But.

Speaker 2 (11:31):
Here's the concern. What happens after I uploaded that. Now
that picture is gone, it still retains the information, the
metadata of that picture.

Speaker 3 (11:40):
How might that.

Speaker 2 (11:41):
Would that be used against me or someone else in
some way in a worst case scenario.

Speaker 4 (11:47):
Never have your picture taken doing anything you don't want
on the front page of the New York Times.

Speaker 3 (11:53):
Okay, so no compromising photos.

Speaker 5 (11:55):
Right, I mean, so what so what somebody has My
dear aunt Anne died and she was the Mastermind Champion,
which is a quiz show in England kind of like
our Jeopardy and all that.

Speaker 4 (12:10):
She was all over the news when she passed away
because she was in the ultimate finals and it was
a big deal. The BBC was very nice, but news
bureaus were picking up photos of her off of my
Facebook page.

Speaker 3 (12:23):
Yeah.

Speaker 5 (12:24):
I mean the family wasn't overly.

Speaker 4 (12:26):
Thrilled about it, but it was already right exactly. And
all it does is help identify these photos. It is
nothing to be massively afraid of.

Speaker 2 (12:41):
Did you hear that, Mark Ronner? It's nothing to be
massively afraid of. You don't need to do any more fearmongering. Okay,
thank you, well, thanks for that. Appreciate that. Come on,
help me feel better.

Speaker 4 (12:53):
Now, yeah, come on, really, what kind of pictures are
you uploading that have that much seek? Oh, it has
the geographic location of where you've been. You're obviously sitting
in a television studio. It's no big secrets. I mean,
it's obvious, it's captain obvious.

Speaker 2 (13:11):
But we've been trained that guard your personal data. And
I'm arguing just the other.

Speaker 4 (13:16):
Shade I understand, and personal data is something totally different
personal data is talking about your birth year along with
your birthday. Now, Facebook is going to make a big
deal when it's your birthday and you're going to get balloons,
and lord knows, I can't tell you thank you all
four hundred and some odd people who wish me happy birthday.

(13:39):
I haven't had a chance to thank you. Yeah. But
the point is that information, the information you put in
your bios on social media, that's what you have to
watch out for because those can be used along with
the data that has been taken from the breaches. Funny,
you don't see the FBI protecting us from the breach,

(14:00):
and the breaches are where the real damage is done
because if they have your social Security.

Speaker 5 (14:05):
Number and you all went I hope to the.

Speaker 4 (14:09):
Verify location to freeze your social Security numbers, and if
you haven't, we can talk.

Speaker 5 (14:14):
I did tell us about it. How hard was it?

Speaker 3 (14:18):
Three clicks? Maybe?

Speaker 5 (14:19):
Yeah?

Speaker 3 (14:19):
Right now?

Speaker 4 (14:21):
Most social Security number, no doubt, was in a breach
and he was notified of it. The only reason you're
notified of it. You can go to a website called
have I Beenponed dot com?

Speaker 5 (14:30):
Sign up for that.

Speaker 4 (14:31):
They'll send you notifications and when you find out these things.

Speaker 5 (14:36):
Think about it.

Speaker 4 (14:37):
Somebody could take a job as Mo Kelly, somebody who
didn't have a social Security get a credit card and
get a credit card, but then you've already frozen your
credit bureau.

Speaker 3 (14:45):
I'm just saying they can exactly.

Speaker 5 (14:48):
But if you're smart, and I know everybody listening out
there is smart, and these are not hard to do.
If you're having a problem with it, hit me up
on X just ask me and I'll send you a
link to As a matter of fact, that there've been
a couple of celebrities who asked me and I sent
them in DM how to do it.

Speaker 4 (15:10):
But it's real simple. You just go and you freeze
your credit You don't have to lock it. You can
still use all your credit cards. Everything is still good.
But nobody can apply for a credit card in your name.
And if you freeze your social Security number, guess what,
nobody can impersonate you to get a job and keep

(15:33):
your reputation safe.

Speaker 3 (15:35):
This world is changing so quickly, so very fast.

Speaker 5 (15:38):
And we keep it in our own hands not to
be afraid of all the blowney that we are fed
by the powers that be.

Speaker 2 (15:46):
When you came into the studio tonight, in this last moment,
I want to talk about what you brought me when
you came to the studio tonight. What am I holding
right now, Marsha Collier, Well.

Speaker 4 (15:55):
You're holding the new second edition of Android Smartphones, for
which mo I think when you look at it and
get to First of all, it's in slightly bigger type. Not.

Speaker 2 (16:07):
Yes, I'm not a senior yet, but I understand the
value of larger farms.

Speaker 3 (16:11):
Oh yes, yes, oh yes.

Speaker 5 (16:13):
It's on nice paper.

Speaker 4 (16:15):
It's in full color with lots of screenshots, and I
use my family as examples throughout the book because I
don't like to make up things to show people and
illustrate things. I like to show you my own way
of doing it. So you're seeing my phones. I'm logged
into all the phones in the book, and it will
give you some tips and tricks and some things you

(16:37):
never knew about your phone. And maybe next week we'll
give you a super tip.

Speaker 2 (16:42):
I see that you get the four to one one
from Google Assistant. I showed my mother how to use
Google Assistant and voice to text, and you can't tell
her anything now she does it. She feels like she
has someone working for her in the house. She does,
setting appointments, making calls, waiting on hold, all those things.
I'm sorry, this can't be my mother, but it's.

Speaker 4 (17:02):
All in there, and people don't use it. If they
don't know about it.

Speaker 3 (17:06):
That's right, and you're here to help them out.

Speaker 5 (17:09):
And we'll have the tip of the week next week.

Speaker 3 (17:11):
That means I'm going to see you next week.

Speaker 1 (17:12):
You got a deal you're listening to Later with Moe
Kelly on demand from KFI AM six forty
Advertise With Us

Popular Podcasts

Are You A Charlotte?

Are You A Charlotte?

In 1997, actress Kristin Davis’ life was forever changed when she took on the role of Charlotte York in Sex and the City. As we watched Carrie, Samantha, Miranda and Charlotte navigate relationships in NYC, the show helped push once unacceptable conversation topics out of the shadows and altered the narrative around women and sex. We all saw ourselves in them as they searched for fulfillment in life, sex and friendships. Now, Kristin Davis wants to connect with you, the fans, and share untold stories and all the behind the scenes. Together, with Kristin and special guests, what will begin with Sex and the City will evolve into talks about themes that are still so relevant today. "Are you a Charlotte?" is much more than just rewatching this beloved show, it brings the past and the present together as we talk with heart, humor and of course some optimism.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.