All Episodes

December 6, 2024 33 mins
ICYMI: Hour Two of ‘Later, with Mo’Kelly’ Presents – Thoughts on the FBI’s warning to iOS and Android users to avoid ‘texting’ AND a way to see how much Google’s AI is obtaining from your photos on ‘Tech Thursday’ with regular guest contributor; (author, podcast host, and technology pundit) Marsha Collier…PLUS – A look at Fullerton College’s new “Drone Flying” bachelor’s degree program with one the Hornet’s most famous Alums, and KFI’s very own ‘Producer Extraordinaire,’ Michelle Kube Kelly - on KFI AM 640…Live everywhere on the iHeartRadio app
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.

Speaker 2 (00:06):
Let's talk tech and text with Marsha Collier, our resident
tech guru. Marshall call, I know you saw the story first.

Speaker 3 (00:17):
Good evening. He's good to see you.

Speaker 4 (00:18):
Good, see you too, I.

Speaker 3 (00:19):
Know you saw the story.

Speaker 2 (00:20):
The FBI has warned iPhone and Android users put together,
means everybody stop sending texts. And it's not sending text
from Android to Android or iPhone or iOS to iOS,
but you know the two shall not meet. Don't text
from Android to iOS. What the hell is going on?

Speaker 4 (00:41):
I think the FBI has an excellent PR department that's
trying to build itself up to show its importance.

Speaker 2 (00:48):
Sorry, folks, no, no, no, if we need to go there,
then let's go there, because I would like to think
of myself as somewhat tech savvy.

Speaker 4 (00:57):
Okay, how long have you been texting on your phone
since the beginnings?

Speaker 2 (01:01):
Right?

Speaker 3 (01:02):
Pregate?

Speaker 2 (01:03):
Oh no, no, no, even pretexting when we had the two
way pagers and everything.

Speaker 4 (01:07):
Right exactly, so none of that was encrypted. None that
Chinese know all about that, right right, They're done with that.

Speaker 3 (01:16):
Let's walk through this whole thing.

Speaker 4 (01:18):
Right exactly, So that's the way it was. Then Apple
invented encryption for their users in the I message system.

Speaker 3 (01:28):
Those are the blue bubble people.

Speaker 4 (01:29):
Right, and they encrypt from end to end. Now Google
invented RCS messaging, which is encrypted from end to end,
as is Gmail, which is encrypted from end to end.

Speaker 1 (01:44):
End.

Speaker 4 (01:45):
I might recommend if you plan on doing something super
hyper secret or something that could get you in trouble,
probably better to use Gmail and have them deleted at
the other end.

Speaker 3 (01:57):
And we would never recommend anything illegal on this.

Speaker 4 (01:59):
Show, never, never, But I mean something you didn't want
your family to know about, something you know, because you know,
I really don't think that the Chinese are going to
be looking for birthday plans or Christmas gifts on your text.
But it seems that when Apple graciously decided to adapt

(02:21):
OURCS so that there was some symbolic merger of Androids
and Apple. Yes, they do get RCS features, but their
encryption is only within their own ecosystem. So this gift

(02:42):
that he get that Apple gave to Android is not
really a gift. It's wide open and unencrypted. But as
we said at the beginning of the show, think about it.
You have been texting for the past decade and it
hasn't been encrypted, and you know seriously, I mean, oh
my goodness. Amid an unprecedented cyber This is from NBC News.

(03:06):
Unprecedented cyber attack on telecommunication companies such as AT and
T and Verizon. US officials are recommending that Americans use
ENCRYPTID message apps to ensure their communications stay hidden from
foreign hackers.

Speaker 2 (03:22):
For those who don't know, RCS is rich Communication services,
so you can get your emojis and you know, your
little thumbs up all those things.

Speaker 4 (03:30):
When people send you happy birthday techs, so you're going
to get all kinds of.

Speaker 2 (03:34):
Universe animations and everything that's RCS. But why I understand
what they're what the FBI is saying, and I understand
why they're saying it. Why do you think it's being
said now? Is it because of the holidays? Is it
something maybe that a major tech provider or company might

(03:57):
have been breached and they don't want to say.

Speaker 4 (03:59):
Don't they have been breached all long and all long.
You are a celebrity, mo. If something slightly untoward happened
to you, I am sure that you'd have a PR
person jetting out press releases of all the good stuff
you're doing. This is the way, in my humble opinion,
that the FBI is showing how important they are, because

(04:21):
there's no pe for protection in FBI. But I will
give you the solution. If you are slightly afraid that
foreign countries are going to care when your husband sends
you to pictures from the market that say do you
want this milk or do you want that milk? If
you're really concerned about that, use an app called WhatsApp.

(04:43):
I use WhatsApp. I used it today. I spoke to
my family in England. You can do free long distance
phone calls on WhatsApp. You can text on WhatsApp. Your
pictures don't get screwed up. On WhatsApp, you can sell
send files. It's by Meta, which is Facebook, but don't

(05:04):
confuse it with Facebook Messenger. It's a separate program and
it's called WhatsApp, and honestly I recommend it. It is
encrypted end to end. But the only thing is we've
said the word end to end about twenty times here.
Once it gets to your phone or your computer, then

(05:24):
it's all dependent on what kind of security you have
on your device, right because you know, if you walk
by somebody who's got a WiFi scanner, I guess they
could be interested in read your texts.

Speaker 2 (05:37):
Or a stinger or if they had like a Oh
for example, when I was on I don't know if
you heard the story I was telling the story. I
was coming back from Washington, DC, and Doug im Hoff
was on our flight, Secret Service and everything. If you
don't know what a stinger is, A stinger will basically
pull down all phone calls and text messages within the

(05:58):
radius of the plane or some small area. And I
knew I had to be very mindful of what I
would text. I can't say, guess who's on the flight.
I'm going to go back there and give them a
piece of my mind, because Secret Service would have been
all over me. But there's a lot of technology out
there which can suck your text.

Speaker 4 (06:18):
In exactly so encrypted end to end it Okay, if
it's flying through the universe and through the fiber and whatever,
it's encrypted, but once it gets to your phone, it's
widely readable by anybody hackable unhackable. So just think about this.

(06:38):
Don't put anything in a text that you need to hide.
If semi you want to feel more comfortable, use the
app called WhatsApp.

Speaker 2 (06:48):
That's the biggest takeaway. Don't put anything in texts of value.

Speaker 4 (06:53):
Oh my father told me, never put anything on paper
that you don't want on the front page of the
New York Times. And that was the best advice I
think he ever gave me.

Speaker 2 (07:01):
It's true, it's true, and now there is much more
evidence of all the stuff that we don't want to
see the light of day.

Speaker 4 (07:09):
So in wrapping, the FBI has a great PR department.

Speaker 2 (07:14):
When we come back, let's talk about how Google's AI
and how it can fight even more things about you
just by the photos that you have.

Speaker 4 (07:25):
Oh, we're gonna surprise you with that. You're gonna lie.

Speaker 3 (07:27):
Good surprise or bad surprise.

Speaker 4 (07:29):
It's kind of boring, you know, this big AI They
all got great BR departments.

Speaker 2 (07:35):
Well, we'll talk about it next. Marcia Collier joins me
in studio on Later with mo Kelly.

Speaker 1 (07:40):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.

Speaker 3 (07:45):
And continue to be joined in the studio by Marsha Collier.

Speaker 2 (07:48):
Marcia, I understand that Google has extensive AI capabilities. In fact,
the Google Pixel nine they feature it with its AI
already in the hardware.

Speaker 3 (08:04):
But people are concerned about privacy. Yes, you have you
have a lot of stuff on your your your your device.

Speaker 2 (08:10):
We talked about text messages and how those might be
used against us. What about our photos? We always worry
about uploading our photos.

Speaker 5 (08:18):
That's right.

Speaker 4 (08:19):
They have your photos and they can see everything though, everything, everything,
but unfortunately everything just like texting. Isn't that exciting? M H.
Realize that AI is a computer. AI is looking to
build intelligence from your photos, nothing else.

Speaker 3 (08:41):
They are not no Skynet.

Speaker 4 (08:43):
No, they're no terminators, none of that. It's not yet,
it's it's all very boring. Photos are only used to
train generative AI models to help people manage their image libraries.

Speaker 3 (08:56):
That's it.

Speaker 4 (08:57):
They analyze age, location, and photosob that's it. It's nothing
to be afraid of. So mo I pointed you to
a website called they See Your Photos dot com and
you uploaded a photo.

Speaker 2 (09:13):
Yes, and the slug is your photos reveal a lot
of private information. In this experiment, we use Google Vision
API to extract the story behind a single photo. And
if you were to go to my Instagram at mispho
Bokelly and we're to look at my profile photo. That
is the photo that I use for this experiment. It

(09:36):
shows me sitting basically on a television set which I
was in a lounger chair.

Speaker 4 (09:42):
It shows a pleasant man in his forties.

Speaker 2 (09:45):
Yes, yes, which I'm not. I'm not pleasant, nor am
I in my forties. But we'll take it with a
coffee table in between two chairs, just like a typical
television set.

Speaker 3 (09:55):
Right, and here's the description.

Speaker 2 (09:58):
You can see the photo again at mister mo Kelly
on my Instagram profile picture and it reads it the
photo as follows quote. The foreground features a middle aged
black man in a dark blazer in jeans as accurate,
seated on a light gray armchair accurate. He has a
pleasant experience, a pleasant expression, and appears relaxed accurate. A

(10:18):
small dark brown coffee table sits in front of him
holding a small potted succulent. The background is a large
backdrop depicting a hazy, distant view of the Los Angeles
City skyline. It goes on, all of that is accurate,
and it details at least how well it can read
a picture, but there's nothing.

Speaker 4 (10:38):
But that's what they're doing so that if they did
a search for offices with succulents in them. If this
was in your Google photos, that might come up in
that description, and it is training the AI. I'm giving
this little example to see succulents on table.

Speaker 2 (11:00):
Well, that's what I learned. It was a plant to me, right,
you know you're gonna call it a succulent or I
don't know, a rotal gendron.

Speaker 3 (11:07):
I don't know, but it was a plant.

Speaker 4 (11:09):
But I mean, people sent stock images to this vision,
and it's able to pick up subtle details in them,
like a person's tattoo, the initials, whether it's a picture
of a leaf. The whole point is that it's just
a single photo. You're not But.

Speaker 3 (11:31):
Here's the concern. What happens after I uploaded that.

Speaker 2 (11:34):
Now that picture is gone, it still retains the information,
the metadata of that picture.

Speaker 3 (11:40):
How might that.

Speaker 2 (11:41):
Would that be used against me or someone else in
some way in a worst case scenario.

Speaker 4 (11:47):
Never have your picture taken doing anything you don't want
on the front page of the New York Times.

Speaker 3 (11:53):
Okay, so no compromising photos.

Speaker 4 (11:55):
Right, I mean, so what so what somebody has My
dear aunt Anne died and she was the mastermind, champion
which is a quiz show in England kind of like
our Jeopardy and all that. She was all over the
news when she passed away because she was in the
ultimate finals and it was a big deal. The BBC

(12:16):
was very nice, but news bureaus were picking up photos
of her off of my Facebook page.

Speaker 3 (12:23):
Yeah.

Speaker 4 (12:24):
I mean the family wasn't overly thrilled about it, but
it was already right exactly. And all it does is
help identify these photos. It is nothing to be massively
afraid of.

Speaker 2 (12:41):
Did you hear that, Mark Ronner? It's nothing to be
massively afraid of. You don't need to do any more fearmongering. Okay,
thank you, well, thanks for that. Appreciate that. Come on,
help me feel better.

Speaker 4 (12:53):
Now, yeah, come on, really, what kind of pictures are
you uploading that have that much seek? Oh, it has
the geographic location of where you've been. You're obviously sitting
in a television studio. It's no big secrets. I mean,
it's obvious, it's captain obvious.

Speaker 2 (13:11):
But we've been trained that guard your personal data. And
I'm arguing just the other.

Speaker 4 (13:16):
Side, I understand, and personal data is something totally different.
Personal data is talking about your birth year along with
your birthday. Now, Facebook is going to make a big
deal when it's your birthday and you're going to get balloons,
and lord knows, I can't tell you thank you all
four hundred and some odd people who wish me happy birthday.

(13:39):
I haven't had a chance.

Speaker 3 (13:40):
To thank you.

Speaker 4 (13:40):
Yeah. But the point is that information, the information you
put in your bios on social media, that's what you
have to watch out for because those can be used
along with the data that has been taken from the breaches. Funny,
you don't see the FBI protecting us from the breach,
and the breaches are where the real damage is done

(14:04):
because if they have your social Security number and you
all went I hope to the Verify location to freeze
your social Security numbers, and if you haven't, we can talk.
I did tell us about it. How hard was it?

Speaker 3 (14:18):
Three clicks? Maybe?

Speaker 6 (14:19):
Yeah?

Speaker 3 (14:19):
Right now?

Speaker 4 (14:21):
Most social Security number, no doubt, was in a breach
and he was notified of it. The only reason you're
notified of it. You can go to a website called
have I Beenponed dot com? Sign up for that. They'll
send you notifications and when you find out these things,
think about it. Somebody could take a job as Mo Kelly,
somebody who didn't have a social Security get a credit

(14:41):
card and get a credit card, but then you've already
frozen your credit bureau.

Speaker 3 (14:45):
I'm just saying they can exactly.

Speaker 4 (14:48):
But if you're smart, and I know everybody listening out
there is smart, and these are not hard to do.
If you're having a problem with it, hit me up
on X just ask me and I'll send you a
link to As a matter of fact, that there've been
a couple of celebrities who asked me and I sent
them in DM how to do it. But it's real simple.

(15:12):
You just go and you freeze your credit You don't
have to lock it. You can still use all your
credit cards. Everything is still good. But nobody can apply
for a credit card in your name. And if you
freeze your social Security number, guess what, nobody can impersonate
you to get a job and keep your reputation safe.

Speaker 3 (15:35):
This world is changing so quickly, so very fast.

Speaker 4 (15:38):
And we keep it in our own hands, not to
be afraid of all the blowney that we are fed
by the powers that be.

Speaker 2 (15:46):
When you came into the studio tonight, in this last moment,
I want to talk about what you brought me when
you came to the studio tonight. What am I holding
right now, Marsha Collier, Well.

Speaker 4 (15:55):
You're holding the new second edition of Android Smartphones, for
which mo I think when you look at it and
get to First of all, it's in slightly bigger type. Not.

Speaker 2 (16:07):
Yes, I'm not a senior yet, but I understand the
value of larger farms.

Speaker 4 (16:11):
Oh yes, yes, oh yes, It's on nice paper. It's
in full color with lots of screenshots, and I use
my family as examples throughout the book because I don't
like to make up things to show people and illustrate things.
I like to show you my own way of doing it.
So you're seeing my phones. I'm logged into all the

(16:31):
phones in the book, and it will give you some
tips and tricks and some things you never knew about
your phone. And maybe next week we'll give you a
super tip.

Speaker 2 (16:42):
I see that you get the four to one one
from Google Assistant. I showed my mother how to use
Google Assistant and voice to text, and you can't tell
her anything now she does it. She feels like she
has someone working for her in the house. She does
setting appointments, making calls waiting on hold all those things.
I'm like, this can't be my mother.

Speaker 4 (17:02):
But it's all in there, and people don't use it
if they don't know about it.

Speaker 3 (17:06):
That's right, and you're here to help them out.

Speaker 4 (17:09):
And we'll have the Tip of the week next week.

Speaker 3 (17:11):
That means I'm going to see you next week.

Speaker 4 (17:12):
You got a deal.

Speaker 1 (17:13):
You're listening to Later with Moe Kelly on demand from
KFI AM six fortym O'Kelly.

Speaker 2 (17:22):
We're live everywhere on the iHeartRadio app. And I guess
this is more of a technology Thursday. There's a lot
of tech and advancements and communication and it's kind of
perfect that all these stories are happening on a Thursday. Now,
I want to take you to Fullerton College and get this.

(17:42):
Fullertin College is getting ready to launch its first bachelor
degree program in school history, but for flying drones. Fulletin College,
you heard that right, is launching a Bachelor of Science
degree in drone and Autonomous Systems beginning in the fall

(18:04):
of twenty twenty six. The school just announced it two
days ago at a special event on campus, fittingly replete
with a drone light show. And I have been to
Fullerton college. I've spoken there on a couple of occasions.
I've had a fraternity brother friend who taught there for
many years, so I feel somewhat connected to the campus.

(18:27):
But there's someone here at CAFI who knows the campus
far better than me, and it's producer Michelle Q, and
she joins us on the show right now. Michelle, first,
let me congratulate you on postathon and thank you for
coming on tonight short notice.

Speaker 6 (18:44):
Oh, thank you very much for having me. I appreciate that.

Speaker 2 (18:48):
Let's get Postathon out of the way, and I mean
that in a good way. I know we crossed the
one million dollar threshold earlier today, and I think we
have more than seven hundred seventy thousand pounds of pasta.

Speaker 3 (19:01):
Is that correct?

Speaker 6 (19:02):
Yes, seventy nine thousand pounds of pasta and sauce. But
I do know. I'll have another update tomorrow, but I
do know that there was a drop off today of
pasta and sauce that was I'm going to say more
than fifteen thousand pounds.

Speaker 2 (19:19):
Is it fair to say that we are still trending
ahead of last year at this time, Yes.

Speaker 6 (19:25):
We are turning ahead of last year. I think we're
about if I remember correctly, I think we're about thirty
to forty thousand dollars ahead of where we were last year.
And I'm really confident that the next couple of days
are going to be really good for us because we're
you know, the donations are coming in through Sunday.

Speaker 2 (19:44):
Donations through Sunday at Wendy's and Smart and Final locations correct.

Speaker 6 (19:50):
Yes, and also at KFI AM six forty dot com
slash pastathon. You can donate anytime there too.

Speaker 2 (19:56):
I started this segment talking about Fullerton College, how it's
going to launch his first bachelor's degree program in school
history and flying drones. You are in a lum of
Fullerton College. Tell me about the school and what it
meant for you.

Speaker 6 (20:13):
I loved Fullerton College. I when I graduated high school,
I you know, wanted to go to a four year school,
but I you know, I wasn't quite ready, you know,
to do that, I don't think. And so when I
was looking for a community college, I you know, had
looked at a bunch of them, and when I settled
on Fullerton College, I had gone out there and just

(20:36):
kind of looked at, you know, the kinds of programs
that they had in the campus and it was just
a really nice kind of you know campus. It just
had a really nice feel to it. And so I
enrolled there and I wasn't exactly sure what I wanted
to do, but you know, I took the basic classes
that you take, you know, your first semester when you're
in you know, JC, all the basic stuff. But they

(20:58):
had a really interesting radio TV program. And what I
found interesting is that Fullerton College at the time had
a live radio station and you know, a live kind
of not a TV station, but a TV studio set
up to the nines, and you know, cal State Fullerton
did not have that at the time. And so I

(21:20):
started taking these radio and TV classes because originally I
wanted to work in TV and film. I wanted to
be a film editor. My uncle was a film editor,
and so I had to take radio and TV classes.
You couldn't just take one or the other. It was
a combination. So it was my first intro into radio,
and some of the teachers that worked on the radio

(21:43):
side really inspired me to kind of follow a path
into radio. When I took an internship class, I think
it was my third semester there, I was again still
trying to be film TV editor and I couldn't get
an internship at a TV station or a studio or

(22:04):
anything like that. And before dropping the class, my teacher
just said, why don't you go get a radio internship.
He'll take anybody. So that's how I ended up reaching
out to KFI because my dad listened to it, and
they hired me. Well, they took me on as an
intern and I was interning at KFI for almost a
year and I fell in love with radio. And it

(22:27):
was because of Fullerton College that I fell in love
with radio, and it turned out to be my career.
I've been with the station now, as you know, for
thirty years, and I would never have done that if
it weren't for Fullerton College and the tireless dedication of
the teachers there that really focused on the love of
radio and the love of television on both sides. But

(22:50):
at the same time, it was an incredible experience.

Speaker 2 (22:53):
Doesn't Fullerton College have an extensive science.

Speaker 3 (23:00):
A course work which is offered there?

Speaker 6 (23:02):
Yes, they do, and in fact, you know, I had
taken you know, several astronomy courses, you know, while I
was there too, and I really loved that too. But
I just kind of I fell in love with radio,
so I just kind of, you know, went that direction.
But yeah, they do have extensive, you know, science related classes,
and it's really amazing the amount of classes that they had,

(23:26):
you know, out there. And and I, you know, ended up,
you know, like I said, I ended up a KFI
ended up getting my you know, AA degree there. I
never went on to a four year school because I
got the experience in radio so early and I learned
everything kind of on the job. But I don't regret
it because I just think, you know, Fullerton College was

(23:49):
such a full experience to me, it almost felt like
a four year school.

Speaker 3 (23:52):
Well it's a really really good school.

Speaker 2 (23:55):
And I'd say that as someone who knew some of
the professors there and had a chance to speak there.

Speaker 3 (24:00):
I think you're one of the most famous alum. Is
that correct?

Speaker 6 (24:05):
I don't know.

Speaker 2 (24:06):
I guess maybe, but I couldn't help but think of you.
When I see that Fullerton College is going to launch
and it's not often that you see a community college
is offering bachelor degrees, and when Fullerton College is launching
this first bachelor's degree program in school history and flying drones,

(24:26):
I couldn't help but think of you, even though you
didn't come out of the sciences specifically. But Fullerton College
is again making a lot of noise in the world today.

Speaker 6 (24:36):
Yeah, it was amazing. When I saw the story this morning,
I was really proud. I just thought, Wow, this is
a really amazing thing for the school to do, and
you know, to give these students an opportunity who maybe
can't go to a four year school to get a
bachelor's degree while attending a community college. I think it's fantastic.

Speaker 2 (24:53):
Well, anytime we can shout out Fulloton College, I think
is a good thing to do.

Speaker 3 (24:59):
I wanted salute you again. Michelle. You are I call
you producer Michelle.

Speaker 2 (25:03):
You've long been known as producer Michelle, but you're exactly
actually executive producer Michelle. Your executive producer for all the
shows here at CAFI, which is indicative of your importance
to the station and also the success that you've had
in growing the station over the said thirty years that
you've been here. Your work with Pastathon speaks for itself.

(25:25):
But I don't think you get enough shine and enough
credit for all that you do behind the scenes.

Speaker 6 (25:30):
Well, I appreciate that so much, mo. I couldn't do
it without you guys. And I couldn't do it, you know,
without the support of all of you guys there in
allowing us to kind of, you know, grow something like
the Pasathon year after year. I couldn't do it without you.

Speaker 2 (25:46):
You're being very nice, you're being very humble, but you
deserve all the credit. Okay, just want to be clear,
none of this happens without you.

Speaker 3 (25:52):
Okay.

Speaker 6 (25:53):
Well, I adore you.

Speaker 3 (25:54):
I just showed up.

Speaker 2 (25:55):
At you know, five point thirty did my show from
the Anaheim White House. I don't want to put that
anywhere near all the work that you put in.

Speaker 6 (26:03):
Well, thank you. I appreciate that very much.

Speaker 2 (26:05):
Well, thank you for coming on on short notice and
shouting out Fullerton College.

Speaker 3 (26:08):
And I'll be seeing you soon anytime.

Speaker 6 (26:11):
Thank you.

Speaker 1 (26:12):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.

Speaker 2 (26:18):
Some time ago, we told you about the controversy how
there were some wicked dolls, which unfortunately included a link
to a porn website on its packaging. I was of
the opinion that it wasn't a mistake. I mean, the
odds are one in a million, right, you're going to
get to the wrong website and it just happened to

(26:39):
be porn. Well, the story has an update. According to
court documents, a South Carolina resident is launching a class
action lawsuit after purchasing the toy for her young daughter
who visited the X rated website that quote unquote had
nothing to do with the Wicked Doll.

Speaker 3 (26:58):
Of course it didn't.

Speaker 2 (26:59):
The toy company mistakenly listed a similarly titled website for
the adult entertainment site Wicked Pictures, rather than the official
page for the Universal Pictures film.

Speaker 3 (27:12):
The plaintiff, this is Keith.

Speaker 2 (27:15):
The plaintiff alleges that Mattel didn't offer a refund and
believes that she and her child suffered emotional distress. Excuse me,
distress from the misprint. This is a quote from mom quote.
These scenes were hardcore, full on nude pornographic images depicting

(27:38):
actual intercourse.

Speaker 3 (27:40):
That is a lie.

Speaker 2 (27:42):
Okay, we're gonna conduct the expirit an experiment right now
in the cafe Studio Tauala.

Speaker 3 (27:47):
I need you to verify this.

Speaker 2 (27:48):
We're gonna go to the Wicked Pictures website live on air.
We're gonna do it live on Okay, We're going to
Wicked Pictures website.

Speaker 3 (27:59):
Now, you didn't see this because I've already been to
the website.

Speaker 2 (28:02):
But the first time I went earlier today, it asked
whether I was actually eighteen years old.

Speaker 3 (28:07):
That was the first thing.

Speaker 2 (28:08):
Okay, so it had to verify age to get in.

Speaker 7 (28:13):
I see, I see this, this thing that's popped up
all right.

Speaker 2 (28:18):
Now, what I see there are pictures of titles.

Speaker 3 (28:24):
There is no nudity, Am I am? I lying? No,
there's no nudity.

Speaker 7 (28:28):
These are titles for films that you can peruse, that
you can delve into, and I don't know, find you.

Speaker 3 (28:38):
Your choice.

Speaker 2 (28:38):
It's clear that this is not Wicked Universal Pictures.

Speaker 3 (28:43):
Very clear. It's very clear.

Speaker 2 (28:44):
You can't mistakenly think that, oh, okay, Wicked the Hunger
starring Emma Hicks, well, Wicked Loves Girls.

Speaker 3 (28:53):
You can't think of.

Speaker 2 (28:54):
No, you cannot and if you're in your right mind
think that this is the right website.

Speaker 5 (28:59):
There is.

Speaker 3 (29:00):
There's no way you can just mistakenly.

Speaker 2 (29:03):
Happen upon hardcore scenes or actual sex taking place, not
on this website.

Speaker 7 (29:10):
Now, if this is what the mother is claiming, then
what has to have happened is she had to click
tasty Tina and her misadventures over here, and she would
have had to have opened up the video to see

(29:32):
it to say, oh my god, this is hardcore porn.

Speaker 2 (29:36):
The quote let me go back to the point. The
quote said these scenes.

Speaker 3 (29:40):
Plural plural were hardcore.

Speaker 7 (29:44):
That means she sent her daughter away put out some
r I don't know if he was watching wicked in
her tass.

Speaker 2 (29:51):
Full on nude pornographic images depicting actual intercourse, not on
the homepage. Not if you're just scrolling around, you clearly
had to go. Let's click on the scenes tab. Let
me see if there's anything there. Okay, clicked on.

Speaker 3 (30:04):
The scenes tab. You have to create your account.

Speaker 2 (30:07):
You can't arbitrarily just happen upon nude or sexual behavior.
She's lying, and she's she's gonna get paid because I'm
quite sure Universal Pictures just wants us to go away.

Speaker 3 (30:20):
Shame on you, but she's lying.

Speaker 7 (30:22):
I think that Universal Pictures should use this to prove
their point that this was a misprint, that there really
and truly is no harm, no fould. Because you're not
opening up a page and actually able to see porn.
You not only have to create an account, but you
have to create you have to upload a credit card,
because right there, I'm seeing a paywall. There's a paywall.

(30:45):
That's the next thing. You get to have to create
an account. There's a paywall. So you're saying that your
daughter went onto this website, prove she was eighteen, created
an account, uploaded one of your credit cards, and then
watch this hardcore porn oops case, close your honor.

Speaker 3 (31:05):
I don't know what to say. Look, dismiss dismissed, defense
rests you know this is it was.

Speaker 2 (31:13):
It's literally impossible to accidentally stumble upon any pornographic content
on this website. You have to age verify, and you
have to log in to set create an account, and
put up a credit card before you see any type
of sexual activity nudity content.

Speaker 3 (31:32):
How can you quantify the trauma that these poor people endured?
How much do they want now see?

Speaker 7 (31:38):
I can understand if after the mother did all this
to say I need to see what you're talking about,
and then invited her young daughter over say now you see,
I don't want you watching any of this, and just
kept it going, then yeah, then I think what we
need to do is we need to call child Protective
Services and say there is a woman who is a

(31:59):
lot about her porn addiction and try to blayment on
Mattel and has a wicked account on her own.

Speaker 2 (32:06):
You cannot accidentally stumble across hardcore scenes, nudity or sexual
content by just going to the website. You have to
be an active participant, seek it out, sign up.

Speaker 7 (32:22):
Let me also just say real quick that this has
got to be some amazing porn for anyone to have
to sign up, creat an account, give you all my information,
and then I'm going to pay twust.

Speaker 3 (32:35):
Just these have to be You don't get to see anything.

Speaker 7 (32:40):
All you get is a title and you know, uh,
a scantily clad you know, I mean scantily.

Speaker 2 (32:47):
Clad a'llah, maybe a target ad. I mean, this isn't
like who I know have lingerie. It's not like they're
even partially nude.

Speaker 7 (32:55):
There There are more tantalizing pictures on Instagram.

Speaker 2 (33:02):
Oh, by a lot, this is by a lot ridiculous.
It looks like a Hallmark channel. The way they're positioning
that's the wrong word presenting this stuff.

Speaker 3 (33:12):
The way they're laying it out. Yeah, yeah, it's it's
an incredible spread. I'll say that. The way they're building
this up.

Speaker 5 (33:20):
What all gonna break is that it okay, all right,
sixty we live everybody in our heart radio app. Damn
what you need to know and when you need to
know it k F I'm k O S

Speaker 7 (33:33):
T HD two, Los Angeles, Orange County lot everywhere on
the

Later, with Mo'Kelly News

Advertise With Us

Popular Podcasts

Monster: BTK

Monster: BTK

'Monster: BTK', the newest installment in the 'Monster' franchise, reveals the true story of the Wichita, Kansas serial killer who murdered at least 10 people between 1974 and 1991. Known by the moniker, BTK – Bind Torture Kill, his notoriety was bolstered by the taunting letters he sent to police, and the chilling phone calls he made to media outlets. BTK's identity was finally revealed in 2005 to the shock of his family, his community, and the world. He was the serial killer next door. From Tenderfoot TV & iHeartPodcasts, this is 'Monster: BTK'.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

The Bobby Bones Show

The Bobby Bones Show

Listen to 'The Bobby Bones Show' by downloading the daily full replay.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.