All Episodes

August 22, 2024 31 mins
ICYMI: Hour Three of ‘Later, with Mo’Kelly’ Presents – An in-depth analysis of the most viral stories of the week in “The Viral Load” with regular guest contributor Tiffany Hobbs weighing in on everything from the "very mindful, very demure" viral TikTok user Jools Lebron, to the octogenarian owners of a laundry shop that have become Instagram stars for posing in garments left behind…PLUS – A look at why researchers are calling for AI to receive Personhood Credentials - on KFI AM 640…Live everywhere on the iHeartRadio app
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to Later with mo Kelly on demand from KFI,
a M six forty.

Speaker 2 (00:13):
Now show.

Speaker 3 (00:22):
Thank social media, Facebook, It's ex TikTok.

Speaker 1 (00:26):
Viral load, viral load for viral load, ny.

Speaker 4 (00:36):
K if I am six forty, It's Later with mo
Kelly with Live Everywhere on the iHeartRadio app. Is now
time for the viral Load with Tiffany Hops.

Speaker 3 (00:44):
Let's do it.

Speaker 2 (00:45):
And you guys have been talking a lot tonight about
viral stories.

Speaker 3 (00:49):
I was listening.

Speaker 2 (00:49):
I am a fan of the mo Kelly Show, and
so I heard you talking about all things viral. These
stories are a bit of a diversion from the current
political juggernaut that is the Democratic National Conventions.

Speaker 3 (01:03):
So let's get into it.

Speaker 2 (01:04):
The first one is a social media post that is
going viral, and here's the thing. It actually happens to
have originated a couple of months ago. We have a
few stories tonight that aren't necessarily current but are seeing
kind of relevance now because of how social media works.

(01:24):
This story specifically claims sadly that a deceased Pennsylvania father
is missing months after his death. Here's the deal. There's
a man, his name is Brian Posh. He was thirty six,
and he was found dead inside of his submerged truck

(01:44):
back in April of this year. This happened in the
Brighton Township area and the police department did all of
the reconnaissance, found out what happened, and on April fifth,
they were able to determine and what actually caused Brian
Posh's death. After that unfortunate incident, that picture of Brian

(02:08):
Posh that went viral at the time that he had
died because of the unfortunate circumstances started to take off again.
The picture has been posted numerous times, thousands of times
across social media. And what's happening is the current captions
are saying that there's a man and they use different names,

(02:29):
and they're saying that this man by this name with
this same picture is currently missing, and they're assigning different
causes to his apparent missing status or the conditions of
what's going on with him. What's also happening is that
there are crowdfunding sites that are being used to capitalize

(02:52):
off of this scam, and people are asking for money
to support this family, this phantom family. All these things
are being assigned to this picture that originated back in
April ninth due to a very and real, unfortunate incident,
and it really shows how damaging and dangerous social media

(03:12):
can be, because when you see a picture, you see
a story. If you don't have that media literacy, you
don't really know just how real and true the story is.

Speaker 3 (03:23):
People don't care. They're not trying to be accurate, and the.

Speaker 2 (03:26):
Problem with accuracy is that it can again open people
up to scams and money laundering and fraud. And people
have been putting money into these various crowdfunding websites in
hopes of supporting these people or this person they think
is still missing or whatever the unfortunate circumstance is attached
to the picture. His family is also being re victimized

(03:49):
over and over again because they're seeing their loved ones
picture plastered all over social media, so they've spoken out.
Various social media users have spoken out about it, but unfortunately,
as virality goes, these sorts of posts don't go away.
The next story is a lot more positive. It deals

(04:10):
with a Louisville teenager who was caught doing something very ordinary,
very mundane. This young man Thatteus Mason, a Louisville native,
and a student of the Jefferson County Public School System
was just walking out of a library one day. He
was wearing an orange outfit. He was walking down the

(04:31):
street on the side of cars, and he was.

Speaker 3 (04:34):
Holding a book. Unbeknownst to him.

Speaker 2 (04:38):
Someone snapped a picture of thatteus and posted the picture
with the caption that read, you definitely don't see this
every day, A very rare sighting of a young in
just walking and reading a book. That photo was shared
all over social media.

Speaker 4 (04:58):
A person reading now I went viral, What is this
like the movie Demolition Man. Well, they don't know what
toilet paper is or something.

Speaker 3 (05:06):
That's exactly what it's like.

Speaker 2 (05:08):
The fact that this again very ordinary, very mundane, very
usual act of reading a book, what go viral just
really shows you how skewed reality is. And I imagine
that the poster, the original poster, feels that teenagers don't
necessarily read books like that, or he doesn't necessarily see

(05:31):
teenagers walking down the street engaging in just normal reading activities.
Maybe they're always on their cell phones or some sort
of technology. And he felt the need to share it
on social media, and because so many people agreed with
its caption, The photo itself went viral thousands of times
over Thaddeus himself, and this is where it became a
little concerning to me. Thaddius said himself, I did not

(05:54):
know who took the picture of me. I was walking
down the street. I was just walking down the street
reading a book. And then someone showed him the picture.
And now he knows as well that he is a
viral sensation. But I wonder about his privacy. There are
lots of implications being made about his orange outfit, his race,

(06:15):
all sorts of things.

Speaker 3 (06:16):
So it's problematic on many ends. And also I.

Speaker 2 (06:19):
Guess somewhat refreshing to see a young person reading a book.

Speaker 4 (06:22):
One thing I would pull out of this is I
tell young people all the time there's somebody always watching.
When I say somebody, it could be a camera, it
could be a person just viewing you and observing you,
watching you thinking that either you're going to get into
trouble or something like that. There's somebody who's always watching.
This is a perfect example of that. Now, the young

(06:43):
man was just reading a book, but I'm quite sure
any other time he was being watched as well, just
it might've had a different result or a different assumption
was being made.

Speaker 2 (06:52):
Absolutely and again, you never know what people are going
to do with what they observe of you. And in
this case, a picture was taken and now he is
all around the world. The next story we're gonna skip to.
It's a shorter story because that story when we come back,
is so meaty. So this next story is a gag.
It's a joke, and it's hilarious. There's the Gougenheim Museum

(07:16):
in New York. It's a museum of art, abstract art, installations.

Speaker 3 (07:22):
All sorts of things.

Speaker 2 (07:23):
And last week a visitor at the Gougenheim Museum pulled
a stunt that really exposes just how naive audience members
can possibly be, or attendees to the museum might be.
The visitor discreetly removed one of their shoes. The shoe
was a white, kind of dirty white Chuck Taylor all

(07:46):
Star sneaker, familiar, Moe familiar.

Speaker 3 (07:50):
I had a bunch of pairs of them Chucks who
isn't well.

Speaker 2 (07:53):
The user took the shoe off and put it just
kind of against a wall, a blank wall, and just
left the shoe there. Then the person stood back, watched
people walk by and watched them observe this shoe, this
isolated shoe from a secret area of this particular room

(08:16):
in the museum. While watching people, the person observed that
the audience was kind of, you know, chattering about the shoe,
making observations about the.

Speaker 3 (08:27):
Color and the laces.

Speaker 2 (08:29):
Basically, the audience thought that the shoe itself was an
art installation or some sort of piece that belonged in
the Gougenheim.

Speaker 3 (08:40):
Yeah, I know.

Speaker 2 (08:41):
And then the person with the shoeless gag was it
was it off? No, it wasn't even roped off. It
was just out in the open. And so people, again
are that naive that they thought it was something legit.
They the person took a picture, of course, took video,
uploaded it to various social media platforms, where it went viral,
because again it showed just how naive we can be

(09:03):
when we're at these museums.

Speaker 4 (09:05):
Women come back the second portion of the viral load
with Tiffany Hobbs and and there's a movie tie into it.

Speaker 3 (09:13):
She doesn't know it, but I'm quite sure Mark Ronner
knows it.

Speaker 1 (09:16):
We'll tell you about it in just a moment you're
listening to Later with Moe Kelly on demand from KFI
AM six forty.

Speaker 3 (09:27):
Now it's sucking my room moment Tiffany live on camp
Ladies WI mo o Kelly. She'll talk about the time this.
I'm social media room alone with Tiffany.

Speaker 4 (09:43):
Hobbs Camp I AM six forty later with Mo Kelly. Tiffany,
take it away, mo.

Speaker 3 (09:49):
Do you know what demure means?

Speaker 4 (09:52):
I got one better for you, Mark Ronner? If I
said demure to you, what likely comes to mind?

Speaker 3 (10:02):
There are a couple of meanings. Pardon me, I'm eating?
Oh okay, well, let me let me take this.

Speaker 4 (10:09):
Whenever I hear demure, I think about the only reference
point there is to demure. And there's a cinematic tie in.
And so I know Dmuir was well breaking out on
social media. I just assumed it had to do with
total recall.

Speaker 5 (10:25):
I'll be asking you some questions stuck so we can
find you in the ECO program.

Speaker 3 (10:29):
You answer honestly, you'll enjoy yourself a whole lot more.

Speaker 5 (10:32):
It's your sexual orientationedro Huh, how do you like your women? Blonde, burnette, redhead, burnette, slim, politic, luctious, demure, aggressive, sleezy,

(10:53):
to be honest, sleezy.

Speaker 3 (11:02):
I've never heard the word used in any other context
in life until now.

Speaker 2 (11:07):
Because the word demure has gone completely viral. It is probably,
besides these national conventions, one of the biggest viral stories
of all across social media.

Speaker 3 (11:22):
And what am I talking about.

Speaker 2 (11:24):
I'm talking about a social media user by the name
of Jules Lebron who has popularized the phrase in a
series of videos.

Speaker 6 (11:34):
Time out for What to Watch, We're apparently going to
go through the meaning of the word demurre.

Speaker 7 (11:39):
Is that what's happening?

Speaker 3 (11:40):
This is what's happening. You guys have been asking me
about this all morning long.

Speaker 4 (11:44):
In the stories that we share in What to Watch,
we try to be very demure, very mindful, okay, and
very knowledgeable.

Speaker 3 (11:52):
Okay.

Speaker 6 (11:52):
We try to make sure our audience understands what's happening
on TikTok on social media.

Speaker 3 (11:56):
So this is what is happening right now.

Speaker 6 (11:58):
Watch this, folks, Okay, I see hard. You may make
up for work, very demure, very mindful. I don't do
too much. I'm very mindful while I'm at work. Your
hard look very presentable. A lot of you girls go
to the interview looking like Mark Simpson and go to
the job looking like Patty and Selma, not demure.

Speaker 3 (12:17):
I'm very yes, so that is the origin of this
viral sensation.

Speaker 2 (12:26):
Yes, not total recall, not total recall, not Arnold, not Arnold,
but I'm sure he would appreciate how the word has
taken on new meanings. So Jules Lebron is a very
popular kind of lifestyle uh social media user. She gives
tips on makeup. She is a motivational speaker online. She

(12:48):
helps anyone navigate their personal struggles because she's very transparent
about her own. And actually Lindsay producer Lindsay is very notlgible.

Speaker 7 (13:00):
Yeah, very eligible for even before like she blew up,
like when she was just doing small stuff.

Speaker 3 (13:06):
Really, TikTok, who is this? I thought it was like
Lebron James's cousin or something cousin not quite twenty thousand
leagues under the seat relation.

Speaker 6 (13:16):
Not no, well, not that we know of.

Speaker 2 (13:18):
But I'm not sure because at this point everyone knows
who she is, from Jennifer Lopez to different beauty brands.
I don't know that means every your favorite Jennifer Lopez
and everything in between. So the TikToker, she blew up
on TikTok uses terms to describe the level of appropriateness
for makeup and fashion choices and a variety of settings.

(13:40):
And like you heard in that clip, Jules Lebron is
talking about how people dress when they go to work
or go to interviews, and she uses very mindful, very demure,
very knowledgeable. That word demure was taken out and now
applied to many different situations across social media where people
are using the word wrong, where story goes viral because

(14:03):
it originates here.

Speaker 3 (14:04):
But now demur is being.

Speaker 2 (14:06):
Used to describe your water bottle, or to describe your car,
or anything that you want to give some sort of
personification to. And as of this time, that TikTok video
has over thirty five million views.

Speaker 3 (14:25):
Thirty five million.

Speaker 2 (14:27):
So you guys were talking about Beyonce and Taylor Swift
showing up tomorrow at the Democratic National Convention, don't be
surprised if Jules Lebron somehow makes some sort of appearance
or there's some reference to very different I'm calling my
shot very demure.

Speaker 3 (14:44):
I think she lives in Chicago.

Speaker 2 (14:45):
Actually, oh yeah, even better, so we might see this
take on political legs because she's been tapped to be everywhere,
and as of this time, the major interest that she sparked,
we don't know what she's partnering with or who she's
partnering with, but expect to see a lot more of

(15:06):
Jules Lebron in the very near future.

Speaker 3 (15:08):
She's very demure, she's very cute, she's very knowledgeable.

Speaker 4 (15:11):
MO, how does knowledgeable get in there? Is that like
an additional modifier or is that supposed to be part
of the definition of demure?

Speaker 3 (15:21):
MO, don't question Jules Lebron. That's not very demure? Does
make it doesn't make sense. Where's that meaning? We're moving along,
We're going to be demure and very knowledgeable. Thank you, Arnold.

Speaker 2 (15:35):
I have one last kind of fourth quarter story to
throw in and it does have to do with the
National Convention. If you noticed on Monday that people were
wearing cowboy hats in the crowd, there was a photo
that went viral of a big section of the National
Convention of attendees wearing cowboy hats. They were bedazzled, they
were glittery, they were all different colors. And that is

(15:58):
a direct correlation in two BEYONCEO and her latest album Yes,
because the representative from Texas who was there, is a
huge fan of Beyonce and said that she wanted all
of her supporters to wear cowboy hats in homage to Beyonce.
So when you see these cowboy hats, that's what that is.

Speaker 4 (16:21):
I don't see a world where the DNC would miss
this moment. They would not have Beyonce and Taylor Swift.
I would be disappointed from a political strategist standpoint where
they left that on the table and didn't seize that moment.

Speaker 3 (16:39):
It would be very unknowledgeable and very not demure, very demure, demure.
It's Later with Mo Kelly. Can' if I very demure?

Speaker 4 (16:53):
AM sixty were live everywhere on the iHeartRadio app.

Speaker 1 (16:57):
You're listening to Later with Mo Kelly. Demand from KFI
AM six forty, Wimbo Kelly one KM six.

Speaker 4 (17:12):
Technology is moving so so fast, so so fast. It's
Mo Kelly, kf I AM six forty. Have you ever
been online? Of course you've been online, but have you
ever like gotten into a chat and you had to
pause for a moment.

Speaker 3 (17:24):
You're asking for help and the chat bot will come
up and it's like, am I talking to a person?
Is there an actual person? Helping me?

Speaker 4 (17:32):
Or is it a chat bot with limited responses and
it's scripted where it can only say you know one
of five things in answering questions, and you realize, oh,
this is a chat bot.

Speaker 3 (17:42):
Well, they're getting a lot better.

Speaker 4 (17:44):
So much so that it's almost indistinguishable in an online sense.
And a group of thirty two researchers from open Ai, Microsoft, Harvard,
and other institutions are calling on technologists and policymakers to
develop new ways to verify humans without sacrificing people's privacy

(18:05):
or anonymity. I'm sorry, but whenever I see news stories,
I either think of a corresponding commercial or a corresponding movie.
And the moment I saw this story, I thought of
this commercial.

Speaker 3 (18:21):
Discover customer Service.

Speaker 2 (18:22):
This is Maya.

Speaker 3 (18:23):
Oh hi, Maya. You robots are sounding more human every day.

Speaker 8 (18:27):
Oh I am human?

Speaker 3 (18:29):
Like I'm talking to a human.

Speaker 9 (18:31):
I discover everyone can talk to a human representative.

Speaker 3 (18:33):
All right, prove it?

Speaker 9 (18:34):
Wait?

Speaker 3 (18:35):
Are you a robot?

Speaker 5 (18:37):
Oh?

Speaker 3 (18:40):
How would I prove that I'm not? Twenty four to
seven US based customer service? Do you feel like a robot?
It pays to discover.

Speaker 4 (18:47):
Everybody's seen that commercial is a big Super Bowl commercial.
But it relates back to this story of how researchers
want some sort of protocols set in place so we
know if it when that we're dealing with actual robots.
Now there's also a cinematic tie in, and I know
Mark Ronner knows this one.

Speaker 3 (19:07):
If you think of maybe Philip K.

Speaker 4 (19:08):
Dick and the short story of do robots dream of
electric sheep? Androids? That's right, do androids dream of electric sheep?
Which was the source material for Blade Runner. This is
it's like, you know, is is it life imitating art
or art imitating life? Where now we're having these serious
discussions of making sure that we can identify, you know,

(19:33):
science from the fiction.

Speaker 2 (19:34):
So I've been apartment hunting and one of the things
that I do in the meantime is I read reviews
about the apartment complex. And in one of the apartments,
I was looking at email back and forth to schedule
tours and so on. Well, I found out that who
I was emailing with was a virtual assistant. And the
virtual assistant is so lifelike that I didn't know until

(19:55):
I happened to scroll all the way down to the
bottom of the email and get to the say Nature,
where there was a very tiny, tiny princes print that
said that it was a virtual assistant. Now I went
online and I'm reading reviews. People online were just upset
with this virtual assistant and arguing about how the virtual
assistant was rude because they thought it was a real person. Oh,

(20:18):
this person is so rude and they're not responding, or
that they don't you know this, that and the third.
But it was indistinguishable whether or not it was the
virtual assistant or a real person. I happened to know
because I did the investigative work, which I didn't have
to do. I happened upon it. But for so many
they thought they were talking to a live person.

Speaker 4 (20:38):
It's the technology is way beyond what any of us
could even imagine six years ago.

Speaker 3 (20:44):
And I know I talked about this before.

Speaker 4 (20:45):
Six years ago, Google was debuting a version of their
AI Assistant, which could call businesses and make reservations on
your behalf listen.

Speaker 8 (20:56):
To the assistant. As I said earlier, I risian for
our system is to help you get things done. It
turns out a big part of getting things done is
making a phone call. You may want to get an
oil change schedule, maybe call a plumber in the middle
of the week, or even schedule a haircut appointment. You know,

(21:17):
we're working hard to help users through those moments. We
want to connect users to businesses in a good way.
Businesses actually rely a lot on this. But even in
the US, sixty percent of small businesses to make you
a haircut appointment on Tuesday between ten and noon. What
happens is the Google assistant makes the call seamlessly in

(21:41):
the background for you. So what you're going to hear
is the Google assistant actually calling a real salon to
schedule the appointment for you.

Speaker 3 (21:51):
Let's listen.

Speaker 9 (21:57):
O happening out here. Hi, I'm telling to book a
woman's haircut for a client. I'm looking for something on
May third. Sure, give me one veto mm hmm sure.
What time are you looking for a round at twelve pm?
We do not have a twelve pm available. The quotest

(22:20):
we have to that is a one fifteen. Do you
have anything between ten am and twelve pm depending on
what service she would like?

Speaker 3 (22:30):
What service is she looking for?

Speaker 9 (22:32):
Just a woman's haircut for now? Okay, we have a
ten o'clock ten am is fine. Okay, what's your first name?
The first name is Lisa. Okay, perfect, So I will
see Lisa at ten o'clock on May third.

Speaker 3 (22:47):
Okay, great, thanks, great, have a great day.

Speaker 4 (22:50):
Buy That demonstration was in twenty eighteen. There's no telling
where we are now in twenty twenty four. Yeah, that
always blows my mind. And who showed me that clip
when it came out was my dad and he was like,
where are we getted?

Speaker 10 (23:09):
You know, like this is insane. And I think there's
another one. I thought you were going to play the
other one where I think they call a Chinese restaurant. Yes,
we could barely understand, but the AI was able to
pick up everything, switch the appointment.

Speaker 4 (23:21):
But I love that, right, just the little inflections of
natural language that we have. So we probably have been
fooled dozens of times.

Speaker 3 (23:32):
We just don't know.

Speaker 4 (23:34):
We weren't expecting it, we weren't savvy enough to identify it.
But yeah, look all we need now is the full body.
We can have sex spots. O.

Speaker 10 (23:46):
I want you to take avoid comptest at the end
of the show to prove you're human. What's the difference
between between that and a Turing test. Well, one's more
entertaining and you get blown the Kingdom Come at the
end of it based on the movie that we all know.

Speaker 4 (24:02):
Okay, well, there are plenty of movies which we could
insert into this discussion. And if you want to see
a great movie on this particular discussion, you have to
see Ex Machina.

Speaker 3 (24:13):
Oh, that's really good. That's an Alex Garland movie. He's
the guy who made that Civil War movie we both liked. Yep.

Speaker 4 (24:18):
Ex Machina deals with the idea of whether AI in
the physical form is self aware and what tests we
would administer to be able to discern whether the robot
is sentient as a consciousness all its own.

Speaker 3 (24:39):
It's a great movie.

Speaker 4 (24:39):
If you haven't seen it, I'm not going to spoil
it for you, because there's a great ending on that movie.

Speaker 10 (24:45):
Yeah, it's definitely worth seeing it. It's a little lower
key than maybe what we're used to from him, but
it's brilliant and it's stars Oscar Isaac and what was
the guy from Star Wars Donald?

Speaker 9 (24:57):
Yeah?

Speaker 10 (24:57):
Yeah, I'm never sure how to pronounce his first Yeah,
I just say Donald Hall. Okay, it's either right or wrong,
just one of the two.

Speaker 4 (25:05):
But X Makina highly recommend and yeah, we're going to
have to get to that point where we do get
the personhood credentials, where people prove offline that they are
physically that they physically exist as humans, and we can
receive an encrypted credential that they can use to log
in for a wide range of online services.

Speaker 3 (25:24):
Because imagine this.

Speaker 4 (25:26):
If they can mimic people's voices, and they can, and
they can do it well enough where it sounds fluid
and natural, and they can, what's to stop them taking
your voice and your information that they can gather in
the ether because they have data breaches everywhere, and then
call your credit card company or call your bank as

(25:48):
you and provide the information that they've stolen about you.

Speaker 3 (25:54):
That's the future, I should say, that's the now.

Speaker 10 (25:56):
The rule is that whatever the worst possible thing you
can imagine being done with technology, it's going to get done,
and it will be done. There's no question about that.
You can call it Marx law if you want, Marx law. Okay, sure, okay,
all right, Later with mo Kelly, how long did you
you take to pick that up?

Speaker 3 (26:15):
That was a throwaway? You can have it.

Speaker 1 (26:17):
You're listening to Later with Moe Kelly on Demand from
KFI AM six forty.

Speaker 4 (26:22):
Okay, before we get out here, got to remind you
later with mo Kelly is available anywhere. Of course, you
can get it on the iHeartRadio app, but we also
now are on YouTube. We're also on Spotify, iTunes, Apple podcasts,
anywhere you want to hear this great show. Thanks to
Tualla Sharp because he puts in a lot of work

(26:45):
after actually during the show and after the show to
make it bite size and easily listenable, no commercials, and
you can enjoy just about anywhere you get podcasts, so YouTube, Spotify,
iHeartRadio app, podcasts, Apple podcasts. Wherever you get your podcasts?
Where do you get your podcast Mark?

Speaker 10 (27:06):
I have a podcast app on my phone and on
my iPad and I listened to them pretty frequently.

Speaker 4 (27:12):
Well it's there as well, thank you, tiffany Where do
you get your podcasts on the iHeartRadio app? Oh well,
it's definitely there, Stefan. Where do you listen to podcasts?
Not wearing the house but on what app?

Speaker 7 (27:24):
Podcast app on my phone? And I do use Theheart
Radio app as well. Okay, what's the podcast app? We
need specifics, It's just it comes on the phone standard.
Because podcasts became such a big thing a long time ago.
So it's bloatwear. It's actually it's literally called podcast. Oh
your iphoners, I forget.

Speaker 3 (27:42):
I forgot.

Speaker 4 (27:43):
Okay, they have an app that's built in to the
software called podcast.

Speaker 9 (27:47):
Yep.

Speaker 3 (27:48):
It's like we're living in the future.

Speaker 8 (27:49):
Mo.

Speaker 10 (27:50):
That's actually one of the reasons straight up. That's one
of the reasons I couldn't stay with Android when I
first came out, because I was to podcast, you know,
before it was cool, and.

Speaker 4 (27:58):
They had no way of listening to it. I just
had to go online. They were difficult. Yeah, you had
to listen like from the website.

Speaker 10 (28:04):
Yeah, And I was like, nah, nope, yeah, sooner our
iPhones will have void CONFI tests, just so you know,
right around the corner, no one of those next big
announcement meetings, they're gonna announce that.

Speaker 3 (28:15):
Okay, all right, all right, just want to make sure, lindsay,
how do you listen to later with Mo Kelly on
the iHeartRadio app company?

Speaker 10 (28:23):
Woman?

Speaker 3 (28:26):
Okay, but I actually listened on Spotify.

Speaker 4 (28:30):
You're the one person who listens on Spotify. Yeah, okay,
all right, Well you can listen to it anywhere and everywhere.
And I was thinking and since Tuala isn't in the room,
I'll ask all of you. We're getting ready to go
on a cruise, like ten of my family members in
Twala and his son. We all know he has reservations

(28:51):
about the ocean. I was thinking about maybe chronicling the voyage,
checking him with him, chronicling when he first walks the
gangway onto the ship, like a video journal Twalla on
the open seas.

Speaker 3 (29:07):
I would love that, especially since you have YouTube, you
can utilize that part of it.

Speaker 9 (29:10):
Yeah.

Speaker 3 (29:11):
Yeah, I think we're going to do that. He hasn't
run in here yet, so I guess you can't hear
us talking about it.

Speaker 10 (29:16):
Would this be in any way similar to just theoretically
recording him if you fell asleep while you were in
someplace together.

Speaker 4 (29:22):
No, no, I wouldn't do that. This is actually he
would be interacting with me. I'm beginning his thoughts. Okay,
all right, clarification, It's not like the Truman Show. It's
not going to be voyeuristic. Okay, It's like, so what
do you think?

Speaker 5 (29:33):
You know?

Speaker 4 (29:34):
Are you feeling okay? Is it what you thought? What
you expected? And for those who don't know, Twalla is
not big on water, not at all. I mean, I
assume he showers, but he doesn't like going out.

Speaker 3 (29:44):
On the water.

Speaker 2 (29:45):
I vote that you do a portion of it like
the Real World TV series did when they had the
confessional rooms where you just put Touala in a room
and have him just kind of share how he's feeling
and then do some voiceovers.

Speaker 3 (29:59):
You got to at all this. Yes, it's a lot
of damn ironically give it to Yes, you want like four.

Speaker 2 (30:08):
Cameras, make it happen, special effects, make it happen.

Speaker 3 (30:12):
Do we have a key grip? Do you have a fluffer?
What do you want us to do with this fluffing?
Can we say that?

Speaker 8 (30:19):
Foosh?

Speaker 3 (30:19):
We can? Yeah, we can say it kind of cruise
is this? There are sex cruises and it sounds that
I've those. That's crazy.

Speaker 4 (30:27):
Yeah, there are sex cruises. Don't ask me how I
know that, but I'm just saying there they exist. I
haven't been on one.

Speaker 10 (30:35):
For all the stuff that you can catch on just
a regular cruise, multiply that by what on a sex cruise.

Speaker 4 (30:41):
Well, from what I understand that most of the surfaces
are covered.

Speaker 3 (30:45):
It's not like you can just walk around and start
bumping uglies. Anywhere and everywhere.

Speaker 10 (30:51):
Services like what penicillin? What are you talking about? Surfaces? Oh, surfaces,
not surfaces. Well, services is a different matter now that okay,
I had to make it weird. You put this on me.

Speaker 3 (31:02):
Don't blame the victim. Very demure, Well done, Well done.
Kf I AM six forty Live everywhere on the iHeartRadio app.

Speaker 1 (31:14):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty

Later, with Mo'Kelly News

Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.