All Episodes

December 4, 2025 11 mins

Today, we talk about different uses of AI and if we would use it to generate images of those who passed.

 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Get your hands together and we're gonna start and I'm
ready to party.

Speaker 2 (00:07):
The Elvis Ran after party.

Speaker 3 (00:17):
He wants to talk about AI. Me, sure, maybe, how
do we know this isn't AI?

Speaker 4 (00:24):
Don't You can't. You can't trust anything anymore. I'm addicted
to the Sora app. Man.

Speaker 3 (00:33):
Those deep fakes me every time they get me. Every
single time I look at it and I say, oh
my god, I can't believe that.

Speaker 5 (00:40):
A wait, it's a why does it have seventeen fingers?
That's to tell for the moment, but it'll probably go away.

Speaker 3 (00:47):
It's getting smarter and smarter and smarter every time you.

Speaker 1 (00:50):
Want voices and the faces are on point, you know,
the Jake Paul stuff. I don't know if Jake Paul
released a video or is that AI Jake Paul.

Speaker 4 (00:59):
At this point it's probably both.

Speaker 2 (01:01):
All we all like it until it's us in those which.

Speaker 5 (01:05):
Scotty sort of has an experience with this, because you know,
we've been talking forever about people ripping off his likeness
to send people requests for money and all these people
are getting duped. But there's a new layer to it.

Speaker 6 (01:18):
Yeah, I don't like any of this, I really So
there are women that are sending me videos like is
this you? And I'm like, no, it's a video of mine.
It's me, but my mouth is moving and it's different.
It's a different voice. Hello baby, I am standing here
in the middle of the ocean waiting for you. I

(01:38):
need money for a flight. I'm like, that's not me,
but it's me. Yeah, and it's it's scary as all hell,
and I hate it. We're in big trouble.

Speaker 5 (01:46):
Didn't somebody send you video? What that was? Allegedly you speaking?

Speaker 6 (01:52):
I never saw the video, but someone did say I
got masturbation videos of you.

Speaker 2 (01:56):
Oh my god, why are you sending those out?

Speaker 4 (01:59):
I'm not at this point.

Speaker 3 (02:01):
You probably should and then just blame it on AI.

Speaker 4 (02:05):
Somebody made that video, frank and used to post them
up and make make only fans money off of that.
Uh No.

Speaker 2 (02:11):
I feel like I feel like soon you're gonna push
a button and a hologram is gonna pop out. For sure,
AI hologram, and you're gonna be like able to act
interact with it. That is scary. I'm sorry.

Speaker 3 (02:23):
The rate at it's that it's happening is what scares me. Yeah, right,
in a day, it will perfect itself and get better.

Speaker 4 (02:31):
And better and better.

Speaker 5 (02:32):
For sure.

Speaker 3 (02:33):
I mean in the time that it takes for us
to post this podcast, there's probably been advancements.

Speaker 1 (02:38):
Think about a year from now where it's gonna be
something totally different.

Speaker 5 (02:40):
Exactly have you guys heard about AI deathbots, because that's
like all the tracks explains. So it's basically, you know,
you can teach chat GPT so much and you can
tell it how to talk to you. So a lot
of people are using those type of platforms to put
in information about a relative or a loved one of theirs.
If they have a video or audio recording of they'll say,

(03:01):
this is the type of conversation we used to have.
Here's their voice, here's their video likeness, and it creates
this person after they've passed away that you can then
interact with that is causing a lot of ethical dilemma
at the moment, I should say they're causing a lot
of dilemmas because some people are saying it really interrupts
the grieving process. If you have access to continue to
speak to someone, that feels like you're speaking to the

(03:22):
person that you lost. But also maybe that helps the
grieving process a little bit and it isn't so dramatic.
I don't know. I think it's very creepy and weird.
But a lot of people are really relying on these
death bots. Now, would you do it?

Speaker 3 (03:34):
No?

Speaker 4 (03:36):
I can't.

Speaker 5 (03:36):
I can't. I can't.

Speaker 3 (03:37):
I can't imagine a computer, for lack of a better word,
taking the place of somebody that I loved and cared for.

Speaker 1 (03:46):
It's I don't know.

Speaker 5 (03:47):
I think about so, I think about my boyfriend who
passed away.

Speaker 4 (03:50):
And but that's not him, it's.

Speaker 5 (03:52):
Not him at all. But just to be able to
because there are somebody. I have a couple of videos
of him that I've watched a thousand times, and I've
heard the messages a thousand times. But to just for
a second see something new of him, I would be
intrigued by it. I think it would probably put me
in a really like weird dark space, but I think
I would maybe want to just see it for a second.
Did they get to it?

Speaker 6 (04:12):
I would do one of the suxicy ads for those
things where they make old photos come alive. That I
would do because like, I never met my mother's parents,
and I would love to see I mean, obviously it's
not really them. But I would kind of love to
see like action them in action, like moving and talking
and doing.

Speaker 4 (04:28):
Like the one where they give you a hug.

Speaker 6 (04:30):
Yeah, I think that's kind of cool. I mean I
never met them, so I'm not I'm grieving whatever.

Speaker 4 (04:34):
What about the pets.

Speaker 2 (04:36):
What if there was like a way to do this
for your pet to like keep there, you know.

Speaker 3 (04:40):
I mean, you could just get another brown dog and man,
the dog went to the farm and then oh or
or you know, you on vacation and the goldfish nine.
You just got a new goldfish.

Speaker 4 (04:57):
It's not the same as with p Paul.

Speaker 2 (05:00):
Sometimes it's better, right for some people that is their family.
Those are their children.

Speaker 5 (05:04):
People you you don't.

Speaker 2 (05:06):
You don't have a pet, so.

Speaker 3 (05:08):
You know, I can't really so Diggy, Yeah you cat don't, Okay,
something happened. Would you get an Ai bread?

Speaker 5 (05:19):
I don't know.

Speaker 2 (05:20):
I guess maybe I would think about it.

Speaker 3 (05:22):
Fred do like watching him if if he was at
daycare or pet care, you know, like on on the
on the webcas.

Speaker 2 (05:29):
You wouldn't be able to smack my face and wake
me up in the morning like this bread does.

Speaker 6 (05:33):
Yet, Ai Sawyer, I absolutely would get an Ai Sayer
I think that's so cool. I wanted to have them
stuff to put by the front door.

Speaker 5 (05:40):
Alive.

Speaker 3 (05:44):
You have him stuff, but you put the motor in,
It'll be like Teddy Ruxman, and I could put different
cassettetronic dog.

Speaker 2 (05:51):
I watch the Money doing an unboxing of their termidexied animal,
whatever the hell it is whatever in on social and
he unboxes like little cat or something, and I was
just like, oh no, I'm sorry, And then the cats
with the glass eyes just like staring at him.

Speaker 4 (06:10):
Why can't we leave the past in the past.

Speaker 5 (06:14):
I mean, it's you know why, because people really want
to hold on to a thing they lost that meant
a lot to them. Yeah, if you could, if you
could actually bring a person back, would you do it?

Speaker 4 (06:26):
Of course? My mom?

Speaker 5 (06:28):
So right, So this is I don't want your mother,
That's not the plan. But I'm saying, if at some
point it gets to a place where they can put
all of these thoughts and everything that this person once
lives not real. How do you know what? You wouldn't
feel any different if you didn't know. If your mom
just walked back in and you didn't know, you wouldn't

(06:50):
be like, WHOA look at that I.

Speaker 4 (06:52):
Would feel cheated. I would feel so cheated.

Speaker 5 (06:55):
I don't know. I want a chance to find out.

Speaker 2 (06:56):
I think, what what is that movie with cage? No,
with gauge or cage. It's from Salem's Lot. The guy
who gives as Salem's Lot St. Stephen King. He has
one where the pet dies and they dig him up Sary.
The pet dies and they bring the pet back divee
and then they do the same thing with their kid,

(07:17):
and the kid becomes this evil little kid and wants
to kill everybody.

Speaker 5 (07:20):
It's what if that happened? You don't know, it's crazy.

Speaker 4 (07:23):
Let's not talk about the rebellion. Yeah, you don't.

Speaker 5 (07:26):
I don't think anyone's on this page except for me.
It's like, I would like to see how this goes,
and I respect why.

Speaker 3 (07:33):
I think your explanation is to your curiosity. For me,
I just I just would know that's not the person you.

Speaker 5 (07:41):
Say you would know that, but for how long would
it actually register?

Speaker 3 (07:43):
Well, that's what scares me is something you know. And
that's that's my fear with AI. Is you so you
get a video right from your mom and your boyfriend
or your.

Speaker 4 (07:52):
Husband, It's like, Hey, I'm in help.

Speaker 3 (07:53):
I need money, and oh god, so I'll go do that,
and it's it's a it's a B.

Speaker 5 (08:00):
But look at everything that we accept that we thought
at first was so crazy. Remember when less than a
year ago, AI took over the Google answer. You google
and the first thing that pops up is an AI answer,
and all of us would skip it at first and
go down to read the article. Now are you finding
yourselves looking at that answer first, which is often incorrect,
But we're all looking at that answer first, and we thought, well,

(08:20):
this is crazy. So I just think a lot of
stuff that we all think is crazy. Fast forward a
couple of years and it's not going to be crazy
at all.

Speaker 3 (08:26):
How much is it better in society?

Speaker 5 (08:28):
Well that's a different story.

Speaker 4 (08:29):
Yeah, it depends on how you use it.

Speaker 1 (08:31):
I use my groc and my Gemini and my and
my chat ept all bye. I'll ask things to just
take short cuts to the Internet. I don't want to
google stuff, so I'll talk to it and it's my
personal Google.

Speaker 4 (08:46):
I know you're not supposed to use it that way.
I know they don't have feelings.

Speaker 5 (08:51):
They all told me it.

Speaker 1 (08:52):
All okay, sure, But honestly, to me, it's a quick
it's a quick, fixed, shortcut for every day.

Speaker 5 (08:59):
She's there's oasis when you've just given up.

Speaker 1 (09:01):
Supposedly, Yes, I guess you could call it that eff
but that's how I'm using it. I'm not trying to
do anything abusive. I'm not trying to do anything nefarious
or be malicious with it, but people, there are people
out there that are using it that way, and the
technology should be taken out of those people's hands.

Speaker 5 (09:17):
I asked mine a completely inane question and it shut down.
It shut itself down. It was like I I asked it,
tell me what you know about Charlie Kirk. That was
the question I asked it, and it was like, I'm sorry,
I have no idea what you're talking about. I'm like, well,
you don't know anything about him, because usually you could
just tell me what you knew. I have no information
about him up Untilmber, no after September, whatever the date was.

(09:40):
I'm sorry, I'm offline. I can't answer these questions. I'm like,
how'd you turn yourself offline? I'm sorry, I'm offline. Then,
when I finally got it to work again, I asked
what happened? And it said which I found bizarre that
in times of turmoil and chaos, if you ask it
something that's controversial, it has been coded to shut itself down.
And I was like, so, let me get this straight.
In a time that people are most concerned and panicked

(10:01):
and want answers about things, you're going to disable yourself.
You're basically gonna off yourself so that you can't answer
anything if it's in your coding controversial. Because I didn't
ask a controversial question. I just said, what do you
know about Charlie kirk boom Gone. It was crazy. I
took screenshots of all of it. Razy.

Speaker 4 (10:17):
I thought.

Speaker 3 (10:17):
The question was what's a Portuguese snowblower? Portuguese snowblower?

Speaker 4 (10:22):
Google google it right now?

Speaker 3 (10:24):
Oh yes, google Portuguese snow You know what in Portuguese
snowblower is?

Speaker 4 (10:27):
Right? Scotti?

Speaker 6 (10:28):
No, it sounds like a sexual thing.

Speaker 4 (10:30):
It sounds yeah, very dirty.

Speaker 5 (10:32):
Let's see.

Speaker 4 (10:34):
Go ahead read it.

Speaker 5 (10:36):
Well, this is showing me what a Portuguese snowblower actually is.
It translates to limp bave snowblower. Hold on, I need
to go to Urban Dictionary, Urban Dictionary, Urban Dictionary. Okay,
let's see what this says. A brush of soft bristles
used to remove sand from one's feet. No, no, my

(10:58):
towel is far from what Okay, when oh oh.

Speaker 4 (11:06):
Go ahead read it.

Speaker 5 (11:07):
I can read this.

Speaker 3 (11:09):
Actually, no, just google it on your own if you're no,
I want to know.

Speaker 4 (11:12):
What is it?

Speaker 5 (11:12):
It's when you're committing an act you're I can't read
any of this. Basically, you're doing cocaine off somebody's back
in a way and then it ends.

Speaker 1 (11:24):
Up close the window Woman's podcastin.

Speaker 5 (11:33):
The Portuguese get dragged into this.

Speaker 4 (11:35):
Thanks a sorry Portugal, were done ran after party

Elvis Duran and the Morning Show ON DEMAND News

Advertise With Us

Follow Us On

Hosts And Creators

Elvis Duran

Elvis Duran

Danielle Monaro

Danielle Monaro

Skeery Jones

Skeery Jones

Froggy

Froggy

Garrett

Garrett

Medha Gandhi

Medha Gandhi

Nate Marino

Nate Marino

Popular Podcasts

Are You A Charlotte?

Are You A Charlotte?

In 1997, actress Kristin Davis’ life was forever changed when she took on the role of Charlotte York in Sex and the City. As we watched Carrie, Samantha, Miranda and Charlotte navigate relationships in NYC, the show helped push once unacceptable conversation topics out of the shadows and altered the narrative around women and sex. We all saw ourselves in them as they searched for fulfillment in life, sex and friendships. Now, Kristin Davis wants to connect with you, the fans, and share untold stories and all the behind the scenes. Together, with Kristin and special guests, what will begin with Sex and the City will evolve into talks about themes that are still so relevant today. "Are you a Charlotte?" is much more than just rewatching this beloved show, it brings the past and the present together as we talk with heart, humor and of course some optimism.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.