All Episodes

August 30, 2024 • 29 mins

Send us a text

Can AI revolutionize our daily lives and potentially pose unexpected threats? Get ready for an eye-opening journey into the realm of artificial intelligence! From recalling iconic movie scenes from "The Terminator" and "2001: A Space Odyssey" to recounting our own experiences using AI for budget-friendly logo designs, this episode is packed with insights. We'll unravel the various ways AI is already a part of our everyday lives, like GPS navigation and smart devices, and address how it's sneaking into education, sometimes helping students cheat. We'll even ponder the mind-boggling idea of AI reaching a point of self-improvement where it could create its own advanced programs.

Fast forward 20 years and imagine the ethical and societal impacts that AI might bring. We dissect current influences on media and politics, such as deepfake videos, and the scary potential of AI-generated scams. As we grapple with these serious issues, we also sprinkle in some humor, considering AI's role in personalized eulogies and whether AI might make even our tech interactions more polite. And just for fun, we share a quirky tidbit about the surprising value of Star Wars Legos. Tune in for a balanced blend of thought-provoking discussion and light-hearted moments that underscore the unpredictable future AI holds for us all!

La Bandera BTX in Brownsville, Texas.

Follow us on Tiktok. Thank you for watching!

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Hey.

Speaker 2 (00:06):
Welcome back to Paranormal 956.
My name is David.
As always, I am here withBianca.

Speaker 1 (00:13):
Hi guys.

Speaker 2 (00:15):
What are we talking about today?

Speaker 1 (00:21):
Um someone intelligent?
I don't know.

Speaker 2 (00:24):
Artificial intelligence.
We're're gonna go over what itis, what people are scared of
happening okay and then how it'sbeing used.
How it's being used, includingby us, almost every day now,
right, and so I guess the firstthing that we can think of if

(00:49):
we're going to define whatartificial intelligence is, just
broadly speaking, it iscomputers being able to think
for themselves.
And so that's what theartificial intelligence is.
And then we've seen movies likethe terminator movies and 2001

(01:10):
space odyssey, like in a satanicpanic type of thing I don't
know.
I guess it could be right,because I think.
I think it's like it's thedevil the one coming up with all
this technology, because we dodo think, like when we talk
about the Mayan civilization orthe Egyptian civilization, it's
like where did they get thisintelligence?

(01:30):
It doesn't.
It's not human.

Speaker 1 (01:32):
No.

Speaker 2 (01:32):
We didn't just come up with this stuff, and so I do
think there's probably somepeople that think the devil's
got to be involved with thisyeah for sure.
And so when you talk aboutartificial intelligence, it
really is changing everythingand I do think it's gonna like

(01:56):
cost some jobs I'm gonna behonest, yeah, it's to throw us
out the curve.

Speaker 1 (02:03):
Let's just put it at that.

Speaker 2 (02:05):
There are things that I'm doing.
I'll take responsibility forthis, but we're a small podcast,
right?

Speaker 1 (02:15):
We haven't made it big yet.
We barely got Amy and Ivan.

Speaker 2 (02:18):
Right, small podcast, small podcast and we don't have
the funds to be buying logosand our thumbnail $40 cups well,
well, we got money for somestuff, right right.

(02:38):
But when we first started, youknow, our logo was designed by
artificialificial Intelligence.

Speaker 1 (02:45):
Right, we got some help.

Speaker 2 (02:48):
We got some help.
We were not able to afford atthe time A designer A designer
to do it and we don't have theskills ourselves to do it.
No, maybe we could have come upwith it, but it would have
taken a long time Whole process,and we wanted to do other
things.
Right, yeah, we're recording,we're coming up with ideas,

(03:09):
we're doing research tiktoks,tiktoks.
We do.
We do watch a lot of tiktok,and so it was just easier to
very quickly input some stuff ina computer whatever catch catch
short eye and we just like wentthrough like 20 logos really
quick.

(03:29):
Which one do you like?

Speaker 1 (03:31):
I like this one I like the style on this one.

Speaker 2 (03:33):
I like the color of this one, yeah and then we just
kind of mix and match a littlebit.
We did a little bit of work,but there's no way we came up
with all of that on our own, etc.
Etc.
Right?
Yep I think I myself often usegps to get around the valley.
I do travel yeah that isartificial intelligence, right?

(03:54):
Um, I know.
For instance, if I was to say,hey, siri, subscribe to
paranormal 956 on apple podcast,maybe some of y'all's devices
did something right now, right,and so you know those are

(04:15):
artificial intelligence, I knowI mean, even facebook had it,
has it on now right, right,that's right it's a what do you
call it Like an add-on?
Yeah, meta, ai Meta, yeah,Right.

Speaker 1 (04:29):
And so there have I mean even some.
Which one is it?
It's a pair of sunglasses thatthey threw Meta in.
They're like, hey, meta.

Speaker 2 (04:37):
Oh, right, right, right.

Speaker 1 (04:38):
Translate, Google Translate or whatever, yeah the
Meta Ray-Ban glasses.

Speaker 2 (04:41):
Ray-Ban.
Yeah, yeah, yeah, yeah.

Speaker 1 (04:43):
I mean, we're already up there guys.

Speaker 2 (04:49):
Yeah, and so I do think this is going to cause
some jobs.
I know that, being in the worldthat I'm in, I'm friends with a
lot of teachers.
You know teachers are dealingwith this with kids using it to
cheat, et cetera, et cetera.
Right, because it can write anessay for you pretty easily.
I mean, that's like the easiestthing for AI to do right now.

Speaker 1 (05:07):
I mean, Amy does that for us.

Speaker 2 (05:09):
Right and Ivan helps her out.
Yeah, Somewhat.
Yeah, and so AI is here and Idon't see it going anywhere.
There is this weird thing thatI'm going to kind of squeeze
into this story the chat GPT.
There's been 1, 2, 3, 4, right,4.0 is the newest version.

(05:34):
Somewhere there was a chat GPT2.0 B, I guess.
I don't know how to name it,but people are speculating that
AI made its own chat GBT.
Oh shit, because chat GBT2, theone that I'm talking about, is

(05:57):
better than the number four, themost recent one.

Speaker 1 (06:01):
Wow.

Speaker 2 (06:02):
And so there is a chance once we get this really
going and ai starts learning ona faster level and improving
itself that it's going to bewriting its own programs, shit,
right, and I mean it's going tobe coming up with its own ideas
and ways to improve things onits own, and so a lot of people

(06:24):
are worried.
Now we're getting to the partof what people are worried about
.
Is that, whether it's becausewe are lower life forms compared
to them as far as intelligencewise, or maybe they don't see us
as good people Right, notalways acting in our own best

(06:46):
interest, et cetera, that AIwould try to exterminate humans
and take over the world,possibly even enslaving us.
Right?
These are the extreme views ofall of them.
Right, right, and so is thereany situation like if you were

(07:08):
to speculate.
Ai is here today, right, weknow where it's at.
What does it look like in 20years?
What do you think?

Speaker 1 (07:18):
20 years, I would be what 44?

Speaker 2 (07:27):
Yeah, you still wouldn't be my age.

Speaker 1 (07:29):
Jeez.
Well, whatever, I don't care.
If AI gets to the point oftaking over, I'm out.
I'm not struggling with it, I'mout.

Speaker 2 (07:47):
Count me out, guys.

Speaker 1 (07:48):
You're not going to fight AI.
I'm not struggling with it.
I'm out.
Count me out, guys.
You're not gonna fight it.
I'm not gonna fight ai.
But what made it like that over20 years?

Speaker 2 (07:53):
that's pretty quick.
So this is what I.
I don't see a scenario or Idon't see a reason why ai would
try to take over right, it wouldtake too much, I guess well, I
think we're putting in emotions,I mean if anything.

(08:16):
Yeah, we're putting emotions onand something that it's not
right feeling and so like, withus, with humans we do, feel we
do.

Speaker 1 (08:25):
That's why we're fucked.

Speaker 2 (08:26):
Guys like we want more power, we want more money,
we want more influence.
We want more all of it likewe're just more, more, more,
more more I don't know thatmachines would be like that true
, I don't know why they would beI mean, but it's improving
itself yes in a way for thebetter or like faster.

(08:48):
I guess you would say yeah, ormore accurate I guess I think
all of the above right, I thinkit would be everything.
I think it would be faster, Ithink it would be better, I
think it would be more creative.
Um, right now we're using itfor a lot of creative things.

Speaker 1 (09:08):
What if AI starts its own podcast?

Speaker 2 (09:10):
Yes, I wouldn't be surprised if there are already
AI podcasts.
There are plenty of AIFacebooks already what, yeah?
And there's plenty of AIYouTubes already and TikToks,

(09:33):
and so the thing that is scaryand I don't think this is the
machine's fault, but there havebeen.
For example, right now we're inthe middle of an election
season, right Right With Trumpand Biden and all of this but
somebody made a fake video ofBiden saying things he never

(09:55):
said.
Video of Biden saying things henever said.
And so right now, when we watchthat there's little hiccups, the
lips don't move quite right,the eyes don't match.

Speaker 1 (10:05):
It's not out there, but it's there.

Speaker 2 (10:06):
It's close and if you wanted to believe, sure.
Or if you're really gullible,sure.
I mean, there's a lot of oldpeople watching that that would
never even think that a machinemade this up.

Speaker 1 (10:20):
Yeah right, they just see.
I think that's what's, that'slike, what's messing people up
but, some, like some of theyoung ones, actually catch it on
pretty quick.
They're like yeah, he wouldn'tsay that.
Or like the drake this andstuff like that they're like no
drake wouldn't say that hewouldn't answer back to ken
drake, or you know stuff likethat.
They're like no drake wouldn'tsay that he wouldn't answer back
to ken drake, or you know stufflike that.

Speaker 2 (10:38):
So when I saw the drake one right it looked pretty
, it looked legit yeah but itwasn't.
It was like because I mean Idon't even follow drake that
closely right, but I was kind oflike why would he say that?
You know what I mean?
I don't remember what the thingwas oh, he made a, a slavery

(11:02):
joke or a slavery diss orsomething like that yeah like
you're trying to free the slavesand all this, and I was like,
why would drake do that?
Like that didn't make sense.
And so I went online because Imean, I'm a skeptic, yeah,
skeptic, skeptic, skeptic oneverything yeah and so when I
saw that, I was like let me seewhat the internet says about

(11:23):
this, right, yeah, and the firstfour videos I saw, the people
thought it was real and theywere like, oh, I'm done with
drake and all this stuff.
And I was like this is evenworse than I thought, like
because I was missing some ofthe subtleties in the lyrics,
right, and so when they'reexplaining it yeah, yeah, and I
mean I don't understand, I don'thear.

(11:44):
Well, I'm old right, I don'teven get you know, because they
do those double entendres andthings, so there's like double
meanings here and there.
I missed a lot of it.
And then when they're mad andthey're telling me what they
heard and I'm like, oh, I missedhalf of that, that makes it
even more doubtful for me.
Yeah, and then I found outwhere it came from and I was
like, oh, no wonder, right.

(12:05):
So this is what I'm scared of,though, because people are
already believing that, like Isaid, the first four videos I
saw were all people that boughtit.
Yeah, with the biden video thatcame out not too long ago,
there were some many people thatthought it was real.
That's because ai is not asgood as it's gonna get.

Speaker 1 (12:25):
It's gonna get better yeah, and it is gonna get I
mean, some people are using itto scam and stuff like through
video calls.
It's like crazy yeah.

Speaker 2 (12:36):
So that's where I do think, if you're not only not
smart right, because it's kindof an easy pickings for not
smart people but even if you'rejust not on your game yeah,
right, because you will getscammed, you will get there's,
let's say, like for me, I thinkI'm a smart guy, I I know about
AI, I'm not going to fall forthis, but if they tickle me,

(12:59):
just right, you know what I mean.
It's like maybe I need money orI'm sleepy or I don't know.
Even with me, like I said, I'ma smart guy, throw me the right
curveball and I might still fallfor it, like I'm not saying I
can't right.

(13:19):
That's that margin of wherepeople are gonna fall for it is
just gonna get bigger and bigger, because how are you gonna be
able to track down all of it?
Yeah, right now it's one drakevideo or one Drake song.
I mean one Biden video.
We're going to be inundatedwith stuff soon, for sure.

(13:41):
And it's going to be really hardto tell.

Speaker 1 (13:44):
What and what not?

Speaker 2 (13:45):
happened.
Which is real, which isn't real?
That's what I think is thebiggest issue.

Speaker 1 (13:51):
Because I don't think they're fucking over the
elections as well, yeah.

Speaker 2 (13:55):
They already have.
They did last time and they'redoing it again and it's just
going to get worse.

Speaker 1 (14:03):
Man.

Speaker 2 (14:05):
Can we just vote?

Speaker 1 (14:06):
for sexy red.

Speaker 2 (14:09):
Imagine sexy red for president.
She would fix a lot of things.

Speaker 1 (14:14):
I think For sure.

Speaker 2 (14:15):
I mean, the girl knows how to figure things out,
yep, and she's young, so shecould be president for the next
40 years.

Speaker 1 (14:23):
Redhead.

Speaker 2 (14:24):
That'd be fine.
I think it's a good idea.

Speaker 1 (14:28):
It's the best one out there.

Speaker 2 (14:30):
You need to send her an instant.
Tell her she needs to run.

Speaker 1 (14:34):
What is her management team doing?
It's true they're losing thebest opportunity they got right
this is it.
Her window is open right now,this is her year this is it for
sure even if she goes in late, Imean we're gonna pencil.
There's nothing to debate withher yeah, we're gonna say she
got two kids.
Yeah, she does yeah, we alreadyknow that we already know that

(14:56):
she doesn't lie, not like trumpbring some other stuff right you
need better than that you needbetter than that so I think
that's the real risk with ai.

Speaker 2 (15:07):
What is people coming up with fake stuff and it
getting harder?
I don't think ai is going tocome up with that on its own oh
for sure, no, they need um.

Speaker 1 (15:19):
What's it called the order guidance, I guess?

Speaker 2 (15:21):
yeah, call it so uh when we get to a point, to where
ai is like I thought you wouldlike these things for the
morning to get started.
Now we're talking Wait, wait,wait, wait, wait Back up.

Speaker 1 (15:39):
I never said that.

Speaker 2 (15:40):
Yeah, you're being a little presumptuous here.

Speaker 1 (15:45):
Next thing you know, they're calling you beautiful on
a coffee table.

Speaker 2 (15:50):
You know, I probably shouldn't tell you all of my
secrets.
He could have taken that one tothe gate it was just too good
and, you know, part of me waslike I don't know if this is
real and I don't know if this isnormal.

Speaker 1 (16:05):
Am I allowed to talk about it whenever you pass?

Speaker 2 (16:08):
sure it's.

Speaker 1 (16:09):
We're putting it on the podcast I'm just saying I
don't know if you're gonna feelcomfortable with it are you
saying that at my funeral?

Speaker 2 (16:17):
you want to tell that story?
That's the story you want to gowith.

Speaker 1 (16:20):
Yeah, I think it's the best one.

Speaker 2 (16:23):
Permission granted.

Speaker 1 (16:25):
You heard it, guys, so I don't want you to be saying
that you're embarrassed.

Speaker 2 (16:30):
When I'm dead.

Speaker 1 (16:31):
Right, I don't want you getting all red in the
coffin, right.

Speaker 2 (16:33):
Yeah, that's not.
I don't want you getting allred in the coffin, right, yeah.
Yeah, that's not the worststory that you could tell, so
I'm fine with that.
What's the worst story that Ican tell I'm not going to tell
you here.

Speaker 1 (16:47):
I'll ask AI, trust me .
I'll ask AI to go through yourphone and find the most
embarrassing story.
Oh then, there's going to beplenty of stuff.
Yeah, this is the other thingthat I think is interesting, at
the very least I mean, ai couldwrite a whole fucking what's it
called whenever you do thespeech?

Speaker 2 (17:08):
a speech, the analogy or what's it called oh, a
eulogy eulogy yeah you know whatwe should do, that we should
have ai write a eulogy.
Let's see, because I'm onfacebook.
I wonder if it accesses myfacebook information to write me
a eulogy from the day you gotshipwrecked to to now yeah,

(17:32):
let's see what it does, becauseI do play with it quite a bit.

Speaker 1 (17:36):
Yeah, you do a lot.

Speaker 2 (17:38):
Actually, I love to play with AI in general.

Speaker 1 (17:43):
I just like.
I think AI will love to playwith you, sir.

Speaker 2 (17:46):
Yeah, that's crazy, let's see.

Speaker 1 (17:54):
Do you watch Star Wars?
No, not too much.
Then why the fuck do you havethe legos for it?

Speaker 2 (17:59):
because they're worth a lot of money oh, the star
wars ones yeah, they're worth alot of money I didn't know that
yeah, where is this?
Yeah, I have a whole bunch.
We're in my Lego room.

Speaker 1 (18:17):
Yeah, we always are.

Speaker 2 (18:19):
And for some reason, Bianca's now all of a sudden
wondering why.

Speaker 1 (18:26):
Because I don't like Star Wars.
I'm pretty sure I kind ofremember he did it.

Speaker 2 (18:31):
I mean I'm not super into it.
Here we go.
I'm not super into it.

Speaker 1 (18:40):
Here we go.
Anyways, he has like a wholetroop of Star Wars.
What do you call them?

Speaker 2 (18:47):
Yeah, I have a whole long thing, all right, so here
it goes.
This is my eulogy.

Speaker 1 (18:52):
Okay.

Speaker 2 (18:53):
So I said write my eulogy and it wrote back I'd be
honored to write your eulogy foryou right.

Speaker 1 (19:01):
Okay, I love that, ai is so polite right?

Speaker 2 (19:05):
I hope I don't get choked up by my own fake eulogy.
That's gonna be so embarrassing.
You're gonna read it.
Yeah, I'll read it.
All right, so.

Speaker 1 (19:19):
I love it.
Today we gather to celebratethe life of your name.

Speaker 2 (19:26):
So generic.

Speaker 1 (19:27):
The life of David A shining star that's so funny.
A shining star who left anindelible mark on the lives of
everyone they touched hey, wait,wait, wait.
I didn't touch that many people,just for the record okay, okay,

(19:49):
okay, wait, with a heart fullof kindness, compassion,
compassion and love.
Your name Brighten the worldaround them?
I doubt it.
The infectious smile, theirlaughter and their unwavering

(20:10):
optimism inspired us all to bebetter versions of ourselves.
I hardly doubt that your namewas more than just a friend,
family member or colleague.
They were a beacon of hope, asource of strength and a
constant reminder of the beautyand wonder of life.

Speaker 2 (20:27):
I feel like that part's true.

Speaker 1 (20:29):
Sure.
Through their journey, nameTouch countless hearts, minds
and souls.
I touched a few things I wouldedit that right, yeah yeah, well
, after the me too thing right,gotta watch it they.

(20:49):
They leave behind a legacy oflove, a treasure trove of
memories and a wealth of wisdomthat will continue to guide,
inspired us in the days ahead.
As we say goodbye to theincredible person, we take
comfort in the knowledge thattheir spirit will will live on
through us.
I wouldn't want that.

(21:11):
Once you die, your spirit isgone yeah, yeah, I mean do you
agree with that?
So I can take that out.

Speaker 2 (21:18):
It depends.
I feel there's some people thatare going to still need some
supervision.

Speaker 1 (21:25):
Right Hella guidance, not me.

Speaker 2 (21:30):
No, not necessarily you.

Speaker 1 (21:30):
We will miss them dearly, but we will also
continue to celebrate their life, their love and their
unwavering spirit.
Rest in peace, dear David.
Your memory will be a blessingto us.

Speaker 2 (21:47):
Nice.

Speaker 1 (21:47):
That's it.

Speaker 2 (21:50):
And then it says please note that this is just a
draft.

Speaker 1 (21:58):
So that's not too, bad, right, it takes you, you, I
mean it's a starting point.

Speaker 2 (22:03):
Yeah, it's not very specific it's in general but but
for ai to be out, how long hasthis been out?
And they already wrote that.
Yeah, that's not bad.
I mean, if somebody said that.

Speaker 1 (22:18):
You'd be crying, you'd be bawling, it's good.
Yeah.

Speaker 2 (22:23):
I mean it hits all the right notes, it's not
specific enough.

Speaker 1 (22:26):
I think at the end I would add like ski-yi-yi.

Speaker 2 (22:30):
Right.

Speaker 1 (22:31):
And that's it.

Speaker 2 (22:32):
Yeah.

Speaker 1 (22:33):
Close the books, dump them in the hole.
We're good to go.

Speaker 2 (22:37):
Yeah, and Dump them in the hole.
We're good to go.
Yeah, and as my casket is goingdown, right, yeah, you need
that music playing, right.
I think this is going to begood music to play at my funeral
For sure.
By the way, I wrote this music.

Speaker 1 (22:56):
With AI.

Speaker 2 (22:57):
With loops and putting it together.

Speaker 1 (23:00):
So that's kind of AI, I guess, because it had to mend
it and so that voice I haveseen TikToks where moms are like
I would have to tell my kidsthat I'm older than AI.
It's crazy.

Speaker 2 (23:15):
I never thought about it like that that is
interesting yeah so we do have arumor to clear up now that
we're playing this song.
I had forgotten about this, butthere are.
There's a rumor going aroundyeah that you are the voice on
this theme song okay, let meclarify that.

Speaker 1 (23:36):
are you guys ready?
I'm not the singer song.
Okay, let me clarify that.
Are you guys ready?
I'm not the singer.
Yeah, david just went outrunning through the door.
We got the cameraman behind him.

Speaker 2 (23:55):
Amy and Ivan are on their way.

Speaker 1 (23:58):
Yeah.

Speaker 2 (23:58):
By the way, I don't know if this should be public
information, but I wrote, amy up.

Speaker 1 (24:05):
Uh, I did not appreciate the little dig that
you put on the description ofthe chapters spotify yes on
spotify yeah, we did find thatout later later, after it was
posted and we had a serious talk, it wasn't.
It wasn't just a slap on thewrist, it was like a slap on the

(24:27):
face.

Speaker 2 (24:28):
Consider this your official warning number one?
Yeah, but that's it for today.
You all be safe, safe.
Oh, and do we have updates?

Speaker 1 (24:41):
Well, it's been a few episodes since we updated.

Speaker 2 (24:45):
Okay, let's see All right.

Speaker 1 (24:56):
You're not going to put this on.

Speaker 2 (24:57):
Oh, is it this one.

Speaker 1 (24:58):
I think so.
No, it's other no, oh, is itthis?

Speaker 2 (25:01):
one I think so.

Speaker 1 (25:02):
No, it's other, no, no.
Is it that one?
I don't know where it is.
I think it's other set.
It's that one.

Speaker 2 (25:09):
All right, we got some updates.
There was a question if thegirlfriend was cheating because
she kept breaking up and gettingback, what do you say, bianca?
Was she cheating?
Yep?

Speaker 1 (25:32):
I'm getting a clear message.

Speaker 2 (25:33):
Clear.
Yes, that sounds right.
All right, here's anotherquestion.
Jessica wanted to know if sheshould get married you had said
no.
It was a good call.

Speaker 1 (25:51):
What happened?
She didn't get married.

Speaker 2 (25:54):
She didn't get married.
She took our advice.
As anybody who writes in, theyneed to take this advice.
Yeah, turns out he got anotherchick pregnant.

Speaker 1 (26:05):
So you know how that is.

Speaker 2 (26:08):
And then James had applied for a job, or if he
wanted to know if he shouldapply for a job, and you said
yes, and that he would begetting a better job and again,
right, yeah, that's three forthree, bianca, that's pretty

(26:28):
good.
And as far as on the keithfront.
There's no news to report on myside same here what about you,
bianca have you?
nothing you're still holding out?
Yep, we need to find out when ayear has been, because I don't
remember our first because wewere going to follow it for a

(26:49):
year uh, well, I was stillpregnant, wasn't it?
Like a few months, and yeah,and so it's been a little while
Not a year, not a year, not ayear.
But we do need to keep track ofthat because otherwise we're
going to go to our grave tryingnot to sleep with this guy.
You're over it.

(27:10):
I don't know how much longer Ican hold out.

Speaker 1 (27:15):
Just in case we already wrote your eulogy, or
whatever the fuck it's called.

Speaker 2 (27:18):
Right, my eulogy is written.

Speaker 1 (27:20):
Eulogy yeah.

Speaker 2 (27:21):
We got.

Speaker 1 (27:23):
We're just going to figure out this Keith situation.
I will say this so I'm prettysure you're his favorite Me yeah
, why do?

Speaker 2 (27:32):
you say that I don't know, because I'm everybody's
favorite.

Speaker 1 (27:37):
Not your mom's.

Speaker 2 (27:38):
No, that's true.
Youngest stole my mom's.
This is what I will say.
We had a caller, or a Facebookmessage, I should say.
That gave us a few ideas forstories that we've been doing
and he suggested that sinceSable Palms has an Airbnb that
we could rent that me and youand Keith see a night, and you

(28:01):
and Keith.

Speaker 1 (28:04):
See a knight and just end this.
Okay, you see, I'm not going tobe putting myself out there in
the war.

Speaker 2 (28:08):
Yeah, I don't know that Sable Palms would be the
best place For all of that.
Don't know how romantic.

Speaker 1 (28:23):
It can be with all the I really doubt it stuff, but
the palm trees, the jungle.

Speaker 2 (28:27):
Yeah, yeah, no, so that's it for today.
Thank you for joining us.
We'll talk to you next week.
Follow us on all of our socialsand wherever you get your
podcast thank you guys.
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.