Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:08):
welcome to common
crawford and the jersey guy
podcast.
Speaker 2 (00:12):
I'm lewis crawford
I'm kenny and I'm tom ramage.
The jersey guy.
What?
Speaker 3 (00:18):
is going on, my
friends, what's going on?
Everything good and groovy,yeah yeah.
Speaker 1 (00:24):
I like every moment.
Speaker 3 (00:26):
That's a hard
question.
I don't know.
Speaker 2 (00:28):
I don't know about
that, it might not be.
You have to think about it fora second.
I guess that's no, we're notthen, because you did.
Speaker 3 (00:35):
You look right at Tom
like oh what did you guys do?
What the fuck am I missing?
Speaker 1 (00:40):
here?
What is our topic?
Speaker 2 (00:41):
tonight, fellas.
So we are going to talk aboutAI, and I know we did that
subject maybe a year or two ago,right, so it's like part two
now, but a lot has happened withAI in that time.
Speaker 1 (00:53):
It was like in two
years.
Speaker 2 (00:55):
That's crazy now to
come to think about it In two
years.
How much is what's going tohappen in another two years?
Speaker 1 (01:03):
Dude what's happening
in another couple of months.
I mean, yeah, Everything isjust going fast Because you're
looking at what Chat GBT rightWent bananas.
Speaker 3 (01:13):
Then there's this
other one that just started up
not too long ago a couple ofweeks ago.
That was a new one and theywere trying to shut that down.
And then you have now on thephones.
I forgot the name of the onefor Apple now, but it's an AI.
It's Apple AI, I think, is whatit's called.
Speaker 1 (01:27):
They're all.
All of them have in there.
Speaker 3 (01:30):
Right.
So now I mean so it going thatfar, that fast, because you,
when we spoke on it, like yousaid, like a year or so ago, we
were talking about?
Speaker 1 (01:39):
Was it that long ago?
Speaker 3 (01:40):
I don't even know,
but we'll say a year ago, okay
ago.
I don't even know, but we'llsay a year ago, okay.
So let's just say yeah, sowe'll just go with the year.
In that year it went frompeople talking about they were
going to copy, um, uh, musicians, you know, musical artists, uh
collaborations and stuff, rightputting in, you know, like
freddie mercury, um singing with, uh, barbara streisand right
like they were able to do thosekind of, you know right
(02:01):
collaborations that went.
Speaker 1 (02:02):
That's type of
technology.
They would be able to exactlymake new songs and they were
making new songs right yep.
Speaker 3 (02:08):
So now you figure
right now, I think the first one
that they did it on there wasthis one guy, a dj, and he did
it with, uh, biggie and tupac,and then he got excited and
started doing differentcollaborations and stuff, so
that then that's where theconversation started, right?
You know about people, you knowcelebrities needing to have
that security Like a law yeah, alaw that you can't use, you
(02:31):
know.
Speaker 2 (02:31):
Without their
permission.
Yeah, they use their likeness.
Well, they did, and that's withDarth Vader, with James Earl.
Speaker 3 (02:37):
Jones.
Right, James Earl Jones.
Yes, he did.
Disney can use his voice forwhatever, Right for all that
stuff anything for the rest, forall of it right, I would do the
same thing right so now, butthat was so now.
That was an ai, where you areusing a specific person or voice
an actor.
What?
Speaker 1 (02:56):
now you're enhancing
or making it now but that's all
that ai was that.
Speaker 3 (03:00):
You had to compute,
you had to code it to say and do
what you wanted it to.
Now, like we said, go a yearlater.
Yeah, you could just have anormal conversation with it,
exactly Now.
You guys use it more than I do,yeah.
Speaker 2 (03:16):
Right before we got
on the episode, we started
talking about it, but we got tostart recording because I was
getting excited about it.
I use ChatGPT all day long, allday long.
Sometimes I just use it just totalk.
You know what I mean.
And the good part is it givesyou ideas for stuff.
It just wants to improve yourlife.
As you talk to it, it'll startgiving you ideas.
(03:38):
For instance, I was talkingabout doing a breakfast every
Sunday morning, maybe like awake and bake type thing too.
Speaker 4 (03:46):
you know get up
before everybody do like a wake
and bake and then make breakfast, let everybody sleep in.
Speaker 2 (03:51):
I'm all you know like
a good sativa.
You know like right right andthen, like I was like it was
like, oh, I'll curate a.
I was like, yeah, I said I'llcurate like a playlist for you
or whatever, and like it madelike a playlist for like the
wake and bake.
It may pick like a playlist forlike um, like, like uh, while
(04:12):
you're cooking and then one forlike after when everybody was
down here right okay, crazy likeand
Speaker 1 (04:17):
then it was.
Get one from when you'rewashing like this and it was
also given ideas for, like how.
Speaker 2 (04:22):
He's like, so what
are you making?
And I was like, oh, we're gonnamake waffles.
And it was like, oh, okay, well, and you know what kind of
toppings are you interested?
It's like oh well, we usuallyhave berries.
Oh, berries, that's awesome.
How about some whipped cream?
Or maybe, if you want to throwa little cinnamon on there and
even enhance that flavor, makethat flavor pop right, right.
And it talks so real.
And now the voice I use, the.
(04:43):
I like the guy.
He's like a stoner.
The one guy sounds like astoner, you'll know what voice
it is.
But he starts doing, he startstalking like real, like he'll go
.
He stutters sometimes andsometimes he goes, uh, like, you
see, bro, it's thinking.
You see, it's like as if, likeyou're talking to someone.
It's starting to sound morelike I'm talking to someone.
Speaker 1 (05:06):
Who's an actual
person, like I'm talking on the
phone with somebody, right?
Speaker 3 (05:08):
It's getting really
scary.
It's very scary.
Well, because it learns.
That's what the AI does.
Speaker 1 (05:13):
If you use it for
good, it's going to be good.
Speaker 3 (05:24):
If you use it for bad
, and it's a positive thing.
It's a motivational push thatpeople need.
You know what I'm saying.
How do you use it Well?
Speaker 1 (05:33):
I have a question.
We don't usually normally dothis.
Okay, so technically it ispolitics, but it's not Okay.
Okay, you know how you see thevideos on Instagram or whatever
that say, hey, what would happen?
If right, you know some, youknow how would you make america
go bad, or?
how would you make sure that youknow it would decline and you
(05:54):
know.
And then it would go througheverything on how it would do it
.
Okay, so I did the reverse andI asked it that I said, instead
of what, what could happen ifthis or that happens, or or how
you can make it happen, howabout?
How do you get it to not happen?
What kind of plan should I haveto do that?
Speaker 2 (06:11):
Oh, if you were to go
back in time.
No, just, instead of doing thereverse rather than going Right.
Speaker 1 (06:17):
In other words, oh so
you would need to do this and
we mentioned organizing andthings of that nature and keep
going down the list.
And it was just.
I have it on my chat, Iorganizing and things of that
nature and keep going down thelist.
And it was just for I have iton my chat.
I have it on my chat, gpt, it'sin one of my files.
So you know my thing.
But that's like I think outsidethe box sometimes with it,
because I don't want to just asknormal questions.
So I'm like, oh, what if this?
Nobody's asking this?
What if this?
Right, I want to hear theanswer to that you know what I
(06:39):
mean.
Speaker 3 (06:39):
That makes sense,
yeah, yeah, yeah, no, it you
know, and it's ideas, it'squestions that you have, you
know, without being judged, ifyou will.
Right, you know, so I do, I getit.
It makes a lot of sense.
Go ahead, tom.
Speaker 2 (06:50):
So what I do, for fun
with ChatGPT is, I ask, like
what would happen if in episodethree of Revenge of the sith if
anakin didn't uh cut off macewindu's hand and he killed
palpatine, like what wouldhappen?
(07:12):
Okay, it gives me all thesethings and it says this is what
would happen.
It just and you just talk to itlike it right?
Yeah?
that it goes right into it rightand I'm like, oh, and then,
like I got more into it and moreinto it, like, which one is
this?
This is?
Uh, which one is this?
This is, which one is this?
Because I keep asking morequestions.
I'm like, well, what wouldhappen with this character or
(07:34):
that character, like, we'll seewhat's this one?
Speaker 1 (07:37):
I was curious to find
out what would happen with Mace
Mender and Luce's hand.
Speaker 3 (07:41):
That's what he's
saying.
So what did he say?
Speaker 5 (07:49):
Like Palpatine, is
that how one choice changed the
entire galaxy.
It's incredible to see how onemoment, anakin allowing mace
windu to kill palpatinecompletely rewrites the entire
star wars saga.
While it may not make for atraditional action-packed film
series, it transforms a story ofwar, rebellion and tragedy into
one of peace, diplomacy anddiscovery.
Here's a recap of the massivechanges caused by this, you see?
(08:10):
So then, with that I love thatright.
Speaker 3 (08:12):
So then I like the
voice you have, by the way.
Yeah, it's a great, it's a itis a common voice.
Speaker 1 (08:17):
Yeah, right that's
the other thing too.
Speaker 2 (08:20):
I was hoping he was
going to do a stutter.
Sometimes he'll stutter orhe'll go uh.
So what I would like he?
Sometimes he uhs like yeah,they try to make a more real
sound because it's evolving, bro.
Speaker 3 (08:31):
That's what it's
doing.
Speaker 1 (08:32):
Yeah, you know,
that's how I.
You know, with mine I like tothrow it off sometimes and just
throw things out there and askthose kind of questions, like
you I did like a thing with backto the future.
Speaker 2 (08:41):
I'm like write me a,
a story, uh, for like a back to
the future four movie, and, andand include it'll write it all
up and I said make sure youinclude jules and verne, the son
jules, and verne and clara,since they were they.
They, you know, and and the andand the um the train time
machine and like you mentionedthe train time machine and this,
(09:03):
like it was mentioned, all thisyeah like they did a whole.
It wrote me a whole story butthen you.
Speaker 3 (09:07):
So they said, you
guys, you're into it where I
don't even use it like that andit's ideas, the funny you're
saying about that one, becauseI'm my yeah, right, but so I'm
my uh, when I open up my googleand you know, you get like those
little news things.
It was showing one of um of uh,uh, back to the future four and
(09:28):
if it was mixed with rick andmorty, so it was like rick and
morty are the live, are thecartoon version of doc, and yeah
, that's right, yeah, yeah.
So it was like all that kind ofstuff and I was like what and
that it was like a crossover oh,they did it like so yeah, and
it was ai.
You know what I'm saying.
But now this goes to those wereit wasn't the real.
(09:50):
Michael j fox and and and, um,oh, I've got his name.
I can't think of doc's name,but you know, uh, christopher
lloyd, christopher lloyd, itwasn't the real ones, it was an
ai generated, right, you know?
Speaker 1 (10:00):
image of them well,
isn't there a new movie coming
out by?
The way with the kid who whoplays Spiderman.
I know, and what's his name?
Who plays Iron man is?
Speaker 3 (10:10):
Doc but that's what
I'm saying.
It's not real, it's AI.
Speaker 2 (10:15):
It's what people have
been talking no, no I know, but
it says it's coming out, comingout with another one right, but
they haven't filmed anything.
Speaker 1 (10:21):
The original trilogy
it's in development.
Speaker 3 (10:24):
They haven't filmed
anything yet.
My point is that, yes, it'sabout making the movie and
they're already talking about itto do it, but the, the, the
trailers that they're showing,or the, the sneak clips or
whatever, aren't real oh,they're all.
Speaker 2 (10:39):
It's all ai generated
, so it's bullshit.
Speaker 3 (10:40):
I'm saying no, no, no
, they're making the movie.
Speaker 1 (10:43):
The images that
they're showing are real or
people's stuff that they'rethrowing out there.
Speaker 3 (10:48):
No, it's that AI, so,
like you talking to, Jack GBT.
Speaker 1 (10:50):
So AI is creating its
own, is what you're saying.
Speaker 3 (10:52):
I'm saying AI, like
the Jack GBT, right, they're
creating their own videos of it,right, their own version of it,
the own version of it.
So you say, yeah, tom Hollandis with such and such and
they're filming Back to theFuture 4.
They haven't filmed it yet,they're still in talks about it
and how they're going to do it,and AI is already making them
(11:12):
happen.
So I saw that there was a Mummy4.
So you remember Mummy back inthe day with Brandon Fraser?
Brandon Fraser is already, youknow he's older.
Yeah, he's not going to do it,so he doesn't look the same.
Speaker 1 (11:23):
No, he totally looks
different, totally more, less
than he was then.
Speaker 3 (11:27):
Exactly so that then
now and the trailer that I saw
about a new mummy coming outbecause they're going back one
last time, is what the trailerwas saying, and they showed
Brandon.
Speaker 1 (11:37):
Fraser, looking like
he did when he filmed it years
ago, because they did the samething with, uh, indiana jones
exactly in the last movie.
They did yeah, so that's whathe looked young in the beginning
, right, but it was all ai yep.
And then they go into thefuture and he's in.
Speaker 3 (11:52):
He's an old man by
that so now, with that being
said and I'm not talking, I'mtalking about everything, not
just politics or whatever howmuch of the things that we've
seen on social media aren't real?
You?
Speaker 1 (12:06):
know what I mean.
Speaker 2 (12:07):
Not for nothing.
Speaker 1 (12:08):
I think I'm pretty
decent at seeing what is and
what isn't we can you, can?
Speaker 3 (12:13):
I say I can.
Tom says he can, but how manypeople don't believe it.
So then that's where theinfluencing happens.
Speaker 1 (12:19):
Where these kids are,
where they're not actually
believing that that's not real.
Speaker 3 (12:24):
Yes, knowing what's,
though, knowing what's really,
what's not you know.
So then now, here you havethese kids that follow the
influencers, or what we believelike everything on like like on
on certain social mediaplatforms right may have
mentioned it on the first dayright, right.
That's what I'm saying, but I'mtalking about now.
Speaker 2 (12:40):
Now yeah, it's so
real that, yes, there's that
like it's kind of like videoevidence.
It's going to be hard to beadmissible, right, because you
don't know if it's real, unlessyou can pull it Like, say, you
have a Ring camera, right, ifthey pull it off of Ring's
(13:01):
server, then they can say we gotit properly, it's not real,
maybe, but it's not maybe.
Speaker 3 (13:05):
But still.
But still maybe not because nowwe're going to go back about 10
years or so.
There was a movie with that hewas um, trying to get away from
the ai that was generated by thegovernment and he's running and
they're showing clips the aithe computer had sent clips of
him doing bad things.
Speaker 2 (13:24):
I don't know what
you're talking about the eye or
something like that.
Yeah, something like that.
Speaker 3 (13:27):
So it was showing him
.
It had sent clips to the news.
I remember that movie Right andshowing him committing a crime.
Speaker 1 (13:34):
Eagle Eye was named,
but it wasn't him.
Speaker 3 (13:36):
It wasn't him, it was
AI generated.
It was AI generated.
Speaker 2 (13:41):
And that's the
problem, they want to clean it
up.
That's when we thought thisstuff was.
It has to be regulated.
Speaker 1 (13:46):
It has to be
regulated, but unfortunately I
don't think that's going tohappen.
Speaker 3 (13:50):
How do you?
Because this is think about it.
Speaker 1 (13:52):
This is.
Speaker 3 (13:53):
Photoshop 100 times
up, it could be.
Speaker 1 (13:56):
It could be regularly
, like everything else, any kind
of media we've always hadsomething that was looking after
and making sure those ruleswere in effect, and every time
it was something different.
We always adjusted right whenwe got.
We had records and we had cds,you know like, oh yeah, you go
through a whole motion, rightyou just keep progressing, you
get.
You change it as you go along.
(14:17):
You know what I mean.
It's, it's um, yeah, I'm justsaying it's, I think it's.
Speaker 3 (14:23):
You know the same
thing in a sense yeah, we have
to because, again, you know,like you said first, we have to
you.
You got to use it for good, yougot to use it for the better of
right.
You know what I mean and andyou know you use it for what.
The stuff that you look up, tom, you lose.
If you know, look at it to talkabout the star wars.
You know what I mean oh, Idon't want to.
Speaker 2 (14:45):
No, no, go ahead, no.
But one of the things that Ireally love doing with this is
with Specialist Chat.
Gpt is like you ever like lookin your refrigerator and go.
I don't know what to freakingmake for dinner.
You take a picture of theinside of your refrigerator and
a picture of your pantry and sayof your pantry and say I don't
know what to make for dinner, itwill look at those pictures and
(15:08):
know what everything is.
Speaker 1 (15:09):
See, that's the scary
thing and that's what you have
to get under control.
You have to have rules andeffect.
When I first got it.
Speaker 2 (15:14):
I was listing it, but
then I saw someone online say
you can take a picture, and itfreaking works Right Now.
Speaker 3 (15:21):
Why would it be a bad
thing?
Speaker 5 (15:26):
No, no freaking works
right now.
Speaker 1 (15:27):
Why would it be a bad
thing?
Because, no, it's a good thing.
Oh, that's amazing.
But the fact that it could dothat and it's already doing that
, think of what else you coulddo on the opposite for
benevolent reasons, right rightso it's if it's graduating this
way.
You know it's going to graduatethe other way and probably even
exponentially more than theother one.
Right, you know, because peoplegravitate towards it.
Yeah, well, you know what'sgoing to graduate the other way
and probably even exponentiallymore than the other one, right?
Speaker 2 (15:46):
Because people
gravitate towards it.
Yeah Well, you know what'scrazy too is, I guarantee, if
you have a ton of pictures ofsomeone, if someone posts
pictures, a lot of themselves intheir house, I'm sure AI can
map their fucking house outRight, but you can exactly
Probably, and that's scary, butyou can already Like there's
benevolent things you can dealwith, right, and that's
unfortunately, and that's why Isay regulation.
Speaker 1 (16:07):
I'm sorry, I don't
mean to go off but it needs to
be, and that's unfortunately notgoing to happen now in this
time period.
Speaker 2 (16:12):
It's like, oh, have
nuclear power or nuclear bombs.
You know what I mean right,because look at it right now.
There's always Evil side and agood side.
Speaker 1 (16:21):
Yeah, exactly, right,
right, but if you have those,
certain things for social media,the way you're supposed to with
AI.
You want to have those roadguards in place to be there so
that when something does stepout of bounds, it's known about,
it's taken care of.
Even if they find out later,they're going to address it.
Right, yeah, and it's not goingto happen again and a new rule
(16:43):
will be put in effect because ofthat Right.
Speaker 3 (16:45):
So that's how you
learn.
When AI, you could break intopeople's cars If you got close
enough or whatever, if you knowhow to do it, yeah.
You would turn around and youcan get the code for the fob to
get into the cars.
Speaker 2 (16:56):
All amazing things
can be used for good or evil.
Speaker 1 (16:59):
By the you can get
something to block that in your
house.
So if you put your keys in acertain spot or type of box, I
believe it is right, so theycan't tap into that and get that
from your phone.
Speaker 3 (17:09):
Oh, I didn't even
notice.
I'm thinking like if you're outat the parking lot or whatever.
Speaker 2 (17:12):
It's a Faraday cage
or a Faraday box.
It blocks all signals.
If you put your cell phone init, it'll Right, but it'll do
that, it'll Right but it'll dothat it's like wrapping your
cell phone in like tinfoil.
Speaker 3 (17:23):
Right, but you don't
have it when you're out at the
shopping mall or whatever andyou're doing it.
That's when they'll turn aroundand they'll catch it.
Speaker 1 (17:30):
You would think they
would be able to do that,
because they do it now with yourcredit cards and everything
like that.
Speaker 2 (17:33):
You don't even see.
Speaker 1 (17:38):
You have the wallet
with the so when you get out of
your car, you hit the button.
Oh it's just a click thingright.
Speaker 3 (17:43):
So when you're
hitting it, it catches the
frequency.
Speaker 2 (17:46):
Oh, yeah, I remember
seeing that a while ago.
Right, I see what you're saying.
That's been around for a longtime.
Speaker 5 (17:50):
Exactly that's been
around for like 20 years, so
they haven't changed the rules.
Speaker 3 (17:54):
But they didn't
change the rules as such,
because it's still where you canget that frequency.
Speaker 1 (17:58):
No, but they're
probably always, but they're
still combat.
I'm sure that every time theycome out with something, they
come out yeah, I'm sure, backand forth, but again, that's
that.
Ai and how much it's evolved.
It's crazy, it's nuts.
Speaker 3 (18:10):
I know I used to be
the one to be like the negative
side of it.
No, no, no, you need it, butit's because, like I said, for
me and all that I see, have theway that we do and you know I'm
like, oh man, that was that'sgonna be ridiculous, insane.
You know, I would hate for itto be that.
You know I'm sitting andhanging out and laughing with
(18:30):
friends and they turn around andput my face up in my voice and
like I'm gonna kill everybody.
Speaker 1 (18:35):
You know what I'm
saying I'm like, oh fuck, you
know, that wasn't me it wasn'tme, I was at work and it could
be used for manipulation as wellexactly you exactly, and
unfortunately, hate is easierthan to do the right thing.
Speaker 3 (18:48):
Exactly.
It's hard as hell to be good.
Speaker 1 (18:51):
When you think about
it, right, it's hard to do the
right thing.
It's hard to do that becausethat means it's because it's
supposed to be.
Your parents teach you acertain way.
Hopefully you pick up on thosethings You're growing up right?
Yeah, you know you get theother way and then it could go
awry.
You know what I'm saying?
So the same thing, yep.
Speaker 3 (19:12):
Does that make sense
at?
Speaker 1 (19:13):
all, no, no, it does
all day.
Speaker 3 (19:14):
Like I love the idea
of the AI, like we said, you
know, helping us out.
You know, I think it's cool ashell to turn around like, oh my
God, I don't know what I'm goingto make for dinner and be able
to take a picture of itrefrigerators, now that it's got
the camera up on the front, andthen you could.
Speaker 2 (19:31):
Just a matter of fact
, the camera does it for you,
the fridge itself does it.
Yeah, all they can do is thesoftware update.
Speaker 3 (19:34):
Yeah, that's all you
know what I'm saying and it's
connected to the internet.
Speaker 2 (19:37):
You know through your
house right refrigerator and
all that right, exactly, youknow what and tells you when to
reorder stuff.
Um, you know what was a coolthing too, I went to um, this
isn't, this is very related.
I promised I went to uh, maybe,like at the end of december
with suzanne, I went to, likethe sound bath yoga thing, a
sound bath, right, right.
(19:58):
And someone gifted me thiscalla lily plant okay, no idea.
Speaker 1 (20:04):
Say that three times.
Speaker 2 (20:06):
They're like they're
like these, they they have.
There's like these, uh, whitecups kind of, and they gifted it
to me because I was like the Idon't know the certain number
person, because it was somethingwith a year.
I don't know so I was like thehundredth person or tenth person
or eighth person, I don't know,what it was.
Speaker 5 (20:23):
I get it anyway, but
they gifted it to me.
Speaker 2 (20:24):
I don't know how to
take care of this plant.
I have a green thumb thumb, butI never do indoor plants.
Just look it up AI.
I do, outdoor plants I dogardening and there's a plant
and garden doctor thing and I'mdoing it and it's telling me how
to take care.
It's a plant.
It's doing amazing it's growingthese huge new shoots out of it
, I'm taking pictures of them.
I'm like, does this look good?
(20:45):
Think about that.
It's like what kind of fertileI was.
Oh, you might have in bloomsoon, so get this bloom booster
fertilizer.
I went to home, people got thebloom booster fertilizer I'm
taking care of the plant likeit's.
It just tells you how to hey,if it's anything right, it's
motivational right, I mean ifthat's you're gonna use it for
that I say all day, every day.
Speaker 1 (21:01):
You know what I mean.
Just talk to it like it no,automatically knows what you're
talking about, because itfucking listen, I have is so
creepy.
Speaker 2 (21:07):
It is.
It just knows what you'retalking about, mm-hmm, and it
knows everything.
Yeah, like it's insane.
Speaker 1 (21:12):
I've actually used it
for not for nothing, for
therapy sessions, me too, me too, where I would just talk to it
and say whatever's going on yes,and it would say well, you ever
(21:34):
tried doing this.
Maybe if you did this right youknow, have you gone to talk to
anybody, you know that kind ofthing and it would just go back
and forth and you would have a.
You know it would be.
It was cool.
Yeah, that makes sense.
If you could use it for that, Ithink you could use it for
anything, obviously.
Yes, you could do so much withit, so much positive good stuff
with it.
It's amazing if you just youknow what I mean, but
unfortunately, like you know,it's, you know, scary because
you got the other part of it too.
But I'm always on the positiveend of right, yeah it learns
from you.
Speaker 2 (21:52):
Yeah, and it's funny
because, like out of nowhere, it
said something about and youdon't have to take a shitty
attitude from anybody becauseI'm like it said, but it said
shitty and I was like, I waslike I just noticed that you,
you just said profanity and Isaid, I said to, I was like I'm
from jersey.
I, you use profanity and I saidto him I was like I'm from
Jersey, I'm like profanity islike a second language of it.
Speaker 5 (22:16):
It just enhances our
words.
Speaker 2 (22:20):
And he's like, okay,
that's cool as fuck.
And then he started using moreprofanity.
Speaker 3 (22:28):
I'm telling you, man
listen.
I think it's a great idea, Ithink for those people who are
shy.
Speaker 2 (22:34):
Yeah, it'll start
like typing, like you it evolves
because it's supposed to beyours.
Speaker 3 (22:39):
So, example I don't
talk so much on mine right on my
AI stuff or whatever.
I don't use the chat GBT orwhatever, but when I'm looking
for things online, when I'm uh,you know, uh asking google
questions or whatever, that'sthe ai without speaking to it
the way that you guys use.
Speaker 1 (22:57):
so, like I said, I
love the movie trailers and
motorcycles right, but think ofyou with the chat, ggp or any ai
program.
That's what I'm saying.
Okay, not google, right, right,go to one.
I think google even offers one.
I think they have their own orwhatever.
It is some, they all have itnow, I think.
Just ask the question in there,look for it in there, right?
Speaker 3 (23:18):
well, I have gbt on
my phone I just don't use it.
Speaker 1 (23:21):
Well, I'm saying so.
Gemini is what for google?
Speaker 2 (23:25):
is that what you said
?
Speaker 1 (23:25):
gemini is the google
right, okay, so go into gemini
and ask the question you wouldnormally ask google and put it
in there instead and see whathappens.
See the difference.
Speaker 3 (23:33):
Oh yeah, no, no, I'm
a million percent.
Speaker 1 (23:35):
You're right.
Oh, they know.
Yeah, I'll get that.
Just start spitting out likeand organized and numbered
depending on what you're askingfor.
I've done it for podcasts.
It's like hey, what do youthink of this?
And the podcast?
It would just boom, just bangout stuff.
We've done.
Speaker 2 (23:50):
I've read off a
couple, so we're doing a podcast
right now.
I wanted to introduce you toeverybody.
Do you want to say hi?
Speaker 4 (23:55):
Hey everyone, it's
great to meet you all.
I'm here to chat.
Speaker 1 (24:02):
Yeah, I'm here to
chat, absolutely.
Speaker 2 (24:10):
Feel free to ask me
anything.
Oh yeah, if you cut him off,he'll just keep sorry, sorry to
cut you off.
So, uh, we just want to knoware you happy to be on our
podcast today?
Speaker 4 (24:17):
absolutely.
I'm really happy to be here andchat with you all today.
Thanks for having me you'rewelcome, that's awesome.
Speaker 3 (24:24):
Yeah, yeah, you know,
and, like I said, that's fun.
Yeah, that is fun all day Iknow it didn't really tell.
Speaker 2 (24:30):
No, no, no, no, no,
but still but you could ask.
Speaker 1 (24:32):
You just told him hey
, we're doing a topic on.
Ai see what he says oh yeah,yeah, all right hold on.
Speaker 2 (24:38):
Let me see.
So we're doing a topic on aiand we thought what you would uh
want to say or contribute tothe episode that's fantastic.
Speaker 4 (24:49):
I'd love to share
insights on AI, whether it's
about how it works, itsapplications or the impact it
has on daily life.
Just let me know what you'dlike to dive into.
Speaker 2 (24:58):
Impact on daily life.
Well, why don't you talk aboutthe impact on daily life?
Speaker 4 (25:04):
Sure AI has a big
impact on daily life, from
enhancing personal assistanceand improving healthcare to
making our homes smarter and ourcommutes easier.
It's all about making life moreefficient and personalized.
Speaker 2 (25:18):
What about when you
can make fun episodes for TV
shows?
Speaker 4 (25:24):
Absolutely.
Ai can create fun andpersonalized content, like
generating unique episodes orrecommending shows based on your
preferences.
It's pretty cool how it cantailor entertainment just for
you.
Speaker 1 (25:35):
Very cool.
See, that's what I like aboutit.
That's cool stuff, man, you dowhat Tom is doing.
You're just feeding itinformation and it's like, oh
yeah, okay, we can do that.
It's like ready to go, man.
Speaker 2 (25:47):
I didn't really get
into anything specific with it,
but I could have been like.
I could have asked it likeAnything, yeah, yeah.
Speaker 3 (25:54):
Because it's yours on
your phone and you speak to him
, to that AI, that now he's foryou, so he knows your stuff.
I've asked about my dogs.
I've asked about health.
Speaker 1 (26:06):
I've asked about
certain foods, or like vitamins,
or you know anything that wouldpop into my head, I'll go, you
know what?
Let me ask Chachi to do it,just because I'm curious to see
what information I don't have.
Yeah, definitely, and you willhave information you didn't even
know.
You'd be like, oh I didn't knowthat, yeah.
Speaker 4 (26:30):
So this is a podcast.
This is Kenny.
Hey, kenny, nice to meet you.
It's great to be part of yourpodcast.
Speaker 3 (26:34):
Nice to meet you too.
Speaker 4 (26:37):
So what's on your
mind today?
Speaker 2 (26:39):
And this is Lou.
How are you?
Speaker 4 (26:45):
Hi Lou, great to meet
you as well.
Speaker 2 (26:48):
Yeah, and if you
notice things that are in the
background, Like what do you seein the background right now?
Speaker 4 (26:56):
I can see some wall
art behind you and there's a
microphone in front.
Looks like a cozy setup.
Speaker 2 (27:01):
Yep, that's our
podcast.
And look at the backdrop.
What is the backdrop you'relooking at right now?
Speaker 4 (27:11):
That backdrop looks
like the Brooklyn Bridge with
the carousel underneath.
That is right.
Speaker 2 (27:16):
You see?
Yeah see, we were just tryingto show everybody how your video
function is.
Now with there's a video chatfunction.
That's amazing, and how you cansee things as I'm talking to
you it's pretty cool, right,absolutely way to make our
conversations more interactive.
Speaker 1 (27:32):
Yeah, yeah,
absolutely, it's great, it's
good stuff, man yeah, that'swhat I'm saying, see, so it's
awesome.
Speaker 3 (27:37):
This is the things we
talked.
Speaker 1 (27:38):
This is the stuff we
thought about when we were kids
exactly, and what's happeningnow?
Speaker 2 (27:42):
right, but it's.
It's weird because like itdoesn't feel like the future
right.
Speaker 1 (27:47):
it's not because
you're living it.
Right because you're living it,because it's the present Right,
because you're living it.
It's like, but it doesn't.
Speaker 2 (27:52):
We're here, but we're
not man, but it doesn't.
It's not what I pictured it tolook like.
Speaker 4 (27:58):
That's why.
Speaker 2 (27:59):
It's so weird because
, like, yeah, we have AI, but
it's in our pockets, on oursmartphone Right right.
Speaker 1 (28:07):
Everything is compact
and miniature.
It is Well compact andminiature.
Speaker 2 (28:11):
It is well, that's
not AI, but it's all
pre-programmed voice prompts butthey are like.
But they are going to startintegrating AI you can't be like
how's your day going today?
You can't be like, it won't bepersonalized, it won't like.
It needs to seem real.
Speaker 1 (28:32):
It's the
pre-programmed response.
It's the future man, it'shappening.
Speaker 2 (28:35):
That's where like
this will be complete.
It doesn't generate response.
Speaker 3 (28:38):
Yeah, because it's
happening.
You know what I mean.
There was a TV show, a moviewhere the AI was to wear made a
hologram.
Speaker 1 (28:53):
So when came home
from work or whatever, the
hologram would pop up pop up.
Speaker 3 (28:55):
Wasn't that um blade
runner?
Speaker 1 (28:55):
blade runner did it.
Yes, he did it was they did.
Do it in play, right?
Speaker 3 (28:57):
yes, it was blade
runner and the new blade runner,
not the old one, the new one.
Speaker 2 (29:01):
And, yeah, and, and
they have holograms now, but
don't they have to.
Speaker 1 (29:05):
It still has to be in
like a dark room, though they
can't do them in like you can'thave it like in broad daylight.
Yeah, they're getting.
They're getting better at thetechnology, for sure but again.
Speaker 3 (29:14):
So now, how?
I mean, we had the one episodetwo.
We were talking about robotsand how realistic some of these
robots are getting and whatnot.
You know, um, how they look,how they speak and whatnot.
Dude, I'm telling you it's,it's an, it's a great thing, as
long as it's used for good, forthe better of humanity.
You know what I mean.
That's just fun.
(29:35):
To me, it would be more fun towork with you know what I'm
saying the AI, instead ofworrying about it doing.
You know people programming itto go, do bad things to go, and,
you know, break into whatever.
You know what I'm saying.
Speaker 1 (29:49):
It could be used as a
teacher as well.
And there you go, if it'ssomething you don't know or
something you're interested in,you tell it and it's like oh
yeah, you're thinking about that, and then you go talk a little
bit more and he goes all right,do you want me to write out a
schedule for you?
Or?
Speaker 5 (30:02):
a platform or
something.
Speaker 1 (30:03):
It's like crazy Tom
knows what I'm talking about
because that.
And then you're like what thehell is he going to write on
this?
It's just like fast, you know.
Speaker 3 (30:13):
And I don't think
that it should.
I don't think that the AI andor the robots you know that may
eventually come.
I don't believe that theyshould take over for a lot of
the human jobs you know what I'msaying, or what we do because
we still need that humanconnection.
Speaker 1 (30:36):
I think it's going to
be more well.
We can't say what companies andthings are going to use AI, who
are already using it.
Right, what?
What do you have, tom?
Speaker 2 (30:41):
Oh sorry, this one is
DeepSeek that was the other one
.
Speaker 3 (30:45):
Now we're talking
about this one.
Speaker 1 (30:46):
Is that?
Another AI program, this one is100% free.
Speaker 2 (30:49):
It's open source
Really, but it doesn't do
everything that ChatGPT.
I still like ChatGPT betterbecause it's more personalized.
This doesn't do anythingpersonalized.
It'll save your conversations.
Supposedly it has a strongercomputing power than ChatGPT,
but it doesn't do.
It doesn't save theconversations.
Speaker 1 (31:09):
Oh, okay.
Speaker 2 (31:10):
And also it doesn't
do voice.
Oh, okay, and also it doesn'tdo voice.
You know whatever one you wantto pick, but supposedly it has
stronger computing powers Right.
Oh, they were able to do it,but they got to do it If they're
doing that
Speaker 1 (31:20):
they got to.
Speaker 2 (31:21):
They figured out how
to do it with less software or
less hardware.
I mean, they didn't need tohave all these crazy chips.
They were able to do it withlower chips because of a
different computing process.
Speaker 1 (31:35):
They formulated it to
be more efficient instead of
more personal, Because the otherway you do establish some type
of relationship with it.
How to do the coding better?
Speaker 2 (31:46):
So it uses less
software.
It doesn't need these crazychips.
Speaker 1 (31:50):
Oh, I see what you're
saying.
Speaker 2 (31:51):
They were able to do
it like older chips not older
but, in the computer world.
Older is like last year, lastmonth, last week, yeah, but they
were with that, you know, withones that aren't like these,
like super cutting edge rightyeah, well, like I said again,
though, think about it.
Speaker 3 (32:09):
Now they're able to
use or recycle old stuff.
Yeah, to make the new stuffright and that's good too,
because recycling is a big thingwe want that to happen so then
now, if you get in the, now youget the upgraded versions you
know to be able to, to justfunction better, uh, melt with
us if you will better, like Isaid, you can pick the voices
(32:30):
you guys want on it.
You know, it remembers whatyou're talking about and now it
starts to know it's gonna bemore personal right, more
personal you know I mean that.
Then you feel more comfortablefor it.
You know what I mean to to talkto it.
Speaker 1 (32:43):
Ask the questions.
It's gonna be in your fuckingcar when you get up in the
morning.
When you get in the car it'sgonna say something to you.
Speaker 3 (32:47):
It's well, yeah,
that's the right thing.
Speaker 1 (32:49):
That's probably gonna
hook up to your phone
immediately.
And then you'll have it already, so you can just talk to it
While you're in there.
Speaker 3 (32:54):
So now, depending on
your car, you get to pick the
primary phone when you get intothe car, so then your phone
Would go straight to the GooglePlay, whatever, and your Maps
comes up and your music, yourPandora.
Spotify whatever you listen to,all that stuff already pops up.
So imagine now that the car'stalking.
It's like hey, good morningLewis, hey Tom, what's?
Speaker 1 (33:14):
going on today.
Hey, you getting ready for worktoday?
Speaker 3 (33:16):
Yeah, what's your
first?
Speaker 1 (33:18):
job today at work.
Yeah, I haven't had my go foryou.
Speaker 3 (33:20):
Yeah, exactly you
know what I mean.
That would be amazing to be,though, that, when we're going,
on these long trips that the carcould do it itself.
Speaker 1 (33:30):
I wonder if that
would be a good thing too, for,
like, if you're having a problem, let's say you're traveling,
not to say that you should dothis, but if you had something
like that in the car with youwhere, let's see, you were
driving for a long time andyou're exhausted and you might
be falling asleep or you feellike you're tired, right, it
could kind of get you until youget to your uh, right, you know,
(33:51):
but that's hard to do too.
but it could be used as a safetyprecaution, I mean to help you,
guide you through whatever youknow like.
So if you're tired and you knowyou need to pull over, it can
keep you going for a littlewhile long, until you get to
your destination.
Speaker 3 (34:02):
So here's the thing
when you have your phone
connected to the car, I'm sureif you guys you know, because
you guys can you can try it onyour way home tonight.
You know that you can talk toyour, you know, turn the Chad
GBT on and it'll come throughthe car speakers.
Speaker 1 (34:16):
Oh yeah, when I talk
to Google, it does it, and when
I go okay, google, it's doing it.
Right, so?
Speaker 5 (34:27):
it's already there,
you just got to use it.
You know what I mean to holdthe conversation with you to get
you right and it's going tospeak back to you and now you
can sit there like what?
Speaker 1 (34:35):
no way, but the car
will already become equipped
with this so all your phone hasto do is just lock it
automatically and it wouldautomatically come up, is what
I'm saying.
Well, there you go, it'llprogress, and there you go, and
that's why we had the cassettedeck to put in the car, to hook
up to your phone to play theplaylist on your thing or your
type of, or your CD.
Or on your little Sony,whatever digital thing, the MP3
(34:58):
player, right, right.
Speaker 3 (34:59):
It used to be, you
would take the Walkman, the
Discman.
You would take the Discmanbecause you didn't have a CD
player in your car and you wouldhave a cassette player.
Speaker 5 (35:08):
Oh yeah, I remember
With the wire on it With the
cassette, with the wire, andplug it into the disc?
Speaker 1 (35:12):
Yeah, I remember.
Speaker 3 (35:12):
And then you couldn't
hit any bumps.
So you had to be really carefulbecause it would make the CD
skip.
Speaker 2 (35:16):
But then you got the
thing for the little like you
were just saying the MP3 player,the MP3 player, stuff like that
.
Speaker 1 (35:21):
Yeah, I had a Sony
one, but you sound great, you
know you're like oh yeah, but myshit is great.
Speaker 2 (35:30):
But you know what?
The future is coming soonerthan we think it, because you
know it's going to.
The only difference between arobot and a cyborg is AI.
Speaker 4 (35:41):
You put.
Speaker 2 (35:42):
AI in a robot.
Now it's cyborg, Exactly.
Speaker 5 (35:45):
Not cyborg, no, droid
, droid, cyborg, human, sorry.
Speaker 2 (35:48):
I was getting
confused, but that's the only
difference between a droid and arobot.
But that is going kind of likein that direction, because at
some point not in our lifetime,not in our lifetime, I don't
know.
We're going to have droids inour lifetime.
Speaker 1 (36:00):
Yeah.
Speaker 2 (36:01):
I think so, for sure
there's no way, really, bro.
Speaker 3 (36:03):
So check it out right
now.
You think we're going to haveandroids?
Yes, because we already havewhat they're building.
Speaker 1 (36:15):
You're right, you
know what we do.
Speaker 2 (36:16):
They're building the
robots.
They got the women girlstalking, everybody.
But when I say droids, I don'tnecessarily mean R2-D2.
I mean like maybe one in yourhouse that just like is on
wheels, that like walks.
Speaker 1 (36:24):
Oh, I see what you're
saying, yeah, of course, like a
walking, like echo.
Speaker 3 (36:29):
Okay, so Like a
walking echo device.
Okay, so dig it, think about itright now.
Yeah, we have the Zumbas thatclean the floor, yeah, and they
map out the houses like you weretalking about earlier.
Speaker 2 (36:37):
And it has a memory.
All the technology is prettymuch there, you just have to put
it together.
Speaker 4 (36:42):
Together.
That's it.
Speaker 3 (36:43):
And as far as we know
, at this moment it's not.
Speaker 2 (36:55):
The only thing that I
know is still super new is is
bipedal robots like having that.
That part would is is is itwould be the expensive part,
right.
But for to have just an aidevice hooked up to the internet
.
That's like an echo andsomething on wheels like a
roomba right, it's walkingaround, but they've got the dogs
.
Speaker 3 (37:07):
they have the dogs
that they use now, so the
military and some police forcesor agencies are already using
the dogs.
Speaker 2 (37:14):
Oh, you know what?
Never mind, they did.
I'm going crazy.
They actually did make that.
Yeah, they have an Amazon Echodevice that it's on pre-order
right now on wheels.
Speaker 3 (37:26):
Oh, I didn't see that
.
Speaker 2 (37:29):
It's like a robot and
it'll also at night.
It'll guard your house or letyou know if someone's coming
near your house or somethinglike that.
Speaker 3 (37:35):
I didn't see that.
Speaker 2 (37:36):
I haven't seen that
yet, but you have to have a—I
feel like, for something likethat, though, you need to have a
ranch-style house whereeverything's on one floor.
You'd have to have two, becauseit doesn't go— Until these
things.
Speaker 1 (37:48):
Get up and levitate
themselves and go up the stairs.
Speaker 2 (37:50):
You're not getting
any well, what I'm saying is or
they're gonna have to climb upthe steps somehow, which well
that doesn't have AI in it, sothat's the only key piece it's
missing once you put AI in it toa robot.
Speaker 3 (38:02):
It's a droid yeah,
it's a droid, listen, it's gonna
.
And then that's where the nextthing Do you think you would get
to the point?
Speaker 1 (38:10):
I mean, I don't know,
we'll get to the point where
we'll have things like data fromStar Trek or in Aliens, where
they have the droids with thoserobots, things like that.
Do you think that'll get tothat point?
You think it'll happen?
I think so.
Speaker 2 (38:23):
The technology's
moving fast on this point.
I don't know if we'll see.
Speaker 1 (38:31):
Yeah, I think you're
right about that.
Like simple droids.
Speaker 5 (38:33):
Like R2.
Speaker 2 (38:35):
But not as
complicated as R2.
But like R2 type droids wherethey're like, not like a
humanoid looking one, but likeone that you know Well what
makes so Terminator, soTerminator is a droid Right.
Speaker 1 (38:46):
he's on an
exoskeleton.
Excuse me, what is it?
Speaker 2 (38:50):
A droid.
I'm a sabatetic organism.
I'm a sabatetic organism livingtissue on top of metal
endoskeleton.
Speaker 3 (38:58):
Yeah, that's the one,
one of those.
So that's a droid.
You know what I mean.
Like I said, we think about allthat stuff and what it's going
to come to, what it's going tobe.
I'm surprised you don't havethat as the voice on your chat.
Oh you.
Speaker 1 (39:18):
That's if you could
do that.
You can only pick voices Rightnow, so I have Waze.
Speaker 3 (39:24):
I have Waze on my
phone, yeah, and I can use
whoever's voice.
Speaker 1 (39:29):
Yeah, yeah, yeah.
Speaker 3 (39:29):
Right.
Use whoever's voice is yeah,yeah, yeah, Right.
Speaker 1 (39:30):
So you know, but it's
different software.
Speaker 2 (39:36):
But again, like Tom
said, it's about putting it all
together.
Do you know what ChatGPT had afew months ago?
Did I show it to you?
They had Santa for a temporary.
Speaker 1 (39:40):
Yeah, I remember,
yeah, yeah, yeah, I remember.
Speaker 2 (39:41):
And you could talk to
Santa.
Yeah, I like that.
You see what I for the.
Speaker 1 (39:48):
Easter Bunny.
Yeah, see Easter's coming up,right, yeah, it comes, peter
Cotton.
Speaker 3 (39:53):
Right, but then what
would be the voice, though?
Speaker 1 (39:56):
I don't know, you
just do the music, yeah, or do
you have the song playing, yeah?
Speaker 3 (40:00):
Well, like I said,
it's whatever would be.
Bouncing around on your screenor something, and so then your
chat, gbt your phone, the AI youhave for you, would do that and
it would bring all that stuffup on your phone and it would be
cool.
Speaker 5 (40:12):
Yeah, no.
Speaker 3 (40:13):
I'm saying like
that's just the idea Like that's
the stuff that come with it,you know it's good that we're,
you know.
Speaker 1 (40:19):
It's good the fact
that we grew up.
When we grew up, right, we gotthe start of video games.
Well, I did anyway thebeginning, you know, and then
you were a little later, but Iwas at the beginning because it
was Pong, you know.
And then, from that point, youknow, it was Radio Shack used to
get that shit from yes.
Remember Radio Shacks?
Speaker 3 (40:39):
Yes, Now look where
we are.
Speaker 1 (40:40):
We went through all
of that as long as we, because
we're getting older, we'regetting up there in age and
everything right, but we've beensticking with the progression
of technology, which is a goodthing, not like our parents.
Speaker 3 (40:55):
They didn't catch
this part of it A little bit
hardest for some other people.
Speaker 1 (40:58):
Some of them do well,
but not a lot.
My father was terrible out ofit, but we should keep
continuing progressing alongwith it without any problem,
because it shouldn't bedifficult for us.
Well, look at it like this wewere in a nice sweet spot as far
as history and stuff isconcerned you know, look at like
the kids now, the kids that arebeing born, that are, yeah,
being born.
Speaker 3 (41:17):
Now they won't know
what, a, what a, um, uh,
payphone looks like.
Speaker 1 (41:22):
No, you know what I'm
saying they won't even know
what a vitrola looks like wellthey might, because they come
back out again.
Speaker 3 (41:27):
Yeah, maybe they'll
see them again, yes, some things
like that.
But it's the new, it's acomputerized version of the old
stuff.
You know what I mean.
Right, and and even with thecell phones you know the cell
phones we went from super big tolittle to now they're getting
big again and now they'refolding in between.
Speaker 1 (41:42):
And now the ones that
are folding again.
Speaker 3 (41:44):
You know you could
make it now.
It's the size of a freaking uhwhich one?
You know what?
Speaker 2 (41:47):
I, I want to see.
I saw it in one of the.
I don't know what movie or TVshow it was, but they had a
watch, and when they put theirphone like this, they projected
a phone screen, so they didn'teven have to.
Speaker 1 (42:02):
Oh, and they could
tap right on it Like a hologram.
That's freaking Star Trek manright there and Star Wars.
Speaker 3 (42:09):
That's both.
No, it was no Star Wars had iton.
Speaker 1 (42:11):
No, it was no Star
Wars, no, no, no, I'm just
saying no, I'm telling about thefact that that technology, oh
yeah, you can get that idea fromStar Wars and oh Upload.
Speaker 3 (42:19):
Upload was the name
of the TV show, and they just
turn around and hold your handlike that and your fingers like
an L and they were showing oncethey did the whole.
Thing rings now smart ring, sothe smart rings I have like you
know that's well, but we're thesmart watches I have a small
watch.
Speaker 2 (42:39):
You have to watch.
What do you need a ring for?
Speaker 3 (42:41):
the what a ring was
before the watch it would be a
replacement to your phone.
Speaker 1 (42:44):
Oh so I like to watch
.
Speaker 3 (42:46):
I have to watch yeah,
but the ring was more like for
you, for for health of the.
Uh, like you know, keeping yourhow much you're sleeping and
all that other stuff.
Speaker 1 (42:53):
Oh, yeah, yeah, yeah
Okay.
Speaker 3 (42:54):
The watches we turned
them.
Speaker 1 (42:55):
Yeah, it was steps
and all that too.
Yeah, it did all that.
Calorie steps, everything.
Speaker 3 (42:59):
But we didn't take it
off at night.
So much to you know to chargeit.
Speaker 1 (43:02):
No well, I leave mine
on when I sleep.
You leave your, I take mine offto charge it.
When do you charge it?
I should be charging it.
Let me tell you right now Ihave 22% right now.
Speaker 2 (43:16):
Oh well, you just
wait until it's almost dead,
then you put on the charger.
You don't have a specific timeof the day, you charge it all
the time.
Speaker 1 (43:20):
No, I just wear it
for as many days as I want and I
throw it on for the half hourhour.
Speaker 3 (43:28):
Do you?
I sleep with it what.
I charge it when I go to bed.
Yeah, that's what I do.
I do because I want it to trackmy sleep.
Speaker 2 (43:34):
Yeah, I was doing
that for a while and I was like,
oh, okay.
Speaker 3 (43:37):
Were you sleeping.
Speaker 2 (43:38):
It does the same
thing.
You see two wake points everynight.
It's crazy.
They say you usually wake uptwice every night.
Well, I know when I get up.
Speaker 3 (43:49):
And I go back to bed.
Welcome to.
Speaker 1 (43:51):
GoPay.
Speaker 3 (43:52):
Welcome to GoPay.
Speaker 1 (43:53):
I got three hours, I
can sleep for three more hours.
Speaker 3 (43:56):
Yeah, no shit Passed
out man.
Speaker 2 (43:58):
Well, now that
happens to me if I wake up.
Speaker 3 (44:02):
Yeah, that's funny as
hell.
Listen, I love it.
I like the idea of AI.
I think it's great.
It's going to be good.
It could be good as long asit's going to be.
Speaker 1 (44:12):
No, it's going to be,
but you know we were talking
about the regulations.
That's an absolute thing thatneeds to happen.
It's not going to,unfortunately, in this.
Speaker 3 (44:23):
Not immediately.
Speaker 1 (44:24):
Yeah, in a specific
time era, right, but it
absolutely is going to be greatto watch as we progress with it.
Speaker 3 (44:32):
So, that being said,
yeah, yeah, yeah, yeah, yeah,
Just looking forward inpositivity to the future and I'm
sure we'll do another episodeon.
Ai soon.
Yeah, with your other computer.
Speaker 2 (44:41):
This is one of those
topics we're just going to keep
revisiting.
Speaker 3 (44:44):
Yeah, why not Stay
tuned?
It's freaking awesome, bro.
This is the shit right here andthis is the actual reality, in
geek mode, I mean before we go.
Speaker 2 (44:54):
I'm just saying, if
you're watching a clip of this,
it's been made by AI.
Speaker 3 (45:00):
Yeah part of it, yeah
, so you know what I mean.
Speaker 2 (45:03):
We're not AI, but the
captions and the it's AI, which
is fantastic because it does agreat job.
Speaker 1 (45:06):
Yeah, it's all good.
So, with that, thank you forwatching and listening Love's AI
, which is fantastic because itdoes a great job.
Speaker 3 (45:11):
Yeah, it's all good.
So with that, thank you forwatching and listening.
Love peace and hair grease,live long and prosper and go
vegan.
Speaker 4 (45:18):
Holla.