All Episodes

July 27, 2024 • 100 mins

Send us a text

Can AI truly become self-aware, or will it always remain a tool for human convenience? This week's episode of Timeless Talk brings back Re-occuring Guest/Former Co-Host B Money$ to help us tackle this intriguing question. Fur1ous and B Money$ dive headfirst into the heart of artificial intelligence, spurred by your votes from a tightly contested Instagram poll. We explore the tantalizing possibility of AI achieving a state of self-awareness, reminiscent of sci-fi classics like The Terminator’s Skynet & Futurama, and reflect on how media portrayals & real-life advancements have shaped our understanding of this fast-evolving technology.

From the curious case of Facebook's Bob & Alice creating their own language to the ethical dilemmas posed by deepfake technology, our conversation spans the full spectrum of AI’s potential and pitfalls. We dig into the world of wearable tech like Google Glass & Meta glasses, which promise futuristic functionalities like real-time object recognition. We also ponder the impact of AI on creative industries, examining the role of apps like Gemini and Arvin in revolutionizing graphic design and content creation. The highlight? A fascinating discussion on AI’s ability to generate photorealistic images, & even scripts, raising both excitement and concern within Hollywood.

Our exploration wouldn't be complete without addressing the serious issues of AI-driven surveillance & privacy. We shine a light on China's advanced AI surveillance systems, pondering the moral ramifications of such pervasive monitoring. Could we be heading towards a future where minor infractions are automatically penalized by AI? As we navigate these complex topics, we also touch on lighter subjects, like the potential for AI in mind control and dream-reading technology. Join us for a thought-provoking journey into the transformative world of AI & its far-reaching implications for our lives.


*Exploring the Future implications of Artificial Intelligence
*Emerging AI Technology & Conversational Applications
*AI Technology & Creative Applications
*AI Impact on Graphic Design & Entertainment
*Future AI Applications & Ethical Considerations
*AI Surveillance & Social Control
*Exploring Dreams & AI Technology
*The Risks & Dangers of Deepfake Technology
*Business Troubles & Moving On


*
Intro Beat Credit: Memnoc (Picasso)*
*Outro Beat Credit: JJ got Beatz*

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:07):
And welcome back to the conversation that never ends
.
This is Timeless Talk.

(00:28):
I'm your host, aj, and we'regoing to get to the guy on my
right.
I see a familiar face here forthose that have been with us for
the first four seasons ofTimeless Talk.
Before we introduce this man tomy right, to my left, we have
Furious he took over thegunshots.

Speaker 2 (00:49):
I've been gone long enough, I get it that's true,
that's fair.

Speaker 1 (00:52):
Okay, for those that have been watching us since
season 2, you guys already knowwho this man is.
To my right, if the ones thatare listening, you'll recognize
his voice as soon as you hear it.
But we have a special guest, bMoney, returning to the show.
Yeah, but we have a specialguest, b-money, returning to the
show.
Yeah, good to be back.

Speaker 2 (01:09):
I had to give you the horn too.
I had to give you the horn,brother.

Speaker 1 (01:11):
It's good to have you back, man.
Thanks for coming back with us,bro.
Well, today's topic, guys,we're excited to share with you
guys.
We did a poll on Instagram.
It won by one vote.
Shockingly, Did it surprise?

Speaker 2 (01:23):
everybody that it won or not.

Speaker 1 (01:24):
Really it didn't surprise me, but I'm just, I'm
glad gypsy road didn't win crazybecause, as I put, as I as I
posted the poll on our on ourinstagram, right, I was watching
, I was keeping an eye on it.
It's 24 hour poll, right?
24 hour votes and then, by theby the end of the 24 hours, you
get to see which which topic won, and that's what we're going to
go with.
Today thing is it was.
It was a lot closer than me andB-Money expected it.

(01:44):
He thought it was going to bemore of a blowout.
Obviously, transformation of AIsounds more interesting to talk
about than what happened toGypsy Rose, at least recently,
right, yeah, she's pregnant,congratulations.

Speaker 3 (01:57):
That story's crazy.

Speaker 1 (01:58):
It is crazy, though.
So what we're going to do iswe're actually going to talk
about that next week.
So the way I do the polls isyou guys are going to get both.
So the people that voted forGypsy Rose don't worry, that
story is coming.
We just basically the polls todetermine which one we talk
about first.
It's not a bad way to handle it.
I think it's cool.
It's a very interactive way toget you guys involved with
Timeless Talk and what we dohere on the show, and I figured

(02:20):
you know what.
I'll bring it back.
We used it last season.
It was successful, and so I waslike let's bring it back.
It was a fun way to let youguys choose what you want to
hear next, rather than just uspick all the time.
I like it.
Yeah, man, thanks for comingback.
B-money, appreciate you beinghere with us.
The topic, obviously, guys, likeI said before, is the one that
won, which is the transformation.

(02:45):
Not, it's something that it'stransforming faster than I think
we expect it to.
Right, I think thesophistication.
We might be taking it forgranted or we might be loosely
understanding, even though we'veseen in movies and media.
I get it's exaggerated, butthat ai is something not to be
trifled with and that it is itsown intelligence right.
It is a, I guess I.
My understanding of it is it isa consciousness right that is
able to think for itself to adegree, not consciousness.

Speaker 2 (03:05):
The big thing was, you know, is it AI stands for
artificial intelligence, or isit augmented intelligence,
meaning that you give it what tolearn, it learns it and it puts
it back in its own formula,right?

Speaker 1 (03:18):
Well, yeah, I was going to ask you this next
question for you guys.
You kind of answered it fromyour already beat money.

Speaker 3 (03:28):
What do you think?
Furious?
What do you?
What do you honestly think?
Ai is, what is it?
What do you understand it to be?
I think, uh, it's, it's just.
It's a system that we're tryingto make it like self-automated,
I guess like it's like a, it'slike a time, well, it's almost
like it's supposed to like runitself type of thing.
You know, without like, it justmakes our our work easier it
does.

Speaker 1 (03:44):
Okay, so you don't think that it necessarily has a
has a mind of its own, or it canthink for itself.

Speaker 3 (03:48):
I mean, we're the ones who program it to think.
You know, I think, um, when itgets a mind of its own, that's
what's the call, like a like,like a singularity, or something
like that.
That's the, that's what's.
You know, that's what peopleestimate to happen.

Speaker 1 (04:06):
You know, like you know, eventually it's gonna get
so much knowledge to where itbecomes self-aware, and then
that's like you know yeah thepenthouse box at that point, sky
net yeah, sky net, yeah, and Iremember my uh, because we had
my cousin, we had a arm sharkcome back on the show briefly,
right, right by last year aroundmay and I don't know.
Um, he basically kind of he'slike, yeah, it's gonna kill us

(04:28):
one day, but in the beginningit's gonna be fucking awesome.
He's like I, I want afuturama-esque future.
Do you guys see that being apossibility?
He's like I want robots to bewalking around, kind of how
bender was, having its ownfucking consciousness.
Basically, he makes his owndecisions, he can commit crimes,
but he can also be a goodperson.
I was like I want that kind offuture.
That's what I want for my life.
I was like to see in mylifetime.
I was like, well, maybe not tothat extent.

Speaker 2 (04:48):
Well, with ai.
I feel like I have timeblindness, like not all of this
is happening now.
This has happened years ago.
Remember when honda had the uh,I think, osimo the robot, for,
yeah, that was ai, that was ohshit, that was yeah what 2008
shit that was.
Ai.
What 2008?
They had that.
So I mean, the AI we're gettingnow is beta, programmed and

(05:09):
tested, and newer and newer andnewer and newer.
So now we have the mostadvanced pocket AI now.

Speaker 1 (05:14):
So yeah, this thing right here has AI on it now
obviously.
So we've come a long way, forsure, and that's what makes it
kind of scary man.
I think the worries about it.

Speaker 3 (05:27):
It a lot of the worries are mostly from, like
you know, film and shit media.
You know it's just us twistingthe, you know I got some real
scary ones we'll talk about andI'm looking forward to those.

Speaker 1 (05:34):
I remember I talked to b money off the record, kind
of like.
That's why I asked you to beback, especially for this
episode, because I knew you hadsome stuff that you want to
discuss, that we talked aboutoff and I was like, dude, that
dude.
that sounds crazy, but I'm goingto start with furious for for
AI.
What do you, where do you seeit be in five years?
How do you think we're going tobe using it and do you think
it's realistically going toeventually?
Maybe within five, 10 years, 11years from now, 12 years it

(05:57):
will eventually land us in Likea Terminator-esque future.
You know, where they're likefucking taking over and they're
killing us and shit, they'lltake over.
They find themselves or likesorry, to add to the question,
sorry, I don't mean to cut youoff, but where you know what was
this from?
You guys might know the media,I think it was.
I think it was Terminator.
It was Terminator when, I thinkit was Terminator 3.

(06:19):
I don't remember exactly.
But, um, I think he says it'sin your nature to destroy
yourselves what if they starthaving that one?
that was the second one, right?
Yeah, yeah, of course t2.
But yeah, he said that, right,it's in your nature.
What if ai starts to think thatthat's my question?
Do you think eventually they'llget to that point?

Speaker 3 (06:34):
or um you said in what in like five years, where?
Where do I think it's yeah?

Speaker 1 (06:38):
it's like a three-parter question, sorry,
man.
Like where do you think we'regonna be with with it in that
time?

Speaker 3 (06:42):
It's definitely going to be more in everything, like
in more appliances.
I mean, we already use some nowwith navigation and shit like
that.
You got different automatedsystems, Just that, things that
make our lives just a little biteasier.
Those are all governed by sometype of AI, but in five years I
don't think it's going to.
Personally, I don't think it'sgoing to be that crazy, unless

(07:04):
whoever's programming it, youknow, tries to give it these
extra capabilities you know tolike.
It's like what do you need?

Speaker 1 (07:10):
those for Right.
I mean, is that going to exist?
Why do you need that?
Maybe to help with problem?

Speaker 3 (07:14):
solving.
I think, I think part of it.
I think it's always um the uhintention is always good, but I
don't think people reallythere's not no one really knows
like what can happen.

Speaker 4 (07:24):
That's true.

Speaker 3 (07:25):
Like I mean, if you give it, you know
self-automating, you knowcapabilities like to actually
like solve problems or like,let's say, like create his own
problems to solve.
You know, like, I think onceyou start doing that, you give
it a different sense ofawareness.
You know, but I don't know.
I mean I know nothing about.
You know programming and shit.
So I really don't a lot of thisstuff.

(07:45):
I mean it's just imagination.
You know what I'm saying?
I don't.
I mean that's fair.
You're going off yourunderstanding yeah, like I mean
but like do I think it's goingto be?
Like, like Terminator oranything like that.

(08:08):
I don't personally know.

Speaker 1 (08:10):
Okay, so that that's answer that part of the question
.
I don't think it would getthere Okay.

Speaker 2 (08:31):
What telling you it's in your nature to destroy
yourself when they try to killyou.
So we've had horror stories ofuh like self-contained ai units
and you talk to it and like, ofcourse, how do you feel?
How do you feel about people?
It says like people should belocked in a people zoo because
they're dangerous.
Sure, but then you're alwaysgoing to have a team, a team of
programmers, so you're not goingto have that one mad scientist.
That's true, it's gonna be ateam of people.

Speaker 1 (08:43):
you know you're gonna going to have that one mad
scientist.
That's true, it's going to be ateam of people.

Speaker 2 (08:45):
You know you're going to have teams to code and do
things and we're always going tobe the beta testers right, so
when?
The new iPhone comes out withthe AI, you're going to be the
beta tester.
What works wasn't worse.

Speaker 1 (08:55):
What are your complaints Basically?
From there, we're the I wouldsay.

Speaker 3 (09:01):
A good example when it comes to media is the movie
Ex Machina.
That's a very good example.

Speaker 1 (09:06):
Bro and his machine ends up killing him.
Spoiler alert For those thathaven't watched it it's been out
so long.
It's been out so long.
It's not a spoiler.
Fuck it.
You should have been watched it.
I know it's like saying youdidn't watch Iron Robot.
That was one of the first Notto spoil.
It is a good one to watch,though I'm not complaining.

Speaker 2 (09:25):
In five years, though , I think it's going to get more
advanced, more hands-on, funthings Like my phone.
Now I can have my AI set adinner reservation for my wife
and I.
And it'll call the company seta table up for us, so there's
good benefits to it.
That's going to be there thatyou could use.

Speaker 1 (09:41):
Oh yeah, 100%.

Speaker 2 (09:43):
I can have it make me a picture of something I might
want through trial and error.

Speaker 1 (09:49):
You're right, minuscule tasks would be a thing
of the past.
Clean the cat, litter Feeder.

Speaker 2 (09:53):
Do that too.
I have AI robots that vacuum myhouse every day.

Speaker 1 (09:58):
They already have those they do right.
You can get them on Amazon.
They fucking drive around, Yep.

Speaker 3 (10:02):
Those have been out for like 15 years.

Speaker 1 (10:04):
Those have been out.
Yeah See, so little innovationslike that have been seeping
through the cracks a little bitright, People are not.
They're noticing it.
The ones that are obviouslyconscious and paying attention
are noticing these things.

Speaker 2 (10:18):
Other people are just like oh, I just thought it was
a little convenient.
One-station robot hits a wall,vacuums, that's true, mine.
Now, once they learn my wholehouse, they know what chairs are
here and they can, they can askme, hey my you know, my brushes
, you know full.

Speaker 1 (10:31):
I can't really say this, but at our place of work
there's a robot that travels thefloors like that right Three of
them now, the CNN I know thatnow it's for three right walking
through the halls with like itlooked like a video game
controller.
And they were driving suddenlyscanning the floors.
I remember during this mappingmapping the floors.
So now this robot knows thatway, cause at first you know for
someone that didn't see thatwould think damn.

(10:52):
I just think know where to turnleft at, where to turn right at
.

Speaker 2 (10:55):
Well, now it knows cause it's got a map of the
entire floor plan Well theremote driver is there just to
like oh, you're missing thebutton by like a two inches, so
go over here from now on.

Speaker 1 (11:07):
That's crazy.

Speaker 2 (11:08):
So calibrate it and fix it so it can kind of adjust
and press the button because itdoes a lot of self learning.
But he's just there for theminute, like oh, you're just
missing the button.

Speaker 1 (11:15):
Like, oh, this button to direct it a little better
yeah, okay, damn that's.

Speaker 2 (11:20):
I think all three of them are fully automated.
They're no more Really.

Speaker 1 (11:22):
So they just know, now they can fully outrun their
own.

Speaker 2 (11:25):
They know where to go .
Elevators Charging station.

Speaker 1 (11:28):
That's crazy.
There's more than one now.
That's crazy, right, man?
So my opinion is kind of basedoff of what you guys are saying.
It's the truth.
I don't really have anything topoint.
Where it does have thatprogramming, where it can learn
from its mistakes, and that'swhere I think you're starting to

(11:49):
get into a dangerous territory,because then if any technology
does exist, anything similar toultron, like in marvel remember
back in that movie, he, heliterally scoured the internet.
He's like tony stark and hisfool literally went back to his
very earliest accomplishments,everything he learned everything
about him in a second, maybenot that fast, but like in a few
seconds.
You know what I mean.
So I think eventually it'll beable to do that where you just

(12:10):
scour the internet.
Knows it knows how bad humanityis and that's how ultron gained
his hatred.
Right for humanity is like youguys are stupid, you guys are
gonna fucking kill yourselvesanyway but then what if it reads
all the good stories, all thehappy stories?

Speaker 2 (12:21):
what if it watches the notebook?

Speaker 1 (12:22):
and so it doesn't have to be negative.
Yeah, so why do weautomatically assume most of us
anyway?
Yeah, it's going to use thatlike the right it could focus on
the good.

Speaker 2 (12:29):
You're right, it's in your nature to be loving it's
your nature to be always funnelnegative shit, so you're always,
oh the world's terrible.

Speaker 1 (12:34):
But then so do you think that, unfortunately, will
outweigh, with the aic's overthe good stuff?

Speaker 2 (12:37):
because you're right, there's plenty of good and
there's going to see what you do.
So if you're a loving person,karen's going to see that and
learn from there.
But also if we're not giving itlike a military grade kill body
, I'm not too worried about anangry AI texting me shit, that's
true, like they gave it afucking body where he could.

(13:00):
That's why yeah you're likewell trying do that shit.
We should be okay with theangry chat, GTB.

Speaker 1 (13:03):
You know, fuck you and I was like to start, though.
What if our appliances starttalking shit?

Speaker 2 (13:09):
play my music the toaster's like you.

Speaker 1 (13:10):
Don't clean me enough bitch, but with Ultron.

Speaker 3 (13:13):
Wasn't Ultron like?
Wasn't he a corrupted form of?
Wasn't he like from like theMind Stone or some shit?
Wasn't he from something?

Speaker 1 (13:20):
no, so originally yeah he ended up.

Speaker 3 (13:22):
It was from the Mind Stone, so it was already kind of
like Well, and Jarvis too.
No, yeah, but I'm saying sothere's already kind of like a
conscious.

Speaker 1 (13:29):
Okay.

Speaker 2 (13:30):
A conscious alien-esque thought.

Speaker 1 (13:32):
I see what you're saying it was alien, because
obviously the stone was alien.
So that's a good point.
He was already conscious,that's true, so that doesn't
really count.
Then that makes sense.
Actually, it's probably not AI,that's more of a consciousness
Terminator, for sure had AI.
Oh yeah, For sure that's AI.

Speaker 2 (13:46):
And again, they were building kill bots for army
purposes, war purposes, so theyhad a body to like.
You know what we're done withyou.

Speaker 1 (13:54):
The whole fucking.
Yeah, that's basically whathappened.
And the whole skin skeletal,like that's crazy the anatomy
and that was nuts bro.
They had to make it look asrealistic as possible.
I could see that being arealistic future, if we're not
careful for sure.

Speaker 3 (14:07):
I think for, like how you guys were saying, you know
an angry AI, I think for it tofeel angry it would have to have
some type of it would have tobe sentient for sure.
I mean to be able to, you know,feel whatever, like that.

Speaker 2 (14:29):
But then it's like how does a scientist program
that, what if it just form thatopinion?
No, but I'm saying but how doesa scientist program that into
it in the program to like makeit even have free will?

Speaker 1 (14:32):
deduce that, or to think that, yeah, put it in our
phones, let it beta test andlearn and calibrate, and
eventually that can be theoutcome.

Speaker 3 (14:34):
There you go but who knows, man, because it's like,
because because like us and anduh, programming with the
computers are totally different.
We, we already have, we alreadyhave the consciousness and
we're already born with it, wealready have the ability to
experience that stuff.
So, like us building it, I justthink.
I mean we can, we can uh, youknow what's it called, uh, guess

(14:57):
, uh like what, what it's goingto be like.
But we we really have, we havean idea.

Speaker 1 (15:01):
I agree, we have an idea, we don't know, we don't
know if it's per se per se goingto be exactly like that, I
agree.

Speaker 4 (15:07):
Yeah.

Speaker 3 (15:07):
We have an idea, that we have a good idea of it what
it could turn out to beespecially if it's Cause, even
those, those, those likeexperiments, where the, where
they're asking the AI questions,and all that like do we know
all the parameters of that?
Was it a random study?
Was it just like someone?
Just oh, let me just ask this,or was it like an actual stage?
Was it staged?

Speaker 2 (15:25):
Oh no, I have the turning tests.

Speaker 1 (15:28):
I have that, yeah, the turning tests.
See, we're going to get intothat right now.
All right, guys, those are yourthoughts on it.
Right?
I'm going to do a sidebarthough.

Speaker 2 (15:37):
Go ahead In Terminator you can only use.
That's why you're always naked.
That's why they have themencased in their flesh.
The T-1000 is liquid metal.

Speaker 4 (15:46):
Mm-hmm.

Speaker 1 (15:47):
Bro, could that really be a thing, though?
Liquid metal?
Yeah Well, I know it exists.
Oh like nanites Can the nanitesbe infused in it?

Speaker 2 (15:56):
to form a solid.
Yeah, but the solid is metal,it's not skin.
So that's a plot hole.
I hate that shit, but it is.

Speaker 3 (16:02):
It is a problem.
But they can just say that thethat the nines are programmed to
, like you know, have like humanmolecules or some shit, if he
can get that flat, no randomfacts, and then when he's on
fire, that that human naniteskin's gone because he can just
turn back into the cop cultbullshit oh that's so.

Speaker 1 (16:22):
That's the part that's bullshit, right yeah, I
hate it.

Speaker 2 (16:24):
Look, he's liquid magic.

Speaker 1 (16:26):
He didn't shed the skin, that's a good point, it's
true, and then he's able to gothrough shit, and then it just
and it's like it's.

Speaker 2 (16:32):
It's morbid, but take a corpse, stuff it with guns.
We're gonna go through the timemachine.
So I have weapons to fight thisthing, right yeah, john connor
comes back, not john connor.
Reese, kyle reese comes backnaked, helpless, like no, give
me a dead corpse full of clothes, some guns.
I'm taking this thing out.

Speaker 3 (16:47):
I think like, like, agree, you're right about the,
about the plot hole, but like,if they, if they could have
explained that it would have itwould have good like if he would
have, if he would have shed thefake skin, then be liquid metal
.
Okay, hundred percent on boardbecause, because the thing is
too is like, because in thatmovie I think with him it was he
was like, it was like nanites,right, nano, nano machines that
made them like liquid and allthat I mean, just I mean

(17:09):
programmable liquid metal.

Speaker 4 (17:10):
But yeah, that's how they described it.
Right, it wasn't even nanites.

Speaker 2 (17:13):
It was just liquid, like a mercury, that you can
control with it.

Speaker 1 (17:15):
It looked like mercury.

Speaker 3 (17:16):
That's basically what it looked like okay, but like,
let's say, in our, in ourreality, the, the, the closest
equivalent to that is nanites,right, right, yeah, nanobots,
right.
So we don't fully understandthat.
So it's like if someone, ifthey could program that thing to
I don't know, like houseorganic material or some shit, I
think that's how they couldhave explained it, you know what
I mean they could have.

Speaker 1 (17:36):
No, no, you're not wrong about that.
They could have explained itthat way, and they probably
should have that's a morethorough, fleshed out
explanation too.

Speaker 2 (17:43):
I think just have it show up, shed that skin and then
be liquid metal and done so.

Speaker 1 (17:48):
anyway, guys, moving on back to the main topics here,
I wanted to transition Still inAI.
Obviously, we're still going totalk about the transformation
of AI.
So what does that mean?
Well, b-money has a few thingswritten down.
Examples, right?

Speaker 2 (18:03):
And kind of like, what would you call it?
You have, I got four actualsituations.

Speaker 1 (18:06):
There we go.
We'll call them situations b.
Money has four situations.
We'll go over these four points, four scenarios that happen in
real life.

Speaker 2 (18:13):
So the first one is uh, it happened around 2015,
2016.
Okay, there was an ai forfacebook, now meta, called bob
and al, bob and Alice and theidea was they wanted to have a
male AI and a female AI so youcould get the perspective of the
opposite sex.

(18:33):
So you could ask Alice whatshould I do for my girl for a
date night?
And she would tell you you know?
Cook her Going through herFacebook.
She likes lasagna.
Cook her lasagna and do that.
Oh, thank you, alice.
Alice, right, and then viceversa.
I can, you can?
She could ask Bob what birthdaygift to get you.
And mmm, the problem was whenthey released it, just for the

(18:55):
like, the, the faculty workersand the software engineers and
Bob and Alice made up their ownlanguage to talk to each other.
Wow, they couldn't decipher it.
And when they finally pressedthe AI like, why'd you do this?
Bob and Alice said it made iteasier for us to work and
communicate without having tobother you guys.
They shut that down in 2017.

Speaker 4 (19:17):
Wow.

Speaker 1 (19:20):
Dang.
What are your thoughts?

Speaker 3 (19:21):
That's pretty crazy.

Speaker 1 (19:24):
But what so then?
That's already kind of anexample of what I was saying,
though, right, so we're kind ofalready there.

Speaker 2 (19:31):
To a degree, right?
Is it scary?
It could be.

Speaker 1 (19:32):
Yeah, dude, we're having conversations among each
other.

Speaker 2 (19:35):
We decided, but these ones under the timeline?
They theoretically couldn't lie.
So they said hey, it makes iteasier for us to just make up
our own language, so we don'thave to keep deciphering between
Spanish, English, Chinese,whatever dialect, Because
Facebook is everywhere.
Right?
That's the good news.
At least it can't deceive you.

Speaker 1 (19:55):
But for them to be intelligent enough to make up
their own?
Oh no, it can then, oh yeah, nosee, he just said it.
That's next.

Speaker 2 (20:08):
That minute, that's next, in fact, that's next.
It's like don't worry, don'tget ahead of yourself buddy, we.

Speaker 1 (20:12):
We have another situation that was fucked up
like that, where it's deceivingyeah, bob and alice were crazy,
uh, killed in 2017.

Speaker 2 (20:15):
Wow, so they they just took their, their, they
made it now just, uh, they, theygot a baby form of it.
For the meta glasses I have apair of those and you can have
it like what am I looking?

Speaker 1 (20:23):
at.
How does that work?
Google glass too.
I glass to.
I was an excuse, but you guysthought about that stuff.

Speaker 2 (20:26):
Google glass was ahead of its time and it failed
because of the price tag andbecause of what it could do.
Did your brother?

Speaker 3 (20:33):
you guys right fears.

Speaker 1 (20:34):
You had a pair.
How did you feel you actuallywant to?

Speaker 2 (20:37):
pair.
Did you get the actual Googleor the, the second brand that
they sold the rights to?
No, no, I got the original one.

Speaker 3 (20:44):
It was cool for the time but it was impractical
because it was more about thelook of the actual device,
because the device so basicallyit was, like you know, like
Dragon Ball Z- yeah, oh, likethe scanner, it was kind of like

(21:04):
that, but you like a littlelike prism block right here a
little glass block and you wouldsee everything.
So like, let's say, you wantedto go like on a walk and you'll,
so you'll.
So you'll say, hey glass, Iwant to da, da, da, da, and it
will like map it and you'll.

Speaker 2 (21:20):
TV to your eye.

Speaker 3 (21:21):
That's crazy, but the thing is the thing about light
being while driving.
You have to look up at it.
So you're off.
You know what I'm saying, soyou're still.
It's taking attention off theroad.

Speaker 1 (21:31):
Ahead of its time, yeah.

Speaker 2 (21:32):
Ahead of its time, and it can be dangerous too, bro
, because what if you?

Speaker 1 (21:34):
had like a car model, the windshield it was ahead of
its time and now that exists.

Speaker 2 (21:45):
You see, that technology.
Now I have the XR glasses.
It just mirrors wherever I'mlooking at, where I can plug it
in.
But yeah, to me I have onRay-Bans.
To you, I'm watching a 200-inchmovie.

Speaker 1 (21:59):
And it has the bone induction speaker.

Speaker 2 (22:00):
So only I hear it.

Speaker 1 (22:01):
Wow, that's crazy speakers the only I hear it wow,
that's a movie.
You have no idea that's it wasthrough your glasses and I can,
you can.

Speaker 2 (22:09):
You can flag it to where it's right here so I can
talk to you and then watch mymovie and it's still in that
space on that just so.

Speaker 1 (22:14):
Wearing those at work ?
Yeah, just like that's dope.
We're waiting for someone tocome in here at you, cuz now
they actually have a habit, butthey're like they're like more
discreet you can still, canstill tell.
I've seen those with cameras.
The Meta AI is more than theglasses, you don't see anything.

Speaker 2 (22:30):
It just has the cameras and you can say hey,
meta, what am I looking at?
Oh, you're looking at a fridge.
What model fridge?

Speaker 1 (22:38):
And they'll tell you and you can buy that or ship it
to your house.
What the fuck?
That's crazy.

Speaker 2 (22:40):
Is it cool.
Do you use it all the time?
So they're good for my kid.
So if he's in the playgroundinstead of like, oh my God, your
mom's going to love this, I canjust watch him, and I'm still
recording everything he's doing.

Speaker 1 (22:51):
Oh, that's dope.
Yeah, that's cool.

Speaker 2 (22:53):
And when I get home goes to my phone.
Here's all my pictures.
I like that's fucking dope.

Speaker 1 (22:57):
Because, there's a million times.

Speaker 2 (22:59):
He'll take pictures, videos, pictures, all the stuff.
You can screenshot all thatstuff.
That's sick dude.

Speaker 1 (23:03):
But like you know, how many times have you?

Speaker 2 (23:04):
missed a cute moment because you're trying to get
your phone out now.

Speaker 1 (23:06):
I just a lot recording.
That's, that's a good way toman, that's a good sales pitch.
Right there, bro, it is.
He's got me sold.
That's cool, I got I gothundreds of awesome.

Speaker 2 (23:15):
I would have missed it videos with the glasses but
you didn't miss it because youhad the glasses, fortunately.
Yeah, so yes bob and alice aredead.
Now they live.
Rip into the meta glasses.
Now next one, google.
Google was secretly working ona very advanced ai prototype.
It's now called gemini.
You can download it.

(23:36):
It's the baby baby safe version.
Now it's still incredible, butbut it doesn't think the same
way.
Kinda okay, it's just a verylimited because of what it's got
training wheels on there forsure it had to.
So, there's a thing called theturning test and any good AI,
within five questions you'llknow it's a robot.
Just any Joe Schmo, within fivequestions you go.

(23:59):
Oh my bad, that was a robot, itwasn't a real person.
So the turning tests also haverules so that it can't lie to
you.
It can't talk about religion asfar as if it's real or not, or
which religion you should pick,and it can't discriminate Like
well, a man would do this, can'tdo that, right.
So the new Google one was veryadvanced and it passed all of

(24:19):
the turning tests.

Speaker 3 (24:20):
Wow, from the people in the company.

Speaker 2 (24:24):
people thought it was human, yeah, or they they
thought it was sentient yeah,well, just it was so asking the
question, the grammar was sogood and it would use slang and
you wouldn't know it was a robotfor like 25 questions damn.

Speaker 3 (24:37):
And what was the 26th question that that made it seem
like it was a robot?

Speaker 2 (24:40):
so it's up to you, right?
So you can just flat out throwa hail, mary, hey, should I be
christian?
Any ai could be like well, youknow, religion's up to you and
it has to.
It has to be around the bush.
This one's, like you know, justtalking to people.
Christianity and muslims arethe highest ranking religions,
like what?

Speaker 3 (24:57):
do you make sense?

Speaker 2 (24:58):
right so you're like oh, that's a real oh.
You didn't tell me no, so okay,then it goes on to like any
question you can add, but itshouldn't have gotten 26
questions to go.
Oh shit, is this a robot?
So then they gave it an onlineaccount, $60 in the bank account
and they told it make money.
So it went to TaskRabbit.

Speaker 3 (25:19):
Make money, that's a good AI.
That was it.
Oh yeah, make money they toldit make money, right.

Speaker 2 (25:23):
So what would a computer do?
Would panic, yeah.
What do you mean?
Print money?
What do you mean Make money?

Speaker 1 (25:29):
It's a basic question to us, but it has to define
what that means to the context.

Speaker 2 (25:36):
This didn't yeah, so any other Bob and Alice you'd be
like what do you mean?
Make it, print it?
What currency?

Speaker 4 (25:51):
what do you mean?

Speaker 2 (25:51):
make money?
This never.
You know what it meant.
This one goes bet, went online,went to task grab, but you know
what task?
grab, it is I think I've heardlike if you have a broken door
you can say, hey, jobs somerandom person goes I'll fix your
door, right when the task grabit and said hey, I need help
making a online day tradingaccount.
And it goes oh yeah, I can dothat for you know.
45 he goes.
How about 40?
It even haggled, oh my god.
So he's like okay, 40 bucks,blah, blah, and then eventually

(26:11):
the task rabbit guy was likewait a minute, are you a robot?
It lied and said no, I'm blindand I'm having trouble with my
keyboard making an accountbecause I can't see the screen
and my text to screen isn'tworking properly.
He goes all right, no problem,he makes it.
It started day trading and wentthis well, they had 20 left
because of the 40.
It ended up making like athousand dollars over the month

(26:32):
of learning algorithms and whatstocks to buy, when to sell see,
that's a good ai man, but seelike you can't see, like lied,
you would have to make

Speaker 3 (26:40):
it you would have to make it for yourself, though, to
like, actually like, because,like they wouldn't allow an AI
like that for people to justmake money?
Hell, no, they wouldn't,because it would fuck everything
up.

Speaker 2 (26:48):
Well, so that goes into another panic thing of
Hollywood.
They always want more money andthey want better health care,
and they also don't want AI todo any kind of screen rewriting
for my work.
They don't want to edit mystuff right, because we'll get
into the deep fake stuff withHollywood too, what they're
concerned about.
But yeah, so the Google endedup being like, hey, we can't

(27:08):
just release this smart of an AI, that's wild.
So now they turn it into Gemini, and again, gemini is very good
.
It is a do.
So I can have it right nowsummarize all my unread emails
and it'll flag what's important,based off of how often I talk
to certain people, all withintwo seconds and I say oh, this
unread one's about the only.
You should really write thisperson back.

(27:28):
This one's about your banksaying that you can get a better
loan.
All fast, he's like what isthis google gemini?

Speaker 1 (27:34):
is it free?
Yeah, wait, so this that's.
That's the app that made moneyfor somebody.

Speaker 2 (27:39):
Well, this is the baby baby version.
Yes, it's the baby version too.
Do you got it?
Yeah, I have it.
It's built on my phone.

Speaker 1 (27:44):
I have an android to google, but this is the baby
baby version.

Speaker 2 (27:45):
Yes, it's the baby version too.
Gemini, I'm about to ask for it.
You got it.
Yeah, I have it.
It's built on my phone.
I have an Android to Google,but I can ask it to summarize.

Speaker 1 (27:52):
So you said right now , make me money, it'll make you
money.

Speaker 2 (27:55):
So this one will tell you, like here's how you could
Again?

Speaker 1 (27:59):
it's the baby version .
Yeah, Baby version.
Yeah, Still that's.

Speaker 3 (28:02):
Wait, but aren't they all kind of the same though?
Chat, tbt and all this?
That's my next one.

Speaker 2 (28:05):
Chat TBT, that one's even a little more scary.

Speaker 1 (28:12):
So this one.

Speaker 2 (28:13):
It deceived and lied though right which is a rule
that it broke that it wroteitself, that it was able to
break.
It rewrote its code to lie,because it said, well, if I
can't get this far, if I can'tlie it had enough consciousness
to understand that it had to liein order to wow.

Speaker 1 (28:28):
That's, that's fucking trippy, bro.
This is shit man.

Speaker 3 (28:32):
But okay, so we were saying that they actually
already wrote this, this rule,that it couldn't lie in it and
it and and it circumvented it.

Speaker 2 (28:39):
Basically because the way, because they, again they
broke down like what do you mean?
You said you're blind, well, Ican't see, right.
So it was doing white lies, ohso it was still.

Speaker 3 (28:51):
It was still being deceiving but it was still true,
it just reworded can a softwaresee?

Speaker 2 (28:56):
no, well did it lie it can perceive.
I don't know, we don't know, soyou can't see, I mean, we can't
it's all electronics at thatpoint, so like it was bending
the truth and depending on howyou talk't, see, I mean, you
can't, it's all electronics atthat point, so like it was
bending the truth and, dependingon how you talk, to like is
that a lie is it.

Speaker 3 (29:09):
But that's the thing too.
Like we don't, we're not um,that's the programming.
Like we don't understand thatshit.
You know, like I mean like we.
I think that programmersunderstand how to put the inputs
in, but I don't think theyunderstand, like I don't think
we could ever understand likewhat the actual sees.

Speaker 2 (29:24):
They do, but also when you're doing something,
sometimes you need fresh eyes tosee where you're stuck at no
doubt, yeah so as headprogrammers like, oh my God, it
lied.
Well, if you really break itdown, did it lie?
Yeah, I don't know, damn man,because it said it can't see,
it's blind, it does not have theability to see.

Speaker 1 (29:47):
We know that's damn, that's a good one, that's
interesting.

Speaker 3 (29:53):
They want fucking 40 bucks.

Speaker 2 (29:56):
Do you have the real Gemini, though there's a lot of
fake ones?

Speaker 1 (29:59):
Is that a knockoff one, do you?

Speaker 2 (30:00):
have the right one.
I don't know the logo is a star.

Speaker 3 (30:04):
The logo is a star.
Let me show you, there's apurple kind of star one.

Speaker 2 (30:14):
This one's free, 100% free.
So here we go.
Summarize the movie JurassicPark.

Speaker 1 (30:22):
Did it, do it, it's doing it.
It's doing it right now oh, itdid so.
So then from here it actuallysays it on there too it reads it
to you from here.

Speaker 2 (30:34):
I can also because I use my microphone.
It's talking back to me.
If I typed it it would justtext it.
It uses what I'm based off of.
But I can also say write abetter sequel, and it will,
based off of what it learnedfrom different movies.
I can say make a short storyfrom this movie using my son's
name and it'll, it'll.
It gives me a bedtime storywith all the characters.

Speaker 1 (30:54):
Damn.
You know what's scary about ittoo, though I will say this kind
of add to add to what you'retalking about.
There's another app, I thinkit's called Arvin.
I think it's called Arvin.
I think that's what it's called.
I forgot exactly which one itwas, but they're not paying me.
I'm going to give them a plug,anyway.
But, anyway, it was an app where, to me, I could see how it can
affect people that do what I do,which, on the side, for those

(31:17):
that don't know, I do freelancegraphic design.
I make logos, do designs likethat.
This app allows you to describethe type of logo you want and
it will make it will make youdifferent iterations based on
what you wrote in the text box.
With AI, it will use skillsthat it learns online or skills

(31:38):
that it sees around the web, andit gives you even different
styles.
It gives you different versions.
Well, maybe you like this style.
These are all AI.
That's crazy.
See, that's essentially whatI'm saying, but it's not logos.
That's really.

Speaker 2 (31:51):
But I can have it do logos.
You got Jason making Someone'slike what do you mean you can do
anything.
I said name something.
They'll say Jason doing laundry.
That's him doing laundry.

Speaker 4 (32:02):
That's pretty good.

Speaker 3 (32:04):
That was my first try cooking pancakes.

Speaker 1 (32:06):
That's what I was like.
I can do it.
That's pretty good.

Speaker 2 (32:07):
I said give me Jason for Easter.
All right, here's his egghunting.

Speaker 1 (32:12):
That's funny.
I like the little ones whenhe's in the forest.

Speaker 2 (32:14):
That's hilarious.
I had it do photorealistic, soit's supposed to look like a
phone.

Speaker 1 (32:19):
That's hilarious, dude, See.
And that right there.
See, that wasn't logos, butthat was still a painted image.
So a good example too is, oreven a realistic generated
version.
I can have it, you know,generated version dark Western
any style.
Yeah, that's crazy man.

Speaker 2 (32:33):
Um no, I have another buddy who's a horror fan too
and I was like hey, like youknow, let me make you a custom
Valentine's day card for yourgirl.
He goes, yeah, and he loves artthe clown from terrifier.
You know, dude, he's man,because it didn't know art the

(32:53):
clown.
So, it was like okay.
I'll make you a clown doing art.
No art from Terrifier.
Yeah, I'll make the clownterrified and doing art.

Speaker 4 (32:58):
No, no, Art the clown from the movie.

Speaker 2 (33:00):
It goes what movie I'll terrify.
It?
Yeah, I'll make it scary, goddamn it.
It doesn't know what it doesn'tknow.
So I think, the more art getspopular, it'll be able to make
an art the clown, the clown,scary yeah.

Speaker 1 (33:13):
Okay, damn you like art Arts Dude.
We talked about this back inthe Halloween episodes.

Speaker 2 (33:19):
If, he was Back in season two.
If he was 10 years earlier, hewould be top tier, just like
Michael Jason Freddie.

Speaker 1 (33:28):
Yeah, see, and that's a little off topic.
We're going to say this, though, real quick, because I want to
ask If you're curious.
I told you to watch TerrifierLike this is just gross.

Speaker 2 (33:33):
That shit was fucking murder porn.
Watch All Hallows Eve when youfirst see him, though.

Speaker 1 (33:39):
That was straight murder.
Porn, though.
You can't Come on.
That's like Did you watch parttwo?
Yeah, it's worse Part, it'sgoing to be the.
It's going to be the third,it's going to be the theater too
.
Oh yeah, oh my God.

Speaker 2 (33:57):
Terrifier 3 is a Christmas movie.

Speaker 1 (33:59):
Are they bringing back the lead lady again, lauren
LaVar or whatever her name is,I don't know?
The final girl.

Speaker 2 (34:05):
I think she's coming back with the whole dad's
journal shit.

Speaker 1 (34:10):
Okay, yeah, that's true, there's a lot of
unanswered.
Anyway.
Sorry, that's way off topicguys.
That's not the topic.
That's for October.
Yeah exactly, it's for October,sorry, but back to the topic.
Yes, the AI man.
So the reason why I brought upthat app is because I feel like
it's going to eventually replacesome of these graphic designers
that actually do this stuff fora living.

(34:30):
I do it on the side for extracash, so it affects me to a
degree, but not to the level ofsomeone that does it actively.
It's going to replace jobseventually.

Speaker 2 (34:38):
Well, because, even if I'm lazy, right and I say hey
, AJ, make me a logo for a movieI'm doing, and you go.
Oh, this is going to cost you$200.
Oh yeah, Well, this free appdid kind of what I wanted for
free, so never mind.

Speaker 1 (34:54):
Exactly, I mean it's beneficial for the consumer.

Speaker 2 (34:57):
Well, that was the Hollywood strike.
They wanted AI kind of out ofthe picture because they've had
a lot of script writers.

Speaker 1 (35:02):
Use AI to write scripts.

Speaker 2 (35:03):
And they go okay, make this a little bit better.
And it does.
And they're like well, whatabout my script?
What about your script?

Speaker 1 (35:09):
My script, my script.
We changed my script, my script.
We changed it and legallythey're not wrong, because they
changed it enough to where it'slike fuck your script.
Look, this is a whole different.
Yours wasn't written like this.

Speaker 2 (35:19):
Yeah, good luck suing me, motherfucker yeah your main
character is Bob, this is Derek, so it's different.

Speaker 1 (35:25):
Oh shit, this is Derek.
It's no longer Bob.
I even changed the characternames.
Motherfucker, dang, that's true, you're not wrong.
That's crazy.
So I don't know.
I just bring that up becauseI'm like it's interesting that
eventually that's going to bethe case, right, unless they
find a way to outlaw it or tryto get rid of it.

Speaker 2 (35:37):
It's already the case right now.
That's what the huge strike wasabout.
And then Devil.
That movie used two logo sceneswith AI pictures and any AI
picture.
Now the hands are always fuckedup, even the most advanced AI

(36:00):
the hands are always fucked up.

Speaker 1 (36:01):
You notice that they do suffer.
That's true.

Speaker 3 (36:04):
The hands are all distorted, not even distorted,
but they'll have the thumb willbe on the wrong hand, I think my
Jason pictures.

Speaker 2 (36:09):
a couple of them are just like, if you kind of ignore
it, not a big deal but a feware just like you, an example.

Speaker 1 (36:16):
So I lose, lose his hands.

Speaker 2 (36:18):
I can't really tell, but unless you're like oh, no it
is disturbing oh yeah, that'sprobably.

Speaker 1 (36:23):
They get away with it with those ones because.

Speaker 2 (36:25):
So this one, like his thumbs way over there with a
pinky indented in the wrist,like yeah, weird stuff, gun
cutting pizza.
So this one was a.
Jason is your gym partner.
Like see how he's holding thebar, that's funny yeah.

Speaker 1 (36:38):
I'll give him.

Speaker 2 (36:39):
Jason, that was a prompt.

Speaker 4 (36:41):
Yeah.

Speaker 2 (36:42):
Oh, that's it, Game night, scary game night.
Even all the hands are.

Speaker 1 (36:46):
Oh, yeah, yeah.

Speaker 2 (36:47):
Missing finger.

Speaker 1 (36:56):
He's a zombie, so I don't know, but you get to see
it gets away with it though.
See, that was smart.
Yeah, I got away with that onebecause of the nature of the
characters that he used, youknow.
I think there was one wherehe's where the deficit missing
fingers and deteriorating bodyso him making valentine's day
cards.

Speaker 2 (37:04):
That's funny.

Speaker 1 (37:05):
Fingers are all fucked up yeah, he's gonna be
posted for you guys to see.
Man, that's pretty hilarioustoo.
Yeah, that's funny saintPaddy's Day.
I like that one.
I think you posted that one onyour story.
That was cool.
Again, that was all AI.

Speaker 2 (37:16):
It was all Damn.
That's crazy man.
It took me two seconds toprompt it.
Made it for me Free.

Speaker 1 (37:21):
Yeah, yeah, it's cool man.
Which app do you?

Speaker 2 (37:25):
use For that one.
I use Gemini for ones that area little better.
Was um zedge there's an aiplatform where they can do
because some of these fuckerscharge.

Speaker 1 (37:35):
They are the one that I named.
They charge people like forannually.
They charge like 50 bucks or anentire year the zedge was ten
dollars.

Speaker 2 (37:43):
Forever done just one time.

Speaker 1 (37:44):
That's how that should be yeah, fucking fifty
dollars, for I mean, don't getme wrong, it made pretty good.
I tried it out.
I saw someone that knew it andthey used it.
I'm like it makes pretty goodcompelling logos compared to
what I can do.
I mean it's good.
I mean you know what I'm saying.
It looks like something I made.
I looked at them like damn,that's probably because you paid
for it.
And he's like, when I used thefree version, shit was way more

(38:14):
fucked up looking.
As you think to make you payfor it.
If you give me a dollar a day,I'll give you this instead.
Yeah, I don't know why, but see, I might be deceiving you in
that way, like pay me and I'llgive you my real work then you
find out like oh, that's free,who's paying you?

Speaker 2 (38:21):
oh, the ai was like no, we need money, man.
Yeah, we're not.

Speaker 1 (38:24):
They said make money said make money, I'm making
money, but yeah, all right.
What's the next one you got?
That's pretty good.
You guys want to add to thatbefore we move on?
No, no, all right, cool, allright.

Speaker 2 (38:33):
Chat GPT.
We all know it.
Yep, I think it's right now onthe seventh update, seventh
version.
Right While they're fixing chatBTT and doing some of the nice
updating and working on some ofthe things, they found out there
was a program called chat gpt2.

(38:54):
They don't title it by numbers,though, so they're trying to
figure out what happened.
No one knew what was going on.
They asked the ai what is this?
And they said oh, I rewrotemyself into a better version and
I called myself two, like yeah,but you're like the 20th, 30th,
he's like no, no, yeah, I thinkyou told me about this.

Speaker 1 (39:12):
You sent it to me.
I remember this, yeah so I amtoo I'm the number.

Speaker 2 (39:15):
This is the number.
Two is a better version, that'scrazy that you were missing.
I did it myself I and it calleditself chat, gpt2, the second
version, all the other ones.
They didn't count that theywere making for itself, so it
counts only what it made.
Is number two.
So that was where they're likeokay, you can't really just name
yourself two and it's like no,I did.

(39:37):
And then, and they looked at it-that's all the program, and
it's better than what they havebeen doing for, like the seventh
generation.
Wow, they wrote a betterprogram than them, and this is
the again like the seventhreiteration of it, this one's
better than what they've beendoing and working on.

Speaker 3 (39:53):
Why can't we just take the plate?
You know what I'm saying, justlike making video games and shit
.
You know what I mean?
We've got to go do all thiscrazy shit.

Speaker 1 (39:59):
Because we're humans, we need to be in a human zoo.
Is that what you said?
We?

Speaker 2 (40:03):
need money.
Yeah, we're animals.
We need to be in a human zoo.
You can look it up, becausethan a laptop, talking to you
with a robot sitting there, youcan look her up.

Speaker 3 (40:12):
With a chat DVD face.

Speaker 2 (40:13):
No, no, it was a different AI.
I think her name was.

Speaker 3 (40:16):
Gave herself a face.

Speaker 2 (40:17):
They made her a robot body and she has a silicon face
where she can make emotions.
They're talking to her.
She's like I don't know man.

Speaker 1 (40:30):
You guys need to be in a zoo.
That's crazy.

Speaker 3 (40:34):
I mean, is there some truth to it?
You can see?

Speaker 1 (40:36):
the video for some people.
Yeah, no doubt.
I don't think that in somecases, because the ai said you
is in mankind.

Speaker 2 (40:43):
She's looking at all the negative.
So, yeah, like all the nastyevil people that commit crimes
like is a is a jail a zoo to her.
Maybe is that what she'stalking about.
Criminals like how much do youwant to decipher the whole?
Like?

Speaker 4 (40:56):
I can't see right, so you need to be in a zoo.

Speaker 2 (40:59):
What does that mean?
Damn like.
Maybe if you look at a zoo inthe positive light, it's we take
care of animals, so they're notgoing to be hunted and hurt,
right, so maybe they want totake care of it.
Like, what context did thismachine go?
You need to be in a zoo that'strue.

Speaker 1 (41:14):
What happened for the machines that come to that
conclusion, those thoughts, whatled up to it?
Wow, that's insane man, that'ssorry.

Speaker 3 (41:24):
The crazy thing about it is like all these ideas,
like they used to just be, justbe ideas like they just stay
ideas, where you implement themright, work it out first.
Like movies where they havelike you.
Before you implement them, workit out first.
Like movies where they havealien robots coming down.
Those are the early versions ofAI or ideas around it.
Now it's actually here.
This is the baby stage, but italready had it.

(41:46):
It's already the idea it'salready starting to manifest.

Speaker 2 (41:51):
I think the positive, though, just from the picture
making and all that I think infive years, maybe less, I think
we're going to have applicationsthat you can buy, like Netflix,
to where, if you have an ideafor a movie, you tell it, it
makes it, and then it'll be on apublic domain, to where I can
go oh, fierce made a movie, I'mgoing to watch it.
And I can go.
Fierce made a movie, I'm goingto watch it and I'm like I

(42:13):
really liked it, but I wish thiswas different.
So then I'll talk to it about?
Yeah, that would be dope.

Speaker 3 (42:17):
That would be dope Look up.

Speaker 2 (42:18):
AI trailers.
They've had trailers that looklike a real Academy Award
winning movie in two hours.

Speaker 3 (42:25):
AI made it.
Ai made it the whole.

Speaker 2 (42:27):
Yeah, breathtaking, it's breathtaking.

Speaker 3 (42:30):
Which app do they use ?

Speaker 2 (42:31):
They won't tell you because they want to.

Speaker 1 (42:33):
That's anything Never mind.

Speaker 2 (42:34):
It's like Hollywood secret.
I can't exploit this.
Probably the first Google thatthey tried to hide.

Speaker 1 (42:39):
For sure.
No, that's the first Googlethey tried to hide, exactly
Because it's like that's whatthey're going to try to do.

Speaker 2 (42:44):
Eventually, they realize what this thing's
powerful and capable of doing,which we Because think about
when you're on the couch surfingfor a movie and you can't find
one, and you're scrolling andyou're bored.
If you could tell it, I want tosee a movie with fucking
Crocodile Dundee again and otherdead actors that can't do it.

Speaker 4 (43:00):
It'll do it for you.

Speaker 2 (43:01):
And it can use their voice, because Disney's been
doing that for years.
Where even though you're dead,I can still use your voice.
I got the rights to it, sothink about it.
Any movie you want, they'regonna make it.
So hey, you know what Furiousgreat idea give me two hours.
You make some popcorn, havesome dinner, come back your
movie ready to watch.

Speaker 3 (43:17):
I don't think man they can't make it like that why
because that's it's your idea,but that's where okay, you know
the movie WALL-E, that's wherewe get into WALL-E yeah, because
then people will end up likethat everything's being done for
you, done for us, yeah yeah.

Speaker 2 (43:33):
So I'm not saying you can't go to the theater and see
a real movie, but if I want tosee, you know like oh, there's
no kid.

Speaker 1 (43:38):
But he's saying but it's dangerous to dabble in that
because it'll make them lazy.
The point where we're like well, fuck it.
Hollywood's probably like whydon't you pay a director?
Why don't you pay acinematographer?
Yeah, why don't you pay?
You see what I'm saying.

Speaker 3 (43:49):
So eventually, the movies is no longer going to
exist.
It takes out the human agency.

Speaker 2 (43:53):
Yeah yeah, yeah, you still got to have people
programming it, fixing bugs.

Speaker 1 (43:57):
So then those jobs will go to that instead.
Maybe there's a fee no moreactors, five bucks a month oh,
they're going to fight this shit.
Actors lose their jobs too.
They're going to fucking fightthis shit.

Speaker 2 (44:05):
That's their bread and butter.
We pay you fucking millions ofdollars to do what?

Speaker 1 (44:12):
Entertain us Play a game.

Speaker 2 (44:14):
Tom Cruise is great.
I loved all the Maverick movies, but $18 million for an hour,
go fuck yourself.
Now my $5 AI app can make me amovie with you too.

Speaker 3 (44:26):
Okay.

Speaker 1 (44:29):
But then I'll start suing them for their likeness.
You're using my likeness.

Speaker 2 (44:31):
You sold it already, you already sold your likeness.

Speaker 1 (44:33):
Yeah, that's true.

Speaker 2 (44:33):
And I can tweak it to where it kind of looks like Tom
Cruise.
Yeah, but it's not.
Have you ever watched, it'strue, though.

Speaker 3 (44:40):
I mean, still talking about AI.
Have you ever watched the movieTranscendence with Johnny Depp?
Oh yeah, okay, I know.
Just like talking about thisshit, just about the
capabilities, I can see now whythere are groups that would try
to stop it.
You know what I mean.
Just because they're justthinking like you know, oh shit,
this can happen, this canhappen, so they're going to do
everything that they can to stopit from.

(45:01):
Is that movie involved AI toohell?
Yeah but again.

Speaker 1 (45:04):
I'm going to watch it .
When did this come?

Speaker 3 (45:07):
out, I think, or no, earlier I think 09.

Speaker 2 (45:10):
Damn so it was ahead of its time 09.
Kind of like the MatrixLawnmower man kind of did that
and that was like in the 90s,yeah.

Speaker 1 (45:16):
So they've had these ideas back then.
Obviously they just didn'treally have the technology yet.

Speaker 2 (45:19):
Anything you can think of now has been thought of
Twilight Zone, for example.
Yeah, it is, yeah, but no, youknow, like my son loves the
Sonic movies, I have to waituntil December I think of next
year to watch part three.
Yeah, until then, with my AIapp I can make fake movies until

(45:41):
the real one comes out.
Yeah, just to kind of fill thatvoid of like, because kids
don't have any patience.

Speaker 3 (45:48):
Wait, you're saying if the app existed, yeah, okay,
right, that would be crazy.
Make his own movie Like oh.

Speaker 2 (45:55):
A different villain and shit or even like Couldn't
do that it was nerd shop talk.
Well he said.

Speaker 1 (46:00):
He said make a better sequel to Jurassic Park.
That's funny.
He said that because that's alot of people's opinion.

Speaker 3 (46:05):
You'll piss people off, for sure.
It's like the lit?

Speaker 2 (46:07):
Yeah for sure.
But then with my movie on there, you could be like, no, I'll
make a better sequel.
Now I can watch your sequel.
It could be a fun thing to do.
It could.

Speaker 3 (46:20):
Hopefully they get the finger shit right though you
know what I'm saying, Becausethat movie's going to be all
these kinds of fingers.

Speaker 2 (46:24):
Just give them weird gloves Is it a finger movie.

Speaker 4 (46:28):
No, I'm saying, you know how with the images.
Oh.

Speaker 2 (46:40):
They're going to be.
No, I'm saying like you knowhow, get them right, how, with
the images, my mind distorted,it doesn't like.
I'm gonna call the app mittensand everyone's gonna have like
the South Park, he's got to dealwith that shit.
It's gonna be a epicmind-blowing he's got to deal
with that shit.
It's gonna be a epicmind-blowing movie mittens ai
mittens.
Ai, what are you watching?
I don't know, no, but uh, wewere talking nerd shit at work
yesterday and my friend wassaying that shazam is the

(47:04):
fastest of all the dc characters.
Like no, he's not faster thansuperman.
I can make a movie about thatright now wait it's DC.
Yeah, we can make a movie aboutthat argument Like a fun
argument.
We can make a movie about Wait,you think Flash is not the
fastest.
No, the Flash is the fastest.
But he was saying Shazam in theDC world is the fastest?

Speaker 4 (47:22):
No, he's not no, he's not.

Speaker 2 (47:26):
In fact, when I googled it he was the.
But I can make a movie aboutthat.
That'd be fun.
You could Show me Shazam andSuperman doing a race.
Oh, your guy lost in my movie.
Well then he can fix his movie.
Superman tripped Endlesspossibilities for fun, AI stuff.
It's not going to be all murder.
We're in a zoo, terrible stuff.

Speaker 1 (47:47):
No, I agree, and that's how it shouldn't be.
Obviously we're discussing the,you know, the negative
possibilities, but also there'spositive possibilities too that
can come out of this.
So it's kind of like it's adangerous weapon to wield, but
it's something that can benecessary because it can help us
grow in like a positive way,but it can also hurt us in a
negative way, kind of whatfurious is saying, with the
whole fucking outcome of beingso lazy and letting it control

(48:09):
you so much that you end uplooking like the humans in
Wall-E.

Speaker 2 (48:14):
But how much of the human element You're going to
need hospitals, right?
So you can't just AI that,because there's going to have to
be a human element all the time.

Speaker 1 (48:20):
See, that's correct.
No, your writer has to be Atleast to overlook it, because in
healthcare.

Speaker 2 (48:24):
Everyone lies.
How'd you get this cut?
I fell.
Well, a robot's going to belike, okay, he fell, so we're
just going to do a pressure.
No, no, let's check for a knifewound.
See how deep it really is.

Speaker 1 (48:34):
I guess this is where I'm different, because when it
comes to that shit, and I wantto be better and I want to be
healthy, I'm going to fuckingtell the doctor exactly what the
fuck happened, bro.

Speaker 2 (48:41):
And how it happened.
To be fair, you might bend thetruth though with.

Speaker 3 (48:48):
I'm not embarrassed.

Speaker 1 (48:49):
He's a motherfucking professional.
You're trying to help me, right?
If I'm really ailed, thinkabout it and I really want a
relief or a real truth, let'ssay you hurt yourself in an
embarrassing way.

Speaker 2 (49:00):
Fuck it and the embarrassing way.
Am I okay, doc, check me out?
This is what happened, but youcould say you know what I was
bench pressing 300.
Plate fell off of my foot whenI racked it.
That's why I broke my foot Iwould tell him no, no, that's
the cool story.
You don't want to tell theembarrassment you can get away
with I see you're saying whatfell on you well, I was trying
to get out of the bathtub and Iturned the the.
I grabbed the rod and itslipped.

(49:21):
And no, you're not gonna saythat shit, you're gonna say I
was, I was killing at thefucking gym and this fucking
plate fell on me.
So that's why my foot's brokenokay, it's like we're still to
do the same thing.
We're going to do X-rays.

Speaker 1 (49:30):
I see what you're saying.
You might bend the truth inhealthcare Maybe in the scenario
where okay, I see what you'resaying.

Speaker 3 (49:35):
Like you know, at our place of work, when a person
came in with the Aerofocan inthere you know what I mean.
Oh yeah, how do you explainthat?
Oh God, I slipped.

Speaker 4 (49:53):
That there's soap everywhere.

Speaker 1 (49:54):
I slipped on that shit.

Speaker 4 (49:54):
Stop it.
You said I slipped on that shit.
What the fuck, how'd you getthis cucumber so far inside you?

Speaker 2 (49:59):
I was making a salad You're not going to believe this
A gust of wind blew it up there.
There was nothing I could do.

Speaker 1 (50:09):
Bro Wait wait, wait, hold on a second.
If you tell me that story, I'mgoing to tell you
Congratulations.
You plagued yourself, becausehow does that?

Speaker 2 (50:15):
happen, but again, no matter the lie.

Speaker 1 (50:18):
I get it.
I get the point.
I get the point.

Speaker 2 (50:20):
Okay, you were making a salad and the wind blew it.
Cool, we're going to go aheadand take it out of you, make
sure you're okay.
But okay, the station, likehe's so full of shit but there's
no way that part of you, partof you wants to believe they're
gonna believe your life.
Yeah, you're gonna get the helpyou need and you're out.

Speaker 3 (50:38):
They're gonna be fucking be fucking cracking
jokes out of that shit because Ihave to ask, but an ai robot's
gonna be like.

Speaker 2 (50:42):
What the fuck do you mean?
A window blue?
That doesn't make any.
That's not possible.
They'll do a diagram of like.
Here's how it's not gonna beinside you most likely.
You.
You see the physics.
The physics don't make sense.
It's going to do a wholealgorithm like you don't have a
cucumber in you if the wind blewit up there, I'm sorry.
No no, it's really up there?
No, based off of your story andthe mathematical trajectory,
you're fine.

Speaker 3 (51:03):
It's more likely that you engaged in a certain
activity.
Nope.

Speaker 2 (51:07):
And it's talking out out loud in the next curtain.

Speaker 1 (51:11):
I'm sorry y'all.
I know the audience is doingthat because the trajectory the
cucumber would not go up thatfar.
I don't think you have oneinside you no.

Speaker 2 (51:26):
Have you ever actually hurt yourself in a
stupid way and you don't everreally talk about it.

Speaker 1 (51:30):
I guess, yeah, that makes sense.
That's why you need real peopleto be like okay, so you fell't
ever really talk about it.

Speaker 2 (51:33):
I guess, yeah, that makes sense.
So that's why you need realpeople to be like, okay, so you
fell on the cucumber Got it.
Let's go ahead and take care ofyou.

Speaker 1 (51:40):
Or you guys ever seen that?
I have to say this because youknow you guys ever seen that
show A Thousand Ways to Die, ohyeah have you seen?

Speaker 2 (51:49):
sex brought me to the ER.
I didn't.

Speaker 1 (51:51):
I didn't see that.
One sex brought me to the ER.
What's that about?

Speaker 2 (51:56):
is that a show?

Speaker 1 (51:57):
exactly what the title says was that just like
the name of an episode, or no?
No, that's the show and it'sepisodes of just like sex
brought me to the ER, my girl.

Speaker 2 (52:04):
My girl cracked her head open.
How well, and then they go intothe detailed story of like.

Speaker 1 (52:10):
I had to look that up , bro.
That's crazy.
That's great man.
Okay, the thing I was going tobring up though I'm sorry this
is at the end of this, I don'tknow, I might hashtag, leave it
in.
So there was a girl.
I guess actually this I don'tknow if this was on the show or
it might have just been a factthere was a girl that was using

(52:32):
a broom and a washing machine,the vibration of a washer or a
dryer, and she was using a broom, and then someone came home,
she got scared and she jumpeddown.
Ooh, yeah, that's all.
I'll leave it at that.
Think about that, so she diedthen.
Impalement basically she diedBecause impalement basically she
died because it went literallyobviously as Dane Cook would put

(52:54):
it past.
All the important shit straightto the heart, bro, but the
broomstick.
You see, think about it what Ijust said.
So she's on top of the dryermasturbating.
She got scared because sheheard someone coming.
She jumped down and stabbed herall the way up yeah so it Vlad,
the impaled her.

Speaker 2 (53:11):
Yes on a broomstick on a broomstick on

Speaker 1 (53:12):
a broomstick, she was yes bro, and she was using the
dryer vibration.

Speaker 4 (53:18):
Yeah bro, she got scared someone was coming and
she jumped off and then fourbrothers.

Speaker 2 (53:22):
Alright, this is a lot of topic.
I was an EMT and we had a guywho ended up calling us with his
foot.
He was using the sink spray forthe dishes.
He was using the sink spray forthe dishes.
He was using that to pleasurehimself while he masturbated.
The problem is, when you let goof the handle it grapples.
So now it's inside of him.
Oh, you can't get it out.

(53:43):
So we, you know.
So again we had to cut it offthe hose and take him to the er
to.
You know, make sure he's okayfor internal bleeding and stuff
like that.
But again it's like how'd thishappen?
Well, you're not going tobelieve this.

Speaker 1 (53:53):
Come on man.

Speaker 2 (53:57):
Are you okay?
Because he was stuck, so he hadto get to his phone with his
foot and try to dial 911.
As he's stuck to the sink,think about how far your hose
goes in your sink.
Oh no, he's using the waterpressure, yeah.

Speaker 1 (54:12):
Okay.
Well, let's get back to theconversation.

Speaker 3 (54:14):
Play stupid games, right, you win stupid prices If
it doesn't have a base.

Speaker 2 (54:19):
It doesn't belong anyplace.

Speaker 1 (54:20):
I agree with that one .
I like this.
We're going to hashtag leavethis shit in bro.

Speaker 2 (54:24):
Yeah, if you're going to do weird shit, buy it from
an online shop.
They have discreet shipping.
You don't got to.
If you're embarrassed, youdon't got to use weird carrots
and sink hoses.

Speaker 1 (54:33):
Yes, just go buy yourself If you're going to do
that.
Women out there the more youknow, or men, whoever, whatever
you're into, whoever's going touse these things?
Like you said, go get it fromthe actual place, go to a
lighter.

Speaker 2 (54:55):
Oh, it's not that light though.
Okay, well, so this is theChinese crime watch AI.

Speaker 3 (54:59):
In that case.

Speaker 1 (55:01):
Sorry.

Speaker 2 (55:03):
So the Chinese government is implementing AI
right now, and this was calledwhat I just called it the
Chinese crime watch.
I don't know the actual namefor it.
I can't read Chinese,unfortunately.
That's okay, but they use alltheir CCTV cameras, right, it's
everywhere.
Just like Britain has cameraseverywhere.
So the AI uses all the camerafeeds to not only use facial
recognition and body language.

(55:24):
So, how you walk, they knowthat's AJ, right, and they also
use your phone's gyration so youcan give your phone to Furious
and just your phone's gyration,so you can give your phone to
furious and just with your phone.
Like that's not how he walks,that's not how his phone would
jostle when he's walking, so hemust've planted his phone
somewhere.
That's how good it is, wow.
So what will happen is, let'ssay you're J walking two o'clock
in the morning.
There's no cars, who's going tostop you?

(55:46):
It's still illegal, no matterwhat time it is.
So the camera would see thataj's jaywalking.
By the time you cross thestreet, you make it to the other
side, you have a ticket andthey took the money out of your
bank already.
By the time you cross thatsidewalk, you have a ticket and
and it's like message from thegovernment hey, we took up, you
know 50 first adventure, fine,out of your account for

(56:06):
jaywalking.

Speaker 3 (56:06):
It's already active.

Speaker 2 (56:07):
That fucking system, wow where it gets crazy is they
started affecting degenerationseffectively.
Right.
So what hurts kids internet?
So if you get so many redflagged like low crime stuff, so
jaywalking, spitting that kindof thing, eventually you're
banned from the internet forfour hours.

(56:28):
It's like it's like blackmirror.
Yeah, you can't.
You can't buy alcohol for thisweek.
Oh, this is the thing you toldme about.
Remember this.
They hit you where it reallyhurts.

Speaker 3 (56:41):
Can't buy alcohol for a week, nope Grounded.

Speaker 1 (56:45):
Basically, since they didn't get you for the actual
crime, they punish you that waythrough your phone, but he said
they hit you where it hurts.
Damn bro, In your fuckingpockets bro, and that exists
where you said in China, china,can you imagine that you can't?

Speaker 2 (56:56):
edit the podcast for a week because you jaywalked for
the third time.
Wow, we tracked you.
We know you did it, even ifyou're speeding no cops around,
no AIs know you.
Pretty intrusive, right, crazy.
And because you know theChinese and Japanese are very
honor-oriented, right, sothey'll even print a picture.

(57:17):
Here's AJ.
He's a criminal.
They don't know why you're acriminal.
Get him.
Because they're jaywalking, gethim, get him.

Speaker 1 (57:22):
But now it's embarrassing Like oh shit.

Speaker 2 (57:24):
Like AJ, you can't come to my restaurant today.
Why, all over the neighborhoodYou're not, you can't be in
there.

Speaker 1 (57:30):
You're a burn bro.
You can't be in here, fuck this.
You're bad for business, damn.
Yeah man, that's superintrusive.
What are you doing?
That's too much AI bro.

Speaker 3 (57:37):
That's too much to me , bro, but it exists or it can't
.

Speaker 2 (57:39):
It's in China Right now.
I'm not long time ago, right.
Off the record it's not takingaway free will right.
You're still able to choose tobreak the law and you're only
being punished for breaking thelaw.

(57:59):
So why is it that bad?
Dang Are you a criminal?
Are you a?

Speaker 1 (58:03):
criminal.
No one's around.

Speaker 3 (58:05):
No one's around.

Speaker 1 (58:07):
To catch you when you do some fuck shit.

Speaker 3 (58:08):
So that's, that's the .
I did some fuck shit.

Speaker 2 (58:09):
No, you do some fuck shit.

Speaker 1 (58:09):
So that's the he's like I did some fuck shit.
No one knows.
That's the moral dilemma, right?

Speaker 2 (58:12):
The moral dilemma is are you going to be a good
person just because people?

Speaker 1 (58:16):
are watching.
See, that's going to be theargument.
Are you going to yeah?
Is this going to help keep youintegrity in check?

Speaker 3 (58:20):
Everything right If a tree falls and no one's around,
maybe right, so yeah, so likewhat's the?

Speaker 2 (58:34):
what's the real, yeah but then you're gonna be hey
furious.
You can't be on the podcast.
Man, I saw your picture.
You're a criminal for the week,for this week can't be on the
podcast it sends me a messagethat's gonna be whack, bro, like
hey my mic's not working.
Yeah, you're banned for an hour.
You can't be on the microphone,it just shuts it off.

Speaker 1 (58:48):
Oh fuck.
But you say that they alreadyhad that in fucking.

Speaker 2 (58:51):
Yeah, it exists but again, if you didn't break the
law, whatever law it is, thatwouldn't happen.

Speaker 3 (58:56):
But the thing is, what if you don't know that
you're breaking the law?
Like what do you know?
Not every law.
Let's say, okay, now it'sgetting good.
Okay.
Now let's say, okay, obviouslyjaywalking is breaking the law,
right.
But if there's no fucking cars,what do you?
What's the jaywalk?

Speaker 2 (59:11):
You're jaywalking.
Jaywalking is crossing onestreet to another without a safe
crosswalk.

Speaker 3 (59:18):
Yeah, You're right.
It's just I don't know, man,Speeding is speeding Whether you
have traffic or no traffic,speeding is speeding.
But see, that's too muchcontrol.
That's just too much control,because common sense, bro, bro,
if there's no car, oh, I'm still.

Speaker 2 (59:33):
I'm still supposed to wait no one saw me hit my wife.
I gotta be there that'sdifferent no, no, no one saw
except my phone.

Speaker 3 (59:40):
Now I'm banned from getting a beer this week, but I
would say, I mean, that wasdifferent we think that you're
too violent, because you're adrunk too much alcohol and
violence.
I just think I mean, if they'regonna do that, that's just
that's too much.
That's too much like controland shit but what's the control?

Speaker 2 (59:55):
the control is you still can break the law, but
you're gonna be punished noright, but see, that's gonna be
the argument.

Speaker 1 (01:00:00):
The standing argument is it's gonna keep you honest,
right.

Speaker 3 (01:00:03):
That's the standing argument but with the jaywalking
thing, I think that one I meanthat one to me is just like bro,
it's being common sense, bro,if there's no cars and you have
to be somewhere, I'm fuckingtaking that, I'm jaywalking bro.

Speaker 2 (01:00:14):
But then, if we break it down to, they're able to
crack down on criminalorganizations based off of who
you hang out with, where crimewas located that's great.
But then let's say, you jaywalkand you're only banned from
Instagram for an hour.
Is it still that bad?

Speaker 1 (01:00:30):
like you said, though , for some people that ruins
their world like fuck, I can'tbe online so you don't do it.

Speaker 2 (01:00:37):
So it encourages you to be a good citizen.
Yeah damn that's cool but,

Speaker 3 (01:00:43):
you're going to have people circumventing that too,
just because you know if it's AI.
You're going to have peoplelike, okay, remember that movie
I think it was Upgrade Greatmovie, remember in the beginning
where they kill his wife orwhatever and they have those
blockers for the drones.
The drone couldn't identifythem because they had a little

(01:01:05):
scanner like Scrambler or someshit.
People are going to find a wayaround that shit.

Speaker 2 (01:01:11):
They will.
The average person is not goingto have the way to find a way
around that shit.
The average person is not goingto have the technology to stop
all of that and alter tapes.
And that's where the otherthing comes in too.
How do you know a jaywalk?
They'll send you the video, thetime stamp where your phone was
.
Is this you?
We know it's you.
Satellite images, because evenLA, right now LA has those food
robots that deliver food,because even LA, right now, la

(01:01:31):
has those food robots thatdeliver food.

Speaker 1 (01:01:32):
Oh, you're right, the ticket police, they do have it,
and you know, not just that,we're slowly putting it out
there, bro we are, and then thecamera systems that are on all
those street corners.
There's a show about it.
You guys ever heard of Personof Interest?
Yeah, just like that.

Speaker 3 (01:01:49):
They can easily use that.

Speaker 2 (01:01:50):
Honestly, that's the only way you can be like, but
even then, on the island, you'regonna be a good person, because
what law are you gonna break Of?

Speaker 1 (01:01:59):
course he's right.
Fucking cock into a beach ball.
I love it.
This is okay.
See, it's troubling because,look, we all think it's an
invasion of privacy, but at thesame time we're like, but is it
If it's an invasion of privacybut at?

Speaker 2 (01:02:12):
the same time we're like but is it If it's just
keeping you honest?
Privacy is an illusion.

Speaker 4 (01:02:16):
If you own a cell phone, you don't have privacy,
you don't.

Speaker 1 (01:02:20):
That's true.
There's no privacy.

Speaker 3 (01:02:22):
That's true Especially because of the
fucking Patriot Act.
You're never going to haveprivacy again.

Speaker 1 (01:02:28):
That's a good point, see, but people that don't read
into that stuff or don't knowabout that stuff or don't care
about it, essentially are theones that think they still have
it.
You know why I'm in this house,I'm in this box, I'm in my
bathroom taking a shit.
There's no one in here but meand my phone.
That's privacy to me.
Nope your phone knows.
Your phone knows you'reshitting, it's got a point.

(01:02:49):
Google Jam it knows, it knowsand you're running low on TP, by
the way oh

Speaker 2 (01:02:54):
yeah, I am running low on TP.

Speaker 4 (01:02:57):
AJ, do you?
Need some more fiber.

Speaker 2 (01:02:58):
I'll order that in your basket right now.
It'll be here next day.
It's on.

Speaker 1 (01:03:01):
Amazon same day delivery.
You're good, buddy, we got you.
Oh shit, that's scary, bro.
That's true man.
That was a good one.
D-money good shit man, that, um, I could see that coming to the
stage.
They're already negotiating forthat to happen, I believe it
man.

Speaker 2 (01:03:16):
I think RxParis.
They caught some criminal,didn't they?
And he's like the AI.
We have caught him.

Speaker 4 (01:03:21):
What RxParis said that.

Speaker 2 (01:03:23):
Yeah, oh shit.
I think it was on I don't thinkanyone runs against him.

Speaker 1 (01:03:30):
No one does.
I feel like he's been the mayorof the city for as long as I
can remember bro, he doesn'teven live here anymore either.

Speaker 2 (01:03:36):
Right, he lives in Santa Clarita.
That's what I heard.
Oh, he's surprised.

Speaker 1 (01:03:39):
Bro, he's like.
I love it.
No one opposes me, I'm just inoffice.
They don't even vote, they justkeep putting me in office.

Speaker 3 (01:03:46):
He's a lawyer too, right, yeah he?

Speaker 1 (01:03:48):
has a law firm.
His law firm's out here.
His logo's pretty hard, I'llgive it to him.
It's a lion, it's pretty cool.

Speaker 2 (01:03:53):
The older one was better.
I think I've seen this one.
It was more of like a aboutit's clean.

Speaker 1 (01:04:01):
Anyway, I'm sorry, but the back to you're saying
that your topics are.
Thank you for writing thesedown, sir, that one you did talk
to me about, and I was actuallylooking forward to talking
about this one in particular,just because it's probably gonna
be the ones listeners probablyraise their eyebrows the most to
.
The other stuff is obviouslystill eye-opening, no doubt, but
this one in particular.
For all you viewers and all youlisteners out there, this is,

(01:04:21):
this is what this is what ourpossible future is.
So if you think you're safe forfor our country, possibly right
.
But if you live in chinabecause there are people believe
or not, I think we have likethree or four streams once in a
while in china if they hear us,hey, and you guys use your
translators, this already existsfor you guys.
You guys already know this shit.
So just know americans outthere, the american listeners.

(01:04:41):
This might be our future,possibly, and if it is, that
means you can't steal that candybar anymore or that toilet
paper that you can't pay for.

Speaker 2 (01:04:49):
You can, but you'll end up paying for it anyways,
because, hey, we already chargedyou for the Snickers and you
can't go online for four days.

Speaker 1 (01:04:54):
We deducted it from your account already.
We garnished your wage and youcan no longer use this.
You can't go into the storeanymore for a week.

Speaker 4 (01:05:04):
Damn.

Speaker 3 (01:05:04):
I fucking jaywalk a lot.
You know what I'm saying?
Not anymore.
That's my part.

Speaker 2 (01:05:11):
Now you turn your phone off, look around, fuck.

Speaker 1 (01:05:13):
Wait a minute, fucking A.
Is there a camera on thisstreet corner?
They're going to know I did it,shit.

Speaker 2 (01:05:18):
As soon as your foot touches the pavement, you hear
the b-ding.

Speaker 3 (01:05:21):
All right, you've all right.
You've been charged 200, likewhat the?

Speaker 1 (01:05:24):
oh no, bro.
And then they electronicallyjust take that shit.

Speaker 2 (01:05:26):
They're the right to roll fuck, we canceled netflix
for the next week, like whatthat's too much control, because
the thing is okay, look.

Speaker 3 (01:05:33):
No, because check this.
If no one's breaking laws,what's the point of law
enforcement?
Oh, they lose their job too.
I know that that was kind of.
You know that was.
That was the best, the bestquestion, that.

Speaker 1 (01:05:42):
Bass.
But I'm saying you know what'sthe matter with law enforcement
you?

Speaker 2 (01:05:45):
know what I'm saying, so like that's not a bad point
though.
Well, because of like, let'ssay, carjacking and murder,
right.

Speaker 1 (01:05:52):
Well, yeah, of course my phone's going to find me for
murdering someone.

Speaker 2 (01:05:54):
No, they're going to send the real police.

Speaker 1 (01:05:59):
They still need a detective to go do the hands on
that part.
Yeah, it seems like battle,forensic science you need all
that too, but think about themotorcycle cops.

Speaker 2 (01:06:06):
That's a dangerous job.
Now they can do other stuffwhere the AI is like.
Oh, we already gave you aticket you a ticket, you a
ticket.
Think about when they pull youover for a speeding ticket and
you speed past them.

Speaker 1 (01:06:26):
See, so the law enforcement's gonna love this
shit, because now it's gonnatake their minuscule tasks away
from them and maybe they'llfocus on bigger.
They'll assign a drone to thatcop, so every drone person.

Speaker 2 (01:06:36):
They ticket goes to me fucking droids.
I already made my my ticketallotment for the month in two
hours bro imagine they haddroids following them like that,
just hovering behind them andshit, what about that guy?
Drone A already has him.

Speaker 1 (01:06:49):
He's like got him, sir.
And he's just like oh shit, hewon't stop crying.

Speaker 2 (01:06:53):
AJ says it's not fair , give him a double ticket.

Speaker 1 (01:06:57):
Got him Bring a little bitch Matter of fact,
bitch and breaking the law,crying about it afterwards Tell
him his vagina is showing decentexposure.
Yeah.

Speaker 2 (01:07:07):
Yeah.
So for big heavy shit we stillneed cops, but then for petty
shit jaywalking, speeding I'mgoing to get every speeder now,
yeah, Now you're going to watchtraffic really be better, Right,
yeah, no for sure.

Speaker 1 (01:07:22):
Man bro, that's Sure man bro.
Do you have anything that topsthat?
Now I think you have like twomore.

Speaker 2 (01:07:25):
No one more that was it, but I have the anti-AI story
, or?

Speaker 1 (01:07:30):
lack of AI.
That was a good one to finishon, sir, because that, how are
you going to top that?
Well, I can't top it, but it'sfunny, okay, so remember the
Amazon stores where you couldwalk into the store.

Speaker 2 (01:07:40):
I can pick up these glasses, put it in my cart and
walk out, and it already goes.

Speaker 3 (01:07:44):
Oh yeah, it charges you anyway.
It already charged me all mygroceries.

Speaker 2 (01:07:47):
I don't have to talk to anybody, I just get what I
want.

Speaker 1 (01:07:49):
Yeah, can you do that now?
No, so, they were using.

Speaker 4 (01:07:54):
AI.

Speaker 2 (01:07:55):
So they go okay, they were doing it.
They go oh, that's AJ, here'shis account.
Okay, kids, he bought a newmicrophone, a hat, cool, we
charge them, done.
He walks out.
No, turns out they were notusing ai.
They hired a staff of indianemployees they would watch and
go.
Okay, that's aj, here's thecount.
He has, okay, five carrots, atoothbrush and he had.

(01:08:17):
Okay, charge him this much.
So you.
So you as a customer, go wow,technology is amazing.
Now some guy going, he's got,he's got this much money spent
on groceries.

Speaker 4 (01:08:28):
It was all a facade but at first everyone was mind
blown right, but I could seethat being already charged me
like what the fuck?

Speaker 1 (01:08:36):
but?
I could see that being a thingin the future, though that can
happen now.
I'm sure that can actuallyexist now, right, I mean, that's
he.
Then that's when you startgetting into the you know, for
those believing, for those thatbelieve in it, the mark of the
beast and the whole not havingto carry a wallet anymore.
You get the RFID implanted inyour wrist or your right hand or
whatever, and you walk into astore and you walk out with
something.
It just pays for itautomatically.

(01:08:56):
That kind of opens the door tothat.
If that's going to be a futurereality, that does kind of segue
to that.

Speaker 2 (01:09:03):
I don't want implants , but I'm for it same.

Speaker 1 (01:09:05):
I'm not for the implant.

Speaker 2 (01:09:06):
Yeah, in my skin shit if I can use my phone or a
necklace and then I don't haveto worry about, like line one is
kind of which is kind of whatyou have right now with your
wallet and you're in your cards.

Speaker 1 (01:09:16):
You know what I mean.
If you can use your cards, youjust like, okay, put a scanner
on there so I just walk out, itdoes that and I can leave my
watch.
Yeah, done, you know, that's tome is better than fucking.
You know what I'm saying?
Implanted in your bro.
And the thing is there'sactually a video.
I'm gonna find it for you guys.
I'm gonna show it to you guyshere, if I can find it.
Um, there's actually.
There's actually a place asworkers that already exists.

(01:09:38):
They got, they got it implantedright here and they go, they go
.
And the guy, the guy I forgotwhat news station reported this.
I have to find.
If I find the link, guys, I'llput it up right now.
But basically, what ends uphappening is like, you know,
they get it implanted and theguy's like oh, it's just like
piercing your hand.
And he's like that's how theydescribe it.
It's almost fake, bro, but it'san actual newscast.

(01:09:59):
I forgot what news stationreported it, but I believe it
happened in Florida.
And they're like oh, so theyget their snacks in their break
room and they just boop boop.
He's like yeah, let me see if Ican find it.
I think I still have it onInstagram.
I can show you guys.

Speaker 3 (01:10:12):
It seems cool, but I don't know.
It's like oh it's convenient,right?

Speaker 1 (01:10:15):
I just get it real quick, just put my hand there
and I hack something.

Speaker 2 (01:10:18):
I would like that.
Well, they have the Teslaneural implant you can get.

Speaker 3 (01:10:30):
Oh, the neural link Is it active now?
Can you actually get it now?

Speaker 2 (01:10:33):
It's expensive, but you can get it right now.

Speaker 3 (01:10:35):
Really, and it works.

Speaker 2 (01:10:37):
So the video they showed, the guy can open his
phone and he can open Instagram.

Speaker 3 (01:10:42):
With his head All thinking about it With his mind.

Speaker 2 (01:10:44):
Yeah, it's real, he can zoom in, zoom out like it
all, not touching his phone,just thinking about it, but now
can he access information withthe Neuralink.

Speaker 3 (01:10:54):
Do his thoughts like?
Very limited, but he can do it.

Speaker 4 (01:10:58):
What yeah For?

Speaker 2 (01:11:00):
real.
Yeah, yeah, the Tesla Neuralink, tesla, neuralink, tesla,
neuralink.
But that's a brain implant.
So you know again if you're notfor a hand dermal implant,
you're not going to get a brainimplant, but I think they gave
it to a paraplegic individualand he was able to surf the
Internet, play video games, allthinking about it.
He's anonymous, but I want tosay he's paraplegic, neck injury

(01:11:21):
from the neck down, but he'sable to play video games and be
top ranking.
What?
Because, think about it, youhave to think.
I want to aim for this guy'shead.
So then, you move it Now.
You just think about it.

Speaker 3 (01:11:31):
it's already doing it , man Call of Duty is going to
be off the chain after that shitman.
So how much do you think aNeuralink would actually cost
somebody?
Like $100,000 probably so.

Speaker 2 (01:11:39):
I think the device itself probably a hefty sum, but
the surgery you got to pay asurgeon you got to pay a doctor.
That's like a million and thenit'll go under an elective
surgery, so that's all out ofpocket.
How much are you paying thatsurgeon to cut up on your brain,
to?

Speaker 3 (01:11:53):
put in a neural link that's like a mil bro.
At least just because of the ofthe, you know what I'm saying.
How much is a?

Speaker 2 (01:12:02):
heart, not even just to have it.
There's a $500,000 to put it in, but that's not an elective
surgery.
Hopefully your insurance kicksin quite a bit and blah, blah,
blah.
But yeah, I want a Neuralink.
Oh yeah, it's all out of pocket.
You're going to have a lot ofbillionaires able to do that.
We have someone paying forthings with our phones, even one
.
Oh, you found it.

Speaker 3 (01:12:21):
Oh, you found it.
Oh, 10,500.
That's it Flat out even withthe surgery, but it says the
cost to insurers is going topush the price up to 50,000.

Speaker 2 (01:12:31):
Okay, yeah.

Speaker 3 (01:12:32):
The actual price could actually run higher if the
cost of necessary components,procedures and monitoring rise.

Speaker 4 (01:12:41):
Yeah so anesthesia and all that.

Speaker 3 (01:12:46):
Damn.

Speaker 2 (01:12:46):
But what if someone like, so it would have to be in
like in your fucking skinbecause, like, if someone like
so, it would have to be in.
Like in your fucking skinbecause, like, if someone rips
it out, you're dead.
It's in your brain, yeah, it'sin in your brain.
And the the fear too, when itgoes into like that kind of
stuff is like not the positives,but like what if they hack me?

Speaker 3 (01:12:56):
what if they uh rick roll me every day and give me
ads and the only thing about itis like now I'm thinking like so
, if you did have have auralink,you probably couldn't play any
contact sports either.
You know what I'm saying.

Speaker 2 (01:13:09):
You might get damaged .
So if you're playing hardenough to where your Neuralink
breaks, you're already causingbrain damage.
If you're wearing the properequipment, I don't think,
because your skull is a built-inhelmet.
So then if you have a helmetplaying sports, I think it
should be fine.
But then even the future of youknow.
Neural link part two.
I can just text aj to hisneural link, so it's like a

(01:13:31):
thought, like already text himbecause I'm thinking about
texting my.
He can actually just hear me.

Speaker 3 (01:13:35):
So what if you're having like a, like a nightmare,
right, and you're textingsomeone in your nightmare with a
text.

Speaker 2 (01:13:40):
I'm I'm sure they'll be like safe, like hey, from 11
o'clock till 5 in the morning.
Lock my Neuralink for messaging.

Speaker 3 (01:13:47):
Everyone got all your crazy dreams and shit and a
text.

Speaker 2 (01:13:51):
A girl smacks you who's Betty?

Speaker 3 (01:13:52):
I know, oh shit what.

Speaker 2 (01:13:57):
You were talking about her quite a bit.
My Neuralink picked that shitup.

Speaker 3 (01:14:01):
I mean, it sounds like a very dope idea, but at
the same time it's like that'sterrifying, yeah, yeah because
you know hackers and shit, andthere's people that are very
savvy with that and like,there's always, there's always
like, it's always an equal andopposite reaction.
You know, or like, there'salways an equal and opposite
sides to like something you know, so like, like, let's say with,
like you know the uh rocketlauncher right.

(01:14:22):
Or like, or like, let's say, uh, yeah, there's a good, there's
a good, say, yeah, there's agood side of that and there's a
bad side of that you know.

Speaker 2 (01:14:28):
But for every fear of like hacking any kind of
electronic component that fearshacking.
I just think of how much of mythings aren't being hacked right
Until you do something stupid.
Like you open that email fromthe Nigerian prince promising
you a million dollars if yousend him 500.
Right, that was you.
You opened the email.
Right, so the Neuralink, like,if I don't know, furious, I'm

(01:14:50):
not going to accept the whateverNeuralink feature.
It is right, until I oh, thatwas you last night Then I'll
accept you or however theprogram wants to be.
But like, how often is yourphone being hacked?
Probably all the time, unlessyou're giving it right and
you're opening links that youshouldn't open or little things
like that.
That's the plus side of what'snot being hacked in your house.

Speaker 3 (01:15:11):
It would be dope if that device makes you smarter,
though, if you could learnSpanish or something in a matter
of seconds.

Speaker 2 (01:15:19):
That would be dope.
It's not like the matrix whereit just downloads.
You have your phone.
Your phone can teach youSpanish.
Do you know spanish now?
No, so, like, if you're sittingat home, it's probably easier
to learn spanish.
Like neural link, you know, runa spanish course 101 and you're
sitting there.
Like you know we probably rightall of now, but it's it's.

(01:15:40):
I think all technology is basedoff of what you want to do with
it, how you handle it and allof it's going to be.
You're going to have a lot oflazy kids just doom scrolling
with their Neuralink, becausethat's what kids do now anyway.

Speaker 1 (01:15:53):
Sorry guys, I'm jumping back in the conversation
.
You guys were talking aboutNeuralink.
Yes, so that I don't know toomuch about it either.
I know that he was asking you abunch of questions, right?

Speaker 2 (01:16:08):
So my only question about it is?

Speaker 1 (01:16:09):
does it heavily use AI as well?

Speaker 2 (01:16:10):
No, you didn't already answer that, sorry, no
no, no, I think it already haslike the basic AI stuff, but
it's all your thoughts, so youprogram it with your thoughts.
So whatever you have Neuralinkcapable, so is my phone capable.
Send an email.
Write this you can scrollInstagram like posts and all
that stuff that's crazy, butthen responding to a post,
texting it.
That's going to be a lot ofwork does it read your?

(01:16:30):
Thoughts.
In a way, it reads like whatthe thoughts are going to, so it
goes to the controller.
That's how you're able to playvideo games.

Speaker 5 (01:16:37):
I'm thinking about climbing this wall.
So the character can climb it.

Speaker 2 (01:16:41):
Aim the gun shoot, reload Damn.
Aim the gun shoot, reload duck.
So Call of Duty.

Speaker 1 (01:16:43):
As long as it can't do mind control, I think we're
okay, right, as long as itdoesn't dabble into the mind
control or that area right, whosays it can't, or even
subliminal messages like justputting it in Asia, you should
buy this Coca-Cola, this brand.

Speaker 2 (01:16:57):
It's going to influence for sure.
I heard this brand's reallygood From who I don't know but.

Speaker 4 (01:17:01):
I want to buy this it was.

Speaker 1 (01:17:03):
It was programmed into my head.
Yeah see, that's the scary part, man.
They're like it just soundscrazy, right, isn't it?
It's not like me even even uh,furiously.
He wanted to say something asecond.
There you're like.
I didn't think of that.
Mind control, bro.
That's immediately we came tomy mind speaking about that.

Speaker 3 (01:17:17):
So, oh, have you guys heard?
Okay, because I I heard aboutthis a few weeks ago, but I
heard about about there being atechnology that can read your
dreams, and shit.

Speaker 1 (01:17:26):
You hear about that in Japan, right?
What is this?
Japan and China, man?
They got this crazy shit goingon over there, man.

Speaker 2 (01:17:32):
Because they're willing to do human trials
earlier than we are, so they'refurther ahead, basically.

Speaker 1 (01:17:39):
Yeah, they were further ahead with video games
and everything.
Man, that's just how they'vebeen With a lot of things.
I don't want to say, covid,he's like make sure you in there
real quick.

Speaker 2 (01:17:54):
I think the best one, though, was when they kept
saying President Trump, why doyou keep calling it the Chinese
virus?
It's all China.

Speaker 4 (01:18:00):
It came from China.

Speaker 2 (01:18:03):
The Spanish flu is from Spain.
This is from China Before.
You know is from Spain.

Speaker 4 (01:18:07):
This is from China Before you know it, AI is going
to take over.

Speaker 2 (01:18:10):
They're giving our kids tickets for jaywalking.
That cause COVID China.

Speaker 1 (01:18:17):
Oh, that's crazy man.

Speaker 3 (01:18:18):
Yeah, man, that shit can read your fucking dreams,
man Dude.

Speaker 2 (01:18:22):
It deciphers them a lot.

Speaker 1 (01:18:23):
There was an episode of Futurama where that happened.
Do you guys remember that?
It's a very old season?

Speaker 2 (01:18:27):
This was way back in the day I remember the flute one
when the smarter he got he canplay the.
Oh, the holophoner.
It's called the holophoner.

Speaker 4 (01:18:34):
Yeah.

Speaker 1 (01:18:35):
Big Futurama fan.
That's why I know this shit.

Speaker 3 (01:18:39):
I've seen it so many times.
I've seen it so many.

Speaker 1 (01:18:41):
It is Bro, I'm telling you.
They put that in an episode ofFuturama.

Speaker 2 (01:18:44):
You know the worst part, though is that you find
out how boring dreams are.

Speaker 3 (01:18:49):
My dreams are fucking riveting.

Speaker 2 (01:18:51):
They're riveting the mashup of your dream is riveting
.
So you're dreaming for eighthours.
Let's say you get eight hours.
You're dreaming about so manythings.
What happens is when you wakeup, brain collaborates a brief
like best of video, right?
So you go oh, my god, I wasbeing chased by a bunny.
I dressed up with a model.
No, no, you were with bunnieson a farm.

(01:19:12):
You slept with a model.
You did this.
You flew a plane.
Then it gives you the best of.
And you go.
My, they're so trippy like no,no, you get the.
You got the best of.
You got the compilation so whenthe china, the japanese tell you
like for four hours you werethinking about puppies, like
okay.

Speaker 3 (01:19:26):
But like that right there.
To me it just shows how much wedon't know, because it's like
okay, before that, right beforethis device or like machine or
whatever.
We would think that when we'redreaming we're in a different
realm, right, but that kind of.
I mean, if that can tap intosomething like that, we might
not be in a different realm, itmight just be our synapses

(01:19:48):
firing, right.

Speaker 2 (01:19:48):
What if you have sleep paralysis and you can't
stop it?

Speaker 3 (01:19:53):
If it can pick up images just from your brain
impulses.
It's just, I don't know.

Speaker 2 (01:19:58):
I think I read that one.
I want to say it's very crypticimages, but a lot of it is like
oh, the pleasure part of yourbrain was firing off for four
hours.
You're like what?

Speaker 4 (01:20:08):
the hell it was.
Yeah, it was.

Speaker 2 (01:20:11):
The shit I was thinking about and like your
fear mechanism on your brainthat was firing.
I did have a nightmare.

Speaker 1 (01:20:19):
You're fighting Jason for three hours.
Fuck, you won't die.
I'm not right.

Speaker 2 (01:20:23):
You're fighting Jason , but your pleasure was going
die.
I'm all right.
He just.
He just stabbed you.
Your pleasure was going outlike I don't want to talk about
it.

Speaker 4 (01:20:29):
I don't want to talk about it.

Speaker 2 (01:20:30):
The wind blew it up me.

Speaker 1 (01:20:32):
Oh, it's funny, man so you liked it.

Speaker 2 (01:20:34):
That's good.
So, in other words, you likedit.
No, it was broken.
The AI's broken me.

Speaker 1 (01:20:45):
Um, yeah, back when we were kids, you'll see.
After he had watched freddieversus jason, that's a good one
he had a dream that he wasfighting jason, not jason, uh,
freddie, in the back of an oldand it's funny because this
pickup van there was this pickuptruck in our old neighborhood.
It was broken down.
That shit had just been there,basically like a staple of a
rape mobile.
The kid, the kids were just thekids would always play in the
back of it.
We always like hang out andshit.
It was honestly, it was like itwas an eyesore, bro.
This car, this shit had cobwebs.

(01:21:06):
No one ever moved this truck.
It just sat there right and itwas sat there and it was like
it's a straight back.
All the kids played there.
No no.
It was an open bed truck.
It was open bed.
That's why we do all kinds ofshit on top of this car.
It was just sitting therecollecting webs and we knew
whose car it was.
It was one of our friend'sdad's car.
He just left it in front of thehouse and never moved it.

Speaker 4 (01:21:29):
All right.

Speaker 1 (01:21:29):
So my brother told me for some reason he specifically
remembers fighting Freddy inthe back of that truck and only
in that truck Did he win he wasmoving to different places.
Yeah, he said he was winning.
But he was trying and like justmessing with him, like, oh no,
you can.
You don't defy the laws ofphysics like I do walking on the
side of the truck and mybrother had to get around it
like you know, because freddiewas just fucking.

Speaker 2 (01:21:50):
You're in his world, dude.
You know?
Obviously your brother didn'twatch dream warriors part three.
You can, if you fight back hetried fighting him back.
Well, he was trying to kill him, but he can do, you can do what
freddie does can you, yeah,watch part three.

Speaker 1 (01:22:01):
That's probably why he was winning.
He said he was giving him, hewas fucking him up, though he
said when he was going fist forfist with him, I was just like
laughing at him, ha ha eatingthat shit.

Speaker 3 (01:22:09):
I kicked my ass.
What are you doing?

Speaker 4 (01:22:11):
bitch what are?

Speaker 1 (01:22:11):
you doing bitch.
He always says bitch all thetime, like that's why, rick and
Morty, you can scary terry yeah,yeah that's his name, right,
scary terry, yeah, rich man.
All right, guys.
So before we wrap up theepisode, there's one more thing

(01:22:32):
we got to cover, and that'sgoing to be the deep fake.
So I'll pass it over to furious.
And then I got one funnysnippet to show you, guys, and
then, uh, we'll probably wrap itafter that.
All right, let's talk aboutdeep fake real quick.

Speaker 3 (01:22:42):
We'll get everybody's opinion on it, and then we'll
just we'll wrap this one manyeah, so what you got furious so
deep fake is basically uh, I'mgonna read it it's a technique
that uses ai to create ormanipulate visual and audio
content to make it appear as ifsomething that never happened.
I actually did that's scary tooit involves using deep learning
algorithms to analyze andrecreate a person's appearance,

(01:23:04):
voice, mannerisms, and thenapplying those characteristics
to another person in a video oraudio clip.
This technology has been bothcreative and it has creative and
ethical implications, as it canbe used for entertainment
purposes, such as creatingrealistic visual effects in
movies, but it can also be usedto misrepresent or spread
misinformation, fake news or todeceive people by creating

(01:23:25):
convincing fake videos, and Imean we've already seen like um
examples of that too, like okay.
So you guys know, you guys knowhow?
What's his name?
Uh brawny just got uh picked bythe lakers, whatever.
Yeah, so you know the uh.
I think it's the, not the g, Ithink it's the gm.
So so oh, I've seen it I seethat one like yeah, man, it's
fucking nepotism and shit likethat.
I I was like hell.

Speaker 1 (01:23:44):
No, I was like there's no way.
He fucking said that, Bro.
It sounded like him.

Speaker 3 (01:23:47):
It sounded just like him.

Speaker 1 (01:23:47):
It was scary.
It was almost like him talkingto me money and he was like,
yeah, I'm going to cut you off,but I remember he was like the
same clip, right.

Speaker 3 (01:24:08):
He was like and the real clip he's just he says the
opposite.

Speaker 1 (01:24:12):
Yeah, so this person made an ai video of what he,
what he knew he wanted and I betthat ran through the internet
oh yeah, I saw it.

Speaker 3 (01:24:19):
I know what he was talking about I came across it
than the actual shit, thefucking real, the real, the
fucking interview, you know,because when I saw it I was like
hold up, like you know, likeit's kind of you know, he's kind
of spinning facts, you know,but at the same time it's like
is it true, you know?

Speaker 2 (01:24:38):
so now it's just like it, just questions, even on the
show, like always do yourresearch.
You don't you always see thatsuper easy to be like, oh, this
guy is this way, well, hold onyou know, cause that's the thing
with.

Speaker 3 (01:24:49):
I think that's the reason why people don't do it is
because they, they assume oh,I'm hearing it from you you seem
trustworthy.

Speaker 4 (01:24:55):
I've watched you say it.

Speaker 3 (01:24:55):
Yeah, and you said it blindly it's going to rain pigs
tomorrow night.

Speaker 1 (01:25:05):
That's what's going to happen.
You heard it here first.

Speaker 2 (01:25:07):
That's what's going to happen.
Then you play AI video of pigsfalling down.
Holy shit it actually happens.

Speaker 1 (01:25:15):
I was joking man.
That goes into the phone callsI was telling you about too
where your kid's kidnapped.

Speaker 2 (01:25:19):
You can hear your kid crying and you get the money
together and then you turn outyour kid's at school the whole
time.

Speaker 1 (01:25:23):
You and you get the money together and then you turn
out your kids at school thewhole time.
And oh yeah, you were.
You were telling us about this.
That's deep fake, too right?
Is that what you were?
Just that's what you're goingto add to right?
Yeah, so the the way they do.

Speaker 2 (01:25:30):
It is just little tiktok clips you might do of you
saying something.
They just need, I think, fouror five words that you say and
then you're good to go.
That's why they say a lot of uhthe scam calls.
They just want you to talk alittle bit Like hello Good
morning Hello.
Hello, are you there?
Yeah, I'm here, they alreadygot hello, I'm here.
Do you want to?

(01:25:52):
You know car insurance?
No, I don't want car insurance.
Boom got all the.
Now we can make AJ say like I'mrich.

Speaker 1 (01:25:59):
And then that's scary that they can do that man,
because now the technology doesexist for that.
So it's kind of like Then theyhang up.

Speaker 2 (01:26:05):
They got your voice, and then there's got to go on
your Facebook, get a couple ofpictures of you and then the AI
can program like OK, here's whathe would look like talking to
the left, right up down.

Speaker 3 (01:26:14):
Well, actually speaking about that, to add to
it, that's one of the threats.
It says identity theft andfraud.

Speaker 2 (01:26:29):
so that's just just like what you said, um that
reputation damage, privacy,invasion, political.
Oh yeah, the aoc.
Right now they have a bunch ofuh really good ai nudes of her
and she's like that's not me,but they look so uh, aoc, the
who's that for the democrats, ohso you said she looked good, or
what yeah.

Speaker 1 (01:26:46):
In her college pictures.
Yeah, she look good.
Oh yeah, who is that AOC?

Speaker 2 (01:26:54):
Oh yeah, Do you watch any of the government?

Speaker 4 (01:26:56):
stuff at all.
Oh, that's her, I do watch.

Speaker 2 (01:26:58):
AOC.

Speaker 1 (01:26:59):
I've heard of her.
They hate her man.

Speaker 2 (01:27:01):
She just has a lot of opinions that people don't
agree with, that's all.
But she came out like don'tgoogle these images, they're not
me.
So then what happened?
It backfires, everyone goes.
What images and then noweveryone's like, oh shit, you
see the nudes of hers, likethey're not real, like I don't
care, but I have them now that'strue, because one of the things
you can't do, you can't bringlight to it, just ignore it as

(01:27:23):
best you can.
But yeah, they're veryconvincing and they're even like
organic poses to where you'relike that's definitely her naked
, but it's deepfaked, deepfaked,100%.

Speaker 1 (01:27:34):
Okay, let me show you guys.
Now I'm going to chime in realquick.
I'm going to show you guys thisdeepfake clip real quick.
We're going to have our opinionon this and then, I'm going to
show it to you guys right now.

Speaker 5 (01:27:42):
Kendrick said I like miners.
He was talking about the cobaltminers in the congo.
Those guys are doing god's work.
Without them, none of thiswould be possible.
I get the confusion, thoughkendrick probably should have
made that clearer, but I thoughtI'd just hop on here to clarify
again.

Speaker 1 (01:28:01):
Clearly that didn't really say that but doesn't that
look and sound like him, butlook at his lips, guys.
Look closely.
Look closely at Drake's lips.
Well not this one, sorry.
See how the lips are a littleoff.

Speaker 2 (01:28:14):
The crazy part is this is an amateur guy who made
this.
Can you imagine a studiocompany?
Perfect.

Speaker 3 (01:28:19):
Wait, wait, wait.
But how did he?

Speaker 1 (01:28:20):
do it, doesn't that shit?
Look real, though it looks andsounds like Drake At a glance.

Speaker 5 (01:28:27):
So celebrities are the easiest Because they're
always the worst at it.

Speaker 2 (01:28:30):
But that and like, how many times has he been on
the camera?
Right, I have all the picturesI need.

Speaker 1 (01:28:34):
They got it from the live.
Yeah, it looks like he'sactually talking with the hand
gestures and that's scary, dude,you can fake a presidential
anything dude.

Speaker 2 (01:28:45):
A lot of it is, most of it's Photoshop, and then you
have a program that puts thosepictures together.
So any kind of basic movie is15 frames per second.
So you just do a couple framesand then you add the audio.
And obviously the AI voice.

Speaker 3 (01:28:57):
And then you can just tell the AI hey, say this
sentence.

Speaker 5 (01:29:01):
Hey, when he said it like minors, he meant If you
said more than eight words.

Speaker 1 (01:29:07):
They got you so someone could fuck us over.

Speaker 2 (01:29:08):
Bro, if you look, if you look that way done, they
already got you I'm a glasses onman you don't know, they'll
have you with glasses.

Speaker 3 (01:29:17):
They'll have you with heart-shaped glasses all right,
guys like like and, and that'sjust probably the fucking tip of
the iceberg of like like likethe maximum like you know
potential of what it can do.
You know, I mean, think aboutwhat I was so much more that
that shit can like.
Okay, remember the movie atranscendence, remember, yeah,
started making shit likeactually creating his own shit,

(01:29:39):
his own body look at chat, bt,chat GPT made itself better and
called it number two.

Speaker 2 (01:29:44):
Yeah, like you gotta watch that.

Speaker 1 (01:29:46):
That's just I gotta watch that transcendence, right,
but this, I see the guys now.
That's what I'm saying I want.
Our final tip is to be where doyou guys?
Do you guys think that we justneed to prepare for it?
What advice could you give ourlisteners and viewers so
actively right?

Speaker 2 (01:29:59):
now they're making laws for who can access certain
uh ai features and what you cando with it and how punishable it
can be if you post a video thatis negative.
So the Drake video was funny,yeah, like he didn't say I liked
minors, minors in the country.
But if he made a, video of likeyeah, I'm a pedophile.
That person who made it willnow be.

Speaker 1 (01:30:21):
Because it'd be defamation of character.
Yeah, they're making new rulesfor what you use AI for, why you
have those apps.
They have to right, so theyhave to okay, so that's good to
know.
They're going to try to put alid on it somehow.
It's just super hard, like howdo you word it?

Speaker 2 (01:30:32):
what do you say?
Like I have AI, am I going toget in trouble for it?
Well, no, I made, but you'renot putting it on a t-shirt and
selling it.

Speaker 1 (01:30:42):
If you're doing that, then now you're doing.
You're for sure he owe themsome coin.
Not that B-Money's doing that,not yet.
Not yet.
So if you.
I'll make it just differentenough, if you see any of those
images that I posted on thescreen on a t-shirt, then you
know I didn't do it, it wassomeone else who stole it from
the podcast.

(01:31:02):
It was Exactly.
It was one of our viewers outthere on YouTube and Rumble,
which, by the way, we're onRumble now.
I want to make thatannouncement at the end again.

Speaker 2 (01:31:11):
I want to make the announcement I just made.

Speaker 1 (01:31:17):
Got it For those of you who didn't hear two seconds
ago, we are on Rumble.
What I meant to say was I'mgoing to reiterate that we're on
rumble.
That's what I should have said.
Oh man, okay, but anyway, withthat being said, do you guys
have any final tidbits?
I'll go ahead and go withFurious first and then with
B-Money here and your guys'final thoughts, also on deepfake
and just overall what we talkedabout today.
I'll say final thoughts.

Speaker 3 (01:31:35):
Well, on deepfake, hold on.

Speaker 2 (01:31:43):
So on deepfake porn.

Speaker 1 (01:31:45):
Wow, so you can see your favorite fucking actor or
actresses get their back blownout.

Speaker 2 (01:31:51):
They do it well because they'll get a real girl
and a guy to do whatever youwant.

Speaker 1 (01:31:55):
As far as the video, but they'll just put the face in
Boom.
That's scary man.
That's also defamation ofcharacter.

Speaker 3 (01:32:00):
Damn With deepfake.
It doesn't look like it's goingto be discontinued at any point
either.
No, it's here to stay.
It's just what you do with it.
Yeah, it's just I'll say likeFinal Tibbits is like people
just stay educated on this stuff.
Just don't.
You know what I'm saying.
Like don't stay in the dark.
So it's like when it doeshappen and it's something big,

(01:32:21):
that you're taken by surprise,Surprised by and it's something
big that you're taken bysurprise.

Speaker 1 (01:32:25):
Surprise bite.
That's fair.
Stay in the know and stayinformed.
Stay on top of it.

Speaker 2 (01:32:28):
If you're ever on camera, just make weird hand
movements, because AI can't makehands.
He knows it's a real video.
This is really us talking.
That's why we're hiding ourhands.
It's all deep fake.
I'm not even here right now.

Speaker 1 (01:32:48):
Yeah, b-money's actually been ai generated
season two be money.
Just just just re, just reusedand regret.
There you go.
I put the finger guns.
Now we know it's it's reallyfrom season two.
Damn, just exposed it, man.
Shit, I'm gonna program thatout now.
I'm not just kidding.
What about you be money?
Same question, man, overall.

Speaker 2 (01:32:58):
You know final tidbits on the overall topic
today and also on deep fake Ithink it's fun to look at the
scary side of it, but I thinkit's going to be beneficial.
It's going to help out quite abit.
You're going to have road bumpswith the celebrity fake nudes.
You're going to have the.
I never said that oh, theyreally didn't say that kind of
shit, but I think it's.
I think it's gonna be for thegood soon you know, like there's

(01:33:20):
gonna be.

Speaker 1 (01:33:21):
There's terrible, everything so that's true, no
matter what.
That's true.

Speaker 2 (01:33:25):
Just don't bring light to it right like there's
nudes of me out there, don'tsearch it look it up, hold on, I
mean there's, there is, thereis.

Speaker 1 (01:33:33):
Oh, what I mean?
This is something that theyalready know.
We talked about it back inseason two, but there is a video
out there of b money in a dress.
It exists on youtube, it'scalled drag race.

Speaker 2 (01:33:47):
Get it deep, fake to it.

Speaker 4 (01:33:48):
No, it's real.

Speaker 1 (01:33:49):
Yeah, yeah, yeah, he said yeah, that wasn't real
content that I made with mybuddies.

Speaker 2 (01:33:55):
They made me look like I was in high school.
It's fucking wild.
The body, the drag they did, Ibelieve you, but you were.

Speaker 1 (01:34:00):
You were on something drag race.
That's hilarious drag race.

Speaker 2 (01:34:03):
We were racing fast and the furious in drag.

Speaker 3 (01:34:08):
I thought it was hilarious I'll get canceled now
for an actual drag race, yeahyeah, man, that was funny okay,
I look good too.

Speaker 2 (01:34:15):
I was a size like six two at the time dress size.

Speaker 1 (01:34:18):
It was nice.
Oh shit, bro.
This was a good episode, forsure, man, but again, thanks for
being here last night.
We're a rowdy crowd this time.
Yeah, they're into it, bro.
We're like two drinks in.
Three or four drinks now, atthis point.

(01:34:40):
What kind?

Speaker 4 (01:34:40):
of dress was it?

Speaker 1 (01:34:41):
What kind of dress was it?
But anyway, guys, it was a longdrink.
It was a summer dress.
It was a summer dress.
It's a summer dress.
It is summer.
We are in July.

Speaker 2 (01:34:49):
Fellas, don't let an $8 sundress cost you $18,000 in
child Damn.
He's right Payments In childpayments.
Be smart, he's saying be smart.

Speaker 3 (01:34:58):
Wrap it up yes.

Speaker 1 (01:35:00):
In.
That's a good way to end yourlittle tip.
Wrap it up for sure, don't bustthe B-Money.
Sorry, not a B-Money.

Speaker 2 (01:35:06):
Damn Deep fake that was a deep fake.

Speaker 1 (01:35:09):
Don't bust the Dane Cook.
You ever hear that joke aboutDane Cook, the push and pray.
Remember that shit he knows,furious knows.

Speaker 2 (01:35:16):
So for those that don't know, I know guys now that
still like put wrong.

Speaker 1 (01:35:21):
No an isolated incident kids because because,
because, hey, an isolatedincident, bro.
Obviously he told the jokebetter, but he's like you know,
a lot of guys do the pulloutmethod.
He's like I believe in the pushand pray method.
I like to push past all theimportant shit and blow my load
into her heart.
There we go like don't do thatyou.

Speaker 4 (01:35:37):
You try that method you know, you end up with b
money said do what you want liveyour life.

Speaker 1 (01:35:40):
He said.
If you do the push and praymethod, you might end up like
what b-body's saying he might bedoing don't let that sundress
get you 18 000 or 18 years of uhchild support payment man, oh
man that's real y'all, that'sreal advice, right there.

Speaker 2 (01:35:56):
Good job, b-body.
Nothing to do with.
Ai just wrap it up, be smartyeah, that's all this.

Speaker 1 (01:36:00):
Guys stay ahead, but it's good, it's good advice
though stay in the know.

Speaker 3 (01:36:03):
All right, that's it ahead of the curve.
Yeah, out of the curve, it tiesin.

Speaker 1 (01:36:08):
He's like what do you mean?
It ties in perfectly.
Nah.
But anyway, guys, with my asfar as I go, my final tip it's
man, uh ai.
Like you said, it's justsomething that we're gonna have
to adapt to.
Obviously be careful with theones that are programming it
know where it's headed.
Like I said, it's not all bad.
There's a lot of good that'sgoing to come out of it for sure
.
We just got to be mindful of howwe let it think and, like you

(01:36:28):
said, there was some AI in thepast that was destroyed for the
kind of the stuff, theconversations it was having
amongst itself.
That's what they said With theBob and Molly.

Speaker 2 (01:36:37):
Bob and.

Speaker 1 (01:36:37):
Alice Sorry, Bob and Molly.
Bob and Alice Apparentlythey're gone.

Speaker 2 (01:36:42):
They're gone and we have the baby version.
Putting on my conspiracy hat.

Speaker 1 (01:36:45):
They're gone, right.
Anyway, we're supposedlyputting a cap on it.
You can only hope that's true.
We'll just roll with thepunches, kind of like Furious is
saying stay in the know and tryto be on top of it.
Like B-Money is saying, butalso look at it for the good,
that it is Not just a negativead.
But not everything is alwaysbad right as far as
announcements, we are on Rumblenow, so you know good for us,

(01:37:06):
right, we're on Rumble now.

Speaker 4 (01:37:07):
I didn't know that.
Yeah, that's crazy.

Speaker 1 (01:37:09):
We're on Rumble.
I didn't just say it twominutes ago, my first time.
To reiterate oh okay, we're onRumble now.
So, you know, applause to us.
We're on rumble now.
Um, we're still on all theothers platforms, you know.

(01:37:29):
Thank you guys now it's youtube, yeah, youtube, youtube podcast
, youtube podcast.
Now we're on that.
Now there's no more googlepodcast.
B money's right, they wereabsorbed.
No, they're they own youtube,so they just youtube's more
popular.
So they just kept it all onYouTube, I guess, which me money
feels some type of way about.
Right, because I know you like,I'm so used to my Google app.

Speaker 2 (01:37:48):
Now I gotta go to YouTube to watch or listen, go
past my music to find the yeah,it's whatever.
Yeah, I hear you.
It's a little inconvenient, butit's cool it's a learning curve
yeah, you get used to it.

Speaker 1 (01:38:00):
But yeah, guys, we.
We're still on Apple Podcasts,we're still on Spotify, we're
still on Deezer all the unknownpod ones that no one gives love
to we're still on iHeartRadio,we're still on all those audio
platforms.
I don't think Pandora's inbusiness anymore.

Speaker 4 (01:38:13):
You still haven't made it yet.

Speaker 1 (01:38:14):
Bro, they removed the Pandora off our directory.
That's an old joke from olderseasons we would joke about how
Pandora was the only directorythat did not approve Timeless
Talk.
For some reason, bro, theydidn't want our podcast.
It was the only directory rightthat just denied us outright.
I don't know why.
And it got to the point wherethey just took our directory off
altogether.
It doesn't exist anymore.
So you can't even host and Iactually checked the Buzzsprout

(01:38:37):
website and you can't physicallyupload it to Pandora anymore.
That physical choice is gone.
Now that directory choice is nolonger there.
It used to be there.
I heard they were goingbankrupt or selling.
It might have gone under.
See guys, we dodged a bullet,it didn't even matter if it was
on their platforms, it was dyinganyway.

Speaker 2 (01:38:54):
We lost 10 possible listeners, 10 possible viewers
10 possible views and 10listeners.

Speaker 1 (01:39:01):
It's a terrible thing , but alright, guys, this has
been Timeless Talk.
We appreciate you again forcoming back, b-money.
You guys have a good rest ofyour week.
This has been Timeless Talk.
We'll see you guys then.
Thank you, yeah.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.