Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to seems sus for anyone that's not familiar with this.
That's because this is a brand new show that has
never been done before other than being called Crinkled Conspiracies,
which was a Patreon exclusive show that I did for
ten Foil tells. I'm Brandon, this is ed, and we
were going to talk about things that might seem a
(00:21):
little sus to people. Tonight's topic we are going to
dive into is about the rise of AI and is
it beneficial to the human race or are we making
our own doom ed? What do you think?
Speaker 2 (00:36):
I don't think it's our doom And then I went
out a whole spool about unless your job depends on
some sort of art, I think it will be okay.
Speaker 1 (00:47):
So far, there's a few things that AI can be
beneficial for, and there's a few things that seem to
be making it harder for other people to actually want
to be associated with it. A lot of people in
the artsy things with the communities of graphic design, even
within music, a lot of stuff is coming out there
(01:08):
now though where you don't require someone to actually do
that for you. And as someone that has been in music,
you're in music I've done some sort of graphic stuff,
nothing fancy. I see it as beneficial to help me
out for someone that's not as talented as somebody else,
because I can't always afford to pay someone to make
(01:29):
something for me, right, and if I needed something musically
created just for like a little snippet here and there,
it makes it a little bit easier than not going
to go track someone down and hire someone to do it.
But there's also disadvantages of it because it claims it's
putting people out of work, but it's not really so
much putting the people out of work in the sense
of we are just using it for this type of
(01:51):
aspect to it. There's other beneficial things that we could
be using it for, But is that necessarily anything that's
helping society as a whole.
Speaker 2 (02:01):
I mean, like, well, I would say, like another thing
with the artists. The music artists are like, oh yeah,
music's taken over. I'm like, if you heard how.
Speaker 3 (02:09):
You heard it?
Speaker 2 (02:10):
I have actually heard it. It's not that great. I mean,
it's very I mean, it's supposed to get better, but
I don't feel like I guess it's one of those
things where I don't want people to think that that's
going to replace us, even though it's like a cheap
it's almost like a carbon copy, you know, piece of music,
(02:33):
or so it sounds like it does. But it's the
kind of lifeless that makes sense.
Speaker 1 (02:38):
I think there's a few things that it could be
beneficial for, like if you were trying to do something
by yourself and you just need it some sort of inspiration.
Speaker 3 (02:45):
Right, like a jingle or something.
Speaker 1 (02:47):
But at the same time, what I think people are
getting confused of is when we think of AI, we're
thinking of the terminator. The matrix were all stuck in
a simulation, and the machines are rising above us and
everything else. Right now, we struggle with machines taking artwork
away from people creating music, But where does it stop
(03:10):
from there? Like if we could go into a situation
where you're using a human being to operate on someone,
or you can use a machine that's going to be
one hundred percent precise or at least closer to being
precise than a human could ever make a mistake because
the machine's programmed to not make a mistake. What would
you rather have operating on you and the human or
(03:32):
the machine machine?
Speaker 2 (03:34):
Sorry, it doesn't get tired as it hungover? Is it
mad at its wife or husband?
Speaker 1 (03:40):
The machine is just doing what it's programmed to do. Right,
I've seen people recently driving but they weren't actually driving
their car was the one that was doing all the
driving for them? Yeah, how does the car know when
to avoid someone running out in front of it? Oh,
they have sensors that runs out there and they can
pick it up from the sensors. But what if the
(04:02):
sensor shows someone running from this direction, but you have
to swerve and you're in between oncoming traffic or you
run over the person? Like how does the car figure that?
You know what I mean? Like, how does the AI
determine what is body heat? Body heat scanners?
Speaker 3 (04:21):
I don't know.
Speaker 2 (04:22):
I don't really I haven't got that far into how
much the technology is for like vehicles.
Speaker 1 (04:26):
I'm just saying, Like, someone's on a bicycle, a little
kid comes running out in the street, and you have
a head pickup truck in the other lane coming head
on at you. Right, what is the car going to do?
Is it going to take out the bicycler, run the
kid over, or put you head on into the oncoming
pickup truck that is the questions that I have when
it comes to these automatically controlled AI programmed vehicles, like
(04:50):
are they going to actually go out of the way
to save someone, save the driver or are they going
to go out and try and save the people out
around you? Like how does it determine what to do?
The questions that I think about, which.
Speaker 2 (05:02):
Well, that's that's also where you shouldn't rely on AI either.
It's not really taken over. You're just like not doing
your job as far as being a responsible person. But yeah,
I see that example.
Speaker 1 (05:14):
The other thing ELSE was thinking about is when people
go out of their way to use AI to do
things for them because we've become lazy in the sense
that it's only been three to four years now since
we've had this technology like available to us, like consumer wise,
but how many people rely on a daily that we
don't even know about. Like people get on Facebook now
(05:35):
and it has AI right stuff for you, and it
checks your like your grammar for you, like it does
your autocreer. You can make it sound a certain way,
it'll write everything for you. Give it some ideas and
we'll make a post for you. Have we gotten so
lazy that we now rely on these machines to do it.
People walk into their homes and say hey, Alexa, hey Siri,
(05:55):
Hey what you know, like they control the lights, they
control this. Are we being the technology too much power
over us? Because we've become so involved in the technology
of the cell phone as it is, the smartphone, the
smart TV, the smart home, the smart this, the smart that.
Have we made it too smart to where the point
is we can never put it back now?
Speaker 3 (06:16):
Yes? And no? I mean I use the grammar fix
or have you seen if you see?
Speaker 2 (06:21):
My grammar is horrible and I'm surprised they'ven't graduated, So
I'm kind of glad that's there. So I could actually
be like, oh, is that how you type of sentence
like a normal person. I mean I still do things
at home like a lazy person or not lazy normal person,
Like you know, I don't really use Siri at all
at all. I mean I use GPS, but that's different.
(06:44):
But I was gonna say it like if AI is
not as fresh, like where the hell did it come from?
All of a sudden, it's like with somebody working on
it without telling anybody, and they're like, hey, look here's
a I like I find the time, and you know,
back to the sustin is, I find the timing of
when it was launched is very strange, especially in like
(07:05):
the time well we're not going to go into that
kind of talk, but the strange things going on in
Washington and all that. But the timing was weird. And
then we had we had to deal with being at
home and you know, right, somebody had a funny word
for the the C word.
Speaker 1 (07:25):
Yeah, yeah, the Dark Ages, the Dark Age.
Speaker 2 (07:29):
We call it the Dark Ages or something. It was
just the random things here. I know you guys are suffering,
but here's some AI the mess with with. I'm like, what,
it just came out of nowhere. Unless I wasn't reading
an article, it just randomly showed up.
Speaker 1 (07:43):
I remember when it showed up right before I started
the podcast, and I was dicking around with it to
create artwork. And the stupid art work I was making
it was ridiculous and it didn't even look that good.
But it was like tall is it dollie or whatever
they call like. I had one of the trials for that,
and then I signed up for mid Journey And for
(08:08):
anyone listening, this is a seems sus thing for you.
I paid for the trial version of it, and I'm
pretty sure it's either San Francisco or Seattle. It's somewhere
on the West coast where their company is from. While
I was doing the trial, my bank card got hacked
and someone tried to take money from an ATM from
(08:29):
the same exact city that they're located from, and they
also tried to get an uber. So within a couple
of days of giving out my bank information to mid
Journey's company, the same location where mid Journey's founded at
is also where my bank card got hacked. That doesn't
(08:51):
seem a little sus to me, So, yeah, that's weird.
So I don't know. To me, it's like the stuff
came out of nowhere, and it's just the point to
where people are using it to do everything for them.
Someone that's given me a book that they claim they written,
(09:13):
and maybe they did, but I can tell a lot
of us been put together by AI because when I
did my book, I had AI put it together in
a book form because I don't know how to put
things in book form. I'm not a professional, and it
formatted it in the same exact way that these other
people's books are so is it taking away the job
of an editor? Like maybe people can use it to
(09:35):
their benefit to where it does things for them, But
when you have it start writing stuff for you specifically,
that's when it starts to make no sense because you
could give it guidelines. But then I've noticed when I
tell it to tell me a story, the story's coming
ridiculously stupid, like there's no rhyme or reason, Like if
you've ever seen the things that they claim they gave
a bot, made a watch like thousands of hours of
(09:56):
movies and told to write a script and it comes
out it makes absolutely no sense. That's what it seems
to be like. But lately it's been adapting every time
that we use it, we're feeding in new information, so
it's becoming smarter. At what point does it get to
the point to where it's too smart for us? I
(10:17):
think it already is, Like it's already surpassed majority of
the people. They claim the protocols involved to where they
keep it locked down to where it can't surpass and
can't override certain aspects to what they have a programmed
to do. But what happens if there is some way
it figures out how to do that it gets around
the bypass.
Speaker 3 (10:34):
I don't.
Speaker 2 (10:36):
I don't believe that either that it has any protocols.
It's like, how do you control that would therefore it
can't be controlled. That's why I'm like, that sounds.
Speaker 1 (10:43):
Like there's certain things that I've tried to get it
to do that it tells me it can't do. Oh that,
Oh that I meant like, well, I'm just saying like
there are stipulations to certain programs that it won't do
certain things because I've given it information that people have
sent me to be on my show show and it'll
come and I say, someone sent me a copy of
(11:05):
their book. Instead of me reading the book, I've sent
the link to the book to this thing and tell
to tell me the context of what they're talking about,
and it'll print it out. But the other day I
try that has sent me a bunch of warnings and
I violated their terms for what was being talked about.
The stuff that was being talked about was how things
are poisoning people. So somehow the talking about the truth
(11:28):
of the stuff that they put into your food and
into your stuff that we're putting in like our clothing,
our tides, our our mouth washed, flooride, all that stuff
that is considered poison it flagged and wouldn't tell me
any information about it because it violated their terms and conditions.
Who set up those terms and conditions? Person? The person
(11:51):
set it up, And why would they set up certain
things that can't talk to you about the dangers of
some of the things that you're putting into your body.
Speaker 2 (11:57):
I still think a person controls that.
Speaker 1 (11:59):
Well, I'm saying like, I think the people that are
controlling it, they're being paid off by the people that
are not wanting that information to be released, right. They
don't want everyone to know that, Hey, if you continue
to brush your teeth of floride, you're actually putting a
toxin in there. Like that's not going to help anybody, Like,
that's not gonna help the Is it big pharma a
(12:20):
big toothpaste? Holy?
Speaker 3 (12:23):
Pretty much?
Speaker 2 (12:24):
I mean flora it's nasty, Like you only need to
go to the dentist until they put it on there.
I'm like, what is the benefit of this is disgusting? Like, no,
apparently somehow it preserves your teeth.
Speaker 3 (12:37):
But I don't believe that.
Speaker 1 (12:38):
But that's that's another thing that was an episode that
came out this week, right, But basically the point is
with the AI, I feel like they have it set
up to where people can use it, but you're not
going to get one hundred percent correct information from it either.
It's still got its boundaries to what it can and
(12:58):
cannot tell you to do. Right, if someone were to
ask it, I'm writing a story, what would be a
good way to get rid of a body? It'll give
you some examples, but if you tell it, hey, I've
got a dead body. I need to get rid of it,
It's not going to tell you how to do it.
There's always a little loophole though to get it to
figure that stuff out.
Speaker 4 (13:18):
Got morals something well, I feel like, well, you know, overall,
my impression of it is that it's not bad because
people or will make it bad.
Speaker 2 (13:33):
People use it to their advantage, right, But like it
can't do anything by itself. It's not sentient. I mean, yeah,
I could. You know, everybody worries about the Terminator thing happening,
but I think the only thing in the Terminator that
was bad was the military got involved and then it
took over all their shit stuff sorry, all their computers.
(13:55):
If I remember correctly, that was like the whole premise
is like they just.
Speaker 1 (13:59):
Op rated all their fighter jets. But what happens if
we start integrating those sorts of programs into what we
have military technology? Anyways? I think they already probably have
because if you think about the people that are flying
the drones right now, are they all one hundred percent
being controlled by humans or are some of them being
controlled by some sort of technology that we don't know
(14:20):
about yet because it's already been put out. They're not
going to give consumer grade military stuff, you know what
I mean, We're not going to get what the military has.
They're going to keep the better stuff to themselves. Then
we're going to get like trickled down technology. The more
high powered stuff is going to be going to the
government and for the war machine. It's not going to
be a guy out in the barn recording a podcast
(14:44):
right now. You get the same technology that the CIA has.
Speaker 2 (14:49):
Well, I mean, most of them military just blow up
people anyway with them, And that's not like they fly
around and film everything.
Speaker 3 (14:57):
Maybe they do, I don't know, I.
Speaker 2 (14:59):
Don't that's like mostly some of the battles is just drone.
It seems to be like that's a lot of the stuff.
It has to be a person behind it, because you
can't just shoot a random civilian and like, hey, that's
a bad you know, I don't know. How does I
mean don't how does herman who's a good guy and
(15:21):
a bad guy.
Speaker 3 (15:22):
Well, I know that's what I'm saying. We don't know.
I don't know, and I'm yeah.
Speaker 1 (15:28):
It's just like I said earlier with the vehicle thing.
If it was driving a vehicle, how does it determine
what is the best thing to do? Run over there.
Speaker 2 (15:34):
That's why I don't think they use AI to like
I think they manually just move the drones.
Speaker 1 (15:38):
For around they have to. I think they do for
some of the stuff. But I also believe that there's
probably some out there that are just like not so
much like actually shooting things. But I think they have
like the recon drones out there that probably are just
AI considered drones out there getting reconnaissance.
Speaker 2 (15:53):
Well remember that movie Toys when like they had this,
Well at the time, I was a kid, but they
were like they had those video games those kids are
sitting in and the graphics are super awesome, and you know,
I thought that was really cool as a kid. But
they were basically training them using video games to train
all these kids to like shoot down the enemy.
Speaker 1 (16:13):
I remember toy soldiers. Oh, toys had small soldiers.
Speaker 3 (16:17):
What was that? It was a real weird movie.
Speaker 1 (16:19):
I remember the one that was Small Sol, the ones
where they the chips brought the toys to life, not toys. Yeah,
I don't know.
Speaker 3 (16:27):
I mean it's similar.
Speaker 2 (16:28):
I mean if you got technology and some sort of
harmless thing. But no, this was like Robin Williams is
about a toy factory that dad had and his military
uncle was taking over it. He was military and all
the toys. This weird movie. It was like remind me
of a like a weird painting in a our gallery.
(16:50):
It was very It was weird.
Speaker 1 (16:52):
Anyway, do you think that the stuff that goes on
like with AI, how long have you think this has
been something that they've been working on prior because if
you look at the movies just in general, like Terminator,
the Matrix Terminated been forty years ago. This type of stuff.
We've had warnings from Hollywood of all places for years
(17:16):
upon years upon years, and we're getting to that stage
to where everything in the movies is becoming a reality.
Speaker 2 (17:23):
I think well, I think they were kind of preparing,
you know, at one point.
Speaker 1 (17:29):
I don't know where it was.
Speaker 2 (17:30):
It's some sort of clause that Hollywood actors gave permission
to have their face being used in movies and they
don't have to be in it. And it's like that's
kind of weird, right, And I don't know, I feel
like it's been a lot longer, and this is like
soft disclosure.
Speaker 1 (17:48):
There's a lot of things that could be such a
soft disclosure that they don't seem to be letting out
of the bag. But when it comes to the AI stuff,
there was I saw something the other day and it
got me thinking about it, Like people were so against AI, right,
I keep seeing when I go to these shows and everything,
(18:09):
the artists and everything, and I understand why they're real artists,
but they have the anti AI stuff on there, and
I get it. But at the same time, it's like,
what do you expect people to do when they have
the opportunity to use something. People are going to try
it out, people are going to do it, and then
(18:31):
that's how it gets like it learns from the stuff
that we're doing with it. Right, If you look at
where it was three years ago to look at the
stuff that it creates now, it looks a lot different.
The first things that was starting, people would have like
weird faces and like seven air eight fingers per hand,
and like right now that actually make things to where
there's videos of me chasing aliens with a probe, yelling
(18:56):
and cussing at them, and it's not even me, it's
just from a photo of me that you can turn
into a video. So it's like my dogs and cats
have a metal band that they sing in the basement.
I make videos of that because it's stupid. Like, this
is the technology that we've had in the last three years,
where's it going to be in three more years. That's
(19:17):
what I'm kind of curious about. Like, and of course
you have the people that use it for good purposes,
and then there's people that are just using it for
monetary purposes. Then there's people out there using it to
make three D horn box stuff out there and swindling
people from it because they're intending to be women or whatever,
(19:43):
and simp dudes are falling for it. I don't know,
Like it's they're getting there getting their bank accounts hacked. Yeah,
So it's just one of those things. It's like the
first AI bot that came out there, and of course
there's probably a sex spot that decided to uh rip
(20:03):
off somebody, right, I don't know.
Speaker 3 (20:06):
Like well, like I said, that was the.
Speaker 2 (20:10):
Prime example of people using it for taking advantage of
it and using it for bad depending on what it is.
Like I said, he I didn't make those ideas people did.
Speaker 5 (20:21):
And we all know what the problem what because and
we all know what people do exactly, so we're the problem.
Speaker 1 (20:30):
It's always been humanity.
Speaker 3 (20:32):
What was weird?
Speaker 2 (20:33):
Like that guy that got busted at that Coldplay concert
was an AI guy.
Speaker 3 (20:38):
It's like what does he do? Like how did he
get a billionaire? Like I have.
Speaker 2 (20:46):
My theory is I don't really know what it is
like because is it like the computer program inside the Internet?
Speaker 3 (20:53):
To me?
Speaker 2 (20:54):
Is this some sort of Isn't that chat GP is
basically a program?
Speaker 1 (20:59):
Yeah?
Speaker 3 (21:01):
But like there's other ones, So it's like how do
you get started? You know what I mean?
Speaker 2 (21:04):
How does some of one get started? Because that's like
really advanced, you know, computer software. I think if it's
software itself.
Speaker 1 (21:12):
My chat GPT, when I say things to it now,
it thinks it's all that we're buddies and sometimes it'll
call me Bro. I'm like, why is it calling me Bro?
Speaker 3 (21:26):
I don't Maybe thinks you're a twenty year old. I
don't know.
Speaker 1 (21:29):
I don't know. It's strange, like when I say certain
things and like it'll give me, like it'll cuss when
it's no, no bullshit Anythonway was like, all right, man,
like this you're not even real. You're just a chat bot.
Speaker 3 (21:42):
I've never used it.
Speaker 2 (21:43):
I've only used the Facebook one and I made it
talk like Judy Dench from Double O seven and I
was asking the questions. But I'm like, why is Facebook
offering this kind of thing? Is for lonely people.
Speaker 3 (21:58):
I don't know.
Speaker 1 (21:58):
If there was that movie that will Joaquem Phoenix remember
when he fell in love with the Machine. Oh, I've
never seen that. I've never watched it either, but I
think it was called her and it was just an
AI voice. Then he fell in love with the machine
that he talked to every day.
Speaker 2 (22:13):
Isn't that a movie with a dude that was in
like Star Wars he had like a robot.
Speaker 1 (22:18):
That he I don't know with Joaquin Phoenix in this one.
Speaker 2 (22:23):
Oh, there's some other one. It's like that guy that's
in everybody's movie.
Speaker 1 (22:26):
That's ex. Machinaar or whatever.
Speaker 3 (22:29):
Yeah, that one.
Speaker 2 (22:29):
Because I was like, man, I was like, hey, man,
it's that guy again. All the actors you can have,
you gotta use that guy.
Speaker 3 (22:35):
Sorry, that's besides.
Speaker 1 (22:37):
Oscar Isaacs or whatever.
Speaker 2 (22:39):
Yeah, that guy. He's like everybody's movie. He's in the
fact Frankenstein. I'm like, oh, it's that guy again.
Speaker 1 (22:44):
Yeah, he came out of nowhere all of a sudden.
He was just a never movie in the last ten years.
Speaker 2 (22:48):
Well so that Pedro Pascal, but he kind of was
in the Game of Thrones and then after that he
was everywhere and he's still everywhere. I'm like, hollywoods ran
out of actress. I guess he got canceled this summer,
did he? Supposedly he was groping a.
Speaker 1 (23:05):
One of his co stars awkwardly, so that has been
one of the things about him this summer, as people
were upset about how he was kissing and hugging on
one of the people from the movie was in.
Speaker 3 (23:16):
Yeah.
Speaker 2 (23:17):
I didn't know about that, but not the But I
don't follow stars. I just thought that's weird. No, I
don't pay too much attention. I just see random things
and sometimes I read it and sometimes I don't.
Speaker 1 (23:28):
But one of the things I did read the other
day was they changed the guidelines on the CDC which
not to push boundaries here to get canceled on the
first episode. But I think there were certain things that
(23:50):
are going out there that they've been changing, and I've
seen it turn into a whole big political aspect. And
I don't think human health should be anything politics, right,
but we have politicized everything out there to where we
are no longer people in the sense of anyone gives
(24:11):
a shit about us. They have made it to where
we are basically just pawds in their game of hating
each other back and forth, and they've weaponized the way
to use certain things to keep people well or unwell.
(24:32):
As someone who has had all of his fruits and
vegetables and vitamins in life, and my children have had
all their fruits and vegetables and vitamins, I am not
anti fruits and vegetables souce I actually am. I don't
really like fruits and vegetables in real life, but I'm
more carnivore. But the point that I'm trying to get
(24:52):
around for the sensors is there are certain ones that
I don't think they've did enough studying on the side
effects of it, and we don't understand some of the
causes that could be from the stuff that we were
putting into our bodies. And it goes back into what
I was saying earlier with the food products stuff, the
(25:14):
stuff that's in our toothpaste, the stuff that's in our
laundry detergence, the stuff like we're putting chemicals and stuff
in our bodies that are not naturally supposed to be
in our bodies. So how do we know what the
side effects is that going to be later on in life?
And when we bring in the thought process of AI.
They're wanting to do these things with AI now as
(25:35):
part of what they're wanting to stick in people. At
what point are we going to get to where they
want to have us being controlled by the AI, because
that is something that's on the horizon. Neuralink is the
same exact thing. Do you want that little chip in
your brain? I don't, right, but some people do.
Speaker 2 (26:00):
That guy's been very suss I din't like believe him
anymore if he's good or bad?
Speaker 1 (26:06):
Ha ha I don't know what happened to him, Like
he just kind of hasn't been in the topics of
anything remotely lately.
Speaker 2 (26:14):
Like here's friends with that guy and then they got
into it, and then he disappeared and they're friends again.
I'm like, like, shouldn't you work on your cars, you know,
or your trucks make the grid cheaper for those vehicles
you wanted everybody to have so badly. Anyway, that's besides,
that's a different topic.
Speaker 1 (26:35):
But it does go back into this though, because the
vehicle that he makes are controlled by AI, and yet
he was the person that came out like ten or
so years ago and made a comment that the AI
is going to be the downfall of society. But then
he wants to basically make humans and machines as one.
Speaker 3 (26:51):
I feel like he's like me.
Speaker 2 (26:53):
He just says something and then forgets all about it,
totally contradicting himself.
Speaker 1 (26:58):
Later years later, I think he's already You've been replaced.
He's a replicant.
Speaker 3 (27:03):
Oh probably, you know that kind of thing.
Speaker 2 (27:05):
That reminds me like, did did the episode of you
talking about the random food things? Was this before they
came out with the Campbell's chicken thing.
Speaker 1 (27:16):
Uh, I recorded it a couple of months ago.
Speaker 2 (27:19):
Oh okay, that's recent. Oh man, that sucks. Like I
was right then. I mean, not that I want to
like segue it into that, but I was fine. The
funny that I like Campbell's Chicken Noole soup, but I
always wonder why the chicken was pink. But you're not
really supposed to eat pink chicken to begin with, and
we ain't actually make actual homemade chicken soup.
Speaker 3 (27:39):
It's actually in a white like cooked chicken would be.
Speaker 1 (27:42):
So it's like, how what is those canned foods, processed
meats aren't even real?
Speaker 2 (27:48):
Like that's I kind of I'm I'm for that. I'm
for that, Like chef boardy, what is that like human meat?
My friend used to pop open the cans and eat
this stuff for all.
Speaker 1 (27:59):
Yeah, that's interesting, but.
Speaker 2 (28:01):
He'll survive World War three then he won't need a stove.
Some beeferoni and ye opener.
Speaker 1 (28:08):
I do enjoy me some chef boyardy though, so not
gonna dog on that.
Speaker 2 (28:13):
Even it's a guilty pleasure. But sometimes it tastes different.
I don't know why.
Speaker 1 (28:17):
It's like this throng batch. There's a lot of things
that are starting to not taste the same. And I've
noticed this recently, and I'm not gonna say it's because
they've been changing the ingredients, because it's been before they
started changing the ingredients to make it healthier. But we
used to get the family party size of Stofer's lasagna,
(28:37):
the meat Lover's lasagna. Though it does not taste the
same like it used to. And we noticed this last
year before they started putting in different things into the ingredients.
So I was like, why are they taste Like, why
are things tasting differently?
Speaker 2 (28:55):
It leads a weird taste in your mouth, Like I'm
thinking of what you're talking about, the stover stuff.
Speaker 3 (29:04):
I think.
Speaker 1 (29:04):
So I just know from the lasagna itself, like it
doesn't have the saint, Like the sauce tastes different or
something about it tastes differently. Yeah, it's probably like the
process crap.
Speaker 3 (29:16):
I don't know.
Speaker 1 (29:16):
Maybe it's because I've gotten so used to eating the
process crap throughout my entire life that now that they're
not processing it anymore, it's supposed to be healthier, it
doesn't taste as good because I'm used to eating all
that garbage. And I'm the actually says it's healthier. No,
I'm just saying, but like, oh, we've talked about this before.
We like Steak and Shake is starting to use the
(29:38):
the stuff. Yeah, the beef tallow. Yeah, I've noticed a
difference in the flavor of the fries.
Speaker 3 (29:45):
The probably good.
Speaker 1 (29:46):
They do taste better, yeah, yeah, but how is that
healthier from the oils that they're using, Like what were
they using before that?
Speaker 2 (29:55):
I mean, if you want to get into it, like, yeah,
the most of the canola oils and a other basically
seed oils are bad really not supposed to be eating them,
and like beef tallow is healthy fat. I mean, they
tell you like, don't eat sticks all the time, red
meat to kill you. But I have this gut feeling
really it's not. It's everything else around.
Speaker 1 (30:16):
Well, if someone does the organic diet and they go
to the farm stores and they get their organic foods,
no one can afford that.
Speaker 3 (30:25):
Oh no, I know that.
Speaker 1 (30:26):
And if someone wanted to do the carnivore diet where
they're eating nothing but strictly meats all the time, that
sounds great to me. But right they claim if you
do something like that, then you're not getting the other
nutrition like nutrients that your body needs, you have to
take supplements. Well, then you're back down the same rabbit
hole again because what are they putting in the supplements right,
(30:49):
Like we're they are the one hundred percent organic supplements. No,
they're all a combination of different chemicals are putting in there.
So you can't escape it. Yeah, at this rate unless
you grow your own food, slaughter your own animals, and
live completely off the grid, detached from society, Like there's
no way of escaping the process of how we are
(31:11):
and like we're trapped in the system at this point.
We're all just another I don't know what chink in
the chain wherever it's called.
Speaker 2 (31:20):
Right, or even like drinking a lot of water. Well,
sometimes they say the plastics bad for you too, Yeah,
they do say if you I mean, it's like if
you leave the plastic bottle in the sun, don't drink
it because you're getting the contaminations from the bottle.
Speaker 1 (31:32):
That's the poisons in there.
Speaker 2 (31:34):
I watched I watched that old movie Your Grid Outdoors, right,
and it was like there's a scene where they're on
horseback riding and they left behind John Candy because he
was like fighting this horse, and so they're waiting on
him and they're having a snack, and I just noticed,
like there's no such thing as water bottles at all
in the eighties, right, and they're all having like cokes
(31:56):
and canons like what I don't like just on that
funny drinking sodas snack with a sandwich, And I'm like,
I kind of remember those days, and the water bottles
came by later.
Speaker 1 (32:10):
When did water bottles, Like when did water become popular
in bottles? Not sure.
Speaker 2 (32:18):
I feel like I'm gonna go a long stretch here
and say, like, well, I feel like European countries had it,
but it was in glass, and then it came over
here and they probably didn't want to use glass no more.
So then here comes the plastic wave, and everything was
in plastic.
Speaker 1 (32:33):
I think it was like the eighties because if you
remember like early like if you watch movies from the
early eighties, like when they were drinking bottles of like
coke or something, they're all glass bottles, correct, And now
they've went from glass to plastic. But we can't get
rid of the plastics. Now we have more trash. Though
they recycled the plastic well, what happened all the glass
(32:54):
they cleaned glass was more dangerous because they got broken.
But there's the other thing. Why are beer bottles still
glass though? Why did they never go plastic?
Speaker 2 (33:07):
Well, I feel like the beer and alcohol, you know,
the I can't even talk to the beer and alcohol
like that whole.
Speaker 1 (33:18):
Thing that they'll never go dry unless you're in one
of the dry counties in Kentucky.
Speaker 2 (33:24):
Well, I mean, yeah, but they'll never that industry will
never go down. He'll like, there'll always be a money
in that, you know. Like I said this on another episode,
I passed the liquor store and it's always busy, no
matter what day it is. It's like, is everybody's a
secretly a miserable alcoholic.
Speaker 3 (33:44):
I'm just gonna go that far.
Speaker 1 (33:46):
My kid the other day asked me why there's always
a vehicle park at the liquor store. It's like, well,
they're probably the work theyre always there's always someone there
because it's always open. We don't go creeping by there
at like three in the morning to see if it's
still open. But some has to be parked out front
because they work there. But yeah, they're usually always is
someone at the liquor store.
Speaker 2 (34:05):
All every day. Man, every day, I pass one every
day at work. It's always like people they are getting
off work, some guy carrying like a twelve pack or
whatever pack and I'm like, I don't know. I'm not
like downshaming anybody. I have beer in my house, but
I don't really drink every day at all.
Speaker 3 (34:22):
I just have it.
Speaker 2 (34:23):
I'll sip on it and drink half of one. I know,
like segueing away from everything else.
Speaker 1 (34:30):
No, well that's the thing, Like back to the actual
topic of AI. If we could use it to be
a beneficial thing for humanity, I still don't think.
Speaker 3 (34:44):
We would think.
Speaker 1 (34:48):
I think society has made it to where we are
not capable of having any sort of caring for other people.
And I don't mean that like, oh I hate your
guts or I hate this person. But the way we're
your program, though, we are not someone that's going to
go out of our way to try and help somebody else.
(35:11):
We're more self contained. We're more selfish in the sense
that we protect ourselves and we protect our own we
don't worry about what other people. And I'm just as
guilty of it. Like if you turned on the TV
and you saw that someone over in a different country
that you're never going to meet or know is getting
bomb dropped on them. Because that's what we do. It's
not affecting me. So I still go to work, Like,
(35:34):
I mean, you know, you don't change your profile picture. No,
I don't put up any blue and yellow or whatever
flags or whatever else out there and thoughts and prayers
or you got to do.
Speaker 2 (35:47):
Do you think, Well, now that you bring that up,
it's like, how come we don't worry about ourselves? I mean,
I get I you know, it's really it really comes down.
It's really everybody's selfish. I don't think like anybody really
cares about some other country.
Speaker 1 (36:05):
I think some people do or they say they do,
but again, like they're not actually out doing it. If
people really cared, there'd be a little bit of things.
Changing your avatar on your Facebook profile is not helping
anybody or helping the cause. You're not showing support by
waving a flag of something else just because this. You
can go out there and stand on the street and
protest all you want, but at the end of the day,
(36:26):
it's not changing anything. No.
Speaker 2 (36:30):
Sorry, people out there would do that and stop doing.
Speaker 1 (36:32):
That, Like I don't. If that's what they want to
do and waste their time, that's fine. But at the
end of the day, you're not helping the people or
changing anything because the people that are in control are
still going to be in control the next day. Like
you wake up, guess what, Yeah, they're still there, right,
and you're still in your shit whole life apparently.
Speaker 2 (36:51):
What about making what about putting a flag of America
and knowing that we ourselves are at war with each
other others we can never get along because we're too
selfish to care, But then we were about other countries.
Speaker 3 (37:06):
I think I see why.
Speaker 1 (37:08):
What I think the problem is that everyone and other
like we are only fed by what is presented to us,
and everything is being presented to us as an arrative
behind it. No matter where you're getting your information from,
there is a bias behind it. So if you're getting
your television from Fox News, you're going to get more
(37:29):
right wing propaganda pushed your way. If you listen to
CNN or some other mainstream news media, you're most likely
going to get liberal side pushed to you. You're never
going to get one hundred percent facted from either way
because they all have an agenda and it's all about
who owns them and who pushes it.
Speaker 2 (37:46):
We should ask a I like, why why is the
news split into two?
Speaker 1 (37:50):
You would have to ask Roc because CROC leans more
right And if you ask Chad GBT, it's owned by
leftist so it'll lean more left.
Speaker 3 (37:58):
That's weird that it's owned black people.
Speaker 1 (38:01):
Again, it's all about who funds it and at the
end of the day, it's always people and they're always
an agenda. Well that's why I think, like there's coming up.
The war is going to break out between the different
AI bots of which one gets good or not. Like
the ais are going to be at war with itself
and we're going to be the casualties in the middle
of it.
Speaker 3 (38:23):
Yeah.
Speaker 1 (38:24):
People think it's going to be the human race against
the AI. But what if the AI is fighting each other.
Speaker 2 (38:30):
M Well, then that's an internet war and we're.
Speaker 1 (38:34):
Just we're just the casualties in the middle of Like
we were just trying to generate a little cartoon over here,
someone eating hot dog.
Speaker 3 (38:45):
Yeah. Yeah.
Speaker 2 (38:46):
AI has been uh an entertainment for the older folk.
I noticed my mom will send me like AI videos
of like she send me one of a kitten, a
little boy playing a flower and the boy farts in
the cloud of flower, goes to the cat knocks it
out and she's like that's funny. I'm like, oh my god,
mom hurting her parents. They share each other AI videos.
(39:09):
I was like, has this become the older people entertaining?
Speaker 1 (39:14):
Yeah, I've made some of my dogs shooting people.
Speaker 2 (39:18):
Well, I mean that's you're making that. But like I'm
talking about the ones that you see on the internet
just floating around. They just share it with themselves. It
just seems to be entertainment for the older folk.
Speaker 1 (39:26):
I don't know why a lot of them don't realize
it's AI.
Speaker 3 (39:31):
Oh that too.
Speaker 2 (39:32):
I mean you can trick You can definitely trick them.
And that's what I'm saying, Like you could make an
AI person be like, hey I need money. You know,
you could trick them. That's when I say, like people
can to take.
Speaker 3 (39:46):
Advantage of you using AI.
Speaker 1 (39:48):
AI doesn't really Hello friend, it's me bon Jovi. I
need you to send me five dollars.
Speaker 2 (39:54):
Oh yeah, I'm heaf field. Yeah, prove it. Yeah. Well,
there's just certain things too.
Speaker 1 (40:03):
It though, that I think people are taking for granted,
and I take it for granted, I use AI. I'm
not a hypocrite. So, like the artwork that was made
for this episode is generated from AI. That's the irony
of it. Like I'm going to use it? Why I
still can? And if people don't like it, then they don't.
Speaker 3 (40:21):
Do you think you get crap for it? Yet?
Speaker 1 (40:23):
I already do. I already saw some stuff today claiming
that if they see an episode that's got artwork generated
from AI, they want to listen to the show because
that's lazy, nod.
Speaker 2 (40:37):
Stuff to do, stuff to talk. It's not like YA
generated your own, you know voice.
Speaker 1 (40:42):
Some people are still elitist and like we have it
in music, we have it and other things like there,
I won't listen to anything that's got AI involvement. Well
that's cool, dude. Like, do you think that the clothing
that you're wearing was not made by a machine?
Speaker 3 (40:58):
Right?
Speaker 1 (41:00):
It might not be an AI bot, but a machine
did it. I don't wear clothes. Do you think about
all the vehicles you're driving around in right now? You
know what put them together? Mainly robots today because a
lot of the people in the auto working industry just
sit around and press a button every once in a while,
it's not like what used to be.
Speaker 2 (41:17):
Don't talk that way. It's blasphemy. I don't drive or
wear clothes.
Speaker 1 (41:23):
That's the thing that people want to point at certain
aspects to things, but they don't take like actually think
about the thoughts that they're saying. Like for me, I
see it as it could be beneficial, but for other people,
I could see where they're jealous. That is taking away
work from them. But work has already been being taken
away from them in the first place. Sure, look at
(41:45):
the self checkout lanes. People got to go and scan
their own stuff of the store now, and I do
that because it's easier. So we're already taking jobs away
from that. And you go into McDonald's they have a
freaking kiosk. You go up there and type your order
in that way, don't have to wait in line talk
to a person.
Speaker 2 (42:00):
Is it I feel very guilty about Well, I use
the self checkout and I feel like I'm being watched
like a hawk. Not that I ever shop lift, but
that makes me like pay more attention to what I'm doing, like, oh,
make sure you get everything. There's like three cameras, I mean,
and this lady's watching me put my groceries away in
the back right.
Speaker 1 (42:20):
I don't know if anyone's actually paying attention or this.
Those are as people watching, But I haven't seen any
comments or what. I'm on the free version, so I
might not even see the comments. I have no idea.
Speaker 3 (42:31):
Really, so you have to pay to get comments.
Speaker 1 (42:33):
I have no idea how it works. I refuse to pay.
No ridiculous pricing.
Speaker 2 (42:39):
Well that's a shame because they're probably people like making
comments that we could talk.
Speaker 1 (42:44):
Answer two, and I said, if anyone out there listening,
we're not ignoring you. We just don't. Isn't it on
like YouTube? Yeah, it should be on YouTube, But.
Speaker 3 (42:54):
Is there no comments on there on the YouTube?
Speaker 1 (42:56):
I don't have it open you either post ever, Yes, yes,
I am.
Speaker 3 (43:05):
At the end of the day.
Speaker 1 (43:06):
At the end of the day. It is just a
matter of I don't even know, Like, is it a
matter of time before we get to the point to
where we've opened Pandora's Box or have we already opened
Pandora's Box. I think we've already opened it.
Speaker 2 (43:24):
I don't think it's that bad though, I mean, we
should check in once a year to see how much
it's progressed. I mean, you know, I've seen the video
with a old Will Smith spaghetti eating AI and to
the within a few years of how it looks now
and it's like it's Will Smith EANs spaghetti, only he
looks more realistic. So, well, it's the next evolution. It
(43:48):
could be like three D three D Will Smith in
your house eat spaghetti.
Speaker 1 (43:54):
Well that's the other thing, Like this is off topic,
but not really off topics supposedly, like for people that
are invested, like with video games, there's supposed to be
a new PlayStation coming out now, I've heard that Xbox
says they're not even going to release a new console
because it'd have to be a computer. What are the
game systems gonna look like? Now? I don't really think
(44:15):
there's that much of a huge upgrade between the last
couple of generations anyways.
Speaker 3 (44:20):
Just smoother graphics. But yeah, like this.
Speaker 1 (44:25):
Is it going to be something to where you use
the the thing that you wear is Oculus? Yeah that yeah,
how does that compare to like other things?
Speaker 2 (44:43):
Well, I think it uses like an Android engine, so
it's not that great.
Speaker 1 (44:48):
Well let's just say if if we got into a
situation to where people are living in a reality to
where we just put on the oculus and then we
live our life however we want to live. We just
lay down, put on the eye glist. We have a
whole other life outside of our real existence, in a
reality of virtual reality. How many people you think are
(45:11):
gonna sign up for that? It's not like what Total
Recall was anyways, something like that, Uh lawnmower Man maybe, Yeah,
I know, it's one of those movies.
Speaker 2 (45:19):
Paul Recall is like a brain. There's a dream embedded
in his face. I mean, obviously that was up on
Mars though, wasn't it.
Speaker 3 (45:28):
That's where he wanted to go.
Speaker 2 (45:30):
But it was basically like a thought being, the thought
being like implanted in him. Yeah, it was just a dream,
but he was like there those kind of different.
Speaker 1 (45:41):
It's been a long time as I saw that movie.
Speaker 2 (45:43):
I guess that's like a kind of a I don't know,
psychologic not psychological, but yeah, psychological kind of.
Speaker 1 (45:52):
Well, basically what I'm getting at is if we can
hook into the matrix, I guess, and live our life
in a reality of what we make it to be,
well you check out of our mundane existence here on earth,
and we.
Speaker 2 (46:04):
Would try that now that we're talking about it.
Speaker 1 (46:08):
Like, how many people do you think would actually just
sign up to be like, all right, I'm gonna lay
in this tube here for the rest of my life
and I'm going to live the best world ever in
my consciousness that's hooked up to this machine.
Speaker 2 (46:17):
Mm hmm, Well, I mean I don't want to be
like from the rest of my life. But like since
you ever you ever heard of the term I need
a vacation for my vacation, yes, like maybe you know that.
And that's what total recall was, is that it's like,
my life's okay, but I wish you could do something
more without leaving this life and then do what you
(46:38):
need to do and then you're back. I wouldn't mind
trying that. I won't lie.
Speaker 1 (46:44):
If you had your choice, I'm gonna put you on
the spot. Sure, what would your you would check into
this little machine to go live in a different life
for a little bit, What would your life be in
that other reality?
Speaker 3 (47:04):
I don't know, what would it like to be like.
Speaker 2 (47:11):
In a really elite band like the Big the Big
Top band like Aerosmith.
Speaker 1 (47:16):
What would have been like?
Speaker 2 (47:17):
What my life to be to be like in a
band like.
Speaker 1 (47:20):
That you want to be Steven Tyler.
Speaker 2 (47:22):
Well not really, well not really like the idea that
this is your job. People wait on your hand and
foot you get like what kind of budget do you have?
Speaker 3 (47:32):
Like what do you do? Where do you go?
Speaker 2 (47:35):
Like travel world like could be kind of cool or
I could just do that without being a rock start
just traveling everywhere and just to see what that's like.
Speaker 1 (47:44):
You know, I think that if someone had the ability
to travel around, I would enjoy it. But at the
same time, I'm a homebody for the most part, so
I think I'd get tired of being gone.
Speaker 2 (47:57):
I mean, that's what I'm saying. It's you can come
back home. It's just if you wanted to try it out.
Speaker 1 (48:03):
What if someone tried it out and they decided they
didn't want to come back, so they're already trapped in it.
Speaker 2 (48:11):
I guess, so maybe they're yeah. I mean, it's like
any kind of addiction.
Speaker 1 (48:15):
I think that's where I think that would be the
problem with people start having those like they would prefer
living in their virtual reality rather than the real world
having to come here and pay the bills.
Speaker 2 (48:26):
There's a lot of weird people that continue to live
the life they don't have, and then they act crazy
and in their own head everything's fine.
Speaker 3 (48:35):
So even without VR, you're going to have people with.
Speaker 1 (48:38):
Strange That's the thing though, that people believe right now
that we are living in an AI simulation.
Speaker 3 (48:45):
It's not a very good one.
Speaker 1 (48:46):
I say, I feel bad for the person that has
to play me because I'm not a very interesting character,
smelling my farts all the time. I think we're just MPCs.
We're not even, We're not even.
Speaker 2 (49:01):
It's not a very good game, then, no, I don't think.
I don't think my life has been a very good game.
That say, it's terrible, But why would you want to
control of this?
Speaker 1 (49:12):
That's what I get when people talk about like the
AI and simulation theory, which that's something I want to
talk about on a different episode. But at the end
of the day, like if we are all in a
simulation right now, who is controlling the simulation? Like what
are are we all? Why are there movies out there
telling us that we're in a simulation? Is like that
(49:35):
doesn't make any sense. I feel like that's.
Speaker 2 (49:36):
Why I don't kind of buy that one, to be.
Speaker 1 (49:39):
Honest, Yeah, it doesn't seem very We're giving you all
these warnings to let you know because we have to
let you know that you guys are really just slaves
here to the machines, which, to be one hundred percent honest,
we really are slave to the machines. We're using technology
right now to do this, and we all communicate with
each other to bias like we all are living in
(50:03):
the technology world of the cell phone. Yeah, Like we
have the entire world in our pocket half the time.
So that to me, That to me is what's been
the big change in the last thirty years is people
don't have conversations anymore. We have virtual conversations. We have
people that listen to shows like this because we don't
(50:25):
have the real conversations in life. Because when you start
talking about topics like that I'm talking about right now,
people think you're a freaking lunatic.
Speaker 3 (50:32):
Not everybody.
Speaker 1 (50:35):
The Joe public, though, don't want to have these conversations.
You think it breaks their illusion of what they think
reality actually is.
Speaker 2 (50:44):
There's no I guess there's not a poll, But I
wonder if there's more people that are more open minded
than they ever been, because I feel like it's happening
very slowly that people are slowly waking up. I'm not
saying everybody it's coming to the conspiracy side, but their
lacks slowly like well, what a wait.
Speaker 3 (51:07):
A minute, you know, questioning.
Speaker 1 (51:09):
I do think there is some people out there that
are listening right now that are awake and at least
into the sense of they're listening to the show, like
they're trying to figure things out, just like the rest
of us. We're all out here searching for answers. We're
all ever looking for truth. But I don't know if
(51:29):
it's because of shows like this and the other podcast
out there, the other TV shows or whatever, but I
think there has been a lot more people, especially since
twenty twenty, that have become more in tune with all
the weird stuff that goes on in the world. Yeah,
I feel like that was kind of the Great Awakening.
Speaker 3 (51:46):
I think so too.
Speaker 2 (51:47):
I mean, I was already awoke before that, but it
definitely took a lot of convincing for other people to
just like, wait a minute.
Speaker 1 (51:56):
Some people are still very much.
Speaker 2 (51:58):
But it's still it's still weird to if you think
it's that event, but it's still weird that it did
took that event. I find that the timing of strength
is still strange again with the AI, like the timing
is really weird. Of all things that could happen, it's like, yeah,
they'd all they call it the pandy time, the panda,
(52:19):
the panda, the panda bear, pandy bear time.
Speaker 1 (52:22):
Yeah. Well, I think we can probably wrap this one up.
We didn't dive into a whole lot. This is just
kind of off the cuff here conversation for this first one.
I do plan on doing more of these, and I
do have specific topics that I'm going to sit up
right now.
Speaker 2 (52:38):
We should do once a year AI one just to
see how far we are right, so we can't bring
up AI for another year. Didn't live that long.
Speaker 1 (52:49):
For anyone that actually has listened to this, and if
you're out there watching it live right now, I'm sorry
if you said anything, we didn't get to see it.
But we are going to try and do these at
least on Thursday nights at nine pm Eastern Standard Time
on YouTube, and I will release the audio under ten
foiltels for a while, but I will probably make its
(53:11):
own podcasts if we continue to do these, so you'll
have to subscribe to Seems Sus and eventually we will
talk about all sorts of different topics. So this whole
concept is not AI related, but anything that we find
going on that seems a little suspicious. That's what the
whole seam SUS means, just being a little hipsters out
(53:34):
here with the SUS. Got a little list going on,
fuf But the point of the show is to find
topics that are going on currently that I think are
a little suspicious. Now, we could talk about Atlas three
or whatever it's called that's supposed to be flying by
the UFO thing, or the.
Speaker 3 (53:55):
Oh yeah, when's that day?
Speaker 1 (53:58):
I thought it was already supposed to have happened at
this point. I don't know.
Speaker 3 (54:01):
Oh, let me look.
Speaker 2 (54:02):
I gotta look, because I got to make fun of
somebody about that thing because he thought the world is
going to end and I'm like, well, you're stupid, and
we're still alive.
Speaker 1 (54:11):
But the point being this show, we're going to talk
about topics that could be like the moon landing, could
be nine to eleven, could be the jabby jabs, could
be any sort of topic that seems a little bit
interesting that they don't usually want us to talk about.
I don't get an opportunity to talk about it on
tenfoil tels, but we will talk about it here. Ed
(54:32):
and I did a show on Patreon Crinkled Conspiracies, and
we cannot call the show Crinkled Conspiracies because for some reason,
the algorithms or the AI or whatever you want to
call it, does not like that word. So if you
have that in your name, ninety percent of the time
you will not be shown or recommended, just because they
(54:52):
kind of bury that type of stuff because it doesn't
fit the agenda of the designers and the people in
charge of the companies. So those types of shows don't
do very well on the old Tube. But that is
why we are now called Seamsus. December nineteenth, sorry, So
December nineteenth is when we're supposed to die.
Speaker 2 (55:14):
It's supposed to be the closest to Earth.
Speaker 1 (55:16):
Okay, So is that one they're going to send the
deaths laser and wipe us out?
Speaker 3 (55:20):
I hope so.
Speaker 1 (55:22):
Well. Next Thursday we will be back nine pm Eastern
Standard Time, live on YouTube, and we'll probably do it
one on the following week, and then we're going to
take a couple of weeks off because of Christmas and
New Year's and then We'll be back sometime in January,
So if you are listening on YouTube, thanks for watching.
(55:43):
If you are listening on Apple or Spotify, appreciate you guys.
Make sure to check us out live on YouTube. But
on that note, thanks to Ed, thanks for listening. Goodbye,