Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
The whole point is to share on here.
Speaker 2 (00:08):
Oh yeah, yeah, that's right. Embarrass the out of yourself
keeping conversation, you know, and.
Speaker 1 (00:14):
You're doing your friend job on your room. What's going on, everybody?
Welcome back to JJ's lounge. We're back with another episode
of Stardom. Man, it's been a little while since I've
done episode. I was just talking about that before we
went to live today. But you know what, it don't matter.
(00:34):
We're always going live on here. We're always having a
good time. And uh, you know, we got a few
people with us today. We all know my buddy here,
Sean Shank adding into the show. Uh. To the network
as well, well, how you doing, man?
Speaker 2 (00:47):
Doing real well? Just getting prepped for my Uh. I
would love to say I was doing a bunch of
cool entertainment stuff and I was a little bit talking
to some comics earlier. But actually I spent most of
my day getting ready to teach teach college courses. So
it's it was kind of a boring day, not bad,
so I am all right.
Speaker 1 (01:05):
Boring is good sometimes though. Man, there's days where I
don't leave the house and these are very rare, and
I'm like, man, I cannot believe. I stayed in the
house all day and very good.
Speaker 2 (01:18):
Yeah.
Speaker 3 (01:18):
I did a lot of cleaning and whatnot today because
I work at Amazon filling center and uh, you know
it's gonna be busy in the next few days with
sales and stuff like that through Amazon.
Speaker 2 (01:31):
So I did all my housework today.
Speaker 3 (01:33):
I got to go in tomorrow, you know, fill out
you know, Amazon orders all day.
Speaker 1 (01:40):
But yeah, you do it on a Sunday.
Speaker 3 (01:43):
So yeah, so we we're not open on Saturdays, but
we are open on Sundays and we do film out
and then we have Monday off because it's Labor Day,
and then it's gonna be a living nightmare going back in.
Uh so yeah, not looking forward to that. So I
did all my housework today.
Speaker 1 (02:01):
So that's good. Man. I've been doing some housework. We
had a place that sitting in the garage for like
two years. Finally got to put up. It's been like
one hundred and some degrees outside. So it's hard though, man.
But you know what, man, why don't you everybody a
little bit shout out on what you do with who
you are. Like I said, we've kind of been in
(02:21):
touch for years. Now is the first time we've ever
done anything, So why don't you let everybode know who
you are and what you do.
Speaker 2 (02:26):
Man.
Speaker 3 (02:27):
So I run Beer Pirate Radio. It's a college rock
in talk radio station. I also do podcasting. One of
them is the School of Martian podcasts on Justin Nominee TV.
And then I do my podcast as the bog Art
Podcast as well. But yeah, started that back in twenty sixteen,
(02:49):
had to put a pause on it, and now like
two years ago we started it back up. It's going great,
you know, and I love college rock and uh, you
know everything about it.
Speaker 2 (03:02):
I always wanted to zoo my own radio station, so
you know.
Speaker 1 (03:06):
So you say college rock, Yeah, I'm guessing that's not
a genre music. So why don't you go a little
bit more details.
Speaker 2 (03:15):
What do you mean by college rock?
Speaker 3 (03:17):
Like pop punk, old punk rock? Yeah, just basically, I
don't know, it's hard to explain. I don't even know
how to explain college rock basically, but it's just, uh,
you know, it's you know, just old punk.
Speaker 2 (03:32):
Pock punk rock, you know, music and stuff like that.
Speaker 3 (03:36):
Yeah.
Speaker 2 (03:37):
So how do you define old my friend, because like
you say old, oh, yeah, like the Misfits and g.
G Allen and like crazy old punk and you're like, no,
like nineties ever clear?
Speaker 4 (03:49):
Oh yeah, So so we play early two thousands to
uh now pop punk and punk rock, so you're throwing
three day and stuff like that.
Speaker 3 (04:02):
So yeah, I know, man.
Speaker 1 (04:05):
He's gonna have a hard tack. You're not and Sean,
you're not that much older. I mean, I'm Evan. I'm
guessing the beard hides it. You're probably what nineteen twenty
Just kidding, yeah.
Speaker 2 (04:17):
I wish man maybe a couple more beards than I.
Speaker 1 (04:23):
Sean keeps a clean cuts, so he has that baby face,
but he's really like, you know, seventy five.
Speaker 2 (04:29):
I feel it. My back is seventy five, you know,
but my imagination is about nineteen or twenty.
Speaker 1 (04:35):
Right right now, we're all I don't know, are we
all about to say that? Sean? I don't know. We've
never really discussed your age. How old are you?
Speaker 2 (04:45):
Oh?
Speaker 3 (04:45):
Me?
Speaker 1 (04:46):
Yeah?
Speaker 2 (04:47):
Well, in Hollywood, I am a I am a sharp
thirty three years old.
Speaker 1 (04:56):
I don't think i've ever just God, yeah, I'm thirty six, man,
but I feel like I'm forty eight.
Speaker 3 (05:02):
I always tell people that I'm thirty something. I never
tell them my real age.
Speaker 2 (05:08):
I'm just you know, I'm.
Speaker 3 (05:09):
Thirty something, you know, but I'm thirty six as well. Yeah,
I was born the day after Christmas and that sucked.
Speaker 1 (05:19):
So I'm a December baby myself December fifth, so yeah,
okay for you man.
Speaker 3 (05:27):
Yeah, it was always either a birthday I got presents
or Christmas. It was never I got both, so I
had to pick water together, you know, and it sucks,
you know. But when I was younger, I used to
get both. But now I don't even celebrate my birthday anymore.
I mean, it's just another number on your life.
Speaker 1 (05:48):
Really. I'm usually working on my birthdays, to be honest.
Speaker 3 (05:53):
I worked on my birthday last year, so yeah, probably
gonna work on it this year.
Speaker 1 (05:57):
So yeah, Sean, man, you got to take the weekend off,
so that's kind of nice, man, just kind of hanging out.
Speaker 2 (06:04):
Yeah, well, I mean, you know, my taking off is
not the same as other human beings taken off. Man,
Like I've got four you know, I've got four jobs,
and so I had to step away from I mean,
like I did a show, I guess Thursday night down
at Crackers, Nindy. But man, I had to take a
(06:24):
few days cause next week I've got I'm teaching seven
classes at two different colleges this upcoming semester, and I've
I've still got a full tour schedule with a stand
up and you know, the podcasting, and now I've I've
signed on with an agency for tiktoks. Now, so I'm
having to do that every day, which actually hasn't been cool. Yeah, yeah,
(06:50):
and it hasn't been that hard. I was wondering, you know,
I mean, my god, what will I do for content?
But then I rediscovered within myself that, you know, I
can't shut up, so I have all kinds of stories
and ship I can talk about.
Speaker 1 (07:05):
It's it's weird how TikTok works, man, because it's like
the most random stuff takes off in the you know,
and and what you've been doing lately has just been
random stuff.
Speaker 2 (07:16):
Yeah. Actually, I told the story of my how I
used to be. I worked in the adult industry where
and before all the way that direction, I was a
lead writer for a sex toy one of the largest
sex toy distributors in the world. And so if you
(07:36):
would ever read like a description for you know, the
Master Vagina Blaster five, Yeah, no, it was, it was.
It was a cool gig, and I did it for
a couple of years, but it was just they they
(07:57):
started selling some things that were even too far from me,
and I'm not going to say what they were, but
just let your mind rest on that where it's like
I have barely any filter and my line is so
far out there, and when you go too far for me,
it's like you've got some problems. But while I was
(08:20):
doing that, I was also gaining traction online as a
comedic adult noir author, okay, and I had books on
Amazon and everything, and it was all pure comedy, like
it wasn't you know, like oh, she heaved her breast
into his face and like taking myself seriously, like it
(08:41):
was ridiculous. And I was get a pretty big fan base.
And I had a lady and you've seen the needle
point before the wreck my putting trench, right, So she
sent me that needle point, and you know, I just
I saw it today and go, you know what, I'm
going to tell the story of that, And so I did.
And it's even after like just a hot second when
(09:03):
it was on, I was already up to like one
hundred and fifty views or something, so it's.
Speaker 1 (09:07):
Like, yeah, okay, I see I see posts for like
the TikTok stuff, but I don't know how real any
of that stuff is. Man So as far as like
an agency to work for, that's kind of cool though.
Speaker 2 (09:16):
Yeah. Well, the one that I hooked up with, they
actually reached out to me through somebody that is very
well established, like a friend of mine. That's a he's
got i mean three million plus verified followers and everything,
and it was somebody associated with him, so I knew
it was legit, you know. And I don't know what
(09:39):
the formulas are. I am just getting on there and
putting stuff on there and if it works cool, if not, hey,
you know whatever, dude.
Speaker 1 (09:48):
Yep, I hear you.
Speaker 3 (09:50):
I use a lot of I use a lot of
AI generated clipmakers. I mean I worked in advertising for
eight years and uh AI ruined advertising forever.
Speaker 2 (10:04):
You know.
Speaker 3 (10:05):
I was making big money, you know, with these advertising people,
and when AI came around, it just crashed the market
for advertising.
Speaker 2 (10:16):
You know. I was worked. I worked with like Asian Air,
Blue Cross, Blue Shield for the insurance companies and stuff
like that.
Speaker 3 (10:24):
Yeah, and I just remember the email that they gave
me was like, oh, basically AI is coming around, and uh,
we're gonna do some layoffs, and I was unfortunately one
of the people that got laid off.
Speaker 2 (10:37):
But yeah, no, but so I I used, definitely used AI.
Speaker 3 (10:42):
You know for like clipmaking and stuff like that. For
TikTok it's you know, I don't know.
Speaker 1 (10:49):
But yeah, So you got to eat a ghost pepper
wing with no water nearby? You want view there go.
Speaker 2 (11:00):
A ghost. I'll eat a ghost pepper. I have no
fear with that.
Speaker 1 (11:02):
Done. I did. We did. We did the ghost pepper
and we did ghost pepper noodles, and then we topped
it off with the toad Satan Sucker and we did
that live with the Hot Sauce Boss, who's got like
three and a half million, four million subscribers on Instagram,
and I met him through wee Man from Jackass, So
(11:23):
he came on. We bought a few things from a store,
and then we did a show. It was a lot
of fun. But yeah, speaking of that tomorrow and we're
doing a hot dog challenge on here at seven thirty.
Speaker 3 (11:32):
Did you guys, did you guys see the Joey Chestnut
and what with the other on Monday?
Speaker 1 (11:39):
Yeah, I didn't even know it just happens to me
the day after.
Speaker 2 (11:43):
Ye yeah, it's a I don't know.
Speaker 3 (11:46):
I uh, they're betting on it, like you can go
just like those sport betting places, right and uh you know, I.
Speaker 2 (11:53):
Was thinking about putting maybe twenty bucks down, uh you
know on Joey Chestnut. You know, I didn't make a look.
Speaker 3 (12:00):
Yeah, I'm pretty sure they bet on everything now and uh,
you know, sporting and there was I think they were
a betting in the Olympics and stuff like that for
a while and then they put an edge to that
because you know, I don't know.
Speaker 1 (12:15):
Actually, so Nathan's is the one that runs these competitions.
I actually so, my brother in law and I decided,
since we were gonna do the show tomorrow, we're only
gonna do the Nathan's Hot Dog. So I was like,
you know, I'm gonna reach out to them and see
if they're interested in, you know, just kind of helping
us promote it. And they immediately sit back this like
warning email about oh, you know, don't do this, it's
(12:36):
not safe. You need to have medical help. It's it's
a funny. I'm actually gonna read it tomorrow during the
show because to.
Speaker 2 (12:44):
Mention the fact that for have you ever done in
eating competition? I have, Okay, I have done competitive eating
and everybody gets all excited, like all the food's gonna
be tasty and it's a challenge and they love everything.
They don't think about the day after, okay, not thinking
of your your gastro intestinal track. You're not thinking of
your your poor you know, cinnamon star that is just
(13:08):
going to get ravaged the next day and the day after.
Speaker 1 (13:13):
You know, it's I'm already expecting to lose. I'm doing
it for the fun of it. I'm gonna get three packs,
so its twenty four hot dogs. I'm gonna try, and
I'm gonna try to at least et twelve of them.
I'm sure I can at least get twelve. But I
don't think I'll be the winner. I think that the
world record is like seventy six and ten minutes.
Speaker 3 (13:34):
That's that's crazystic You just got to think of the
aftermath of that, like see all those Yeah, it's it's
gonna be bad.
Speaker 2 (13:42):
Bro, I do it's fun aside from your your toilet
battering escapades that are gonna be coming up. One thing
you talked about earlier that the AI thing I'm curious
about because you use it. I'm not a fan of AI,
all right. I know there are some things that it's
(14:05):
it's very useful for, but you know, it's I I
just I guess I want to ask what your opinion
of it is in kind of the overall sense of
what we're looking at down the road, because I and
you work at Amazon, right, AI was supposed to do
that stuff. That's what the imaginings were for AI, is
(14:26):
that it would do the jobs to free us up
so we could create music and poetry and art and
everything else. And now AI is instead creating poetry and
music and art, and we're still tasked with doing the
menial stuff. It's like it's upside down world.
Speaker 3 (14:45):
So, like, was it the actors or whatever they were
going on strike because of the AI stuffy and so
I mean, and then they came to like an agreement
and stuff like that. But you know, I think AI
can be useful in some things. And you know, and
Poopy on the other side, you know, I lost my
(15:07):
job because of AI. But you know, it's just it
it helps with the time, Like I do it because
I don't have a lot of time in the day.
You know, I work a full time job. You know,
I pushed one button and it comes up with one
hundred clips that I can upload instantly.
Speaker 2 (15:25):
Oh geez, my alarm just went off for some other reason.
Speaker 3 (15:30):
So I mean it ita'ts I mean, yes, I see
how the downside of it, you know, the art of
creating something is going to be disappearing and stuff like
that because of AI. But I mean you got to
look at the other way, Like we can use AI
for like medical reasons and stuff like that to like
save lives and stuff like that. And let's say someone
(15:51):
had a heart attack and you know, you got to
find the symptoms of the heart attack. You just look
it up on your phone, the symptoms, and AI can
tell you what to do basically. Like I mean, I'm
not really good at explain what AI does or something
like that because I don't want to use it for
clip you know, clip stuff.
Speaker 2 (16:09):
But I imagine there's benefits to it and stuff like that, so.
Speaker 1 (16:14):
I can see see, I see both sides. I think
it's the scary part of it is how attached it
is to every one of us and through the internet,
so like if they can find our information and do
whatever and twist. I mean, think about all these people
who are trying to sue it for suit people for
using AI to generate songs that sound like them even
though it's not them. No, I mean, and I think
(16:36):
AI's gonna get to a point where it's developing itself
so fast that we won't be able to like get
under it without pulling the cord kind of thing, if
that makes any sense.
Speaker 3 (16:49):
Yeah, I mean, but uh, no, I it's funny because
I use AI to make a Depp say welcome to
Beer Pirate Radio, and it sounds so real, like I
put in a little sound effects and it sounds so
it sounds so real, and you know, and stuff like that,
(17:12):
you know, to use I mean, I don't know how
to explain it.
Speaker 1 (17:18):
Ay.
Speaker 2 (17:18):
I it's a good thing and it's a bad thing.
Speaker 3 (17:20):
Yeah, And I mean you're gonna have so many people
that are gonna be like, oh, well, you know, we're
gonna actors are not gonna be able to act anymore
because they're just gonna type in a second for an
actor to reenact and stuff like that, and they're gonna
have like these automatic actors and stuff.
Speaker 1 (17:38):
Right, would you I agree?
Speaker 2 (17:40):
What's up? Sean Uh, well, our our a viewer of ours,
who's who's thrown some questions and ideas at us asked
a very good question to which I would say, no,
absolutely not. I would not because I have seen the
movie Total Recall where I saw the Johnny Cab go
batshit crazy and almost kill Arnold Schwarzenegger because he refused
to the fair. So to answer your question, no, hell no.
Speaker 1 (18:03):
Never a driver list uber, Yeah, and I don't think
I could do it. I think the idea of a
car that drives itself a scary as to me. There's
there's a movie called or a New It's not really new.
Have you seen the series upload on on Amazon Prime?
Speaker 2 (18:18):
M No.
Speaker 1 (18:19):
So this guy's in a in one of those auto
ones and then like his girlfriend or something like basically
programs it to just dead stop in the middle of
the road and he like dies on impact. And uh,
it's just scary, like the kind of things that if
people can hack into that, people can hack into everything. Man,
if it's if it's a smart car, I mean.
Speaker 3 (18:42):
Well, look at the Kia Boys, the Kia Boys where
I live in Rochester.
Speaker 2 (18:47):
All they did was, you know, use the USB. Uh,
you know a.
Speaker 3 (18:51):
Little key and they could steal any KIA and stuff
like that. But yeah, I mean they're geting crazy.
Speaker 1 (19:01):
They are, Sean, have you used anything AI before or
have you like messed around with it at all?
Speaker 2 (19:10):
Okay, So the only thing I've done with AI to
this point, because as I said, I've been very much
against it, and part of that comes from what I
said earlier with it should be doing the menial task
for us, right. But it's also as you know, I
teach it to colleges and I teach creative writing in English,
(19:31):
and you know, writing is one of our primary forms
of communication, and these students are trying to take the
easy way out by using chat, GPT and all this
other stuff, completely missing the point of being there, right,
and they're looking for that easy way out. So I
take great umbrage with any of these programs where people
(19:53):
are saying, well, just do this and turn the paper in.
You know, it's like, why waste your money? But the
one thing I I have done is a comic out
of New York. His name's Davin Rosenblat, you know Davin.
He and I've been working on a script for two
years and we had to create what's called a pitch deck.
(20:13):
I don't know if you guys know what that is,
but it's basically like a small presentation slideshow type deal
that you would you send it off to, you know,
producers and networks and things so they can get like
a visual representation of what your show's about. Right, Right.
A Stranger Things had one called a mon Talk project,
(20:37):
and it's considered like the Bible of pitch decks. And
if you look how they put theirs together, it's you
can see why it became a hit show. Right. And
there were some images because Davin's life is so crazy
that trying to unless you got an animator, and we
paid an animator to do some of the pitch deck.
Some of the stuff is just so crazy that I
(20:59):
had to turn to AI to create the images because
it was just how else do you do it unless
you get an animator. So so so, I'm sorry to interrupt,
but what was it gonna say?
Speaker 3 (21:14):
Now? When I did working in advertising, I used to
make like flyers for insurance companies, and you know I
used to make them. Now these companies are using these
AI things and literally one click, it will have a
whole pamphlet ready to produlge per print. But uh yeah,
(21:34):
I mean it it sucks because, like you know, a
lot of people you know, lost their jobs because of it.
Speaker 1 (21:43):
And do you remember you remember there used to be
a website you could go to for school and if
there was like homework that you needed done and you
forgot to read that book, you could go to this
website and it would summarize the whole book for you.
Speaker 3 (21:59):
Oh, clyft notes something.
Speaker 1 (22:01):
Uh yeah, do you remember in high school at all?
Speaker 3 (22:05):
Oh, yeah, you have cliff Notes we Uh oh yeah.
I never read the book back in high school, Like
it was just cliff Notes.
Speaker 2 (22:12):
It was. But then the teachers.
Speaker 3 (22:15):
Kind of got great of and they were like, oh,
you know, they would pick it in the middle of
the chapter like a question that you wouldn't think they
would ask, and yeah, and I would always fail, and
you know, it's it sucked, but yeah, cliff notes, Yeah,
it was. Uh it used to be a book that
you can buy. But then like some kid created a
(22:37):
website and put all the cliff notes onto the website
and uh yeah, man, uh well the one who flew
over the Cooper's Nest.
Speaker 2 (22:47):
No, I never read that.
Speaker 1 (22:48):
I just never I just.
Speaker 2 (22:50):
Just uh just looked up the cliff notes, you know.
Speaker 1 (22:52):
Uh huh, Sean so using a teacher. I'm sorry, I
know you got something. You have you had any experiences
where you've caught like a student blatantly just like Coppy paste.
Speaker 2 (23:09):
I have had more than one time students turn in
papers where at the top of it it says, you
know something to the effect of put in a more
specific text request if you want the generated whatever it was,
to be more specific about the thing.
Speaker 5 (23:28):
You see what I mean, It's like it would I
basically telling the kid, like you have to be more
specific if you want a longer paper with more detail,
and they didn't go in and erase that.
Speaker 2 (23:41):
Before they turned the thing in. So, you know, it's
it's disappointing to me because I think people are missing
the point. You know. I know life is hard, it's
hard for people, and they do want to get through
things quickly. But you know, if you've ever seen that
(24:04):
movie with Adam Sandler, I think it was called Click. Yeah, okay,
so we saw the reason like he loved fast forwarding
through all this stuff, right, but then we find out
at the end, you know, in the de newment that
it was a complete mistake because life is about the
journey and it's about growth, and it's about experiencing those things.
(24:24):
And if you go through this life trying to do
everything on fast forward. And this isn't a knock on
the cliffs Notes because brother, I used them too, but
I used them back when we had the books that
had the yellow and white, you know, safety oddist design
for a cliffs Notes book. But whatever, I use them too,
you know. But I also learned that, like, we've got
(24:46):
to experience this life, right, because that's the point. If
you don't accept the journey and accept the growth and
experiencing these things, then you're just you're on auto pilot
and then what are you doing?
Speaker 1 (25:04):
That's that's my fear with AI. I think that So
the movie Wally. You guys familiar with the movie Wally?
Speaker 2 (25:13):
Yeah, yeah, my girlfriend made me watch it.
Speaker 1 (25:15):
It's like this, like it's scary because it feels like
that's the direction we're going as humans. You know. I
had a total different topic for the show, but I'm
loving this AI chat and like kind of the comparisons
between high school now, so let's keep talking about it.
But I feel like AI is leading us in a
direction to where tons of layoffs because what we don't need,
(25:39):
So everybody's gonna be using their phone for everything. I
mean literally, you can get anything on your phone right now.
I mean in high school, I remember, first off, before
even having any kind of technology, but then they had
like the the Pom pilots. I remember when the Palm
pilots came out, like these little handheld devices that look
(25:59):
like our now yep, and like and then I remember
teachers like trying to learn how to bluetooth or pair
of devices to send homework to you like old school.
The internet.
Speaker 3 (26:13):
Wow, yeah, I remember the t I eighty three calculators.
We would program the answers into those and then use
those on the math class.
Speaker 2 (26:24):
Like yeah, it was just it was so easy.
Speaker 3 (26:27):
You just like plug it into whatever and then just
you know, put all the answers to the quiz onto
the t I eighty three and yeah, and we just
used to cheat that way all the time.
Speaker 2 (26:39):
There was a so there was another uh so the
Internet was just getting started right when, uh when I
was about in seventh eighth grade and they started doing
online quizzes and she wanted us to print out results.
So I used to bomb them all the time.
Speaker 3 (26:58):
I would get like fifties for these on them, and
so I used to I found the font, I found
the font, and uh put a piece.
Speaker 2 (27:06):
Of tape over it.
Speaker 3 (27:07):
So I'd say one hundred and then photocopy it and
then hand that in and it would always it would
always say one hundred and uh. I used to do
that all the time too. I mean, kids are going
to find ways to you know, you know, cheat the
system and cheat education, especially you know, talking about AI.
There's AI answer where you could just like highlight a
(27:29):
question and AI will answer it for you. I mean,
like students are going to colleges to learn and stuff
like that, but they're kind of cheating themselves because they're
using all these AI stuff to answer questions and stuff
like that.
Speaker 2 (27:46):
So I guess it's like the kids are cheating theirselves, right,
but it's like stolen stolen academic valor. Man, Like, look, look,
I'm a Hopkins dude, right, That's that's where i got
my my graduate degree. And so my umbrage with Stanford
is not just a competitive one. But Stanford's been at
(28:07):
the forefront of stupidity for the last many years. Like
when they were getting rid of the wrestling program, and
now they're getting rid of their creative writing professors and lecturers.
And you know why, because AI's out there and the
computers can create for the kids. Just like you said,
they can put this stuff in and they can write
(28:28):
the stories for him. It's like, are you kidding me?
Some of the greatest stories that we have have come
from creative writing and the passing down of stories and
legend and everything else throughout human history. But now because
we're just hucking it over, you know, foising our ability
to do this onto computers. You know, people are just like, oh,
(28:51):
I'll just have the computer do it, okay. So basically
you're letting it steal the soul of humanity. All right,
fair enough, if that's okay with you, means let it
do it. But you know, I don't. I don't think
we need to allow the heart of our society to
be sucked into a machine for convenience sake. Right. It's crazy,
(29:13):
it is.
Speaker 1 (29:14):
And when technology crashes, man, we're all five, I mean,
majority of the podcast. How does Sean? How does Sean
feel about using or somebody using garage band or proto
tools to create a song using a digital drum, brass
or wood win instrument instead of playing instrument that I
feel like, I don't know. I got an opinion on that.
(29:36):
But you first they were asking you.
Speaker 2 (29:38):
I love that question. I love that question, and I'll
tell you why. That's a tough one. Because if you
say you're a master of the guitar, all right, you're
fantastic on the guitar, but you can't find somebody to
drop some beats for you, right, you just can't find percussion.
I don't I don't hate that. I that idea, right,
(29:59):
because you're already creating. But what my where I take
umbrage is if somebody says, all right, AI write a
song about you know, I don't know, somebody ate my
dog and stole my wife some country ballad, right, and
they plug the stuff in with nothing else but just
the ability to type the same thing I could do,
(30:22):
all right. This is basically what this boils down to,
is the same effect as what we got from that
Australian breakdance and Lady Raygun all right. The reason people
got pissed off was it wasn't just the corruption that
she took pardon to get to the Olympics, which is
supposed to be the best of us, right, But it's
the fact that, dude, I can flop around on the
(30:44):
floor and touch my toes and make this snake where
I can do all of that shit, And that's not
what we watched the Olympics for, Okay. So that's the
same thing with the AI. I can type into an
AI music generator, make me a country song about how
they poison my dog and kick my wife, and then
sit back and two hours later and going I made that.
(31:04):
Know the fuck you didn't, dude, you didn't do shit.
But if you're slapping down the bass, you know, and
you're slapping the bass and you're doing the guitar parts
and you need to have a trumpet thrown in there, dude,
that's cool by all means. Do it because you're in
the process of creation, all right. It's the same as
if you have a DJ, all right, that is taking
(31:28):
all these pieces and they're doing you know, I got
a buddy of mind. His name's Kevin. He's an EDM.
Guy makes beautiful music and he writes his own stuff.
But he layers. I mean you talk hundreds of layers.
This guy has of different pieces they puts together but
because he's in the process of creation. Dude, I'm all
about that. That's awesome, but he's he's taking an active
(31:51):
role in it.
Speaker 1 (31:55):
Yeah, well, and I agree. So a good example one
of my shows I have is in the Lounge where
I talk to musicians, and I had a guy, Jeremy
Groans on he's a vocal well he's actually the the
sole guy for the metal band Earth Groans, but he
highs so when he goes on tours, he hires the crew,
but he creates all the music himself, and he does
(32:16):
overlaying and all that. So he does blend a little
bit of both to get what he wants, you know,
for sound. But it's impressive to see what they can
do with it. So I do think their strengths to
utilizing it. But it's like anything, I mean, there's benefits
to I mean, let's say for work FMLA, right, it's
(32:41):
a Family Medical Leve Act. If there's a short if
you need time off for medical purposes and it's covered
under FMLA, nobody can touch you your work. You can't
get in trouble for not being at work. But there's
people that will get it so that they don't have
to go to work, you know what I'm saying. So
you got the people who use something because they need it,
and they got people that are just trying to ride
(33:03):
the social media train and think they're good ship, but
they're really just faking it the whole way. So AI's
such a toss up.
Speaker 3 (33:11):
So going back to the AI music, are we gonna
start getting Grammys to AI companies that make songs?
Speaker 1 (33:18):
I think they will. I don't think that's fair.
Speaker 2 (33:21):
It's a great question, dude, that's a fantastic so. So
I mean, like all these companies that, like, you can
get AI to make a movie.
Speaker 3 (33:30):
Script, are we going to give oscars to the AI
companies that do it? I mean it's it's uh, it's scary.
It's it's a scary thing.
Speaker 1 (33:39):
Yeah, what's up, Harold? Thanks for tuning. Harold is the
uh host of the m bomb Effect, So feel free
to give them a check out me.
Speaker 3 (33:48):
Oh yeah, Bob, what's up brother, dude?
Speaker 1 (33:54):
Yeah, I mean we're getting really deep into this. I
don't think I don't think it's fair to you to
use that to get an Oscar or any kind of award.
I mean, if you're going to use that, fine, but
expect to lose that opportunity. I mean, but that's just
my opinion.
Speaker 2 (34:11):
Stolen valor, man, Yeah, stolen valor just and it's just
you know, and that's why Tim Waltz is in so
much deep ship right now because of you know, some
of this you know stuff that he's And I'm not
one side or the other with I'm just saying, like,
people take that very seriously, and because if it even
smacks of something where you're like, you know, oh, I was,
(34:33):
I was in combat, you know, and then you find
out that the person's not that, that person's name becomes garbage.
That's that's what the the one TikTok guy, the ginger
what's his face with the missing tooth, he's gotten in
a whole shitload of trouble.
Speaker 1 (34:47):
Ginger Billy, I.
Speaker 2 (34:49):
Think that's his name where he was he said, no, No,
he's he's a he's a he's thick. But it's it's
it's a name like that. But I mean, point be
and it's you know, people that attached themselves to something
like that that takes bravery and effort and concentration and time,
(35:10):
you know, just like basically with the with what we're
talking about now with creating the music it is. You know,
I was inquiring. My dad was a musician for years.
He played with like sticks and a bunch of bands
back in the day, and he he slaved away in
front of a keyboard, you know for decades, and you
(35:33):
know that it really meant something. And so for somebody
that would step in and get a Grammy because they
typed in right an awesome, you know, keyboard solo. What
do you what are we doing here? What are we
even talking about?
Speaker 3 (35:47):
What it's like letting so this whole it's a little
off topic to the AI.
Speaker 2 (35:53):
But like, so video games.
Speaker 3 (35:57):
Are getting introduced into sports stuff like that they considered
a sport. I don't think so.
Speaker 2 (36:04):
But you know, now these ais are making like video
games you guys hear about that?
Speaker 1 (36:09):
No, not obviously, but no I didn't.
Speaker 3 (36:12):
So they're making like these video games that basically, you know,
I don't know, I mean, they're just designing video games
with AI and then uh you know, and then selling
them for you know, fifty dollars a pop and it
probably only costs them.
Speaker 2 (36:27):
I don't know, probably like one hundred bucks.
Speaker 3 (36:31):
Use the AI service and stuff like that. So right,
I mean but it's it's I don't know, it's just
like the letting.
Speaker 2 (36:41):
Computers and stuff like that, you know, get into sports
and I mean, I don't know.
Speaker 3 (36:46):
It's you know, technology is now becoming our lives. You know,
everything we you cos.
Speaker 1 (36:55):
Your glasses now are are coming out linked to your
phones and like you literally have I see. I got
a friend who uses the new RAVN has like the
camera and the video call and the recordings, and he
literally will take phone calls and on the glasses and
he like looks stuff up while he's walking down the
(37:16):
street talking to you on the phone all through his glasses.
It's like the craziest ship man.
Speaker 2 (37:20):
Yeah, nowadays with that ship. Used to be when somebody
talk to themselves walking down the streets, you're like, yeah,
stay away from that person. Now they just might be
on a call. Yeah.
Speaker 3 (37:32):
Amazon is doing now is they're having the glasses and
they actually look at scan it and tell where it's
going and stuff like that. So they're gonna start wearing
glasses and Amazon warehouses and ship and they do inventory
by just looking at a shelf and ship.
Speaker 1 (37:52):
Yeah it's pretty I mean, see that's it's clever.
Speaker 3 (37:57):
It is clever, but there's a lot of money, Like
you gotta buy every employee like smart goggles just to
tell that you have ten bags of like cat food. Like,
I don't know, it's kind of I think it's kind
of silly.
Speaker 1 (38:14):
But do you think it's gonna be a situation where
they give all the employees glasses and a bunch of
employees because they don't need the extras because now they
got the glasses, so now they can make them do
larger quantities in their shift.
Speaker 3 (38:27):
I don't know.
Speaker 2 (38:28):
I mean, it's it's so hard.
Speaker 3 (38:31):
I mean like it's because the working people now of
America are getting technology use and you know, I don't
know it's technology is.
Speaker 2 (38:43):
Great, but it is.
Speaker 1 (38:47):
What would you got, Sean.
Speaker 2 (38:49):
Oh no nothing. I just I live out in the
deep country and I heard some screaming in the woods.
So coyotes, Nope, no bigfoot. I don't know. It's look
look up anything on Appalachia or I'm sorry, Appalachia. And
(39:11):
at nighttime you hear screaming whistling in the wood, somebody
calling your name. You just you know, you hear it,
and you're just like, oh that's nice, and you don't
pay attention.
Speaker 1 (39:18):
There's a whole docuseries on Hulu about like the Kentucky Goblins.
Have you heard of that one? Goblins? Yes, this like
group goes to investigate these claims of these goblins in
the mountains in Kentucky.
Speaker 2 (39:40):
I thought that sounds like Kentucky was the strangest thing.
But now they've got goblins.
Speaker 1 (39:45):
Yes, just look at that.
Speaker 3 (39:46):
That sounds like someone in like having like moonshine, too
much moonshine. They're like, oh, do you hear about the goblins?
Speaker 1 (39:55):
Yeah, it was in my backyard yesterday. If they won't
leave me alone.
Speaker 3 (39:59):
I'm drink in like one hundred and fifty proof booshine
and I'm seeing godlins.
Speaker 2 (40:06):
Reminds me of that old joke about the leprechaun in
the bathroom and the guy says, well, he said he
was a leprechaun.
Speaker 3 (40:14):
Did you guys see.
Speaker 2 (40:16):
The leprechaun in the tree? It was like a news
reporter in Atlanta, Yes, yeah, yeah, that one brilliant.
Speaker 3 (40:27):
So this news reporter created this story about like a
leprechaun in a tree, and all these people like in
the city would migrate to this tree and look for
the leprechaun. But I read the backstory to this, and
it was just the news reporter making up like shit
just to get onto the news, and they he did
(40:48):
like a follow up story with it.
Speaker 2 (40:50):
I mean it was crazy. Yeah, and some guy I remember,
some guy and a wife beater picked up some sort
of connecting piece of it was like conduit, right, and
he's like, man, I found the leprechaun's flute. It's his flute.
Speaker 1 (41:06):
Yeah, it's that's another That is one thing that he said.
Speaker 3 (41:12):
He said he was a relative to people in Ireland
when he was an African American man.
Speaker 1 (41:17):
So that's kind of a prime example with like social
media though, man, people make up the craziest claims. A
while back, there was a guy I saw on TikTok
and he kept going live talking about the lake and
his on his property boiling and how he had to
he was airing it live stream and he kept showing
(41:40):
these bubbles in his lake. It was like a year
or two ago, and he was talking about how the
government was coming after him and he was running and
like you'd go live but nobody would, like he wouldn't
tell anybody where he actually was, but he kept asking
for help, so like he was getting thousands of viewers
built up over a couple of weeks because he was
talking about some can cover up on his lakefront property
(42:01):
and like there's something in the ground, and then all
of a suddenly he just stopped. He just you don't
see wan TikTok anymore.
Speaker 2 (42:10):
There's all kinds of weird stuff like that. Did you
see the one where the guy I think they were
hiking out in Colorado or Utah and they saw he
has video and it could be deep fake, I don't know,
but of giants walking across the tops of like a mountain,
and all of a sudden he's He's like, I think
(42:33):
I'm not suicidal. I'm good if i disappear. You know,
it's because I've had people following me. And then he
just stopped posting, like he came on a couple more
times and then just stopped completely. It was that was
that was a strange one.
Speaker 1 (42:53):
It's it's not the way that layers and stuff when
you're doing like videoing and him, I learned everything myself,
and I'm not great. I'm definitely not advanced by any means,
but being able to like I could take a video
clip of a giant, like I could do that within
ten minutes of getting off here. I'm not saying that
(43:13):
this guy was right wrong, but I could easily in
ten minutes create like a clip behind me of these
giants moving. It's just because these apps are so advanced. Now.
Speaker 3 (43:27):
Well, you know, then then there's AI, you know, create
a giant going, you know, across the mountain, you know,
click a button AI.
Speaker 1 (43:37):
Yeah, just but it almost takes a whole new direction
to the cry wolf because when it happens, when something happens,
when something crazy happens in the world, and social media
is blowing up about it, how do you know if
it's real or not? You know what I'm saying?
Speaker 3 (43:53):
Oh yeah, you know, well dude, you well, Jerry Springer
when he died, like everyone thought he had a will,
but it was a pre recorded like a skit.
Speaker 2 (44:07):
That he did.
Speaker 3 (44:08):
But everyone thought it was Jerry Springer's actual will and testament,
and like I guess he's he had a kid with
some other girl and stuff like that, but everyone believed it.
It was on TikTok and everything. And it's just you
got to think about what you see. You gotta be
like it's like, you know, you gotta look at it
(44:30):
that you know, it might be true, it might not
be true, but yeah, it's like TikTok. They trick you
to believing shit, and yeah.
Speaker 1 (44:40):
Yep, and that's I think. I think that our government
trying to shut down TikTok is because it's preventing them
from keeping things hidden from us. But I also think
that they're utilizing AI to distract us. Like, so, so
let's say they get busted for doing something, how easy
(45:00):
for them to just go, yeah, somebody just a ied that.
Speaker 2 (45:05):
Right, one hundred percent. Well, I mean the government's been
doing stuff to lie, I mean to us for decades
and decades and decades, I mean in verifiable proof. I mean,
you know, I always go back to the and I've
heard me say this before jukebox, but the Gulf of
Tonkin incident. You know, if anybody needs to answer the
(45:25):
question with the government lied to us for their own gain,
just check out the Second Attack and the Gulf of
Tonkin that got us into the Vietnam War. The Secretary
of Defense straight up said yes, that was a false
flag operation. It's like, so, how many thousands of people
died because you guys wanted to lie about that? You know,
And well.
Speaker 1 (45:46):
It's it's it's so easy to get everybody behind them
if there if there was something done against us, like oh, well,
you know they fake a story like well Layton did this,
so we're going in and doing this, Like it's such
an easy way to get everybody on their side.
Speaker 3 (46:02):
Did you guys hear about the they wanted to put
a general warning on TikTok saying that it's addictive.
Speaker 2 (46:10):
It's gonna be like it's gonna be like a pop
up screen.
Speaker 3 (46:13):
It's like, oh, general warning. You know, TikTok is, you know, addicted.
They like you've seen on a cigarette pack, like the
general warning, like you know, you know if you smoke
you get lung cancer. Right now the government wants to
do is put warnings on social media where it's like
this is addictive, like you know, you know, you can
(46:35):
waste an hour looking at cat videos and ship so over.
Speaker 1 (46:39):
Over the last couple of weeks, I've been hearing on
the car rides on the radio, I'm hearing ads for
lawsuits against social media for causing traumatic problems with suicidal
issues with kids and their teens and pre teens. Just
it talks about if you have any family members that
have had any kind of crisis because due to social media,
(47:02):
like you can contact them and get involved in these
lawsuits that are going on.
Speaker 3 (47:07):
Well, so with that, you know, your parents let you
use the phone and stuff like that. So I mean
it's do you blame the parent or do you blame
the child? That's Washington tick top. So you know if
you like, I don't know, it's I think it's like there's.
Speaker 6 (47:25):
A certain age you should use social media, right, and
like I think at an early age, you know, there's bully,
you know, and so.
Speaker 2 (47:36):
I think it's totally truth. I think it should be
sixteen and up.
Speaker 3 (47:41):
You know, my dog's drinking water over here, you probably
hear it, but uh yeah, no, I'm thinking like sixteen up.
Speaker 2 (47:51):
You know.
Speaker 3 (47:51):
And then if we had those general warnings like I
was talking about before, I mean, I don't think it
would be a problem. But you got these kids going
on like like you know, like eight years old.
Speaker 1 (48:02):
You know, you see them, You see them in the streets. Now, man,
you can be driving down the street and see groups
of kids on like literally just doing dances on their
phones or recording stuff. You know, it's strictly for social media, okay.
Speaker 3 (48:15):
And then you got some asshole that was like, oh,
you know, you look like a door in that shirt,
and it's like, I don't know, they're boiling anywhere you
go the.
Speaker 2 (48:26):
Right. Yep, absolutely, I do have.
Speaker 1 (48:29):
I do agree with you on that statement, because like
you see kids in these carts with like their tablets
and their headset and their headphones on, just so the
parents can go shop without their kids harassing them. Like
I remember being that annoying shit when I was a
kid in the cart or grabbing the cart and shaking
it and stuff. So like, but and people like parents
(48:49):
are now going, oh well, I can just avoid that
and hand them this tablet, and then they get addicted
to the screen and they're watching these shows of parents
recording their kids doing stuff like it's the craziest shit, man,
It just it blows my mind.
Speaker 2 (49:06):
But the thing is, it's not just the addiction to
the screens, which is I mean, it's psychologically or physiologically,
it's you know, hitting the same pleasure centers that get
you kicked off when people use drugs and things like that.
That this this thing does the same stuff, right, but
(49:28):
it's even more insidious than that because people have started
to discover that the algorithms and things we know are
targeted towards people and all this other stuff. Well, now
even the comments sections are targeted towards people, and a
lot of people didn't know that, but folks were starting
to notice where they'd be on like the same page
(49:49):
and it'd be like a girlfriend a boyfriend and she
would be looking at this thing where a couple was
fighting in all of the comments were like, Yeah, the
dude is stupid and he needs to you know, blah
blah blah, and then the guys would be like, yeah,
women are dumb, and everybody agreeing with like his perception
of things, right right, So now it's like we are
(50:12):
being manipulated down to the nth degree with this stuff.
So it's not just the fact that it's capturing our attention,
but it's it's manipulating our brains into thinking that, you know,
how the population is thinking and looking at things, and
(50:33):
it's it's really it's kind of not kind of it
is scary. It's very scary how this this whole thing.
And but that's another example of AI, because if your
comment section is utilizing AI, you know, artificial intelligence to
kind of suss out exactly what's going to impact you
(50:54):
the most. You know, that is complete manipulation of your
like your thought processes and everything out and it will
manipulate you into how you view the world outside of it.
Speaker 1 (51:08):
Can you talk about the comments section too? People are
literally using chat bots to go in and kind of
coax people into sending him like money and and you know,
just hey, send me the cash gun, send me this.
And it's just random chat bot that you program into
your live streams and it goes into the comments like
it's just another person trying to say, yeah, just go,
(51:29):
you know, show them some love, send them some money.
Speaker 3 (51:32):
And and do you ever want to hear a do
you guys want to hear a funny story real quick
about uh those.
Speaker 1 (51:38):
Worn So.
Speaker 3 (51:41):
Back in the day, there was a textas text to speech.
Speaker 2 (51:47):
Text.
Speaker 3 (51:47):
Yeah, so we used to do all the time was
take uh my like uh do like a bank account
like uh you know, and say that it was withdrawal
from my buddy's bank act just like some random buddy.
And we used to put it to the phone and
use these chats to speech to pretend that they someone
(52:09):
like basically wiped out their whole savings and then they
would be strusted out.
Speaker 2 (52:12):
They would be like, oh my god, you know, my baby.
Speaker 3 (52:15):
Account just got We would be drunk and we would
like just like pretend that like that happened, you know.
I mean, yeah, it's like it's crazy how these like
these ais can manipulate voices and uh, you know and
pretend to be.
Speaker 2 (52:34):
Someone else like elderly.
Speaker 3 (52:35):
People like, uh, you know, it's bad, Like elderly people
get like, you know, tricked all the time with these
AI people. They get robbed of their life savings, right, yeah.
Speaker 1 (52:50):
Oh all the time. I just watched The bee Keeper.
Have you seen that one yet?
Speaker 2 (52:55):
The Beekeeper? I'm aware of the content of it.
Speaker 1 (53:00):
It's it is, it's it's all right, it's Jason Stathan.
I'm not He's okay, he's not the best. But the
story line.
Speaker 2 (53:06):
Is you know what Davin and Davin's then you know
what they do. And so Beekeeper was basically the wet
dream of the people looking at people that get scammed
and what they think should happen, which doesn't happen. That's
not how the works.
Speaker 1 (53:25):
Well, yeah, it's on Paramount Plus right now. Yeah, I
just watched it the other day. It's not bad. I
literally watched like three days ago just to put something
on why I was cleaning the house, but basically like
this woman like gets scammed, this old lady and then
like it's just this multi billion dollar company of just
scamming old people out of their money. It's literally who
(53:47):
he's going after. Yeah, And what we got here was
to say, yeah, the scammer who took all Felicia Shard's money. Yeah,
but it's happening so crazy, and I think the hardest
part for me is that those younger than us that
didn't get to experience like the advancement to where we
(54:11):
are now, like this is I mean, I'm thirty six,
this all started happening. I remember having a pager in
I think fourth grade as pagers were phasing out, but
like I remember the regular Nintendo, like I think the
earliest memory I have it was about the Nintendo Super
Nintendo time frame and then just said in advancement as
(54:34):
I got older and just the gaming consoles and it
will dial up like experiencing all of that, Like we
know that, but those that are younger than us really don't,
so that it's scary where we are in such a
short time.
Speaker 3 (54:51):
Have you ever seen like the kids when they ask
what how to hold a telephone and kids go like
this instead of this, because we grew up, you know,
with the cord going to the wall and we would
talk on the you know phone that way. Now kids
they go like this because it's flat. It's a smartphone.
Speaker 2 (55:15):
Yeah, it's kind of their phone like this. Well no,
I mean like they're pretending it's like a phone. I
see what you're saying is I'm just imagine like, oh,
I've got a phone call hello. Yeah. I don't know
no kids at work that I work with, because when
(55:40):
I was in high school, typing was just getting it,
like using the keyboard, like you know, memorizing the keyboard,
and I suck at it. I'm one of those people
that like, you know, chicken, I call it chicken plucking.
Speaker 3 (55:53):
And there's yeah, and there's kids at my job that
just like fly through when we have to manual in
your addresses, Like they'll just fly through the address and
I'm like, I'm here just going like this, like okay,
you know, you know, kids, I mean they learn a
lot of technology within the last I would say ten years,
(56:17):
Like ten years has been like a big step up
within technology and the educational system.
Speaker 1 (56:22):
So right right, they they put it on a speaker
and put the phone in front of them like it's
a Star Trek communicator.
Speaker 2 (56:30):
Okay, yeah, I guess you could. I guess you could
see it that way.
Speaker 3 (56:35):
Yeah, well no, I mean, I don't know. It's just
the way it is now, Like kids are gonna. Technology
grows every day, it never rests.
Speaker 1 (56:46):
So doing like doing this, like what we're doing now
with the growth of JG's Lounge and kind of you know,
just podcasting in your radio stations, the advancement that we've
done just to communicate with people, like we've I've had
people from around the world on and just from setting
up a show the day before. We couldn't do that,
you know, in the past. But what happens when this
(57:09):
crazy solar flare gets too close and technology is just gone.
There's our money is all tied up in it, and
everything's tied up by technology. And I'm using a natural
disaster as an example, because I mean, it could happen.
It really could have happened.
Speaker 2 (57:25):
Tell you what, I'll tell you what happens. You got
you gotta go go work on these you're ready.
Speaker 3 (57:30):
For Yeah, I'm I'm I'll get there soon. I'm I'm
the skinniest white dude in the world. My dad was skinny.
I'm skinny, you know, I'm goofy but not.
Speaker 1 (57:50):
But what so there's none wrong with that man, But
what what happens? I mean, the world's gonna create eighty
percent of the population is not going to know what
to do.
Speaker 2 (58:01):
It's it's actually worse than just not I told you,
very true, very true, but it's it's actually much worse
than just people being confused. They've done disaster scenario breakdowns,
and it generally and I'm saying like ninety percent of
(58:22):
the time, it all fact falls out the same way,
which is the first couple of days, there's a lot
of confusion everything. But because supply chain and everything is
going to get completely stopped, people are going to be
going without food and water, and you know, this whole thing,
especially in the cities, because in the city, you don't
(58:44):
you don't shop like you do in the country, where
you go in you know and buy for the next
two weeks, you know, which is what I grew up doing,
and prepping and canning and all that other shit that
you know, we had to learn how to do, and
so you know, people who can't get their takeout, their
door dash or anything else that have also become conveniences. Right,
(59:05):
It's like, within a week you're going to start to
see a real collapse of the modern society. And you know,
you're you're talking a substantial amount of death and things,
and it'll happen quick. It's not going to be a
slow burn for sure.
Speaker 3 (59:25):
Now I think about it this way. Now, let's say
every technology device crash, right, you know, I think it
would be in the society of living off the land,
you know, growing crop, uh, you know, stuff like that.
I think I think we would be more living off
(59:46):
the land.
Speaker 2 (59:47):
Technology just so crap.
Speaker 1 (59:49):
You're talking about that small percent of people who view
it that way though, the people who don't even have
the mindset to think for themselves because they're so addicted
to these for the answers are gonna have no clue
and they're gonna kill each other, and they're gonna ride
in the streets.
Speaker 2 (01:00:05):
We're gonna go, like Siri, grow pop tarts.
Speaker 1 (01:00:09):
Why isn't my phone responding to me? I mean, who
knows how to do those things? Like? I could probably
figure it out. I mean I, oh, yeah, I've lived
out in the wilder never lived out in the olders,
but I did a lot of you know, live in
the mountains for weekends on end, which is nothing. But
I think I could survive. But I think there's a
(01:00:31):
huge population of people who would have no fucking clue.
Speaker 3 (01:00:34):
Well, the older generation, would, you know, teach the younger
generation how to do things, basically like how society is now.
The elderly teach the young. And I think that if
technology crash, we had all these farmers teaching you young
how to do stuff and you know, I don't know, man,
(01:00:55):
I was never good at.
Speaker 2 (01:00:56):
Uh you know, you know this, you know, yeah, technology crap.
Speaker 1 (01:01:02):
Yeah, that says they all huddle around Starbucks waiting for
the Wi Fi to come back on. You want to
know something funny, though, there are companies out there that
are making like sales promos and like the only way
you can get the deals is if you're on their
Wi Fi network. So people have to physically go to
one of their store locations and get on their Wi
(01:01:23):
Fi to get special deals on products.
Speaker 2 (01:01:25):
That's creepy. I mean, it's a way to get them
out there. But if they can get into the Wi
FI and you agree to certain ship they can take
all the information that they want to from you. And
there's this this is insidious because you hear about the
thing with the whole Disney Plus.
Speaker 1 (01:01:42):
Disney Plus what happened there? Okay, so do you know
the story? Do you know the story very well?
Speaker 2 (01:01:49):
I pe I was.
Speaker 1 (01:01:51):
So this lady made they were in they were in
Disney and and they the lady made it very clear
she had a peanut allergy. You know, I cannot have
any peanuts nothing. And they're like, no, it's fine, You're
gonna be fine. Of course, what did they server had
peanut in it? So she died. And the husband is
(01:02:13):
suing Disney and only for fifty dollars fifty grand, that's
always asking. And their response is that he cannot sue
them in the in the was it's like the public
court system or the federal court system, because he signed
up for Disney Plus. There's a there's something in the
(01:02:33):
agreement when you sign up for Disney Plus, you know
the app on TV, to where you can't sue them
in a federal court. You have to you have to
handle any issues like privately. And it's stated in their
agreement on Disney Plus. So they go to Disney, she
dies from eating the food, and they're like, no, you
signed up the subscription two years ago, basically saying you
can't sue us.
Speaker 3 (01:02:54):
Well, I could see the argument where like in the
back of every theme part ticket it's like you dri
at your own wrists and if you get injured in
the theme park, like, we're not liable for it.
Speaker 2 (01:03:05):
If you read it in any theme park ticket, you'll
see it.
Speaker 3 (01:03:09):
But yeah, that's what that's crazy Disney Plus.
Speaker 2 (01:03:15):
Huh, holy crap. But yeah, Disney Disney isn't you know,
bad enough ruining Star Wars and every other thing that
they get their damn hands on.
Speaker 3 (01:03:25):
You know.
Speaker 2 (01:03:26):
Now, this guy's poor, and they specifically this, This is
what's gross about this. They specifically said peanuts will kill her.
Do not serve this to her, and they went okay,
and then served her the stuff. You know, It's like,
I don't care how many clauses you've got inside of
your damn you know, Disney Plus or whatever. They specifically said,
(01:03:48):
don't serve this to her, and they still did.
Speaker 1 (01:03:51):
M M.
Speaker 2 (01:03:51):
That's on you, bro.
Speaker 1 (01:03:52):
It's he's asking for fifty grand. That's nothing.
Speaker 3 (01:03:58):
That's trunk.
Speaker 2 (01:04:01):
That's trunk.
Speaker 1 (01:04:01):
Chan It is like it's not like an unreasonable amount
that easily could have just waved it off, but wasn't
even hit you know, big news. But instead they're making
a point to let you know, like, hey, we got you,
We got you by the balls. You know, we can
take your children. You know all that, all those scandals
about Disney and and people that come out about it,
and it just gets swept under the rug like it's nothing.
Speaker 2 (01:04:25):
Well, the scariest corporate juggernauts that's out there and the
secrets that are hidden under the pavement and everything of
that empire that it's guys, gals listening to this tonight.
These things we grew up with that we thought were
so wonderful, they're not. I'm trying to be there's a
(01:04:47):
cinnacle here. It's just a fact they're not. It's a
lot of evil out there.
Speaker 1 (01:04:52):
Yeah, Evan, what do you think about that? Do you
do you agree that like big companies like Disney are
are are pre corrupt and I do I think that
if they're very evil?
Speaker 3 (01:05:02):
Uh, I mean I could see the pros and cons
of any business. You're gonna piss off some people and
you're gonna make people happy, like putting out a new
Star Wars movie for snot knows kids to watch.
Speaker 2 (01:05:16):
I mean, and then the you know, I mean, there's
always pros and cons of companies.
Speaker 3 (01:05:20):
I think the great part of it is pretty crazy.
Disney's a monopoly in all it is, It's a media monopoly.
Speaker 2 (01:05:29):
And there they are.
Speaker 3 (01:05:31):
You know, I don't know, they can get away with
a lot of stuff that normal companies can't, right, And
I don't think that's right.
Speaker 2 (01:05:40):
I don't think I think every company should be equal.
Speaker 3 (01:05:45):
It's just because the money they have they have, like it's.
Speaker 2 (01:05:48):
A media company that's just in the billions.
Speaker 1 (01:05:52):
So right, But I don't know, I just want to
kind of start wrapping up here. But set what you
got man?
Speaker 2 (01:05:58):
Well, no, I just I think it would be funny.
Ask so what do you think about Amazon? What do
I think about Amazon?
Speaker 3 (01:06:05):
Oh?
Speaker 2 (01:06:05):
I love Amazon. It's the most wonderful corporation in the world.
Speaker 3 (01:06:10):
So Amazon, right, Again, there's pros and cons that Amazon
can help small businesses. They can open up a shop
like a page on Amazon and sell their product, which
helps the small businesses. But you know, you got these
like you know, I don't know, like Amazon itself, like yeah,
(01:06:32):
I mean you got Jeff Bezel or whatever, you know,
I don't know. He just says so much money, it's
so ridiculous.
Speaker 1 (01:06:41):
Yeah, I think.
Speaker 2 (01:06:43):
I think at the jukebox. I think this thing is
going to hit the air and then when it comes
out tomorrow, he's going to be talking and all of
a sudden, his face will freeze and it's going to
be a completely different voice going. I love Amazon. They
have the best products ever. And I would never say
anything I generate you know exactly, you're gonna be like
(01:07:03):
your responsibility. I don't know.
Speaker 3 (01:07:09):
I mean, I you know Amazon, Amazon gave me a job.
Speaker 2 (01:07:13):
I mean that's all I love.
Speaker 1 (01:07:14):
Yeah, No, I agree, dude. We use Amazon like crazy.
I get Amazon packages every day.
Speaker 2 (01:07:21):
Amazon, So I'm not hating.
Speaker 3 (01:07:24):
Yeah, I mean that's where it comes in, like the
pros and cons, like you know, Juff Bezo, you know,
he's a big douche and like, you know, it's just
you know, and then you've got these small businesses that
use Amazon to for you know.
Speaker 1 (01:07:40):
Well, yeah, their businesses team going out there and they're
blowing up.
Speaker 3 (01:07:45):
We just did a podcast about Timu and Wish when
wish wish, it took like wish took three months to
get your packages. And now tim Wu has local warehouses
where they're selling the same stuff, but you can get
it or three days sooner.
Speaker 1 (01:08:03):
So do you guys remember I don't think it was
one of I don't remember what program it was, but
there's a big thing that one of the marketplaces was
selling like kids in the like by labeling them as
armors and dressers. Yes, it was a couple of years back,
(01:08:25):
and it was a big deal, yes, and like depending
on how the description of the dresser the armor was
was the type of girl that you were wanting. And
they were doing it through just a legit website. Obviously
it's sex trafficking. It's terrible, but that's what they're they
were doing.
Speaker 2 (01:08:43):
Yeah, and you knew which ones were the ones that
were targeted with you know, or I'm sorry, not targeted
but children, because you would have an armor in there
for four hundred bucks and then you would have one
for fourteen six dollars and you know, of course something
(01:09:06):
is going on there, Yeah, you know, And that's one
of the things that include a lot of people into
what was going on. And it wasn't uh, it wasn't wayfair,
but it was like one of those It.
Speaker 1 (01:09:18):
Was wayfair, you know what, it was wayfair.
Speaker 2 (01:09:20):
And yes it was wayfair.
Speaker 1 (01:09:22):
Yes it was wayfair. It does sound like pizzagate. You know, Hey,
thanks for tuning in tonight. Man. I don't know who
you are, but I really appreciate you being a part
of today's show because you know, and throwing in your comments.
Some of them been great. Definitely, Blurred Lines is another great.
This is Sean's show, Blurred Lines, and he does a
(01:09:43):
lot of conspiracy and stuff like that, so check that
out too. And Sean, I think maybe we should talk
about maybe stuff like this wayfair and stuff like that
and get into some of that stuff.
Speaker 2 (01:09:53):
I would be under percent down for it, because the
thing is the proliferation of sex traffic, sex trafficking, child trafficking.
I mean, it's it's still crazy prevalent around the world,
and you know, it's sad because it's like we were
(01:10:13):
starting to really tackle it, you know, and then we
just went silent on it and like, man, yeah, that's
a whole other rabbit hole. So yeah, we'll go after it.
Speaker 1 (01:10:30):
This has been a hell of a show. Ivan Man
I hope you enjoyed it.
Speaker 2 (01:10:35):
I did. Man, I feel like I stumbled a little bit.
Speaker 3 (01:10:38):
But I'm still getting used to this live stream stuff
because we pre record our episodes and uh, you know,
when we stumble, we can edit it out.
Speaker 2 (01:10:45):
So I stumbled a little bit.
Speaker 1 (01:10:48):
Now, this is how we do our live streams. Man.
There's been times where we totally lose power. One of
us go down for five minutes. It happens a big deal.
Speaker 3 (01:11:00):
Hey, I've been drinking such nude, so.
Speaker 1 (01:11:03):
It's all good, man, It's all good.
Speaker 2 (01:11:05):
You know.
Speaker 1 (01:11:06):
I'd love to have you back on again too. Sometime.
Conversation perfect topic for tonight's show. We were going to
talk about animals, so a little bit better. Sean Man.
Always fun, dude, You.
Speaker 2 (01:11:21):
Know that absolutely, my friend.
Speaker 1 (01:11:23):
And everybody that's been tuning in. We appreciate you guys,
and the next time, you know, we'll see you around