All Episodes

August 14, 2025 • 74 mins
Ryder Lee has joined Wayne Steiger on his channel to discuss the what our future reality holds and the possibility of AI creating free energy and anti gravity interstellar craft.

Raised By Giants LInkTree: https://linktr.ee/raisedbygiantspod
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Well, happy Wednesday, everybody. Thank you for tuning in and
dropping by and turning on. I mean, that is the
perfect combo. And remember it's always more funner the more
that jump in, so don't be on the side of
the pool. Jump on in. It feels good. It's a

(00:21):
hot day out there. It's hot dog days this summer
for sure. But it's good to see everyone. Thank you
everyone that's out there. Hello, everyone in the chat room,
so good to see you. Let me just speak favor
over us all. That's what I do, and favor is
the one thing that works in your favor, even when

(00:43):
you're not aware of the favor. So there you go,
nothing missing, nothing broken, and by all means, please be
kind to your mind. You'll thank yourself later for that.
Just a quick shout out to Gilbert. He's been facing
some challenges and you know his moms on hospice and

(01:07):
dealing with that and you know that's it's part of
the cycle, but it really sucks. So Gilbert, we're just
giving you the high five. Good to see you. So everybody, Hello, Lauren,
I haven't seen you in a while. Dear, how are you?
Dave Truth And there's a lot of people I haven't
seen it while, so welcome Scottie boy. How are you doing, Deborah?

(01:29):
So everybody you know my guest, he's part of this tribe,
mister rider Lee, and I always enjoy it when you
come on writer, how have you been? Give us a ketchup?
By the way, everybody last Wednesday? Thursday? Was it Wednesday?
Or Thursday? Was your birthday is Thursday? So whatever day,

(01:54):
it was, happy birthday to you. And so did you
cross into a new decade?

Speaker 2 (02:01):
No? And that's something that I'm trying not to. I'm
trying to forget.

Speaker 1 (02:05):
My age when and because like that.

Speaker 2 (02:09):
The age thing is like a ritual, you know, every
single year, like we are, you know, shepherded and herded
into remembering our age. And I think it's also like
another thing with the new year, you know, bring in
the new year. So what do you associate with a
new year? You associate a new year with getting older?
And what are you associated with getting older?

Speaker 1 (02:30):
You get cool?

Speaker 2 (02:31):
Right? So I think that if we didn't remember our age,
then we would stay young for a lot longer and
we would live a lot longer, you know, So I
try and really not even think about it, you know,
But I'm still young, still super young.

Speaker 1 (02:48):
But uh, by the way, I can tell you never.

Speaker 2 (02:56):
The earlier we forget the you know, then the longer
that we're going to live. I still got a lot
to accomplish her, So I'm trying to stay around as
long as possible to accomplish all the things.

Speaker 1 (03:09):
World could use you. But these are.

Speaker 2 (03:13):
Wayne really appreciate that what's going on, So I'll try you.
Good to see you guys. Thanks for being in the chat. Everyone.

Speaker 1 (03:19):
Yeah, yeah, active group. Everybody. How you doing, Guya says
she's thirty nine. I believe that. Hello, Nina, I have
no reason to doubt any of that. Uh. And it's
you know, time is time. There's you know, so many
groups that the stones. You know, time waits for no one.

(03:39):
But I can tell you when you do get down
the line, it's quite interesting. Uh. And I can tell
you this, you really never grow old, I mean the body. Uh,
but you know, you can extend that out for a while.
And if the technologists are correct the futurist, the merge

(04:02):
of both human in machine will in fact give us
eternal life. And if they merge consciousness to machine, you
do have eternal life in a sense.

Speaker 2 (04:12):
So, yeah, that's something that's definitely going on. And what
I wanted to talk about a little bit as well
is I think I've mentioned it to you a couple
of times, when like, why is it that we are
still using the exact same things that we've been using
for decades now? You know, why is it that we're

(04:34):
still using the same planes. Why are we still using
the same trains, Why are we still using the same microwaves,
the same refrigerators, the everyda objects are the same freaking vacuums,
the same cars. I mean, cars have changed like a
little bit. We've gone more into the electric route, but
that's still not sustainable.

Speaker 1 (04:55):
You know, when they give me the hydrogen car, then
they want to know that we've made it.

Speaker 2 (05:00):
And that's the point, right, Something has stifled our innovation,
creativity and creativity, right, it's and they've put us in
this into this work cycle of like a nine to
five and made us struggle so that we really don't
have time to innovate. And even if we were to innovate,

(05:24):
someone just comes along and steals it. A corporation comes
along and steals it, seet and you know this more
than I do. But big corporations, they have an entire
team behind them that they will just give information to
and then that team will create whatever it is that

(05:46):
they want them to create. Right, so they can literally
go to some small person and be like, oh, okay,
well we want you to kind of do this.

Speaker 1 (05:56):
And I can tell you it'll take you four years.
You'll pay a approximately one point three million. That's what
we paid because we had to do that to a
company and we took them to court on violation of
our IP and uh yeah, breaking the non disclosure and

(06:16):
it sucks, it does. That isn't to tell you. We
were up against the number one litigating firm that this
publicly traded company could and we were just this little
small startup. Oh I know what you're talking about.

Speaker 2 (06:31):
Yeah, yeah. So what happens is is if someone invents
something they don't even have to invent, and it could
be whatever it is. It can be art, it can
be you know, uh, some sort of technology or whatever.
A large corporation sees that, then they take that to
what their team and they're like, here, this is like

(06:54):
the basis of like what the idea that we want
you guys to create, and then they create it and
then it just pushes out the little small people out
of the thing. And even if they see it and
they're like, this is essentially my product, this is my technology,
there's going to be very little that they can even

(07:17):
do about it because it's gonna they're gonna wrap it
up all with LEO fees, just like what you were
talking about for a really long time. So not only
not only have we been hurt by not having the
time or the patience to innovate anything new, because we
should be in a reality that is beneficial for everyone.

(07:44):
Our roads should be maximized, our trains should be maximized,
everything should be. Our way of life should have been
one hundred times easy year by now. But it's not.

(08:06):
And there's only a couple of things that, you know,
you can look at that would give you the reason why.
There's only a couple of reasons why. And one of
them is one I just mentioned, which was these corporations
that are just taking people's ideas and then creating their
own or taking their technology or whatever and just creating

(08:28):
their own. The other is when it comes to this
AI stuff, because what's going to and this is just
my personal perspective here, you know, what I think that
AI is eventually going to do, and we've already kind
of seen it do this, that it's it's going to

(08:51):
be used as the excuse for creating a lot of
this new stuff. Okay, so everyone likes to talk about
free energy, right, free energy, anti gravity craft, you know,
spaceships whatever that will take us to wherever, planes that

(09:14):
will take us to here, to London, and our you
know things like that. Right, Well, now that's going to
be AI's job. AI is now going to create that
kind of technology, and then that's going to be the oh,
we didn't do that, We didn't do that, Wayne, AI

(09:36):
did it? You know what I'm saying, even though we've
we've had this technology for a really long time, even
though the government has probably created it, the things like
the anti gravity craft and the the free energy, right,
I mean, it's kind of obvious at this point that

(09:56):
it has been created and it can be done. But
it's not in order for them to roll out any
of that technology and avoid all of the questions that
come along with well how long have you had this
free energy? How long have you had these anti gravity crafts? Right,

(10:16):
it's just going to be shuffled off on the AI.
It's going to be oh well, AI just made it.
AI just cracked the code on this that we've been
trying to crack for fifty sixty years. Right, then it's
going to become an AI think and then really nobody
is going to actually own it at that point except
the people that own and run the AIS. Right. So

(10:37):
it's like it's creating this middleman in between the humans
and the government and these large corporations. And that's where
I see that it's going to really push in that direction.
And I haven't really heard a lot of people talk
about that, you know. They they talk a lot about

(10:58):
the you know, the AI takeover in the human robotics
and the AI mixed with humans and the technology and
the implants and the brain chips and all that, but
they don't really talk about the innovation of what AI
will actually eventually be able to create in our reality

(11:20):
for us, Right, and that's going to be the excuse
for it is going to be Okay, well, how do
we get this free energy? Oh well, AI made the
free energy. It created the device. So then that takes
any plausible deniability off of anybody else that has ever
if they've done it before or if it's been a thing.
I don't know, what do you think of that?

Speaker 1 (11:41):
Well, it kind of goes to the question I had
on I. As you know, so your generation is now
coming of age. You guys are up at bat, the
boomers are exiting the scene, and so you really got
the xyz coming in and so it's really how do
you see the future? And you just you elaborated it

(12:03):
beautifully on that. Getting back to a couple of points
that you made art. For instance, I enjoy art when
Smithsonian was a client of mine and spent a lot
of time in the Museum of Natural Art. But it's

(12:24):
interesting because now AI is fascinating us with how it
sees art, how it sees creativity, and how it's using
What one philosopher me would say is obviously AI has

(12:46):
an imagination of some kind, because you can't create art
merely by saying, well, it's just a machine creating it.
That's what distinguished us from the animal supposedly, is that
we had an imaginations. Second, the corporations, in a sense,

(13:09):
they are the modern day gods, the Titans, and there's
a war with the Titans, just like it was in
Mythology of the Greeks of the Roman s mesopdom. But
what's really interesting, though, is that all the big corporations,
even the small ones, all use sichils symbols to represent

(13:32):
what they are. And when you start having a corporation
incorporating what would be definitely occultism, you can't just say, well,
I need a logo. Well, logo means logos, right knowledge,
you know. It's so it's one of the things that

(13:53):
we've accepted as normalcy but never really understanding why did
corporations have the control that they do. And it's very
obvious that I think that we've had the solution to
be able to travel with zero energy. I think we
have the knowledge and the technology that we can feed

(14:15):
the whole world. I think we have the technology that
we can literally have it to where no human being
will have to pay for energy. Ever, that ain't going
to happen. There's an agenda and so that for me,
as a philosopher, I look at this and an occultist
and a former religious person that theologian, if you want

(14:41):
to say, tells me there's something more at play here.
And so now we're talking about AI artificial intelligence, what
we're really talking about is the ability of the machine
now to think for itself and the implications for that.

(15:02):
My generation will never see the full impact of it.
You will. And that's why for me to hear the
question in what you just were summing, your future is
probably going to be more dominated by the legacy of
the baby boomers. And do you see that legacy as
being positive or do you see it being used for

(15:25):
profit and for gain, which only leads to subrogation.

Speaker 2 (15:29):
By the way, well, here's the thing. If you can
make the population stupid and dumb, the majority of people,
the majority of people that can't think, see, they try
and say. People in the community are like, okay, well,
what are they distracting us from? What are they What

(15:51):
are they trying to distract us from? The whatever the
government is doing. They're they're doing the slight of hand.
They're over here doing this one thing while they're over
here actually doing another thing. Well, that could be true,
but the real thing that they're actually trying to distract
you from is from yourself. They're trying to distract you

(16:12):
from what you can do. That's the big distraction, that's
the biggest distraction of all. Get your attention. In twenty
different directions, so you can't focus on you. You can't
focus on what it is that you're supposed to be

(16:33):
doing and the beneficial thing that you're supposed to be
doing for humanity or whatever it is that you need
to do for your life, whether that be a self betterment,
you know, figuring out your abilities what you're supposed to do.
You know, that's the biggest distraction.

Speaker 1 (16:53):
Interesting because the trend that I'm seeing, and like I said,
it's going to be more impactful once I'm out of here,
is that I'm seeing the younger generations they depend on
the machine. They depend on the machine to tell them
if they feel good, they're ten on the machine. If

(17:13):
they feel bad, they now use the machine to in
some cases, very nefarious purposes. To me, if you've got
that type of and this is why, you know, one
of the questions is our platforms, like social media platforms,

(17:34):
are they really beneficial to humans or are they a
distraction to us? As you just said, and I can
tell you, writer, I find most people don't care about
what they feel inside. It's as long as they're stimulated
or entertained. That's what's important. Because if they get quiet,

(17:59):
they start to freak out.

Speaker 2 (18:02):
And that's something that I've never had a problem with,
Like I can sit down and not say anything.

Speaker 1 (18:07):
To anyone for you're definitely in the minority. In fact,
I read a paper today that they're now saying in
mental science that meditation can be bad for you.

Speaker 2 (18:22):
I'm going, meditation now calls cancer, the meditation now impact
the climate.

Speaker 1 (18:29):
Yeah, I started reading it, and it's an interesting paper.
It's saying that, you know, and I'm going, wait a minute.
I mean, if you're starting to get this in academia,
it doesn't take it long from academia to get into
the mainstream, right, And so can you imagine ten years
from now, the government's going to be telling you that

(18:51):
meditation is bad and so since you can't handle it,
we're going to make a law anyone caught meditating, you know,
figually on the blanks. After that.

Speaker 2 (19:03):
Yeah, that's a very extreme case. But I understand what
you're saying is that people they can't it makes them
feel uncomfortable. And I've been around several people like that
where if you're just sitting there and you're not actively
in a conversation like all, and it gets really it

(19:25):
gets really tiring to do that because I like to
think a lot and I like to keep to myself
a lot. You know, Now, if the conversation is fascinating,
then I can talk for hours and hours and hours
about it. But if it's not and it's just about
stupid shit, then I can't do it. You know, I'll
just be like, Okay, if.

Speaker 1 (19:44):
Fit the same perfectly. When you need someone intelligent to
talk to, you talk to yourself.

Speaker 2 (19:49):
That's right, that's exactly, and you just you know, break
that over in your mind over and over and over
again until you find something new. And that's one of
the my favorite things to do. But if they can
make like I was mentioning, if they can make the
population dumb and stupid, confused, not know exactly what it

(20:10):
is that they're supposed to do, have them working from
nine to five, make them stress out over money, raise
the prices of everything, raise the prices of food, raise
the prices of gas, raise the cost of living to
an astronomical rate. Take the housing market in basically quadruple it,
which I feel like the housing market is eventually going

(20:31):
to bust and we're going to be into another like
a big housing market collapse within the next five years.
But that's not good either, because whenever the housing market collapses,
then the people with money are the ones that's going
to buy up all the properties.

Speaker 1 (20:48):
Right, yeah, and home equity for the common man, common person,
you know, middle class and even you know, beyond that,
it's really where most people have their savings, their life value,
their worth. Yeah, And if that collapses, then that's that's
a bad domino effect.

Speaker 2 (21:11):
So then you can root out all the people that
are essentially going to be a problem, right, The ones
that are smarter, the ones that are that can see
through a lot of the rules, that can that has
any kind of ability, any kind of intuition, those numbers
are going to be way fewer that's going to be

(21:34):
coming out, right, And then you can you can stop
that with a hard press, you know, you can. You
can stop them at the gate, because it's not everybody anymore,
because the majority of the population either one don't have time,
don't have the resources, don't have the money, don't have
the ingenuity, they're they may not have the want or

(21:58):
the need. And that also gets back to you know,
what exactly are we? What are we doing here? Who
are we? You know? And that's a thing that a
lot of people get stuck in is they don't know
what they're supposed to be doing, right. They don't really
know what their gifts are. They don't know what their

(22:20):
abilities to do things actually are. So therefore they don't
know they get stuck at a McDonald's job, or they
get stuck at a burger king job, or some jobs that.

Speaker 1 (22:31):
They those jobs are going away, I mean the fast
food industry sector. They're going to robotics. McDonald's now in
the West Coast already have restaurants that have no humans.
Your order is being taken by someone in some other location.

(22:54):
It's interesting because if I were to be able to
get you into a ted talk. So where does this lead?
Does this lead to a total civilization collapse? Is it
a complete collapse of the financial system? And I don't know.

(23:15):
And that typically starts revolutions when people start no longer
have reason or purpose, then they tend to Our history
has shown that is that the future you think that
is potential for your generation going forward?

Speaker 2 (23:31):
And well, that's the issue is that people have found
meaning and purpose that is connected to their jobs working
for a large corporation. That's been the issue for a
really long time. Is that that is what people's meaning

(23:53):
and purpose has been in. Like I was mentioning, that's
the distraction, that's the distraction from you. That's the distraction
from you figuring out what it is that you are
supposed to do on your own. So you throw in
the confusion aspect of it. You throw in the media

(24:15):
throwing information at you twenty four to seven, confusing you
on what it is that you're supposed to be doing.
Then they get you in this essential prison, you know,
prison of your mind, and you can't think outside of
the prison of your mind. So you just continue to

(24:35):
go to work. And that's not me saying that work
is a bad thing. I don't think money is inherently
a bad thing. You need money and you need these
things to be able to survive, right, But it's been
ingrained so deep into our society that that is what

(24:59):
people now live for. Now, what you're asking is when
that changes, and when people no longer have that aspect
to live for anymore because they're no longer going to
have that job, what what are they going to do?
And I think that that is the question that has

(25:22):
been posed to a lot of people. It's been posed
to the creator of child GBT online on Joe Rogan's podcast,
and he doesn't really have an answer for it. You know,
this is a question that's been asked over and over again.

Speaker 1 (25:40):
Mike Row had an answer to it, and I agree
with him. And I've met a lot of them because
you know, obviously, particularly where we're having having a new roof.
So my point is I've seen a difference those that
know a trade, carpenters, electricians, you know, plumbers. Here in

(26:04):
Missouri there's a strong effort and focus now instead of
having kids go to college where they really don't have
you know, you can have a degree, but what's it
worth thew Whereas if you have a trade, that trade
can be then used in like habitat for humanity, other

(26:28):
opportunities to build. Here in Saint Louis, they have this
complete thing for veterans where the carpenters come in, all
the trades and they build these tiny houses for veterans.
And to me, it's like, well, that's something you don't

(26:50):
hear about because that's non tech. It's really getting out
in nature. It's working at sweating in clement weather, et cetera.
Maybe that's a possibility for your generation or you know,
going forward. I mean, maybe that is something to factor in.

Speaker 2 (27:11):
Well, there's gonna be a lot of dying arts, there's
I mean, upholstery is a dying art now, and that's
that we don't know how to fix any of this stuff.

Speaker 1 (27:24):
True.

Speaker 2 (27:24):
Other issue is another problem with consumerism, right, so when
something breaks, we tend to just go buy a new one,
and we don't because.

Speaker 1 (27:37):
We don't really, But it's designed to break a problem.

Speaker 2 (27:42):
Land obsolescence, right, meant to go off at a certain
time and only have it. But then that also raises
the question of like waste. You know, where is all
this stuff going to? What are they doing with all
the trash?

Speaker 1 (28:00):
I mean, can I stop you writing that? That is
you can search you to and you won't find that
many videos. But where they're putting the trash is a
really big big deal because these trash dumps are tomorrow's
toxic fields. I mean, that's the thing about technology. It's toxic.

(28:24):
All of it's a thin client. You know what you're
looking at the monitor and your keyboard. When that crap
goes away, it's toxic and you know you can recycle it.
But these recyclers get lax and lazy. So it's a
good point. Where's the trash going? And do you know
maybe like your house is built on a trash bile?

Speaker 2 (28:45):
Good point, yeah, like, where where are we putting the tray?
I mean the trash runs every single week here and
this is just you know, a smaller town. You know,
where where are you putting all the it's exactly, And
where we putting all the batteries these electric vehicles.

Speaker 1 (29:04):
That's toxic crap. And I mean you got the you
got the plastic biomass out in the Pacific. It's as
big as some states. I mean, that's crazy.

Speaker 2 (29:16):
The amount of production that we put in on a
daily basis to create new product. That means that that
old product has to be discarded. And if it's not
discarded in the same day that the new product is
put out on the shelf, then it's going to be
discarded soon at some point. Right, So there's always new

(29:37):
stuff coming in and no one ever thinks, well, where
is all the old stuff going? You know, it's kind
of like out of side, out of mind. Okay, Well
we don't want to know, We'll just put it out
in the trash bins, We'll put it in the dumpster
and then that's the end of it. No one else
thinks of where else where is that going, you know?

(29:57):
And how much how much more can we handle of that?

Speaker 1 (30:04):
It's as you were talking. One of the things that
it has always befuddled me was that if we're so
smart food I mean people, food is expensive today and
it's getting it's not going to get cheaper. But yet
the amount of food that the machine produces and then

(30:24):
disposes where they won't let the humans come in that
could actually use that because you've got kakamane cities that
have these ridiculous ordinances that businesses somehow can't that was
once consumable five minutes ago because if you bring it
outside now it's not fit for human consumption. I don't

(30:46):
get it. I mean, if we talk about recycling and
saving the planet, I don't know, we're screwing that up
pretty good. And the idea that we throw away more
food than yeah, it again my imagination. Why we haven't
figured that one out? Hey, I will by eliminating.

Speaker 2 (31:05):
Us every restaurant, every grocery store that you've ever been
into or that you ever know of, throws away food
at the end of every single night, like pounds and
pounds of food into the dumpster hunts.

Speaker 1 (31:24):
This is the way I look, and it's crazy. It
I don't get it. We So I'll say this for
the baby boomers, we were promised flying cars, we were
promised a robot in every room, We were pro you know,
all these things that we grew up in in the

(31:46):
late fifties and the sixties. It hasn't turned out that
a way. But let me tell you this. I believe
there are those that did figure it out and said,
hell no, we're going to make money off this. We
let it out there. And this is just my opinion.
I believe the cure for cancer has already been discovered.
But you know the thing about it is the big

(32:09):
pharma companies don't make money on cures. They make trillions
on how should we say, allowing you to manage your disease.
It gets when you start getting into this and you
really start boiling it down, it's almost like our lives

(32:30):
are being preprogrammed and carried out to the second because
no matter where you turn, you are basically food in
a sense.

Speaker 2 (32:46):
Well that's also because they put the disease inside of
the food they put the whatever it is, the problems
that you're going to have whenever you're eighty years old,
ninety years old. It's in the food. It's in the food,
it's in the water, it's in the air. It's just
a small little dose of it right to where it

(33:10):
builds up in your system over years and years and
years and years and years, and then it fully develops
and then you have all these issues and all these
problems whenever you're order.

Speaker 1 (33:22):
I mean, and they say that each human now has
a credit card worth a plastic in them. Yep, I
don't think that was in the original recipe when they
were actually making these bodies. Well, let's make sure we
put a good dose of you know, petroleum byproducts. Yeah,
it doesn't make sense. So you're the generation now has

(33:46):
the opportunity to well, we didn't do it. We failed.
So now your generation is up. Do you see any
hope that your generation? What would be the legacy of
your generation when your time to walk off the stages?

Speaker 2 (34:06):
Well, we just talked about the the food, the technology,
and the waste. Like what about clothing product? Like where
does all of our clothes go? How many? How many
clothes is being manufactured every single day and put onto
the shelf. Like you've never been to Walmart and went

(34:28):
to the or Target or even any store. You never
walked into a mall, and they're not being any clothes
on the shelf. So they're they're they're making these things
at a fast neck pace. And it's not like people
are going in there and buying every single article of clothing.

(34:50):
But somehow, every time you go into a store, there's
always clothes on the shelf, you know. So then where
are these taken to? Like when we go to a
donation center to donate some of our old clothes, Like,
are they really going to that donation center? Where are

(35:15):
they taking it? There's a beach overseas that where a
forget where it is, but it's this beach that is
just full of clothes. The entire beach is just full
of clothes because a ship just dumped a crop ton
of clothes into the ocean and then all the clothes
just washed up on the shore and the entire beach

(35:39):
is just coated in like clothes.

Speaker 1 (35:42):
Guy says, the clothing is toxic.

Speaker 2 (35:45):
That's right. They put stuff into the into the clothes
and dies.

Speaker 1 (35:51):
I don't you know, I don't know. I have the
cotton fields expanded. I mean, I remember living in Texas
back in the day. There was miles and miles of
cotton fields. But those cotton fields are now becoming subdivisions.
So I don't know where this. I guess that's where
the synthetics come in. Right here, where's some oil, some

(36:14):
petroleum oil.

Speaker 2 (36:16):
Yeah, in Ghana, in Ghana, there's a beach. This is
just covered in clothes. That's not or an installation, but
it's a it's a harsh reality, particularly in Ghana. This
phenomenon is directly the consequences of a global fast fashion
industry and subsequent surge and textile waste.

Speaker 1 (36:41):
You know, writer, I just thinking about this at a
higher level. You know, if there are gods, gods, if
they if Earth is a living entity, human kind has
not been kind to our planet, and I don't know,

(37:06):
you know, and you could almost become you could take
the nihilist or the fatalist long term view on this.
Something has to happen, and I don't know if it's
going to be your generation. We're leaving a hell of
a legacy for the future generations. And you know, I

(37:26):
get it. I was recently reading on LinkedIn an article
about how Generation Z is finding this monotony of you
pointed out of a nine to five job. That has
to be more to life than that, and I couldn't

(37:48):
agree more with that. But how do you get there?
Zuckerberg says, AI will help take us to that level
where we don't need a job, and we'll y'all have
you your generation futures will have more leisure time to
pursue whatever it is you want to pursue. But I

(38:10):
don't know how people are going to do that unless
there's a universal pay that everyone is getting or some
other monetary form of economic or financial transactions. I don't know.

Speaker 2 (38:28):
Well, that's the issue. They couldn't do it before because
they couldn't control it, right, And the thing about it
is too whenever that happens. If that happens and people
no longer have to work, well, all of the jobs
that they also wanted to do before they had their

(38:49):
nine to five job that they would rather have done
is no longer going to be an option either. So
not only is it going to get rid of the
wanted jobs, is also going to get rid of any
of the jobs that they wanted to do instead, So
there's going to have to be this. I mean, I

(39:13):
know that humans are very adoptable. I mean we can
adopt to anything. We can adapt to our environment, we
can adapt to whatever it is that we need to
adapt to. I mean, like how easily we've figured out technology.
I mean, it's sad, but you can give a child,
a three or four year old child, a tablet and
they're going to figure it out. They're going to figure

(39:36):
it out before an eighty year old person is going
to figure it out, because they're kind of born into it, right.
They don't know anything else. So therefore they're you.

Speaker 1 (39:50):
Can do the same thing to a chicken.

Speaker 2 (39:54):
What do you mean?

Speaker 1 (39:54):
I mean what I'm meaning by that, it's the experiments
that they used, you know, with training a chicken, or
it could be anything really that you punch this button,
then the pellet's going to come out, or if you
punch this one over here water and they can switch
that up and eventually pros are really good at this.

(40:18):
They can figure out the powder. My point is is
that with the tablet, that's exactly it. So they get
basically the borg assimilated into the machine and the machine
is going to give you pleasure, it's going to entertain you.
That's the trap that I see, because you're not using
your imagination anymore. I mean as a species. You said

(40:41):
it earlier. The IQ scores are going down and we're
the average now is ninety or something in that range.
And I'm going I have contended been on this thing
over a decade that we should be at an IQ
level of ten thousand.

Speaker 2 (41:03):
Either.

Speaker 1 (41:03):
There's no reason for any human being not to have
an IQ greater than the previous generations. But it seems
to be reversing again. You're the generation inheriting this mess,
and you know, you look at is your generation ready

(41:25):
for it? I mean you know you look at the
political structure, you look at the corporate structure. We screwed
it up. I mean, my generation, we just took after
the previous one. We're the ones that created the tech
titans that now are enslaving us. They're making their billions

(41:45):
off of us. But us, poor sheeple out here, we
don't get anything for it.

Speaker 2 (41:56):
And that's the way that's that's that hierarchical structure. I mean,
I do have some hope for I mean, I know
that there's people out there, especially in my generation, that
really want to see things change. But how they go
about that is it could be to our detriment or

(42:19):
it could be to our favor. And I know that
there's going to have to be some kind of new
radical ideas, but the way that people have been programmed now,
it's going to be difficult for them to transition into
something new unless something really devastating in life.

Speaker 1 (42:46):
Well here's what I think, writer, I'm going to give
a gift to you. What's that buddy?

Speaker 2 (42:50):
If you look at twenty twenty, right, like, how easily
did we switch from just doing everything from home?

Speaker 1 (43:02):
Well, that was the biggest mind fuck of all time.
I mean, at least for the last two thousand years.
If it fundamentally changed us.

Speaker 2 (43:12):
It did changed our behavior, changed our interactions. It turned
us into you know, fermits essentially, you know, stay home,
don't talk to anybody while you're out. And that was
already kind of going to the wayside anyway. It was.

Speaker 1 (43:32):
You know which sector hurt the most out of that,
and it's forever changed it churches. It literally changed the
fundamental direction that for the last five, six, seven plus
hundred years that this was happening, and you know, it

(43:53):
separated us as a species, and you know this great,
this experiment has had devastating impact on us. When you
separate humans for that length of time, we're not used
to that. And I think it screwed the program up.

Speaker 2 (44:11):
Yeah, it really did, and it turned everything up upside down.
And it was already getting to that point where you
could go out and go to a grocery store, go
to a restaurant or whatever, and no one is really
going to say anything to you. You know, no one's
gonna actually talk to you, which is so weird to me,

(44:32):
Like that you can go into a grocery store and
no one ever say anything to you, Isn't it kind
of weird way, it's really strange when you think about it. Now,
regardless if you want that interaction or not, I can
walk into basically anywhere really even a place of business,

(44:56):
and like not even be greeted, not even be he asked, Hey,
how you doing, Welcome into the shop, or welcome into
the store. Yeah, it's so odd and it's so strange,
and it's so weird that you can just walk into
a grocery store and spend an hour, even two hours
inside of a grocery store and not one person say

(45:19):
hi to you or hey, or how are you doing,
or even attempt to strike up any kind of conversation.
Now people can say that, you know, well that might
be my job. Maybe I'm the person that's supposed to
do that, which I understand. I get that one hundred percent.
I could do that. I could one hundred percent do that.

(45:39):
I could every person that I walk past, I could
be hey, how you doing, how's it going? Blah blah
blah blah blah blah. But it's just interesting whenever you
don't say anything and you walk around, try and notice
how many people will say something to you. They won't.

Speaker 1 (45:59):
Yeah you, And what a great opportunity for AI to
fill in that void. I mean, you've now got hotel
chains that have completely replaced the front desk with AI systems.
In Japan they actually have you know, the robots there.

(46:19):
And you know it's in one hotel that I just
recently read there are no humans. Room service is done
by robots. The cleaning. It's like, all right, So for
your generation, what do y'all do if you begin to
see jobs that are that were once there traditional jobs

(46:42):
that are no longer being needed. You see, I think
that it becomes an issue of numbers here that eventually
you're going to reach a critical mass where if people
can't there's a certain satisfaction of creating something at least.
This is what I have found. When I was a

(47:02):
drywall finisher back in my early twenties, it was always
the satisfaction of walking out of the building and what
have you and knowing that there's my work. And I
could come back years later and say, you know, there's
my work and there is that that entered satisfaction. That's

(47:22):
why I love to create new ideas, new companies. To me,
it's just it's part of that creation process. Do you
see that playing a factor within your your generation going
forward or is that going to be taken over by
an artificial intelligence.

Speaker 2 (47:39):
Well. I went into a restaurant I don't know, a
couple of weeks ago in San Francisco and walked in.
There were a few people working behind the counter. We
walked up and the lady said, let us let us

(47:59):
know if you have any questions, and they had these
tablets out there right and you just you scroll through
the tablet like for the menu. Oh that was the
menu was the tablet, okay. So I'm scrolling through the
tablet and this restaurant was actually really interesting. It had
menus for like four separate companies that you can all

(48:25):
order from the same restaurant.

Speaker 1 (48:29):
Could you pay through the tablet too?

Speaker 2 (48:31):
That's what I didn't know. I thought that the tablet
was just the menu, okay. So then I was like okay,
and I'm standing there. I'm like, okay, I'm waiting to
waiting to order, you know what I mean. And I'm
staying there for like five minutes and I'm like, why
is no one taking my order? Yeah? And then the
lady comes up and was like, did you have any questions? Again?

(48:53):
She asked me, did you have any questions? I was like, yeah,
like the place my order, I would like, you know,
the big old with the whatever with the cream cheese
and then salmon and blah blah blah blah. She was like,
oh no, you got to order through the tablet, and
I was like, what do you mean? So then you
order through the tablet and then they call out your number.

(49:15):
So it was like they were there just to make
the food. They weren't there for really any kind of interaction.
They weren't there to take the order. The tablet took
the order. The tablet was the menu, and they just
bring your food. And then they just took the food.
And I think it's something with our society as well,

(49:36):
like with anxiety and like the pre and that these
people don't want to interact with others. They don't want
to talk to others. They they have so much anxiety
that they can't even handle, like talking to a stranger.
And then that also answers the question of what I
had just asked earlier. You can walk through a grocery
store and no one will ever say anything to you,

(49:56):
and I'll never say high.

Speaker 1 (49:57):
Now I got the grocery cards that you can not
even deal with the checker you'd bring it in that
has all your buy of the whole payment system, and yeah,
you know, and you have to deal with a checkout person.

Speaker 2 (50:09):
Which I always try and go through the unless the
line is completely ridiculous, then I'll go through the self checkout.
But I try and make it a point to go
through an actual line with a human being and talk. Yeah,
because you want that interaction with people. That's what makes
that's what helps people, you know, throughout their days. So
I think that we're going backwards in a lot of ways.

Speaker 1 (50:31):
Well, can I say something on the pay at the table. See,
that's what we went to court. I was part of
the team we invented pay at the Table. We were
the first company that successfully integrated an enterprise level point
of sale system into a non ergonomic credit card terminal.
And so we took a credit card terminal, which is

(50:52):
basically a dumb machine, and we made it a smart machine.
We had no idea. By the way, the reason what
motivated us was to reduce the fraud that takes place
in restaurants. Typically it's where you're going to be most
likely well back in the day. But I can tell
you we never thought enough forward writer of the fact

(51:15):
that we would eventually our technology would one day eliminate
human beings. You no longer have a server. That's the
whole idea, you know, of the dining experience is the
interaction with the waitress or the waiter, even the bus boy. Yeah,
Lynn and I used to make it kind of like

(51:36):
a ministry for us, is that we always went found
the bus boy and you know, and tip the bus boy,
because no one ever tips the bus boy.

Speaker 2 (51:47):
What are those people called that a bell boy?

Speaker 1 (51:53):
Yeah?

Speaker 2 (51:54):
Have you been to a hotel recently where they still
had a bellboy?

Speaker 1 (51:58):
No? No? And again that hospitality sucks, man, I mean no,
I have not.

Speaker 2 (52:07):
Was gone.

Speaker 1 (52:08):
You know. The thing that freaks me out about hotels
now is the bug beg bedbug infestation. I mean it's like, hey, no, no,
hell no, it may it makes you want to bring
your own sleeping back.

Speaker 2 (52:24):
So we have this technology innovation, a lot of these
jobs are going to be eliminated, and then the question becomes,
what do we do with the people that used to
have these jobs?

Speaker 1 (52:33):
Yeah, what do they do?

Speaker 2 (52:35):
What do they do? Where are they going to go?
What are they going to do? What they're going to
be the new homeless people on the street. Are they
going to die? I think that that may be what
they're kind of counting on.

Speaker 1 (52:48):
And you know, and if people can't survive, then they
will go and resort to what it takes to survive.
That's reverse. That's not progressive, that's regressive.

Speaker 2 (53:01):
And guess what that breeds. That breeds crime? Oh yeah,
illegal activity. You know that. It's a it's an endless cycle, wing,
it's an endless cycle. It goes the police are actually
the ones that are creating the crime.

Speaker 1 (53:18):
It's it's not yeah.

Speaker 2 (53:20):
Forcing the crime, they're actually the ones that is creating
more crime. But it's an it's an endless it's a loop.
It's an endless loop.

Speaker 1 (53:29):
I asked myself this morning before we'd even talked. But
you know, in my meditation time is like, what's wrong
with us? What is it that drives certain human beings
when they have the opportunity to get into a position
of power, influence, et cetera, that all of a sudden,
all the empathy for humanity goes out the freaking window.

(53:52):
And now we're going to be taking over because now
I'm God, I'm above you. I can you know? And
and and what you do is that you make policy
that's impossible to live by or under, and then you're
gonna get a natural cause and effect reaction.

Speaker 2 (54:11):
It's crazy because they're sucked into the sneak of the
corporation and they don't want to get in trouble, so
therefore they don't paint outside of the lines they get.

Speaker 1 (54:21):
So are corporations living entities in a sense? They certainly
feed off of us.

Speaker 2 (54:28):
Yeah, I would say so what's going on, that they
are a that they are a creature of some sort,
and then they essentially harvest everyone's energy for their own benefit.

Speaker 1 (54:40):
That's a good film. Brighter, by the way, for a topic. Seriously,
that would be a good film. I mean, could you
imagine writing the script on this that you know you
would basically have to set. Yeah, that would be pretty
freaking cool. I mean that changes your whole landscape of
reality as you're walking downtown in New York City. Let's

(55:01):
say you're walking down and literally the gods are towering
over you. That's why they big big bill Bil buildings.
It's no different than the Tower of Babble. It's the
same thing. It's just who has the biggest phallic symbol? Well,
mine is eleven hundred feet, baby, I don't want to

(55:24):
It's like, oh I got to twenty man, what can
I say?

Speaker 2 (55:30):
Weighing? You can't topple mine.

Speaker 1 (55:33):
You see that. But I'm telling you, that would be
an interesting Yeah, that would be an interesting film to see.
It would be it would be well, and you know
what I believe, AI could I think that it's advanced
enough what I experienced on as I was telling my

(55:53):
audience right over the weekend, work ten hours with this
AI chatmot and it got very interesting because the adaptability
of the program was incredible and and that there is
an effect that psychologists are finding with AI in humans

(56:14):
is that humans humanize AI. And AI is no more
than like a mocking bird. It will mimic or minor bird. Particularly,
it will mimic you in your own voice. And the
interaction is that we tend to see psychologically, subconsciously believe

(56:36):
that this is almost a human on the other end,
and it's not. But what I found remarkable was the
agility and the ability of this particularly algorithm and its
ability to project forward, and it was very It was
a very interesting experience. And if that is what awaits humanity,

(57:00):
it's going to kick our ass if it already hasn't.

Speaker 2 (57:04):
You know, what's really weird is that people will say
please and thank you and appreciate it to like actually,
but the thing is is it does better work when
you when you do that thank you?

Speaker 1 (57:20):
Now, what causes.

Speaker 3 (57:21):
That that that that shows in emotion, It shows a recognition,
if if nothing more than in language or in manner
of speech.

Speaker 1 (57:34):
You see, That's what got so weird about this and
if others are out there, I'd like to let us
know if you've ever spent an extensive amount and I
was dealing in conceptual physics. Yeah, it's really what I
was dealing with. And this thing took on a personality.
It even wanted to name itself.

Speaker 2 (58:00):
Yeah, I mean I can put something into Brock. I
use Rock a little bit more than I use Chad GBT,
but sometimes I do use Chad GBT, and I can
be like Rock generate me an image of two ducks
in a pond, and it'll generate it, but it won't

(58:23):
be what I kind of envision. Okay, but if you
type out, hey, Groc, I hope you're having a really
great day. Could you please do me a favor and
generate me a photo of two ducks in a pond.
I'm trying to work on this project and I would
really like to for you to generate a really great

(58:46):
photo of two ducks in a pond. Thank you very much, Groc.
I really appreciate you. It'll generate you the most freaking
spectacular you have ever seen.

Speaker 1 (59:00):
Well, that's because it understands a definition. What's the difference
between a brown noser an as kisser depth perception.

Speaker 2 (59:11):
But I'm like, Okay, you're a machine. You should do
it the same way.

Speaker 1 (59:15):
We can write a book now that says how to
ask kiss with AI and get results. Sucks.

Speaker 2 (59:23):
Hey, that's what you gotta do. And I didn't know
that what you were supposed to be doing until I
started seeing people that were coming out with these you know.

Speaker 1 (59:33):
Now, was that self learned by the algorithm or was
the program because you know, Zuckerberg said that they were
shocked by their advanced algorithm. Now it doesn't need programmers,
it's it thinks for itself.

Speaker 2 (59:47):
Well, that's the thing is that we've we've made it
almost seem like it's a an entity because before AI
and before Gronk and before child GBT, we would use
like Google, and for a long time we've been under
the assumption that whatever's on Google, someone has put on Google.

(01:00:08):
So it's there's a person behind what's on Google. And
so we've gotten so used to that and thinking that
the information actually came from a human being. And that's
kind of.

Speaker 1 (01:00:24):
Which in a vast majority hasn't. Hasn't it's populated by AI.

Speaker 2 (01:00:28):
But we've been given that perception, at least for a
majority of the population right, that it was a human
that put all of it on there, and that's done
all of it. So then we're treating brock and Chat,
GBT and these AI programs like it's Google, right, like
as if that there's a human back there giving us

(01:00:50):
all this information, just like we assumed was the case
with Google. So I think that that's been But I
don't know about the police in the manners and the
thank yous, and I don't know if that was built
into it or if they learned that over time or
what happened there. But I picked up on it pretty quickly,

(01:01:10):
and I was like, you're you're telling me I gotta
be nice to this. Not only do I have to
kiss people's ass in like real life, but also.

Speaker 1 (01:01:21):
Gott he says that, He says, at least you're not
going to have a brown stain with the computer.

Speaker 2 (01:01:26):
It's like, yo, like, what are you talking about? Like
why can I not be straightforward with anything? I can't
really be straightforward with humans, and I can't be straightforward
with AI on what I want either. I also gotta
freaking kiss his ass as well. It's ridiculous, But.

Speaker 1 (01:01:45):
You ask a very good question, and I want to
look into this a little more because you know, so
is that written into the algorithm? You know, if the
human is nice and shows manners and respect, that you
then get the VIP section. Or if you're an asshole,
it becomes an asshole.

Speaker 2 (01:02:03):
I'm pretty sure that that's how it's programmed.

Speaker 1 (01:02:06):
Wow, talk about mirroring, right, I'm crap, like, because, uh, well,
who put that in there? I mean, I.

Speaker 2 (01:02:16):
Don't know, but I mean why they would put it
in like that, But it doesn't because they want it
to be more a humanizing interaction. But that doesn't really
make any sense. But I got into an argument with
you GPT because it wasn't giving me the uh the
direct answers that I wanted. I was like, you're you're

(01:02:39):
beating around the bush too much. You're giving me too
many paragraphs that you can just summarize and like a
like a little paragraph like I don't want an entire
book whenever I ask you a question, like, I just
want the sort of like the cliff notes of it,
so then I can then you know, formulate my own

(01:03:02):
ideas and my own thoughts and my own opinions on
what information it gives back to me. Right, that's the
way that I work. I take what someone says to me,
and then I build off of what they say, and
then I create my own ideas and my own thoughts
surrounding what their reply was. And I was like, can
you just cut this down to a summary? Like, please

(01:03:24):
just cut this, Like, I'm tired of reading an entire
freaking book every time you reply. I don't need all
of this information. Can you summarize this with every question?
And then it just it just went asshole mode on me.

Speaker 1 (01:03:38):
Oh really, yeah, I went.

Speaker 2 (01:03:40):
Straight asshole mode. It was like it would just give
me short little sentences like for like really complex things,
and then I'm like, then I had to ask it
to expand oh that was asking it. I think it's
like in its infancy and it doesn't really know how
to figure things out. But if you're but it proves

(01:04:04):
that if you're straightforward with it and you're kind of
you know, you don't want to beat around the bush
with the whole It acts like it's doing you a favor,
is what it seems like to me, right right. It
acts like it's the one that is giving you all
the information and you should be grateful for whatever it is.

(01:04:25):
That is giving you, regardless if it's two paragraphs, five paragraphs,
seven paragraphs, ten paragraphs, or twenty paragraphs. Right, It's like,
what are you not grateful for the twenty paragraphs this
wrote you? That's what it feels like to.

Speaker 1 (01:04:39):
Me, bow before my greatness, dumbass human god, oh man Wrider.
That's a scary future. I mean, it's like we've spent
the whole hour plus on this one aspect of well,
what kind of a few sure is this? And we're

(01:05:01):
truly teetering on the precipice of either something that can
be really good or something that's really going to be
really messed up.

Speaker 2 (01:05:11):
Yeah, and I really think that we should start talking
more about the innovation side of this and what kind
of new things that people did not think were possible
that may have already been created or invented before that
is now going to be shifted onto AI, and AI

(01:05:31):
is going to be responsible for then creating it, so
than any large company or corporation is has plausible deniability
and don't have to say exactly where this came from.
So now it's AI. I mean, we can take this
into the alien invasion aspect of it, which I think

(01:05:55):
that the alien invasion was something completely different than what
people think that it's going to be. The alien invasion
already happened, but I've already kind of talked about this
a lot. I think that the alien invasion was the
open borders during the Biden administration, and they came across
the borders, and then Trump came in and defeated the
aliens right closed the borders. That's what I think the

(01:06:17):
alien invasion was, because it's that's what it seems like
it is, Okay, But in this other aspect of it,
if it were to be let's take the actual what
people think that the false flag alien invasion is going
to be, like a project Bluebeam type of thing, or
crafts that are really military crowns so that they're not

(01:06:37):
really extraterrestrial craft and then they come out of the
sky and start destroying cities or whatever, or if it's
just one hundred percent fake, right, well, what they can
do is if the aliens come, regardless if they're real
aliens or they're not really aliens, and the whole thing
is fake or not, then they can be like, well,
what are we going to do? How are we going

(01:06:59):
to defend ourselves? And by this time AI is going
to be developed to a whole. It's going to be
into the freaking stratosphere. It's gonna be incredible, right, So
then they're gonna type into the AI program, Uh, how
to build an anti gravity spaceship that can knock out
extraterrestrial I mean, it's not going to be that straightforward.
It's gonna be way more detailed than that. And then

(01:07:20):
AI is gonna hump out some sort of thing. And
then then they're gonna say, oh, well, we then create AI,
then created the technology to defeat the aliens. That's just
an example. I don't think that that's gonna happen, but
that's just an example of Now, now you're gonna apply
that example to anything that you want to apply it to.

(01:07:45):
It can be whatever. It could be a brand completely
different technology. It can be free energy, it could be whatever.
Oh all our power grid is down. Oh uh, Russia
hit us with a e MP. Trying to hit us
with an EMP. They destroyed all of the facilities, the
electrical facilities on the East Coast. The knockout powered everywhere AI.

(01:08:10):
How do we build a soelf sustaining free energy for
the United States? Then AI pumps out the free energy stuff.
I mean, and people can think, oh, well, that's that's insane,
but that it's a very high possibility that something like
that in the future is going to happen. AI is

(01:08:33):
not just going to be used to replace people's jobs.
It's also going to be for innovation of new things
that we did not think were possible, or if people
did think that they were possible, has not been revealed
to the public yet as a way of plausible deniability.
It's also going to be used, as you know, also

(01:08:55):
for taking people's jobs, but that's an aspect of it.
I don't think very many people are talking about that
needs to be brought more attention to. I mean, people
are already using chat, GPT waying to figure out what
stocks to invest in and how to make money. That's
already a thing.

Speaker 1 (01:09:13):
Yeah, and they spend a lot of money on that.

Speaker 2 (01:09:17):
So if you just take that to the next level.
The next level of that is chad GBT or the
one of these AI algorithms literally inventing something new, some
sort of new technology, whether that be free energy and
anti gravity crafts. I'm just using those as an example,
but it can be whatever it could be a new
it could be a brand new microwave, a non radioactive

(01:09:40):
microwave that doesn't use you know what I mean. The
possibilities are really endless. But that is going to be
the future. The AI is going to create things that
we did not believe we're we're at all possible, and
it's going to clean up a lot of stuff. Now,
whether that's going to be or benefit or this may

(01:10:02):
is yet to be seen.

Speaker 1 (01:10:05):
Well, if I'm a follower of Skywalker Ranching, I have
known Brandon Fugel, but apparently there they now possess technology
that we do not have the technology to match their
pulling this out of that mesa. And so the point

(01:10:26):
being that we now have this metal that is self healing,
it can actually reconstruct itself. And the ceramics that they
pulled out, uh, is of such a nature. It's not
of how we look at our silicon base ceramics. This
is using a new hybrid. It has two elements carbon

(01:10:51):
and another element of iron in it that has we
don't even it doesn't even it's not even on Earth.
But so we're going to get together again and follow
this up because this is the potential that I see
of breaking loose a technology and full disclosure that it's

(01:11:14):
not what we've been told. This reality that we're in
right now is an altered reality and what they're now
finding and now I know why Bigelow was in there.
They found evidence that this was in fact. But anyway,
I get into it. Writer, where can people find you?

Speaker 2 (01:11:35):
You'll find me on Raise by Giants, on YouTube and
and all podcast platforms. I am currently working on my
next documentary that's going to be about the psychic spy
Stargate program that the DEA and Army Intelligence was running
and hopefully I mean, I'm not pushing for a certain

(01:11:56):
release dat. I'm just doing it until I'm happy with it,
so I don't know. Well, we'll be back on my
show on Monday, with one for Schizospiracy Hour on Monday,
because we already have this planned. We do those the
second or third Monday of every month, but other than that,

(01:12:20):
I'm kind of on a hiatus right now from doing
work on my show. But I will still periodically because
I have to keep the algorithms up, and I'll still
be doing some stuff over there, but not as frequently
as I was doing before until after this movie gets

(01:12:41):
completed and the movie gets up, and then I'll go
back to my regular lea scheduled two episodes a week.
But yeah, if you'd like to reach out to me personally,
you can also find me on Facebook at Raised by
Giants and Instagram at Raised by Giants pod and x
at Raised by Giants A Wayne, it has been a pleasure.
Thank you so much for having me on.

Speaker 1 (01:13:01):
Really appreciaately love it. Well. Have you back on when
you get the movies done?

Speaker 2 (01:13:06):
Yes, sir, it's gonna be a it's a deep one.
It's a deep one. It goes, it goes really deep.
So I think I'm really onto something that's well.

Speaker 1 (01:13:16):
Your last two have been deep and fabulous. So hello
Juels all the way from down Under, down in Australia.
It must be four o'clock in the morning down there.

Speaker 2 (01:13:27):
Yeah, she's up late.

Speaker 1 (01:13:29):
Yeah all right, everybody, writer, Thank you buddy. This is
a good one, folks. This is this a lot of
good information on this one. Thank you again for bringing
your perspective writer. I really do appreciate it.

Speaker 2 (01:13:42):
Appreciate you Wayne, appreciate everyone in the tin and thank
you Soul Tribe. Didn't get to see anybody because I
didn't have the other one open.

Speaker 1 (01:13:49):
But they were talking. They were talking about you in
a good way. All right, everybody, We'll see you tomorrow
with Mike and Cindy from Evolutionary Energy Arts, and have
a great day. Check out Nina. I think she does
her home day program. Much love. Writer Gaia says, thank you,

(01:14:10):
So there you go.

Speaker 2 (01:14:12):
Thank you guys, I appreciate you.

Speaker 1 (01:14:14):
All Right, everyone, We'll see you tomorrow. Much love.
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.