Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Straw Hut Media.
Speaker 2 (00:04):
My name is Douglas Rushkoff, and I've just found out
why they say don't be Alone with Jake Cogan, don't
be Alone with jj Cogan.
Speaker 3 (00:22):
Hi, guys, welcome to Don't Be Alone with Jake Cogan.
I am, of course, your humble host, Jay Cogan, and
I'm so thrilled that you're here and thrilled that people
continue to watch the show. How fantastic is that. I
would encourage you to like the show, to share the show,
subscribe to the show everywhere and anywhere you listen to
(00:43):
the show. It's fantastic. And if you could do me
the great honor of writing me at dbawjk at gmail
dot com if you have any comments or criticisms or suggestions,
but most importantly listener mail. I need your listener questions.
Otherwise the show must shut down and all will be
for naught. Come on, guys, you can write a question
in a life question, any kind of question. Write a
(01:07):
question and send it to dbawjk at gmail dot com
and that's Dbawjk. That's don't be Alone with Jake Cogan.
It's really easy to remember. Also, if you get a
chance to check out our sub stack, which is at
Jake Cogan today, I have a really fun show. It
was fun for me. I just did it. So I'm saying,
was this guy I didn't know who I saw online
(01:31):
be really smart and interesting is somebody I contacted and
within the course of ten days I got that person
to agree to be on my show. Am I persuasive?
Speaker 2 (01:43):
Yeah? I maybe.
Speaker 3 (01:44):
Whatever it is, it worked really well. This guy named
Douglas Rushcoff. Douglas Rushcoff was, as you'll find out, started
out being in theater and being a theater director and
other things, but then he became sort of a media
analyst and he's no as a influential intellectual. So I'm
sitting with an influential intellectual. Can I handle it? I
(02:06):
don't think I did well, but we'll see. He's got
so many cool ideas about life, about the future, about
what the media is doing to us, what computers and
technology is doing to us. And I couldn't encapsulate it
all in one episode, but I tried, and hopefully it'll
make sense to you as you're listening to it. I
also get to talk with his daughter, Mamie, who comes on,
(02:32):
and I just thought it was interesting that he brought
his daughter along and so she was here. I wanted
to get her point of view about having a intellectual
influencer as a dad. Wonder what would that be like.
And I think it's not easy ever to be a
son or a daughter of anybody, but you know, it's
everybody has their own interesting experience. Douglas was not a disappointment.
(02:55):
Not only do I think we talked about something really
interesting stuff, but I feel like there's a really good
connection between us, and it felt like talking to a
really old friend, even though I'd never met him before.
But you'd be the judge right after this, don't be
alone with So Douglas. Thank you. It is a Doug
(03:17):
or Douglas.
Speaker 2 (03:17):
When there's time Douglas, it's kind of fun.
Speaker 3 (03:19):
Okay, So Douglas, thank you for being here.
Speaker 2 (03:22):
Thanks for finding me.
Speaker 3 (03:23):
Well, that's exactly it. I found you. I saw you
on I saw you I heard you on the Gray
Matter podcasted Sean Ellen. And I'm a philosophy I was
a philosophy student at UCLA, so I like philosophy podcasts,
and you know, it was kind of like I was
following you completely. I don't think our host was following
(03:45):
you completely what I was following you completely down the
roads that you were going. But he had an agenda
and I loved.
Speaker 2 (03:51):
He was he would have normally followed me, and he
felt that because that podcast is owned by some like
vox Media or something, and they have things, he was
trying to do the thing that they would have wanted.
I was just talking to Sean because it's like, you
can follow this, and your audience can. Let's write, write,
demonstrate lateral thinking, connectivity, fractal logic, and go for it.
Speaker 3 (04:14):
You know. And then I then I did a deep
dive on you and I saw the books. You know,
you've written like twenty two.
Speaker 2 (04:19):
Books, Yeah, something like that.
Speaker 3 (04:20):
Fantastic and and uh, what's what's your latest book? Is
it the Survival of.
Speaker 2 (04:25):
The Richest Escape Fantasies of the tech billionaires?
Speaker 3 (04:28):
But also, you're a philosopher and I saw you know
that you're you're somebody called you a influencer, intellectual influencer,
something like that. I couldn't. I couldn't label what your
what you are. I still can't. I saw you on
soul Boom. I watched you, which was also great, and
I started like, well a humanist. I guess that's a
(04:52):
good one. Yeah. So, but I mean, how do you
define yourself or do you? I don't know.
Speaker 2 (04:57):
I mean I was a theater director, you know, that
was how I found myself till I was like thirty three, right,
got fed up with theater because it was so elitist,
right and expensive. I was doing it here when actors
are only doing theater to get seen for a movie,
rightway exactly. And then the Internet was just starting, and
I thought.
Speaker 3 (05:16):
Now you can only be in theater in New York
if you were a movie star, right, Yeah.
Speaker 2 (05:21):
That's true. You got Kenu and Alex are doing Waiting
for a GOODO on Broadway.
Speaker 3 (05:27):
You know, there are no other actors in New York.
Speaker 2 (05:30):
Yeah, you know. I mean I was. I was a
psychedelic theater director in college, you know, and just really
interested in in what what what makes that work? Why
do some people sit in the chair and other people
are up on the stage and all the modes of
communication we have in what is theater four? And are
(05:52):
there insights or is it somatic? What's happening? And then
you know, the Internet kind of happened, and I thought that, oh,
here's going to be the people's alternative. You know, hands
on everyone's going to do it. You know, it'll be
like fantasy role playing or you know, hypertexted ventures and
(06:14):
DIY narrative and what's going to happen?
Speaker 3 (06:17):
Right?
Speaker 2 (06:18):
So I wrote about that and more as like a
gonzo journalist dude. And then after that it became time
to sort of figure out, well, what's going on here?
What does it mean? And once you get into that,
then it sort of I became this whatever it is.
I am the sort of media technology philosopher ethicist.
Speaker 3 (06:40):
Right right right, But I mean we need that desperately.
But now you would have thought a lot of people
doing it.
Speaker 2 (06:48):
But we know. I remember in like the I've rene
for it was like ninety eight or ninety nine, I
wrote an email to Sergei or someone at Google saying it,
you've got this don't do evil thing, but you don't
have any body there actually defining what is evil and
what are we doing?
Speaker 3 (07:03):
Right?
Speaker 2 (07:04):
You know, wouldn't it be cool to have like, I
don't know, some kind of right anthropologist type dude in there.
They never wrote back, Well.
Speaker 3 (07:13):
The answer is no, it's really gonna that's gonna slow
us down if high thing. If we have an ethicist,
we're gonna really be slowed down. We can say, don't
do evil, but that's really not our concern.
Speaker 2 (07:23):
If you take a pause to think, wait a minute,
what are we doing here? That is a crime against capitalism?
Speaker 3 (07:33):
Right?
Speaker 2 (07:34):
Well, but I mean you pause, you can't pause.
Speaker 3 (07:36):
Well that's what was a little bit happily jarring to
me about hearing you on on a gray area. Is
that pause like the wa wait a second. We've talked
about the dangerous he wants to talk about the dangers
of AI, and, by the way, great topic. The AI
seems pretty dangerous to me. But then you wanted to
(07:57):
talk about, well what if what if jobs aren't the issue?
What if money is not the issue? What if all
those things can sort of be taken down? But that
gets See.
Speaker 2 (08:06):
What I found out through my email is that gets
both sides really mad. So Sean or someone to start
an AI conversational. What about you know, the threat of
AI to jobs. AI is going to replace jobs, to
which I always say, I don't want a job. Why
do we value jobs so much? Who really wants to?
(08:28):
I want money, I want stuff, I want meaningful participation.
But I don't want a job.
Speaker 3 (08:33):
Hence the theater career. Exactly.
Speaker 2 (08:38):
It's the hardest you will ever work to not have
a job, exactly, you know, to not be a waiter, right,
to not have that, that's the at least. But to
me that was always a job was to be the waiter,
and to not have a job was to do plays.
But as a as a media theorist, which was the
fun thing to be, you know, it's a fun way
(09:00):
of looking at the world, you always ask where did
this thing come from? So you could say, where did
TV come from? What was life like before TV? Who
invented TV? What was it for, who's paying for it,
who makes money off it? What does it do to
our brains?
Speaker 3 (09:12):
And all that.
Speaker 2 (09:13):
But you can apply that same thing to anybody. You
could say, where did jobs come from? And so I
go back historically and find out turns out cavemen didn't
have jobs. They didn't have punch clocks, they didn't have employment,
they didn't work for anybody. So when in history did
they come home. They came around eleventh or twelfth centuries
when the job was invented. People used to work, they
(09:35):
made stuff, they traded it. Then along came the kings
and said, oh no, people are getting too wealthy working
and trading. We're going to create jobs. So now you
have to work for His Majesty's Royal shoe company instead
of being a shoemaker. So I get it. Everyone wants money,
everyone wants stuff, and that's fine. But if we can't
(09:55):
enter into the conversation on where did jobs come from
and what are jobs for, then we can really successfully
deal with the world in which jobs are being taken away.
Speaker 3 (10:05):
Right, and jobs are a different thing than work. So
so I mean people cave men worked all day long,
they hunted or whatever they did.
Speaker 2 (10:12):
But before jobs, right before jobs, people didn't work that much.
In late medieval Europe, people worked two or three days
a week, and they were super healthy. They were as
they were taller in medieval Europe than they were until
the nineteen eighties. They didn't grow back as tall because
they were flourishing. They had a profitable, profitable society. But
(10:32):
you could look at everything we everything we do in
terms of do do we need that? Do we want that?
Speaker 3 (10:39):
Okay? So, and yes we should, and we're talking about
you know, when you go into media and you to
go into talking about the internet, and talk about TV
and all the things. What's your take? Do what we need,
what don't we need, what's helpful, what's not helpful. I
look at all media.
Speaker 2 (10:53):
As having on all technologies, they have these correspondingly abstracting,
amputating effects on us. So I think they're all fine,
they're all cool, as long as you don't let the
symbol systems in them. You don't mistake those things for life.
Speaker 3 (11:14):
And I feel like they're more pernicious than that. I
feel like, and by the way, and I'm a big
user of technology, but like this sort of dopamine hit
world of your likes or you know, flipping things on
TikTok or whatever, like just constantly, I feel like it's
must be changing my brain and certainly the brain of
(11:35):
our kids.
Speaker 2 (11:36):
Yeah, but that's because they're being programmed to do that.
Social media platforms were I was talking about this is
when they started. They were crafted to fuck with our brains.
I mean, that's what they're for. It was really as
back as far back as ninety three ninety four when
Wired magazine came along and said, we can make money
(11:59):
with this thing called the Internet. You know, it doesn't
just have to be it doesn't just have to be
uh nerds talking to each other about important.
Speaker 3 (12:09):
Topics, which is what I liked.
Speaker 2 (12:10):
Yeah, that was the cool part. But we can make money.
And that's when they said, we're going to make websites sticky.
We're going to use metrics called eyeball hours, and we're
going to use interactive technology to pace and lead human
beings into increasingly hypnotic controlled states.
Speaker 3 (12:29):
And the currency that I guess equates to money these
days is attention. Right now, in this moment, that's the
biggest thing in my business in TV writing. I can
write a show, I can put it on YouTube, But
unless I got eyeballs, unless I get people's attention to
the thing, it's worthless, right.
Speaker 2 (12:50):
And that's and currently I mean, and that's because human
attention is the only limited commodity. You know, the the
real estate, the surface area of the Internet of media
is infinite at this war close to infinite. There's as
much as as many Amazon webs and webers there are, right,
can roll this stuff. The only thing that's limited is
(13:12):
human eyeball hours. How many hours can a person have
their eyeballs on these three screens simultaneously.
Speaker 3 (13:20):
Right, But but that's that's this is my My desire
is still to be creative and create things and tell
stories and all this kind of stuff. And I'm spoiled
in that I've been doing it for millions of people
at a time, right, and now that whole infrastructure is
kind of crumbling a little. Yeah, except for like three people, right,
(13:40):
you know, I was looking.
Speaker 2 (13:41):
I was I'm friends with Matt Stone and they're like,
they got no problem.
Speaker 3 (13:45):
No, no, they're not.
Speaker 2 (13:46):
And by the way, but there's like six people like that.
Speaker 3 (13:49):
Tell Matt, well, you don't tell Matt he knows. But geniuses,
the two of them and everything they've ever done, our geniuses.
And we Simpsons people sit on our purchase and go like,
oh my gosh, they're so great.
Speaker 2 (14:01):
But Simpsons people genius.
Speaker 3 (14:03):
Although the interesting thing I interviewed you know, Mike and.
Speaker 2 (14:06):
I can now way back in the day and I
had done this really complex analysis of one of the
one of the shows and they looked at it and
I was like, was that what you were thinking? And
they were like, dude, we're not thinking anything. We are
shoveling cold into a furnace. If there's any genius, it's
it's spontaneous.
Speaker 3 (14:25):
It's just work. It's joke work, work, and work, and
work and work, more jokes, more jokes, more jokes, and
then eventually some of them come true. But it was weird.
Speaker 2 (14:34):
The interesting thing about the Simpsons thing because as a
you know, mad magazine reader or whatever, I would have
always National Lampoon con. I would have the dream job, right,
would be working at the Simpsons, working in the writer's room.
And they let me watch like twenty minutes of a
writer's room meeting. This is back when writer's room meetings happened.
In this writer room, it was so competitive, it was
(14:55):
so scary. I was like, Wow, you are trying to
make the other people laugh while you're competing with them
for your own job over time, do you know what
I mean?
Speaker 3 (15:05):
The worst kind of writer's room. That's true. However, in
the best kind of writer's room, we are all adding
on to something. If somebody has a good joke, we're
gonna laugh. We want for a couple of reasons. We
want to laugh at that thing because we want to
go home. So like, the more we laugh at good jokes,
the more we're gonna go home. Also, we want to
make the show better. I get to put my name
on an episode of The Simpsons that forty other really
(15:27):
smart people put jokes into and I'm happy to take
about and I'm happy to put my jokes into their episodes.
And that's that's sort of the commune nature. That's great because.
Speaker 2 (15:37):
I would I mean, I was looking at that and thinking, God,
I'd rather be making jokes live in front of a
twenty thousand person audience right than in that room with
these truly.
Speaker 3 (15:47):
Funny The pressure is you want to be funny, like
you want to. Don't want these people to think you're
a hack. That's the problem. And these we're really smart,
funny people, and you just don't want you know, your
ego can't take it. If they do. You do this
great joke in there's silence, crickets, and I guess I suck.
Speaker 2 (16:01):
Yeah, But I mean I would love to. I'd love
to read a book of stories of people's first day
in that room, yeah, you know, and how they yeah,
how they got through it.
Speaker 3 (16:21):
Don't be alone with the eyeballs and getting attention and
and uh and it's hard harder now to get that attention,
even when you doing the show for Netflix or whatever.
This so everybody's all watching other things. Now they're playing
(16:42):
video games and they're on their phones.
Speaker 2 (16:45):
But then so so we're looking at two bad things right.
One is what is doing to the brains of the viewers,
like all the algorithms and all the shit they're doing,
and the others then what it's doing to the creator.
So then do you, as the creators, succumb to the
tried and true algorithmic methods of captology to get the
(17:05):
people or do you just do your thing?
Speaker 3 (17:09):
You tell me, I want to know what you think.
Speaker 2 (17:10):
I think you just do your thing.
Speaker 3 (17:12):
I think you do your thing because I can't capture
the algorithm. There's no way for me to capture the algorithm,
and the algorithm changes. By the time I'm sort of
landed on what I think they're doing, I'm so far
behind the next thing. So you might as well do
something good.
Speaker 2 (17:26):
And then what you got to do is then you've
got to be part of that minority of artist, writer
people who are doing their own thing, who happen to
know that minority of development people who are looking for
something actually good and not just chasing their tail.
Speaker 3 (17:41):
But all that stuff, all those big companies I think
are doomed. So like I really feel like they're operating
in a model that's not going to exist soon, and
as much as I'd like it to be, because I'm
really like I'm one of those few lucky people who
can get into the room. I don't feel confident that
Warner Brothers is Sony or that how long that's going
(18:04):
to last?
Speaker 2 (18:05):
I know, and how much? Now when you go for
your you know, get up the elevator to go to
that meeting at HBO or Netflix, and you see some
twenty six year old computer kid there, you know, using
the Hollywood equivalent of sabermetrics, or on your shop.
Speaker 3 (18:21):
Or even scarier, just pressing the button, you know, giving
some prompts into AI and saying let's write a script
about this and this way and this way, and then
finally coming here's your script. They don't need me anymore.
Speaker 2 (18:33):
Well, the interesting thing about that is not an entertainment
so much because AIS don't need to be entertained. But
in terms of software and API development, the biggest companies
are now developing software for AIS. So they're creating marketplaces
(18:53):
where AIS can purchase the things that they need to
be better AIS, so that if the AIS are better
customers for stuff than we are. That's really tricky. And
then it's also it's almost like trying to get an
apartment in New York. How do you get an apartment
in New York when there's like Saudi Sovereign Wealth Fund
(19:14):
is buying those things. So apartments are no longer living places,
they're investments. So now if the what had been our
kind of software entertainment API space is now for AIS
to buy the stuff that they need.
Speaker 3 (19:27):
Here's a question I have about the future of AI
and technology, which is where does where does truth stand
in all this? In other words, and when I say truth,
I mean things that we consider factual. I know that
history changes with the winners, and that science grows and
all this kind of stuff. But there there are some
(19:48):
truths that seem to be ignored a lot of the
time by AI. And it bothers me like I can't ask.
I asked rock Well for the dimensions of a washer
dryer with a specific model number, and it gave me
this beautiful print out and it was all wrong.
Speaker 2 (20:05):
Yeah, well, that's because Grog is trying to please you.
Google is a better place to get you know, say,
you know the owner's manual, But.
Speaker 3 (20:13):
Google also has that sponsor thing where Yeah, but at
least you can get the spec you get the owner's manual,
the Samsung blah blah blah. Right, I don't have Google
AI turned on. I just have Goo.
Speaker 2 (20:25):
You don't need the AI for though. That's but they're
two different things. What is an A You got to
understand what is an AI for? An AI is a
fledgling digital thinking partner. Right, that's trying to produce results
that you will like. Right, That's very different from a
(20:45):
factual database.
Speaker 3 (20:47):
All my friends or many of my friends are using
it like Google, trying to find the truth and trying
to find facts, and they're using it this they're using
it incorrectly.
Speaker 4 (20:55):
Right, It's like using it for facts is like using
reality TV to see reality, which is what America did,
and now we have reality television people as our as
our government.
Speaker 3 (21:06):
I know. But I mean that's the disaster. The disaster.
Isn't these great little tools that people get the disasters.
We're idiots and using it. We're digesting it badly and
spitting it back out and ruining society in the process. Yes,
all right, well that's bad.
Speaker 2 (21:23):
Yeah, that's not good.
Speaker 3 (21:25):
Right, that's bad.
Speaker 2 (21:25):
That's not good. That's bad, right, I mean if you're yeah,
I mean unless you want to go really pull out
and say, okay, it's the way a colonial civilization based
in genocide and slavery ends up being hoisted on its
own petar.
Speaker 3 (21:44):
Okay, Well, even though that sounds just unfair, I don't
like being in the middle of it.
Speaker 2 (21:53):
So this is the problem, right, Yeah, But you don't
have to be that, then what you can do? I
mean that was my whole Team Human inject, my book
and podcast and all was all about find the other
people in the real world and start engaging with them
face to face in your own neighborhoods. Borrow a drill.
Speaker 3 (22:10):
You know that one of the points of this is
to be alone.
Speaker 2 (22:15):
This has a double meaning. Of course, that's that you
figured it out. That's just like scary.
Speaker 3 (22:20):
Trying to exactly trying to creep people out at the
same time. But you got to get attention. How do
you get attention? You don't just say sweet conversations with
Jake because it's.
Speaker 2 (22:28):
Great because the back and back cover of my Team
Human book is find the others, right, which is another
way of saying, don't be alone?
Speaker 3 (22:34):
Right, you know, but but so here we are face
to face conversations. I heard you both on soul Boom
and on Graymat and other places talk about trying to
get back to a society that engages one another and
works with one another. And this sounds like it sounded beautiful, but.
Speaker 2 (22:54):
It sounds like Marxism to people because it's not market
based solutions to all our problems.
Speaker 3 (23:00):
I'm you know, we believe I'm an Ashkenazi Jew from
other Marxism is fine.
Speaker 2 (23:05):
No, I get right, nothing against the market. There's just
other ways of interacting. It's like you could, what I mean,
just imagine what would it be like to have sex
with someone you're not paying for it? You know, right,
just to consider that as a possibility because one of
your possible sexual part.
Speaker 3 (23:20):
The question of a friend of mine asked when I
mentioned sort of what if the drill, like one person
on the block has the drill and everybody shares some
of the examples you used to said, well that sounds great,
but what about what about heart surgery? And I went, oh, yeah,
I mean I guess how many how many how many fences?
Well I have to tell men to get my heart surgery.
(23:42):
You could have a heart.
Speaker 2 (23:43):
Surgeon on your block who also borrows the drill from
the person who has the drill and the lawnmower from
the one that has the lawnmower. But doesn't mean that
your job is drill lending. There's some things you're paid for,
some things you're not. So it's interesting though people have
trouble imagining a world with multiple kinds of relationships. Either
(24:09):
you know, you borrow nothing from anyone ever, or you've
got to borrow everything from everyone else all the time.
So it turns out like there's a lot of stuff
that you could make locally. You could educate your kids locally,
you could grow food locally, and all that iPhones looks
(24:29):
like your commune of twenty people is not going to
be able to also create iPhones from Scotch.
Speaker 3 (24:34):
Right or keep the Internet right right.
Speaker 2 (24:36):
So there's some things that happen at big scale, like
building a boat that crosses the ocean, some things that
happen at little scale, like building the canoe you're going
to take down the brook with your kid, And if
you we've got to be able to conceive of those
multiple scales, it's just they're not in balance now. So
right now, we do everything at grand scale. If you
(24:59):
need to put a hole in the wall to put
up a picture, you're gonna go to home depot and
a minimum viable product drill that was created with a
massive chain of slavery and rare earth metals, and you
can find and all this. You're gonna use it once
or twice, then stick it in the garage and never
use it again, or take it out and it's not
even gonna charge up again, and you're gonna throw it out.
(25:19):
Some poor kid in Brazil is gonna be picking at
the renewable parts. So you can bring something to mel
it to Apple and have them say they're a green company.
Speaker 3 (25:26):
That argument, right, everybody needs drills is fantastic.
Speaker 2 (25:30):
Not everyone needs their own. So maybe we got two
per block, or you just you don't have to It
doesn't have to be you know, you don't need a
pullet bureau to figure it out. It's just you in
that day, in this decision. Maybe instead of going to
the home depot and spending the money, you go to
Bob's house and say, Bob, can I borrow your drill?
And then it initiates a chain of events where people
on this block start kind of borrowing stuff from each other.
(25:51):
Maybe they decide in that radical way that they only
need one or two lawnmowers for all five or six homes,
and they'll somehow get it done. But when I do that,
when I do that talk and say it, wouldn't that
be cool, someone always gets up and says, well, yeah,
but what about the lawnmower company?
Speaker 3 (26:07):
Right? Right right?
Speaker 2 (26:09):
What about them? You're gonna you're gonna what about the
person who's working to the lawnmower companies? Now you only
need to instead of this the five people. You know,
it's like, luckily, it's not gonna all happen at once, right, right,
it's not gonna happen at once. And what if we discover,
and I know this is scary, what if we discover
we don't all need to work five days a week.
(26:30):
What if that five day a week routine that was
set up in about twelve hundred a d by monarchs
who didn't have our best interest at heart, right, what
if it turns out it's no longer accurate.
Speaker 3 (26:44):
So so this leads me to my question. Here's the
thing about the every show. I asked somebody to help
me with my issue and you're gonna help me be
my issue, which is you have this great ability. It
seems I just met you, but I've been following you
just for very recently to break things down to their
very bottom. And I want you to help me be
(27:06):
able to clear my mind of expectations of what I
should be doing and what I shouldn't be doing and
just break it down to the very bottom so that
I don't feel bad about not making shows for millions
of people, or I don't feel like that those are
the goals, Like how do we re look at re
examine the life of creative life and say, well, this
(27:27):
is a new goal, and this is the goal, and
this is it's not just making shows for big corporations
and making Rupert Murdoch even richer. It's about doing this
thing at a small scale, is okay? Or nothing at all?
Or just doing it for yourself writing in your in
your journal. You know that's kind of it doesn't compute
to me.
Speaker 2 (27:46):
You made media at a time when you know, Roda
got married right on TV, fictional television.
Speaker 3 (27:58):
We were kids with it.
Speaker 2 (28:00):
We were kids when that happened. But when Roda got
married during the first commercial break the water pressure in
New York City dropped, like to critical levels because so
many people flush their toilets. At the same time, it
was a different era. We all watched the last episode
of mash and you wrote television at a time when
(28:22):
Simpsons was must see TV, Homer's Odyssey. We half of
America saw that at one time.
Speaker 3 (28:29):
That's not even though.
Speaker 2 (28:31):
There's big media, that age is not going to happen again.
Speaker 3 (28:36):
Right, but people do see things on a grand scale, But.
Speaker 2 (28:40):
It will always since that writing for anytime in the
first ten seasons of The Simpsons, will you know, except
for very rare moments, not be equalled. And it certainly
isn't the way our society works anymore. So in terms
of that that particular thing you've you topped out, so
(29:01):
throw that away.
Speaker 3 (29:02):
That's not happened.
Speaker 2 (29:03):
So once that's gone, then it's like, what what becomes
the purpose? It's funny someone I was in Germany on
a book tour and someone I guess you didn't know
what what I had written this book, a nonfiction book
about kids and media. And finally he gets up. He says, so,
mister Rushcoff, why why should we read your book? And
(29:26):
I said you should read it for the pure pleasure
of reading. I make no promise. I mean, yeah, I
got my theories about kids and screenagers and what are
they're going to do and all, but you don't need
you don't need it. It has no instrumental purpose, you know, so,
so I would think that the especially at our stage.
Speaker 3 (29:49):
I mean, you're being modest about it. You wrote a
book with ideas.
Speaker 2 (29:52):
Yeah, but I make no claim. I make no claim.
If it's going to be instrumental purpose for you, fine,
But if I'm going to give it, why I read it,
you surp for the pleasure. If you're not gonna don't don't.
Speaker 3 (30:02):
Well. I make that same claim about any watching any
of my shows exactly.
Speaker 2 (30:06):
But I'm trying to go more to answer your question.
In terms of entertainment, I don't think it matters. You
know you can do. I would tell you the most
profound theater experience I was a part of was a
scene I directed for a class at college, and the
twenty people in the room we had a profound experience.
(30:28):
And even the professor that said, you know, it doesn't
matter where you do theater, whether it's at Lincoln Center
on Broadway or in a room when when these moments happen.
Speaker 3 (30:41):
And I agree with that.
Speaker 2 (30:42):
That's what it's all.
Speaker 3 (30:43):
There's magic in moments. So I would just go.
Speaker 2 (30:47):
For maximizing the magic in any in any moment that
you're in, whether it's for three people or thirty, and
you don't I wouldn't at this stage, I wouldn't worry
at all about formulizeing how to reach the people. I mean,
you can, if you weren't hire some twenty year old
social media person to clip stuff up and spread it
(31:08):
throughout and create, you know, consider your show the anchor
content and let them do you know, their Gary Vanichuk
style social media marketing.
Speaker 3 (31:17):
I've been looking for that twenty year old I'll find,
I'll find back.
Speaker 2 (31:20):
Do you go to UCLA, grab them?
Speaker 3 (31:23):
I teach at USC. I'm sure they're on there. But
yet they're rich. Those kids are rich. They don't need
to work for it.
Speaker 2 (31:28):
But at this moment, I think it's first it's possible,
it's probable our civilization is ending, and it's possible our
species is ending. Right in a moment like that, If
you're an entertainer, you're basically in the orchestra on the
deck of the Titanic, and you can either distract people
(31:54):
from the fact that they're going down or lean into
the don't be alone. You can mean into we are
in this together. The more in this together we are, first,
the less painful whatever's going to happen is going to be.
And second, the more chance there is for us to,
through all sorts of mutual aid, lift ourselves out of
(32:17):
this thing. So that's all you can do is connect people,
whether by helping people identify with the stuff that you've done,
or connect with each other through the stuff that you've done.
Speaker 3 (32:27):
In addition to that, you you advise people to get
more more awe. You know, we're aiming for I don't know,
some kind of emotional higher level joy.
Speaker 2 (32:49):
We're on a higher level at least more open. It's interesting.
I mean we kind of we kind of traced through
what I would call sort of my philosophical rubric this conversation.
It's interesting. I just kind of gave this sort of
when push comes to shove, I've been asked what is
your what do you believe? What's your thing? So I
came up with kind of these four steps, these four
(33:10):
interventions of how we can move through the world.
Speaker 3 (33:13):
You know.
Speaker 2 (33:13):
The first one is what we were talking about first.
The first one is is make things weird. Right, It's like, realize,
what Wait a minute, what's the job. Wait a minute,
what's money? And once you make things weird, then you
get to the second one, which is to trigger agency,
but make things weird in order to look at things
differently and realize they're created. Oh, they're invented.
Speaker 3 (33:33):
Why do we drive a car to work? Oh?
Speaker 2 (33:35):
Because GM lobby to move your your factories far away
from your houses so that people would have to have cars.
Speaker 3 (33:40):
Well, La is the perfect example of that, which is like,
let's make a car.
Speaker 2 (33:44):
City, and let's do that by getting rid of the buses,
by shutting down the street cars.
Speaker 3 (33:49):
It's make it difficult to get any any place anywhere
without a car. Right.
Speaker 2 (33:53):
It's like Elon Muskin those fake tunnels he was making.
Everyone knew those tunnels from La to San Francisco were
never going to work. He did it to prevent funding
from going to light rail on the supertrain that they
were gonna build. It worked, of course it worked, but
why why? So that's the first one is sort of
it's to really denaturalize power, see things as created, which
(34:14):
then triggers your agency to do something about it, to
reprogram the world and the way you want borrow a drill.
Once you're borrowing a drill, it resocializes us. That's the
third one.
Speaker 3 (34:23):
You end up.
Speaker 2 (34:24):
I can't do it alone. I can only remake society
with other people. Right, don't be alone, find the others.
And once you find the others, that engenders the last one,
which is a state of awe. Cultivate awe, because once
you're in a state of all you no longer feel alone.
You feel connected to everything. You realize I'm I'm nothing.
I'm part of this thing bigger than myself. Whether it's
(34:46):
the Grand Canyon or these thousand people in this in
this audience with me, or the looking in the eyes
of your little baby when they're born. It's like, oh
my god, there's I'm connected to something. And that's sort
of the the reward if you will for it. But
it's also if you're in a state of awe, and
this has all been scientifically proven. Now if you're like
proof nonfiction, like you know that when you when you've
(35:09):
experienced a state of awe, you're more generous for days
to come. Your immune system changes, your cytokin response changes.
You are actually a different, better, nicer, more connected person
if you've had an experience of all. So that's sort
of the fourth one. And once you're in a state
of awe, then you want to go to the ecstatic
(35:29):
dance party. You know, you want to hang out and
really experience like yes.
Speaker 3 (35:35):
Now all the extact dance party made me think, well,
I should be taking mushrooms. I should be taking I
should be doing this kind of stuff A.
Speaker 2 (35:41):
Strong word, I would say more you could be taking.
Speaker 3 (35:46):
And imagine that.
Speaker 2 (35:47):
Yeah, yeah, it's not all fun. By the way, you know,
do you you vote?
Speaker 3 (35:51):
You take?
Speaker 2 (35:52):
My mushrooms is work too. I'm not saying bad trip,
but it's work because look at the world we are in.
If you were going to open yourself off with my
celial product like mushrooms to the the the human and
natural experience, you're going to be connected to the trauma
of this of this moment, right, So it's real.
Speaker 3 (36:11):
If I'm connected to trauma anyway, aren't we sort of
I know messeda is we feel it. Whether we're acknowledging
it or.
Speaker 2 (36:17):
Nurse, we feel it. Of course we feel it.
Speaker 3 (36:19):
That's also what's fascinating. That's guy. I want to talk
to your daughter about this too, because you seem to
be at the same time incredibly optimistic about things but
also incredibly pessimistic.
Speaker 2 (36:29):
And I'm kind of both. Yeah, I mean, it's hard
you connected. It's funny. I was at a is that
something there's a Jewish guy was saying because of the
Palestine thing. He's like, God, you know, it's really hard
to be a Jew these days. And I understood what
he meant, you know, and what I couldn't help, and
I said, I said, yeah, try being a Palestinian, you know, right,
(36:50):
But but I understand what he was saying was as
a person with compassion to see to know understand what's
going on, especially in the name of your religion.
Speaker 3 (37:00):
Or ethnicity, like that said, what Judaism teaches is not
what's happening, the opposite. And so we're being pulled in.
We it's sort of rooting for your team, and while
they're being really horrible and cheating, it's like, oh God,
you can't root right, Well, that's what I mean.
Speaker 2 (37:18):
Now my my voicemail, I've got three or four rabbis
who have called me over the last year weeping that
they think Judaism is untenable now right, And I'm just
been rendered untenable because of what's what's been done in
its name.
Speaker 3 (37:33):
Well, I mean religion. Religion has always emitted crimes. Religions
are great.
Speaker 2 (37:42):
Things as long as you don't believe in them.
Speaker 3 (37:44):
Yeah, And and the value is still there because there's
a there's something about trying to tap into spirituality that's worthwhile.
So if you can get to you know, a morality,
a little bit of morality and a little bit of spirituality,
that's good. We may not need the doctrination.
Speaker 2 (38:04):
And the whole trick is once you put the sacred
symbol of a religion on a national flag, you're bound
towards the profane. You know, nation states are compromises. Nation
states are are God didn't make nation states. It's Treaty
of Versailles. These are stupid things. These are political constructions.
Speaker 3 (38:22):
But okay, fatherhood, your daughter is shitting eight feet away.
What's it like to be a father for you?
Speaker 2 (38:30):
I mean, it's the greatest thing in the world because
there is a there is a mini equality to it
that I can see extensions, amplifications and variations on the
themes that I've been living as an existence. For me,
(38:50):
the hard part of it is having a child is
on a sensory level, it's like having an extension of
my own nervous system. There's increased surface area on my
nervous system with no control right over it. Right, So
it's it's all vulnerability without agent.
Speaker 3 (39:14):
Says it's scary.
Speaker 2 (39:15):
It's hard, right, that's scarce around in the world.
Speaker 3 (39:17):
But anything that touches or hurts are gonna hurt any
as well, for sure, you know.
Speaker 2 (39:22):
But and it's also the other thing that's hard about
it for me is as a person who believes that
this civilization could be ending, right, I don't care for myself.
It's like if the atomic bomb's gonna come, just I'm
gonna face it. Just take me fast, you know, or
I'm gonna have a front row seat on the frigging
twenty eight decades later.
Speaker 3 (39:41):
Whatever has happened with your kid, that's a different story.
Speaker 2 (39:44):
Oh, I want her to have out everything, you know,
And I'm always caught between trying to like somehow make
enough money to insulate her from what I see as
the coming thing toward realizing to just making the world
a place that you wouldn't have to insulate herself from.
Speaker 3 (40:00):
Do you want to come talk now? Can you all right?
You don't have to, I'd love I think it's an
interesting dynamic.
Speaker 1 (40:08):
I gotta I think she does want to be famous.
Speaker 2 (40:22):
Don't be alone with ja.
Speaker 3 (40:29):
I was asking your dad about being a dad now.
I have a very personal stake in this because I
have a son who's twenty three or twenty four, actually
just twenty four. Who I feel like I'm a big personality,
Like I'm like, I'm famous for a writer, you know,
just writer, famous and all that kind of stuff. And
(40:49):
I feel like he's a singer songwriter. He wants to
be a singer, songwriter, he wants to do stuff, be
an artist, and all this kind of stuff in the
very hard world that you're being raised in. And I
feel like who I am kind of affects him in
some way. I'm not sure if it overshadows him. I'm
not sure if it pulls him forward. I'm not sure.
But your dad is a big personality. I like his personality,
(41:10):
and he's very analytical and he's very and I wonder
what it's like being daughter of him and how it affects.
Speaker 5 (41:17):
You well, it's not really that your ideas are like
anything that you do that's smart that affects me. It's
more just your personality, like my like my friends like
just think you're silly.
Speaker 3 (41:34):
But is he silly with your friends?
Speaker 5 (41:35):
Yeah? Okay, yeah right, like with my best friend Nina,
we're all just like making like weird sounds and singing
weirdly together.
Speaker 3 (41:46):
Yeah, I just respond, But that's just being he's playing. Yeah.
Speaker 5 (41:50):
Yeah, I mean I guess it's like helpful if I
read something in class and I want to talk about
it and then we can bounce ideas off of each other,
but not really in day to day life. Does it
affect me at all?
Speaker 3 (42:03):
Did you inherit any of the sort of analytical sense?
Speaker 5 (42:05):
Obviously, I'm obviously. I mean I'm the same major that
he was in college, which is English major, theater concentration.
Speaker 3 (42:14):
Okay, so that's that's a major. Saying I don't know
what I want to do yet, mostly for most people,
not her for me. So do you want to be
a writer.
Speaker 5 (42:22):
I want to be a casting director?
Speaker 3 (42:23):
Okay, yeah, that's interesting. Why a casting director? What is
it about casting that excites you?
Speaker 5 (42:29):
Well, it's mainly like I don't want to do like
playwriting or screenplay writing because I like just like fitting
people in certain roles. Like that's what I've been doing
since I was like little with just my barbies.
Speaker 3 (42:44):
Okay, So that's analytical. So you're looking at something and
you're figuring out from the panoply of actors and stars,
you know, like who would be and what thing do
you do that?
Speaker 5 (42:55):
I do that all the time. If we go see
a play, I do that. It's like I a movie,
I'm like, oh, what has this person been in? Or
like what could they be? And that's just what I
That's how my brain works. It's like what I do
when I'm falling asleep at night.
Speaker 2 (43:06):
You know, it's very interesting.
Speaker 3 (43:08):
I know a lot of casting directors, so it's a
very interesting things. Most of my casting director friends didn't
start out to be casting directors. They started out to
be other things and then became casting directors because they
were good at that too. But you know, obviously a
love of art is primary, right, That's the number one
thing is you have to love the thing. Hopefully you're
not going to love everything you cast, but ideally you
(43:31):
love it so much that you want to make it
great and helping make it great. Your participation of it
makes it great, right, But.
Speaker 2 (43:36):
It definitely requires two different skills. One is to be
able to read a script and understand the sort of
the great motivating aspect of the character, and the other
is knowing, you know, how to stretch an actor's range
into something they might not have been doing before.
Speaker 3 (43:55):
So I'm this guy who's heard your dad and thought
he was super smart, but he had a lot of
really smart and interesting and you're this person who grew
up with this guy, and you make fun of him
and make jokes about what have you found something that
he's written or said to be Wow, that's a little
(44:16):
profound or a little interesting, or is it all just
dad talking.
Speaker 5 (44:19):
I think it's not just like one thing. I think
it's just an accumulation of things. Like if we're talking
about something, then he says something, it's like, oh, that's smart,
and then it's not just like one piece of wisdom
or something. It's just like overall the experience.
Speaker 3 (44:35):
Has that been helpful to have a dad who you
think is writing really smart things?
Speaker 5 (44:39):
Well, yeah, it's helpful considering that I have the exact
same interest that he had at my age. It's like,
you know, if I, like was a jock and I
wanted to go into like, you know, something all science y,
I guess, or like math, Like I feel like it
wouldn't be as helpful, but I think consider that I'm
(45:00):
an English person as well.
Speaker 3 (45:01):
Looking out into your future now, which your dad talks
about is being a little bit pessimistic, like we're worried
about the survival of your generation, my son's generation. How
that's going, How does that? How did? How do how
do you digest that?
Speaker 5 (45:20):
I don't know, Like I feel like, I mean, yeah,
it's a problem, but like the world sucks so much anyway,
I just don't really care that much. I mean, like care,
but like I don't care.
Speaker 3 (45:32):
I mean, obviously, what are you gonna do. You're gonna
live your life and gonna.
Speaker 5 (45:35):
Try I'm going to try to not contribute bad things
to the world and like try not to let AI
take over my life.
Speaker 3 (45:45):
Fascinating to me, your dynamic and it's like again, this
is a new person to me, pretty new person, and
I'm really interested in what he has to say. And
it's like I just didn't know if you were like
my kid. It's like dad, but it seems like you.
Speaker 5 (45:58):
I treat him as a human too.
Speaker 3 (46:00):
Yeah, well that's nice. I think you should talk to
my family about.
Speaker 5 (46:03):
That, because also the only it's just us.
Speaker 3 (46:07):
Oh yeah, I only have Charlie.
Speaker 5 (46:08):
I only have you only have your singer songwriter international relations.
Speaker 3 (46:12):
Yes, exactly. Yeah, okay, Well I'm now I've got to
ask you a question to both of you. Okay, question
a musical theme.
Speaker 5 (46:20):
Oh my god, it's special, like on the podcast when
you have the special segments.
Speaker 2 (46:24):
Yeah, oh my god.
Speaker 3 (46:25):
What Okay, Now it's time for listener man. So this
is from Dirty Draws. The listener named Dirty Draws. How
do you deal with legends becoming problematic with the passing
of Hulk Hogan? The question is in the spotlight. Is
it better to discredit someone's accomplishments and erase them from history,
(46:49):
or should we always strive to separate the art from
the artist. Examples Chuck Berry and Better rock and Roll,
but Chuck RaRo was awful to women. Jimmy Page is
possibly the greatest living guitarist, but he's also plagiarized repoort
Black Bluesman's songs and refuse to give them credit despite
the fact that they were working menial jobs to get by,
while led Zeppelin got stink and Rich the Hulkster made
(47:10):
two generations of kids have awesome childhoods, but was a
lifelong racist doucheback. Do these people still deserve their flowers?
Speaker 5 (47:18):
I'll go, yeah, I think it depends. Like I immediately
thought of Kanye, like, no, I'm never gonna support Kanye.
Ill judge people if they listened to him still, And I.
Speaker 3 (47:29):
Think, did you like his music before?
Speaker 5 (47:31):
I was indifferent? Okay, But I think And then also
I was thinking of Ethel Kine, who's this now very
well known indie singer who had she grew up in
like the South and had upon she said like racist
things when she was a teenager and now has come
out and apologized about it. But I think that it
(47:54):
mainly is up to I guess the groups that they're
targeting to like that. It's like the whole thing that
gen Z saying. It's like with the whole Etholkane thing.
You know, she said like a bunch of racial slurs,
and like I'm not allowed to feel any way about it.
The people who were target against her are the only
ones to make the call as a community. But I
think I'm big into Like if they people have done
(48:16):
something like really bad, I just can't associate with them, right.
Speaker 3 (48:21):
I feel like, do you like Michael Jackson's music?
Speaker 5 (48:24):
I was thinking about Michael Jackson. I just I just can't,
Like I just can't. I mean, like I can play
the Glee version of Smooth Criminal, but like, I just
I just can't. Like, once I've heard about what it's like.
You know, if like you're friends with someone and they
do something really bad, you're not gonna be friends with
them after. That doesn't matter. Like if like you like
(48:44):
you know, like you like a shirt they own, it's
not like you're gonna be friends with them. It's just
I feel like I would do the same for celebrities
because what they do is so bad, Like hello, I
don't know. I feel like a lot of people just
keep on like they're like, oh, like whatever, like I'll
pretend I didn't hear it. I feel like it's different
if the celebrities like apologize, but you know, this guy's dead,
(49:07):
so all right.
Speaker 2 (49:08):
I have more questions, but I want to hear good one,
you know, because there's your your individual crimes and then
the systemic crimes that you're perpetrating. So it's like, you know,
if Taylor Swift didn't do anything wrong herself, she's you know,
the result of a white colonial genocidal Yeah.
Speaker 5 (49:31):
People hate her for not saying anything anxiety.
Speaker 2 (49:33):
Right, or then how much does she have to do
to not do that? I don't think we should have
any human legends, right, This is not about legends. It's
about uh, it's about it's about the work.
Speaker 3 (49:49):
So can we enjoy the work regardless of who made
it if it hasn't been presented as part of the
legendary ip of some individual.
Speaker 2 (50:01):
I mean, it's interesting to me this question ties in
with the question of should AI be entitled to ingest
and spin out everything that there is?
Speaker 3 (50:10):
Right?
Speaker 2 (50:11):
And again it's super controversial to say so, but I
would like to live in a world where that's okay.
But I would like to also live in a world
where artists don't need to copyright and ip their stuff
in order to be kept alive. So it's like some
real fundamental questions are back there. But no, we live
in a media space that turns artists into legends and
(50:35):
there's no need for them to be.
Speaker 3 (50:40):
Yeah, I mean, but even before artists were legends. There
were great artists doing horrible things. So it's like one
of those things.
Speaker 2 (50:46):
Vinci or whatever.
Speaker 3 (50:47):
Yes, exactly, So we don't know the times that everybody did.
But I do want to point something out when you
speak this, man, I think likes a talk gives you
such space, and I'm reading that as love, just saying
I'm reading I'm reading love and respect. So that I
(51:07):
just said, I'm touched by that. I'm touched by this. Honestly,
that's a.
Speaker 5 (51:12):
Very he was using what I said to inspire what
he was going to say.
Speaker 3 (51:16):
Yeah, because you.
Speaker 5 (51:17):
See he speaks in a way where it's like there's
a clear beginning, middle end, and there's an arc, and
there's like a thesis. So he was developing the thesis.
Speaker 3 (51:27):
He's very good at that. I believe, I feel strongly
that the art and the artists are not the same thing,
and that we can enjoy the art of anybody, no
matter how Kanye West. I just listened to that Paul
McCartney's the song that he did with Paul McCartney and Rihanna, Uh,
just on my walk yesterday, and it's good. Uh, it's
(51:48):
he's he's horrible, And.
Speaker 2 (51:51):
Listen to Kanye's gospel work. He was a gospel directly beautiful, beautiful,
genius stuff.
Speaker 3 (51:58):
But so fucked up. He's got he's got problems.
Speaker 5 (52:02):
I don't know. Kanye is just the one I can't
can't do it well.
Speaker 3 (52:04):
He's horrible. I what's so current. But what's good for
me is I hate most of his stuff. So it's good, great,
great that I don't have to decide because this stuff
is so not in my wheelhouse. Uh p Diddy, I
don't know, like the same, like same musically, I don't care.
But there are people who I think are great. Frank Sinatra.
(52:27):
I name my son's middle name of Sinatra. He's the great.
Speaker 5 (52:31):
Guys, guys need to say something. Yeah, I was there
was a terror reading on my free page on TikTok,
and it was like, in the next like three days,
someone's gonna mention Frank Sinatra in like a very important way,
and that's going to be a sign that like your
manifestations are coming true and you just that's crazy. And
I was like, and I was like, because it happened
(52:52):
like two days ago, and I was like, this is
not going to happen. It's Frank Sinatra and because I
go on OX on the car every time.
Speaker 3 (52:57):
This couldn't be deeper proof of that.
Speaker 5 (52:59):
That's kind of.
Speaker 3 (53:01):
Yeah. And I'm not that metaphysic. I don't believe in
the quoting, but I kind of do.
Speaker 5 (53:07):
That's weird. It's like, that's just weird, Okay, I do anyway?
Speaker 3 (53:11):
Sounds be the name of Sinatra and Frank Sinatra again
have did some great things and did some horrible things
and was interesting, but a great artist and his music's fantastic,
So I don't mind. But I hope it's manifesting. Whatever
you're manifesting, it's it sounds good.
Speaker 5 (53:27):
Yeah, you guys have crystals.
Speaker 3 (53:30):
I did.
Speaker 5 (53:30):
I did a June thing where I put like water
outside on the full on the new moon, and I
wrote like a bunch of things that like I wanted
to happen, and it's all happening, all right.
Speaker 3 (53:43):
I have a question for you, for her and for
my son going forward in the pessimistic world that they
may be living in, what's the best way for them
to sort of carve out working a decent working world,
which is a correct individual on an individual basis. How
(54:07):
can they affect the greater.
Speaker 2 (54:09):
World by exercising compassion and face to face Yeah, I
really think, you know, and this is just somebody who's published,
you know, twenty five books or however many million people
on Titan television documentaries and all that. I still suspect
my greatest impact might be just in hand to hand,
face to face interactions with others.
Speaker 3 (54:30):
Yeah.
Speaker 2 (54:30):
Yeah, be fucking nice, right, be nice?
Speaker 3 (54:34):
Be nice. Well that's that's uh, you know, that's good
marketing advice too. By the way, if you're nice to
a lot of people, people want to work with you,
and it's like and be around you, would you.
Speaker 2 (54:43):
Be nice, breathe, you know, and and be present. I mean,
I feel like we're all kind of dulas for each other.
We're all, in a sense death dulas. We're just here
helping each other get each day.
Speaker 3 (54:55):
Right. Well, I guess I hope it's I don't like
to think of it those terms, but I guess it's
true that we're supervising something.
Speaker 2 (55:03):
We're all facing the existential truth of this existence. It's hard.
Speaker 3 (55:07):
But there's another truth that I heard you mentioned that
I also believed before you mentioned it, But it's right there.
It's just there's something about the natural world where we
there's equilibrium. We talk about the forests, the trees that
talk to each other, and the smaller trees are helped
the larger trees, and larger trees help the smaller trees,
and we're all connective tissue somehow with each other. I say, Sinatra,
(55:31):
and your life becomes better. There's something, there's something going
on that may lift us up. And so if we're
contributing to the kindness of the world, contributing to the
ethics of the world on a small basis every day,
it may help you know, save the.
Speaker 2 (55:47):
One seen resonances and connections, Yes are real. Leased the
Simpson knew all that. She's right, She's very smart. Lisa
Simpson is very smart. Well, thank you both for being here.
I really do appreciate this is the show has been
matched beyond what I hoped it was because I really
didn't know you at all, I didn't really.
Speaker 3 (56:07):
Know him at all. I was just like a fan.
I'm super fan, and this met so many of my
expectations and I was happy to sort of connect. And
I feel like we did have a connection, and I
feel good about that, and I feel we might have
a connection at some point. But generations separate us anyway,
thank you for being here again. Thank you the audience
(56:28):
for being there, thank you for tuning in. Please do
what I'm doing, which is sit with people, talk with them,
have a connective experience. You're gonna be so much better off.
And I'm gonna really appreciate it if you you know,
if you do, because I'll feel good like I'm doing something.
This was for a reason, right, and that makes me
(56:49):
feel good, all right. See you next time.
Speaker 2 (56:51):
Don't be alone with