All Episodes

June 22, 2023 49 mins

An unexpected visitor returns this week. It's Professor Scientist and he's in a mood. His anxiety is rising because we're all aging and well... lately, everyone's in a state of panic about AI wiping out humanity. Is it just a marketing tool? Or are there real dangers? Discussed: storytelling, Leon Trotsky, mortality, ponies. 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:19):
Hey, and welcome to What Future. I'm your host, Josh Watspolski,
And before we get into the show, I want to
talk about traveling. I have a very particular feeling about traveling,
particularly traveling by plane, and having just done some traveling,
I have certain points of dread, I would say, in

(00:40):
the experience, and I don't know if everybody has these.
I don't know if everyone experiences travel quite the way
that I do. Like, I feel like when I'm in
airport and I look at people, people seem generally. Now
this could be a projection or whatever, me reading into things,
I feel like people at the airport are having a
better time than I am. Like when I look around,
I feel like everybody enjoying what they're doing or enjoying

(01:04):
their trip more than I am. For me, a trip
is just a series of hurdles that I've got a
vault over, a series of anxiety portals that I must
pass through until I get to my destination. And I
think some of that is driven by my well. I
think I have a problem with flying because it takes

(01:25):
me completely out of control of the situation and I'm
not a good passenger anywhere, Like in a car or whatever.
I don't like to take trains that much. But on
a plane you feel especially powerless and out of control.
And you know, of course if a plane typically when
a plane crashes, which doesn't happen that often, but it
does happen, it's not like, oh, there were some injuries,
you know, like a car crash is like, oh the
guy got his arm broken, or well he had to

(01:46):
be he was in the hospital for six months, but
you know then he recovered, or well he'll never walk again,
but he's still alive. I mean, of course, people die
in car accidents, in fact, more often than than you know,
they're more likely to die in a car accident than
you are on a plane. So but there's that kind
of feel like, well, if the plane does crash, that's
it for me. I'm I'm toast. And then you know,
there's the whole thing with seats because I'm very tall,

(02:09):
and uh, it can be very uncomfortable. Uh, it can
be very hold on, I'm getting a call which I
need to take. I understand this, this is very bad timing.

(02:30):
But one second, hello.

Speaker 2 (02:35):
Josh, yes, sir, you know who this is, right, Well,
I mean there's only one person who can cut right
into mind. Who is this?

Speaker 1 (02:46):
It's the professor. It's always the professor. It's only the professor.

Speaker 2 (02:50):
Professor scientists, Sir, professor scientists, Sir professor scientists, sir. Yes,

(03:13):
I like to take control immediately.

Speaker 1 (03:15):
That's good.

Speaker 2 (03:16):
You know you hear my laugh right now, right? And
I do so enjoy our conversations, both on air and
off air. Yes, but I will tell you this. I
come into this conversation today ship ton of anxiety, and
imagine a ton of ship. Okay, any type of excrement

(03:39):
I've got, I've got it, do you really? I got
in my mind's eye right now. I got it in
my mind's and it's very detailed. It's on your front doorstep.
It's more than the doorstep. It's in your yard. Okay,
your chowt is about.

Speaker 1 (03:57):
Could be good. I mean you have to stop.

Speaker 2 (04:00):
I held it from going out inquiring what is that
thing everywhere near that? It's get away from that power
of ship.

Speaker 1 (04:07):
Yeah.

Speaker 2 (04:07):
It's a ton, yeah, of anxiety, of anxiety, and I'm
not enjoying it. And even though I'm giggling about it, yeah,
I'm very serious.

Speaker 1 (04:17):
Well, giggling is just a it's a defense mechanism against anxiety. Yeah,
I feel like I'm doing a bad job at everything,
and that makes me feel anxious. But I'm not so
anxious that I'm going to do a better job. If
that makes you feel any better, So.

Speaker 2 (04:34):
I should write that down. That would be like a
perfect little piece of wisdom in your in your book.

Speaker 1 (04:39):
Maybe I should write a book of little little witticisms
like that.

Speaker 2 (04:42):
Yeah, exactly, all right, my anxiety levels for shiit?

Speaker 1 (04:45):
What's causing you anxiety? You're you should be relaxed, man,
I should be Yeah, well, I know you've got a
lot of stress because you've got you've got big projects,
you got to deal with big people, big personalities.

Speaker 2 (04:55):
It's not just stress. It's just I don't know. I
think stress is more of an external force and anxiety
is more of an internal force.

Speaker 1 (05:06):
That sounds like something that a very smart therapist would say.

Speaker 2 (05:09):
No, I just I just said it. I just blurted
that out.

Speaker 1 (05:12):
You should write a book of witticisms.

Speaker 2 (05:15):
Oh you want to hear some of my witticisms. Yes,
I said this to somebody today. I can be accused
of being a little self righteous, which I'm okay with. Yeah,
because I can be I have a little bit of
a bar.

Speaker 1 (05:27):
So yeah, I'll say yeah.

Speaker 2 (05:29):
And I came up with this thought years ago that
when I feel like I have a leg to stand on,
which to me indicates I'm right, I'm just right about it.
I'm not always right, but this, in a particular moment,
it's irrefutable. I am right. Somebody's wrong, somebody's lying, somebody's
doing something.

Speaker 1 (05:50):
Yeah, if I.

Speaker 2 (05:51):
Have a leg to stand on, I take the leg
I'm not standing on and I'll beat somebody over the
head with it. I should write that one down figuratively literally, Yeah,
figure it. Both your legs are attached permanently. Yeah, it's
not like I'm gonna go beat anybody.

Speaker 1 (06:06):
Up without naming names. Can you give me a basic
structure of the what's causing the anxiety?

Speaker 2 (06:12):
Well, scaffolding, it's uh, mortality has always been an issue
for me.

Speaker 1 (06:23):
Aging mortality is an issue for everybody.

Speaker 2 (06:26):
Yeah, but I have an acute sense of my mortality,
as my shrink says, which I do.

Speaker 1 (06:33):
That's interesting, acute.

Speaker 2 (06:35):
But what I've never really been conscious of so much so,
not in a in a silly way or is the
word blitheful way? Is that no, I don't blithe way
is uh aging? But I've got aging on the brain.

Speaker 1 (06:54):
Now this is this is an interesting topic because I
have a similar I've had a similar thing on on my.

Speaker 2 (07:00):
And I'm not feeling my age, but I'm conscious of
my age. And that didn't happen until a handful of
years ago. And through the handful of years I had
a couple dear friends pass away and we were all
read around the same age. And I've been thinking about
my age, which was not part of my internal dialogue

(07:21):
right and now it's there every day. Why what are
you thinking about your age?

Speaker 1 (07:26):
Well, I will say that I think that I've always
felt and as for as long as about as long
as I can remember, I've always felt that I was
sort of like didn't have enough time to accomplish all
the things that I wanted to accomplish, even though I'm
a master procrastinator and will definitely sit on shit indefinitely.
But what's interesting is I don't think at all about

(07:48):
at least not consciously. I don't spend a lot of
time thinking about mortality, nor do I think about my age.
And if I have any thought about my age, it
is that I don't feel dramatically different or older than
I did twenty years ago. By the way, I don't either, right, Okay,
So so now, Laura, my wife, likes to point out

(08:09):
that we are old all the time, like on a
regular basis. She will say stuff like we're old now.

Speaker 2 (08:14):
Yeah, that's a slippery slope.

Speaker 1 (08:16):
It makes me feel kind of shitty because because I
don't feel I don't feel old, and I don't and
I don't identify as like an old person, and I
don't feel like I have like somehow done all the
things I'm going to do or lived all of the
life I'm going to live. But it makes me think
about it. I'm like, well, is am I just deluding myself?
How deluded am I? About who I am and where

(08:36):
I sit in the kind of spectrum on the spectrum
of time? So you know, it pops up occasionally. I mean,
it's funny because every time I talk about my age
on the show, I say, oh, I'm very close to dying.
But it's like it's a joke because I'm obviously not.
I like to think that I'm not. But you know,
I'm alway to talk to Laura, I might have to
sit Laura, you should you should talk to her about it,
because I think she's creating a real bad It creates
a really bad vibeth.

Speaker 2 (08:57):
It becomes a slippery slope, self fulfilling prophecy.

Speaker 1 (09:01):
Well, and I think it's like why I think about
yourself like that, Why I think about in these terms
of like old or young. I don't feel any age.

Speaker 2 (09:08):
I never did until about five years ago.

Speaker 1 (09:10):
Yeah, and I'm not. Nobody's dying in my world. I mean,
I'm just just hanging out, just LIVI.

Speaker 2 (09:16):
You know, all right, I'm going to jump around a
little bit, but I'm going to mood today. Here we go,
Here we go. I saw a documentary it's about the
No No People, which I had never heard before. Okay,
but the No No People were the Japanese West Coast,
primarily Japanese people who were in turn during World War two, Right.

(09:40):
And there was a twenty eight questions questionnaire that was
given to these to all these people as they're about
to be interned. As I called it, like, number twenty
seven is will you take up arms on behalf of
the United States? Right basically, And then number twenty eight
was some version of do you sympathize side with the

(10:07):
japan right, So these people wouldn't answer those questions. Hence
they answered twenty six, but on twenty seven and twenty
eight they wouldn't answer them, and they became known as
the no no people right now, and they were sent
to a more harsh internment camp in Arizona called Camp Tully, right, right,

(10:29):
So all of it was harsh, but Camp a little
bit more harsh, and these people were seen as real troublemakers.
But in fact, so many of the Japanese people, as
it turns out, were you know, like God love them.
It's just a you know, this is a beautiful race
of people. And they were like, they weren't rioting. They
weren't like pulling a January sixth on anybody, right, they

(10:52):
certainly weren't happy.

Speaker 1 (10:53):
Only the only white people do stuff like that.

Speaker 2 (10:55):
Fact, by the way, only white people do that. Because
here's my question, if we were to think that, okay,
because of geography and proximity. Okay, so Asian and we
were often referred to as the West Coast Asians, right,
So it makes you think probably not too many Asians
at that time of America on the East Coast in
relationship to the West coast. But if we imprisoned all

(11:18):
the Japanese people, did we try and imprison any of
the German people on the East coast?

Speaker 1 (11:25):
I mean, this is a whole This is a whole different.

Speaker 2 (11:28):
And why did we not? Oh? Because they looked like us,
because they were a bunch of white people.

Speaker 1 (11:35):
This is uh. But also a huge amount of Nazi
sympathizers in America.

Speaker 2 (11:40):
Oh yeah, I made that. I made that point demand
the earlier in the day to a.

Speaker 1 (11:43):
Huge amount of a bit of still to this day
and and and historically for sure. Just imagine thinking you're
the center of the known universe for your entire existence,
in the existence of all of your ancestors, just thinking
that you control and rule all of the earth, and
discovering that that's not the case, And how little it

(12:08):
must make you feel deep down. I mean I can
only imagine. I mean, I'm a white man, I'm Jewish,
so I'm a minority. Basically, we're a certain kind of minority.
We're not we look like I look like a white guy.
But I definitely will be killed as soon as the
Nazis get to power.

Speaker 2 (12:24):
Oh I shouldn't laugh at that one.

Speaker 1 (12:26):
If the guys in New York who are the Hasidic
Jews can identify me and ask me if I want
to pray with them on the street, they'll just pull
me over and say do you want to? That happens,
Oh yeah, it's a very common thing that happens in
New York. They try to get something called a minion together,
which is a group of men praying. I've heard of that,
and at any rate, but if they can identify me,

(12:46):
I guarantee you whoever, like Donald Trump Junior, can identify
that I'm Jewish as well. But yeah, no, I mean, like,
I don't know what it's like. I don't know what
it's like to feel to feel so good about myself.
I don't know, because I don't think any Jewish person
knows this, To feel really good about like who you
are and what you've done, Like, I'm not sure that's
an emotion I can get in touch with. But but

(13:08):
the but these Europeans, these white people, I mean, they've
just been riding this wave of success, you know, seeming success,
and it is very I think it's very difficult to
to imagine a world where you are not the top dog.
And of course that world is a there's only one
solution for them, because that world's imminent, that world is

(13:29):
is here. The only solution for them is to kill,
is to do just like to do a genocide, because
like there's no way, how well you can't stop they can't.
They know they can't stop it. I mean, the ship
that's going on in America with these Nazis and stuff
like actual Nazis and this Ronda Santis shit is just
a pure expression of like the last gasp of this

(13:51):
kind of European U eurocentric white culture that they've built
up and that they think was they thought was forever,
was going to be forever. And unfortunately it's not very
sad for them. So I mean, but you know, they
might just they might just like Adam Bambas, you know
they've done it before.

Speaker 2 (14:08):
You're giving me segues and you don't even know it.

Speaker 1 (14:11):
I don't I don't know it. I have no idea
what we're talking about or why we're talking about it.

Speaker 2 (14:14):
But in lockstep today and lots of stuff to often
often you and I are combative.

Speaker 1 (14:20):
I was gonna say off in we are arguing about
that interesting.

Speaker 2 (14:26):
We like to we like to rib each other a lot,
But today I'm not in a ribbing mood.

Speaker 1 (14:31):
No, you're too anxiety ridden to rib I'm too anxiety
too much thinking about mortality.

Speaker 2 (14:36):
Did you read the latest report out the BBC on
AI as if everybody in the world isn't talking about AI?

Speaker 1 (14:43):
But no, no, but I'm happy. Let's get engage on this.
I'm ready. What's the report? Oh, the one about how
it makes extinct or whatever?

Speaker 2 (14:49):
Yes, even the I think even the guy who's created
or funding.

Speaker 1 (14:55):
And the guy who runs open AI is very worried.
Interesting marketing tactic, I would say, to make your products
seem extremely powerful and valuable. A couple of things on that.
First off, like, I hope it's I would love to
see I'd love to be wiped out by an AI personally,
speaking on just a straight up basic level.

Speaker 2 (15:14):
I just lazers you from outer space.

Speaker 1 (15:17):
He so lucky to have the AI turn on us
and wipe out you.

Speaker 2 (15:21):
Man, How does it wipe you out? Does a laser
just come from outer space?

Speaker 1 (15:25):
Yeah? Lasers? Sure? Fine, Listen, you have to you do
have to game it out a little bit, right, because
today the AI will uh it doesn't do a lot.
And also it's not really AI, it's a it's a
language model, which is because humans are very, very dumb,
I think we should just first I want to lay
this out there. Well, we're dumb. We're dumb because I mean,

(15:46):
storytelling is it is what drives humanity. And by the
way that my career in journalism and in other forms
of storytelling has been very much driven personally by this
idea that telling a story the right way, or finding
this story to tell and telling it is like, has huge,
huge value for people. It can change someone's life, it
can change someone's mind, it can it can it can

(16:07):
reorient you in the world, and it's like very powerful. Right,
So storytelling is extremely important, but it is also like
the basis of all humanity essentially right that we are
this is the only way that we ever get anywhere
is by convincing each other of a narrative. Right. Like
gay marriage is a good example. Like right gay marriage
happened in America largely because we convinced, like there was

(16:29):
a narrative that started to make sense to people, that
you could tell them a story about what marriage was
and what it meant that was different than the one
they had been told. Previously, and they could, they could,
they could understand it, and they'd buy into it. And
I think so much of that had to do with
like changing the narrative, like literally changing the conversation and
the story that we tell.

Speaker 2 (17:00):
What do you think the dude is doing?

Speaker 1 (17:01):
Who?

Speaker 2 (17:02):
Who? What's his name? Stuart?

Speaker 1 (17:04):
Who's Sam Altman? Sam? Runs open?

Speaker 2 (17:06):
What's what's he doing? When he says, Hey, this thing
is really dangerous?

Speaker 1 (17:09):
Basically, I think he's saying, my company is so valuable
and so powerful. We've created something that even I don't
understand the power of. And I hope I get I
hope you will. Uh, the government America, I hope your
military will invest heavily billions of dollars of my technology
to use raft to learn how to you know, bomb

(17:30):
people better or whatever. And the NSA would want it
to help them learn about what people's behaviors are. And uh,
you know, other companies will want to buy. And he's like, well,
it's so powerful. I don't know. I mean, maybe you
can throw me a few bucks. We could figure it
out together. I think he's a master marketer. Uh. And
but here's what it is. I mean, this model tells
us stories really well, and we're so fucking stupid. We're

(17:53):
so dumb. And I gotta say just monkey level stupidity
here with humanity that the bed the better the story is,
the more believable it is to us, regardless of it
is if it is actually believable or has any facts
in it. Like there's a great story that's been going
around about this lawyer who presented this brief, this dissent
or something in a case, and it had all cited

(18:14):
all of these different cases that he had used chat
GPT to get and all of the cases were made
up with like extremely detailed citations. But every one of
the cases was invented by chat GPT because its job
is to model what we believe we wanted to tell us, right,
It models what we think what it thinks. The next

(18:35):
part of the conversation is, so we're in this state
of total panic because like, what else could it do?
But I think to a couple of things there.

Speaker 2 (18:44):
Let me interrupt you before, because when you talk about
predictiveness meets AI meets telling a story meets marketing. You know,
have I ever told you to read this book called
If Then by Joe Lapour?

Speaker 1 (18:59):
I have been reading it actually started reading it because
you told me that's right. We did have the I
I fell off of I'm in the middle of it.
You guys never finished. It's it great, it's great. It's
a great book.

Speaker 2 (19:08):
It's really a great book. And she's a great writer,
but terrific, one of the best. What she points out
and I won't ruin it for people, But the way
that this Marketer, you know, Madison Avenue guy got hold
of a computer, so to speak, and guys who knew
how to operate the earliest, earliest days of computer and
how they their initial interest was how do we get

(19:31):
I think it was how do we get a person
to switch from one cigarette to another? And it threw
multiple hands. Political hands eventually started to years later predict
where we were supposed to bomb certain villages in Vietnam.

Speaker 1 (19:51):
Right, Kisdinger got his hands on it, and it was
off to the raises. Yeah, I assume. I don't know,
but he's one of the greatest war criminals we've ever known.

Speaker 2 (19:59):
I know you I have a passion for him.

Speaker 1 (20:01):
No, I don't really Actually, it's just anyhow he's been
in the news because he just turned one hundredth But god.

Speaker 2 (20:07):
Where was I so the guy, he's a great marketer,
and he just.

Speaker 1 (20:11):
Talking about the stories, the stories. It's very good.

Speaker 2 (20:13):
And then the other guy, the attorney, creates all those things.

Speaker 1 (20:16):
But it's just it's just excellent at shaping something that
feels so alive, feels so real, that we start to
ascribe all these qualities to it. Give it all of
these give it all this power like money in a way, actually,
you know. And so we're all in a panic now
about AI wiping out humanity. And again I have to say,

(20:37):
I strongly advocate for it wiping out of humanity at
the hands of AI if it can, but it cannot.
I think it's hard for us to imagine the mind
of a person who comes to this not in where
we have come to it mid mid life already having
learned a bunch of other habits, but somebody who's coming

(20:57):
to it brand new. Will they be more addicted, will
they be less addicted? Well, they find the things that
we find uh so fascinating less interesting. I think there's
a as a really like wild kind of set of
possibilities that have nothing to do with anything that we
already think that we know about how we use these
devices Because I don't think that we have as a
as a species actually begun to even understand what they

(21:21):
do at all. So so you're, you know, the counter
argument could be, well, that's why we're going to blow
ourselves up with AI. I think right now, the danger
that AI poses is that it makes things, It makes
faking things very easy, and misinformation more than anything. And
by the way, going back to the narrative of storytelling,
misinformation is far more dangerous than the atom bomb in

(21:43):
a lot of ways, right like the oh fucking snore,
if snore, if you are a professor.

Speaker 2 (21:48):
You're you're stating in the arvious, I can't believe that
you are here, am I Do you think the most
dangerous part is fake and creating false So.

Speaker 1 (21:56):
It's now I think at this moment, in this moment.

Speaker 2 (21:59):
And do you think that that's actually going to get
better because it's these fake false moments almost just created
a civil war.

Speaker 1 (22:06):
And you think when where.

Speaker 2 (22:09):
January sixth could have maybe but.

Speaker 1 (22:11):
That AI had nothing to do with that. That was
just regular old people doing their thing.

Speaker 2 (22:16):
Well it wasn't. AI didn't have anything to do with
false news had something to do with it, right.

Speaker 1 (22:21):
But that's so that's not even a thing. That's a
fake idea too. That's just like fake news is a
fake concept.

Speaker 2 (22:26):
But you don't think you just said we're going to
be able to create more falsehoods. Yes, through through an
information just information.

Speaker 1 (22:34):
Call it what you will, but that's what the fucking
Protocols of the Elders of Zion is. And it's been
around forever. It's behold a pale horse. It's all this
fucking the illuminati shit. It's all the same thing. It's
just a huge pile of secrets and misinformation that people
that a certain segment of the population will fucking buy into,
just like trickle down economics and all that bullshit. I

(22:56):
hate to do it. I hate to cite this, but
there's it.

Speaker 2 (22:59):
But go ahead and say it.

Speaker 1 (23:00):
Well, there's a great Trotsky right piece of writing.

Speaker 2 (23:05):
You slip when you slip into Leon Trotsky. Well, I
know I'm not I'm not.

Speaker 1 (23:12):
A hard line or anything, just saying he wrote a
piece in nineteen oh one called on Optimism and Pessimism,
and I think about it all the time. I'm just
going to read you the last line because it's the
last line, is the one that's the last two lines
are the ones that are important. I'm just gonna read
them to you. Surrender you pathetic dreamer. Here, I am
your long awaited twentieth century, your future. By the way,

(23:33):
this piece just details how horrible society is in the
twentieth century. This is when he wrote in nineteen oh one. Right,
this is the start of the twentieth century. Surrender you,
pathetic dreamer. Here, I am your long awaited twentieth century,
your future. And the last line is no replies the
unhumbled optimist, You are only the present. I think about
this all the time, every day, that we are in

(23:54):
this mode of envisioning, that we are in the end date.
But we're not in the end state. We're like in
the Opening Innings. We're in the opening innings. And I've
said this to you before, and I'll see disagree. Humanity
is not going to be destroyed by a global pandemic.
Let me, let me, it's not gonna be destroyed. And

(24:16):
I'm agreeing with you by a nuclear conflagration. I agree,
it's going to be destroyed when we invent a pair
of shoes that let you jump very high, and then
it turns out one day they go hey wire anybody's
legs start flying off because the shoes. And that's how
we're gonna It's gonna be something so fucking stupid and unexpected.
You know, It's like it's actually like the pandemic. I

(24:36):
think a lot of a lot of it is like,
you know, we thought it'd be zombies and fucking buildings
on fire and nuclear missiles and whatever. The robots rope
the fucking guys from the matrix. The robots are the matrix,
and and what it actually is like you got to
sit in your house and work. You're not allowed to
go out. You can't go to the grocery store. That's
the apocalypse. That's our apocalypse. It's like you've got to

(24:57):
be on slack with your coworkers while you know everybody's
getting sick around you. Anyhow, Listen, I don't know how
we got into this. I have no idea what we're
talking about.

Speaker 2 (25:06):
Me take something back to No, let me take something
back because I said I agree, but I don't agree.

Speaker 1 (25:12):
I don't even know. I'm not even sure what our
topic isn't.

Speaker 2 (25:15):
No, but you had said it's going to be some
you know, some fucking tennis shoe that gets the tennis shoe.
You're gonna wipe us.

Speaker 1 (25:22):
Out as your legs are flying off and you put
them on.

Speaker 2 (25:25):
I have heard you say that before. Yeah, No, I
think it's going to be bigger than that.

Speaker 1 (25:29):
Maybe. But the thinking about AI is this, like it is,
it appears very scary because it does things that we
that seem like they are beyond understanding. I think that,
you know, if you look at what the real the
the interesting critics have said, and I think to Amanda's point,
where it's true danger lies at this point is in
the misuse of AI by human beings. But it's kind

(25:52):
of a people don't kill people, guns do or whatever argument,
like yeah, like ultimately a person has to pull the trigger,
but the gun is the thing that lets them kill.
And I think that, like, you know, we can have
the debate about, you know, what the true danger of
AI is and and it's both things, right, it's both

(26:13):
the technology and the people. But it's like, but this,
at this point, we're so early in this game, and
what it's doing is such a parlor trick, and there's
no evidence that the parlor trick becomes it can become
a more elaborate parlor trick. It'll become a very sophisticated
parlor trick. Is the AI sentient? Does it have a desire?

(26:33):
Does the AI want something? No, it doesn't, and we
don't know that it could ever. We have no idea
that there is us. There's no possibility that we could
know that you could make a computer system that has
a desire for something. It can do things we tell
it to do. It can do things that it thinks
we wanted to do, or it thinks it should do

(26:54):
on its own. But that's not the same thing as
like a motivating factor, like a like a dream. Right,
dream is not just a random processing of information in
our brain. It's not just random. It's some combination of
the pieces of information, right, and it's some part of
us that is putting them together in a certain way.

(27:15):
People in machines aren't like one and the same if
you just make a machine that's complicated enough. So this
idea that like someday it will be fucking Skynet from
the Terminator movies is like kind of a weird, bad
human fantasy that has been I don't know. To me,
it feels like a little bit of a childish view
of the technology because some like James Cameron wrote a

(27:37):
movie about a machine to become sentient and wants to
kill humans. We have basically decided that that's what the
machine's going to do. The smarter it becomes like, I
don't know, the machine's probably going to be able to
see the Terminator and it'll probably be like, huh, maybe
I shouldn't do that. That seems like it ends badly,
Like it's not a good ending for the machines.

Speaker 2 (28:00):
You got a sequel first, and now that badly.

Speaker 1 (28:03):
The best Terminator movie is Terminator three, starring Claire Danes,
And uh, I'll take that one to my grave.

Speaker 2 (28:10):
Wait, my favorite Aliens movie? Yeah three?

Speaker 1 (28:14):
Well, that's interesting. Fincher David Fincher's first feature film.

Speaker 2 (28:18):
I get shouted down by more people over that, but.

Speaker 1 (28:21):
You'll have to agree. You will agree with me. David
Fincher's best movie is Zodiac. Uh probably probably, No, it
can't be an argument there.

Speaker 2 (28:29):
I don't know Aliens three.

Speaker 1 (28:31):
Man, you're saying Aliens three is better than Zodiac. That's crazy.
That's just that.

Speaker 2 (28:35):
Go back and revisit Aliens three out there, all right?
And Charles Charles Dutton staring Dutton the alien saying before
that lead, come on, doesn't get men of that?

Speaker 1 (28:46):
All right? I do. It does make you want to
revisit it. I have to tell you that scene.

Speaker 2 (28:50):
Charles Dutton, by the way, one of the great great actors,
never gets talked about.

Speaker 1 (28:54):
Charles Dutton, Rock Rock, Whatever happened to that show? You
don't hear anything about.

Speaker 2 (28:58):
Like a world class actor?

Speaker 1 (28:59):
The stage incredible. Okay, hold on, do we have we
gone through all your questions? Uh?

Speaker 2 (29:04):
Well, we didn't really cover as much of the Uh
what are you laughing at me about?

Speaker 1 (29:09):
No? Just I just I want to know what we
did cover. I just think it's funny.

Speaker 2 (29:12):
Thanks exactly. Did you have you ever actually sat down
and read the fountain Head or at last, No.

Speaker 1 (29:17):
I wouldn't read that pornographythead go back shad, who sucks?

Speaker 2 (29:24):
I think I might have said this to you. There's
a fascinating early Mike Wallace interview black and white with
a oh yeah, it's great, chilling, so chilling.

Speaker 1 (29:36):
He really puts it, puts her, you know, gives her
some tough questions, he.

Speaker 2 (29:39):
Puts her through her paces. But one of the most
remarkable things about that interview, and anybody who's listening. Really,
go go watch that interview. It will show you unless
you're an objectivist, it'll show you what a fraud this
person was. A big thinker, right, a great brain whatever,
basically says yeah, I just thought it up. Well, I

(29:59):
mean I just imagined this thing one day when he
starts asking her about like, well, where do your inspirations
come from? What are the references? And she's like, just
figure that thought it up?

Speaker 3 (30:10):
Well?

Speaker 1 (30:11):
Is that all ideas? Isn't that all good and bad?
Just somebody thinking of an idea?

Speaker 2 (30:15):
I suppose. But boy, that that many people caught on
then and continue to catch on. It's just remarkable to me.

Speaker 1 (30:22):
Well, people like to hear things that make them feel
good about the way that they behave. So you know
the thing about somebody like Anne Rand or Eying Rand,
depending on who you talk to, is you know, she's
condones a lot of behavior that's basically selfish and shitty
and bad. And that's what the Republicans do often, right,
Like it's about protecting your interests versus other people's, or
thinking about a kind of space where other people should

(30:46):
be considered, which is, you know, in essence, the behavior
of a child. Right, the behavior of a person with
a very limited range of understanding.

Speaker 2 (30:56):
Which Wallace kind of gets into a little bit there. Well,
to get it it is.

Speaker 1 (31:00):
I'll tell you you should create it. You should create
an app the professor recommends. And it's just because you've
recommended several pieces of content here during this conversation, and
all they all are very interesting.

Speaker 2 (31:11):
Sound by the way, I shot my wad.

Speaker 1 (31:12):
That was it.

Speaker 2 (31:13):
I got three.

Speaker 1 (31:14):
I get multi to those were your those were your
three recommendations.

Speaker 2 (31:19):
Like I said, that's all I had.

Speaker 1 (31:21):
Well, soon you'll be shuffling off this mortal coil and
you won't have to worry about recommending.

Speaker 2 (31:25):
Things anybody that was brutal.

Speaker 1 (31:28):
Well, you know, just thinking about it, since you brought
it up at the beginning of the conversation.

Speaker 2 (31:33):
Were you going to speak at my funeral?

Speaker 1 (31:35):
My being asked, I would love to speak at your funeral.
I've got some big ideas.

Speaker 2 (31:38):
I would love for you to get up and pontificate
to the point where this is what you're here in
the audience. Ah.

Speaker 1 (31:45):
I was thinking about doing somebody a little more like
a carrot top type of routine, something with like props.

Speaker 2 (31:51):
That really would be.

Speaker 1 (31:52):
Funny, you know, just pulling some shit out of a bag.

Speaker 2 (31:55):
You know what I'm getting a lot of comfort from
in the midst of my well. Is it existential angst?
I'm not sure if it's so existential.

Speaker 1 (32:03):
It's more angst or dread. It sounds to me more
like dread.

Speaker 2 (32:06):
It's more dread. You're right, it's more dread.

Speaker 1 (32:08):
Yeah, be careful, are you know? They're close, but they're
not exactly the same.

Speaker 2 (32:13):
What I'm getting some comfort from, and I mean it
is images from the web telescope.

Speaker 1 (32:18):
Oh yeah, contemplating the vastness of reality.

Speaker 2 (32:22):
Talk about a spec.

Speaker 1 (32:25):
Well, that's an interesting one.

Speaker 2 (32:26):
Beyond the colors and the figures. Yeah, it's like, oh yeah,
there's so much out there.

Speaker 1 (32:32):
Well, you got to be careful though, because then you
really start to feel bad about yourself, about your insignificance
and the meaninglessness of all of your toils.

Speaker 2 (32:39):
No, that that hasn't That hasn't been the reaction. No,
I look at it and go, oh, should there be
another space, time, continuum matrix of it all?

Speaker 1 (32:51):
Yeah?

Speaker 2 (32:51):
Like I want to be out there floating in the well.

Speaker 1 (32:54):
Who knows what happens when you when you leave your
you know, your physical body, you know, have you thought
about have you said about getting into video games? Though?
Maybe if you really feel I don't believe it, despondent.

Speaker 2 (33:05):
Don't believe in it. Have you don't believe in it?
We talked about this before. I think I recommend gaming
to everybody. You don't believe in blueberries. I don't believe
in video I.

Speaker 1 (33:13):
Think I don't believe in blueberries. Is this something that
came up?

Speaker 2 (33:16):
I believe you said this before. Last time we talked,
you one day chastised me in a pleasant enough way.

Speaker 1 (33:23):
You said this before I recollect and I.

Speaker 2 (33:27):
Was being foolish for believing in the anti arts.

Speaker 1 (33:29):
I say, I chastised you for eating blueberry.

Speaker 2 (33:32):
Believing in the antioxidant qualities of blueberry.

Speaker 1 (33:35):
Oh, yes, yes, I think that's probably some kind of scam.
I'm gonna have to it's that that to me. Whenever
I hear whenever somebody.

Speaker 2 (33:40):
Says, that is a scam to you, but AI is
a positive thing.

Speaker 1 (33:45):
Whenever I hear somebody say, no, I didn't say that.
I'm just saying that. I think we.

Speaker 2 (33:48):
I think we going to quote the great mystic mister
t I pity the fool.

Speaker 1 (33:53):
No, whenever anybody says anything about a food, no matter
what it is, any quality the food is supposed to have,
I immediately think there's a complex system of bullshit that led
to this moment, and and whatever it is. I'm sure
the blueberries are healthy. I have no doubt broccoli is
very healthy. I'm sure there's all sorts of shit that's

(34:13):
really good for you. But I just feel like getting
putting too much.

Speaker 2 (34:16):
What's too much?

Speaker 1 (34:17):
Too much faith in the ability of a single item,
a food item.

Speaker 2 (34:22):
You'd rather any kind of people behind a kind of meaningful,
healthful reaction.

Speaker 1 (34:30):
I think it's just like, is misguided. I think it's misguided.
I don't think you know, I don't believe in the Beatles.
I just believe in me, I guess is what I'm saying.

Speaker 2 (34:38):
God, that's another one. Please write that down.

Speaker 1 (34:40):
No, that's fucking John Lennon said that.

Speaker 2 (34:42):
Oh, oh that's right.

Speaker 1 (34:43):
I think it's actually in a song.

Speaker 2 (34:44):
I don't know enough about the Beatles.

Speaker 1 (34:46):
Excuse me, Oh really, no, I think it's I want
to say it's in a song.

Speaker 2 (34:50):
I listen to the Beatles, but I don't like I
don't quote the Beatles clearly.

Speaker 1 (34:53):
Some of Lennon's solo stuff is really pretty fucking amazing.

Speaker 2 (35:07):
Do you remember when Paul McCartney death hoaks happened?

Speaker 1 (35:10):
Oh he's still alive. Interesting, But do you.

Speaker 2 (35:12):
Remember the DJ and Detroit who started that rumor about
Paul McCarty.

Speaker 1 (35:17):
Yeah, yeah, I think about AI doing that just widespread. Yeah,
what are we going to do? What are we going
to do?

Speaker 2 (35:23):
Buy into it?

Speaker 1 (35:25):
Praise for impact?

Speaker 2 (35:26):
I don't know. You're vacillating now. I don't understand. I'm
not quite clear.

Speaker 1 (35:29):
Are joking? I'm joking. What's gonna happen?

Speaker 2 (35:31):
Oh wait, I have a question?

Speaker 1 (35:32):
Shut than shot?

Speaker 2 (35:36):
Please shut up? Please just show do you?

Speaker 1 (35:40):
I love listening to you speaking like, shut the fuck up?

Speaker 2 (35:42):
I didn't say the F word. Why is it professor positive?
When it comes to AI, does it just seems to
be in the hands of diabolical people?

Speaker 1 (35:56):
Yeah?

Speaker 2 (35:56):
And we on the left are always, you know, under
the sword of demos, and the guys on the right
seem to use it to their advantage.

Speaker 1 (36:10):
Yeah, because this is the classic bringing a knife to
a gunfight situation, Like democrats are I.

Speaker 2 (36:16):
Mean, but do you really think they're going to stand
down at some point and say, all right, your turn?
Who the guys on they're only going to get better
at it, They're going to get more efficient and brutal.

Speaker 1 (36:26):
Well the fucking but no. But the other side is
like the Joker, you know, like they're like, what is it? No,
you know the joker from.

Speaker 2 (36:33):
Batman a joke, the guy from the they call the
Center for Denver Nuggets.

Speaker 1 (36:37):
The Joker keep going, well, I wouldn't know anything about
that because I hate sports. You know. He's like an
agent of chaos, right, He's like, just it does? What
the fuck? Ever? I think that's Uh, you can't have
like a rational this is this is like the Nazi thing, right,
This is like the the you can't be tolerant of
the intolerant, right, you can practice tolerance, right, that's a
good thing. But then when you go, well, but I

(36:59):
have to be tall, are into people who think that
I should not exist, Then you reach a kind of
like a threshold.

Speaker 2 (37:05):
And I think that I don't describe that at at all.
That's where I go back to taking the leg you're
not standing on and beating somebody over the head with keeping.

Speaker 1 (37:12):
Yeah, but there's a whole there's a whole set of
people that are like, we're trying to participate in this
like thing called reality and in in truth. And remember
we're not perfect all the time. And there's a bunch
of shitty democrats and people on the left who are
just as stupid and bad as people on the right.
But there are limits to and and ways of being

(37:34):
that they will never go into. They will never be like,
let's exterminate an entire set of people, like they're just
not going to say that. They're On the other hand,
on the flip side, there are people who are like,
we should have an only like a totally white nation,
like just a crazy, unhinged, fucking anti humanity statement.

Speaker 2 (37:53):
Well that does border on extermination or right or just
like sending people off in boats.

Speaker 1 (38:00):
You can't have like a healthy debate with those people. Yes,
because they're going to take every They're going to take
every possibility, everything they can use to create an environment
of shit. They're going to do it because they don't care.
It's kind of in sync with a lot of the
religious thinking of like this world doesn't matter, this life
doesn't matter, and that there's something better waiting for you.

Speaker 2 (38:22):
Yes, absolutely, Mike Pants believes in Yeah, we're you.

Speaker 1 (38:25):
Know, it's like why the why do the evangelicals follow
a guy like Donald Trump? Some of the worst terrorists
in America are Evangelical Christians. Like, why we should interrogate that. Yeah,
there's it's just like any fundamentalist terrorism. But as a
matter of fact, they want it to come the rapture. Yeah, right, exactly,
there's this weird uh it's a death cult. Right anyhow, God,

(38:47):
we're way, we're so far afield. I don't even know
what we're talking about it anymore, but.

Speaker 2 (38:51):
I think he's been fascinating. Oh well, I mean, we'd
certainly like to talk like top shelf stuff.

Speaker 1 (38:57):
I mean, I don't know.

Speaker 2 (38:58):
I can't tell anymore our business. You're you're the cove
to me. You're the covasier Okay of podcasts?

Speaker 1 (39:05):
Is that good? I'd rather be the top shelf. What's
that very expensive whiskey? It's like Pappy Pappy van Winkle.

Speaker 2 (39:14):
Oh, there is a thing called Pappy van Winkle.

Speaker 1 (39:16):
Yeah, Patty van Winkle is like the most expensive whiskey
you can buy. It's like a very rare special. It's
like a it's five thousand dollars a bottle or something. Wow. Yeah,
I'm the Pappy van Winkle of podcasting.

Speaker 2 (39:28):
If this podcast clicks like in a big way, it's
not going to We're just rolling in the dough. Yeah, oh,
I think I'm going to buy you a bottle that I.

Speaker 1 (39:38):
Guess they haven't haven't given you the numbers. I don't
think you're in any danger of buying me a bottle
of Pappy Van Winkle. Let's put it that way.

Speaker 2 (39:45):
Maybe a shot glass of Pappy.

Speaker 1 (39:47):
Van Winkle maybe, yea, I think that's that's possible.

Speaker 2 (39:50):
Do we have anything else? Do you have anything for me?

Speaker 1 (39:53):
For you? Yeah, well I'm want to I know, I
you know it's I'm scared. I'm scared scared, And yeah,
why I don't understand. Don't get it such a rube. Listen,
you're a successful man. You've you've more than proven your value.
You've more than proven your your worth. You've you've made
incredible things you continue to influence to this day. You

(40:17):
have a circle of friends that love and adore you
and think the world of you.

Speaker 2 (40:21):
Get a little choked up, get a little misty.

Speaker 1 (40:23):
And employees that fear you, that cower and fear. I've
seen them in person. They absolutely don't even want to
walk a little bit in front of you because they're
they're afraid they'll get knocked down.

Speaker 2 (40:33):
Don't look at me, don't eyeball me. I think often
I say to them, don't I ball me.

Speaker 1 (40:37):
What I would ask you is what's missing? What don't
you have? What did you want that you haven't gotten?
You know, where is it? What is it?

Speaker 2 (40:43):
Oh? God, I wish I could hansle that on the air.

Speaker 1 (40:45):
I mean, name one thing that you wanted that you
haven't gotten.

Speaker 2 (40:49):
Ugh, I can't pony in my backyard. I often use
that expression.

Speaker 1 (40:53):
You could have it, though, nothing stomping you. Nothing stomping you.

Speaker 2 (40:56):
By the way, I did say that to a friend
of mine one time, who I will not name him.

Speaker 1 (41:00):
But the next day, there's a pony in your backyard.

Speaker 2 (41:03):
He sent a pony into my office right exactly, This
is what I'm talking about. He did have a woman
bring a pony by and parade up and down my
hallway for about an hour.

Speaker 1 (41:11):
You live in the life of a king. What is it?

Speaker 2 (41:14):
Just?

Speaker 1 (41:14):
I want you to say, I want to know. I
really want to know one thing you wanted or that
you've wanted really badly, truly, that you have not been
able to.

Speaker 2 (41:22):
Get Uh, let's end it on this one. Yeah, having
a dad would have been good. Okay, that's you asked
for it. I gave him to. I mean that's good.

Speaker 1 (41:39):
As a guy with a father, I got to say,
it's not that great. I mean he's fine. He's fine.

Speaker 2 (41:44):
By the way, wherever your dad is right now, he
wins and he said to he said to your mom,
he's like, ugh, just so I got a little adjutant
my like my roop case.

Speaker 1 (41:54):
They the more the more criticism in my family, the better,
to be honest with you, he's in embracing it.

Speaker 2 (42:00):
Well, maybe I should spend more time with your dad.

Speaker 1 (42:02):
Well you didn't know your You didn't know your dad
at all.

Speaker 2 (42:04):
Oh, let's not go there now, let's.

Speaker 1 (42:06):
Get into it. Now, let's do it. No another two hours,
solve your all.

Speaker 2 (42:10):
Your mother would be nice to have had. Yeah, Like, okay,
I was raised in a single parent fantastic mother household.

Speaker 1 (42:19):
Yeah, and look at look at what it look at
what it made you. So you wouldn't be if you're
with a dad, you might not be you to be honest,
you know, think about it.

Speaker 2 (42:27):
But mister butterfly effect, look at the the thing you
be killed they called the butterfly effect. Right, yeah, there
we go.

Speaker 1 (42:33):
Well, no, but it's true. I mean, I'm sure it
drove you in all sorts of a different ways, just
like all of my failings and my needs and wants
have driven me in different ways.

Speaker 2 (42:42):
Oh, I got one final question for you. Just popped
in my head earlier in the day. Let's do it
early in the conversation.

Speaker 1 (42:47):
I'm ready.

Speaker 2 (42:48):
You are king of the world. Okay, you're omniscient, You're omnipotent.

Speaker 1 (42:55):
All right?

Speaker 2 (42:56):
You are borderline godlike?

Speaker 1 (43:00):
Am I immortal?

Speaker 2 (43:01):
No? That has nothing doing well?

Speaker 1 (43:04):
I mean a god would be immortal, in my opinion.

Speaker 2 (43:05):
Said borderline.

Speaker 1 (43:07):
Okay, I'm omnipotent. I'm omniscient, but not immortal.

Speaker 2 (43:11):
Okay, all right, not immortal, borderline god?

Speaker 1 (43:14):
Like I have an expiry date.

Speaker 2 (43:16):
You have one thing to fix in the world. What
would you fix one thing? I mean, like, can an agonize.

Speaker 1 (43:24):
A mean thing? What do you mean thing?

Speaker 3 (43:26):
Though?

Speaker 1 (43:26):
Like?

Speaker 2 (43:27):
Do you cure cancer? Do you get rid of guns?
Do you stop war? Do you it's only is it
only physical things? Or can I like remove a component
of humanity?

Speaker 3 (43:35):
Oh?

Speaker 2 (43:35):
Yeah, you can do that too. You're a near godlike
you can do pretty much whatever you want. One thing
you alter, change, eradicate one.

Speaker 1 (43:46):
Here's what I would change, I think. I think it
would you know, who knows if it was only one thing,
who knows if it would work. I would make it
so that every person could understand or sympathize with another person.
I would make it so that everybody was able to
feel empathetic towards another person. That's the thing I would change,

(44:06):
that there was a sense of empathy for another person
whenever they interacted with them in a desire to empathize
with them. I think that thing would probably fix a
lot of our problems in society. I don't think like
removing guns, somebody would just make a laser like you
know what I mean, Like, they'd make up, they'd use bombs,

(44:27):
they'd use bows and arrows like or whatever. I'm not saying,
like it's that's the same thing. But guns would fix
a problem temporarily, but not permanently. A permanent fix would
be like if you could empathize with another person, If
everybody could empathize with the people around them, even the
people that seem to suck, I think it would go
a long way, you know, or like you know, I
don't know erased the emotion of hate, but I don't

(44:48):
think that's I'm not sure that would have the result
we wanted to be honest.

Speaker 2 (44:52):
Okay, anyhow, I.

Speaker 1 (44:53):
Don't know what would you do if you were omniscient
and omnipotent, soft and cuddly, what would you do? What
would you your one act?

Speaker 2 (45:01):
My one act?

Speaker 1 (45:01):
Big man?

Speaker 2 (45:03):
Yeah, big man, you sound like Dennis Hopper and Apocalypse.
Now what do you want me to say it was
a wise man?

Speaker 1 (45:11):
What?

Speaker 2 (45:12):
Yeah, I didn't think about you turning the tables on me.

Speaker 1 (45:15):
That's right, baby, check it out. You're got to answer
this fucking question. Now, look at you, you're in the spotlight.

Speaker 2 (45:19):
Well, there's so many things. You know, there's the obvious
you'd cure diseases, But uh, I do think that, as
difficult as it is to say, there's there's a Darwinian
nature to life that you you maybe shouldn't fuck with. No,
I would say, honestly, I think it's the rain in
the internet. Yes, I do.

Speaker 1 (45:41):
I mean, like comparison, mind's way way better. I think
it's like not even I can't even calculate how much
cooler and better. My answer was to this, you're holier
than I'm in. I love people.

Speaker 2 (45:55):
Look at TikTok last if that's the one thing. God,
now you're just now you're denigrating. That's this is not
a nice way.

Speaker 1 (46:02):
They looked at their photo. There was a timer saying
enough Internet.

Speaker 2 (46:07):
I think the Internet is the end of the world.

Speaker 1 (46:10):
Now you just think that because you're aging.

Speaker 2 (46:13):
I think an unregulated Internet is the end of the world.
That's what I think.

Speaker 1 (46:18):
Regulated, unregulated, That's not our problem. Our problem is us.
It's not that thing we think. We always want to
put it in the object. We always want to say
it's the thing that makes it. So.

Speaker 2 (46:29):
You think people are gonna get smarter, more educated.

Speaker 1 (46:32):
People thought when they put radios in cars that people
are gonna drive off the fucking road because they were
so mesmerized by the magic box producing music that they
couldn't steer the car. And you know, it probably did
cause a few accidents, But over time we learned to
change the channel without plowing into a family.

Speaker 2 (46:49):
I'm duly chastised. I'm gonna take mine back. I'm gonna
take mine.

Speaker 1 (46:53):
Back all right.

Speaker 2 (46:55):
Here is what I wish for the future.

Speaker 1 (46:57):
What I would change, What you would change as a
godlike creature.

Speaker 2 (47:01):
As a godlike creature, we would eat lollipops every day.
We would all have ponies in our backyard.

Speaker 1 (47:09):
Okay, let's say several things.

Speaker 2 (47:12):
I'm condescending right now to you. We would have you'd
wish wealth, great wealth on everybody, great emotional and physical
monetary wealth.

Speaker 1 (47:22):
Yeah, you're mocking me.

Speaker 2 (47:24):
Now get eight hours of sleep a night.

Speaker 1 (47:26):
You know. It's funny, you're mocking me, but we agree.
I am mocking you, but we ultimately agree.

Speaker 2 (47:31):
By the way, I'm going to go back, I think
there's far more empathy in the world than you might think.

Speaker 1 (47:35):
I'm not saying there's there's plenty of empathy, just not enough.
Just not enough. There isn't if we could, if people
were empathetic, they wouldn't act the way they act. It's hard,
it's hard. It's hard to imagine what it's like for
somebody else. But I think that the more we can
do that, the better off humanity is.

Speaker 2 (47:51):
Actually, you sold me on it. I agree with you.

Speaker 1 (47:53):
I hate to be like all lovey dovey fucking hippie.

Speaker 2 (47:56):
It was a little hippi hippiush for you.

Speaker 1 (47:57):
But no, I know I'm disgusted with myself for even
saying that's just that. I'm like, everybody should be horny
all the time. You know, No, I'm on board now.
You sold it to me. I I sounded like a rube.
I sounded like mister ten sixty is what I sounded like.
And uh and then it's just like, just like the
idea of that, You're like, if we could just live
at the internet, will be like if we can just

(48:20):
like you know, cancel America online, nobody can get an
account on AOL.

Speaker 2 (48:25):
We're all set, all right, Okay, no more, I'm shamed, almighty.

Speaker 1 (48:31):
Shut It's a very lovely It's kind of lovely. It's funny.
You're like, just put the phone down.

Speaker 2 (48:38):
I'm off.

Speaker 1 (48:39):
Put the fucking phone.

Speaker 2 (48:40):
I'm often described as lovely. All right, anything else? When
are we next talking?

Speaker 1 (48:45):
I don't know. I hope soon, because it's I can't
go for too long without a little bit of this
in my life.

Speaker 2 (48:50):
All right, professor says, how do you well?

Speaker 1 (48:54):
Once again, your interruption has been my pleasure.

Speaker 3 (48:58):
Okay, Well, that that that I think is our show.

Speaker 1 (49:09):
I think that we've done far more than anyone was
expecting to do. I had a whole show planned. In fact,
I was going to spend several hours talking about my
travel anxiety. But you know, I think there obviously, I
think what we've learned there are more important things to
focus on in this in this world and in this life,
and we focused on some of those things just now.

(49:33):
I'm not sure that. I'm not sure why, I'm not
sure how, But as usual, the Professor made it happen.
All right. I got to get out of here, I
got to have a lot to think about. But we'll
be back next week with more what future, And as always,
I wish you and your family the very best
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.