Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This is me, Craig Ferguson. I'm inviting you to come
and see my brand new comedy hour. Well it's actually
it's about an hour and a half and I don't
have an opener because these guys cost money. But what
I'm saying is I'll be on stage for a while. Anyway,
come and see me live on the Pants on Fire
Tour in your region. Tickets are on sale now, and
we'll be adding more as the tour continues throughout twenty
(00:23):
twenty five and beyond. For a full list of dates,
go to the Craig Ferguson show dot com. See you
on the road, My DearS. My name is Craig Ferguson.
The name of this podcast is joy. I talk to
interesting people about what brings them happiness. My guest in
(00:46):
the podcast today I've been aware of for a long time.
His band We're on my old late night show when
it was so well, you'll hear this story anyway.
Speaker 2 (00:55):
Please welcome Damien Kulash from Okaygo. Damien.
Speaker 1 (01:07):
First of all, let me apologize. Don't get too close
to your microphone because I'm feeling but a little poorly.
I've got I think I've got a cold, but it
could be leprosy.
Speaker 2 (01:18):
It could be anything. I don't know.
Speaker 1 (01:19):
We've got many symptoms. Start with a code.
Speaker 2 (01:22):
How are you you? Okay?
Speaker 3 (01:24):
I'm well, I'm better than I'm better than that. I
don't think I'm I have leprosy?
Speaker 1 (01:29):
Okay, well that's good. Are you a hypochondriact? And anyway, No,
not particularly many artists are. You know. I think it's
I think it's something to do with a creative mind
getting bored and you go.
Speaker 3 (01:43):
It could be we're.
Speaker 2 (01:44):
Just the delicacy of our sensitive little souls.
Speaker 1 (01:49):
Now I see, we're we're going to embark on an
existential crisis. Hey, I know that you did the old
late night show like years ago, right, Yeah, I was
doing Late Night and it was right at this It
was before this two shell pass.
Speaker 2 (02:06):
It was before I mean it.
Speaker 1 (02:07):
Was really it was quite early, right, long time ago. Yes,
it was when the show was so small. The band
had to play and then you guys had to go away,
and then we put the late night thing in and
then we we shot it out of order so that
the band played first. That was right, right, That was
you were doing that? That sounds right to me.
Speaker 3 (02:27):
Yeah, I've had kids done sense, So I don't remember anything.
Speaker 1 (02:30):
Yeah, that's interesting, so have I And I remember Patchy Bits,
who were a great band.
Speaker 2 (02:37):
Patchy Bits they were they were really lovely.
Speaker 1 (02:39):
But listen, I think that was interesting because I mean,
obviously I've known of the band, and even right at
the very beginning of Late Night, I always kind of
like I would, I will say yes or no to
the bands, and so I was very familiar with Okay
go very early, or I thought early on. I mean
(03:00):
imagine it was not early on for you, but since
then the kind of performance nature of what you do,
because you're not like a normal rock star. First of all,
I can see you're in your kitchen. That's and you
have a throw blank on your sofa. This is true,
This is true. This is not my own home, but
(03:21):
it's not no.
Speaker 3 (03:23):
Wow, there's been fires in Los Angeles as Oh my god.
Speaker 2 (03:27):
Did you get evacuated? Did you get into trouble?
Speaker 3 (03:30):
I did? I you know, I don't. My home is
still there and is safe. But my kids' school is
very close to where the fires were. And while I
am not a hypochondriactor, and I'm very cautious with their
exposure to heavy metals and flying asbestos and so forth.
Speaker 2 (03:51):
That's that's exactly what I fail about it.
Speaker 1 (03:53):
Because I lived in Ala for years and years and
years and years, and I still have a lot of
friends there who have moved out. I would have been
I would be rified now to just be there breathing
that stuff.
Speaker 3 (04:02):
And it is nasty stuff. And there's a big push
towards normalcy, which I completely understand. Like everybody wants life
to keep going, and so the fastest you can open,
you know, the faster you can open the schools, the
faster you can get everybody back into their sort of
like routine, the better. But we are opting out of
(04:23):
that for a little while we're in Santa Barbara because
it is it feels too toxic.
Speaker 1 (04:29):
Yeah, I would be totally doing that as well. I'm
in London and I've just been on the Tube, the
London Underground.
Speaker 2 (04:36):
That's pretty toxic too. Actually, I think that's where I live,
where am I go my.
Speaker 3 (04:41):
Cold so much much more romantic type of toxic it is.
Speaker 1 (04:45):
It's a little bit like it's toxic, but it's also
a Richard Curtis movie. You can see lots of little
rich of Curtiss movies going on all over the subway.
I think, by the way, that would be a good
place for an okay go video would be the subway
jumping leaping across trains instead of.
Speaker 3 (05:04):
Yeah, I don't know. I don't know why we haven't.
Speaker 2 (05:06):
Fun of it?
Speaker 1 (05:07):
Are you still doing the super kind of complicated and
like performance are videos are?
Speaker 3 (05:14):
Yes? We just released a new video two weeks ago
last week the week before, yeah, two weeks ago, which
is it features sixty four clips on sixty four phones,
all laid out on the floor as a mosaic.
Speaker 1 (05:29):
I have to say, I haven't seen it yet. What's
what's the name of the song that goes with it?
Speaker 2 (05:33):
It's called a Stone only rolls Downhill?
Speaker 3 (05:35):
Okay? And it is the opening lines are I wish
I could say, would all be all right?
Speaker 1 (05:41):
That you're concerned about the state of the I mean,
I think everybody's concerned about the state of the world.
Speaker 3 (05:45):
Are you not concerned?
Speaker 2 (05:46):
Well, you know, I haven't.
Speaker 1 (05:48):
No, I'll tell you I've been I've been bored in
the urself of everyone who listens to this podcast. But
I've been reading a lot of historical novels, particularly the
work of gor Vidal, and it's been bad for a
long time. I feel like this is not it's always crazy.
I feel like I think like I'm getting a sense,
a little bit of a sense of perspective. Of course,
(06:10):
I'm worried, but if you look at it, it's it's
always been pretty scary for humans on this planet.
Speaker 3 (06:16):
Yeah, you think it's worse, I do, really, I think
it is. I think it is more. I think that
the things we are, I think we're at some tipping
points into some some places we may not be able
to return from as easily. I think it is.
Speaker 2 (06:32):
I think you're right, that has always been bad, and
it has always been crazy.
Speaker 3 (06:35):
Uh. But I but I think that the I am
very worried about the governmental changes happening in the US
right now. That having a you know, a president.
Speaker 2 (06:45):
That acts more like a fearer is not great.
Speaker 3 (06:50):
And and I do think that that that climate stuff
is passing tipping points that really change the way humans live.
I don't necessarily think like I'm not running from a
specific fire as much as I am the toxicity that
comes after that and the social changes that come after that.
You know, I think that a lot of what's happening,
(07:12):
you know, I think the let's see what's the right is.
I think the war in Syria is really a climate war.
I think that the migration problems at the southern border
are really are really climate problems that when you have
the reason you have MS thirteen in taking over cities
in South America is because there's an influx of people
from agricultural areas that are no longer used for agriculture
(07:36):
because they've been in drop for so long. So it
seems like a social problem at the border of the US,
but it's actually a climate problem globally. So that's what
I'm worried about.
Speaker 2 (07:47):
Well, what would you do?
Speaker 1 (07:49):
You have any ideas about how you think a solution
may be achieved?
Speaker 3 (07:53):
You just write pop songs. That's the way to do it.
Speaker 1 (07:57):
But I think it's kind of interesting because I think
nobody is worried about I mean, any sentient being. I
think it has to be worried about climate change because
it's it's the planet on which we live. But the
the idea that I I've had this discussion with my kids,
who are a bit probably a bit older than yours,
I think, but that I feel at this point, you know,
(08:18):
all of our frothy entreaties to you know, get your
grandma to separate our you know, plastic bag from the
you know, the.
Speaker 2 (08:27):
Eggs is not It's not gonna work.
Speaker 1 (08:29):
And I think at this point we've now reached a
it's an engineering problem. We have to actually we have
to actually intervene in some way. And I think that's
kind of that's kind of fascinating to me because I
feel that nothing's going to change until it becomes profitable.
I feel like climate change has to become profitable for
someone to say, well, we'll pay you five hundred billion
(08:51):
dollars to do some cloud seeding and bring down the
temperature of the North Atlantic by three degrees, and then
someone will do it because it can be done.
Speaker 3 (09:01):
I hope you're what do you think then?
Speaker 4 (09:04):
I think that solar is already cheaper than than fossil fuels,
and yet we keep opening Newel's and trying to dig more.
Speaker 2 (09:15):
You know, I think that the I agree is systemic.
Speaker 3 (09:21):
I do not think that the that that recycling our
you know that I don't. I don't think this choice
should be on the consumer. I don't think it is
a consumer choice. You can't expect everybody in the world
to make the best decision for the planet. What you
need to do is is actually make good structural decisions
for the planet. I am not super bullish on geoengineering stuff.
(09:46):
I'm but I'm also not a scientist. I just sort
of feel like that those geoengineering solutions are often used
as a sort of solve for like, I know it's
really bad, but we can fix this, and sort of like, no,
the thing we should do stop drilling, like stop stop
burning oil.
Speaker 2 (10:03):
That is the obvious thing to do, and yeah, I
know that.
Speaker 1 (10:08):
I well, we could easily find you know, someone else
who could see here's why we mustn't stop drilling oil,
and now we must have to. Yeah, I mean, there
is such a it's an argument which is it seems
like one of those arguments where you can't persuade anyone,
do you know what I mean. It's like people dig
in exactly, and then it's like and I feel like
(10:28):
the nature of argument has become a.
Speaker 2 (10:30):
Little a little difficult as well.
Speaker 1 (10:32):
It's like it's not just about climate change, it's about
societal change, in the sense that the difficulty in argument
now is that you don't really you don't really discuss
to move forward it's not socratic. It's more about well,
what's your position. Well, here's my position, and here's why
your position is wrong, and here's why my position is right.
Speaker 2 (10:53):
And it doesn't seem to move. I think it's a
real gridlock.
Speaker 1 (10:55):
It's I think that's that's that's a systemic problem with humans.
Speaker 2 (11:01):
As well, I think, isn't it.
Speaker 3 (11:03):
I believe that it is.
Speaker 1 (11:05):
So maybe what I'm trying to get to is maybe
writing pop songs is the right way to go, because
if you can in some way through art, maybe not
only pop songs, but pop songs and literature and art
and entertainment and all of the things that artists do,
(11:25):
perhaps that can help people kind of break the logjam
of This is my position and I will never change
from it, because you know, one of the great things
I think, one of the great movies I think to
watch for this is Dirty Dancing. You ever seen a
movie dirty Of course, you've seen a movie Dirty Dancing,
and you watch that movie. If you watch that movie
when you're fifteen, or you watch that movie when you're
(11:49):
fifty and you have children, it's.
Speaker 2 (11:51):
A completely it's completely different movie. You're like, wait a minute,
that dad is right?
Speaker 1 (11:56):
You know, you know, and she should be put in
the corner and age is Patrick Sweezey with that.
Speaker 2 (12:01):
This is what it's all about, what's going on here?
Speaker 1 (12:05):
But I think perspective is an artist's job, right, Yeah,
it is. That's why I kind of I loved about
the first Mad Giant performance piece that I saw of
yours with. I saw the very famous Trademill video. It
was very fun, thank you. But the Rube Goldberg machine
(12:27):
for this too shall pass. I feel like that must
have taken about four or five years together. I mean,
it was an enormously complex Did you actually get it
in one take?
Speaker 2 (12:39):
Eventually we did what you actually see.
Speaker 3 (12:43):
It goes from Take A to Take B back to
Take A because there's a section of that where we're
going down an elevator shaft following a bunch of fluids
that sort of rise and fall on different you know,
as as the machine trips itself, and the incredible, incredible
cameraman Mick Waugh, who was who was doing this amazing
(13:06):
job filming that did in the one perfect take otherwise
did miss the level of the liquid in there for
a second, so we had to we had to sort
of stitch in take b.
Speaker 1 (13:16):
But because you did get the machine to do what
it set out to do.
Speaker 2 (13:21):
Who designed the machine? Are you part of that?
Speaker 3 (13:24):
Yes, well there was we we put I mean this
is now, you know, ten fifteen years ago, but we
put a we put up a job posting on a
sort of like nerdy artists board saying we're looking for
an engineer who can help us build a Goldberg machine.
And twelve engineers together wrote us a proposal, and we're like, guys,
(13:45):
you're not very good engineers, if you like, you know,
we asked for one. This is too many.
Speaker 2 (13:50):
We can't we cannot afford you.
Speaker 3 (13:52):
And they said, no, no, we'll just pay us like
we're one person, and we'll just split it up. And
it so wound up being this sort of group art
project where by the end, I think people logged hours
and you only got paid if you did more than
ten hours a week or something like that. Like, but
people would just come into this giant warehouse and everybody
would sort of work on it together and play and
have fun. And my dad build some of that machine.
(14:14):
I built a bunch of that machine. We broke it
up into a whole lot of different six second chunks
of the song, and so that there was sort of
like there were discrete problems to try to solve and
and we could then review things while we were on
tour and come back and you know, like the last
three weeks of that build, I was there every day,
(14:35):
but there was another like three months.
Speaker 1 (14:37):
So you would you in the in the case of
like these big videos, will you amand the music to
fit the video?
Speaker 2 (14:45):
Will you change?
Speaker 1 (14:46):
Like you know, if you're in a you know, that
one show looking thing in the airplane, do you change
the music.
Speaker 3 (14:51):
To generally know the one shot one in the airplane
is uh, that's the that's the album version of that song.
That one is you know, we're in zero gravity and
you can get twenty eight seconds of sorry, you can
get yeah, twenty eight and a half seconds of zero
gravity for each each time the plane dives. So what
(15:14):
we did for that was we broke the song up
into twenty one second chunks that were that fit the
you know, tempo of the music nicely, so that every
four bars there would be a applause based.
Speaker 2 (15:29):
So, how do you rehearse that something like that?
Speaker 3 (15:31):
Do you do?
Speaker 2 (15:32):
You have to rehearse it?
Speaker 1 (15:33):
Like are you watching a clock while the plane is
in the site wave?
Speaker 2 (15:36):
Is that what was happening? Or were you?
Speaker 3 (15:38):
We spent six flights just bouncing off the walls doing
every like, you know, six flights of fifteen each, So
what is that three hundred? Do I have that? Six?
Fifteen is six?
Speaker 2 (15:51):
No, I'm way off sixty ninety.
Speaker 3 (15:55):
So we had ninety little like test segments and then
and then we then with the footage from that, we
had go pros everywhere just filming everything. With a footage
from that, we put together what we thought a routine
would be, you know, just in a dance studio, going
kind of like we could do this thing and then
that thing, and then and then we went back to
(16:15):
the plane and spent another week, another six flights trying
out that routine, and then finally a week of shooting those.
And during the week of shooting them, each flight you
have fifteen periods of weightlessness. The video requires eight of them,
so we do seven as a rehearsal, then reset everything
and do the full video as the final eight. And
(16:35):
in between each like, it's thirty seconds of weightlessness more
or less, then five minutes as the plane climbs back
up to where it can basically drop you again. And
during that we just to sit still as we could
so that we would later be able to like morph
over that period.
Speaker 2 (16:53):
It sounds fantastically expansive. It was it. Yeah, who paid?
The record company paid for this?
Speaker 3 (17:00):
No, we are our own label. So a Russian airline
this was back before the Russia and THESTA fell apart.
There you'll see on the back of the back of
that plane, I mean like in the in the back
you can see it as a S seven.
Speaker 2 (17:15):
That's a Russian airline and they were this was their
ad campaign.
Speaker 1 (17:19):
Wow, so they donated that for an ad campaign. That's great. Hello,
this is Craig Ferguson and I want to let you
know I have a brand new stand up comedy special
out now on YouTube. It's called I'm So Happy, and
I would be so happy if you checked it out.
(17:41):
To watch the special, just go to my YouTube channel
at the Craig Ferguson Show and is this right there?
Speaker 2 (17:47):
Just click it and play it and it's free. I
can't look.
Speaker 1 (17:50):
I'm not going to come around your house and show
you how to do it. If you can't do it,
then you can't have it. But if you can figure
it out, it's yours. You must have some very clever
producing skills or some very clever producers to I mean,
how the hell do you get in touch with Russian airlines?
So do you want to do music video and it
will get you know, because I.
Speaker 2 (18:11):
Don't think I don't think I even voice that video go.
Speaker 1 (18:14):
You know what airline I should be taken is the
one that.
Speaker 3 (18:19):
I do remember thinking like how ambitious are you? Like,
like well, and how risk averse are you? Because like
that's not the type of airplane you want to be in.
But you know, our Goldberg Machine was underwritten by by
State Farm Insurance, which it's like, I feel like everything
that happens in that video looks like something you should
be ensuring again, you know, like.
Speaker 1 (18:39):
But that's that's kind of interesting because they I mean
presumably they were the good guys when they were underwrite
in the video. But I don't know if everyone's very
happy with State Farm Insurance right now, particularly in Los Angeles.
Speaker 3 (18:53):
No, no, but certainly not. I mean again, this is
many years ago, but I you know, like we had
to make this this choice fifteen twenty years ago about
whether or not we were going to do the standard
music industry dance or try to do it ourselves, and
(19:14):
that that State Farm video was the first one where
we had any any brand money helping to pay for stuff,
and it was it was a difficult balance because back
then the idea of selling out was a very real thing,
and it was like, people are going to hate this
if they if it feels like we are just you know,
sort of dancing monkeys. Now it's best I understand it,
(19:35):
Like you know, younger millennials definitely gen Z and beyond
see see those brand collaborations as as a badge of
honors like it's it's there's.
Speaker 2 (19:48):
I think that the opposite of that.
Speaker 1 (19:50):
I think they look at it a different way as well,
because because they also use it as their as their power.
I think the young people when they say we will
withdraw our business from Target or Star Books or whoever,
as they're angry at at the time, until you fall
in line with you know, whatever we're arguing about, which
is a different It is a kind of change which
I think is it has it does it's valid? I
(20:13):
mean there is a kind of like, well, I'm not
buying your crap until I feel like you're a decent
human being. Or your position aligns with mine, which I
think is fair. That seems democratic to me.
Speaker 3 (20:24):
Absolutely, yeah, I think that. I mean, voting with your
dollar is a is more effective than voting. Yeah, I
think everybody's voting with dollars. It just depends on much
dollars you have and what votes you're using to do.
What is the situation then with your record company? Now
then you is it only your band or do you
bring in other bands?
Speaker 1 (20:43):
And do you do you function like a We have
occasionally released other bands.
Speaker 3 (20:47):
For the most part, it's more the record company stuff
isn't very unal so it's like we don't particularly want
to do it unless unless we're doing it for ourselves.
But it's it's mostly just that it's not like there're
the functions of a of an old fashioned record label.
It used to be a lot about physical distribution and
now that's not part of the part of it anymore.
(21:10):
There's still a lot of business. I mean, it's still
a lot of stuff that has to happen, and most
of it is sort of just you know, promotional and
and logistical. But it is you know, I don't it's
not stuff I relish doing. But if you're kind of
a detail freak. You're gonna wind up wanting to oversee
it all anyways.
Speaker 2 (21:29):
You might as well just are you a detail freak?
You think? Unfortunately, Yeah, I think that's I think that.
Speaker 1 (21:35):
You know, it's not unusual for musicians or artists to
be that way. I mean, you you want your your
you want your ship to be the way you want
it to be.
Speaker 2 (21:45):
That's that's the whole gig, right, you want it to be. Well.
Speaker 1 (21:48):
I always say when when I'm doing anything, is I
don't want it to be produced unless it is produced
by me.
Speaker 2 (21:54):
I don't.
Speaker 1 (21:55):
I don't you know if you say, well, this is
what happens here to go, I don't know what happens here.
I'm doing it, so I feel like it's a That's
why through all of it, and through doing the late
night showing, through all the television I've done, I eventually
went back to doing only stand up and doing podcasts
because I can talk to you. I want to on
a podcast because it's there's no one here to tell me.
(22:17):
I can't you know, there's like you can't talk to like.
I remember one of the early podcasts I did was
with one of the senior undertakers. In New York City,
a lovely man, and I wanted to know about undertaking,
and I had no idea. It is fascinating world, and
it's like you have to go to have a college
(22:38):
degree to do it, and it's like a.
Speaker 2 (22:40):
Real, a real interesting thing.
Speaker 1 (22:42):
And I wouldn't have been able to get that guy
a three minute interview second guest on a Wednesday night
or late night. There's no way I would have been
able to do that. And I could talk to him
for an hour. And that's what I think is the
great gift of this world. Now, the kind of I
know that you have talked. Then you're going from of
Congress or something about net neutrality. I did, Yeah, can
(23:03):
you describe to me what net neutrality actually means?
Speaker 3 (23:08):
Uh? Net neutrality? Boy? And I it has been a
long time since that was the specific argument, So I
hope I'm not too out of date, but that that
the traffic on the Internet is going across lines that
are right right like there are people there, there are there.
(23:29):
The actual infrastructure of the Internet is owned by the
same people who are generally by by the same companies
who are who are selling us that access, and so
they can preference what what information travels and what speeds,
and and so that there was actual preferencing of corporate
information over private information back then, and that there was
(23:53):
there were basically it was it was essentially Internet payola.
And that was.
Speaker 2 (23:57):
A nasty, nasty, nasty thing.
Speaker 3 (24:00):
Now we have a much worse issue, which is sort
of we have wound up asking the content networks to
be the police of the content on them. We expect
Facebook to be to decide what should be on Facebook,
and the misinformation that flourishes is their problem.
Speaker 1 (24:20):
Yeah, But it's interesting thing because then people go mad
at Martin Zuckerberg recently because he said, all right, we're
not gonna We're not gonna police it. We're just gonna
people can flag it. Is that is that neutrality?
Speaker 3 (24:32):
Is that? Is that? I? I don't This is why
I was scared of being on a date because I
honestly don't know where. Like we've gotten now gotten past
the point where we're worried about comcasting at and t
throttling things so that they can uh, so that they
can serve the interest of their business partners and into
the world of misinformation and I and I honestly I'm
(24:56):
so overwhelmed by it myself that I don't I have
not developed a position that I.
Speaker 1 (25:00):
It's first of all, that's a very refreshing thing to
hear any human being to say in this day and age.
But what I think is is fascinating though, that in
the age of as we wander into every discussion I've
had with everybody, at some point we end up talking
about AI. And I think because it's an enormous change
in how we're living and it's coming fast. But what
(25:23):
I'm fascinated by with it is that it really is
as far as the Internet goes, as far as the
digital information goes, we're in a post truth environment where
you can't actually trust anything, even the video which you're watching.
I mean, I'm sure there's an AI program that could
make an okay go video that would convince most people
(25:46):
that you had done it.
Speaker 2 (25:47):
Yes, for sure, and it's mind muggling.
Speaker 3 (25:51):
And it also, luckily for my little man, it does
put a premium on stuff that where people try just
for the sake trying, right, I mean, there is some
trust is involved that people do on the other end
of the screen, Like the thing we make winds up
on the other end of the pipeline going into your
(26:12):
eyes or going into your ears, and you need to
trust us in some way that this is the real thing. Luckily,
we've been doing this for twenty twenty five years, so
we're fairly trustworthy source of you.
Speaker 1 (26:22):
You are trustingly, but the machine is no longer trustworthy.
That's what I think is the frightening thing. That I
trust you to make the product, the way that.
Speaker 2 (26:33):
You make and the way you said you made it.
Speaker 1 (26:35):
But I don't know if I trust my phone to
be giving me something that wasn't put together in a
bot farm somewhere, and I don't know anywhere wherever the
boat farms are.
Speaker 2 (26:45):
Do you ever use? Are you ever?
Speaker 1 (26:47):
Because on one side, everybody likes to I certainly like
to talk about how scary AI is and how bad.
Speaker 2 (26:55):
It is, but it's also you know, it could be great.
You know, I've talked to I've talked.
Speaker 1 (27:00):
To a surgeon on this this podcast who is like, no,
this is fantasic. Is going to save so many lives
in surgery because this thing can do stuff that I
just can't. You know, it can do it faster, it
can do it can do it, It can make decisions,
It can analyze data that would normally take me a
week a day, two operations, three operations.
Speaker 2 (27:21):
I don't have to do it.
Speaker 1 (27:22):
I can get it instantly as as soon as the.
Speaker 2 (27:25):
Patient is Sometimes I don't even have to open them up.
Speaker 1 (27:28):
And I think that that is the other side of
is the technological balance, isn't it?
Speaker 2 (27:35):
Yeah? I know it's incredibly, incredibly powerful, and I think
people mostly sort of jump to endpoints.
Speaker 3 (27:43):
Is this going to be a big monster that's going
to come after us and try to kill us, or
is it going to make a believable version of Craig
Ferguson that's not actually Craig Ferguson. I think the more realistic,
in my estimation, sort of slide into the future has
more to do with how many intermediary steps it cuts out,
(28:04):
and whose jobs is replacing, and what that does to
the eventual product that Like, when you think about it
with respect to music, it's not that it's not just
that you can have an AI replication of Taylor Swiss Voice.
It's that I would be astonished if our recording programs
within the next year or two didn't have a button
that you could push to get a better guitar track
(28:25):
than the one that you just recorded right.
Speaker 1 (28:27):
All right, So it's almost it's like but extrapolating that
so that it gives you, it realizes what you're trying.
Speaker 2 (28:35):
To do and makes it better.
Speaker 3 (28:37):
Yeah. So like, for instance, my wife and I directed
a film for Apple the year before last, and in
the mixing process, I remember you were you know, we're
there in a giant mixing studio to do this very
high end professional cinema mix and the tool that was
used to clean up the audio was all AI based.
Speaker 2 (28:58):
And it's unbelievable what you can do.
Speaker 3 (29:00):
You can take a track from you know, a musical
track from the nineteen sixties and pull out each individual
instrument and remix it on the flat if you want to,
and that it's unbelievably powerful. Crazy, it's it, and it
is you know, from a workflow perspective, why would you
not do that? But when you think about what that is,
what that is capable of them, why would like it's
(29:21):
that tool? Anybody uh in the path is dumb not
to use that tool. Right.
Speaker 2 (29:27):
But now, now, like pivot that to logistics. Let's say
if you're running a port right now, you would be
dumb not to run your logistics through an AI that
can make better routing decisions.
Speaker 3 (29:39):
Than you can't. Right, And if you are a small
local government, you would be dumb not to interface with
that I with your own AI, you know, and like,
and at some point you're it's it's not a question
of like, this big evil robot coming after people. It's
just that they're they're all black box decisions. We don't
know why it's a better decision. We know that it
(30:04):
came up with an answer that works, right, and that's
all we know. Right, You usually can't go back into
an AI and actually figure out what the reasoning was
because there is no reason. It's it's pattern recognition, and
it's recursive and and it feeds on itself, and it's
like it can't explain to you why it has done it.
So what happens when you have misalignment with with the
(30:25):
actual human goals that have gone into it, Right.
Speaker 1 (30:28):
So when the human goals and the synthetic intelligence goals
don't mash up, so the artificial intelligence makes a decision
in favor of who well or in.
Speaker 3 (30:40):
Favor of what it believes the goals to be, which
you may have set but like you know, like the
Aladdin's lamp or whatever. It's sort of like there's always
a way to achieve those goals, which actually may reveal
to you that you should have had better calls, you know.
Speaker 1 (30:53):
But it's but it actually goes back to I think
the point you were making la on about climate change
effects is is what's affecting the society change because it's
a technological It's like when the printing press is invented,
then very quickly you lead on to the Protestant Reformation
in the because people are starting to look at Bibles
(31:13):
that they weren't allowed to look at before, and then
you start thinking about we'd like to look at this
in our own language. And suddenly there's a and suddenly
everything changes because of a piece of technology. But I
think that the AI to me, it seems to be
on a par I think with the printing press and
the sense of the changes that will bring. Some of
(31:35):
them are outstandingly good, but it's probably not all going
to be good.
Speaker 3 (31:42):
No, It's hard for me to imagine a an industry
that won't be completely top down. Time you've talked about
a revolutionized planet, because I'd feel like mostly what AI
is currently good at is pattern recognition. Most what humans
do is pattern reaction and like and that's and we're
(32:04):
in the early days. We're like, we're we're at you know,
we're in the Internet of nineteen seventy eight, not even
nineteen ninety, you know what I mean. We're like, this
is very, very very early days, and so I have
a hard like, in fact, the first song on our
new album is is written to this idea because I
just realized writing a rock and roll album right now
(32:25):
is feels almost silly, like a bunch of guys standing
there with guitars and drums making music like it feels
like releasing a brand new flip book the day before
cinema is invented, you know what I mean?
Speaker 1 (32:37):
Yeah, I get it, But I think that that's why
I mean, I feel like in my reaction to it
has been like all of the work now has to
be live. Everything has to be live. I have to
do shows which are live. I have When I started
out in my life when I was six fifteen, sixteen
years old, my job for the until I was about
twenty one was I was a drummer. I was a
(33:00):
drummer in rock bands and punk rock bands, and I
remember saying that nothing's going to replace drummers now, but
the first fucking thing to go it was, like I mean,
and I was.
Speaker 2 (33:11):
I should have known better.
Speaker 1 (33:12):
I was listening to craft Work when I was twelve,
but I didn't see the signs, you know what I mean.
It's like I didn't put the I didn't make any
pattern recognition.
Speaker 2 (33:20):
But I think you're right.
Speaker 1 (33:21):
There's also you go into any town in the United
States and you will see a bunch of groovy people
looking for vinyl in some store, probably near the railway station.
Speaker 2 (33:31):
Oh yeah, I listened to vinyl. You know, I still
do it.
Speaker 1 (33:34):
I imagine you probably have a vinyl collection as well, And
I think that there is I don't think it's just nostalgia.
Speaker 2 (33:43):
I think it's also it's a different experience.
Speaker 3 (33:46):
Well. I think it all points to the same slippery slope,
which is that when anything is possible, nothing is special, right,
So everybody wants to have more and more access to
the stuff they love. But the more access you have
to the less special any individual part of it becomes.
And so you and I grew up summing through through
(34:06):
punk rock record store seven inch boxes to find that
one seven inch from that one band that nobody else
had a copy of, because it would I mean, the
music that was on there could change your life.
Speaker 2 (34:18):
But just having it was a thing, you know, it
took some work, it took some effort.
Speaker 3 (34:22):
And and that that relationship to songs changed twenty years
ago at least, right, But but it continues to change
all the time. I mean, there's no now to a
fifteen year old, it's just as easy to discover Edda
James as it is to discover Taylor Swift. Right, Like,
they're not other.
Speaker 2 (34:42):
Than the promotional money behind one.
Speaker 3 (34:44):
There is no there. They're both right there on your phone, right,
there's no like when I wanted to know who John
Zorn was, I had to find a weird enough record
store to carry John Thorn records. Now you just have
to know you have heard the names, you know, or
the algorithmlessons.
Speaker 1 (34:59):
But I think does that make you optimistic or pessimistic,
because because some of it is some of that is great.
Speaker 2 (35:05):
Right, It's both. It's just it's just changing mystic, you
know what I mean.
Speaker 3 (35:09):
It's just that like it doesn't that you can't that
that you can't really put that Genie back in the
bottle and and like we buy people buy vinyl now,
but I don't even know if they buy it to
listen to so much as to like have a tactile
experience again and be able to like engage with something, right,
you know, that's not sort of coming from the cloud.
(35:29):
And I would say that that's where that's where I
think that the like the ai AI universe is likely
to be that expansion to every piece of the human
labor there is right that like everything is possible and
nothing is special, and so like I do, I hopefully
(35:53):
there's just there's just things that are then arbitrarily special
because like we're gonna put work into like, okay videos, right,
there is an easier way to make this, but we're
going to do it this way because it will mean more.
Speaker 1 (36:11):
I did have a movie about I don't know, ten
more than ten years ago, probably like closer to fifteen
years ago for the I don't know if you've heard
on the Disney Corporation and yeah, yeah, yeah, that's right.
These guys and they decided that they wanted to make
a movie of Winnie the Pooh, but they were going
(36:32):
to do it all hands.
Speaker 2 (36:33):
So I did the voice of Owl.
Speaker 1 (36:35):
But the entire movie was hand drawn, was done in
the old style, and it was I think the last
one they've done. And I remember asking them why they're
doing it, and they said, because we can't, because we can,
because we're Disney and we can. We can do it,
and it can be great, and it's great, it's beautiful
and it has Maybe it's because I know that the
(36:56):
movie was hand drawn, it feels different to me. But
because I'm sure that the eye program can now can
now make it animation look like like that, it can
make it look exactly the same way.
Speaker 2 (37:09):
My oldest kid is an animator.
Speaker 1 (37:11):
He's twenty four, and he's looking at having studied all
through that time, now looking at an industry which is changing.
As you say, it's like the Internet in nineteen seventy eight,
It's like what now? What now?
Speaker 2 (37:22):
What now? What now? What now? But you have children?
Speaker 3 (37:26):
What do you?
Speaker 2 (37:27):
What do you do?
Speaker 1 (37:28):
You have view parameters and your parenting? Is there is
there something you think? This is what I must teach
them in order to navigate this.
Speaker 3 (37:37):
No, there's six and a half year old twins, and
so I'm basically just hanging on for dear life to
be honest, young kids, like being proactive is difficulty, just
trying to keep it from God understand.
Speaker 2 (37:53):
And I've been there.
Speaker 1 (37:53):
But it's interesting because my oldest is coming up on
twenty four and are there where just coming in When
he was a little kid, iPads and stuff like that,
so there were things to do.
Speaker 2 (38:06):
I don't know how.
Speaker 1 (38:09):
I see people at the airport where their kids have
got headphones on and they're looking at phones and that stuff,
and that you don't know how lucky you are, you know,
I mean I.
Speaker 3 (38:16):
I and I am that generation because we are a
total fascists about screen time. We do not let our
kids have any screen time unless traveling like that.
Speaker 2 (38:25):
And then it is and then it's this incredible drug
because they've never read that. He's exactly the same.
Speaker 3 (38:31):
I mean.
Speaker 1 (38:32):
But what's interesting I noticed that my bo both kids really,
but my youngest will put it down.
Speaker 2 (38:39):
It's almost like they're Tom bombadill with it.
Speaker 1 (38:41):
The younger generation they're like, yeah, it doesn't it's a thing,
but it's not the only thing. And I think that
will happen true that you know, it's a thing, but
it's not the only thing. And you can't have a
visceral experience without viscerality, if that's even a word. You
have to I mean the idea of you can have
a you know, you can have an AI interactive sex robot,
(39:04):
and you go, well, you can, but it's that's not
going to be it's not going to be the same.
Speaker 2 (39:11):
It mightbe all right, but no, no stakes. There's no stakes.
Speaker 3 (39:14):
There's no stakes, you know, because when everything is possible,
not that it's special, you know, it's just sort of
like there's that finding the the human equivalent to that
person takes real yest.
Speaker 1 (39:28):
And also it's it's a it's a very interesting thing.
I'm very glad that my romantic life was reached a
conclusion in terms of monogamy before for the invention of
the data nows because I don't I don't know how
because now it's involved and I know I won't say
it is because it's a little embarrassing, I think, but
(39:51):
I know a relative of mine who found out he
was in a flirty relationship with a robot was not
a real person. You know that it's crazy or or
is it? Maybe it's okay to be in a flipsy
relationship with a rule vote. Nobody's going to get it.
Speaker 3 (40:09):
Well, so you started with it's this is that it's
always been crazy. It's always always been crazy.
Speaker 2 (40:15):
It's just crazy changes.
Speaker 5 (40:16):
You know, Yeah, I believe that, But I also think
that the level of let's say, if you were to
measure social changes with a Richter scale.
Speaker 2 (40:31):
We are experiencing several like seven to.
Speaker 3 (40:36):
Nine you know, like really really big earthquakes simultaneously. Right now.
We've got climate change, and we've got a I and
WE and social media I think has done has really
really dramatically changed a generation of people for the words,
and all of that is either feeding or a fueling
(41:00):
or bouncing off of the rise of totalitarianism all over
the world.
Speaker 1 (41:06):
Like, well, totalitarianism is not new that I mean, that's
that's a pretty tried and tested kind of that's been
around for a while.
Speaker 3 (41:14):
It sure has, but there have been a whole bunch
of big tenth pole democracies that stabbed it off for
a couple hundred years that are not doing a good
job of.
Speaker 2 (41:24):
You know, it's it's I mean, I I kind of know.
Speaker 1 (41:26):
I struggled with this a little bit because I mean,
what what I have the conclusion I've kind of come
around to This is only a personal thing. But the
way I've come around to is that I kind of
messed around with with all sorts of looking at religions
and philosophies and theologies and and basically how what I
seem to be zeroing in on in the past couple
(41:48):
of years is uh, basically the works of Epicteties, Socrates, Seneca,
but people who have been dead for a couple of
thousand years, but basically said the same thing. It was like,
you can do very limited stuff. You can actually fucking
do and everything else. It's really not your concerned, you know.
(42:10):
I mean, and and I think you can drive yourself.
I mean, like Seneca was it was Nero's school teacher.
I mean, that's a dangerous fucking job, you know. But
I think that that the idea that we all because
you know, I have this conversation with a Way and
she's like, have you seen this in the news, and
(42:31):
have you seen that news?
Speaker 2 (42:32):
I'm like, I've seen it.
Speaker 1 (42:33):
But I have a clutch mechanism, which is, what the
fuck am I going to do?
Speaker 2 (42:37):
What the fuck am I going to do?
Speaker 3 (42:39):
Oh totally agreed. I'm not saying that I know what
to do or that there's anything we can do. But
I do think that the last hours of the Weimar
Republic were probably a scarier time than.
Speaker 2 (42:51):
The early hours of the Viymar Republic, right.
Speaker 3 (42:53):
Like, that's depends.
Speaker 1 (42:56):
I mean, yes, I would agree, I think, but that's
historical perspective. That's It's like if you stand a period
of time and look at that time. You know, like
if you stood in right here, right now, and you
were talking to the geniuses that we're putting together the
fucking Treaty of Versailles in nineteen eighteen, you'd say, you
know what, France, maybe you should calm the fuck down
(43:17):
and not make everybody angry to make this shit all
rise up again. You know, whereas you take real geniuses
like Nelson Mandela and Desmond Tutu who say, you know what,
we have to dismantle this. It's the only way we're
going to move forward truth and reconciliation. And that to
me is fascinating because I think historical. I don't think
(43:43):
his history begins until everybody's dead, you know. So you know,
the way that we're looking at you know, the way
that we look at the Second World War is about
to change, you know, like this kind of like all
of the That's why I think you get these scary
rise of wad kind of crypto fascist organizations because so
many people have now forgotten.
Speaker 2 (44:04):
What was going on. They're like, well, you know, they
had nice hats, you know. I mean, it's it's tricky.
I don't think history repeats itself. I think it was.
Speaker 1 (44:17):
There's been attributes to Mark Twain it says history does
not repeat itself, but it rhymes. I think that that's
and speaking to someone who does part of what you do.
Speaker 2 (44:27):
Is you make things rhyme.
Speaker 3 (44:28):
I do often rhyme things.
Speaker 1 (44:29):
Yeah, I think that that That's how it appears to me.
That's what it seems like. And I don't I don't know.
I mean, when I look at the way things are
now politically in America. I happened to be reading at
the time, just when the election was going on, by
Gorfidell's biography of Aaron Burr.
Speaker 3 (44:48):
Was fucking crazy.
Speaker 1 (44:50):
I mean, but and because he's been dead for such
a long time, highly entertaining, you know, I don't know.
I mean, I I wrestled with it, but mostly I
seem to come down to I don't know if this
is the way you do. I come down to, I'll
deal with what I can deal with, and that's all
I can really do. Like I'm not, I don't feel
(45:11):
like I can't change anybody's mind. I think it's folly
to think that you will.
Speaker 3 (45:15):
Except your kids. You're right, and that's that's exactly and
that's exactly where the where I'm still stuck with. I'll
just deal with it, but I can't because I have
no idea.
Speaker 2 (45:25):
How do you prepare them for all this stuff?
Speaker 3 (45:26):
There's now that's what song, that's what The song by
the way that a Stone Only Rolls Downhill is like
my kids. When I was writing out, there were four
to four year old twins. You're you cannot tell a
four year old the world is fun not not a
good idea.
Speaker 2 (45:43):
It might not be not it might not be.
Speaker 1 (45:45):
I mean, if you if you were to say the
world is if you were in the middle of the
Black Death.
Speaker 2 (45:50):
You know, when one in.
Speaker 1 (45:51):
Three people in Europe was killing over dyan, when infant
mortality was off the charts, when antibiotics were just a
fucking nobody even thought about it, Like jare theory was
like a couple of guys, and you know, Pythagoras had
maybe mentioned at some point you would have said, well,
this is the endgame clearly, you know, I mean the.
Speaker 3 (46:10):
Well, but I would argue that like, yeah, that time
was way worse than one hundred years before one hundred
years after.
Speaker 2 (46:19):
Yeah, depending on where you were standing, you know.
Speaker 1 (46:21):
I mean, it's kind of like I wrestle with it
all the time myself, you know, and I think trying
to find a sense of perspective in the world when
perspective and maybe this is what it is like. As
you say, social media has changed the generations. I think
I think what it's called legacy media, I think it's
this has been destroyed, you know, the the the kind
(46:43):
of the idea that the press will somehow keep the
government in some degree of all it. That's that's gone,
and it's been gone for a while, I think, longer
than we give your credit. I mean, fucking you know,
William Randolph Hurst was no. You know, I don't think
(47:04):
he was impartial, do you know what I mean? I
think and no, And I think that on the one hand,
I get as scared as everyone else. But I think
in the other hand, I think ultimately it comes down
to well known about one hundred years when it would
mark to you and me.
Speaker 3 (47:20):
Well, you know that there's a term in biology and evolution,
punctuated equilibrium the world. It's not a the world doesn't
change and evolve in some sort of steady and ramp
like a nice line. You have long periods of sort
(47:41):
of things relatively imbalanced, and then something happens and everything
shifts really quick right right. And that doesn't mean that
when the dinosaurs were alive, they weren't, you know, in
constant struggle, and things weren't crazy all the time. But
also there was a moment when when a commedy right
(48:02):
right like, and when there's a when there's a vacuum
in a system like that, the entire system rearranges itself.
Speaker 2 (48:10):
You know that that's where we are. You think we're
in that that place?
Speaker 3 (48:15):
I think we are, Yes, I think that. Like just
because Aaron Burr had an incredibly insane life, and because
World War two or World War one or utterly horrific
and world transformative, does it like they were? Those are
all moments, Those are all punctuations in the equilibrium. The fifties,
(48:36):
like the fifties that followed World War Two. We're not
we're not free of strife and craziness. They were full
of strife and craziness, but they they also represent an
equilibrium of sorts, right, like you can make some guesses
as to what's going to happen next. They're not all right,
but like you can you can trust certain institutions and
certain kind of trajectories to be more or less as
(49:00):
you imagine they will be, you know, like and right now,
I think it's incredibly hard to imagine what the world
will be like three years from now, much less ten
years from now. And I think like some of that,
I mean, a big part of that is technological, A
big part of that is social, you know, socioeconomics, stuff
like governmental essentially. And I do think that we are
(49:23):
an inflection point with with with climate change. It had
it's a slow, slow change, but it's starting to have
these effects that people notice as human. Like I would argue,
like I said that, you know, the conflicts in Syria
and on the southern border in the US are clearly
climate issues, but people don't recognize them as that, and
(49:45):
they're not going to be publicly like that. People aren't
going let's fix the climate so that there won't be
an immigration issue across the Mediterranean. That's not what people
are saying, you know they have, but they are saying,
holy shit, La just burned down, and now there's canad
and wildfires that last all year. You know, there are
people are noticing that that the erratic climate is causing
(50:08):
real human issues. And I think that if that's that
may not be a three year change. That might be
a ten year change or a twenty year change. But
when you when you combine that with the other ones
that are going on right now, I do think we
are likely at one of those inflection points.
Speaker 1 (50:23):
I think that's a that's a fair hypothesis. I think
that could be argued pretty reasonably.
Speaker 2 (50:30):
What will you do? Will you go back to La?
Speaker 3 (50:34):
Probably not, but I'm not sure I that mostly, I mean,
I love La. I love my house in l A,
I love my life in La. But the but the
long game for LA looks like a lot more of this.
Speaker 1 (50:49):
Yeah, you know, I live in a much colder place
with less fires, but other problems.
Speaker 2 (50:56):
That's what it is. It's been fascinating to talk to you.
Speaker 1 (50:59):
I thought we would have an interesting conversation, and I've
certainly been very interested in hearing what you have to say.
Thank you so much for making time for us. I
really appreciate it.
Speaker 3 (51:11):
Thank you. This has been so much fun and for
me too.
Speaker 2 (51:14):
Thanks very much. Now fuck off and deal.
Speaker 3 (51:16):
With your kids. I will