Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Uh, it could happen. Here is the podcast that you're
listening to. I'm Robert Evans, the person that you're listening to,
and one of the people who does this podcast. Boy,
what a what a glorious introduction that was. Let me
(00:28):
also introduce some human beings who you might know. First,
we have Chris and and we have James. Are are
our correspondence in the field joining us today. Also is
James's Spanish Civil War era mosen degauant Yep, that's right. Yeah,
I'm very happy it's joining us. It's going to make
(00:48):
contribution throughout the episode, just gonna it's it's an antique
bolt action rifle served in three World wars, the current one.
That's right. Yeah, and it's about too. It's about to
kick off this one now, which it might be. It
might be two in the alcolumn for the most in
the gun. Yeah, it's it's it's it's had a it's
(01:11):
served a mixed bag. Um. Yeah. Anyway, we're recording this
the day of the elections, so everybody's having a horrible one. Um,
I'm having a firearm Yeah, yeah, I did. I'm still
hoping my my Tech nine comes in before Oregon votes
on its next ballot measure. Anyway, Um, today, I wanted
(01:34):
to talk a little bit about something that I've been
thinking about kind of constantly, which is, um, it's called
effective altruism. And it's the short end of this is
that like it is a style of thinking about charitable
giving that Elon Musk in particular has recently highlighted as
(01:58):
like how he thinks about things. It's very popular with
the billionaire set, who are who are deeply invested in
getting people to think that they're saving the world, right, Um,
the folks who want to be seen as like looking
ahead and and and set protecting the future of mankind
and saving the world, um, but not doing it through
(02:19):
things like paying you know, more taxes and supporting you
know less money, being in politics and and all that
kind of jazz. Like not not anything that would would
actually harm their their personal ability to exercise power. So
it's gotten kind of attacked recently because it's associated with
guys like Muskin, because he is markedly less popular now
(02:40):
than he was let's say, ten years ago. Um, but
I want, Yeah, I wanted to talk because effective altruism,
which is an actual movement. There's like organizations that espouse this.
There's hundreds of millions of dollars in charitable giving that
gets handed out under the addresses of effective altruism, and
as it heads up, like most of it, it's fine,
(03:01):
like most of its charities to like get let out
of water and stuff like. It's not like effective altruism
is not comprehensively some sort of like scam by the wealthy.
It's more of a an honest theory about how charitable
giving ought to work that has been adopted by the
hyper wealthiest justification for fucked up ship and married to
(03:21):
something called long term is um, which we will be
talking about in a little bit. But I want to
talk about where the concept of effective altruism comes from.
If you read articles about this thing, most people who
study it will say that it kind of This got
started as a modern movement in nineteen seventy one with
an Australian philosopher named Peter Singer, and Singer wrote an
(03:42):
article titled Famine, Affluence and Morality UM. I think it
was actually published in nineteen seventy two. I don't know
one of the two seventy one or seventy two, and
and the essay basically argued that there's no difference morally
between your obligation to help a person dying on the
street in front of your house. Like dude gets hit
by a car in front of your house, you are
not more morally obligated to help him than you are
(04:04):
morally obligated to help people who are dying in Syria,
you know, um, And obviously, like there's a a version
of truth to that, which is that we're all responsible
for each other and internationalism is the only actual path
away from the nightmare. And when we do things like
ignore authoritarians massacring their people, it inevitably comes back to
(04:26):
affect us and like fuel the growth of an authoritarian
nightmare domestically. That is very true, um, But also there's
a fundamental silliness in it, because one reason why there
is a moral difference between helping a person dying in
the street in front of you and somebody who's in
danger in I don't know, southern China is that, like,
you can immediately help the person in front of your house, right,
(04:49):
Like if somebody gets hit, but you have the ability
to immediately render life saving eight, it's actually quite difficult
to help somebody who is for example, getting shot at
by the government uh in Tibet. Right, Like not that
you don't don't have a moral responsibility to that person,
but your moral responsibility to actually immediately take action when
somebody is bleeding out is higher than your responsibility to
(05:11):
try to figure out how to help people in distant
parts of the globe. Um, this is more nuanced than
I think a lot of especially like rich assholes like
to It's more nuanced than like the the I shouldn't
say rich assholes. What what's the problem with this is
that it's the this is the kind of revelation like
when you start talking this way that that feeds really
(05:33):
well into a fucking ted talk it it's a perfect
fix for that morality, whereas the reality is like a
lot more nuanced where and number one, it's also like,
well that the kind of help that you would render
to somebody who's been hit by a car in front
of your house is very different and requires really different
resources than the kind of help you would give people
in say, again like Syria, who are being murdered by
(05:55):
their government. Right, if somebody gets hit by a car
in front of your house, you run out with a
fucking tourniquet and a kit and you call nine one
one right, those are the resources that you can immediately
use if butshar Al Assad is firing poison gas at
protesters in you know, Aleppo, well your your stop the
bleed kit is not going to help with that, one
(06:16):
way or the other. Right, A very different set of
resources are necessary. UM. So it's it's foolish to compare
them anyway. Singer did UM and his essay was a
big hit. It's often called like a sleeper hit for
for young people who were kind of getting into the
you know, the charity industrial complex UM, or at least
(06:36):
we're considering it now. I found an interview with one
named Julia Wise who currently works at the Center for
Effective Altruism UM, and she was a started out as
a social work, like to give you an idea of
the kind of people who got into this. When she
read Weiss's article UM, she was a social worker, she
kind of fell in love with the concept and when
(06:57):
it started becoming a thing, and like the monies and eighties,
it was, as she described, quote, a bunch of philosophers
and their friends and nobody had a bunch of money.
So it was also more when Singer put it out
kind of a a wave, like a way of people
kind of debating how to think about charity, which is
is fine, people should always be like exploring stuff like that.
(07:19):
So it's not I don't want to be like going
after Singer too well, I do a little bit um
because Singer after kind of his movement has a couple
of decades to grow, winds up doing a Ted talk um,
and the Ted talk winds up kind of electrifying a
very specific chunk of the American techno set um. And
(07:41):
you can see kind of in in some of the
writing on this, like the way in which his talking
about sort of the morality of charity has gotten flattened
over the years. Quote which is the better thing to
do to provide a guide dog to one blind American
or cure two thousand people of blindness and developing countries? Um?
(08:01):
Which is like, I don't know both. There's resources to
do both. Um. We again, if you, for an example,
in the United States, were to attacks the billionaire class
and corporations a lot more, you could provide that blind
person in the United States, uh with with free healthcare
in a way that many countries do. Um. And we
could also continue or even expand charitable giving, maybe if
(08:23):
we were to do stuff like spend less money on
our military. Again, it's like a false choice, like it's
worth but but of course it's it's because the reason
this choice is there is because they're thinking about they're
thinking about helping people purely in the form of like
nobless obliged charity, right, They're they're thinking about periods like
rich like things that get improved when rich people put
(08:43):
money into them. Um. Yeah, so obviously we should help
you know, one of these groups before the other because
it's more effective and yeada, YadA YadA. Yeah. Yeah, well,
and I think I think that was one of the
things that like the there there's there's a second way
you can look at the original sort of problem of
we have this same methical responsibility. Someone who could sit
by a car or somebody's on the other side of
the world. Is that like the other way you can
(09:04):
look at that is like I don't care about what's
happening is some one of the other side of the world,
so they don't have to care about this person. You
get hit by a car, and that seems like people
are doing it's like, well, I really have to care
about this person here because there's someone over there. Yeah,
I did, Like I can see like how this lines
up with some of these like like bigger like meta
ethical kind of perspectives on what equality is and what
(09:28):
like your ethical obligations are. But then yeah, it seems
to just kind of be like a very clear, like
very clear slippery slope to making kind of mouth usy
and excuses for doing funk. All right, that's that's where
the story is heading. So oh good. Early two thousands,
(09:55):
he does like a Ted talk. You know, the momentum
around this idea starts to build and it really gets
a shot in the arm in two thousand thirteen with
the work of an author named Eric Friedman. Friedman's new
book or Friedland's book at the time that was new
was called Reinventing Philanthropy A Framework for More Effective giving
um and he kind of he kind of extends the
(10:16):
arguments that Singers making. One of the things that he
does is he he contrasts what St. Jude's Children's Reach
their hospitals are doing to like research children's medical or
like like illnesses that that kids supper and treatments for
them um with the Malagi Provincial Hospital in Angola um
and he kind of contrasts to patients who are being
(10:37):
served at the different hospitals for life threatening conditions and concludes, quote,
I'd probably also be very angry at the donors who
are continually funding St. Jude and leaving Melangi Provincial woefully
under resource. Why are the patients of St. Jude's so
much more worthy of life? And like, yeah, what are
the ridiculous way to think about it? Children's fucking asinine
(10:59):
And the fact that many of the people who are
doing these fucking TED talks and contributing to this like
a global tech class at the same people who are
making fucking millions of dollars off the pharmaceutical industry, which
continues to neglect the diseases that people like in the
colonial periphery suffer from because there's no profit in selling
them drugs, and instead you're selling boldness cures to people
(11:19):
in America, right, Like, yes, we can, I mean, like
you you could if we just if if every single
person who had a who's gotten a TED talk had
all of their wealth appropriated tomorrow, we could fund both
of these hospitals exactly yes, yeah, would will be better.
It's fundamentally a kind of obscenity to look at pharmaceutical
(11:39):
company CEO is making hundreds of millions and billions of
dollars selling people often literal poison and jacking up the
price of things like insulin. To look at these tech
CEOs accumulating tens of billions of dollars, and to say,
donations to this children's hospital are robbing an angolid hospital. Yes,
so I won't be paying my tax? Yeah, why don't
(12:01):
you go funk yourself? Yeah? And anyway, like but this
is like you can see who this appeals to, right
if you like the kind of people who love the
freakonomics books, which are bullshit regressive statistics. Can economic story please? Yeah? Okay,
(12:22):
So one of my professors at Chicago was a political
science guy, um or I guess the public policy and
there's there's a thing, there's a thing the freakonomics guy
wrote where he was trying to prove that money doesn't
actually influence, like doesn't actually influence left. Yeah, and you
know what my my, my, my professor wrote wrote a
(12:43):
paper about that, which is that you know and again
this is this is a perfect example of how done
this guy is that he doesn't This is how condis think, right,
Like they when they when they go into a field,
they go in thinking they already know everything and they
can prove whatever they want because okay, but the thing
this guy doesn't understand, right, is that, like and this
is the thing most people in the US do not
understand about how Congress works. Is that like all of
the ship that's happening on the floor of Congress, all
(13:05):
of those votes that is not that is not real congress, right,
that that is fate congress. Nothing nothing important to actually
happens there. All of the important stuff in Congress happens
in committees. And so you can't figure out whether money
is doing anything by measuring its effects on like votes
on the floor, because floor votes are bullshit. All of
the important stuff has already by the top, by the
tip of floor vote happens, all the important policing stuff
(13:25):
has already happened. And so he did this heard this
whole thing where he was you know, he had this
great I I. He had this great metric called like
uh oh God, and it was it's called like the
the dairy cow coefficients, which is like measuring like how
how someone should vote versus like how the dairy cows running.
It turns out, you know, if you look at what
these people do in committee. No, yeah, hey, look, it
(13:48):
turns out a lobby money is unbelievably effective. But because
this fucking guy had like and this is something that
like like this sort of distinction between between Congress like
on the floor and Congress and committee. Like there's a
press in it whose name of forgetting who has this
famous line that like Congress and Committee is Congress at work,
Congress on the floors, Corgress at play or something like that.
Like it's it's like this is just like basic ship
(14:09):
that if you know literally anything about how a field works,
you cannot do if you wanna, if you wanna, if
you want to a good breakdown of why the freakonomics
guy is full of shit. Michael Hobbs and Peter sham
Shary I think is his last name, have a new
podcast called If Books Could Kill and they break down
with like citations and everything like why everything in that
(14:33):
book is horseship but like the reason why it's the
only thing I'll disagree with you one, Chris is I
don't think he's an idiot. I think he's very intelligent,
and I think the thing that he's smart to do
is he recognizes that there's a specific type of person
and engineers and programmers are very likely to be this
type of person who kind of fundamentally like their oppositional defiant.
If somebody if something, if people say like well this
(14:56):
is good or this is bad, um, they're going to
take They want to take the opposite stance. And if
you can provide them way to like feel like they're
enlightened and smart and actually looking at the data by
doing it, then they'll take the opposite stance on stuff
like it's bad to let people by elections or it's
good to fund children's hospitals just because somebody has made
(15:16):
them feel smart for being an asshole. Um, that's what
the freakonomics guy does. Malcolm Gladwell does a subtler version
of it as a general rule. Um, and that's what
that's what the fucking freedman is doing in this this book.
In two thousand thirteen, I found a good review of
it in the Stanford Social Innovation Review. Um. That is uh,
(15:38):
pretty scathing, like surprisingly scathing considering it's it's written by
a bunch of like Stanford nerds. This approach amounts to
a little more than charitable imperialism, whereby my just causes
just in yours to one degree or another, is a
waste of precious resources. This approach is not informed giving
UM and I think that that does a pretty good
(15:59):
job of summarizing what I think is fucked up about it.
There's another thing that's really messed up, which is that
one of the conclusions that they gets come that they
come to here is that um, they don't recommend or
there's an organization called Guild well that kind of gets
gets formed as a result of the book Freedman Rights,
and they recommend not to deliver, like not to donate
(16:22):
money to disaster assistance in the wake of the Japanese tsunami.
Um And opposed disaster relief donations in general. Um because
quote and this is from Freedman, most of those killed
by disasters could not have been saved by donations, um,
which is number one, Like, that's the donations are about
like rebuilding communities generally, it's not like about the saving lives.
(16:43):
Usually it's about like, well, all of the infrastructure was
destroyed and it must be rebuilt. Um but okay, guy,
Well it's annoying to you because it's like it's it's
not like there's not good critiques of like pricipically always
like the Red Cross. Oh, it's all fucked up. Every
single yes, yeah, I critique is like the worst possible,
like the critiques. Yeah, every single large charitable organization is
(17:07):
fucked up. And if you go and talk to people
on the ground, they will bitch, Like if you go
to fucking war zones, people bitch more about NGOs than
the folks shooting at them half the time. Like, yeah,
they bitch about it being inefficient, about the stuff they're
given being like bad quality or like um like nonsense
like just being handed out to be handed out, which
(17:28):
is a thing that happens sometimes, and a bitch about
well paid aid workers staying in hotels and showing up
for a couple of hours to like do a photo op. Um.
There's also more incisive like you know, that's not to
say none of it's useful, Like, for example, as many
complaints as people have, everyone I've known who has been
in a place where medicine sans frontier slash doctors without
(17:50):
borders has operated while they have complaints about doctors without borders,
are like, it's good that there's more doctors here. We
fucking need them. Um. And you know, it's like you
and h c R. Plenty of things to complain about
you and h c R. Every refugee camp I go to, Also,
people have fucking water filters, intins and ship because of
U N h c R, which isn't nothing. It's a
(18:10):
damn site more than nothing, and it's a damn site
more than any of these long term motherfucker's are doing
for people who are I don't know, displaced by war. Yeah,
And it's like some of the things that they're doing
is like, this is very strange kind of attempts to
calculate and create markets for human life and human suffering, right,
which you see a lot if you were like, I've
(18:31):
worked in nonprofit, I've worked in in disaster response. I've
seen some of these things on the ground and it
it you see these bizarre fucking decisions being made by
by someone in an office who is likely never been
on the ground of these situations, and it inevitably results
in it's within these big organizations at the Red Cross
(18:52):
and MSF but also on a governmental level, right with
people not having the autonomy to respond in a situation
to reduce human suffering. And it's dead to be told
to do something which is supposedly evidence based based on
someone who's looked at the wrong criteria and come to
the wrong conclusion hundreds of miles away. And it's us.
It's bureaucrats, right, And it's like we we've we've, we've
(19:12):
we've we've somehow managed to create like the absolute worst
possible night mirror system of you have a bunch of
government bureaucrats and then you also have a bunch of
sort of privates that we have. We have like different
we're watching a collision of different kinds of private sector
bureaucrats like you have, you have your sort of engeo
bureaucrats you have, and then you know, and then you
have these billionaires who are also just fucking bureaucrats, and
(19:33):
all of them are just doing box ticking, and we
get like just the absolute worst nightmare fusion of horrible
bureaucracy and capitalism, which is a great way to run
programs to have people not die, And like so much
of this comes from what that the whole like free economics.
Thing to me strikes me as like we didn't, like
you said, reading the Wikipedia active about subject and then
(19:55):
applying trying to find out why you can apply a
market to it, and then posting that as a solution.
Stuff we have. The episodes were dropping on bastards well
the week before this episode will air, are about like
why the rent is so damn high? And one of
the complaints I have is that there's a specific class
of media people who the only answer they will accept
is because, uh, there's not enough multi family zoning, which
(20:16):
is just a part of why the rent is so
damn high. And reducing it to just that ignores um,
the price fixing software that tens of millions of Americans, Uh,
like landlords use um. It ignores shit like Airbnb. It
ignores like the fucking problems in the construction industry, the
lingering effects of the two thousand eight crash. It's very frustrating,
(20:37):
and it's the these kind of like freakonomics guys like
to do the same thing, like the fucking freakonomics student.
In particular, one of the things he got famous for
is being like, uh, you know, the dropping crime in
the nineties, this unprecedented fallen crime was due to abortion,
which zero I will say again zero people who are
experts on the topic of crime in America agree with.
(20:58):
What they will say is actually there's a ship out
of different things that contributed to the declining crime, and
there's a good chance that abortion had an impact. A
bigger impact was probably getting the lead out of like
reducing environmental let although that gets overstated too. There's all
sorts of different ship including like air conditioning, just the
fact that like, yeah, now more people have air conditioning.
And guess when violence is highest in the summer, when
(21:19):
people are stuck around each other outside and like all
sorts of computer games. People because something else to do.
But it's it you want to if you're gonna be
doing the kind of like if you're gonna be doing
ted talk fucking uh public works philosophy, then it helps
to just be able to like make one big Malcolm
Gladwell style fucking reveal. Anyway, that's how all these people
(21:42):
exist and how all of their morality is informed. After
two thousand thirteen, Friedman is kind of like followed up
by this guy named William mccaskell, who was currently the
he's a Scottish philosopher um, which god, it's easy to
get called a philosopher these days. Um. And he is
he is a personal friend of Elon Musk. When must
(22:04):
text messages got released as part of that court filing,
some of them were with McCaskill um, who was considering
like putting a bunch of money into buying Twitter. They
ultimately decided not to, I think, because they just like
it seems like McCaskill just didn't trust that Musk had
any sort of planned So he is, I will say this,
not an idiot, um, but he's wrong in ways that
(22:25):
are are deeply fucked up. And he wrote a book
that is currently a bestseller. It was published in August,
called What We Owe the Future. And the gist of
this is that like it's merging this kind of effective
altruism with what's called long termism, which is this argument
that morally we have to consider the impact of our
actions as not just on people alive today but in
(22:47):
future people, which is fine. There's actually a lot to
that idea, but the way it always works out is
we can't pay attention to problems that people are suffering now.
We have to we have to work on giving the
world from these bigger problems. Um. And again it's almost
it's almost exclusively used as an argument for guys like
Musk to like, well, we shouldn't tax billionaires out of
(23:09):
existence because I you know, I see this that with
clarity the problems that we face, and the long term
solution is for me to be able to push for
these specific things that I think are the only way
to save humanity. Right. I'm getting ahead of myself a
little bit here. Let's talk about mccaskell again. Um, when
he was at Oxford. He's an Oxford boy, James h.
Look at we've had in Banngus. Yeah. Uh. He started
(23:33):
a group called Giving what we Can in two thousand nine. Uh,
and members were supposed to give away ten percent of
what they earned to the most cost effective charities possible,
which is fine, there's nothing wrong with that idea basically,
and it was like a supposed to be basically a
lifelong promise that like, you know, we're all because you
assume Oxford people, a lot of them are gonna wind
up making very good money. You know, as we move
(23:54):
into our careers, this will be a more and more
influential kind of giving um, but yeah, dropped the board.
If they'd had me there, Yeah, those meetings might have
gone a little bit different. Yeah. Over time, though, he's
kind of moved into he's merged this and and again
(24:16):
the whole effective altruism movement. A lot of it does
start reasonably with people being like, are these charities were
donating to working? How can we make sure they're effective?
Like what can we do to make giving um work better?
Which is again perfectly fine, but very quickly gets married
to this kind of long termist thinking um and they
(24:36):
focus instead of stuff like, for example, funding hospitals, stuff
like preventing an artificial intelligence from killing everybody, or like
sending people to distant planets, which are like cool and
sci fi and everything, but also deeply unrealistic. I'll say
it right now. Our our threat is not that an
AI kills us all. There's certainly a threat that different
kind of artificial intelligences are used by authoritarians to make
(24:59):
life worse for every But by the way, Peter Teal
is a big back or of effective altruism. He's one
of the people building that fucking AI. This is the
guy who wrote that thing about earning to give right
like that. He was like, this is the guy who
did that. Yeah, okay, I'm familiar promise to never take
more than thirty one dollars or something and then come
(25:20):
over the course of a year in his life and
give charity. He gives all his book profits to charity,
but he also runs an organization that is spending more
and more on keeping its people comfortable because I guess
he doesn't have the money personally to spend anyway. I
think there's some sketchy ship there. Yeah, this whole idea,
and I'm sure we're gonna get today, right like, it
(25:41):
completely overlooks our obligation morally to agitate for structural change,
right like. It says like, if you can become a
billionaire through whatever bullshit, evil, fucking exploitative grift you can
and then give nine of that away, you're still perpetuating
a system in which one grift gets rich and thousands
of people die without fucking clean water. But that's okay
(26:04):
because you also donated some water filters or whatever like,
And it's not okay, And it makes me very angry. Actually, yeah, yeah,
it makes me angry too. And it's one of those
things if you look at like, here's all the charities
that mccaskell and his organization are putting hundreds of millions
dollars of dollars into. They're not all bad. A lot
of them are good, and I'm glad that money is
going there, But there's always this strain of deeply unsettling
(26:26):
logic running through it. Now, I want to quote from
a Time article that I think kind of gets in
a very subtle way, has this guy's number When I
start thinking in practice, if you've got if you've got
some things that look robustly good in both the short
and the long term, that definitely makes you feel a
lot better about something that is only good from a
very long term perspective, he says. This year, for example,
he personally donated to the Let Exposure Elimination Project, which
(26:48):
aims to in childhood let exposure, and the Atlas Fellowship,
which supports talented high school students around the world to
work on pressing problems. Not all issues are equally tractable,
but mccaskell still cares about a range. When we met
An Oxford, he expressed concerned for the ongoing political crisis
in Sri Lanka, though admitted he probably wouldn't tweet about it.
The answer, he believes, is to be honest about it.
In philanthropy, big donors typically choose causes based on their
(27:11):
personal passions and ultra subjectivist approach. Mccaskell says, where everything
is seemingly justifiable on the basis of doing some good,
he doesn't think that's tenable. If you can save someone
from drowning or ten people from dying in a burning building,
what should you do, he proposes, it is not a
morally appropriate response to say, well, I'm particularly passionate about drowning,
so I'm going to save one person from drowning rather
(27:31):
than the ten people from burning. And that's exactly the
situation we find ourselves in. And like, no, it is not.
That is nonsense because, among other things, if you're a
random person, uh, and you have a choice between saving
someone from drowning or ten people from dying in a
burning building, well you actually probably don't because saving people
from drowning is a really difficult technical skill, which is
(27:52):
why people usually die when they try to rescue whether
folks who were drowning. Yeah, the guy, the creator of
Hugio die from drowning it's really hard and dangerous, and
also so is rescuing people from a burning building, which
is why we have firefighters. And guess what, a lot
of firefighters may not be very good at saving people
from drowning because they have not trained for that. There
(28:13):
are different skills, and these are both problems, but they're
different skills. But what have you instead spend that time
buying some testlas dogs and then sold them and instead
invested in h I don't know, I find something that
stops water from from drowning people. Like none of the
problems we have are are none of the problems I'm
going to say right now, zero percent of the problems
(28:35):
we have are the result of some sort of like
lifeguard firefighter standing in between a burning building and like
a yacht race gone wrong and going oh god, noah.
It's like the trade he's doing the trolley problem, like
he's he's just he's trying to do the Troy problem.
(29:02):
It's funny that you talked about Sri Lanka too, because
it's like this is the perfect example. This is the
perfect example of a political crisis that is like completely
intractable to all of these like no, no, no, no,
one of these people donating the charities can like do
literally anything about that, because that's actually you know, like this,
like the crisis of Sri Lanka is a is a
is a both both is it like it is, but
(29:23):
it is both a sort of short term crisis of
this like you know, like utterly horrific genocidal political elites,
and then also a sort of long term crisis about
like the sort of structural position of like specimcific countries
and sort of the global colonial system. This is not
something any of these people can solve. The only the
only thing, the only way any of these people could
solve this is if people of Sri Lanka like just
(29:44):
expropriated them. But you know, but he but because because
because because these people like because Sri Lankas do not
have access to this guy and like six guns, right,
there's no there's no way, you know, he can just
sort of sit there in his chair going, well, it's
a crisis, us, I'm gonna tweet about it. I'm not
gonna tweet about it. Yeah, I was. I was simply
(30:05):
talked to newspapers about in tweeting. What what I would
say is that like here's the actual solution to the
stupid problem this guy came up with, Well, if we
were to tax all of the billionaires to the point
that they weren't billionaires and then put that into a
massive new like works progress fund that, instead of like
just building national parks, provided like rental assistance to millions
(30:27):
of Americans in exchange for them learning how to fight
fires and getting basic life gave it saving care and
getting trained in things, um like that, so that they
could deal with the consequences of climate change and be
able to protect their communities effectively, and be incentivized to
gain the actual technical skills that would allow them to
protect people. Well, then you would have more people capable
of saving someone from a burning building or from drowning.
(30:50):
Um but anyway, whatever, that's that's that's my that's my
pie in the sky. Leftist solution to that is use
funds taken from rich in order to incentivize people to
gain the skills that will allow them to protect their
communities in the event of disasters. Um anyway, whatever. Uh So,
(31:10):
Over the last decade, all of this thinking has increasingly
given way from a wonky theory on charitable giving by bighearted,
guilt ridden millennial kids and that's that's how this guy
has always framed in articles, McCaskill as he's like, in fact,
I'm gonna fucking I'm gonna scroll down here to my
notes and I'm gonna find the section of the article
to like show you the way he gets fucking talked
about in all of these quote. Thirteen years ago, William
(31:31):
McCaskill found himself standing in the aisle of a grocery store,
agonizing over which breakfast cereal to buy. If he switched
to a cheaper brand for a year, could he put
aside enough money to save someone's life? Like that's the yeah,
that's sort of like that you have where your engagement
with global poverty is in the fucking cheerios aisle exactly exactly.
(31:53):
And then yeah, of weight Rose in Oxford, I'm sure like, no, funk, sorry,
I'm so fucking angry at this, and it's it's clearly,
very clearly I can see that this is going towards
an excuse for incredibly wealthy people paying funk all in
taxes because they claim that it's not an efficient way
to do things, and they completely ignore all these structural
things which have to exist for their effective altruism to
(32:15):
occur in the first place, right, Yeah, it's um. Anyway,
this is effectively like over the years given away from
this again kind of this wonky theory by guilty millennial
kids to this pop philosophy for the fintech set, because
that's how these guilt written millennial kids wound up making
a bunch of money. Um. And yeah, that time article
gives like, I just want to read another quote from
(32:38):
it about one of the other guys who's involved in
putting a lot of money into McCaskill's organization. Quote. Mr
Mr Bankman Freed makes his donations through the ft X Foundation,
which is given away a hundred and forty million, of
which ninety million has gone through the group's future fund
towards long term causes. Mr McCaskill and Mr Bankman Fried's
relationship is an important piece and understanding the community's evolution
(32:59):
in the reason years. The two men first met in
two thousand and twelve when Mr Bankman Freed was a
student at m I T with an interest in utilitarian philosophy.
Over lunch, Mr Bankman Freed said that he was interested
in working she related to animal welfare. Mr McCaskill suggested
he might do more good by entering a high earning
field and donating money to the cause and by working
for it directly. Mr Bankman Freed contacted the Humane League
(33:21):
and other charities, asking if they would prefer his time
or donations based on his expected earnings if he went
to work in tech our finance. They opted for the money,
and he embarked on a remunerative career, eventually founding the
cryptocurrency exchange. First off, that guy absolutely did not call
any charities. Um. Sorry, this was a four. This was
(33:43):
from the Forbes article I used, not the Time article. Um.
First off, I don't believe that he but if he did,
it was something like, Hey, I don't have any skills
or training. Do you want money or do you want
me to volunteer? And they were like who then is this? Kids? Like,
we don't. We don't need another asshole wandering around here
trying to touch the cats. Um, send us to a check. Yeah.
(34:04):
And so instead of I don't know, getting trained as
a vet tech or something where he would actually be
able to help animals, he founded a cryptocurrency exchange and
contributed to the burning of massive amounts of carbon that
will contribute to mass deforestation and the deaths of animals
around the world. That's good. I think that there's another
aspect of this which I think is sort of under explored,
which is that utilitarianism is genuinely one of the greatest
(34:26):
evils humanity has ever created, every every bad decision anyone
has ever made. If you look behind it, you can
find your guilitarian is like, it's the basis of the
basis of all economics. It's horrible everything in the world.
It is an engine that allows rich people to feel
good about hurting poor people. That's that's what it is.
(34:46):
But and that's what I think this all makes clear.
So the actual rhetoric from these people is always likes,
especially if you're just kind of encountering it out in
the wild. It's hard to argue with a lot of
the time because they'll be like, well, look, we need
to look at what's going to help the most people,
and that's why we're you know, setting up None of
this matters if we don't deal with this problem or
that problem. And it's it's Taylor made to sound profound
(35:07):
and again and like a Ted talk or the website
for some charitable giving organization aimed at getting you to
like put ten percent of your income to long term
mist causes. But again, the funked up ship crusts kind
of around the edges for the most part, in lines
like these from a time profile on the Castle. The
first public protest against African American slavery was the six
eighty eight Germantown Quaker petition Slavery or was only Yeah,
(35:30):
slavery was only abolished in the British Empire in eighteen thirty,
three decades later in the US, and not until nineteen
sixty two in Saudi Arabia. History encourages mccaskell to favor
gradual progress, So for revolution abolition, he says, is maybe
the single best moral change ever. It's certainly up there
with feminism, and they're extremely incremental. They don't seem that
way because we enormously shrink the past. But it's almost
(35:50):
three hundred years we're talking about. Um. That wasn't the
result of incremental change. It was the result against the
people who owned slaves, fighting viciously against any attempts to slavery, Like, yeah,
it was a it was a battle. It was a
series of in fact, a series of revolutions in a
lot of cases, including like the Haitian Revolution and guys
like John Brown. There were a ship bleeding Kansas. There
(36:13):
were a shipload of people died fighting in order to
end slavery. Like, yeah, it's a civil war, dude, what
do you call that? That's not incremental? A million people
shot each other to death, you know, and it's it's
as far as we can talk about sort of income
with the progress. It's stuff like Okay, So the like
the slaves in Haiti freed themselves by means of revolution
and then sent a bunch of guns and weapons to
(36:34):
people in Latin America so that their armies could march
through Latin America and slavery. Like many revolutions had to
occur to end slavery because it was a powerful system
at the center of global capital that a lot of
entrenched and heavily arm interests were willing to die to maintain.
Which also is fun because I I I bet, I
(36:56):
bet if you look Throse, people supply chains, and this
is almost certainly true of Elon must supply Jay, like
I mean, Okay, must supply chains. In China. You can
have some kind of debate as to whether the kinds
of forced labor you're going to be encountering our slavery
Like I I bet if you look to present the
people who are affective alters, you can find slavery in
their supply chains, and their arguments will be like, well,
(37:19):
I can't end slavery and must supply chain because I
guarantee it. They're in the tech industry and like, nobody
has a laptop or a smartphone without the use of
rare earth minerals that are like acquired via slavery. It's
it's the same thing if you're wearing clothes, you have
something that slavery was involved in. Because the garment industry,
slavery is literally inextricable from it. Like the company that
(37:42):
has tried the hardest to remove slavery from their from
their production line, Patagonia, UM still continually finds like, oh, no,
they're smart. Yeah they're pretty good. I'm going out, but yeah,
they put a load of money into that ship and
they still it is hard. Um. Anyway, um, I'm going
to read another fun quote from the Forbes article. Mr
(38:04):
Bankman Freed said he expected to give away the bulk
of his fortune in the next tender twenty years. If
you're worried about existential risks of a really bad pandemic,
you sort of can't stall on that. Mr Bankman Freed
said in an interview. That is how his text messages
popped up among hundreds of others sent to Mr Musk.
Mr Bankman Freed ultimately did not join Mr Musk's bid.
I don't know exactly what Elon's goals are going to
be with Twitter. Mr Bankman Freed said in an interview
(38:25):
there was a little bit of ambiguity there. He had
his hands full in the month that followed his cryptocurrency
prices crashed. The Twitter deal has been volatile in its
own way, with Mr Musk trying to back out before
recently announcing his intention to follow through that after all.
In August, Mr Musk retweeted Mr mccaskell's book announcement to
his hundred and eight million followers with the observation worth
reading this is a close match to my philosophy. So
(38:50):
that's that's kind of the surface of where we are now.
Um it is not. It doesn't quite get at all
of the things that are deeply fucked up. And for
that I wanted to quote from another article. UM, I
found an a on a e O N. It's an
essay by uh God, let make it the author here
because it's it's quite good about long term ASM. It's
(39:11):
an essay called against long termism by Emil P. Torres,
a phb candidate at a university in Hanover in Germany
uh Leibnitz Universitat. I don't know. I feel silly every
time I tried to say Germans, so I'm not going
to try that hard. But the article is very good
UM and it kind of gets at how this effective
altruism movement has merged with long term is um in
(39:35):
a way that specifically exists to buoy the interests of
wealthy authoritarians around the world. Quote. This has roots in
the work of Nick Bostrom, who founded the grandiosely named
Future of Humanity Institute f HI in two thousand five,
and Nick Bestead, a research associated FHI and a program
officer at Open Philanthropy. It has been defended most publicly
(39:56):
by the FHI philosopher Toby or, the author of the
precipice Existential Risk in the Future of Humanity. Long termism
is the primary research focus of both the Global Priorities
Institute and an f HI in linked organization directed by
Hillary Greeves, and the Forethought Foundation run by William mccaskell,
who also holds positions at f HI and g p I.
Adding to the tangle of titles, names, institutes and acronyms.
(40:18):
Long termism is one of the main cause areas of
the so called effective altruism movement, which was introduced by
ord In around two thousand even eleven and now boast
of having a mind boggling forty six billion dollars in
committed funding. It is difficult to overstate how influential long
termism has become. Karl Marx in eighteen forty five decluded
that the point of philosophy isn't merely to interpret the world,
but change it, and this is exactly what long termists
(40:40):
have been doing with extraordinary success. Consider that Elon Musk,
who is cited and endorsed Bostrom's work, has donated one
point five million dollars to FHI through its sister organization,
even more grandiosely named Future of Life Institute. This was
co founded by the multimillionaire tech entrepreneur Jean Tallinn, who
has i recently noted doesn't believe that climate change poses
an existential threat to humanity because of his adherence to
(41:02):
the long termist ideology. Meanwhile, the billionaire libertarian and Donald
Trump supporter Peter Teal, who once gave the keynote address
at an Effective Altruism conference, has donated large sums of
money to the Machine Intelligence Research Institute, whose mission is
to save in humanity from super intelligent machines and is
deeply intertwined with long termist values. Other organizations, such as
g p I and the Foe Thought Foundation are funding
(41:24):
essay contests and scholarships and an effort to draw young
people into the community. Well, it's an open secret of
the Washington d C Base Center for Security and Emergence
and Emerging Technologies c SET aims to place long term
mists within high level US government positions to shape national apology.
In fact, c SET was established by Jason Mathani, a
former research assistant and f HI who is now the
Deputy Assistant to US President Joe Biden for Technology and
(41:46):
National Security. Or himself has, astonishingly for a philosopher, advised
the World Health Organization, the World Bank, the World Economic Forum,
the U s National Intelligence Council, the UK Prime Minister's Office,
Cabinet Office, and Government Office for Science, and he recently
contributed to report from the Secretary General of the United
Nations that specifically mentions long term is um. The short
answer is that elevating the fulfillment of humanities supposed potential
(42:08):
above all else could not trivially increase the probability that
actual people those alive today in the near future suffer
extreme harms even death. Consider As I noted elsewhere, the
long termist ideology inclines its adherence to take an insusian
attitude towards climate change. Why because even if climate change
causes island nations to disappear, triggers mass migrations, and kills
millions of people, it probably isn't going to compromise our
(42:29):
long term potential over the coming trillions of years. If
one takes a cosmic view of the situation, even a
climate catastrophe that cuts the human population by se for
the next two millennia will, in the grand scheme of things,
be nothing more than a small blip, the equivalent of
a nine year old man having stubbed his toe when
he was two. So this is evil, right, Like this
is like this is vicious and vile and cruel, and
(42:52):
it's one of those things. There's a book that I've
talked about on the show a couple of times UM
that is quite popular called Ministry of the Future UM
and I. It's a very good book, and one of
the attitude, like the basic premise of it is that
climate change is addressed finally and the worst aspects of
it are are dealt with and like begin to be
repaired because of the establishment of an organization called the
(43:14):
Ministry of the Futures. It's international organization that exists to
like look out for the interests of unborn people and
animals and plant species. And part of how they do
this is by murdering billionaires in their beds uh and
blowing up planes to in international air travel, which is
so there's a verse. Like again, the idea that like
we should be thinking about people and and living creatures
(43:37):
who have not yet been born is reasonable and the
reasonable conclusion of that is and so we should deal
with things like climate change and stop like thoughtlessly degrading
our environment so that people in the future will be
able to live a quality life. UM. The argument that
these long terms are making is No, that's foolish because
in a trillion years, none of it will matter. And
(43:58):
I intend to be alive and trillion years because I
will be an immortal machine man billionaire forever. You know,
It's about these people, These people like you think about this.
If you believe this, the only, literally the only thing
that you should spend your time doing is trying to
dismantle every single nuclear weapon on the planet. Like you,
you you should be forming your own private armies to
(44:19):
like storm military basis to destroy nukes. And none of
them will ever fucking do this. All these people will
back candidates who like want to have one nuclear weapons.
All these people who will back candidates who like like
you know, I wonder how many these people personally supported
dropping a nuke in the middle of a rock in
two thousand four, Like god, Yeah, anyway, this is probably
(44:41):
that's probably enough. I wanted to At some point, I
think we will be doing a more detailed look into
some of these people, and a more detailed look into
some Maybe maybe it's a Bastards episode, but this is
just getting more relevant. And I wanted to give people
I wanted to connect them with some like some some
resources per particularly that article on a on about the
(45:03):
dangers of long termism and uh yeah, anyway, be be advised.
This is what the fucking assholes who have spent like
think about how many cool things the tech industry is
actually made in the last decade. It's it's not many, right, Like,
it's mostly been vaporware. Like most of the different big
apps and stuff have all are in the process of
(45:24):
collapsing right now. That's why the industry is falling apart
as we record this in the metaverse. Yeah, that's right,
that's right. The legs. It's like you're sitting right next
to me, James, except for you have no laying legs in.
Your mouth is open in an endless wordless scream. Um. Finally, anyway,
(45:44):
that's what these assholes want to do, what they've done
to the Internet, sucking the vibrancy and the life and
like the freedom out of this this incredible creation, and
turning it into uh an engine for sucking your personal
data out and marketing things to you and making you
angry all the time as much as possible, and convincing
your parents and grandparents that fucking Joe Biden has been
(46:07):
replaced by a lizard man. Um like the people who
did that, uh now think that we can't take care
of people today because that would distract from our mission
to take care of people who have never been born
a trillion years from now. Um anyway, fuck them. It
(46:28):
could happen here as a production of cool Zone Media.
Well more podcasts from cool Zone Media, visit our website
cool zone media dot com, or check us out on
the I Heart Radio app, Apple Podcasts, or wherever you
listen to podcasts. You can find sources for It could
Happen Here, updated monthly at cool zone Media dot com
slash sources. Thanks for listening.