All Episodes

July 11, 2025 45 mins

Acclaimed author Michael Lewis discusses his time with Sam Bankman-Fried and why he thinks both high finance and Effective Altruism shaped the 'Crypto King's' worldview, ultimately landing him in jail. Plus, we hear about the people fighting terrorism, cave-ins and brain-eating amoeba from Michael's new book 'Who Is Government?'.


For a full list of sources, see the show notes at timharford.com.

Get ad-free episodes, plus an exclusive monthly bonus episode, to Cautionary Tales by subscribing to Pushkin+ on Apple Podcasts or Pushkin.fm. Pushkin+ subscribers can access ad-free episodes, full audiobooks exclusive binges, and bonus content for all Pushkin shows.

Subscribe on Apple: apple.co/pushkin
Subscribe on Pushkin: pushkin.fm/plus

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Pushkin. On the tenth of November twenty twenty two, Sam
Bankwin Freed tweeted that he had eft UP. As we
explored in our last episode, that was something of an understatement. FTX,
the digital currency exchange platform where individuals could trade cryptocurrencies

(00:38):
and store their funds for safekeeping, imploded when it emerged
that Sam bankmin Freed had been diverting funds into his
crypto hedge fund Alamedia Research fraud. Yes, but as an
effective altruist, he was doing it for a good cause,
wasn't he. In his own misguided way, he thought he

(00:59):
was just following the teachings of Will mccaskell, the founder
of the Effective Altruism movement, and that can't be bad,
can it. If you've not listened to that episode yet,
please do before listening to this special Cautionary conversation with
Michael Lewis, the man who had a ringside seat for
the spectacular fall of the cryptocurrency vunderkind. Michael, Welcome to

(01:23):
Cautionary Tales.

Speaker 2 (01:24):
Tim. Good to see you again.

Speaker 1 (01:26):
It's great to see you. I think the last time
we saw each other, I was trying to teach you
how to play an obscure German board game.

Speaker 2 (01:32):
Then that's exactly right. I can't even remember what book
led you to get interested. Maybe the Big Short. I
think it was The Big Short. But the FtM were like,
oh no, you can't just talk to Michael Lewis about
his book. You have to talk to Michael Lewis about
his book while you teach him a weird German board game.
I thought it was actually ingenius because I try to
do this with subjects, and especially if you only have

(01:53):
a short time with them. You're so much better off
doing something with someone than just talking to them, if
you're trying to kind of get some insight into how
they tick. The best job interview I ever had it
was to lead teenage girls through Europe when I was
twenty two years old. I went to the tour agency
and the guy who ran the tour agency said, God,
I forgot you had an interview. This day. We're supposed

(02:14):
to move the furniture from this office down the hall
of the other office. Could you just help me do that,
and then if we have time left over, we'll talk.
And I spent an hour moving furniture with this guy
and I left. He said, I'll call you about the
interview later and I left bewildered. Yeah, like a week
later he said, you have the job. And two months
later I'm in a hotel room with my fellow leader
in like Bruges, and I said, you know, this was odd.

(02:35):
I was never interviewed, and I explained what happened. He said,
I moved that furniture too back the other way, and
it was a really smart way to kind of figure
out whether someone was collaborative, whether they could pick up
stuff quickly, how they interacted with people. So I took
it as a sign you were a clever journalist when
you did this.

Speaker 1 (02:51):
Yeah, well, or I have a clever editor. But yeah, Michael,
I'm already feeling bad that I haven't brought a board
game this time. But I'm sure we'll be fine. I'm
sure we'll be fine. We are going to talk about
the time you spent with Sam Bankmanfried, what you made
of him, what you made of the influence of the
defective altruism had on him. And we're also going to
be answering listener questions, and we'll be talking about your

(03:14):
new book Who Is Government. Before all of that, here
is the caution details thing. The most remarkable thing about

(03:46):
Going Infinite is that when you started working on it,
you had no idea that you were going to be
covering the fraud of the century. You were just intrigued
by Sam bankman Fried And this book turned into a
very different book during the course of writing it. And
it feels like this is not the first time you've
been in the right place at the right time. I'm

(04:07):
curious how how always hey that you find your subjects.

Speaker 2 (04:11):
You know, I stumbled into this. I had no idea
that there was like fraud going on. So I was
asked to evaluate him by an investor who was doing
a deal with him, and it was a friend, and
I said sure, I had no idea who he was.
He turns up in my doorstep in Berkeley, and we
spent a couple of hours together, and in a couple
of hours I realized I had this character who was

(04:31):
one he thought about the world in a very unusual way,
in a very persuasive way. But two he would possibly
give me access to several places I wanted access. One
was Jane Street in the world of high frequency trading,
because he'd spent three years at Jane Street and he'd
hired a bunch of people out of there, and they're
notoriously opaque and won't talk to journalists. So I thought, ah,
maybe I get to Jane Street throw him money in politics,

(04:54):
because he was handing out money left right and centered
to American politicians and the crypto world, which he was
very skeptical of, and so he wasn't a religionist, so
he wasn't defensive about it. So I remember just saying
to him, I don't know where this is going to
end up. Could I just hang out watch? You know?
He was the world's richest person under the age of
thirty according to Forest Magazine. It wasn't the big brain
wave on my part. It was kind of surprising he

(05:16):
let me do it, especially in view of what was
going to happen. So I just follow my nose. Yeah,
And my nose isn't like, oh, what's going to be
the next big story. My nose is more what's not dull,
and sometimes that works out, you know, sometimes that ends
up being what you should be paying attention to. So
as you.

Speaker 1 (05:35):
Hung out with Sam Magmunfred Moore, you must have become
aware of this whole effective altruism thing seemed to play
a huge role in his life and how he thought.

Speaker 2 (05:46):
You're exactly right. It kind of blew my mind. I'd
never heard of this, the idea that he had gone
to Wall Street with absolutely no interest in money, and
his parents would both professor law professors at Stanford. They're
like non materialist people. They don't care about stuff, they
don't care about money. You know, in another generation, he'd
have been like a high school physics teacher or maybe

(06:08):
a college professor. He was not the social type who
goes to Wall Street. And he'd forced himself into it
because he had discovered, before he'd discovered Wall Street, this
movement called effective altruism that just resonated with him.

Speaker 1 (06:20):
Yeah, it was seduced into it by the philosophers.

Speaker 2 (06:24):
And they proselytize and they actually will, mcgaskill told me.
He says, you know, it turned out that the ideal
market for these ideas were nerdy math science people at
American universities, people who were attracted to this kind of
quantitative solution of how to lead your life maximize the
number of lives you save.

Speaker 1 (06:42):
I mean, just as an asside I've been writing a
column about the Trump administration's decision to cut foreign aid.
And it's slightly controversial exactly how much foreign aid they've cut,
exactly what they're cutting, and the conflicting statements, but plausibly
sensible independent analysts reckon that decision is going to kill
a million people a year yep. And it's mostly people

(07:04):
with HIV, although were there a few other things that
have some vaccinations, as tuberculosis. But you have HIV, you
have very effective medication. As long as you take the medication,
you'll find the moment you stop taking the medication, the
virus starts to come back, and pretty soon you're going
to die. And I'm writing about this, and somehow I
can't quite get as outraged about those million people as

(07:28):
if they were being rounded up and executed for being
a legal immigrants, or if they'd started some war that
was likely to kill a million people. It just feels
different somehow. It is dead right.

Speaker 2 (07:38):
Part of the reason it feels different is it's a number, right,
especially the bigger than number. It just becomes a number.
You would feel much more raged if you saw just
a single child die.

Speaker 1 (07:48):
Yeah.

Speaker 2 (07:48):
We don't respond to the data. We respond to anecdote.

Speaker 1 (07:52):
Yeah, unless you're Sam Bgmund Freed and unless you're Sam
Bakman Freed, which is one to the data. You know,
there's a jumble of my previous books.

Speaker 2 (07:59):
That are sort of popping into my head when he
starts talking this way. One is Moneyball Oakland A has
succeeded because they ignored the anecdote and look to the data.
And two was the Ndoing Project. Yeah, Economy and Taversky
were all about the way our minds mislead us when
we trust our intuitive sense of things rather than the data.
It was interesting that this was wholly persuasive to this

(08:24):
kind of person and not just Sam Bagmfree, but there
was a whole crowd. I mean it was cult like.
As McCaskill said. They tend to be on kind of
on the spectrum. They tended to be male, they tend
to be math or physics. They tend to be socially awkward.
There were people who for whom emotion was a weak
guide to action. Whatever you or I feel that leads

(08:47):
us to do things, they didn't feel so much, but
they could talk themselves into using logic and numbers as
a guide to action in the way that emotion is
a guide to action for us.

Speaker 1 (08:59):
Are the numbers a better guide to action than emotions.

Speaker 2 (09:02):
Depends on what you're doing. Yeah, I mean, let's go
back to the moneyball case. Most of the time in baseball,
the statistics information that you are way better off than
guided by than your nose. Every now and then that's
not true. But most of the time in life you
don't really have the numbers, like how you're going to
decide who you're gonna marry, or where you're gonna live,

(09:22):
or even where you're gonna go to college, or even
where you're gonna have dinner tonight. How do you reduce
that to a statistical problem. We are kind of forced
by the dearth of statistical information to constantly rely on
our intuitive judgment, and so that it leaves us kind
of sleepy to the possibility that there is some statistical
information that could inform our judgment. Often when there is,
though yes, I think it is better, Connomin and Traversky

(09:44):
showed just it's amazing how even when you're looking at
people we would all think of that as experts doctors
that algorithms outperform the experts when rendering diagnoses. That kind
of thing.

Speaker 1 (09:56):
One of the things that struck me exploring The story
for Cause Details was when effective altruism took this what
seems to be a little bit of a weird turn.
So on the one hand, you you've got the people
who are like, Okay, I've got one thousand dollars and
i want to save the maximum number of lives. They
crunch the numbers and they go, yeah, it's probably bed
nets impregnated with anti mosquit typesticides. If you give it,

(10:19):
you give out a lot of bed nets, you could
save a life for whatever one thousand dollars, three thousand dollars.
So it's cheap. It's cheap, and that's better than maybe
spending the money on vaccinations, even though vaccinations are good
and vaccinations are better than spending the money on work
to improve governance. And improve governance is better than giving
it to a donkey sanctuary because all of the donkeys

(10:40):
are living in luxury. They're all fine. So that all
makes sense up to that point. You go the effective
altruisms at on it. I'm with this is great. I'm
with them too, so I feel the same way. I'm
with them too. I've makes it.

Speaker 2 (10:50):
They just make me at that point, they just make
me feel guilty for how I'm living my life.

Speaker 1 (10:54):
And then they start having these conversations like, well, you know,
there could be a lot of people in the world
in the future. The future could go on for a
very long time. We could be talking about trillions of people.
We could be talking about intergalactic ciations or human civilization
lasting a million years. But maybe AI ruins all that.

(11:15):
So maybe if we have a workshop about AI safety,
maybe there's only a one in a billion chance that
it will do anything. But if it does something, it
could save a trillion lives. So a one and a
billion chance of saving a trillion lives, that's a thousand lives. Right,
that's really good, and so it's totally worth spending fifty
thousand dollars on our AI workshop. At which point you

(11:35):
start like, where did it go wrong? Or are they right?

Speaker 2 (11:39):
Am I wrong? So I exactly say a feeling it
jumped the shark Around twenty and twelve and Toby Ord
writes this book about existential risk where he's putting numbers
on the likelihood of various events wiping out humanity, and
it's AI and pandemics and meteor strikes and nuclear war
and so on and so forth. Very hard to put

(11:59):
numbers on that, but of course you can't do the
calculations about the numbers, so they just accept roughly good numbers,
and it's just not at all clear to me the
numbers are okay at all. The math become much shakier.
That's one thing that happens. The number has become much bigger.
You're not just talking about saving lives in Africa right now,
You're talking about all future humanity and infinite people. And

(12:23):
so the sums of money required to address these problems
also becomes much much bigger. One thousand dollars will be
useful in Africa right now. One thousand dollars right now
to prevent AI from eating us all one day is nothing.
Sam Begun Freed sits down and does a back of
the envelope calculation. He thinks he needs at least, you know,
one hundred billion dollars to make a dent in this

(12:44):
or that problem. So all of a sudden, it gets
very grandiose, and it's completely even if they were never
entirely interested in the emotional residence of saving the life
of a small child in Africa today, there was at
least some emotional content there, and all of a sudden

(13:04):
it vanishes. It's purely an abstract math problem. And then,
of course, as you just said, what can't you justify
when the numbers are that big, the number of people
you're going to save that big, what can't you justify
doing to attack the problem? It seems perfectly rational to
do whatever you need to do to make one hundred
billion dollars to eliminate or at least defray this existential risk.

Speaker 1 (13:27):
Yeah, this is the end of the world. We're talking
about the end of human civilization. You just be able
to cut some coolness at that point.

Speaker 2 (13:32):
So what you're doing, also, if you think about it,
is you're puffing yourself up. You're a superhero. Now you're
now going to save humanity. You may only be increasing
the odds that humanity is saved by a few percentile,
but nevertheless you're in the realm of saving humanity. So
it creates a grandiosity to the whole thing.

Speaker 1 (13:52):
And well, as I understand it, the sound bragmue of
people have said it, if I'm giving myself a five
percent chance of saving the world, that I'm five percent
of Superman.

Speaker 2 (14:00):
Right there we go. So this is how he was
thinking when I first met him, and I thought it
was bonkers, but I thought it was interesting. It's not wrong,
you know, it's not obviously wrong. It's probably obviously right
that there are these extinction level events that could happen.
The question is, like, is it barkers think you could
do much about them? There's something that goes on when

(14:22):
you're doing things that have such a remote probability of
having any effect. It's very different from buying a bed
net and saving a kid. The degree of uncertainly gets
so high. And so when you look at how what
happen to the EA movement around Sam. They had their
arguments about this, but the people who move with Sam

(14:44):
all bought into you know, we're saving humanity, and they
left behind a collection of original EA's who sort of
disapproved and who continued to, you know, buy bed nets
for kids in Africa.

Speaker 1 (14:56):
Do you think it is fair to blame effective altruism
for what Sam Maguinfree did and you know, the fraud
and him going to prison and all of that. To
what extent can you track it all back to.

Speaker 2 (15:08):
Effective Can we assign fractional blame?

Speaker 1 (15:11):
I think that's totally in the spirit.

Speaker 2 (15:12):
Of the heart. Let's in the spirit of Sam Bankman Freed,
let's do a little shares of blame. So I would say,
the little sliver of blame that effective algorism gets is
in how it was sold to the Sam Bankman Freeds
of the world. You are this nerd with a high
sense of your own self importance, but it hasn't been
appreciated by the world who is coming of age in

(15:35):
a world where very weirdly, you can get rich very
quickly because your particular gifts are all of a sudden
in demand at Jane Street and Hudson River Trading and
Citadel and all these quantitative trading shops and also tech firms.
So all of a sudden, you're the most monetizable twenty
two year old on the planet, maybe a professional athlete.

(15:56):
And if you unleash your money making skills for the
good of this movement, you're a superhero. They're given a
kind of license to behave in unorthodox ways.

Speaker 1 (16:08):
Maybe it never could to will that somebody would, Oh, well,
I guess I could commit a massive fraud and as
long as I saved the planet, that's fine. I mean,
possibly never oc could to the philosophers that anybody would
think to do that.

Speaker 2 (16:19):
It's possible. That's a funny thing you say that. I
think that's possible. So I think if Sam Beginfree is
a creature of anything. Effect of altruism is really important,
but so is Jane Street. The high frequency trading world,
they live to game systems. You're in there to figure
out how to game markets, and people who are really
good at playing the game are the people who win.

(16:40):
And Sam was really good at playing the game. Obviously
you're not supposed to break the law, but it puts
you in a certain frame of mind when all you're
doing is gaming markets day after day after day, you're
looking for little edges, and if I were assign another
slipper of blame, kind of the grown up world of finance.
It was obvious to anybody who walked into the FTX

(17:00):
that there was a basic conflict of interest. He had
this private trading firm that he'd started first out of
which grew this public crypto exchange, and that his private
trading from traded on the crypto exchange and was in
a little hut next door to it in the Bahamas.
So some blame there, but of course also blame to Sam.

(17:21):
What isn't in Sam that leads him to do this.
You know, he danced his way out of lots of
problems before. I think he thought he was smart, so
smart that he could kind of figure this out. And
in fairness to him, I think he was overwhelmed. If
you'd seen their business, your first reaction would be this
is pure chaos. There's no organization chart. No one knows
who the job actually is of The corporate psychiatrist is

(17:44):
the person who knows the most about the business because
he's the only one everybody talks to. You know, it's
like one thing like that after another. What it didn't
feel like and still doesn't feel like to me, And
I get in trouble when I say things like this
is malice. What he wasn't and isn't is like this
natural crook. He doesn't have cruelty in him. Oddly, he

(18:04):
doesn't have a lot of dishonesty in him. He has some,
but up to the moment that it's discovered that he
has the money in the wrong place, that he's taking
used as customers' money to do all kinds of things
he didn't have permission to do. There's not really a
trace in Sam Bankman's Freed's life of criminal behavior or
corrupt behavior. Didn't cheat on tests, didn't cheat a golf,

(18:25):
never played golf. You know, you go back to his childhood,
there's no sign of anything like this, and so it's
not like this is who he is. It makes it
even a weirder event.

Speaker 1 (18:34):
Do you think he was a kind personal? Is a
kind person?

Speaker 2 (18:37):
Yes, I mean, you know, qualify it. But he paved
very well to the little people around him who helped him.
He would get worked up and scream every now and then.
But I think because he was on the receiving end
of it when he was a kid a lot. I
think he was highly sensitive to suffering. I think suffering

(19:01):
bothered him. And he was also highly averse to conflict,
Like he was really uncomfortable getting in a conflict. He'd
going high rather than fight. Every now and then he
get upset and everybody around him get scared. But it
was very private. It was like watching a little volcano
go off. So yeah, I think he's basically clan person.
And to this day, I'm still in touch with him.
He writes a prison diary and he sends it to me.

(19:24):
He's pretty careful about his interactions with other people. He's
in no conflict. People kind of like him because he
kind of watches out for them a bit. I mean,
we did it was really wrong, but when you know
this personality, it's even stranger.

Speaker 1 (19:37):
He did it interesting, very interesting. Well, Michael, stay with us.
I'm going to give you questions from our loyal listeners,
particularly on the subjects of kindness and altruism, and we
shall hear them after the break. We're back. I'm talking

(20:00):
to Michael Lewis, the author of Going Infinite, The Rise
and Fall of a New Tycoon, which tells the story
of Sam Bankmin Freed, the world's most infamous effective altruists.
And Michael, we've asked our listeners to send in their
questions about altruism. Are you going for answering a few?

Speaker 2 (20:18):
Sure?

Speaker 1 (20:19):
So here's one from I think from Nocum Apologies if
I've mispronounced your name, that they write as an admirer
of mister Lewis's books since Liar's Poker. When I heard
he was answering questions about altruism, I had to ask this.
According to the latest data I could find, the most
important cause for effective altruists, after poverty and health, is
AI risk. There are real challenges to the adoption of AI,

(20:43):
but to put it so high on the list of
causes to donate to seems misguided at best. Can you
explain how AI overtook such obvious harms as global warming
or the public benefits of the spread of free markets,
free speech, or democracy as the greatest good and effective
altruists can do all the best, No, com This is.

Speaker 2 (21:04):
One of those really great questions I probably can't answer satisfactorily,
but it is a great question. I think in the
minds of Sam bagman Freedom and the people around him,
they thought of these existential risks in two ways. How
salient are they? How likely are they they cause problems
to actually happen? And how tractable are they? What can
we actually do about this? And in Sam bankman Fried's

(21:28):
mind when he did that calculation, it wasn't AI that
bubbled to the surface. It was pandemics. He threw much
more money into pandemic prevention than AI. Even though he
thought AI was the greater risk. He couldn't figure out
what to do about it.

Speaker 1 (21:41):
Would this had been pre COVID When he was throwing
money into pandemic prevention, it.

Speaker 2 (21:45):
Was pre COVID. When he was thinking about it and
when he started to have real money we were in
the thick of COVID. It was like, if a really
even more deadly one comes along, how can we be
better defending ourselves against it or detecting it and preventing
in the first place. That kind of thing the one
move he made in AI prevention other than funding people
with small sums of money who had interesting ideas, but

(22:06):
he bought a big chunk of anthropic which was the
kind of Salonda refuse for open AI. People who thought
that open AI wasn't paying enough attention to the risks
and that fine enough end up being worth many, many
billions of dollars to his creditors. So why were they
so alive to AI as an existential risk? Because your

(22:28):
question is right, it's not obvious. It's not obvious to me.

Speaker 1 (22:31):
Let me flip it around and say, why are they
not worried about climate change as an existential risk? For example?

Speaker 2 (22:37):
The answer to that is they think that climate change,
no matter how bad it gets, they'll still be human
beings alive.

Speaker 1 (22:41):
Yeah, and they're probably right, Yeah, I mean, it's not
going to turn the earth into venus right, it could
make things really uncomfortable. We could really regret we didn't
do more earlier. But the idea that it ends the
human race seems unlikely.

Speaker 2 (22:54):
Yea.

Speaker 1 (22:54):
So they're looking for things that will actually, like end it,
absolutely end human civilization, and there, I guess you're looking
at nuclear weapons and AI are more plausible.

Speaker 2 (23:03):
Maybe that's right. AI had the capacity to do the
job completely in a way even pandemic wouldn't. I don't
even think nuclear weapons are that plausible. So they could
play out a scenario where they were no longer people
because of AI.

Speaker 1 (23:16):
It throws a light on this the difference between a
long term, effective altruist and the rest of us. The
rest of us would go, hey, if there was a
nuclear war and ninety nine percent of the human race
was killed and the rest of us were reduced to
the Stone Age and had to build from scratch, that
sounds incredibly bad. Whereas for an effective altruist, they're like, well,
if you think really long term, yeah, and on a

(23:37):
two million year time horizon, that's right. It's not the
worst thing.

Speaker 2 (23:40):
It just shows you where you can go when you're
untethered from normal human feeling and convention. That's the real
thing about the effect of altrus is that at bottom,
it's a status movement. It's a group of people who
feel underappreciated, slightly ostracized, who know in some ways they're
smarter than their peers. Essentially, the starting point is everybody

(24:01):
else is stupid, they don't know, and it takes them
to weird places.

Speaker 1 (24:05):
Let me throw in a thought from listener, Matt. I'm
going to paraphrase, but it's relevant, I think to what
you're just saying, Michael. So, Matt conjures up the idea
that people who eat meat are in a way very
good for pigs and chickens and cows, in.

Speaker 2 (24:23):
The same way that people who want ducks are very
good for ducks because you created the need for these
animals to exist. Yeah.

Speaker 1 (24:30):
You know, if a Martian came down and looked and said,
you know, what are the dominant species on Earth, then
there's a good case that some of the dominant species
are the species that humans eat. And the reason they're
dominant there are so many of them have taken over
all these different ecosystems is because that's what the humans want. Anyway,
he draws a parallel between that perspective and effective altruism.

(24:54):
Is it too harsh to think of effective altruists as
benign butchers ready to sacrifice real people today for the
sake of a notional gain to humankind over eons. So
there's no end to the suffering you can tolerate today
as long as your maximized using the quantity of humanity
in the long run.

Speaker 2 (25:11):
I'd be willing to play this game with him, Except
the effect of altruismselves do not acknowledge they were willing
to inflict suffering on people today. They just thought they
were shifting their attention from people to day to people
in the future. They were shifting their neglect, if you
want to think about it the other way. So they
weren't actively thinking of themselves as inflicting suffering on present humans,

(25:33):
although Sam did. Although Sam did, although the logic of
their argument would lead them to do it if necessary.

Speaker 1 (25:40):
Yes, interesting question for Victor. Victor asks why is no
one talking about why we need altruism in the first place?
What are we doing wrong as societies that some people
are left behind and there's a need for altruism. I
suppose this gets to the idea of if you had
a benign government, or if you had the right rules
or the invisible hand of Adam Smith. But if we

(26:03):
somehow organize society better, no one would have to be
kind or altruistic, because all these problems would be.

Speaker 2 (26:08):
So this fair question too. Without human sympathy, let's not
call it altruism, Let's just call it kind of a
basic sympathy for our fellow creatures. Without that, why would
you bother to organize society in a kinder way? Like
that is what's at the bottom of attempts to organize
society so that you don't need it. It's a funny

(26:29):
situation in a way, because when I think of altruism,
they kind of selflessness. Every time you scratch the surface,
if you find some human motive for what they're doing,
selfish might be too strong, but it's not exactly selfless.
So when you ask this question, my mind goes in
a completely different direction, and my mind goes in the
direction of do we actually even have altruism? Is there

(26:51):
such a thing? I think it's more complicated than that.
The minute you were sitting in a room with Sam
bankman Fried and is effective altruists at FTX, you realize
they were engaged in a kind of competition with other
effective altruists. Who's going to be the biggest deal in
the effective altruism community. It's like who's going to save
the most future lives. They never put it quite so crudely,

(27:13):
but it was in the air. Maybe that's good jolly.
You've got these kind of killer competitive instincts. And it
could be people kidding each other with machetes, or it
could be Wall Street traders trying to make the most money.
The beauty of early sam Bekman Freed and his crowd
was so we have this machine called Wall Street. It
is gotten better and better at extracting rents. The people

(27:34):
who are the peak predators on Wall Street now are
the high frequency traders, the Citadel's, the Jane Streets, the
jump trading, the Hudson River trading virtue, all these places
generating more money for individuals than have ever been generated
on Wall Street before. And all of a sudden, the
kind of person who's good at that has this religion

(27:56):
about giving the money away. I thought, for a brief
moment there was this kind of robin Hood thing that
might go on. And in fact, Jane Street, which is
the leader of the pack. Jane Street started to worry
that too many of the people that they were recruiting
were effective altress. And the problem with the effective altress
is they couldn't control them in the way you control

(28:17):
a normal person who just wants a fourth house and
a third yacht. They didn't have the same materialist needs.
They were doing it for this kind of quasi religious reason,
and it made them much harder to manage. But I thought,
what a great problem. Wouldn't it be cool if instead
of like reforming Wall Street because we'll never do it,
that it's sort of weirdly reformed itself. Because the kind

(28:38):
of person who got into the position of making the
most money felt like it was his religion to give
it away. That would have been cool.

Speaker 1 (28:47):
Yeah, No, it would have been would have been so
a quick question from our listener Richard, which I think
is tied to that point. It's an exequat question really.
He says, if you give money to charity, should you
tell everybody about it?

Speaker 2 (28:59):
Well?

Speaker 1 (28:59):
Should you just do it? Should you just do it quietly? Well,
one hand, you should do it quietly because it's undignified
and it's not about you, It's about the charity. But
on the other hand, if you tell your friends you're
giving money to charity, then maybe that will encourage them
to give as well.

Speaker 2 (29:12):
That's an interesting way to put it. I don't think
people who put their names on buildings are doing it
because they want to encourage other people to put their
names on buildings. It's self advertisement. Yeah. I was raised
to be very quiet about this sort of thing, that
you give money if you always listed as anonymous. I've
had organizations ask we put our name on it, and
I said, and when they want that, I let them

(29:33):
do it. My answer to the etiquee question is let
the recipient decide whether they want your name on it
or not, what's better for the recipient, and do whatever
is better for them.

Speaker 1 (29:45):
Thank you, Michael, brilliant answers. Hold on with us, and
let's talk about your new book, Who Is Government. After
the break, we're back. I'm here with my fellow Pushkin podcaster,
the brilliant writer Michael Lewis. Michael, you very kindly agreed

(30:08):
to come on the podcast and talk about Sam Bankmnfreed
and talk about going infinite. And then I got a
message saying, oh, by the way, Michael has a new
book out. You should probably ask him some questions about that.
I have to admit my heart sank a little bit.
I thought, I don't really have time to read another
book in preparation for this podcast. And of course, the

(30:29):
moment I opened the first page, I was immediately drawn
in and I loved the book. So congratulations, But tell
us about who is government?

Speaker 2 (30:37):
So I wrote a book about a set in the
first Trump administration called The Fifth Risk, where I wandered
around the administration, the executive branch, and got essentially an
education from the various departments that Trump himself refused to get.
So how the Agriculture Department worked, how the Energy Department worked,
what went on inside these places. And the longer I
spent there, the more taken I was with the actual

(30:59):
characters in government. Whatever the stereotype of the bureaucrat is
in the American mind, they violated it. There were these
breathtakingly devoted public servants who were experts and all kinds
of arcane and in some cases spintanely frightening fields who
were doing the work that kept the society together. And

(31:21):
I thought, you know, there really was a project coming
back and just doing profiles of these people. And it
wasn't so much because I saw Doge coming. I'm surprised
Doge came. It was more that I thought just generally
the conversation around American government had gotten so dumb because
people didn't really appreciate what the government did and who

(31:41):
the people were who did it. So I recruited six writers,
Dave Eggers, John Lanchester, Geraldine Brooks, Casey Sepp, Camal Bell,
and Sarah Val none of them really conventional journalists Comal
Bell's a stand up comedian, Dave and Geraldine mostly novelists,
same as John, and just dropped them into the government,
said look, find a story, and I did two of them.

(32:03):
And the idea was just like inoculate the American public
against these really stupid critiques of their government. I mean,
these things ran one by one each week in the
eight weeks running up to the election, and then we
put them together the book because they got so much attention.
The thing that really got me was actually the quality
of the material. Like the first story is about a

(32:24):
guy named Chris Mark who figured out how to prevent
the roofs of coal mines falling in on the heads
of coal miners, and you think, well, that's an arcane problem,
and how big a deal could that be? Fifty thousand
American coal miners killed by roof falls in the last century,
and who knows how many more around the world, And
there was just an imperfect science and how to keep

(32:45):
the roof of a coal miner up, and how this
person comes to his expertise, why he does it, how
he does it is literate. It was just like this
stuff of novels. You find that over and over again
in the Government.

Speaker 1 (32:58):
The story about Christopher Market. We don't want too many spoilers.
But his father is kind of, among other things, an
expert in why Gothic cathedrals don't fool.

Speaker 2 (33:06):
Down, and he rebels against the father.

Speaker 1 (33:08):
Yeah, they don't get on a tour, they don't get
on He leaves home and.

Speaker 2 (33:11):
Refuses to get a college education. He goes and joins
the working class and then essentially reprises his father's career underground.
And when I tell him that, he gets angry at me. No,
it has nothing to do with my father. Yes, my
father figured out how the roofs of Gothic cathedrals don't
fall down, but that has nothing to do with how
the roofs of coal mines don't fall down.

Speaker 1 (33:30):
Yeah, And of course it's it's the same problem.

Speaker 2 (33:32):
It's the same basically, some little differences, but from any
kind of perspective outside of his own, it's the same problem.
And so that there were these psychological portraits to draw
that were just fun. I don't know how you think
about what you do. I think about what I do
is like goal mining. I'm a prospector. I wander around,
you know, the mountains of California, kicking up dirt under

(33:54):
my feet, looking for a place where I can sink
my pick and maybe find some gold. And it's kind
of random when I find it. There's not a great
science to the finding of the gold. You get good
at sort of analyzing the landscape. You got a sense
of this might be and where it might not be.
But it's still a lot of luck involved. I felt
that with the fifth risk, I found this unbelievably rich

(34:18):
vein of ore. I can only scoop out a fraction
of it because there was so much of it. And
I was working with my hands, and I had a
bucket on my back, and there was all this stuff
left behind, and I thought it would all be gone
by the time I came back to it. And in fact,
it's like noem would bother to go find the mind.
No one's interested.

Speaker 1 (34:37):
There's this line by Geraldine Brooks, one of the writers
in Who Is Government. She's profiling a guy who is
a jiu jitsu instructor, tennis coach, fights terrorists and pedophiles,
but it is a qualified accountant and works for the
in non revenue service. And she says, you know, if
this was a novel, this would be malpractice, right, you

(34:58):
can't just make this sort of stuff up. But because
it happens to be true and this guy is a
real person, I'm actually allowed to have him as a
character in this story.

Speaker 2 (35:06):
And it tells you something. This person who is like
the profit center of the United States government because he's
busting up cyber crime rings and raking in their bitcoin
and sticking it in the treasury. He's generated billions with
his team, billions of dollars of free money for the government.
And the Trump administration has disabled him. They've fired half

(35:26):
his unit and he can't do what he did as before.
Then our population isn't just completely outraged by this is
incredible to me, but they wouldn't know to be outraged
unless you knew the story. And you only know the
story if you read Geraldine's piece because it's not in
the news.

Speaker 1 (35:41):
Otherwise it's astonishing and he busted Peter followings, there's one
point where somebody from Hamas's tweets and says, oh, what
you can donate to the revolutionary cause. He's send your
bitcoin to this address, and he redirects it basically hacks
their bank account. I have to say the details of
exactly how this happened to lost on me, but anybody
who donated bitcoin it ended up going to victims of

(36:01):
state sponsored terrorism, right and anybody who clicked on the
Hamas logo got directed to Rick Castley singing never going
to give you up. So who says the Buroucuts don't
have a sense of humor, Oh no, they do. That
they don't have, by and large is a sense of
themselves as characters. This was the thing that all.

Speaker 2 (36:17):
The writers came away thinking, they like, I'd find this
person who had done this unbelievable thing, and I'd say
I want to talk to you about it, and he goes, well,
it wasn't really me. It was the team that you
really have to talk to my bosses. It was very
little ego. I guess what it is. These jobs self
select for people who really like doing big, important things
but don't care much about credit or money. It's hard

(36:39):
to believe that such people still exist in American life.
Everybody else seems to be looking for fame and fortune.
These are still like the opposite of reality TV stars.
They got interested in a problem. They've worried the problem
to death for thirty years, has had enormous consequences, and
they don't expect anybody to pay attention.

Speaker 1 (36:54):
I need. You get nervous when people do, or at
least the bureaucracy around them gets nervous.

Speaker 2 (36:59):
Well, that's right. That's really actually an important point because
that was the other thing all the writers noticed. And
this is maybe by way a way to explain why
there's this inefficiency and information about this, is that the
bureaucracy has gotten so used to all attention being bad attention.

Speaker 1 (37:15):
So journalist shows up and says, I'd just like to
know what you're doing and why you're doing.

Speaker 2 (37:20):
It, and they assume this is going to end with
this like a congressional hearing and me getting fired and
humiliated and prevented from doing anything for.

Speaker 1 (37:28):
The rest of my life, whereas in fact, it's like, no, no,
I just want to know what you're doing, because it's
amazing what you're doing, and I want to tell everyone
what you're doing that's right in a good way.

Speaker 2 (37:35):
They themselves are that not as wary because they aren't
doing anything bad. It's the political people above them are
worried that this story will somehow make the White House
look bad, and they can't imagine how it would make
the White House look good.

Speaker 1 (37:48):
Yeah, and there is a story to be told. I
think also about how the poor reputation of government bureaucracy
just makes their life harder, irrespective of the politics itself.
I think you profile the woman who studies rare diseases.

Speaker 2 (38:04):
Yes, so she doesn't really study rare diseases. She's had
a bunch of them stone And let me just tell
you how I found her, because that sort of explains
how what a role is in the world. So back
when I was working on my COVID book The Premonition,
one of the characters was a researcher named Joe Derriesi
who's like he was a superhero in BioResearch and Joe

(38:25):
he was working on COVID, but at the same time
he had discovered what he thought might be a treatment
for a rare brain eating amiba called balamuthia. I'd never
heard of this thing. Nobody had ever heard of this thing.
It did, wasn't It was discovered in like the nineteen
nineties in the San Diego Zoo.

Speaker 1 (38:41):
But subsequently now I'm now having nine minutes about it.
So thank you for telling me it's.

Speaker 2 (38:46):
Time should be you should yeah, because there's some tens
of thousands of cases each year in the United States
of people dying of unidentified encephalitis. It's like something's going
on in their brain and they never figure it out.
They die it's just encephalitis. There was a woman whose
brain was being eaten by something they couldn't figure out
what rolls into the UCSF emergency room. Joe gets involved.

(39:10):
Eventually she dies. He gets involved too late, but he's
able to identify the bug that's in her brain, and
it's this Ballamuthia Mandrillis. It's called it's a brain eating
a meba. He then, very cleverly he says, well, like,
what could you do to treat this in his lab,
he has as his graduate students bombard balamuthia with every

(39:31):
drug that's been approved either in Europe or the United States,
and find that one drug, a UTI drug used in
Europe called nitroxylene, kills the Baalamuthia clearly doesn't kill people.
People are taking it for UTIs, so like, why not
try this and lo and behold even though it's rare.
Pretty shortly thereafter, someone else rolls into the emergency room

(39:54):
has the same symptoms. They find he has ballamuthi in
his brain. They give him the nitroxylene and he survives.

Speaker 1 (40:00):
Yeah.

Speaker 2 (40:01):
I say to him, I said, man, that's great. Now
everybody will know there's at least something you could treat
it with. And he says, Nope, doesn't work that way,
he said, may we'll be able to publish a scientific paper.
Maybe doctors will notice it. But there's every possibility that
if you roll into a hospital with Ballymouthie anywhere else
in America, that they won't even know we did this work.

Speaker 1 (40:21):
There's two problems, right, So one problem is you didn't
run a round the most trial, so you can't be sure.
Maybe it was a flu.

Speaker 2 (40:26):
Correct.

Speaker 1 (40:26):
It's still interesting that you had a theory, and you
gave him the drug and he got better. So maybe,
But then your second problem is, you know, how do
you tell people that maybe yes.

Speaker 2 (40:38):
But with these rare diseases, you're never going to have
a randomized trial. And when they're fatal, if you roll
in Tim with Ballymouthie in your brain, would you rather
them throw some nitroxyline in you or not? If they don't,
we know what's going to happen if they don't, So
your brain is going to get eaten. That's bad, so

(41:00):
it why not? And the other side of this is
that because it's a rare disease, the pharmaceutical industry has
no interest in it. It's like there's no money in it. No,
none of people going to have this happen to them.
By the way, the way you apparently get is by
ingesting dirt, So be careful with dirt when your garden.
Don't put your hands to your mouth anyway. So Joe
says to me, this is a natural place for government

(41:23):
to intervene, to at least collate, because people like me
are doing this stuff all the time. And there's a
woman in the Food and Drug Administration named Heatherstone who
is kind of all by herself, basically decided to tackle
this problem. She's created an app that try to gather
every instance of a rare disease being treated around the world,

(41:43):
how it was treated in what the outcome was. There's
hope for people who have rare diseases. But the government
does not have the energy anymore to get behind Heatherstone's creation.
They're unwilling to really promote it. She's a woman on
her own to go into medical conferences trying to persuade
doctors to plug in their rare disease treatment their cases

(42:04):
into her app. It's going nowhere. What happens? The story
in the end is about girl and six year old
girl in Arkansas who rolls in and takes them forever
to find that she's got balimuthia. But she has ballymuthia
the completely screwed up way in which her life is saved.
The little girl in Arkansas parents google around and find

(42:26):
that Heatherstone has been thanked by JODORESI for helping me
get his hands on nitroxylene, and they get personally in
touch with her, and she makes sure that they get
the drug and her life is saved. But at the
very same time, just like forty miles from Jodoresi's office
in California, another little girl got Ballymuthia, never heard of

(42:47):
the cure and died. And it's sort of like a
parable of good and bad government, Like, do you what
kind of government do you want to have? A government
that has emboldened and strengthened to actually follow through on
this really good idea of how to deal with an
intractable problem, or a government that actually has kind of
lost its spirit and its energy because she created the
app cure idea. It's called it should be useful, it

(43:11):
should be something people pay attention. It should be something
that the government throws its credibility behind. But then you know,
the government has less credibility. And it's a story of
the frustrating of the impulses of people who previously might
have done really great things in the government. So it's
a sad story. It's a sad story masquerading as a

(43:32):
happy story, but a kind of amazing story.

Speaker 1 (43:36):
It's absolutely amazing.

Speaker 2 (43:37):
There's this thing that kind of springs from between the
lines of the whole book. These people feel like people
who have figured out the way to lead a meaningful life,
and this brings us back to effective altruism. One of
the keys to a meaningful life is to find ways
outside of yourself, find ways to live for things other

(43:58):
than yourself. You know, Sam Begman Freed groped towards this
in his own weird way. But these people put their
finger on it right away. There are problems I can
solve and it will help others. And never mind how
much I'm paid or whether I'm acknowledged for it. And
I'm not wired to do that. I need more attention
than I should. But they are and we should be

(44:20):
disgrateful for them instead of heaping scorn and derision upon them.

Speaker 1 (44:24):
I've been talking to Michael Lewis. Michael is the author
of Going Infinite and a new book with co authors,
Who Is Government. Michael has been great to talk to you.

Speaker 2 (44:34):
Thank you. I always find Tim and.

Speaker 1 (44:37):
You can listen to Michael's podcast Against the Rules wherever
you get your podcasts. As for me, I will be
back next week with another Cautionary Tale. For a full
list of our sources, see the show notes Timharford dot com.

(44:58):
Cautionary Tales has written by me Tim Harford with Andrew
Wright Alice Fines and Ryan Dinney. It's produced by Georgia
Mills and Marilyn Rust. The sounders are and original music
are the work of Pascal Wise. Additional sound design is
by Carlos San Juan at Brain Audio. Bend A Dafhaffrey
edited the scripts. The show features the voice talents of

(45:22):
Melanie Guttridge, Stella Harford, Oliver Hembrough, Sarah Jupp, messaam Monroe,
Jamal Westman and Rufus Wright. The show also wouldn't have
been possible without the work of Jacob Weisberg, Greta Cohne,
Sarah Nix, Eric Sandler, Carrie Brody, Christina Sullivan, Kira Posey,
and Owen Miller. Cautionary Tales is a production of Pushkin Industries.

(45:45):
It's recorded at Wardore Studios in London by Tom Berry.
If you like the show, please remember to share, rate
and review. It really makes a difference to us and
if you want to hear the show, add free sign
up to Pushkin Plus on the show page on Apple
Podcasts or at pushkin dot Fm, slash plus
Advertise With Us

Host

Tim Harford

Tim Harford

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.