All Episodes

July 4, 2025 38 mins

A radical thought experiment transforms the lives of a new breed of philanthropists, as they follow the logic of altruism to extraordinary lengths. The most famous convert to the Effective Altruism movement, Sam Bankman-Fried, is either a humanitarian hero, a con artist at an astonishing scale, or most bafflingly, both.


For a full list of sources, see the show notes at timharford.com.

Get ad-free episodes, plus an exclusive monthly bonus episode, to Cautionary Tales by subscribing to Pushkin+ on Apple Podcasts or Pushkin.fm. Pushkin+ subscribers can access ad-free episodes, full audiobooks exclusive binges, and bonus content for all Pushkin shows.

Subscribe on Apple: apple.co/pushkin
Subscribe on Pushkin: pushkin.fm/plus

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Pushkin.

Speaker 2 (00:24):
In a penthouse apartment in the Bahamas, a billionaire is
hosting a meeting. It's the kind of place you might
expect to find a billionaire, marble floors, a grand piano,
a balcony with a hot tub, and views of the marina.
As the billionaire and his colleagues debate what to do

(00:47):
with his money, they look to one man in particular
for wise advice. A business analyst, a financial expert, no
a moral philosopher who's devoted his career to thinking about altruism.
This is the second of two cautionary tales about altruism.

(01:12):
In the first, we heard about a scientist called George Price,
who helped to unravel the mystery of how evolution produced
altruistic behavior, and who then became the most extreme altruist
you could imagine, giving away his last penny and the
coat on his back. But Wade, perhaps the billionaire in

(01:37):
the Bahamas is an even more extreme altruist. The only
reason he ever wanted to make money was to give
it away. He looks more like a student than a billionaire,
with his baggy cargo shorts, crumpled T shirt, and disheveled hair.
He's turned his penthouse into a dorm room. There are

(01:58):
bean bags for napping on, monitor wires trailing haphazardly across
the marble floor, a cheap bookcase full of board games,
a freezer stuffed with three dollar vegetable beryani from Trader Joe's.
Over the next year, he wants to give away a
billion dollars, and he wants to do it as effectively

(02:21):
as possible, hence the moral philosopher. The year is twenty
twenty two. The billionaire's name is Sam Bankman Freed, and
his altruistic activities are about to be interrupted by arrest
and imprisonment. I'm Tim Harford, and you're listening to cautionary tales.

(03:11):
It's nineteen seventy two. In London. George Price is spiraling,
convinced that Jesus wants him to give all his possessions
to homeless people. Fifty miles up the road, in Oxford,
a philosopher called Peter Singer publishes an essay titled Famine,

(03:32):
Affluence and Morality. Singer asks us to imagine that we're
walking along past a muddy pond, going about our day,
when we see that in the pond a small child
is drowning upon this shallow we could easily wade in
and save the child. That that would ruin the nice

(03:55):
new clothes we're wearing.

Speaker 1 (03:57):
What do we do?

Speaker 2 (03:58):
Of course, of course we wade in and saved the child.

Speaker 1 (04:02):
If we didn't, we'd never be able to live with ourselves.

Speaker 2 (04:07):
But think about this, says s across the world, a
small child is dying from hunger. We could save that
child's life by giving money to charity less than the
cost of the nice new clothes we were willing to ruin.
Surely our obligation to give to the charity is just

(04:28):
as strong as our obligation to wade into the pond.
Morally speaking, it doesn't matter if the child we can
save is right there in front of us or ten
thousand miles away. If you follow Singer's logic, spending money
on nice clothes instead of donating it to starving children

(04:49):
is just as immoral as walking past the drowning child
in the pond. Follow the logic further, and as long
as there's one starving child in the world, it's immoral
to own anything we don't really need. We should keep
on giving until we're only just better off than the

(05:10):
starving child ourselves. Nobody lives like this, of course, well
nobody except George Price. Peter Singer's essay became a fixture
in undergraduate philosophy classes, including the one I took. Students
tended to have one of two reactions. Either they tried

(05:32):
to find some flaw in Singer's logic, or they conceded
that Singer might be right, but shoved that thought firmly
to the back of their minds so they could resume
their normal lives without constantly thinking about all the starving
children they were thereby condemning to death. Will mccaskell was different.

(05:57):
In two thousand and five, aged eighteen, mccaskell read Peter
Singer's essay. He thought Singer was clearly right and decided
he should walk the walk by giving what he could
as a student. That wasn't easy. Students never have much money,

(06:17):
and mccaskell did also want to make friends. He tried
to compromise. When his friends went to the pub for
a drink, mccaskell ordered tat water, then quietly refilled the
glass with cheap lager he'd brought from the store. Mccaskell
got his degree in philosophy and a job in academia.

(06:41):
He decided that the first twenty six thousand pounds a
year of his salary would be enough to live on
about thirty three thousand dollars. Anything he earned above that
he give away. He researched the most effective ways to
donate some charitable causes. It turns out give you far

(07:02):
more bang for your buck than others. Bednets, for example,
save lives in countries with hilaria by stopping mosquitoes from
biting you while you sleep. By one estimate, around three
thousand dollars spent on bednets would save one life. Mccaskell

(07:23):
met others who shared his ideas. A movement emerged. Mccaskell
and his colleagues asked themselves what it should be called,
and came up with the name effective altruism. They became
the effective altruists, committed to giving away a significant chunk
of their income. In a modest basement office in Oxford,

(07:47):
McCaskill and his colleagues set up the Center for Effective Altruism.
They ate cheap vegetarian food, supermarket baguettes and hummers, and
debated the most effective ways to be altruistic. For instance,
might deworming pills do even more good than bednets per

(08:08):
day dollar spent? The numbers said they might. People with
money started to ask mccaskell's advice on where to donate.
That gave him a dilemma, because mccaskell was aware of
studies that show classically handsome people are more persuasive at
getting donations for charities, and mccaskell had always been conscious

(08:32):
of the gap between his two front teeth. Should he
invest in braces to make himself more handsome? On the
one hand, the money he spent on braces couldn't then
be spent on bednets or deworming pills. On the other hand,
it might make him a more effective advocate for those causes.

(08:53):
Mccaskell asked his old friends about this moral dilemma will
They said, if you want to get your teeth fixed,
get your teeth fixed. In a profile of McCaskill for
The New Yorker, one friend recalls it felt like it
subsumed his own humanity to become a vehicle for the

(09:13):
saving of humanity. Mccaskeal was getting asked for another kind
of advice too, career advice. Students at Oxford University wanted
to know what line of work they should go into
if they wanted to do the most good. Should they
become a doctor in a poor country, for example, or

(09:34):
a medical researcher to try to cure cancer. Mccaskell came
up with a surprising answer, none of the above.

Speaker 1 (09:43):
You are at a top university.

Speaker 2 (09:45):
He told them, you have a chance at careers that
could make you lots of money.

Speaker 1 (09:50):
Why not make money and give it away?

Speaker 2 (09:53):
If you become a high flying banker, for example, you
could easily fund a dozen doctors in poor countries, far
more effective than becoming a doctor in a poor country yourself.
The logic was impeccable. Mccaskal called the idea earning to give.

(10:13):
In twenty twelve, McCaskill visited Cambridge, Massachusetts, to spread his
ideas at other top universities. He heard about a student
at MIT who might be receptive. A physics major in
his junior year unkempt a bit odd, but clearly brilliant.

(10:36):
McCaskill sent the student an email, Let's have lunch. Sam
Bankman Freed was surprised to get an email from a
philosopher at Oxford University Who is this guy? Why is
he inviting me to lunch? Sam was bored of his
physics degree, just as had been bored at school throughout

(10:59):
his childhood. He was good at maths and a card
game called Magic the Gathering, but bad at social interaction.
He remembers having to teach him when it's considered appropriate
to smile. His classmates, he thought, saw him as smart
and maybe not all that human. He didn't feel close

(11:21):
to anyone except for one kid who also liked the
card game Magic Gathering. That kid remembers Sam as a
rare combination of hyper rational and extremely kind. Sam rationalized
his way to a belief system. I guess I should

(11:41):
care the same amount about everyone, which is pretty much
what Peter Singer said all those years ago. When someone
made the case to Sam that his beliefs were inconsistent
with eating meat, Sam thought about it and concluded, this
sucks because I love fried chicken. But they're right. He

(12:02):
became a vegan. In his cargo shorts, crumpled T shirt,
and battered sneakers, Sam met Will McCaskill for lunch. He
wasn't really sure what he wanted to do with his life.
He told Will Before he came to MIT, he'd thought
maybe he'd become an academic, but he now realized that

(12:24):
he'd find academia far too boring. Will pitched Sam on
his earn to give idea. If you want to make
the world a better place, he told Sam, you should
set out to make lots and lots of money. Cautionary
tales will be back after the break. Sam bankmin Freed

(12:57):
finished his degree at MIT and got a job on
Wall Street at a trading firm. The job involved spotting
tiny inefficiencies in financial markets, patterns data that others had overlooked.
It was all about making rational calculations and thinking and probabilities.

(13:17):
It wasn't easy, but if you were good, you could
make a fortune. Sam was a natural. In his first year,
he was paid three hundred thousand dollars, in his second,
six hundred thousand, in his third a million. He gave

(13:38):
most of it away to good causes, including Will mccaskell's
Center for Effective Altruism. How much might I be earning
in ten years, he asked his bosses, if you keep
doing as well as you are. They said, maybe as
much as seventy five million dollars a year. But Sam

(14:01):
wasn't happy. I don't feel anything, he confided to his journal,
or at least anything good. I feel nothing but the
aching hole in my brain where happiness should be. Sam
began to get interested in crypto currency in twenty seventeen.

(14:22):
Crypto was still a very new phenomenon. It was hard
to know what to make of it, an important emerging
asset class or just some complicated scam. New coins were
being launched all the time, but unlike shares in say
Apple or Amazon, they were often completely unrelated to anything

(14:43):
in the real world economy. Sam's trading firm wouldn't let
him touch crypto.

Speaker 1 (14:49):
It was far too risky to start with.

Speaker 2 (14:54):
Crypto was bought and sold on exchanges that aren't regulated
in the same way as stock exchanges. Crypto was relatively
easy to steal, what to misplace. If you lose the
password to your bitcoin wallet, It's not like losing the
password to your online banking. You can't call a help
desk and get another one. Still, Sam saw an opportunity.

(15:20):
The nascent crypto markets were far less efficient than the
financial markets he was used to operating in. The same
coins could trade on different exchanges for different prices. Sam
decided to quit his job and set up his own firm.
He'd use the techniques he had learned on Wall Street
to trade in crypto, But what about the risk of theft.

(15:44):
With a few furtive keystrokes, an employee might divert coins
into their own personal account in a way that would
never work for Apple shares. Sam had a genius solution
to that problem. He would employ only effective altruists. If

(16:04):
all his employees were just as committed as he was
to giving their money away, they would feel no temptation
to enrich themselves by stealing from the firm.

Speaker 1 (16:14):
It was perfect.

Speaker 2 (16:19):
By twenty eighteen, Sam's new company, alimede A Research, had
employed a couple of dozen effective altruists and raised one
hundred and seventy million dollars from investors. But things got
off to a rocky start. The first problem was Sam's
leadership style. One employee recalls he was expecting everyone to

(16:41):
work eighteen hour days while he would not show up
for meetings, not shower for weeks, have a mess all
around him with old food everywhere, and fall asleep at
his desk. Then there was Sam's new bot. He wanted
to automate buying and selling coins on different exchanges. That

(17:03):
was a tried and tested idea on stock markets, but
stock markets worked more reliably than crypto exchanges. His management
team were worried if this went wrong, it could go
very wrong, very quickly. When you switch on this bot.
They told Sam, you have to watch it like a

(17:23):
hawk and be ready to switch it off straight away
if it starts losing money. Sam agreed. He switched on
the bot, then fell asleep. The biggest worry of all
was that, well four million dollars worth of crypto had

(17:43):
just disappeared.

Speaker 1 (17:45):
Where had it gone? Had somebody stolen it? No one knew.

Speaker 2 (17:51):
Sam's management team wanted to tell their investors. Let's not,
said Sam. I reckon, there's an eighty percent probability that
it turns up somewhere. In Sam's highper rational mind. That
was basically the same as them still having eighty percent
of the four million dollars, and it would be perfectly

(18:13):
reasonable to put that in their accounts. We can't do that,
said Sam's management team. That's not how the world works.
The management team at Alameda Research lost patients. Sam was
a brilliant trader, but hilariously ill suited to running a company.
They walked out. Half the employees followed. The investors pulled

(18:37):
out three quarters of the cash they'd put in. Still,
that left Sam with forty million dollars to play with,
and now there was no one to complain when he
did things his way. Sam turned on his bot and
let it run. In Oxford, Will McCaskill and his philosopher

(19:04):
colleagues were thinking, remember what Peter Singer had said years
ago about how distance wasn't morally important. We should care
as much about a child starving ten thousand miles away
as a child drowning in a pond right in front
of us. McCaskill began to think we should treat time

(19:27):
the same as distance. We should care as much about
children who might be born in the future as children
who exist right now. Following that logic leads to some
strange conclusions. The future could last a long time. There
might be trillions upon trillions of future humans, far more

(19:48):
than the mere few billion alive today. But those future
humans will never be born. If today's humans carelessly go
extinct in the next few decades, what might cause that
a genetically engineered pandemic perhaps or a rogue super intelligent AI.

(20:11):
So perhaps the most effective thing altruists can do is
fund academic research into how best to prevent those risks.
Of course, most of that research won't lead anywhere, but
a small probability of a huge payoff can still outweigh
the certainty of a small payoff.

Speaker 1 (20:35):
Think of it like this.

Speaker 2 (20:37):
If you donate three thousand dollars to buy bednets, you
can be fairly hopeful of saving one life. But what
if instead you put your three thousand dollars towards holding
an AI safety workshop. The chance that it will lead
to an important breakthrough is minuscule, say one in a billion,

(21:01):
But if it does, it might save lots of future lives,
say a trillion. One in a billion times a trillion.
If you think about it rationally, that is basically the
same thing as saving a thousand lives. Far more effective
then to fund AI workshops than bednets. This new school

(21:23):
of thought became known as long termism. Will McCaskill got
to work on a book to spread the ideas more widely.
At Alameda Research, they finally found the missing four million
dollars worth of crypto. It hadn't been stolen. After all,

(21:44):
there had been a computer glitch. It had been sent
to an exchange without an accompanying note about who owned it.
When Sam finally realized which exchange might have it and
called them up, they were astonished, How has it taken
you this long to contact us. Sam's bot, meanwhile, was

(22:07):
doing well. Alimeda Research was making money, but Sam wanted more.
He'd realized that the real money making potential in crypto
wasn't in trading on someone else's exchanges. It was running
an exchange of your own. Sam came up with a

(22:28):
clever design for a new kind of crypto exchange, one
that would let its users gamble on the future price
of various coins. Many of those people would end up losing.
That's the nature of gambling, win or lose. Sam would
take his cut, just like a casino. The exchange Sam

(22:52):
had in mind wouldn't be legal to run in America,
so he set it up in the harmas he called
it FTX. It quickly became a huge success. It ran
a Super Bowl commercial in which characters played by Larry
David shown new inventions through the ages the wheel, the toilet,

(23:15):
the light bulb. Larry mocks them all that's stupid. At
the end, he's shown FTX and sneers dismissively.

Speaker 1 (23:27):
It's FTX. It's a safe, an easy way to get
into rypto. I don't think so, and I'm never wrong
about this stuff.

Speaker 2 (23:35):
Never the tagline don't be like Larry, don't miss out.
The AD's message is clear. You might not understand crypto,
just like Larry David's characters didn't understand the wheel or
the light bulb, but it is going to be just
as important. Don't miss out. Who cares if you don't

(23:56):
understand it? Gamble on it now, earn to give. Will
mccaskell had advised Sam Bankman freed. Did anyone care how
Sam was making his money as long as he was
giving it away? Cautionary tales will be back after the break.

(24:28):
In twenty twenty two, Will McCaskill published his book What
We Owe the Future. The organization he helped set up,
the Center for Effective Altruism, moved into impressive new premises.
It bought Whiteham Abbey, a grand fifteenth century estate just

(24:48):
outside of Oxford, to host workshops on subjects like AI
safety and pandemic risk. Some Effective Altruists felt queasy. Was
this really a better use of money than bednets? Sure,
said others. If we hold our workshops in a century's
old building, that'll help to focus everyone's minds on a

(25:11):
long term time frame. Sam Bankman Freed was fully on
board with Will mccaskell's new long term is thinking. One
rational way to donate his money, Sam decided might be
to get politicians elected who knew something about AI and pandemics.
Politicians like Carrick Flynn, an earnest, young, effective altruist who

(25:36):
had worked on pandemic prevention, then decided to run for
Congress in Oregon. Carrick Flynn didn't know that Sam Bankman
Freed had decided to throw money at his campaign. Flynn
was watching YouTube sipping diet mountain dew when YouTube cut
to an ad.

Speaker 3 (25:57):
Carrick Flynn faced poverty and homelessness, but he pushed through
to college on a scholarship and a career protecting the
most vulnerable billion dollars.

Speaker 2 (26:07):
Flynn was so startled he covered himself in diet mountain dew,
and that was just the start. Soon, the voters of
Oregon's sixth Congressional district could hardly look at a screen
without encountering an ad for Carrick Flynn.

Speaker 3 (26:23):
Carrick Flynn, the Democrat will create good jobs.

Speaker 2 (26:27):
Oregonians quickly got sick of hearing the name Carrick Flynn
and suspicious these wall to wall ads must be costing
a fortune. Who was paying? Reporters found out that it
was a crypto billionaire who lived in the Bahamas, and
demanded of Carrick Flynn, why is Sam Bankman Freed so

(26:52):
very keen to get you elected? I don't know, Flynn protested.
I've never met him. I've never talked to him. Flynn said.
He assumed it must be because of his interest in
pandemic risk. Of course, the reporters didn't believe that.

Speaker 1 (27:09):
They said, he.

Speaker 2 (27:10):
Must want you to do something involving crypto. Flynn became
increasingly bewildered. I'm not a crypto person, he protested. I
don't know much about it. I've tried to read about it.
I didn't really care. Flynn finished a distant second in

(27:31):
his election. For every vote he received, Sam had spent
something like one thousand dollars on ads. To put that
another way, for every three votes Carrick Flynn received, Sam
could have bought enough bednets to save a life.

Speaker 1 (27:52):
But in the new long term mist.

Speaker 2 (27:54):
View of effective altruism, the money spent on Carrick Flynn
hadn't been wasted. It had always been a long shot
that Carrick Flynn's political career would end up preventing some
future pandemic from wiping out humanity. But if it did,
it could save trillions of future lives. If you thought

(28:16):
rationally about altruism, funding Carrick flynn ads instead of bednets
made perfect sense. In the last two episodes of Cautionary Tales,
we've heard two wildly different stories about people who took
altruism very seriously. Indeed, George Price's altruism was driven by revelation.

(28:44):
A vision of Jesus told him to give away whatever
he had to whoever asked him. He ended up as
thin as a stick with rotting teeth, sleeping on a
mattress on the floor of a squad. Sam Bankman Freed's
altruism was driven by rationality. A moral philosopher told him

(29:04):
to make lots of money and donate it effectively. He
ended up encouraging people to gamble on crypto so that
he could put more money into politics. Taking altruism very
seriously indeed can take you to some strange places. Will

(29:27):
mccaskell flew into the Bahamas for a penthouse discussion about
the best ways to help future people. Sam had been
trying out a new idea. He identified a hundred experts
in AI and pandemic risk and sent each of them
a million dollars out of the blue and nose strings attached.

(29:50):
Use it well and I'll give you more. He planned
to give away a billion dollars over the next year,
but how more political campaigns, more workshops at Whiteham Abbey.
As it turned out, the question was moot because when
Sam's dark secret was about to be discovered, he hadn't

(30:14):
just been earning to give, He'd also been defrauding to give.
When Sam set up FTX, he couldn't get a US
bank to open an account. Its activities were too legally murky.
That meant FTX had no way of taking dollar deposits
from its customers. But you know who did have a

(30:37):
dollar account, Sam's company, Alameda Research. When customers opened an
account at FTX, they wired their deposits to Alamader. Alamader
could and should have kept that money safe for the
FTX customers, but they didn't they used it to trade with.

(30:59):
This wasn't legally murky, this was very illegal. Indeed, what
was Sam thinking? Sam was thinking that nobody need ever
find out Alamada's trading was making profits, and with this
extra money to play with, it would make even more profits.

(31:19):
And Alameda had plenty of assets to fall back on.
It owned crypto worth many times more than those customer deposits,
so whenever an FTX customer wanted their deposit back, he
thought it wouldn't be a problem. As Sam told the
author Michael Lewis, it felt to us that Alameda had

(31:41):
infinity dollars. But then crypto prices fell, Alameda now had
finite dollars. FTX experienced the equivalent of a run on
the bank when all the customers rushed to withdraw their
deposits at once. Alameda suddenly had to scramble to find

(32:02):
the money to pay them back. Remember, when Alameda had
lost sight of four men million dollars, it hadn't got
any better at keeping track of what was where. In
his book Going Infinite, Michael Lewis describes a comically frantic
hunt for Alameda's assets.

Speaker 4 (32:24):
Its CEO would come on to the screen and announce
that she found two hundred million dollars here or four
hundred million dollars there, as if she just made an
original scientific discovery. Some guy at Deltech they're bank in
the Bahamas message ramnic to say, oh, by the way,
you have three hundred million dollars with us, and it
came as a total surprise to all of them.

Speaker 2 (32:45):
Alameda couldn't gather its money in time. FTX was declared bankrupt.
Sam was arrested and extradited from the Bahamas to the US.
Michael Lewis makes the case that Sam wasn't so much
criminal mastermind as overgrown teenager, incredibly reckless, and incredibly disorganized.

(33:10):
The bankruptcy lawyers eventually located enough assets in Alameda to
give FTX depositors all their money back with interest, But
reckless and disorganized is hardly a compelling defense. He took
money that wasn't his and spent it according to whatever
logic suited him. Sam was convicted of fraud and sentenced

(33:34):
to twenty five years in prison. Sam Bankman freed, is
now known for his crimes, but it's his altruism that
interests me. And they have a surprising amount in common.
Sam purloined his customer's deposits because he made a hyper
rational calculation that he'd probably get away with it and

(33:57):
didn't think much about the fallout if it all went wrong.
Sam gave money to politicians, not the poor, because he
made a hyper rational calculation to prioritize future people over
people actually suffering today. Both these calculations remind me how
Sam's teenage classmates described him smart and maybe not all

(34:22):
that human. George Price was deeply depressed by what his
own work said about what it means to be human.
Our altruistic instincts evolved to serve our selfish genes. When
we feel the urge to do something nice, tends to
be the kind of thing that for our ancestors might

(34:44):
have helped their relatives or forged a friendship. Remember the
contrast drawn by Peter Singer. We wouldn't hesitate to wade
into a shallow pond to save a drowning child, but
we don't feel the same urge to give money to
save a starving child on the other side of the world. Why,

(35:07):
when you think from the g Deane's point of view,
it's not hard to explain. The child who's drowning right
in front of us might plausibly be a distant cousin
or have parents who feel forever in our debt. The
child who's starving half a world away, not so much.
Our selfish genes can help to explain why we are

(35:30):
the way we are, but they can't tell us what's
the right thing to do. We need our rational minds
for that. It is depressing that we ignore the starving child,
because evolution simply didn't build us to care that much.
The moral philosophers are right that we can use our

(35:52):
rational minds to transcend our selfish genes. Then again, I'm
not sure it's any less depressing to ignore the starving
child because we've thought long and hard about it and
decided to fund an AI workshop in stead. If we
follow the logic of altruism far enough, it can take

(36:14):
us to places that don't feel human at all. So
perhaps we shouldn't beat ourselves up too much if we
succeed in transcending our selfish genes only by a little bit,
If we manage at least to do something good for
people who aren't family or friends, and we don't give

(36:35):
everything away like George Price, or feel angst about getting
braces like Will mccaskell, or donate our cash to long
shot chances of saving unborn trillions like Sam Bankman Freed.
It may not be a rational approach to altruism, but

(36:55):
it is a human one.

Speaker 1 (36:57):
There are worse things in the world than being human.

Speaker 2 (37:14):
A key source for this episode is Going Infinite, The
Rise and Fall of a New Tycoon by Michael Lewis.
I will be speaking to Michael Lewis next week about
his time with Sam Bankmin Freed, and we're going to
be answering your questions on altruism and kindness. This episode
of Cautionary Tales also relied on Gideon Lewis Krause's profile

(37:37):
of Will McCaskill in The New Yorker. For a fullnest
of our sources, visit Timharford dot com. Cautionary Tales is
written by me Tim Harford with Andrew Wright, Alice Fines,
and Ryan Billy. It's produced by Georgia Mills and Marilyn Rust.
The sound design and original music are the work of

(37:58):
Pascal Wise. Additional sound design is by Carlos San Juan
at Brain Audio. Bend a d Afhaffrey edited the scripts.
The show also wouldn't have been possible without the work
of Jacob Weisberg, Greta Cohene, Sarah Nix, Eric Sandler, Christina Sullivan,
Kira Posey, and Owen Miller. Cautionary Tales is a production

(38:20):
of Pushkin Industries. If you like the show, please remember
to share, rate, and review. It really makes a difference
to us and if you want to hear the show,
add free sign up to Pushkin Plus on the show
page on Apple Podcasts or at pushkin dot Fm, slash
plus
Advertise With Us

Host

Tim Harford

Tim Harford

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.