All Episodes

December 5, 2025 41 mins

Cory Doctorow has spent decades helping to shape the way we think about the modern internet. He is a campaigner against monopolies, surveillance and digital rights. His new book Enshitification: Why Everything Suddenly Got Worse and What to Do About It analyses how the internet giants have captured us and become not quite as good as we had thought they were. On this episode of Ways to Change the World, Krishnan Guru-Murthy speaks to Cory about the broken systems we are living in and what we can do to try and make things better.


Strong language warning.


Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
We have found out lots of ways in which Google just
deliberately made things worse for its users to make things
better for its shareholders. They didn't have to worry about
being good because you were never going to find another
search engine and switch from Google to them.
So when the moment came, Google decided to make stuff worse.
How long have we got before Amazon just has everything?
The market can remain irrationallonger than you can remain

(00:21):
solvent. Trump has basically ended the
American Internet and now we getto decide what a post American
Internet looks like. It's up for grabs and I want to
grab it. Hello and welcome to Ways to
Change the World. I'm Christian Guru Murphy, and
this is the podcast in which we talk to extraordinary people
about the big ideas in their lives and the events that have

(00:42):
helped shape them. Now, my guest this week has
spent decades helping to shape the way we think about the
modern Internet. He is a campaigner against
monopolies, against surveillanceand against digital rights.
And he has a new book analysing how the Internet giants have

(01:02):
captured us and changed and become not quite as good as we
had thought they were. The book is called
Ensitification, with the subtitle Why Everything Suddenly
Got Worse and What to Do about It.
And it's about the broken systems that we are living in
and what we can do to try and make things better.

(01:23):
Cory Doctorow, welcome to Ways to change the world.
How would you change the world? Well, it depends on whether you
want to know like broad goals oror technical tactics.
So my my broad goal is that we should have an Internet where
when people make mistakes and dothings that are bad, they go out
of business and get replaced by someone else and maybe get

(01:46):
punished by the government or lose all their workers.
I have some tactics about how I would get there, but that's my
goal. Not an Internet run by perfected
people, but an Internet where the defects of people don't
become a permanent fact of life.I think we all know what
intricification means instinctively, but can you lay

(02:09):
out the process by which you think it happens?
Yeah, so in shitification it describes this three stage
process of platform decay, the process by which these platforms
that we love turn into giant piles of shit.
In stage 1, the platform is goodto its end users, but it finds a
way to lock those users in such that when they move to stage 2,

(02:30):
which is making things worse forend users, in order to make
things better for business customers, those users can't
readily depart. They have something that's
keeping them there. In stage 2, you bring in
business customers and you lock them in.
You make them dependent on thosecaptive users.
And then in stage 3, you withdraw the value from those
captive businesses and from the captive users leave behind a

(02:52):
kind of mingy homeopathic residue of value that's
calculated to be just the minimum needed to keep users
locked to the platform, businesses locked to the users,
and all the value is harvested for executives and shareholders.
OK, so, so let's go through a couple of examples.
I mean, do you think this applies to all the Giants first
of all or, or are there any goodguys?

(03:13):
No, I think that it's really allthe firms are are bad.
And I think that, you know, we have to understand this as a
product of incentives. So even if you think the firms
are good, now they have this enormous pressure to worsen
themselves because their competitors are doing it.
And because we've rigged the policy environment so that when

(03:35):
their competitors do terrible things, they make more money
rather than going out of business, which means their
shareholders in the market is going to demand that they do
this. It's, it's really, it's not
about like whether you can get someone who's sufficiently
heroic and moral to run a business that they don't do this
to you. It's it's whether we can get our
policy makers to stop creating the inshitogenic policy

(03:57):
environment that rewards companies for being terrible.
That's a great word. What was it again?
Inshitogenic. Inshitogenic I can do this all
day like I'm like a last studentdeclenching a difficult verb.
Here in the inshita scene is theera we're living through.
We would like to disincitify things, but then there's anti
disincitificatory forces massed against us and so on.

(04:19):
I I like speaking English. It's a mongrel language where
you can just make up words and then make up words based on
those words. It's quite fun.
I love that we don't have a language Academy who tells us
what the words mean. So I wanted to go through sort
of three examples, if you don't mind.
And I know you're used to doing this because you've done a lot
of interviews around this, but sort of bear with me.
I mean, I wanted to do sort of Facebook because I think that's

(04:40):
when we all kind of understand has changed.
And perhaps for the worst, Amazon, because a lot of people
think, well, we choose to use that because it's good for us.
And Google because despite all the debates around Google, you
know, Google obviously claims tobe a force for good and to be
campaigning for the ability to migrate data and so on.

(05:03):
So let's just go through those 3examples, beginning with
Facebook. Sure.
So Facebook I think of as the prototype case here.
They really are kind of the people who lay bare what
identification looks like when it's at home.
So when Facebook kicks off, you know, first, obviously they
exist as a way to hook up americancollegekidswhohave.edu

(05:24):
addresses. But then they open to the
general public in 2006, and they've immediately got a
problem, which is that everyone who could conceivably be using
social media already has an account on Myspace.
And so they go to those Myspace users and they say, look, I know
you love the rollicking conversation that you get on
Myspace. It's quite fun to hang out with
all those people. But the last thing anyone should

(05:45):
ever want is to be using social media that's owned by an evil
billionaire who spies on you. Come use Facebook.
It's run by an extremely normal man who will never spy on you.
And so we pile into Facebook andwe lock into Facebook.
And so this is stage 1. And the way that we lock in is
something that economists call the collective action problem,
which, you know, you may know as, like, you love the six

(06:07):
friends in your group chat, but boy, are they in a pain in the
ass. And not only can you not agree
when it's time to leave Facebook, you can't agree on
which pub you're going to go to on Friday night.
And so you get locked in becauseyou love those people more than
you hate Mark Zuckerberg. And you, you are stuck on his
platform and that inaugurates stage 2, which is making things
worse for you because he knows that you love your friends more
than you hate him. So you can make you you hate him

(06:29):
a little more by bringing in some business customers.
So to the advertisers, he says, give us a remarkably small
amount of money and we will target ads to them with
incredible fidelity. And because we're such proud
craftspeople, we have paid for awhole building full of engineers
that are going to do nothing except police ad fraud.
So when you give us a pound to show an ad to a specific kind of

(06:50):
user, we're going to find that user and stuff that ad right in
their face. He goes to the the publishers
and he says, do you remember we told these fools that we weren't
going to show them things that they didn't ask to see?
Obviously that was a lie too. Just put excerpts from your own
website here on Facebook. Add a link back to your own
website. We'll cram those non
consensually into the eyeballs of people who never asked to see

(07:12):
them and they'll click your linkand give you free traffic and
you can monetize that traffic asyou see fit.
So that's stage 2. Now these businesses become very
dependent on Facebook. And I think that unless you're
like a small trader or someone who's got an econ degree, you
probably don't really think muchabout how easy it is for a
powerful buyer to lock in their sellers.
I mean, we all know about Monopoly because we've all had

(07:34):
rainy afternoons playing the terrible board game until we
wanted to kill our family and ourselves.
But no one's ever played a boardgame called Monopsony, which is
the corollary of Monopoly. It's the powerful buyer problem.
And powerful buyers, they don't have to be that powerful to
exert a lot of leverage over thepeople who sell to them.

(07:54):
You know, if you imagine that you're running a coffee shop on
a block with four other coffee shops and you're next to an
office tower, and the workers inthat office tower amount to 20%
of your gross receipt, it's hardly a majority.
But if that business goes bust and your gross revenues fall by
20% overnight, that's probably the end of the business, right?
You're not going to be able to make your loan payments.

(08:15):
You're not going to be able to make payroll.
You're going to have to ask yourlandlord for rent relief.
And you might never recover because 20% might as well be
100% if we're talking about the sudden drop off of it.
So advertisers and publishers become very dependent on
Facebook. And so things get worse for them
too. Advertisers find that ad prices
are going way up at targeting, Fidelity is going way down, and

(08:36):
AD fraud explodes in a way that I think like beggars belief.
Publishers find they have to putlonger and longer excerpts,
eventually the whole article, before it will get recommended
to someone or even shown to the people who subscribe to them.
And if they dare put a link backto their own website, that
article is fully suppressed because the link might be quote
UN quote malicious. And so now Facebook is squeezing

(08:57):
those people. They are squeezing you.
You have a feed that's basicallydevoid of the things you ask to
see with just the minimum Facebook calculates, we'll keep
you there. And meanwhile, Facebook is is in
this equilibrium where they've left behind just enough value to
keep everyone locked in and harvested the rest for
themselves. But they're also in this very

(09:18):
precarious state because the difference between I hate
Facebook, but I can't stop coming here and I hate Facebook
and I'm not coming back. It's razor thin, right?
You get a live stream mass shooting, people bowl for the
exits, the market stages of masssell off, and the company
panics. And being technical people, they
have a technical euphemism. They call it pivoting.
I think I know the answer to this, but is there any defence

(09:40):
in the fact that along the way there was a lot of economic
activity, You know, publishers were, you know, were, were
getting a lot of a lot of views,were making money along the way,
people selling, were selling anduntil it got bad and until the
algorithms changed, there was quite a lot of good stuff going

(10:03):
on. Yeah, I think that is a defense.
I think that's a defense of the idea that if we can make the
policy environment better, we don't have to make the people
better. You know, it's I would very much
like to see Elon Musk and and Mark Zuckerberg in the poor
house and relegated to an, you know, unimportant appendix in
history. But the fact is that, you know,

(10:23):
they ran their companies better when they faced constraints,
when they had to worry about competitors, when we didn't let
them buy their competitors, whenthey had to worry about
regulators, when we didn't have a cartel that captured its
regulators, When they had to worry about workers, When
workers had power because they were scarce, a power that they
failed to consolidate through unionization, which means that
now that they're not scarce, they don't have power.

(10:44):
And when they had to fear new market entry, when they had to
fear people entering the market with new technologies that could
siphon off users and workers andvalue by offering something
better that modified their existing technology, something
like an ad blocker or a jailbreak for an App Store or
for a printer or for a car or for any other product, they had

(11:05):
to, they had to treat us better.But then, you know, we expanded
IP law to make that kind of reverse engineering illegal.
And so now they're just shorn ofany discipline and, and you
know, it, it wasn't Mark Zuckerberg who pass the acts in
Parliament or who got Brussels to enact the regulations or who
caused Congress not to update privacy law for, for 50 years.

(11:28):
The last privacy law in America,consumer privacy law to come out
of Congress is a prohibition on video store clerks disclosing
your VHS rentals and literally anything else that happens that
violates your privacy. Privacy is fair game in America.
That was the policy makers. They did it right.
And so, you know, I, I, I don't like these guys, but I think

(11:48):
that if we're going to hold anyone's feet to the fire and if
we're to have any hope of havingthe next round of imperfect
people nevertheless deliver value to us, we need to get the
policy environment. Right.
OK, well, we'll, we'll come to policy.
I mean, I just want to sort of still explore.
Does this kind of apply to everyone?
I mean, Amazon now. Now, obviously we we've all

(12:10):
heard criticism of the way Amazon treats workers, and we're
beginning to hear more about howAmazon treats the businesses
selling on it. But for the consumer, don't we
still just get whatever we want for a reasonable price?
Well, the reasonable price part and actually whatever we want
part are both up for grabs. So what you saw with Amazon's

(12:31):
history is a long period of increasing, squeezing around
suppliers and workers. Yeah.
And, and as you say, broadly, I think there's this there's been
this idea as kind of Thatcheriteidea called consumer welfare,
which is that so long as monopolies are only screwing
workers and suppliers, but delivering value to consumers,

(12:51):
that should be fine. And so for a long time we had
this this thing from Amazon where, you know, you could even
be like a a bit of a sharp operator, right?
You could, you could call up Amazon or e-mail Amazon and say,
oh, that thing that I was supposed to get never turned up.
And they just ship you another one and bill the merchant, which
might be some small traders using Amazon as a platform.
So there was a long time when Amazon would have your back, but

(13:12):
that has ended. So let's take just one example
of of how this goes wrong. Amazon's search business, they
call it an advertising business.When I wrote the book, it was a
$38 billion business. Now it's a $58 billion business.
It's worth double the total advertising revenue of every
news outlet in the world every year.

(13:35):
So what is the search business? Well, it's an auction for the
search results. So we've replaced the fitness
factor for successful products in the economy.
That was do you make lots of people happy enough to pay you
more than a cost to make this with the fitness factor of Do
you make Amazon's algorithms happy enough by parting with
enough money and raising your prices while you do it to sell

(13:57):
to people who are sort of lockedinto the Amazon marketplace?
Because even if you shop somewhere else, the prices have
gone up there too, because Amazon's imposed this economy
wide tax. So, you know, this is not a
consumer welfare story. And and indeed, the outgoing
head of Biden's Federal Trade Commission, the equivalent of
somewhere between the Office of Fair Trading and the Competition

(14:19):
and Markets Authority. They're, you know, the world's
largest, most powerful consumer regulator.
This woman called Lena Khan. She made her bones when she was
a law student. In her third year at Yale, she
wrote a Yale Law review article called Amazon's Antitrust
Paradox. I think it was 2017.
And it did something that no lawreview article has ever done in

(14:39):
the history of the world, which is went viral.
People who weren't lawyers read a law review article, and it was
basically a rebuke to the peoplein consumer welfare theory to
say, look, you predicted that ifcompanies were allowed to
destroy their suppliers and their workers, that they would
never turn on consumers. And you were wrong.
And Amazon proves it. And, you know, within a few

(15:01):
years, she was the most powerfulconsumer regulator in the
history of the world. And she brought an important
case against Amazon that Amazon briefed against by saying she
had bias against Amazon because she knew too much about them.
And only regulators who hadn't studied and published important
articles about Amazon should be allowed to seek to regulate
Amazon out of a sort of world oflike absolute knowledge vacuum.

(15:25):
And where do you think we are sort of on the Cliff edge of
competitors going out of business?
High Street stores, shopping malls, you know, small online
platforms. You know, how long have we got
before Amazon just has everything?
It's pretty bad. I mean, we're, we're whether or
not they have anything is actually less important than

(15:47):
whether or not they structure the market elsewhere.
You know, you still have like corner shops, but because you
have discriminatory discounting to big box wholesale, big box
retailers, it is often the case that the guy at the corner shop
ordering a case of fizzy drinks pays more wholesale than you

(16:09):
would pay retail at a a big warehouse store like a Walmart.
And so it just means that they're always going to exist at
the periphery that they can never grow and that they could
be extinguished at any minute. And they will live and exist in
increasing desperation. And also, they'll probably have
to, you know, find other ways toraise prices around the edges on

(16:31):
things that you're quite desperate for, like the the
infamous water soluble umbrella that every news agent puts out
the minute the rain starts falling and charges whatever £20
for, because otherwise they justgo out of business.
So it just makes the whole market sour.
I'm not going to like bet on theday when Amazon's the only
seller. You know, famously, the market
can remain solvent or, or the, the market can remain irrational

(16:53):
longer than you can remain solvent.
But you know, there's a a corollary to that out of
finance, which is Stein's law, which is anything that can't go
on forever eventually stops. And I think that just as
dangerous as the market entirelyrun by Amazon is the market in
which Amazon like gets angry at the UK or makes some bad
decision where things get much, much worse precipitously and we

(17:15):
can't reboot a pluralistic retail sector to pick it up.
And then we just end up with just just shit even worse than
what we have now. You know, the problem with a
benevolent dictator isn't merelywhether they're sometimes not
benevolent, it's also whether they're not sometimes competent,
right? If they if they make a mistake
and there's no one to stop that mistake and nothing to pick up
the slack, then we're all prisoners of their folly.

(17:39):
OK, so So what? What about a company like
Google, which still claims to want to do no evil, that says?
Well, they took that out of the mission statement.
They still think of those you want.
To organize the world's information and make it useful,
which is arguably the thing they're not doing.
But they, they, they still say, well, look, you know, we believe
in, in openness. You can close, you know, you can
take your data. You can, you know, you can have

(18:00):
control over how much we know inin what sense is Google also
guilty of insitification? Well, I mean, Google does does
not make good on any claims about whether or not its users
can control what they know. Very infamously, one of the
antitrust cases against Google, there was a set of memos

(18:21):
published by the Department of Justice internally where the
person who ran Google location services said that I've turned
off location tracking and I think it was 12 places and I've
just found a 13th where I'm still being tracked.
So, you know, the, the, I think you should think of the privacy
dashboard on your Google settings as being something like
the, the Fisher Price steering wheel that you clip to your kids

(18:45):
car seat so that they can pretend to drive.
It's not connected to anything. But, you know, Speaking of the
Department of Justice and the antitrust cases against Google,
we have found out lots of ways in which Google just
deliberately made things worse for its users to make things
better for its shareholders. And in the search case, as
reported by Ed Zitron, we see a set of memos going back and

(19:08):
forth between two factions of Google executives about the
destiny of Google search. So the precipitating event here
is that Google search revenue growth had stalled, and it had
stalled for the excellent reasonthat Google had a 90% market
share. So you do not grow your market
share from 90%. I mean, sure, you can raise a

(19:30):
billion humans to maturity and make them Google customers, and
that's a product Google calls Google Classroom.
But it takes a while, right? And in the meantime, the market
would like to see increasing returns on this product with an
90% market share. And so you have 1 executive,
this guy called Prabhagar Raghavan, who's an ex McKinsey
guy who comes to the company from Yahoo and who's in charge

(19:50):
of Google search revenue. And his proposal is why don't we
just make Google search worse? If we turn off all the things
where we try and guess at what you meant to make your query
better, some of that is just like.
Query stemming. So that would be like you search
for trousers and it runs a background query for pants as
well. Some of that is spell checking,

(20:11):
some of that is like context awareness.
So, you know, there was this news cycle here in the United
States where a person who workedfor the US government walking
down the streets of DC saw a National Guardsman as part of
their occupying force and threw his submarine sandwich at this
guy. And it was quite a cause celeb.

(20:33):
They called it assault with a deli weapon.
And if you search for submarine sandwich on that day, it would
be nice if Google was like context aware enough to float
those to the top because it's more likely on that day that you
are searching for information about that news cycle.
And so Raghavan's proposal is wejust turn this stuff off.

(20:53):
So you search for Submarine Sandwich and you don't get the
National Guardsman story. So you have to search again, you
have to type Submarine Sandwich,National Guardsman.
Maybe you don't find the story then.
So then you type Submarine Sandwich, National Guardsman DC,
and then you see the story. And that's three chances to show
you ads. They didn't have to worry about
being good because you were never going to find another
search engine and switch from Google to them.

(21:14):
And so this made them sort of immune to competitive pressure.
And so when the moment came, Google decided to make stuff
worse. But I mean, the the competitive
pressure for Google has arrived now, hasn't it, with AI and
ChatGPT and their own browser. And you can see young people
have already switched from the Google default to ChatGPT as

(21:36):
their default. So so will that, will that
change behaviour do you think? Or will ChatGPT just end up
doing the same thing? Well, first of all, yeah, AI is
incredibly insitification prone because it makes these mistakes
that they grandiosely call hallucinations, but which we
would call product defects. And it's impossible to tell

(21:59):
whether the AI chatbot has done something that is to the
company's benefit and your detriment because of these, you
know, sort of ineffable, inexplainable hallucinations,
AKA product defects, or because they put their thumb on the
scales. So did you get directed to a
product that the company gets a big Commission on because that

(22:23):
was the best product, or was it because the AI hallucinated?
Did your query, which would normally burn 2 tokens, burn 200
tokens and cost you 5 lbs instead of five P?
Was that hallucination or was that because the the company
decided that this was a way to cheat you?

(22:44):
It's very hard to tell. And you know, like deposing the
engineers and pulling the serverlogs and making the case would
be really hard. So there's an enormous
temptation to do this. But there's a a much bigger
problem with AI, which is that it's not real.
AI has spent $700 billion on capital expenditures thus far.
They are making at most $60 billion a year, but it's not

(23:06):
anywhere near $60 billion a yearto take just one slice of
revenue from that $60 billion figure.
That's the $10 billion that Microsoft gave to Open AI and
Open AI gave back to Microsoft, which Microsoft then booked as
revenue, right? This is like someone in a green
apron outside of a Starbucks handing you a voucher and you go
and you get a latte and Starbucks announces that they

(23:27):
made £7 from you. They didn't make £7 from you,
right? So this is not revenue, right?
They have $50 billion Max, and it's less than that to recoup
$700 billion. And how long do they have to
make that much money before that$700 billion is a write off?
Well, they are amortizing or depreciating those GPU's that
they bought on a five year depreciation schedule.

(23:48):
So that would mean that they would have to somehow make $700
billion at $50 billion a year over five years.
Except privately they'll tell you that the GPU is burned out
in two years. Meanwhile, you've got 7
companies, the so-called Magnificent 7 that are between
31 and 33% of the S&P 500. This is like the FTSE index but
for America. And they're just handing around

(24:09):
the same $100 billion IOU very quickly and pretending that all
of them have that in their bank account.
And so when the crash comes and the crash will come, most of
these foundation models are going to be switched off, right?
Like, it's not just that they would have to charge you to use
this, They would have to charge you a gigantic amount of money
to use this. You know, they are talking up

(24:30):
the demand for this, but you know, if they had a lot of
demand for it, presumably one ofthe things they could do rather
than borrowing lots of money or rather than, you know, seeking
more money in the capital markets is they could start
charging money for it. That is the traditional way by
which firms manage demand that exceeds their capacity, exceeds

(24:50):
their capacity, and they use some of that revenue to build
out the capacity. I think they know that if they
charged even a penny for this, that especially the young people
who are typing into the open AI for their search results would
just run for the exits. And that's before we even get to
whether or not AI can be a good search engine or whether or not

(25:10):
even if AI could be a good search engine today, whether it
could be a good search engine tomorrow.
Because of course the reason that these companies have the
data they have is because they scrape public websites that they
now substitute for. And I'm of the view that the
copyright question is kind of a distraction.
I think even if we reform copyright or if clarify

(25:33):
copyright to make it unambiguousthat you could stop a search
engine from training its model with your material if you wanted
to, That the firms that represent the entertainment
industry, who are my boss, right?
The people who pay me as a writer, that the thing they want
more than anything is just not to pay me anymore.
And so they would say, well, from now on, if you want to do

(25:54):
business with us, you have to sign over that, right?
And then we're going to train a model and we're going to try and
put you out of business. So it's really about like the
distribution of the people who, who make a meal of me, like who
gets my legs and who gets my arms.
It's, it's not about like protecting me from being
devoured by these companies. But but you know, the, the real
problem is that like, who's going to let them scrape their

(26:15):
stuff, right? Like not, not like who's going
to sue them over it? Who's going to let them?
Who's going to welcome their crawlers if the only thing that
you get at it is the cost of serving the query and no search
traffic back? And so I, I just think it's not
real. Like that's the problem with it.
It's not real. It's like asking what we'll do
when the anti gravity machine arrives.

(26:36):
OK, so look, people listening tothis thing going right, what can
I do? And I, you know, I suppose the
big point is you as an individual can't do that much,
can you? I mean, this, this, this is a
government task. And and not just one government.
It's the world. Well, yes, I, I think that, you
know, trying to shop your way out of monopoly is like trying
to recycle your way out of the climate emergency.

(26:58):
You know, the reason billionaires want you to vote
with your wallet is because there's not many billionaires,
but they have very thick wallets.
And so the only vote they ever win is the wallet vote.
They never win the vote vote because there's like 6 of them.
So, you know, this is the whole point of trying to get you to
fetishize your consumption choices rather than become a
member of a polity. And I think you have to become a
member of a polity. Obviously, electoral politics in

(27:20):
the UK aren't quite a ferment. And, you know, where the
policies land, where the smallerparties end up, That is like,
entirely up for grabs. And what we've seen in other
periods of dislocation is that smaller parties can become
kingmakers or they can even win surprise majorities or become
surprise members of coalition. So that's one way you can get

(27:41):
involved. But there's also the nonprofit
sector. So I have worked for the
Electronic Frontier Foundation, which is an American, an NGO
with a strong European presence.I was their initial European
director and you can get involved with eff.org.
I also helped found auk equivalent to EFF, something
called the Open Rights Group, which you can find at Open
rightsgroup.org. And it has community groups in

(28:03):
cities and in the nations in, inWales and Scotland and so on
that that work on these issues. But you said that it has to be
all the countries in the world. I actually think that's not
quite true. I think this is the, the secret
strength here of the tech policybattle as it stands is that tech
is so slippery and so flexible that if we were to repeal the

(28:27):
laws that ban reverse engineering and modification,
that you could imagine some smart person down in what we, I
guess, have to still call Silicon Roundabout.
Or maybe you know, Manchester or, or I, I, I teach at the Open
University. Maybe it'll be someone in Milton
Keynes who figures out how to reverse engineer an iPhone and

(28:48):
raises the capital and makes thelittle dongle that you can buy
in the checkout aisle at the Asda that jailbreaks your phone
and installs a third party App Store.
And maybe they sell apps, but maybe they actually sell that to
everyone who wants to have an App Store.
And because digital is so slippery, anyone in the world
who wants to buy that product could just get it from you over

(29:09):
the Internet, right? So I am like all the best
Americans, a Canadian, and Canadians have figured out how
to make Americans happy by selling them reasonably priced
insulin, which we stick in the post and mail to them.
That's a relatively inefficient process.
But Canada, the UK, some EU member state, maybe Ghana, which

(29:31):
has a lot of engineers, Nigeria,which has a lot of engineers,
and not only that, a lot of engineers who've just come home
because America has made it clear that if they are caught on
the streets in America, they'll be kidnapped by masked goons and
sent to a gulag. And where there's capital that
has fled America because they'd like to be involved in a
business whose success factor isn't solely determined by how
many Trump coins they buy. The country that does this first

(29:55):
could make, I think 10s, if not hundreds of billions of dollars,
destroying the rent extraction machines of American big tech
and then exporting those products to everyone in the
world, including Americans who are frankly the first victims of
this stuff. They were patient zero in this
epidemic. So I do think that like we just
need one country to break from the bullying that the US Trade

(30:18):
Representative subjected them tofor 25 years to say you must not
allow your entrepreneurs and your technologists to put our
technologists out of business. I mean, you, you mentioned, you
know, you, you spent a lifetime,an adult lifetime sort of
campaigning on this stuff. I mean, given where you've
analyzed that we are, doesn't itsuggest that the the pressure

(30:40):
groups and the protest groups don't work or aren't working?
Well, you know, I, I will say that we've had plenty of
victories and plenty of defeats over that time.
If, if we'd only ever lost, I, Iprobably wouldn't have made it
this long, But we have lost a lot of these fights.
And I think what's changed is the coalition partners we can
recruit. You know, anytime you see a a

(31:02):
group of people who've been trying to make something happen
for a long time unsuccessfully, and they become successful, you
should assume that they found some coalition partners.
It's not that like they just happen on a cool new hack to let
the same people apply the same amount of pressure but force
multiply it through technology or or through some messaging
campaign that suddenly makes it work.

(31:23):
It's because they found a way toset aside their differences with
other people and make it go so. A lot of people who've been
involved in consumer rights and privacy rights and labor rights,
they've been arguing about this tech stuff for a long time.
And as you say, we have not succeeded to the degree that I
would like us to have succeeded.Mostly we fought a holding
action and on the whole we lost ground.

(31:44):
But we won. Why do you think that is?
Well, because our our adversaries were the most
wealthy people on earth who madea lot of money by stealing from
the public and then rolling someof that money back into
influencing lawmakers. Right when the former deputy PM
of the UK gets put on 4 million a year to run around Europe and

(32:05):
say you mustn't break up Facebook because we're the only
things defending European cyberspace from Chinese
communism. But we have new allies, all
right? So on the one hand, you have
firms and entrepreneurs and technologists who'd like to just
make a hell of a lot of money. But then you have another set of
allies, which are the national security Hawks, who I think are
going to be the decisive partners in this coalition

(32:27):
because Trump has made it reallyclear America does not have
allies and it doesn't have trading partners.
It has adversaries and rivals, and it will attack them using
technology. And now you have Brussels
scrambling to build the euro stack to try and build European
servers with European services that European businesses can
relocate to. But unless we legalize

(32:49):
jailbreaking and reverse engineering, they're not going
to get their data out of these government ministries.
No one's going to copy and pastea million documents by hand.
And it's not just data because if you'll recall, when the
Russian looters stole those Ukrainian tractors, the John
Deere company bricked the tractors, right?
Which like, it's cool if you're a cyberpunk writer like me, it's
kind of fun to think about. But if you think about it for 10

(33:11):
seconds, you realize, well, anyone who Donald Trump doesn't
like could have all their tractors brick too.
So there is such an impetus for fixing this.
So now we have this 3 legged stool of entrepreneurs, digital
rights activists and national security Hawks, including people
worried about China, because this is also a risk for Chinese
inverters and solar batteries. It's also a risk for Chinese
telecoms companies. We need to jailbreak all of this

(33:33):
technology and put open source software in it.
And this is finally the moment, I think, where we have enough of
a coalition to do it. And of course, when you talk to
any government minister pretty much anywhere about this stuff,
they say, well, we can't act alone.
You know, we, we can't declare war unilaterally.
We have to work with everybody else.
What could the British government do?

(33:56):
Yeah, for example, without European agreement or American
agreement at all. Right.
So instead of saying, let's takeour deeply, literally,
figuratively, and notably inshitified water and committing
it to an endless array of data centers funded with debt that
will be white elephants when thebubble bursts, and trying to

(34:19):
fire civil servants and replace them with defective chat bots.
We could say, well, we're not inthe EU anymore.
Article 6 of the Copyright directive that bans British
businesses from competing with American businesses by rating
their highest margins. We don't need to follow that
anymore. We are going to unleash capital
to do this, to strengthen our digital sovereignty, but also to

(34:42):
do our industrial policy. And so this is a giant
opportunity. And it would be something that
rather than indiscriminately hitting the American economy,
you know, in Canada, we've we'vedone retaliatory tariffs, which
is to say we made everything we buy from America more expensive.
It's a very weird way to punish America, right?
It's like punching yourself in the face as hard as you can and
hoping the downstairs neighbor says Ouch.

(35:04):
And it's totally indiscriminate,like tariffing soybeans just hit
some poor farmer who, you know, lives in a state that begins and
ends with a vowel and has never done anything bad to Canada, and
who is tormented by the John Deere Company who RIP him off
every time he needs to fix his tractor.
Rather than whacking that poor sod, we could make the tools
that let him fix his own goddamntractor.
We could put John Deere and the other tech companies that gave

(35:26):
millions of dollars to Trump outof business.
And we could float our own domestic tech sector that
everyone in the world would buy products from.
And why do you think British politicians aren't more
interested in this kind of agenda?
I mean, are they just scared? I mean, I, I imagine they would

(35:47):
just regard that as a declaration of war on Donald
Trump's America and that's something Britain can't do.
Yeah, I think that 25 years of activism by the US Trade
Representative has lobotomized our policy makers.
They no longer see this idea that someone else's margin can
be their national opportunity. They really think that you

(36:08):
should not upset other people's apple carts, That disruption is
something that only Americans get to do.
That when they do it to us, it'sprogress.
But if we do it back to them, it's piracy.
And I think they've just forgotten this policy response.
I think a lot of consumers and even technologists have
forgotten this policy response. You know, I am, I'm a proud drop
out of the University of Waterloo, which grandiosely

(36:32):
bills itself as MIT North. And they were kind enough to
bring me back for the 50th anniversary of their computer
science program. And I gave a talk about this
stuff. It was more than a decade ago.
It's kind of what set me on thispath today.
And I had this grad student raise her hand and say, you
know, I can figure out how to build a thing like Facebook, but

(36:54):
I can't figure out how to get people to abandon their friends
and come to it because they lovetheir friends.
And what do I do? And I told her about how
Facebook launched because we've kind of forgotten this, too.
When Facebook started and it wascompeting with Myspace, their
pitch wasn't just, you know, come and use our superior
service with its better privacy policy all alone, right?

(37:16):
And sit in smug superiority, rereading that privacy policy
until your stupid friends get the message and come join you
here. They gave those users a bot, and
you gave that bot your login andyour password.
And it would go to Myspace several times a day.
It would scrape everything waiting for you on Myspace in
your feed and put it in your Facebook feed.
And you could reply to it and we'll push it back out again.
So you could eat your cake and have it too, right?

(37:39):
You could see what was happeningon Myspace without being a
Myspace user still. And when I told this to this
grad student, she was like, but can we do that?
Is that like, wouldn't that be like, wrong to do?
And the coda to this is I ran into her a couple of years ago.
I was doing a signing in New York, and this person comes up

(38:00):
to the front of the signing queue at The Strand and says,
you know, I saw you at the University of Waterloo on our
50th anniversary of our computerscience program.
And I said I had the most remarkable conversation with a
young grad student there. And she said that was me.
And I've devoted my life to exploring how jailbreaking and
reverse engineering can make technology better and undo the

(38:21):
worst excesses here. So I think that when you awaken
people's imaginations to this, right, it's not just that
they're cowardly or supine. It's not just that they're
bullied by billionaires. They've just literally
forgotten. They've forgotten that there's a
totally honorable tradition in technology of reverse
engineering a legacy platform tomigrate users to a new platform,

(38:43):
and that this has been the greatengine of dynamism in digital
technology. Right When Lotus 123 launched,
you could open your oh God, whatwas the other spreadsheet
called? The one that came before it,
Visicalc. You could open your Visicalc
spreadsheets in Lotus 123 and ifyou couldn't, Lotus 123 would
have died on the vine and we never would have gotten Excel.

(39:04):
Do you think you, you've sort ofhit a moment with the book and
these ideas? I mean, these ideas have been,
you know, you, you've been bashing away at them for a
while. But I, but I, I guess we're all
starting to get the extent to which we've been captured.
So I suppose how hopeful are youthat you're on to something now?

(39:26):
Well, I'm glad you used the wordhope and not the word optimism.
I have no time for optimism or pessimism.
I think they're both a form of like vulgar fatalism, the belief
that what we do doesn't matter, right?
Things will get better or thingswill get worse because of the
great forces of history. I think we're the great forces
of history, that what we do matters.
And hope is the idea that if youcan make a material change to
your environment, even if you can't see your way all the way

(39:49):
to where you're trying to get to.
If you can ascend the gradient just a few steps towards that
future. That from that new vantage
point, you can see pathways thatare open to you that were
occulted when you were down at the bottom there in the valley.
And that in the stepwise fashion, maybe with some
traversing and doubling back andso on, it's messy that you can
find your way to a better world.And I'm very hopeful because as

(40:11):
you say, for 1/4 of a century, most of my adult.
My life, I'm 54 years old, I'm going into my 25th year in it,
just a couple of weeks. So for most of my adult life,
I've been pushing on a door thatwasn't just closed, but treble
locked. And now the door is open a
crack. And people who are just kind of
waking up to this stuff, some ofthem say to me, what are you so

(40:33):
excited about? That door is only open a crack.
And I'm like, you're goddamn right, it is open a crack.
Like this is the most incrediblething.
I've been waiting for that door to be open a crack for 1/4
century. Let's push on that door, right?
I think that we are really at a moment and it's weird that it's
Trump that it's delivered it to us.
But you know, when life gives you SARS, you make SARS

(40:53):
Spirilla, right? Trump has basically ended the
American Internet and now we getto decide what a post American
Internet looks like. It's up for grabs and I want to
grab it. Cory Doctorow, thank you very
much indeed. Thank you.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Betrayal: Weekly

Betrayal: Weekly

Betrayal Weekly is back for a brand new season. Every Thursday, Betrayal Weekly shares first-hand accounts of broken trust, shocking deceptions, and the trail of destruction they leave behind. Hosted by Andrea Gunning, this weekly ongoing series digs into real-life stories of betrayal and the aftermath. From stories of double lives to dark discoveries, these are cautionary tales and accounts of resilience against all odds. From the producers of the critically acclaimed Betrayal series, Betrayal Weekly drops new episodes every Thursday. Please join our Substack for additional exclusive content, curated book recommendations and community discussions. Sign up FREE by clicking this link Beyond Betrayal Substack. Join our community dedicated to truth, resilience and healing. Your voice matters! Be a part of our Betrayal journey on Substack. And make sure to check out Seasons 1-4 of Betrayal, along with Betrayal Weekly Season 1.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.