All Episodes

October 20, 2025 21 mins

In a Better Offline exclusive, Ed Zitron reveals how much Anthropic spent on Amazon Web Services in 2024 and 2025, and how the costs of running their services are increasing linearly with their revenue, suggesting there may be no path to profitability for LLMs.

(Free) Newsletter: www.wheresyoured.at/costs/

Want to support me? Get $10 off a year’s subscription to my premium newsletter: https://edzitronswheresyouredatghostio.outpost.pub/public/promo-subscription/w08jbm4jwg

YOU CAN NOW BUY BETTER OFFLINE MERCH! Go to https://cottonbureau.com/people/better-offline and use code FREE99 for free shipping on orders of $99 or more.

---

LINKS: https://www.tinyurl.com/betterofflinelinks

Newsletter: https://www.wheresyoured.at/

Reddit: https://www.reddit.com/r/BetterOffline/ 

Discord: chat.wheresyoured.at

Ed's Socials:

https://twitter.com/edzitron

https://www.instagram.com/edzitron

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Alz Media. Hello and welcome to a special exclusive episode
of Better Offline. I'm Your host ed Zitron. As a
result of discussions with sources and documents viewed of the

(00:24):
amounts built on Amazon Web Services, I am for the
first time in history able to disclose how much AI
firms are spending on AWS, specifically Anthropic and a coding
company Cursor, its largest customer for API services. I can
exclusively reveal today how much Anthropics spent on AWS for
the years twenty twenty four and from the beginning of
twenty twenty five through the end of September twenty twenty five,

(00:45):
and from what I can see, their compute spend may
vastly exceed what has previously been reported. Furthermore, I can
confirm that through the end of September twenty twenty five,
Anthropic is spent around one hundred percent of their revenue
in twenty twenty five on Amazon Web Services, spending two
point sixty six billion dollars on compute on an estimated
two point five five billion dollars in revenue. Go to

(01:06):
the newsletter. I source the whole goddamn thing, and if
I'm honest, this piece is the culmination of several months
of articles about how Anthropics business tactics have made be
turned the screws on their biggest customer. I can exclusively
reveal today, as well as many other numbers in the newsletter,
the curs of Amazon Web services bills doubled from six
point two million dollars in May twenty twenty five to
twelve point six million dollars in June twenty twenty five,

(01:29):
and have stayed inflated since Anthropic increased the costs with
the launch of priority servers, tiers and aggressive rent seeking measure.
I need to be clear I cannot one hundred percent
guarantee that's what did it. I'm going to hedge my
bets very hard on that, but it certainly Bloody Wealth
seems that way. It's my guard instinct. I'm not going
to say it declaratively, but I'm going to show you
why I believe this, And I admit I struggled with

(01:50):
how to turn this into an episode because the newsletter,
which is on my free feed, is a series of
numbers and analyzes that if I just read them aloud,
would sound extremely dull and at times be quite hard
to follow. It's not something that naturally plays well for radio.
So instead of giving you the audible version, I'm going
to give you the cliff notes and speak to a
degree of vindication I feel on reading these costs. So
let's start with a number. One point two two five

(02:12):
billion dollars. That's how much Anthropic spent on Amazon Web
Services in the third quarter of twenty twenty five. They
spent eight hundred and twenty nine point seven million in
Q two twenty twenty five, and six hundred and ten
million dollars in Q one twenty twenty five. Oh and
one other number, they spent one point three five billion
dollars on AWS and twenty twenty four. So yeah, just
in another way, talking of their twenty twenty five numbers,

(02:34):
anthropics spend on AWS doubled over the course of three quarters.
Now a little backstory about Anthropic that's necessary to understand
this fully. Anthropic was originally invested in by both Google
and Amazon. According to The New York Times, Google owns
around fourteen percent of the company. An analyst, yes Tomate,
Amazon owned somewhere between fifteen and eighteen percent, and both have,

(02:55):
in not so many words, said that they're the main
or primary compute part for Aanthropic. It's unclear how much
Anthropic spends on Google Cloud, but semi analysis believes they're
a big client, and that's about as much detail as
I can get from anywhere I've really looked. In any case,
Anthropic is spending effectively every dollar they make on Amazon
Web Services, and Amazon has appears to be booking this

(03:17):
as revenue, though I can't directly confirm that, though I
do know these numbers are cash, they're after credits. Though
in the recent months, Anthropic has lowered the amount of
revenue they're spending on it to eighty six point two
percent in Q three twenty twenty five, which is an
improvement from Q two to twenty twenty five, where they
spent one hundred and six percent of their revenue and
Q one, where they spent one hundred and seventy five

(03:38):
percent of what they made on Amazon Web Services. It's
quite horrifying when you say it out loud. Now, if
you're thinking that, because these numbers are quite close, that
this might suggest that Anthropics costs are improving, think again. Anthropics.
Amazon Web services costs of a habit of massively spiking.
For example, their AWS bill led from three hundred and
eighty three point seven million dollars in August twenty twenty

(04:00):
five to five hundred and eighteen point nine million dollars
in September twenty twenty five. That's one hundred and thirty
five million goddamn dollars. And my hunch is it's because
they have a massive problem where clawed code users are
each costing them thousands of dollars despite only paying one
hundred or two hundred dollars a month. There's also the
nasty matter of Google Cloud. Anthropics Amazon Web Services bill

(04:20):
is two point sixty six billion dollars from January through
the end of September, as I said, and that is
pretty close to two point five to five billion dollars
in revenue. But if Anthropics spend on Google Cloud was
only twenty five percent of what they spent on AWS,
its compute cost would jumped to three point three three
billion dollars through the end of September, way more than
it brings in. If it's half of what they spent

(04:41):
on Amazon Web Services, this becomes a three point nine
to nine billion dollar compute bill, and if they spend
the same amount, the bill becomes five point three billion dollars,
and again that's just through the end of September. Another note,
cursors spend on Amazon Web services is comparatively small, but
includes some spend on Anthropics models because Amazon is allowed
to sell them. And I believe that the reason that

(05:01):
they do this, because they do directly pay Anthropic like
they actually send money directly to them, is because Amazon
offers significant discounts in some cases for running models through
their service. I think it's their bedrock service, and my
source confirmed that this was the case, though I could
not get granular data on what exactly CURSES spend was
on Amazon, like I can't say, oh, they use this

(05:21):
model or that model. Now, Cursor spends most of their
compute money directly with Anthropic, as well as every other
model developer whose models they use. AWS is a small
piece of the puzzle, and while small, it's spending data
provides evidence of how much this shit actually costs, though
I also concede that some of the money CURSES spends
with AWS likely goes to the non AI part of

(05:41):
the business, like file hosting and other tech infrastructure. Nevertheless,
the timing of the massive jumps in CURSES AWS bill
from six point two million dollars in May to twelve
point six million dollars in June directly correlate with the
massive changes made to their product, increasing the costs on
any users that wanted to use Cursor in the way
they had in the past by making them face the
actual costs of serving models on a per million token basis.

(06:04):
I've written about this a lot, By the way, it's
hard to describe it in detail because it's going to
take forever. But around mid June, Cursor had to change
everything because mysteriously they had to stop spending so much
money with their customers. Their customers would burning a hole
in their pocket, and I think we can kind of
see why. Curses costs have also never come down, spiking

(06:25):
to a higher fifteen point five million dollars in June,
dropping to a still high nine point six million dollars
in August. I need to spike again to twelve point
nine million dollars in September, though I cannot declaratively state
that this is exactly what happened. Curses costs doubled immediately
following the addition of anthropic service tiers in late May
twenty twenty five, which require an upfront commitment of token

(06:47):
spend and token throughput, and when Curser announced the launch
of their two hundred dollars a month ultraplan amidst massive
product changes, they cited how it was and I quote
made possible by multi year partnerships from open Ai, Anthropic,
Google and Xai, and that their support was instrumental in
offering this volume of computer the predictable price. Now, really,

(07:18):
I'm being as fair as I can. Another factor might
be that the new Claude four models was significantly more expensive.
It's entirely possible that all of these things are true.
I just want to make sure I cover my basis
because I do not know for sure. But the timing,
the timing man and another thing, you know what. Anthropic
also launched a week before service Tears a competing product,

(07:40):
a Cursor called claud Code one that they could run
with as little restraint as they'd like to drain as
many monthly customers away from Cursor, who is also their
largest customer, buying Anthropic models through their API. Real fucking mystery, right,
If it quacks like a dark where's a T shirt
that says dark? And Claude tells you you're absolutely right,
that's a duck. When you upload a picture of it,
it's probably a fucking duck. But I obviously can't say

(08:01):
for sure. I need to be explicit here with what happened, though.
Anthropics supplied access to their models to a company Cursor,
and then released a product claud Code that did exactly
the same thing as that company Cursor, turning it both
into a customer and a competitor in the process, creating
a massive conflict of interest, as not only did Anthropic
have an incentive for that customer or competitor to fail,

(08:24):
though they also needed their compute revenue, which is kind
of a bugger. Anthropic also had the means to make
this failure happen in the most painful and expensive way
possible by worsening the terms in which that competitor required
the compute it needed to function. Could be a coincidence,
I guess. And when I say compute, I mean tokens.
Just I'm reading a script. Okay, where us going to

(08:44):
eam Anyway, I'm not going to turn this into a massive,
sprawling episode about this company. I wanted to give you
the raw information so you can go and read the
detailed analysis. I did. It's free, by the way, don't worry.
But now I want to talk about how all this
made me feel, because that's what makes this show unique.
I think is the appropriate way of coming out this.
I'm going to be honest, I find what it looks like,

(09:05):
and I'm hedging my beds again. Anthropic did to Cursor
truly disgusting. Cursor hit five hundred million dollars in annualized
revenue in the same month that they then saw their
cost double, dramatically reducing the value of their subscription product
at the apex of their success. Yes, Cursor is an
unsustainable AI company I know, and like all of these companies,

(09:25):
has no part of the profitability. Anthropic should have always
charged sustainable rates, even if it meant that it wasn't
possible to build a big company based on their models. Sadly,
we don't live in that universe. And while you could
make the case that startups like Uber didn't at first
charge sustainable rates, I'd argue that the reason why its
initial rates weren't successful was because of the steep upfront
cost of customer acquisition, which is the problem that could

(09:46):
be solved through the lifetime of the customer and Uber
had the means to gradually ratchet up the costs of
rides or mores. Italy reduced the cut that they pay
to drivers in a way that wouldn't be immediately paying for. Furthermore,
Uber never had a fuel problem. What Anthropic has as
a fuel problem, they have a compute problem for the
amount that they're paying to run their goddamn services. Cursor
is also Anthropic's largest customer, and the timing of priority

(10:09):
tiers to coincide at the moment when they were growing
fastest is a suspicious and potentially disgraceful move. While you
could describe it as a necessary step in the direction
of sustainability, that plausible excuse is undercut by the overall
timing of the move. One cannot ignore how close the
launch of these tiers were to the launch of anthropics
clawed Code, a product that lacks curses, flashy front end,

(10:29):
but performs similar functions, all subsidized by anthropics massive hordes
of venture capital, and its chummy relationships with hyperscalers like
Amazon and Google. The thing is, even with these moves,
Anthropics still spent a dollar and four cents on Amazon
Web services for every dollar they made through the end
of September twenty twenty five, and that's for just twenty
twenty five. By the way, their costs increase nearly with

(10:51):
their revenue. And while they've improved, when they spent a
remarkable two hundred and twenty seven percent of their revenue
on AWS in January, they still spent eighty eight point
nine percent of it on a western September. Now, if
you're worried hearing how close these numbers before, like I said,
means there's somehow approaching profitability. Good lord, No, I'm repeating myself.
I realized, But I really need you to come away

(11:11):
with this with reality in your brain. These digital mister
beans very likely spend comparable sums on Google Cloud unlikely
another billion or two one salaries data And I don't
know that one point five billion dollar settlement with all
the authors that they just agreed to. This company absolutely
fucking sucks. I don't care if you like Claude Sonnet
or claud Opus. I don't give a fuck. Claude Opus
and Claudes on It are not worth burning billions of

(11:33):
dollars a year in cloud costs, fueling an environmentally destructive
plagiarism charge pseudo company that would roll over and die
within months if it didn't constantly get fed billions of
dollars a year. What are you gonna tell me they're
gonna turn this ship around. They're gonna make some sort
of autonomous AI coder. You know that's bullshit. Every goddamn
one of you boosters knows that total bullshit. I'm sure

(11:53):
son at four point five is somewhat better than Sonnet four,
but what does that actually mean? Anthropic raised twenty billion
dollars this year? Do we give them more next year?
I've heard reports that they're actually targeting twenty billion dollars
in anialized revenue, so one point six to seven billion
dollars a month in revenue by the end of next year.
It's an absolute fucking joke. But the only thing funnier

(12:16):
than that joke is that it will likely cost them
twenty five billion dollars to make that fictional money. And
where preytell is that coming from? And why? Why? What
is so remarkable about this company that gives them a
free paster burn two point sixty six billion dollars in
AWS in fucking nine months. I'm not talking about your
cynical oh, Amazon is booking at is revenue crony capitalisms?

(12:36):
Here an'tswer it. I'm not, I'm not. I'm talking about
the scientific or technological reasoning for keeping Anthropic alive. And yes,
I feel exactly the same way about Open AI. What
possible achievement does Anthropic have that warrants this needless, endless,
sprawling financial destruction. Why are we rewarding a company with
bad business practices for making a product that loses more

(12:57):
money the more money it makes. I'll even try and
see this through the eyes of an AI booster. Damn,
all I'm seeing is blue and yellow anyway, And even
from here, the only reason to keep Anthropic alive is
because you see these companies as sports teams. You see
Dario Amiday as the equivalent of Dan Campbell or Greg Popovich.
You root for them and their causes because you think

(13:17):
that if they win, you as a fan will be rewarded.
You don't think too hard about what it is that
Claude Sonnet or Claude Opus do, and you find enough
ways that this is somewhat kind of useful to you,
and you use those reasons to justify the proliferation of
a wasteful and destructive technology. What exactly happens here? Anthropics
AWS bills are not really going down. They've normalized in

(13:40):
an eighty eight to ninety five percent range and they're
clearly going to stay there. And if your argument is
they'll go down, your argument is quite literally no, eh.
Go read semi analysis for seventeen hours and come up
with some demented GPU based argument about inference smax scores,
pretend like you give a shit, come up with a
real argument against mine, because I am working hard, are
at this than you are, And if you believe otherwise,

(14:02):
you should ask yourself why the guy who said Sam
Altman's no it loads refused cash dump in a premium
newsletter got this scoop and you did not. But that
actually leads me to a key question. How long do
we hand Anthropic and by extension, open AI billions of dollars?
And for the first time in your goddamn life, it's
time to ask, what if I'm right? What if these

(14:24):
companies are incapable of becoming profitable? What if there really
is no massive demand for generative AI? Do you really
think Anthropic will make one point six billion dollars a month?
Sometime in twenty twenty six, do you really think that?
And even for Amazon, it's kind of shit wow to
do a couple billion on one hundred and five billion

(14:45):
dollars of capex. I might have even said this later
in the script, but just thinking about it makes me
feel a little crazy. And look, I get there's a
middle ground here where people say that there's some sort
of use case that sort of works for AI, where
you hit it hard enough to write good enough prompts
or whatever that you like it for search, that your
brainstorm with it, they helped you pick out a hat
that you used it to solve some sort of problem. Once,

(15:08):
I just want to ask you how much of those
anecdotes really worth to you? How impressed with these things
are you? Would you pay double, triple, quadruple? Would you
pay on a meted basis where those little flights of
fancy cost you a few cents, then ten cents, then
a dollar, because that's how much it costs to provide
these services, and at some point you're going to be
made to pay for it one way or another. Advertising

(15:43):
won't be the answer. By the way, the literal only
company to try advertising in large language models is an
AI search engine company called Perplexity, and they just paused
accepting new advertisers too, and I quote ad week rethink
how ads fit into its AI search experience. They made
twenty thousand dollars in twenty twenty four in advertising revenue.

(16:04):
Are we supposed to be impressed that Perplexity made enough
revenue to buy a second hand Toyota Corolla. There are
people making more money than that's slinging fucking Herbalife. And
this is literally the exact company that should have succeeded
based on any kind of ads will fix everything argument.
And they couldn't even buy courtside tickets at the NBA playoffs.

(16:26):
The costs are increasing linearly with revenue, and I've fucking
proved it. I am open to any compelling arguments that
can explain how this ever changes. And my god, if
you say trainium, I will absolutely lose my shit. Chips
aren't fixing this. By the way, if your answer is
the anthropic will make some sort of theoretical ultrapowerful large
language model or invent agi, you are a goddamn mark.

(16:49):
You are being conned. Look join me. I'm serious. There's
no harm in being wrong I've been wrong tons of
times in my life. Being wrong and admitting you're wrong
is an act of bravery. Shit, actually kind of get it.
This stuff feels if you let it, like it's doing
something for you, even though interacting with it is actually

(17:09):
draining you because you're constantly having to find ways to
make it do what you want it to do, to
the point that when it actually does something for the
first time, it almost feels magical. You feel very powerful,
despite the fact that you have been put to work
to make automation work. That's not how automation is meant
to work. And sure, there are software engineers out there
who have, like any good software engineer, found a way

(17:31):
to take the useful parts of llms and use them to,
to quote Carl Brown of the Internet at bugs, make
the easy things easier. Then there are the ones that
are spending more time than they were building software, prompting
l elms and rewriting claude dot md files and thinking
that because things sort of worked off they hit enter
that they're privy to a great becoming. And there are
the victims, of course, of vibe coding startups companies that

(17:54):
sell the outright lie that somebody who cannot read or
write software can write secure, effective and functional software. Look,
I'm serious, join me. If you're an AI booster, I
don't care. Everybody is welcome. In reality, I don't care
who you are. I don't care if I've called you
a booster and given you a verbal swirly one hundred times.
Now is the time to accept that this software is

(18:16):
too expensive, too destructive, and too wasteful to continue backing it.
I'm not even saying you have to say fuck AI
or shun chat GPT like you're an armish teenager that
looked at porno. But it's time to be loud and
direct that these products are not worth the egregious and
perpetual annihilation of billions of dollars every fucking year. I
don't even know if this means you have to stop

(18:36):
using them. I don't want you to, but I don't like.
What are we gonna do. These things are not gonna
go away because you stopped using Claude. They're gonna go
away because you stop talking about them. They're gonna go
away because they cost too much in their pay pigs
stop paying them. What I am advocating for is for
everybody to openly discuss that the amount of money it

(18:58):
costs to run these companies is that odds with what
they have built, are building and will build in the future.
Nothing they are building is moving towards superintelligence or AGI.
No combination of Amazon Tranium or Google TPUs is going
to usher in the birth of the machine guard. The
products they make are, at best, and in inconsistent moments,
kind of cool, but one hundred times more often mediocre, unreliable,

(19:21):
and outright ridiculous, even if you really get a lot
out of these models. Do you think that these companies
should be allowed to burn billions of dollars every year?
How much do you think they should be allowed to burn?
And how much is too much for you? It's time
to start having this conversation and having it publicly, especially

(19:42):
as Clammy Sam Mortman Bloviate's about building two hundred and
fifty gigawatts of data centers in seven goddamn years at
the cost of one third of America's entire fucking economic
output in twenty twenty four. Anyway, this has been a
big day for me, so I'm going to leave it there.
It's a huge scoop. I'm grateful that I get to
do this every day, grateful for you listening, and grateful
for your reading. I hope you've enjoyed this episode and

(20:05):
thank you as eva for supporting my work. Thank you
for listening to Better Offline. The editor and composer of
the Better Offline theme song is Matasowski. You can check
out more of his music and audio projects at Matasowski
dot com, M A T T O. S O w

(20:28):
Ski dot com. You can email me at easy at
Better Offline dot com or visit Better Offline dot com
to find more podcast links and of course, my newsletter.
I also really recommend you go to chat dot Where's
Youreed dot at to visit the discord, and go to
our slash Better Offline to check out our reddit. Thank
you so much for listening. Better Offline is a production

(20:49):
of cool Zone Media. For more from cool Zone Media,
visit our website cool Zonemedia dot com, or check us
out on the iHeartRadio app, Apple Podcasts, or wherever you
get your podcasts.
Advertise With Us

Host

Ed Zitron

Ed Zitron

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

CrimeLess: Hillbilly Heist

CrimeLess: Hillbilly Heist

It’s 1996 in rural North Carolina, and an oddball crew makes history when they pull off America’s third largest cash heist. But it’s all downhill from there. Join host Johnny Knoxville as he unspools a wild and woolly tale about a group of regular ‘ol folks who risked it all for a chance at a better life. CrimeLess: Hillbilly Heist answers the question: what would you do with 17.3 million dollars? The answer includes diamond rings, mansions, velvet Elvis paintings, plus a run for the border, murder-for-hire-plots, and FBI busts.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.