All Episodes

July 24, 2025 37 mins

In part two of this week's three-part Better Offline, Ed Zitron walks you through how little money there is in generative AI, how Anthropic and OpenAI are killing their own customers, and why there may never be a profitable LLM company.

YOU CAN NOW BUY BETTER OFFLINE MERCH! Go to https://cottonbureau.com/people/better-offline and use code FREE99 for free shipping on orders of $99 or more.

---

LINKS: https://www.tinyurl.com/betterofflinelinks

Newsletter: https://www.wheresyoured.at/

Reddit: https://www.reddit.com/r/BetterOffline/ 

Discord: chat.wheresyoured.at

Ed's Socials:

https://twitter.com/edzitron

https://www.instagram.com/edzitron

https://bsky.app/profile/edzitron.com

https://www.threads.net/@edzitron

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Zone Media Hell, and welcome to Better Offline. I'm your
host ed Zeitron. Subscribe to the newsletter by the merchandise.
It's all in the notes. And we're on the second

(00:23):
installment of our three part Hater's Guide to the AI
bubble and the cracks within the generative AI industry and
how they're becoming bigger and scarier and the potential economic
meltdown course by a collapse in generative AI spending. Well,
it's not really general if AI spending. It's literally just
fucking GPUs, and I think it might be sooner and
likelier than many think. Toward the end of the last episode,

(00:44):
we talked about one of the inane comparisons we hear
between today's nation state size spending on jen ai capital
expenditures and the investments that Amazon made when scaling Amazon
Web Services, which was literally the foundation of Dow Computing
at scale. I would say someone's going to email and
say I'm wrong, not going to read it, and I
had to cut things short because we ran out of time.

Speaker 2 (01:03):
But I want to.

Speaker 1 (01:04):
Continue the conversation because I think it's important to examine
this comparison thoroughly, if not just to explain why it
doesn't work. It's also I want to stop. I want
to stop hearing it. I want when people say it
to me, I just want to send them this fucking
episode and say leave me alone, buddy boy.

Speaker 2 (01:18):
But but bye.

Speaker 1 (01:20):
But the first point I want to make in this
episode is that generative AI and large language models do
not resemble Amazon Web Services or the greater cloud compute boom,
and generative AI is not infrastructure. Now, some people compare
llms and their associated services to Amazon Web Services or
services like Microsoft Zero or Google Cloud, their giant, multi
billion dollar operations that basically share their server capacity with

(01:43):
companies wanting to run stuff on the Internet or within
their own within their own systems. A very fudgy way
of putting. They help make sure that applications work online.
These are very very useful services, and by the way,
people are wrong to make the comparison between them and
the l lms. As I'll get into now. Amazon Web
Services when it launched, comprised of things like and forgive

(02:04):
me how much I'm going to dilute this, Amazon's Elastic
Compute Cloud EC two, where you rent space in Amazon
service to run applications in the cloud, or Amazon Simple
Storage S three, which is enterprise level storage for applications
and storing things, is not just like a simple hard drive.
It's redundancy, it's making sure it's copied in places, so
latency comes down tons of other things. But in simpler terms,

(02:25):
if you were providing a cloud based service, you used
Amazon to both stored a stuff that the service needed
and the cloud actual cloud based processing. So of compute,
so like your compute loads and runs applications, but delivered
the thousands of millions of people online.

Speaker 2 (02:38):
And this is a huge industry.

Speaker 1 (02:40):
Amazon Web services are alone brought in WHEB revenues are
over one hundred billion dollars in twenty twenty four. And
while Microsoft and Google don't break out their cloud revenues,
they're similarly large parts of their companies, and Microsoft is
used as zero and the past to patch over shoddy growth.
These services are also selling infrastructure. You aren't just paying
for compute, but the ability to access storage and deliver
services with low lateent so users have a snappy experiences

(03:02):
wherever they are in the world. And I know I
just said a snappy experiences. I'm not editing it. The
subtle magic of the Internet is that it works at all,
and a large part of that is the cloud compute
infrastructure and oligopoly of the main cloud providers. Having such
a vast data centers, this is much cheaper than doing
it yourself, and to a certain point, Jobbox moved away
from Amazon Web Services is it at scale, for example,

(03:23):
but this also allows someone to take care of the
maintenance of the hardware and make sure it actually gets
your stuff to your customers. You also don't have to
worry about spikes and usage because these things are usage based,
hence the elastic and you can always add more compute
to meet demand or just have it in a particular time.
There is, of course nuance, security specific features, content specific
delivery services, data based services. There's nuance behind these clouds.

(03:47):
You're buying into the infrastructure of the infrastructure provider, and
the reason these products are so profitable is that in
part you are handing off the problems and responsibility to
somebody else. And also, most web applications are not that
demanding of cloud compute. They might be at scale expensive
to provide to millions of people, but Facebook was not
a super complex, I don't know website depending on thousands

(04:09):
or millions of GPUs, and based on the idea, there
are multiple product categories you can build on top of
something like edblys Because ultimately cloud services are about Amazon,
Microsoft and Google running your infrastructure for you. Large language
models and their associated services are completely different, despite these
companies attempting to prove otherwise. And it starts with a
very very simple problem. Why did any of these companies

(04:31):
build these giant data centers and why did they fill
them full of GPUs? Amazon Web Services was created out
of necessity. Amazon's infrastructure needs were so great that it
effectively had to build out the software and hardware necessary
to deliver a store that sold theoretically everything, the theoretically anywhere,
handling both the traffic and customers, delivering the software that
runs Amazon dot Com quickly and reliably and well, making

(04:52):
sure things kept working, making sure they were stable. And
it didn't need to come up with a reason for
people to run web applications. They were already running applications
client side on their computers. They realized that doing so
at scale would be cool, or they were already doing
so in a way that was likely not particularly cost effective.
And the ways that we're doing so, they were inflexible,

(05:13):
and they required specially skills and indeed physical infrastructure personnel.
They were quite expensive. So Amazon Web Services took something
that people already did and what there was actually a
proven demand for, and made it better and scaled it. Eventually,
Google and Microsoft copied done because that's all they can do.
And that appears to be the only similarity with generative AI.
That due to the ridiculous costs of both data centers

(05:33):
and GPUs necessary to provide these services, it's largely impossible
for others to enter the market.

Speaker 2 (05:38):
You know.

Speaker 1 (05:38):
After that, generative AI feels more like a feature of
cloud infrastructure rather than the infrastructure itself. AWS and similar
medic clouds are versatile, flexible, and multi faceted. Generative AI
does what generative AI does well, that's about it. You
can run lots of different things in AWS. What are
the different things you can run using large language models?
What are the different use cases and indeed user requirements

(06:01):
that make this the supposed next big thing. Perhaps the
argument is that generator of AI is the next AWS
or similar cloud service because you can build the next
great companies on the infrastructure of others. The models of
say open AI and anthropic and the service of Microsoft. Okay, okay,
let's humor this point too. You can build the next
great AI startup, and you have to build it on

(06:22):
one of the megaclouds because they're the only ones that
can afford to build the infrastructure. One inc wincteny small problem.
Companies built on top of large language models don't make
much money, and in fact they're almost all deeply unprofitable.
But let's establish a few flats to get going. I said, flats, flats,
Jesus Christ.

Speaker 2 (06:41):
Facts.

Speaker 1 (06:42):
Here are the flats I'm establishing. Outside of one exception,
mid Journey, which claimed it was profitable in twenty twenty two,
which may not still be the case. I've actually reached
out to ask them and they didn't get.

Speaker 2 (06:52):
Back to me.

Speaker 1 (06:52):
Every single LLM model is company is unprofitable, often wildly so.
Outside of open ai and oropic, in any sphere which
makes the AI coding app cursor, there are no large
language model companies either building models or services on top
of others models that make more than five hundred million
dollars in annualized revenue meaning month times twelve. Outside mid
Journeys two hundred million arr and Iron clouds one hundred

(07:14):
and fifty million arr Also fucking perplexity, there are only
twelve generative AI powered companies making one hundred million dollars
annualized or eighteen point three million dollars a month in revenue.
The database then, this is the Information's AI. Generative AI
database doesn't have replt, which also announced it hit one
hundred million in analyzed revenue. I've included it in my

(07:37):
statement of facts. Of these companies, two of them have
been acquired, move Works acquired by service Now in March
twenty twenty five after the company shit the Big Big Time,
and Windsurf, which was acquired by Google and Cognition in
July twenty twenty five and one of the most annoying
deals of all time. But for the sake of simplicity,
I've left out companies like Surge, Scale, Cheering, and Together,
all of whom run consultancies selling services and training stuff

(07:59):
for training models. Otherwise, there are seven companies total that
make fifty million dollars or more annual recurring revenue, which
is four point one six million dollars a month.

Speaker 2 (08:08):
Now, none of this is to say.

Speaker 1 (08:09):
That one hundred million dollars isn't a lot of money
to you and me.

Speaker 2 (08:12):
I just want to be clear.

Speaker 1 (08:13):
If you want to give me one hundred million dollars,
I'll do anything. I'll wink like a pig for you anyway.
But in the world of software as a service or
enterprise software, this is jump change HubSpot At revenues are
two point six three billion dollars in its twenty twenty
four financial year. Three years into this crap, and Generative
AI's highest grossing companies outside of open Ai ten billion
annualized as of June and Anthropic four billion annualized as July.

(08:36):
Don't like saying that word. Both of them loose billions
a year after revenue. There are really three problems here.
Businesses powered by Generative AI do not seem to be popular,
Those businesses that are remotely popular are deeply unprofitable, and
even the less popular generative AI powered businesses are also
deeply unprofitable. But I want to start somewhere because I
keep hearing about fucking Cursor. Fucking's start with any sphere

(09:01):
and Cursor and their app Cursor. It's an AI powered
coding app and they have five hundred million dollars of
annualized revenue.

Speaker 2 (09:09):
Pretty great, right, ha.

Speaker 1 (09:11):
It hit two hundred million dollars in annualized revenue in
March and then hit five hundred million in June after
raising nine hundred million dollars. That's amazing, ed, ed it's
time walk to the garage. ED, it's over for you. Wrong,
it's a mirage. Cursor's growth was the result of an
unsustainable business model that it's now had to replace with
opaque terms of service, dramatically restricting access to models, and

(09:32):
rate limits that effectively stop its users using the product
at the price point they were used to go to
Arsnash Cursor on red app Take a look. Take a
look at how happy everyone is.

Speaker 2 (09:42):
I want to know one.

Speaker 1 (09:43):
My peers in the media don't seem to have the
ability to talk to actual fucking customers. It's ridiculous. This
company is circling the drain, and nobody seems to want
to talk about it, despite how big a deal that is.

Speaker 2 (09:54):
Oh.

Speaker 1 (09:54):
Also, Curse is horribly unprofitable, and I believe there are
a sign of things to come in generative AI. A
couple of weeks weeks ago, I wrote up the dramatic
changes that Cursor made to its service in the middle
of June or my premium newsletter and discovered that they
timed these changes precisely with Anthropic and open Ai to
a lesser extent, adding service tiers and priority processing, which
is tech language for pay us extra if you have

(10:15):
a lot of customers or face rate limits or service delays.

Speaker 2 (10:17):
Asshole.

Speaker 1 (10:19):
These price ships have also led to companies like replt
having to make significant changes to their pricing model that
disfavors users. People are finding in really simple terms that
what they used to get for twenty bucks is much much, much,
much much smaller curse the users hit rate limits. Replit
users are hitting rate limits, and even then when they
try and do the same things, they're spending way more

(10:39):
money if they go pay as you go. It's a
complete fast But I'm going to repeat some of this
stuff from the premium newsletter because there is a time
of events that I believe are going to be in
the big short to AI Boogloo all right. In or
around May fifth, twenty twenty five, Cursor closed the nine
hundred million dollar funding round in a Around May twenty second,
twenty twenty five, Anthropic launched Clawed four Opus and new

(11:00):
models with Sona and Opus, both of them kind of
well known for coding, and on May thirtieth, twenty twenty five,
they added service tiers, including priority pricing specifically focused on
cash heavy products like Cursor and the cash is when
you put stuff that you're going to be looking at regularly,
take a look at it, and you can use it
more readily. Cash is the CAC eight G, by the way,
is generally something that's for efficiency.

Speaker 2 (11:22):
The idea that you.

Speaker 1 (11:23):
Would add a toll onto the cash is fucking disgusting
and only targeted coding startups. But on May thirtieth, twenty
twenty five, Reutter's reported the Anthropics annualized revenue hit three
billion dollars, with a key driver being code generation. This
translates to around two hundred and fifty million dollars in
monthly revenue. June ninth, twenty twenty five, CNBC reported open

(11:43):
Ai'd hit ten billion dollars in annualized revenue. And yeah,
when they said ann your recurring revenue, they meant annualized.
But the very same day they cut the price of
their three model by eighty percent, which competes directly with
Clawed four Opus by the way, and This was a
direct and aggressive attempt to force Anthropic to kind of
like make too ether lower prices or compete. It's just
shtheads fuckinging around with assholes. But on or around June sixteen,

(12:07):
twenty twenty five, Cursor changed its pricing, added a new
two hundred dollar a month Ultra tier that, in their
own words, was made possible by multi year partnerships with
open Ai, Anthropic, Google, an Xai, which translates to multi
year commitments to spend which can be amortized as monthly amounts.
A day later, on June seventeenth, Cursor dramatically changed its
offering to it for its twenty dollars a month subscriptions

(12:28):
to usage base, where one got at least the value
of their subscription, so a twenty bus a month person
would get more than twenty dollars of API course in compute,
along with arbitrary rate limits and unlimited access to Cursor's
own slow model that its users hey. Then on June eighteenth,
Repler and other vibe coding company that I had previously
mentioned announced their effort based pricing increases that were massive.

Speaker 2 (12:50):
July first, the Information reported.

Speaker 1 (12:52):
The Anthropic hit four billion dollars of annualized revenue making
three hundred and thirty million dollars a month, an increase
of eighty three million dollars a month. We'll just under
twenty five percent in the space of a month.

Speaker 2 (13:04):
Hmm.

Speaker 1 (13:05):
Where could that money have come from? In simpler terms,
Cursor raised nine hundred million dollars and very likely had
to hand large amounts of that money over to Open
Air and Anthropic to keep doing business with them, then
immediately change the terms of service to make them worse

(13:28):
for their customers. And as I said at the time,
and this is a direct quote from my news there,
while some met, no, I can't do the Kevin Ruth's
voice and doing my own stuff, pardon me. While some
may believe that Open AI and Anthropic hitting annualized revenue
milestones is good news, you have to consider how these
milestones were hit. Based on my reporting, I believe that
both companies are effectively doing steroids, forcing massive infrastructural costs

(13:50):
onto big customers as a means of covering the increasing
costs of their own models. There is simply no other
aid to read this situation. By making these changes, Anthropic
is intentionally making it harder for its larger costs largest
customer to do business. By the way, Cursor is their
largest customer, creating the extra revenue by making Cursors product
worse by proxy. What's sickening about this particular situation. It

(14:10):
doesn't really matter if Curs's customers are happy or sad.
They like open AI's Enterprise Priority Access API Anthropic in
this case, require a long term commitment which involves a
minimum through put of tokens per second as part of
their tiered access program. If Curs's customers drop off, both
Anthropic and open Ai still get their cut, and if
curses customers somehow out spend those commitments, they'll either still

(14:31):
get rate limited or any sphere willkin cur more costs.

Speaker 2 (14:35):
Why do you care about this?

Speaker 1 (14:37):
Well, Cursor is the largest and most successful genetive AI
company by far. In these aggressive and desperate changes to
its products suggest that a that its products are deeply unprofitable,
and b that its current growth was the result of
offering a product that it was not the one it
would sell in the long term. Cursor misled its customers,
and its current revenue is as a result, highly unlikely
to stay at this level. Worse still, two anthropic engineers

(14:59):
left from the the Clawed Code team to go and
work at Cursor two weeks ago, and they have already
come back. This heavily suggests that whatever they saw over
there wasn't compelling enough to make them stay. As I
also said, while Cursor may have raised nine hundred million dollars,
it was really open Aianthropic XAI and Google that got
that money. At this point, there are no profitable price
AI startups, and it's highly unlikely that the new pricing

(15:21):
models by both Cursor and Replet are going to help.

Speaker 2 (15:23):
These are now the.

Speaker 1 (15:24):
New terms of doing business with the big model companies,
a shakedown where you pay for priority access or tears,
or face indeterminate delays or rate limits. Any start up
scaling into an enterprise integration of General AVII, which means
in this case anything that requires a level of service
uptime has to commit to both a minimum amount of
months and the throughput of tokens, which means that the
price of starting an AI company that gets any kind

(15:46):
of real market traction just dramatically increased. Well, one could say, oh,
perhaps you don't need priority access. The need here is
something that can be entirely judged by anthropic and open
ai in a totally opaque manner. They can and they
will throttle companies that are two demanding on their systems.
It's proven by the fact that they've done this to
curse them multiple times. But okay, why does curse them
out so much? And it's simple. Generative AI will not

(16:09):
get big on selling consumer software without an enterprise SaaS story,
they're dead And I realize, I know, okay, folks, it's
kind of a little boring hearing about software as a
service despite the fact that it's a huge, several hundred
billion dollar industry. But this is the only place where
generative AI can really make money. Companies buying hundreds of

(16:29):
thousands of seats or how industries that rely on compute
grow and without that growth, they're going nowhere. To give
you some context, Netflix makes about thirty nine billion dollars
a year in subscription new from consumers, and Spotify about
eighteen billion.

Speaker 2 (16:43):
These are the.

Speaker 1 (16:43):
Single most popular consumer software subscriptions in the world, and
open ai is fifteen point five million subscribers. Suggest that
open ai can't rely on them for the kind of
growth that would actually make the company worth three hundred billion.

Speaker 2 (16:54):
Dollars or more.

Speaker 1 (16:55):
Cuzer, as it stands, is the one example of a
company thriving using GENERATIVEA a software company selling software, and
it appears its rapid growth was the result of selling
a product at a massive loss. As it stands today,
Curs's product is significantly worse and it's ready it's full
of people furious at the company for the changes. In
simpler terms, Curser was the company that people mentioned to
prove that startups could make money by building on top

(17:17):
products on top of open AI and Anthropics models. Yet
the truth is the only way to do so is
to grow, and grow is to burn tons of money.
While the tempting argument is to say that Curs's customers
are addicted and will keep paying, this is clearly not
the case, nor is it an actual business model. Like
people that say this, I've never had a drug addiction,
but I know people that do it. It's nothing like software.

(17:39):
Stop making that comparison. It's insulting to the victims of addiction.
But anyway, this story showed that open A and Anthropics
are actually their bigger the biggest threats to their customers
and will actively rent seek can punish any of their
success stories, looking to loose as much as they can from.

Speaker 2 (17:52):
Them before they copy their products.

Speaker 1 (17:54):
To put it bluntly, curses growth story was a fucking lie.
It reached five hundred million dollars in annulif revenue selling
a product it can no longer afford to sell and
could not afford to sell long term, suggesting material weakness
in its business and any and all coding startups. It's
also remarkable, in the shocking failure of journalism that this
isn't in every single article about any sphere. I'm doing

(18:16):
this part time? Why am I in the asshole here?
Like I'm I don't know, really, though, I do have
a question. Where are all the consumer AI starts? I'm
genuinely serious.

Speaker 2 (18:27):
What have you got for me?

Speaker 1 (18:28):
Perplexity Perplexity. Perplexity only has one hundred and fifty million
dollars in the annualized revenue, and they spent one hundred
and sixty seven percent of their revenue in twenty twenty four,
or fifty seven million dollars of spending on revenues of
thirty four million dollars on computer services from Anthropic, Open
AI and Amazon. They lost sixty eight million dollars and
worse still, they still have no path to profitability and

(18:50):
it's not even making anything new. They're a search engine,
they have an AI browser. But don't worry. Professional gas
bag Alex Heath just did this insane and flumm mixing
interview with CEO Aravins Ravinas, who, when asked how it
perplexed you would become profitable, appeared to experience what seems
to be a stroke like I'm about to read something

(19:12):
to you and it's gonna sound strange, but this is
exactly what was said. Maybe let me give you another example.
You want to put an ad on meta Instagram, and
you want to look at ads done by similar brands,
pull that, study that, or look at AdWords pricing of
one hundred different keywords and figure out how to price
your thing comparatively. These are tasks that could definitely save
you hours and hours and maybe even give you up
an arbitrage over what you could do yourself, because AI

(19:33):
is able to do a lot more and at scale.
If it helps you to make a few million bugs,
does it not make sense to spend two thousand dollars
for that prompt? It does, right, So I think we're
going to be able to monetize in many more interesting
ways than chatbots for the browser. I want to be
fucking clear about something. Alex seems like a nice guy.
If someone said that to me, I'd ask them if
they could smell toast. I'd be like, Aravin, Mate, are

(19:56):
you okay? How many fingers I'm holding up?

Speaker 2 (19:58):
Aravin? You're right?

Speaker 1 (19:59):
Did you hit your head on something? The ceilings don't
seem that low in here. But mate, you're just spewing
utter fucking nonsense. I've read this paragraph multiple times. I
do not know what he's getting at. I think he's
suggesting something about how you could ask it to tell
you what to do with ads.

Speaker 2 (20:15):
I don't know. I don't know.

Speaker 1 (20:18):
This is the big probably the biggest consumer AI company
that isn't open AI, and they speak like they're an
insane person or a stupid person. Check out the Business
Idiot Trilogy for what I think there. I also mentioned
them earlier, but I don't I don't want you to
talk to me about AI browsers. Anyone humoring AI browsers
is being an imbecile for some reason. They are not

(20:40):
a business model. How are people going to make money
on the browser. Hm hmm, what do these products actually do?

Speaker 2 (20:45):
Oh? They can poorly automate accepting linked invites. Wow. Wow,
it's like God himself has personally best my computer. A
big fucking deal.

Speaker 1 (20:55):
In any case, it doesn't seem like you can really
build a consumer AI startup that makes any real money
or approach being a real company other than chat GPT,
I guess, and that's because the GENERATIVEAI software market is small,
with little room for growth and no profits to be seen. Arguably,
the biggest sign that things are are troubling in the
generative AI spaces that we use the term annualized revenue

(21:15):
at all, which, as I've mentioned repeatedly, means multiplying a
month by twelve and saying that's our annualized baby. The
problem with this number is that, well, people cancel things.
While your June might look great, if ten percent of
your subscribers churning a bad month due to a change
in your terms of service, for example, that's a huge
chunk of your annualized revenue gone and likely gone forever.
But the worst sign is that nobody is saying the

(21:36):
monthly figures, mostly because the monthly figures fucking suck. One
hundred million dollars of anualized revenue is eight point three
three million dollars a month. To give you some scale,
Amazon Web Services hit one hundred and eighty nine million
dollars fifteen point seventy five million dollars a month in
revenue in two thousand and eight, two years after founding,
and while it took until twenty fifteen to hit profitability,
it actually hit break even in two thousand and nine,
though were invested in cash and growth for a few

(21:57):
years later. And I should be clear them doing that
justified so many startups burning cash, so many starts like yeah,
look at aws. They were investing in growth, which is
a fair thing for companies to do. But I'm being
an asshole. But right now there is not a single
generative AI software company that's profitable, and none of them
are showing the signs of the kind of hypergrowth that

(22:17):
previous big software companies had or Cursor technically is the
fastest growing software as a service company of all time.
It got there by basically lying. Cursor is never bringing
back the product at the twenty dollars price point that
they were selling. They're never doing it. The money they
earned was earned it's not fraud because they didn't do it.

Speaker 2 (22:40):
I guess it was deceptive, but it's not really to
the it's just fucking lying.

Speaker 1 (22:44):
It's just lying. And who knows what happens to curser now.
But you know what, I'm harping on cursor a bit.
What other software startups are there?

Speaker 2 (22:51):
Glean, Glean, fucking Glean, Glean, everyone loves to talk about.

Speaker 1 (22:57):
Enterprise search company Glean, a company that uses AI to
search and generate answers from your company's files and documents.
Fun fact, also Salesforce's own Slack has now blocked them
from searching Slack. Just arshole on arsehole violence. In December
twenty twenty four, Glean raised two hundred and sixty million dollars,
broadly stating that it had over five hundred and fifty
million dollars in cash with best in class ARR growth.

(23:18):
A few months later, in February twenty twenty five, Glean
announced it achieved one hundred million dollars in annual recurring
revenue in fourth arter FY twenty five, cementing its position
is one of the fastest growing sas startups and reflecting
a searching demand for AI powered workplace intelligence. In any case,
AR could literally mean anything, as it appears to be
based on quarters, meaning it could be an average of
the last three months. I guess anyway. In June twenty

(23:41):
twenty five, Glean announced it had raised another funding round,
this time raising one hundred and fifty million dollars in
It troublingly added that since its last round, it had
surpassed one hundred million dollars in AR raw five months
into the fucking year. And your revenue is basically the same.
That isn't good. That isn't good at all. Also happened
to that five hundred and fifty million dollars in cash?

(24:02):
Why did Glean need more? Hey, wait a second, take
a look at this. Glean announced their raise on June eighteenth,
twenty twenty five, two days after Curses price increase, in
the same day that Repler announced the similar price act.

Speaker 2 (24:12):
It's almost as if the.

Speaker 1 (24:13):
Dramatic pricing increase has affected them due to the introduction
of Anthropic Service TRES and Opening Eyes priority processing.

Speaker 2 (24:20):
But I'm guessing. I know, I'm guessing.

Speaker 1 (24:22):
But it is kind of where that all of these
companies raise money and all announced these things around the
same time.

Speaker 2 (24:42):
Hey, that reminds me, I got another problem.

Speaker 1 (24:45):
I got another problem here because I think that there
is another reason why the cycles kind of keep repeating.
You get a company of that grows, and then they
kind of go nowhere, because well, the company doesn't really
seem to have a total addressable market much bigger than
one hundred million AR and I think it's a little simple.

Speaker 2 (25:03):
It's quite simple.

Speaker 1 (25:03):
In fact, there really are no unique generative AI companies,
and building a moat on top of l elms is
near impossible.

Speaker 2 (25:11):
If you look a man, am I going to get
some emails about this, but bring them on.

Speaker 1 (25:15):
If you look at what GENERATIVEAI companies do, now that
the following is not a quality barometer, it's probably one
of the following things. They're either chatbot one, either you
ask questions or talk to This includes customer service bots, searching, summarizing,
or comparing documents with increased amounts of complexity of documents
or quantity of documents to be compared. This includes being
able to ask questions of documents. Web search deep research,

(25:39):
meaning long form web search that generates a document where
some parts of it will inevitably be hallucinated or derived
from low quality sources, generating text, images, voice, or in
some rare cases video, Using AI to generative AII mean
to write, edit or maintain code, transcription, translation, or photo
and video editing. Every single generative AI company that is

(25:59):
an open Aireanthropic and honestly kind of those two does
one or a few of these things, and I mean
every one of them. And it's because every single generative
AI company uses large language models, which have inherent limits
on what they can do. Llms can generate, they can search,
they can kind of edit, they can sometimes transcribe accurately,
and they can sometimes translate much more well, much less accurately.

(26:21):
I guess within weeks of Curses changed to its services,
Amazon and byte Dance release competitors that, for the most part, do.

Speaker 2 (26:27):
Exactly the same thing.

Speaker 1 (26:28):
Sure, there's a few differences in how they're designed, but
design is not a moat, especially in a high cost,
negative profit business were your only way of growing is
to offer a product you can't sustain. The only other
moat you can build is the services you provide, which,
when your services are dependent on a large language model,
are dependent on the model developer, who, in the case
of open AI and Anthropic, could simply clone your startup,
because the only valuable intellectual property is the models, and

(26:51):
those models are theirs. You may say, well, nobody else
has any ideas either, to which I say, I fully agree.
My rock com bubble thesis suggests that we're all out
of hyper growth ideas, and yeah, I think we're out
of ideas related to any large language models too. At
this point, I think it's fair to ask, are there
any good businesses you can build on top of generative
AI or large language models. I don't mean ad features

(27:14):
related to I mean an AI company that actually sells
a product that people buy at scale that isn't called chat,
GPT or claude. In previous tech booms, companies would make
their own models, their own infrastructure, or the things that
make them distinct from other companies. But the generative AI
boom effectively changes that by making everybody build on stuff
on top of somebody else's models, because training your own
models is both extremely expensive and requires vast amounts of

(27:37):
infrastructure and just pure power. As a result, much of
this boom is about a few companies, really too, if
we're honest, getting other companies to try and build functional
software for them, and these companies Open ai and Anthropic
are their customers weak point in a relationship that veers
from symbiotic to parasitic at a moment's notice. I cannot
stress enough how bad open ai and Anthropic are for

(27:58):
their business customers. Their models are popular, by which I
mean their customers customers will expect access to them, meaning
the open ai and Anthropic can, as they did to Cursor,
arbitrarily change pricing, service availability, and functionality based on how
they feel that day or whether they need to pump
their annualized revenue for investors.

Speaker 2 (28:14):
Don't believe me.

Speaker 1 (28:16):
Anthropic cut off access to AI coding platform Windsurf because
it looked like they might get acquired by open Ai.
They never were. They just harmed that business. They just
cut a hole in them. Why because they might touch
another business, the most anti competitive shit in the world.
And everyone sat there clapping like a fucking seal. Disgusting
even by big tech standards. This fucking sucks, and these

(28:38):
companies will do it again. But you know what, Let's
talk about the actual uses of generative AI, because the
limited number of use cases are because large language models
are all really really similar. Because all large language models
require more data than anyone who's ever needed, including like
four times the amount of data on the Internet. They
all basically have to use the same thing, either taken
from the Internet or bought from one of the few

(28:58):
companies that scale surge during together or whoever. While they
can get customized data or do customized training and reinforcement learning,
these models are all transformer based and they all function similarly,
and the only way to make them different is by
training them, which doesn't make them that much different, just
better things they already do. And good lord, is it
so is general IFAI is so ungodly expensive and the

(29:20):
training is as well. By the way, they have to
pay real humans as well, which they hate doing. And
even when they're paying outsourced labor and ken youre at
two dollars a pop, they're still losing a ton of money.
It's really crazy, actually, how badly built all of this is.
And I already mentioned open AI and Anthropics costs as
well as perplex The's fifty million dollar bill in a
year to Anthropic Amazon and open Ai off of a

(29:42):
easily thirty four dollars million dollars in revenue. These companies
cost too much to run and their functionality doesn't make
enough money to make them make sense. And the problem
isn't just the pricing, but how unpredictable it is. As
Matterscheer wrote for cio Dive last year, generative AI makes
a lot of companies lives difficult for the massive spikes
and costs that from the power users, with few ways
to mitigate those costs. One of the ways that company

(30:04):
manages their cloud bills is by having some degree of predictability,
which is difficult to do with the constant sleu of
new models and demands some new products to go with them,
especially when send models can and can and do often
cost more with subsequent iterations, not necessarily for much return,
except if you're a company like a coding company, your
customers are going to actually ask you for the new models.

(30:26):
As a result, it's half AI companies to actually budge in.
But ed, What was that? Ed?

Speaker 2 (30:31):
What about agents?

Speaker 1 (30:32):
Aren't they the thing that will eventually make the insane
broken calculus behind generative AI actually work?

Speaker 2 (30:37):
What is your accent made? Anyway? Anyway?

Speaker 1 (30:42):
Let me tell you about agents. The term agent is
one of the most egregious acts of fraud I've seen
in my entire career writing about this crap, and that
includes the metavers. When you hear the word agent, you
were meant to think of an autonomous AI that can
go and do stuff without oversight, replacing someone's job in
the process. And companies have been pushing the boundaries of
good taste and financial crimes in pursuit of them. Most

(31:03):
egregious of them as Salesforce's Agent Force, which leads you
deploy AI agents at scale. That's a quote, and brings
digital labor to every employee, department and business process. Another
quote from Salesforce's website. These are two blatant fucking lies.
Agent Force is a goddamn chatbot program. It's a platform
for launching chatbots. They can sometimes plug into APIs that

(31:24):
allow them to access other information, but they're neither autonomous
nor agents by any reasonable definition. Not only does Salesforce
not actually sell agents, its own research shows that the
agents and agents in general only achieve around fifty eight
percent success rate on single step tasks. And I'm going
to quote the register here. This means tasks that can
be completed in the single step without needing follow up

(31:45):
actions and more information or multi step tasks, So you know,
most tasks they succeed a depressing thirty five percent of
the time. Last week, open Ai announced its own chat
GPT agent that can allegedly go and do tasks on
a virtual computer. In its own demo, the agent took
twenty one minutes or so to spit out a plan
for a wedding with destinations of a cander and some

(32:06):
suit options, and then showed a pre prepared demo of
the agent preparing an itinery of how to visit every
major league ballpark. And that's baseball for the non Americans
out there. In this example's case, agents took twenty three
minutes and produced arguably the most confusing map I've seen
in my life. You can see the map in the
newsletter version of this episode. It's hilarious. It missed out
every single major ballpark on the East Coast, including Yankee

(32:28):
Stadium and Femway Park, which are two of the most
well known stadiums in sports, and added a bunch of
roundom ones and like one in the middle of the
Gulf of Mexico. What team is that, Sammy the deep
Water Horizon Devils. Is there a baseball team in North Dakota?

Speaker 2 (32:42):
Clammy?

Speaker 1 (32:43):
Sammy Samy. I also should be clear this was a
pre prepared example. This is the best they had. I
want to see the cutting room footage on this, because
you best bet that that map looked like straight dogshit,
as with every large language model product, And yes, that's
what this is, even if open ai won't talk about
what model results are. Extremely variable agents are difficult because

(33:06):
tasks are if for coal, even if they can be
completed by a human being, that the CEO thinks is stupid.
What open ai appears to be doing is using a
virtual machine to run scripts that its models trigger regardless
of how will it works, and it works very very
very very poorly and inconsistently. It's also very likely expensive
to run. In any case, every single company you see
using the word agent is trying to mislead you. They're

(33:29):
lying gleans ai agents to chatbots with If this, then
that functions that trigger events using APIs, which means if
an event happens, another thing will be triggered, not taking
actual actions, because that is not what lms can do.
Service Now's ai agents that allegedly act autonomously and proactively
on your behalf are despite claiming they go beyond better

(33:49):
chatbots still ultimately better chatbots that use APIs to trigger
different events using if this, then that functions. Sometimes these
chatbox can also answer questions that people might have or
trigger an event somewhere. Oh right, that's literally the same thing.
The closest we have to an agent is any kind
of coding agent, which is they can make a list
of things that you might do on our software project

(34:10):
and go and generate code and push stuff to get
help when you ask them to. And they can do
so autonomously in the sense that you can just let
them do what a model that doesn't know anything and
has no consciousness thinks is right based on its corpus
of data and the things you can access to. And
it's about as safe as that sounds. When I say
ask them to and go, and I mean that these

(34:30):
agents are not intelligent at all. They do not have intelligence,
and when let run rampant, fuck up everything and create
a bunch of extra work or so. A study found
that AI coding tools made engineers nineteen percent slower. Nevertheless,
none of these products are autonomous agents, and anybody using
the term agent likely means chat bond. And all of
this is working because the media keeps repeating everything these

(34:51):
companies say. It's a disgrace. We need to stop this.
I realize we've taken a kind of a scenic route here, though,
but I know he needed to lay the groundwork, because
I really am alarmed. According to a UBS report from
the twenty sixth of June, the public companies running AI
services are making absolutely pathetic amounts of money from AI. Microsoft,
according to the UBS, is making annual revenues of somehow

(35:13):
less than the Information report at two point one billion dollars.
Service now is making less than two hundred and fifty million,
Adobe less than one hundred and twenty five million salesforce
less than one hundred million now service now said two
hundred and fifty million dollar ACV annual contract value. This
may be one of the more honest explanations of revenue
I've seen, putting them in the upper echelons of AI
revenue and less. Of course, you think about it for

(35:35):
a couple of seconds and think, are these all AI
specific contracts or perhaps they're in contracts where you've taped
AI on to the side.

Speaker 2 (35:41):
It gives a shit. It's also year.

Speaker 1 (35:44):
Long agreements that could churn, and according to Gartner, over
forty percent of Agenetic AI products will be canceled by
end of twenty twenty seven. And really, you gotta laugh
at Adobe and Salesforce, both of whom to talk such
a goddamn fuck ton about jenerit of Ai. And yeah,
I have only made amazed the hundred twenty five million
in analyzed revenue from it.

Speaker 2 (36:02):
Pathetic crap, dog shit.

Speaker 1 (36:05):
These aren't futuristic numbers, they're barely product categories, and none
of this seems to include costs.

Speaker 2 (36:11):
Oh well, good grief.

Speaker 1 (36:14):
Look, a lot of what I've been saying is reminiscent
the previous podcasts, and I've gone over this a lot,
so I really want to make it clear that the
signs are very troubling, and that the things I've warned
you about the past couple of years are only getting worse,
and the cliff's coming up things are only getting closer.
When we double off of it, things may get really,
really bad, And then the next episode we'll talk about

(36:34):
how and what that tumble might look like and the
noises I'm going to make when it happens.

Speaker 3 (36:47):
Thank you for listening to Better Offline, The editor and
composer of the Better Offline theme song is Matasowski. You
can check out more of his music and audio projects
at Matasowski dot com, m a Ttoso w s Ki
dot com. You can email me at easy at Better
Offline dot com or visit Better Offline dot com to
find more podcast links and of course, my newsletter. I

(37:09):
also really recommend you go to chat dot where's youreaed
dot at to visit the discord, and go to our slash.

Speaker 2 (37:14):
Better Offline to check out I'll Reddit. Thank you so
much for listening. Better Offline is a production of cool
Zone Media.

Speaker 1 (37:22):
For more from cool Zone Media, visit our website cool
Zonemedia dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Host

Ed Zitron

Ed Zitron

Popular Podcasts

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.