All Episodes

November 12, 2025 57 mins

This week’s Better Offline is a recording of a panel held by Clarion West in Seattle earlier this year with Cory Doctorow and Ed Zitron, discussing enshittification, The Rot Economy, and the destruction of modern tech platforms, moderated by Whitney Beltrán.

Cory Doctorow: https://craphound.com/
Whitney Beltrán: https://brightwhitney.com/

Get involved with Clarion West: https://www.clarionwest.org/get-involved/give/

Clarion West is a nonprofit literary organization that runs an acclaimed six-week residential workshop every summer, online classes and workshops, one-day and weekend workshops, a reading series every summer, and other events throughout the year. At Clarion West, you’ll be among award-winning and best-selling writers in science fiction, fantasy, games, horror, and more. Our workshops and classes are taught by world-class instructors from across the field of speculative fiction. Wherever you may be in your career, whether novice or sage, we offer a diverse listing of classes that is packed with valuable information to take your writing to the next level.

Want to support me? Get $10 off a year’s subscription to my premium newsletter: https://edzitronswheresyouredatghostio.outpost.pub/public/promo-subscription/w08jbm4jwg it would mean a lot!

YOU CAN NOW BUY BETTER OFFLINE MERCH! Go to https://cottonbureau.com/people/better-offline and use code FREE99 for free shipping on orders of $99 or more.

---

LINKS: https://www.tinyurl.com/betterofflinelinks

Newsletter: https://www.wheresyoured.at/

Reddit: https://www.reddit.com/r/BetterOffline/ 

Discord: chat.wheresyoured.at

Ed's Socials:

https://twitter.com/edzitron

https://www.instagram.com/edzitron

https://bsky.app/profile/edzitron.com

https://www.threads.net/@edzitron

Email Me: ez@betteroffline.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Al Zone Media.

Speaker 2 (00:05):
Hello and welcome to this week's Better Offline. I'm your
host ed ZiT Trono. So this week is going to
be a recording of a panel held in Seattle at
the Seattle Public Library with my good man Cory Doctorow,

(00:27):
and we were discussing in certification, the rot economy, and
the destruction of modern tech platforms. We're moderated by the
wonderful Whitney Bell Tron and big up to Clara and West.
A wonderful group out there, nonprofit. You should definitely support
everything in the episode notes. Have fun with this, I
certainly did.

Speaker 3 (00:49):
Well.

Speaker 4 (00:49):
Welcome everybody again, thank you for houcking it out on
this Wednesday night.

Speaker 5 (00:54):
We're really excited to have you.

Speaker 1 (00:55):
All.

Speaker 5 (00:55):
The energy in the room is great.

Speaker 4 (00:58):
We're all here to talk about what is happening to
us and seems to be spinning wildly out of our control,
and what it is and how it's gonna affect us
and what we can possibly do about it future facing.

(01:18):
I have two really, really illustrious guests with me tonight,
Corey and Ed. I did my research beforehand and watched
all of their other interviews with other people. I'm not
gonna do as good of a job. So I'm really
counting on them to shine. But for those of you
who maybe just you know, showed up by accident, I
would love for first Corey and then Ed to do

(01:40):
a brief introduction of themselves before we launch into this
really incredible topic.

Speaker 5 (01:46):
So Corey, why don't you go ahead him?

Speaker 3 (01:49):
I'm Corey Doctor. I have worked with the Electronic Frontier Foundation,
a digital rights group, for a quarter of a century now.
I've been an activist, kind of journalist and a writer.
I'm covering library worker and bookseller and Clarion West instructor
every now and again. And I'm very pleased to be
back here in Seattle to talk with you folks.

Speaker 6 (02:12):
All right, so yeah, let's open up this pit. I'm
at Zittron. I'm the host of a podcast called Better
Offline about technology and the decay of society. I also
write a news letical Where's You're at At that I
mostly started because I was depressed. It's got eight two
thousand subscribers now somehow. But yeah, I write about tech
the ka of society, and I had to learn a
bunch of economics to do it, which was extremely exhausting.

(02:32):
But I enjoy it. Thank you for being here.

Speaker 4 (02:39):
Okay, so we're gonna We're just gonna have like a
little bitty five minutes on the basics before we launch
into the complexities of the situation, where I'm gonna start
with an anecdote, which is, one day I woke up
and Google sucked. I don't know what happened, but one
day I was like looking for a thing, and I'm like,
this is so much worse than I am used to
and that tripped me into a world of it sucks

(03:03):
on purpose.

Speaker 5 (03:04):
Here's why. And I would love for Ed and Corey.

Speaker 4 (03:09):
To tell me why Google sucks.

Speaker 3 (03:11):
Well, do you want to start by talking about Propagar
Ragava Ragavan?

Speaker 6 (03:16):
I love saying that motherfucker's name. So I don't know
if any of you have read The Man Who Google Search,
which was a piece that I wrote by accident. I
found a bunch of notes from the Department of Justice. Anyway,
long story short, Around the early part of twenty twenty,
there was a thing called the code Yellow at Google,
which just meant that was I'm going to use some
financial things. But there was query weakness in Google, which

(03:38):
means that not enough people were just asking Google things. Now,
you may think, well, Google ideally gets you an answer, right, No, no, no,
you pleb How dare you what do you mean you
get an answer? Fuck you? You need to look at ads,
Sundhar Pshai needs and all of his McKinsey friends need
to do their seance or whatever. Nevertheless, there was an
internal argument at Google. There was a called Ben Gomes

(04:00):
around search who literally said I'm worried that all Google
cares about his growth. Turns out he was right. Around
June that year, a guy called Probagar Ragavan took over.
He was formerly the head of ads, and he pushed
pushed Ben Gomes to change things to make Google Search worse,
to keep people on Google Search. So the decay of
Google Search really started then. It was actually a few

(04:21):
years beforehand. They were slowly making it so it was
harder to tell sponsored links and ads from regular search results,
to the point it's pretty much impossible now and most
of the results a pretty bad as well. But long
story short, Google deliberately made things worse. And then the
guy who it's called Probagar Ragavan remember that name, asked
wipe He He mandated basically that I can't prove it

(04:44):
but it matches up that he just reduced the quality
of results. They allowed websites that had been previously pushed
down to come back spamming results, so that you ended
up farting around on Google more, seeing more impressions and
bing bong, number go up. Probagar sadly, he sadly moved.
He's alive. He is now the chief technologist at Google.

(05:04):
I would say, like thirty percent of my work was
the problem. The funny thing is that propagad was riding high,
taking a massive, massive growth of Google. He was doing
so well, and then he got made to take over Gemini,
Google's AI. And then there was a thing in I
think it was early twenty twenty four where this well
maybe it was twenty twenty five where Google Gemini was

(05:25):
generating Chinese George Washington's and the Conservatives were going in there,
there's a black Nazi, black as a Chinese George Washington.
I can't. My tiny little brain is vibrating at one
thousand miles a minute. And for some reason propagar Ragavan
had to take responsibility. And this is not a charming man.

(05:45):
This isn't a man who's like a political navigate. He's
just rude. He's very good at being rude. In the end, sadly,
Propaga's work has been done. Google sucks now. And as
Corey will tell you in his excellent book and across
his wonderful literature, this is a problem basically the entirety
of the Internet.

Speaker 3 (06:03):
Yeah, so Ed's got this way I talk. Don't know
there we go. I do have an honorary PhD in
computer science. So Ed has got this way of talking
about the ideology that says, let's make Google worse in
order to increase profits. And you have to understand. You know,

(06:24):
when Google hit this code yellow, they did a ninety
percent market share and search so of course query growth
had slowed, Right, how do you increase query growth when
you have a ninety percent market share. You can raise
a billion humans to maturity and make them Google customers.
That's a product called Google Classroom. But it doesn't work quickly,
and so they needed something else to goose growth, and

(06:45):
they came up with this idea, let's make Google worse
in order to make more money. And you see in
these documents that had surfaced this ferocious debate between technologists
who want to do the right thing in the form
of bend Gomes and business people who want to do
the wrong thing in the form of pregate of ourt Ragavon.
And what's interesting about this is that although for many
years you can imagine fights like this played out at Google,

(07:07):
where the side that wanted to make things better one
out against the side that wanted to make things worse,
in this case you see this guy losing, and he's
losing because his argument consists of if I made Google worse,
I would feel bad about my work and my life.
And pragabah Ragavan's argument is if we make Google worse,
we'll make a lot more money.

Speaker 6 (07:27):
And there's one other thing as well. Jerry Dishlow, who's
now the head of ADS, one of his things was, look,
I'm not saying that revenue needs the control ADS. However,
we will have a shared reality. There's basically what he said.
It's fucking when you read these things, it's chilling and
makes you think of things you can't say in public.

Speaker 3 (07:46):
So so you know you have to wonder like, what
is it that created this environment? Ed calls the mindset
that says, well, if you can, you should worsen things
to make money. The raw economy and I think it's
a very apt phrase. And the question that I try
to interrogate in in shitification is what gave rise to it? Right,

(08:07):
What created the in shitta scene? And my conclusion is
it wasn't because you shopped wrong. Right. It's not because, oh,
you didn't pay for the product, so you became the product.
You know, farmers who buy half million dollar tractors are
exploited by John Deere, which won't let their repairs go
live until they pay a two hundred dollars call out

(08:27):
fee for John Deere person to come out and type
an unlocked code into the keyboard. Now, that is not
a free ad supported tractor, right, that's a tractor they
paid you know, six large for and it won't work
until they pay ransom money. So it's not because you
shopped wrong. It's also not because these guys are the
wrong guys to be running the company. They are terrible people.

(08:51):
But the reality is that these zucker Muskian mediocrits that
run these companies are not smart enough to be causes.
They must be effects, right, They're responding to an eshiogenic
environment created by policy, and that policy, in the case
of Google, is the policy that oversaw for decades Google's

(09:12):
serial acquisition of both vertical and horizontal competitors, so that
a company that had only made one really successful consumer
facing product, which they made a millennium ago, right that
their search engine, and that had almost with that exception,
failed to launch anything internally except for things they bought
from other people in anti competitive acquisitions, to the point

(09:33):
where they bought all the shelf space right nowhere else's
search engine could take root. They're bribing Apple to the
tune of more than twenty billion dollars a year not
to enter the search market and have a direct competitor
that would erode their margins. So they've become too big
to care. And this is really my thesis, right that
it's as much as Ed is right to be angry
at Pepperagar Ragavon. There are people alive today and not

(09:56):
so recently dead, who presided over shifts in our policy
environment where they were warned at the time that the
decisions that they were contemplating would have the absolutely foreseeable
effect of rewarding firms that did bad things to us
who took those decisions anyway, and today are like hanging
around polishing their fake Nobel Prizes and economics and collecting

(10:17):
six figure consulting fees, working for blue chips and not
being held responsible at all, much less worrying that when
they go out abroad amongst us that someone might be
sizing them up for a pitchfork. And so that's the
thing that I want to recover in this book, and
the thing that.

Speaker 5 (10:31):
I'm going to jump in here.

Speaker 4 (10:32):
That's okay, So this is Seattle and collectively is seaatellites.
We tend to have certain feelings about capitalism. Would you say,
either of you, that this phenomenon right, this rat economy
is just an inescapable fact of capitalism, or is there

(10:55):
real policy changed? Is there some other way we could
do this, that we could ascuw this, this rat economy
and actually still have functionally capitalism in some way and
the things that we want and improved lives.

Speaker 6 (11:08):
So a lot of people talk about going back in
time and killing baby Hitler. Sure fine, I also think
Baby Reagan should be on the table. The same goes
for that fuck here. That fuckhad Milton Friedman finally an
audience with class. Because whatever however you feel about capitalism,

(11:30):
the whole growth focus capitalism started with Milton Friedman in
the free market nonsense. I'm not going to spend as
many angry hours as I need to to get that
out of my system. The force of capitalism are obvious,
but the real once you remove the regulation, once you
allowed Reagan and his various judges to pull away regulation
and just effectively stop doing antitrust. When Lena Kahan, who

(11:51):
was good, was remarkable because she tried, that was the
real Like Lena Khan is excellent, I've had her on
the show. I'm going to have her actually fantastic, but
it's like she was particularly because she was like actually
knew what she was talking about. I know, I was surprise,
and also was like, what would stop these companies from
doing things? What would be the incentives? The incentives were

(12:12):
created over the course of decades. Yes, capitalism is at
its route doing what capitalism will do, but by allowing
it to be unrestrained and allowing the markets to become
growth drunk, because that's all the markets. That is the
center of the rot economy. Everything is growth. Every single
thing is driven by growth. Nothing to the point that
the AI economy isn't even just like sixty billion dollars
of real revenue. It's not about even whether the product

(12:35):
creates money anymore. It's just number go up forever. It
sounds simple, but it is at the root of everything.

Speaker 3 (12:42):
Yeah, I mean, I think it makes a really good
point here. And you know, look, I'm not someone who
believes that markets are the best or only arbiter of
how we allocate resources. I think that there are other
ways that we can do things. But even if you
are the kind of like Elon Musk adult libertarian who
can't open your copy of Atlas Shrugged anymore because the

(13:04):
pages are all stuck together, and you'd think that the
only thing that our government should ever do is enforce contracts,
you still want them to enforce contracts, right, and for
the referee to referee the game adequately, they have to
be more powerful than the players in the field. So
I tell my libertarian friends, Look, the smallest government you

(13:25):
can have is determined by the largest corporation you're willing
to tolerate. And the decision which was taken in the
late nineteen seventies, first under Carter and then accelerating under
Reagan and through the rest of it, and really coterminal
with the tech industry. So remember roll Od Reagan went
on the campaign trail the year the Apple two plus
went on sale. The tech industry grew up, and as

(13:47):
it got bigger, anti trust got smaller. It's really the
first post anti trust industry, which is why it looks
the way it does. And when we decided that we
would no longer enforce anti monopoly law, we really did
set in motion these very foreseeable outcomes. And the people
responsible for it, they insist that they're pro monopoly posture,

(14:08):
which boiled down to this, that monopolies are efficient, that
when you observe a monopoly in the wild, what you
must be seeing as a company that is very pleasing
to people. Because if a company isn't pleasing to people,
then other people will enter the market and take away
your monopoly. And so any monopoly that you find tautologically
by circular reasoning, must be a good company, right. And

(14:31):
so if Apple is controlling all of the app market,
if Google is controlling ninety percent of the search market,
what you are observing is a firm that is so
pleasing that everyone is voluntarily using them, and if they weren't,
then they would be out competed. And so they said, well,
Let's tolerate monopolies as efficient. Let's not engage in this
perverse labor of punishing companies for pleasing the people who've

(14:54):
elected us to represent them. Let us celebrate these new
efficient and you do that for forty years, and suddenly
every sector becomes a cartel. Right. We have this in
glass bottles, vitamin c eyeglasses, intermodal shipping, rail, the internet,
as Tom Eastman says, as five giant websites filled with

(15:16):
screenshots of texts from the other four. We have it
in semiconductors. We have it in professional wrestling. We have
it in plastic bags filled with sterile singialine. And one
company controls not just the hospital beds, but also the coffins.
Think about that for a minute. Right, So, these this
rampant monopolization did ultimately give rise to this world where

(15:40):
firms didn't have to worry about being disciplined by markets.
But it also meant they didn't have to worry about
being disciplined by governments. Because when you boil a sector
down to just a handful of firms, it's very easy
for them to decide what line of bullshit they're going
to feed to their regulators. And because they are so
a slash in money because they don't compete head to head.
You know, for Google, twenty billion dollars a year to

(16:01):
Apple not to enter the search market is a bargain
because the erosion of both of their margins if they
were competing head to head in these markets that they've
actually divided up amongst themselves like the pope dividing up
the New World, would cost them both far more than
twenty billion dollars. They'd have to hire each other's employees
and pay them more. They'd have to offer us cheaper things.
They'd have to be better to advertisers, they'd have to
be better to publishers, and that would erode their margins.

(16:23):
And so you end up with this moment where they
have all of this money, they find it very easy
to come to a single position, and their regulators do
as they're told, And so we end up with both
regulatory capture and market capture arising out of the same
set of policy choices, And so two of the most
important forces that we count on to punish companies that

(16:44):
do bad things to ensure that when pragabar Ragavon and
Les Gomes are fighting that Les Gomes can say more
than this would make me feel bad about my life's work.
You can say, and we're going to have our lunches
eaten by someone smarter than us, you, or we're going
to get I'm ACKed around by a regulator that at
that moment he's going to lose that argument. And so
this is how you end up with the devil of

(17:06):
your worst nature on your left shoulder, whispering your ear
and winning the argument over the angel of your better
nature and your right shoulder telling you to do the
right thing.

Speaker 4 (17:24):
So on that note, our regulators are not really regulating
right now in general and for us the average person, right,
this is all really over our heads power wise, Like
we as individuals don't really have power to do anything
about getting Google to behave right, But what do we

(17:45):
do in our personal lives when we're affected by this? Right,
Like I work in video games and working on Cyberpunk two,
and I feel the effects of the economy initientification on
my job every day. It's very, very, very stressful. You know,
I was talking to a woman earlier, her name's Carolyn. Hi, Carolyn,
about you know a as effect on higher ed. So

(18:09):
for us, for the people in this room, do you
have any ideas or guidance for like, how do we
live our lives?

Speaker 3 (18:16):
Now? What do we do? So?

Speaker 6 (18:18):
We live in a high information, low processing environment. The
reason that regulators don't do much is not only because
they're incentivized not to, but also they don't know shit
about fuck. In fact, when you go around reading most
a ton of media, when you read a bunch of
regulatory stuff and you read what the government says about things,
they don't know what they're talking about. I don't even
mean slightly. I mean that the markets have been running

(18:39):
away with this idea that generative AI is the future.
The media has been reporting that generative AI is replacing jobs.
Neither of these are true. They're not true. There are
some jobs being eroded by llms, but it's jobs like
translators that were already being eroded by machine translation. I
forget the exact terms and our directors, so real jobs
put contract like and also bosses that wouldn't pay real

(19:01):
people anyway. The world itself is run on ignorance, not
just social ignorance, but just straight up factual ignorance about
what a I can do. And the reason I pick
AI is because This is where you can actually be disruptive.
You hear a fucking data center popping up, go to
the town hall meeting. Make your mare upset. I'm deadly serious.

(19:23):
You should I had. I saw one brave soul at
Wisconsin data center meeting. You just quoted my work. Love
him going to these places and actually not making a scene,
just going in and saying, Hey, there's only sixty billion
dollars worth of revenue in this industry. There's no growth
outside of chat GPT like these companies. Open Ai burned
nine point two billion dollars in the first half of
this year. That's fucking crazy. That's a completely crazy thing.

(19:46):
There's no path to profitability for these things on top
of them not being that much demand. I'm not just
quoting things in my newsletter at you. This is the
shit that you should be saying to your elected officials
and show up at town hall meetings. They hate it.
They do not want to see you, but they have to.
So you show up and you tell them these very
simple things, and they'll say, when I read in the newspapers,
shut the fuck up, man, I don't want to hit

(20:08):
I from you meds. Then I don't know a little bowser,
I guess, but nevertheless, showing up and actually just saying
the very basic things and sticking to your points will
do a lot. I know it sounds small, but really
these that's where you get things done, because these days,
sentences are probably not gang built anyway, but if you
can undermine them, then they're really not getting built.

Speaker 3 (20:30):
You know, when the early reviews started to come out
for in Shitification, there was a kind of common theme
that emerged from some of these reviews. It was I
found this book very exciting and interesting. It gave me
a way to attach a handle to something that had
been kind of big and diffuse and frustrating to me
that I couldn't make sense of. And now that I
got this handle on it, I feel like I can
do something with it. And I read all of these,

(20:52):
you know, suggestions for how we can make policy at
the end and what that would do in order to
forestall or roll back and shitification and make a better world,
but there wasn't anything I could do personally in my
life as a consumer. And you know, they're right, They're right.
I don't think your personal consumption choices really play a
role in changing our systemic problems now by all means like,

(21:17):
if there's like a business you want to support, go
to Third Place Books instead of Amazon in order to
get your books because it'll make their lives better and
you'll have a bookstore on your community. But don't get
yourself that it's going to fix the structural problem of Amazon.
The structural problem of Amazon is that regulators allow them
to buy all of their competitors such that they were
able to corner markets, and then they were able to

(21:39):
lock people in too. Quiet is that they issue all right,
and then they were able to lock people into their
platform through getting them to pay in advance by a
year for their shipping with Prime, such that merchants found
that they couldn't sell anywhere else, and then they were
able to lard onto those merchants forty five to fifty
one percent junk fees on every dollar they brought in,
and then they were able to hit them with this

(22:00):
Most Favored Nation deal that says that if you raise
your prices on Amazon to recover some of those forty
five to fifty one percent junk fees, you have to
raise your prices everywhere else at Target and Walmart and
mom and pop stores, and your own warehouse store. So
that Amazon started to impose a worldwide tax on all consumption, right,
And that's not a thing you solve by changing your
own consumption habits. You know, by all means, if you

(22:23):
feel that Twitter is bad for your mental health, then
it probably is, and you want to go to Blue Sky,
you want to go to mask it on sure, But
just don't fetishize that as the thing that's going to
make a systemic difference. Right. The thing that makes a
systemic difference is intervening not as an individual but as
a polity. So, as I've mentioned before, I work at
the Electronic Frontier Foundation. We have a national network of

(22:43):
grassroots groups, including several here in Seattle, called the Electronic
Frontier Alliance. If you go to e FA dot EFF
dot org you can find some of these local chapters
and they work on things like limiting police use of
facial facial recognition, buying surveillance technology, privacy for abortion seekers,
limitation on the use of digital infrastructure to track down people,

(23:05):
that ICE is chasing right to repair laws. Washington's got
a really good one. All of those things start at
these grassroots levels, and it's getting involved as a quality
that makes a difference. I know it's easy to despare
if you think you can't solve things individually through your
own consumption choices. We've been told this for forty years
that you have to vote with your wallet. The reason
rich people want to vote want you to vote with

(23:27):
your wallet is they have thicker wallets than you. Right,
you are always going to lose that election, right, So
you know, don't vote with your wallet. Be a citizen.
Someone in the reception before this asked me about what
advice I would give to tech workers, and statistically a
bunch of you are probably tech workers. So you know,

(23:49):
the tech workers did have this chance to consolidate their
power and they missed it. So for a long time,
one of the forces that constrain in shitification was workers themselves,
because tech workers are this uniquely constituted workforce. Historically, it
has the tech sector has always had very low union
density but an enormous amount of worker power, and that's

(24:09):
because tech workers were both very scarce and very very valuable.
The National Beer of Economic Research estimates that the average
tech worker in Silicon Valley in Seattle was adding a
million dollars a year to their employer's bottom line. Right,
That's why they gave you freekimbucha and massages, and why
they'd hire a surgeon to freeze your eggs so you
could work through your fertile years. It wasn't because they

(24:30):
loved you, right, it was because they were afraid of
you getting a job across the street. Now we know
how tech bosses treat the workers. They're not afraid of losing. Right.
That's the Amazon driver who's got the AI camera that
takes points off if they look away from their from
the road to check something to one side or the
other and it decides that their eyeballs were in the

(24:51):
wrong orientation. It's the warehouse workers that are injured at
three times the rate of other warehouse workers in the sector.
It's everyone who peas in a bottle. It's pe lo
assemble iPhones in China and have suicide nets around the factory. Right.
That's how they treat the workers. They're not afraid of
So there was this opportunity at one point for tech
workers to use that power and consolidate it through a union,

(25:13):
and they missed it, right, because tech workers thought that
they were temporarily embarrassed founders. They didn't think that they
were workers. And they thought that because their bosses would
meet them in monthly town hall meetings where they could
ask impertinent questions about corporate strategy, that their bosses thought
that they were peers. But your boss didn't think you
were a peer. Your boss thought you were a problem
to solve. And after half a million layoffs in the

(25:35):
tech sector, they're not afraid of you anymore. There's other
workers who will take your job. You can no longer
say I refuse to and shittify the thing. I missed
my mother's funeral to ship. And you can't hire someone
else to replace me, because they'll just fire you and
hire someone else to replace you. And so now is
the time to unionize and it can.

Speaker 6 (25:52):
Yes, Also, Elma, if you have anything pertaining to the
revenues or spend of anthropic or open AIE, you could
email me that information and I could ad shit pis
and fuck between the numbers. They love it in the podcast.

Speaker 3 (26:12):
And so you know, it might feel like a bad
time to be unionizing because we no longer have the
National Labor Relations Board as it was constituted under Biden.
In fact, it's been so illegally denuded of commissioners that
it can no longer form a quorum and investigate unfair
labor practices. So it can feel like this is a
bad time. But here's the category error Trump is making.
Trump thinks that the reason we have unions is because

(26:34):
we have the National Labor Relations Act. It's backwards, right,
Long before unions were legal, we had unions, right. And
the union piece represented by the National Labor Relations Act
was brought about because bosses were scared of what their
workers were doing to them at that point, because militancy
had gotten so intense that they sued for peace, and

(26:56):
that piece came in two parts. Part of the National
Labor Relations Act describes what your boss can't do to you,
but a lot of the National Labor Relations Act is
about what you can't do to your boss. And one
guess which half of the National Labor Relations Act has
been most vigorously enforced. So Trump thinks that we fired
the referee, and so that means all the players have
to leave the field. He's wrong. When you fire the referee,

(27:18):
it means there's no more rules. Right. And there's a
reason that fascist attack unions first is because the opposite
of fascism is solidarity.

Speaker 4 (27:35):
Yes, excellent answers all around. And just to summarize what
I heard was don't vote with your wallet, unionize. So
take that out the door and dot org plug it.

Speaker 5 (27:47):
Okay.

Speaker 4 (27:48):
My next question for you is one that a lot
of people have been thinking about, which is the bubble?
The bubble is coming. We've all heard about the bubble?
Is the bubble real? What is gonna happen?

Speaker 6 (28:00):
Then?

Speaker 4 (28:00):
If we are tiny investors that have our four to
oh one K and index funds, what the fuck do
we do?

Speaker 3 (28:05):
Buy long polls to dig through rubble with can goods?

Speaker 5 (28:10):
Let's try again, ed.

Speaker 6 (28:14):
Go back in time. But if you can't know, the
gravity exists right now, the market is going absolutely badshit insane.
I'm not a stock analyst. I cannot give you financial advice,
but right now number going up, so maybe you could.
I don't own stocks. I'm a psychopath. I live in
cash invest in words. The problem we have right now

(28:34):
is that in Video is the largest stock on the
stock market. There has never been a more problematic fact
than that eighty eight percent or more of their revenue
is selling these fucking GPUs. You want to know what
happens when you plug those GPUs and they start losing
your money. Nobody, not a single company other than Nvidia,
is making any kind of profit on ai. In fact,

(28:55):
they're burning billions. The reason I tell you this is
you can't really navigate away from the bubble right now
other than just selling I imagine, So know what you're
going into. Everything you're reading Oracle say right now about
their relationship with open Ai is bullshit. And to get specific,
is Oracle it has a five year long, three hundred

(29:16):
billion dollar contract with open Ai. Sounds amazing, right, They
just need four and a half gigawatts of data center capacity.
Any guesses how much they have two hundred megawats now,
don't that's less than four hundred giggles.

Speaker 3 (29:29):
That's a lot less.

Speaker 6 (29:31):
That's a lot less. And I must be clear, I
actually mean power it load. They've got about a buck thirty.
So not to worry though, because open Ai also doesn't
have the money. Nevertheless, the stock is run. AMD now
has a deal where they're gonna sell chips to open Ai.
Who the fuck knows who's paying the video same deal.
Whenever you see open Ai involved, just don't just don't

(29:51):
do it. There is no avoiding a bubble popping now,
there just isn't. So the smartest thing you can do
is stick to the fundamentals. We're going to get in
to the more hysterical phase. Now we're going to see
some crazy shit. I had someone suggest the other day
that open Ai may do a spac ipo. I think
it's insane, which is why it's possible. Do not let

(30:11):
your money touch open Ai under any circumstance. That company
is cancer. I'd say the same thing about Anthropic, But
just be aware that right now the stock market and
all of the associated media is going to tell you, oh,
there's a bubble. I've read many places, Oh there's bubbles
can be good. They can never be good. That's we
don't call them bubble. I'd see it in multiple headlines.

(30:31):
We're like, yeah, but this is a good bubble. Yeah,
you know, I've got the good kind of cancer. I've
got the good kind of diarrhea. Anyway, the point is
knowledge is power here, and you're going to look at
the media, and the media is going to say AI
number go up. Everything great. Broadcom's gonna ship ten billion
dollars of chip, ten gigabats of chips to open AI

(30:53):
by the end of twenty twenty nine cost fifty billion
dollars and two and a half years to make a
giga what a dates set in a capacity everyone is
going to feel like they're saying that this is inevitable.
Know that it's not. I realize I can't give you
much better advice than that, but just know that gravity exists.
This cannot succeed. On top of the fact that everyone's unprofitable.

(31:15):
It's not actually that popular either. Chat GPT is very
popular because a lot of people love being driven, insane
or trying to fuck it, I think, and terrible for
me that's what my friends. No sorry Kelen, but it's
it's frustrating as well, because I'm sure all of you

(31:37):
have felt the poison of generative AI within your workplace,
and these people will tell you that you must learn
AI's coming, you must learn to use AI. The reason
it's not able to do your jobs is it shit
Like I realized that all of this sounds may be
very elementary to you, sounds like most of you get it.
You're going to keep reading that it's not the case
that it's replacing coders. It isn't that is a fucking lie.

(32:01):
Everyone's saying it sooned up to show such in the
Della Andy fucking Jesse. These chunda fucks love to say
this stuff. It's not true. So really, the advice I
can give you is this is going to pop. It's
going to happen and acts accordingly.

Speaker 3 (32:18):
So let me advance a theory of the less bad
and more bad bubble. If not a good bubble, that
would be nice. So some bubbles have productive residues and
some don't. Right, So Enron left nothing behind right now, WorldCom,
which was a grotesque fraud. It some of you will remember, right,
They raised billions of dollars claiming that they had orders
for fiber. They dug up the streets all over the world.

(32:40):
They put fiber in the ground. They didn't have the
orders for the fiber. They stole billions of dollars from
everyday investors, people who just wanted to go through their
old age without starving to death or not having a
roof over their head. The CEO died in prison and
it was good riddance. But there was still all that
fiber in the ground. Right, So I've got two gigabits
symmetrical fiber at home in Burbank because AT and T

(33:02):
bought some old dark fiber from world coom. Because fiber
lasts forever. Right, It's just it's glass, So once it's there,
it is a productive residue. Right, So what kind of
bubbles are we living through? Well, Crypto is not going
to leave behind anything. Crypto is going to leave behind
shitty Austrian economics and worst jpigs. Right. Ai AI is

(33:24):
actually going to leave behind some stuff. So if you
want to think about like a post AI bubble world,
and I'm just I just got edits from my editor.
I wrote a book over the summer called The Reverse
Centaur's Guide to Life After AI. And if you want
to think about a post AI world, imagine what you

(33:45):
would do if GPUs were ten cents on the dollar,
if there were a lot of skilled applied statisticians looking
for work, and if you had a bunch of open
source models that had barely been optimized and had a
lot of room at the bottom.

Speaker 6 (33:58):
Right, got to push back on this, Okay, These aigpus
are mostly owned by private equity firms and big tech.
The majority of the GPUs are not owned by people
who will let them enter the market, and they are
really not a ton of Like if the idea is
that GPUs, someone will work something out, sure, But you

(34:18):
don't think all the king's sources and all the king's
men were to come up with something else. Because that's
the thing. The thing that I worry about the thing
that terrifies me about this bubble is this is not
useful infrastructure at all. Everyone loves say, oh, it's just
like the dot com bubble. It's nothing like that. It's
absolutely it's it's not even like the fire. At least
the fiber was somewhat useful. These GPUs the amount of
power alone, sure, And one of the reasons that open

(34:40):
AI and Anthropic have been able to have their monopolies
is because because they have the capital, and no one's
going to have the capitol to run these things. And
I mean they'll be'll be selling eight one hundreds, they'll
be paying you fifty cents an hour. It's just what
terrifies me is the I'm not saying you're doing this.
There are people already trying to rationalize this. They say, well,

(35:03):
and you were the one that actually told the stories
of adult comb where they're a useful service space BECASEUS
before AWS. Right, I don't see that usefulness here.

Speaker 3 (35:10):
So I take your point. I think that there's going
to be a lot of firms in receivership, right, I
don't think. I don't think that that private equity bosses
preferences are going to enter into it. I think that
there's going to be a lot of firms and receivership.
And I do think that when you contemplate the intersection
of optimization of existing open source models, you know, think
about what happened when when deep sick entered the market. Right,

(35:33):
You've got Chinese firms that are prohibited from using the
more advanced GPUs. So rather than doing this sort of
display of how serious they are about AI by spending
as much money as they can, they went they said, okay,
well what can we juice out of you know, uh,
previous generations of GPUs And then they got some pretty
impressive results.

Speaker 6 (35:52):
Yeah, but what results? Like, I'm so they made it cheaper, sure,
but what actually happened is the result of deep Sea
other than everyone all of silicon value when didn't happen?

Speaker 3 (36:01):
Sure, we get they can't through it cheap, but the
Chinese So my point, my point is not about the
market reality is the material reality. So I'm talking about
what happens after the market pops, right. So you know
I've seen people do interesting and useful things with AI.
I think you've probably seen people do some useful and
interesting I'll give you an example. I was writing an
essay and I couldn't remember where i'd heard a quote

(36:23):
i'd heard in a podcast. I couldn't remember which quote
it was, so I downloaded Whisper, which is an open
source model from open AI, to my laptop, which doesn't
have a GPU, right, little commodity laptop. I through thirty
hours of podcasts that i'd recently listened to. At it,
I got a full transcription an how where my fan
didn't even turn on.

Speaker 6 (36:40):
That's awesome.

Speaker 3 (36:41):
Yeah, so I know tons of people who use this.
In the title of the book, reverse Centaur refers to
this idea from automation theory, where a centaur is someone
who gets to use machines to assist them. A human
head on a machine body, right, and so you know
you're riding a bicycle, you use it.

Speaker 6 (36:56):
Thought it was with human legs.

Speaker 3 (36:58):
Yes, and a reverse center is that's right? A machine
head on a human body. Right, it's someone who's been
conscripted to be a peripheral for a machine. Right. And
when I you know, I should say, you talked about
good cancer. I have the least bad kind of cancer.
I've got a very treatable form of cancer. But I'm
paying a lot of attention to stories about cancer. And

(37:20):
you know open source models or AI models that can
sometimes see solid mass tumors that radiologists miss. And if
what we said was we at the Kaiser Oncology department
are going to invest in a service that is going
to sometimes ask our radiologists to take a second look
to see if they miss something such that instead of

(37:40):
doing one hundred X rays a day, they're going to
do ninety eight, Right, then I would say, as someone
with cancer, that sounds interesting to me. I don't think
anyone is pitching any oncology ward in the world on that.
I think the pitches fire. Ninety percent of your oncologists
find ninety percent of your radiologists have the remainder babysit

(38:01):
AI have them be the accountability sinks and moral crumple
zones for a machine that is processing this stuff at
a speed that no human could possibly account for have
them put their name at the bottom of it, and
have them absorb the blame for your cost cutting measures.
And so you know, when I hear people talk about AI, right,
I hear programmers talk about AI doing things that are useful.
Like one that I've heard many programmers say is I

(38:24):
had one data file in a weird format that for
one as a one off, I needed to get into
a different format, and I had ways that I could
check it and tell what was going on, and so
I asked, I once shot at it with an AI.
I did some check sums and it was great, and
it saved me an hour. That's a centaur, right, Sure,
But here's the thing.

Speaker 6 (38:41):
How many of those those actually exist in any given time,
and how many of them are I the DGX books,
the Nvidia's launched like I could see a future with
lage language models are run locally. Sure, well, so I
questioned how much of the cancer related stuff would actually
be generative. But that's a separate discussion because one of
the things these people, as they conflate, aire been around

(39:02):
for a while with generative and my concern is that
I don't know a lot of the like anecdotal one
shot staff and the Carle Brown Internet of Bugs. If
any of you have ever seen fantastic YouTube channel should
look him up. He's fucking brilliant, he said. It makes
the easy things easier and the harder things harder. Sure,
if we could have that as client side, awesome, But

(39:23):
I don't really think any of this GPU infrastructure, even
in receivership, is gonna like you can already get discount
a one hundreds h one hundreds h two hundreds, can't
wait for the discount Blackwells pieces of shit.

Speaker 4 (39:49):
But I'll give you an I'm going to my fingers
between the tigers here.

Speaker 3 (39:54):
I really want to squeeze out one more example here.
It's a really good one because it's about a nonprofit
people should be supporting as well. So it's a nonprofit
called the Human Rights Data Analysis Group HRDAG dot org.
It's run by some really brilliant mathematician statisticians. They started
off doing statistical excrapolations of war crimes for human rights tribunals,
mostly in the Hague, and talking about the aspects of

(40:17):
war crimes that were not visible but could be statistically
inferred from adjacent data. They did a project with Innocence
Project New Orleans, where they used LLMS to identify the
linguistic correlates of arrest reports that produced exonerations, and they
use that to analyze a lot more arrest reports than
they could otherwise. And they put that at the top
of a funnel where lawyers and paralegals were able to

(40:39):
accelerate their exoneration work. That's a new thing on this earth, right,
It's very cool, And like I'm like, Okay, well, if
these guys can accelerate that work with cheap hardware that
today is out of reach, if they can figure out
how to use open source models but make them more
efficient because you've got all these skilled applied statisticians who
are no longer caught up in the bubble, then I

(41:00):
think we could see some useful things after the bubble.
That's that's my argument for this is fiber in the
ground and not shitty monkey jpigs.

Speaker 5 (41:08):
Thank you both, we got there.

Speaker 4 (41:11):
So what I think I heard was we feel like
the GPU infrastructure is just kind of fucked that's not
going to be useful afterwards. But also the actual technology
itself could possibly be useful in a number of use
cases that could socially.

Speaker 3 (41:27):
Just if workers get to decide how they use their tools,
they generally will be able to make some good decisions
about it. And new tools for workers who are skilled
and get to decide how they use them, that's great.
You know, look, if you're I've met a video editor
who changed the eye lines of two hundred extras in
a crowd scene, right, using a deep fake, And he

(41:48):
was like, yeah, I was, you know, I was sitting
over the director. We thought, wouldn't this scene be really
interesting if they were all looking that way instead of
this way? And they were able to do it, And
I'm like, Okay, that's a new tool on this earth.
It's like generative. That was that looks Yeah, it's a
chief fake, right, they just moved the eyeballs.

Speaker 6 (42:02):
Yeah, the deeping is that generative? AI? Because this is
a really this is really appoint point.

Speaker 3 (42:07):
It's just because well, they made new pixels where pixels
didn't exist before by making an inference, right.

Speaker 6 (42:13):
Right, But that doesn't mean it's a drunk swoman. Of
the reason I say this is not because you're wrong
about that. That example is great. It's just these pigs
have got rich conflating the useful with the useless. Okay, okay,
all right is also right, But thank you sirs.

Speaker 4 (42:30):
I do have one, one small anecdote ad myself. Yes,
I am of value.

Speaker 5 (42:36):
I work in the video games industry.

Speaker 6 (42:37):
Of course you work.

Speaker 4 (42:38):
I work in the narrative department, which means I write
a lot of dialogue and in my pipelines, we've actually,
we have found a use for generative AI that doesn't
steal anybody's work, uh and makes us work about fifteen
percent faster. So what we've done is we got permission
from our SAG actors to train our personal closed generative

(42:59):
model on their voices, plug that into what we call
robo voice, so our lines that do speech to text
instead of the horrible Amazon poly monstrosity that's like hello,
my name is robot. And what we're able to do
is hear the lines and iterate on them faster so
that we're done by the time we actually get to
record with the SAG actors at their sack fees, and

(43:21):
nobody has lost any work. So I think my point
is it's cool, right, Like, generative AI absolutely does have
some uses. Is it worth burning down the planet? I
can't answer that for you. It's a Wednesday, right, But
there are like there's baby and there's bathwater, And I
think it it behooves us all to continue to have
the discussion, right to not one hundred percent close the door.

(43:42):
And on that note, we are going to switch to
answering questions from the audience, and I have one picked
out already, since we're all adults here, I thought I
would start with a spicy one, which is woodn't slowing
AI development in the United States, just allow China to
dominate all of us?

Speaker 3 (44:01):
Oh good, Oh good?

Speaker 6 (44:05):
So I love hearing a shit that was said about
the Soviet Union said again, Oh no, what if we
don't have chat GPT like China? Oh no, what will
we ever do if we don't have a chat? But
we can fuck like cha? It's nonsense. The reason that
China is pulling ahead of America in any way is
because they have a massive renewables initiative and also terrible

(44:27):
working conditions. But putting all that aside, that whole thing
is built on an inherently cymophobic idea and also just nonsense,
which is that America does not have the resources to
get this. Nowhere ever has anywhere near this much money
been shoved into one thing forever for years, not the
dot com boom, not their Post Telecommunications Act, which Corey

(44:50):
probably knows way better than I do. But there were
think it's four hundred billion dollars put into that nonsense.
Nobody has ever had more chances and more money to
do anything. Using China as a convenient excuse to spunk
money every month is a fucking stupid idea. But also
to develop what. Look at the last releases from open

(45:10):
ai Atlas. Oh work, Wow, a fucking web browser. Great,
I've never used one of us. Oh it can read
the web page and then fuck up and not buy
a thick Wow. I can just take five edibles if
I want to not use Amazon properly. But in all seriousness,
there is no development to be made that is not
being made in America. Shit, tons of talent, tons of

(45:32):
bit Chinese by the way, tons of Chinese engineers, as
you all know, like it's a very common thing. There
is no development that's being missed. There's what are we gonna?
Oh No, we can't cand anthropic five billion dollars a
year to destroy it for no real reason. There's no
it's not a cogent argument. Because look at what China

(45:53):
seems to be doing about the same thing as America
is with large language models is way less that just
that says more about large whiche models than anything.

Speaker 3 (46:01):
So there's a long history of saying that we should
subsidize and protect large American firms to defend America against
foreign firms. So for many, many years, the argument against
breaking up AT and T was that they were our
national champion and that they were keeping us safe from

(46:21):
foreign aggressors. In the mid nineteen fifties, AT and T
was almost broken up. The DoD intervened to say that
if we lost AT and T, if it wasn't intact,
America might lose the Korean War. So AT and T
won the Korean War because they got another thirty years. Right,
the only people arguably who won the Korean War were

(46:41):
AT and T, including both sides in the Korean War.
So when the eighties rolled around, right, there was this
law in the seventies. In the eighties rolled around and
we were once again thinking about breaking up AT and T.
There was this long argument about how there was this
belligerent foreign power in the Pacific rim that they weren't original,

(47:01):
they stole our ip and cloned our technology, and they
would destroy our high tech industries if we did not
have a giant company to defend us against them. That
country was Japan. Right now, it turns out that the
major project of AT and T was not depending America
defending America from Japan. It was preventing Americans from getting

(47:22):
modems because AT and T really did not want you
and anyone else in America to be able to provide
a service to one another without them being able to
veto it or charge rent on it. Right, if you
think about like the rollout of caller ID six dollars
ninety nine per month when it rolled out, right, this
was what it cost you to find out who was
calling you before you picked up the phone. You can't

(47:44):
do that once people have modems. There is no caller
ID for email. Right if your email provider says until
you click the message in the list pain you don't
get to find out who the from address is, you
would just change email providers. So by controlling the network,
by centralizing the network, they were able to just basically
like kneel on the throat of the American tech industry.

(48:08):
And so it is always the case that when we
defend monopolists in order to prevent foreign firms from destroying
our high tech sectors, what we're actually doing is we
are defending the firms that are structuring and controlling the
market domestically for our own innovation. So these very large
firms that are doing AI. Now, AI is only like
one of the things they do. Obviously we have anthropic

(48:29):
and open AI, but you know, Microsoft, Google, Apple, Oracle.
These are firms that are effectively market structures. They get
to decide what products exist and what products don't exist,
how much they're going to cost, and who can see
them and who can use them. Right, And when we
defend them in the name of preventing China from pulling
ahead in AI, what we're effectively doing is we're saying
that we should maintain this shadow FTC that has more

(48:52):
power than the FTC ever managed to exercise, but that
never exercise it in the interest of the American public,
but only in the interests of their shareholders.

Speaker 4 (49:02):
Thank you, all right, We have about five minutes left,
and I run a tight ship. So this is the
last question. So the question is I have a high
school senior who is technically minded. Ten years ago I
would have told her to get a CS degree and
go into tech. What should you tell someone to study now?

Speaker 6 (49:25):
Finance? It's interesting, will I want to tell you a
little little story. So I taught myself economics in this
past year and a half. I'm a fucking idiot. I
just want to be really clear. I don't consider myself
particularly smart, however you feels up to you. The point
is the world right now. The media, tons of analysts,

(49:45):
tons of tech founders, don't know shit from fuck. Those
are technical terms. In all seriousness, finance is not as
difficult to complex as it sounds. And indeed the world
runs on money. But also these corporate structures are run
fairly same, but they dress them up in these confusing
and annoying terms that actually, when you break them down,
are pretty simple. There are people to blame, they've got houses,

(50:08):
they're very flammable. But in all seriousness, these people they're
empowered through ignorance. They're not powered through ability, intelligence or
even making good products. They are powerful because we have
and I'm not trying to sound paranoid, this is just
the truth. Media that in many cases does not understand

(50:28):
the things they are writing about or indeed have the
capability of reading an earning statement. This stuff seems very scary.
You may say I'm not a numbers person. I couldn't
possibly you are a numbers person. It's not that. It
really is if I can do it. I failed mathematics
years and years in a row. It really is fairly straightforward.
They literally format things to make them boring, They worth

(50:50):
them to make them seem confusing. You can pick upart
corporate structure. Any one of you can do it. That
is where anyone can have an effect. I think a
high school could do it. You want to change the world.
You pull away their power, You take away their ability
to obfuscate their wealth and the way they accumulate power
by learning how these things work. I've written a lot

(51:10):
about it. I had to do everything you read. It's
so long because I'm learning as I go. It's ten
thousand words because I'm teaching most like, what the fuck's this?
Anyone can do this. You want to teach a young person,
anythink honestly teaching them who the fuck Milton Friedman was
teaching them how the world got the way it got,
But also learn about how companies are structured. It is
not that complex. And the reason that you get knocked

(51:32):
off course is because the powerful people go, no, it's
not that simple. It's actually very mystical. Sam Altman, he
is not just a con artist. He's not just a
guy that's really good at convincing rich people to give
him money. He has secret brilliance. No, he doesn't. They speak.
None of these people do. The McKinsey egg plants like
these fucking people. Long story short, learn finance seriously, and

(51:55):
you don't have to go to school.

Speaker 3 (51:56):
It's good advice. You know. There's this idea from the
finance sect, this acronym ego. My eyes glaze over. It's
when you large so much complexity and a prospectus that
no one can get through it, and they assume that
a pile of shit that big has to have a
pony underneath it. I have different advice, though. So my
daughter started college this year and she wasn't sure what

(52:19):
to do, and we had a lot of talks about
what to do. She didn't take my advice, but I'll
tell you what I told her, which is that if
you don't know what you want to do at university,
don't go to university. Go to college and become an electrician.

Speaker 6 (52:31):
Absolucking Lou.

Speaker 3 (52:32):
There is so much work for electricians, and we are
going to be solarizing for the next forty years. There's
going to be infinity work for electricians. It's like being
a plumber, but you don't have to touch pooh. You
can be you can be an electrician who just does
emergency call outs when money's getting low and you charge
five hundred bucks an hour. You can be an electrician
on a cruise ship. You can be a theater kid.

(52:54):
You can be an electrician on a job site. You
can be an electrician for the government. You can be
an electrician on a battleship. You can go abroad and
be an electrician because electricity is the same. Right, there's
so and it's and it's interesting work and you get
paid on the job. Right, you get paid for your apprenticeship,
and you can be in a union. And if you

(53:17):
decide later you want to learn more and you like it,
you can become an eying and if you don't, you
can put yourself through college by being an electrician and
learn finance.

Speaker 6 (53:27):
And one of the thing, seriously, I've talked to basically
every power analyst that's at this point or read their work,
and I will say, there's always space for someone who
knows electricity.

Speaker 3 (53:38):
And one hundred.

Speaker 6 (53:39):
You will make so much I'm not kidding the people
who pay you fifteen hundred dollars an hour.

Speaker 3 (53:43):
Yeah, so it's crazy or HVAC HVAC electricity. Oh hell,
yes that is. That is a killer combo.

Speaker 6 (53:50):
No, really, you will make so much money. I'm not
even kidding. And it's so like the electricity stuff, that's hard.
I can do that. The numbers easy piece, like just
going yeah.

Speaker 3 (53:58):
Yeah, if you get your numbers wrong, you don't electrocute yourself.

Speaker 1 (54:01):
So there.

Speaker 3 (54:03):
No one ever fell off a roof doing math.

Speaker 4 (54:06):
That's true. Well Ed, Corey, we have time for one
more question. Okay, speaking that in there and we were
at time. Okay, Oh no, now you're making me panic.

Speaker 5 (54:17):
Okay, Oh no, I got a certification at work.

Speaker 6 (54:22):
Anyone used to compute?

Speaker 5 (54:23):
No, we're going, We're going hang on, hang on.

Speaker 6 (54:34):
I texted Samuel one the other day and you didn't
take back to me. Just fact to share with the audience,
Dario pr anthropics.

Speaker 4 (54:44):
This is this is a short answer lightning round, which is.

Speaker 5 (54:48):
If you had to guess what is the.

Speaker 4 (54:50):
Timeline we were looking at for the AI bubble tupop
make your bets people.

Speaker 6 (54:56):
No light thing Q two, twenty twenty six. Oh.

Speaker 3 (55:00):
So I am a firm believer that the market can
remain irrational longer than you can remain solvent. So I'm
not going to try and gas. But I think it's coming,
and I also think the number. So to your point ed,
I would say that the number of foundation models that
will be around after the crash very likely could be zero.

(55:20):
I'm not saying that it must be zero, but I
think it very.

Speaker 6 (55:22):
Likely the open ones will be around.

Speaker 3 (55:24):
But yeah, the open models. You can't kill an open
source model if people like it in contry.

Speaker 6 (55:28):
A fringe mode, or they'll kill clued.

Speaker 3 (55:31):
Yeah.

Speaker 4 (55:32):
Yeah, all right, so we have immediately and later then
you can stay solvent.

Speaker 5 (55:39):
So don't try to guess.

Speaker 3 (55:40):
Yeah, save your sacks, don't try and short that market.
Bipolls the practice digging for can goods and rubble use
that time.

Speaker 6 (55:48):
Effect on my electrician.

Speaker 3 (55:50):
I had both my hips replaced and my cataracts done
just in case civilization was about to fail. I saved
one of my femurs. I had a cane topper made
cast in brass. I had it scanned at three d
If you got to interarchive dot org, slash doctor or
dash femur, you can get a twelve hundred dpi stlphile
that you can print my disease feamer. I wanted to
make soup stock, but my wife said no.

Speaker 5 (56:16):
And on that note, everything's gonna be fine. It would
be fine, all right, have a wonderful night, everybody.

Speaker 1 (56:20):
Thank you, Thank you for listening to Better Offline.

Speaker 6 (56:31):
The editor and composer of the Better Offline theme song
is Metasowski. You can check out more of his music
and audio projects at Matasowski dot com, M A T
T O S O W s ki dot com. You
can email me at easy at Better Offline dot com,
or visit Better Offline dot com to find more podcast
links and of course my newsletter. I also really recommend

(56:52):
you go to chat dot Where's your ed dot at
to visit the discord, and go to our slash.

Speaker 1 (56:56):
Better Offline to check out I'll Reddit. Thank you so
much for listening. Better Offline is a production of cool
Zone Media.

Speaker 4 (57:04):
For more from cool Zone Media, visit our website Coolzonemedia
dot com or check us out on the iHeartRadio, app,
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Host

Ed Zitron

Ed Zitron

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Ruthie's Table 4

Ruthie's Table 4

For more than 30 years The River Cafe in London, has been the home-from-home of artists, architects, designers, actors, collectors, writers, activists, and politicians. Michael Caine, Glenn Close, JJ Abrams, Steve McQueen, Victoria and David Beckham, and Lily Allen, are just some of the people who love to call The River Cafe home. On River Cafe Table 4, Rogers sits down with her customers—who have become friends—to talk about food memories. Table 4 explores how food impacts every aspect of our lives. “Foods is politics, food is cultural, food is how you express love, food is about your heritage, it defines who you and who you want to be,” says Rogers. Each week, Rogers invites her guest to reminisce about family suppers and first dates, what they cook, how they eat when performing, the restaurants they choose, and what food they seek when they need comfort. And to punctuate each episode of Table 4, guests such as Ralph Fiennes, Emily Blunt, and Alfonso Cuarón, read their favourite recipe from one of the best-selling River Cafe cookbooks. Table 4 itself, is situated near The River Cafe’s open kitchen, close to the bright pink wood-fired oven and next to the glossy yellow pass, where Ruthie oversees the restaurant. You are invited to take a seat at this intimate table and join the conversation. For more information, recipes, and ingredients, go to https://shoptherivercafe.co.uk/ Web: https://rivercafe.co.uk/ Instagram: www.instagram.com/therivercafelondon/ Facebook: https://en-gb.facebook.com/therivercafelondon/ For more podcasts from iHeartRadio, visit the iheartradio app, apple podcasts, or wherever you listen to your favorite shows. Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.