All Episodes

August 7, 2024 113 mins

In the third live-to-tape episode of Better Offlive, Ed Zitron is joined in-studio in Los Angeles by Cory Doctorow and Brian Merchant to talk about the forces that have turned the tech industry away from innovation - and how we might turn the tide against them.

CORY DOCTOROW:

https://pluralistic.net/

https://x.com/doctorow

BRIAN MERCHANT: 

Blood In The Machine Book: https://www.hachettebookgroup.com/titles/brian-merchant/blood-in-the-machine/9780316487740/?lens=little-brown

Newsletter: https://www.bloodinthemachine.com/

https://x.com/bcmerchant

Ed's Socials:

https://www.twitter.com/edzitron

https://instagram.com/edzitron

https://bsky.app/profile/zitron.bsky.social 

https://www.threads.net/@edzitron

Newsletter: wheresyoured.at

Reddit: https://www.reddit.com/r/betteroffline

Discord: chat.wheresyoured.at

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
As media.

Speaker 2 (00:05):
Hello, and welcome to Better Offline. I'm your host ed
Zitron and this is the third I was about saying inaugural,
but I'm not sure that's what that word means. Episode

(00:25):
of Better Off Live, the live to tape radio show
that I shove in there when I'm in cities. Today,
I'm joined by two titans, Titans of the tech industry. Yes,
I've got Cory doctor Row here of course, author Cory
doctor Row and Brian Merchant, other author and journalist. And
I am so excited because today we're talking about how
we fix the entire tech industry.

Speaker 3 (00:46):
Yeah, sounds like an easy order. I don't know what
we're gonna do with the excess time after we're done
with that.

Speaker 1 (00:51):
Looking forward to that's probably twenty minutes, Yeah something Max, Max, Yeah,
we'll talk about our various holidays.

Speaker 2 (00:58):
Brian just got back from the French rivi era. It
sounds like that's true.

Speaker 3 (01:00):
Yeah, I did. We got stuck in Philadelphia, so my
brain is still.

Speaker 2 (01:04):
The French revie Riviera of America, of America. Yeah, and
welcome to scholar debate, gentlemen. So the thing that I
keep thinking at the moment, the thing that really I
had too. It was like pessimism versus optimism that brought
you two together today with me and I keep coming
back to this, which is everyone's kind of depressed in
tech at the moment. It feels like everyone's very negative.

(01:26):
And Brian, I know you've been a ton about lat
it's obviously, and I'm wondering how we actually reverse this STrenD, Like,
how do we actually get away from the fact that
everyone feels miserable with their devices right now? Because I
love tech. I adore my gizmos, my gadgets, my apps.
I don't I care that they're messing with me, and
I write a lot about it, but I want everyone

(01:46):
else to enjoy it again, I just don't how do
we even begin that well.

Speaker 3 (01:50):
I think that the story that the tech industry wants
to tell us is the origin of this depression, because
the tech industry wants us to think that you have
to take the bad with the good. Right that you know,
if you want to have a friend, Mark Zuckerberg has
to be your friend too. If you want to search
the internet, Sergei Brinn needs to know about it. If

(02:12):
you want to have a phone that works, then Tim
Apple gets to charge everyone who helps you use it
thirty percent of whatever you pay them. And their argument
is that like building a phone that works that doesn't
have this thirty percent viig is like making water that's
not wet, right, right, And that's just like patently untrue.
And so I think that, you know, to the extent

(02:34):
that we're like God, you know, I like being able
to talk to my friends and being able to you know,
download a movie without having to go to the video store,
and I like all of these other things. I guess
I just have to live in the apocalyptic wasteland or
I have to go and you know, live in a

(02:54):
shack in the woods and be the unibomber. And I'm
here to argue that there's like, you know, we could
seize the means of computation. We could decompose the prefeast
menu into an a la carde little little French content
for you then, Brian and we could have a new
good Internet that consists of the stuff we like and
throws away the stuff that's bad.

Speaker 1 (03:16):
Yeah, And that's why the you know, the luddite is
such a handy epithet for them in its you know,
incorrectly deployed sort of context. That's why they use it, right,
It's not they say, oh, if you're against any one
of those things that Corey just brought up, then you
are against in fact the whole of technology. And then
we can sort of deride you, we can write you off,

(03:37):
we can cast you aside. And it has made for
the last twenty years or so at least kind of
you know, actual sort of substantive critiques and attempts to
reconfigure some of those systems much more difficult. You have
to clear that ball bar first, you have to say,
you have to start every argument with, well, I'm not

(03:59):
a lot I but I actually like technology. But I
think that and we are only now kind of reaching
a point where we have enough sort of a critical mass,
enough momentum where we don't have to sort of, you know,
dither around and sort of and make all of that,
all of that.

Speaker 3 (04:13):
Sort of pretext. It's clear now, I think.

Speaker 1 (04:15):
Add to your point that the Internet is such a
festering mass in so many ways that it's just self evident.
You log onto this thing and Google barely works. Everything's
tracking you, either, surveillance capitalism is at work in every
nook and cranny and it's just and anybody who uses
that besides just a few isolated apps can kind of
feel it. It makes your skin crawl these days. So

(04:37):
the task that we have now laid out before us
to try to fix this stuff, I think is clear
than it ever has been. And I do think, you know,
I often get criticized for being a pessimist or a
doom sayer, but I'm really not.

Speaker 3 (04:49):
I'm actually quite idealistic.

Speaker 1 (04:51):
I think that there's a lot that we can do
that's fairly straightforward to start sort of fixing some of
this stuff.

Speaker 3 (04:58):
And I know Corey and are.

Speaker 2 (05:01):
In agreement on all, but yeah, and I think that, oh,
we'll get to where you disagree later. That's where scholar
debate is. So because this is the woke Lex Friedmann show,
of course. And but the thing I keep coming back
to as well is these people who are like, I'm
a techno optimist. Not just the people the capital t
kept Low Andresen types and the fans of Nick Land,

(05:21):
I guess, but even the people criticizing, say tech Crunch
for being against I don't know a writer saying fuck
people who who don't like who don't like the AI,
which is bullshit. Sorry, Connie. The thing is, these people
don't seem to like tech. They seem to like business.
They don't seem to It's not like I see Bajali
sitting around digging around on his fucking iPhone playing a game.

(05:43):
I don't see Mark Andreson or hear Mark andries and
talk about even using technology. None of these people seem
to experience tech. They're just experiencing the ways in which
McKinsey and their various outgrowths of Sandwich. They're weigh in
between the layers of society and and it's so fucking
frustrating because I too, get called a pessimist. I have

(06:04):
people on the Reddit, the people who want I don't know,
actively stealing the hub caps off of cars as they
listen to the show. They're piste off, like, Oh, you
just don't like it. You're just being negative, You're just
a pessimist. No, I love this shit, and I'm tired
of not loving I'm tired of it being bad. Facebook
used to be great. I know that that company's deeply evil,
even to the early days book.

Speaker 3 (06:25):
I mean, it was founded so that Mark Zarkerperg and
his friends could non consensually rate the fuck ability with
their fellow hearts. Undergraduate, it's like truly evil.

Speaker 2 (06:32):
But then even then there was a good side of it,
where like I was in college in two thousand and
five and Penn's day when it was going and like there.

Speaker 3 (06:40):
Was something magical. You suddenly these nights for.

Speaker 2 (06:43):
Better or for worse that you might have forgotten, or
that you might have forgotten moments of these get these
shitty digital camera pictures. You have people that you'd met
in passing, that you'd never meet again except you could
now and now what over fifteen years later, Facebook actively
in vines if you try, if you're like, i'd let's
find this person, like haha, no, no, no.

Speaker 3 (07:06):
So I can I draw a distinction. So I think
that you're the frame of optimism. Pessimism is not very
useful because optimism and pessimism, I think, are our beliefs
that don't leave a lot of room for human agency. Right.
Optimism is just this idea that the history is running
on rails and it's running to a good place, right,

(07:27):
And pessimism is the opposite belief. But neither of them
really admit of a place where where we act. And
I like hope, right, yeah, which is the idea that, like,
if you don't like your situation and you don't know
how to make it fully better, if you can find
even just one thing you can do to materially improve
that situation, you will ascend the gradient towards the future

(07:48):
that you want. And that as you attain that new
vantage point, you may find yourself able to see new
ways to make things better that were occulted when you
were further down the slope. And so you know, it's
not the most efficient way to get from A to Z,
but it's it's also much better than trying to treat

(08:08):
your activism in your life like a novel. Right. I'm
a novelist. I know how novels work. In novels, you
have a very neat dramatic arc where you do a thing,
and then you do the next thing, and you do
the next thing. You build to a climax and either
it's triumphant or it's tragic, and then that's the end.
In real life, there's a lot of backtracking, and the
fact that we can't see a novelistic way from A

(08:29):
to Z just tells us that we're living in the
real world and not a simulation.

Speaker 1 (08:33):
So where do we find hope then, Like, really, I
think that that is it. Yeah, I mean I and
I would just also the sort of comment on that.
I do think that there is you know, there's utility
in pessimism, right, and there's utility in optimism as well.
I don't think it necessarily forecloses any futures necessarily. There's
the of course, you know that you have the pessimism
of the intellect and the you know, the optimism of

(08:54):
the will. I think that's endured for a reason, and
I think it's applicable here. And I think these two
sort of like brief snapshots that you just offered a
Facebook to use it as an example sort of can
can be illustrative in a sense, right, Like you you know,
it was founded for truly demonic purposes by a troll

(09:15):
man who continues to like wield influence over what is
now one of the most powerful platforms of all time.
And yet in that sort of interim period before it
became that from its evil origins, there was this period
where users had more power right where they were connecting
each other. I mean, some of that was somewhat illusory
because they're still subject to all of the you know restrictions. Well,

(09:39):
at the very least they Monk Zuckerbug was scared of
his us as being upset. Yeah, and there was a
you know, the users were dictating the growth. I mean,
we had this kind of fertile period twenty years ago
or so, especially with Web two point zero, where it
was basically the users were taking the lead.

Speaker 3 (09:56):
Like Twitter was.

Speaker 1 (09:57):
Like nothing before. Users said, oh what about a hashtag?
What about these features? What about you know what, how
can we sort of build this into something that's useful
for us? And you know, a couple of uh, you know,
white dudes in their in their twenties and and and
got pretty lucky because they owned some of that infrastructure
at the time, but they at least had the the

(10:20):
the foresight to sort of let users lead and to
see what was you know, what was what was working,
what wasn't, what was popular, what what helped. Obviously that's
imperfect as a model of genuine of sort of generalized
growth too, but it's something that is still powerful to me. Right,
we have a more democratic model of developing technology that
is not just limited to the company in Silicon Valley,

(10:42):
certainly not just limited to one founder, and how do
we sort of knock down some.

Speaker 3 (10:47):
Of those barriers.

Speaker 1 (10:48):
This is what Corey writes about a lot like especially
sort of knocking down sort of the the monopolistic and
the oligopolistic instincts and tendencies that these companies have acquired
now that they've concentrated so much wealth and power. But
we do see flashes of what can be. We still
have models of what's still good. I found myself pointing
more recently to like, Wikipedia, for all its flaws, still

(11:10):
a really interesting organization that still operates and creates really accurate,
good and enjoyable and useful information without you know, having
succumbed to so many of these immense like pitfalls and
become a disgusting cesspool like many of its other contemporaries.

Speaker 3 (11:27):
You know, I think that what we're getting at here
when we talk about what's hopeful about the future and
when we look back on the past, is that we
see these people who were at very best imperfect I
speak as an imperfect person myself, and sometimes actually just
genuinely terrible people who nevertheless produce things that we like.
And that's a very hopeful thing because it suggests that

(11:47):
We don't need perfect people to make good things. We
can imperfect vessels can still hold something of value. And
so you have to ask yourself, why were they better
then and why are they worse now? And Ed, I
think you got at it right, right in the bullseye
a few minutes ago, when you said Mark Zuckerberg was
afraid of his users. Right. So I think that these

(12:09):
firms used to be disciplined by external forces, and by
some internal forces their workforce, and that they no longer
feel that discipline. Right. There was a time when firms
worried about competition. Hm, that's not so much a worry anymore.
We relaxed anti trust laws, and the long run effect
of forty years of relaxed anti trust laws is five

(12:31):
giant websites filled with screenshots of text from the other four.
In the memorable phrase of Tom Eastman, and so there's
nowhere to go, right. Lily Tomlin used to do this
bit on Saturday Night Live, where she'd play an AT
and T operator doing commercials for the Bell system. They'd
end with her turning to the camera and saying, we
don't care, we don't have to. We're the phone company.
When you have a ninety percent market share and search

(12:52):
when you've got twenty five billion dollars a year to
spend making sure that nobody ever sees a search engine
other than yours, because you're the default everywhere there's a search.
Why do you need to make the search good. You're
just the only game in town. You don't care, you
don't have to your Google. Right, So they didn't have
to worry about competition. They used to have to worry
about regulation, and we've seen some pretty big regulatory interventions.

(13:15):
You know. The real big one was Microsoft and the
DOJ for seven years over antitrust. And they don't have
to worry about that either anymore because once the sector
is really concentrated, it finds it really easy to figure
out what their lobbying position is and to stay on message.
If you remember the Napster Wars, right, there were like
one hundred little tech companies that in aggregate were still

(13:37):
much bigger, like orders of magnitude, bigger than seven entertainment companies.
The seven entertainment companies kick their ass, yeah, because two
hundred companies they're a raval, right, seven companies are a cartel,
and so thank you. Yeah. No, the Calltaeil. Thing is
my mite.

Speaker 2 (13:51):
Phil Broughton's safety officer and a nuclear safety ex But
in general he's been telling me Calltel, Calltael, Calltel. And
I I'm really coming round to this thing that it's
not just it's not just about the monopoly. It's about
the fact that they're all sitting there shaking each other's hands.
Apple being paid at nineteen twenty billion in the year

(14:11):
by by by Google to not do.

Speaker 3 (14:15):
So how is that Google and Facebook carving up the
search the ad market with the political collusive Jedi blue
arrangement that was revealed by the Texas AG's lawsuit against them.
So yeah, I mean, these companies they pretend that they're
you know, bitter enemies, but they're all chummy when it
comes to the stuff that matters. And you know, once

(14:35):
you have your regulators captured, then you also can mobilize
law against your adversary. So it's not just about flouting
the law, it's about using the law against your enemies.
And what used to happen with tech And one of
the things you asked what we love about tech? One
of the things I love about tech is how flexible
it is, right, how you can always change it so
that it does what you want. And one of the

(14:56):
things that we got out of the concentration in the
tech sector is a capture of regulators that led to
an expansion of IP law that has now made it
illegal to reconfigure your own stuff.

Speaker 1 (15:07):
Right.

Speaker 3 (15:08):
So imagine that this the three of us. We're having
a board meeting about our ad strategy for our website,
and you're chairing it at and you say, well, guys,
our KPI, here is AD revenue top line AD revenue.
I figured out we make the ads twenty percent more obnoxious,
we get two percent more top line revenue. Brian, who
may not give a damn about users, can still stick
his hand up and say, ad, has it occurred to

(15:29):
you that when we make the ads twenty percent more obnoxious,
forty percent of our users will type how do I
block ads? Into a search box? And then our revenue
from those users goes to zero forever because they're never
going back to type how do I start seeing ads again?
And so when you raise the price of ink, you
have to worry about third party in cartridges. When you
make MySpace terrible, you have to worry about bots that
let people leave my space, but still have their messages

(15:49):
imported into Facebook. You know, all these things that made
it very easy to lower the switching costs and go
where you wanted. Well, IP laws basically made that illegal.
Jay Freeman calls it felony contempt a business model where
break DRM or violating terms of services become a crime.
And so companies just put DRM in terms of service
around their stuff so that anything you do that displeases
their shareholders as a felony. And then the last thing

(16:10):
that used to discipline these companies, something near and Derda
Bryan's in my heart, was their workforce. Yes, because tech
workers were powerful, there weren't enough of them, and they
could get a job somewhere else. And yeah, their bosses
suckered them into working every hour that God sent by
telling them they were on a holy mission, and so
they missed their mother's funeral and their kids litteral league
game to ship the product on time. But that did
mean that they would feel profound moral injury when their

(16:32):
boss said let's make this shitty, and because they could
get a job somewhere else, they would tell their boss
to go fuck themselves. But two hundred and sixty thousand
tech layoffs last year one hundred thousand year to date
this year, tech workers aren't telling their bosses to go
fuck themselves. And so you have a boss who every
day went to the wall where there's a giant lever
labeled in shitification and yanked it as hard as it could.

(16:53):
And they couldn't get that lever to budge very far. Competition, regulation,
their own workforce, interoperable hacks, they all kept the lever
from moving freely. Now the lever just goes all the
way to one hundred, and from the boss's perspective, they're
just doing what they did every day of their lives.
They don't know why we're so angry at them.

Speaker 1 (17:10):
Yeah, And I think this also circles back to, you
know that what we were talking about earlier about you know,
the need and the imperative to sort of be critical
and to sort of oppose certain aspects of how tech
and tech companies especially are unfolding. Because the current state

(17:31):
of play, all these things that Corey's describing that are
absolutely true that you know, they didn't just you know,
they didn't arrive prepackaged that way. I love that you
just walked us through all all those three things that
no longer happen, and it's in large part a result
because there were a lot of missed opportunities to regulate
a lot of missed opportunities to sort of constrict this

(17:53):
growth before it became. My favorite example is, like, I
think it was twenty thirteen when the Obama administration had
a chance to sue Google for when it became quite
clear that it was engaging in monopolistic behavior and it
was becoming sort of a monopoly of search, and they
basically just kind of let them off with the slightest
slap on the wrist possible and said go for it.

(18:14):
I mean, we all think about tech today and all
these people who are you.

Speaker 3 (18:18):
Know what happened there?

Speaker 1 (18:18):
Exactly what happened there? It was so Obama. I mean,
how far back do we want to go.

Speaker 2 (18:25):
That specifically twenty thirteen era, because there's the current antitrust
lawsuit now, But what was the predecessor.

Speaker 1 (18:30):
Well, it was just there was an FTC probe and
basically the Obama administration could have sort of green green
lit a more aggressive investigation, a more aggressive suit into
Google's monopoly practices. You know, this was ten years ago,
so it was before it had become so sort of

(18:51):
like obviously and so stultifyingly sort of the want what
does it have ninety percent of the search market, So
it's literally it's just like the textbook definition.

Speaker 3 (19:01):
Of an search. So it's twenty thirteen. So that's right.

Speaker 2 (19:05):
So Prabagod Ragavan at this point had joined Google. Going
back to the man who killed Google Search my favorite story,
real piece of.

Speaker 3 (19:11):
Shit, motherfucker there, Probagod, come on the shell. I'll take you.
I'll take you out in the mic.

Speaker 2 (19:16):
But what's interesting is that does time really interestingly with that,
so that that interrogation never happened, They never really went deep,
and just then they hired Prabagarth from Yahoo after he's
done his work destroying Yahoo's search, and that was when
Google began to sour. And it feels like that could
have been a complete that's a historic moment. That's a

(19:36):
moment where everything could have changed.

Speaker 1 (19:38):
I mean, it was one moment of you know, a
number that they could have but yeah, and you know,
Corey has documented what's happened.

Speaker 3 (19:46):
Since it's just old school.

Speaker 1 (19:47):
Payola practices, right like Google has just sort of really
just kind of strangled the market, not through innovation, not
through introducing great products or ensuring that the user has
the best experience, but just by you know, sheer force of.

Speaker 3 (20:00):
Paying people out.

Speaker 1 (20:01):
The Apple deal is a great example, just literally just
funneling billions to another tech company to make sure that
its product shows up at the top of another tech
products you know service.

Speaker 3 (20:13):
So, yeah, you know, I think that if we want
to understand why Obama took the decision he did and
what's changed, you have to understand that during the Carter administration, right,
so we have to go way back, we started to
see the rise of a crank theory of anti trust law,
and that Crank theory was that anti trust law was
being misapplied that if you saw monopoly in the wild,

(20:36):
what you'd actually encountered was something that was very efficient. Right,
if everyone goes and buys the same product at the
same store, uses the same search engine, it's the best.
Wouldn't it be perverse for the government to spend public
money to take the best product and make it worse, Right,
And so that was the theory that we got under Carter.
Reagan really liked it because of very monopoly friendly, and

(20:57):
every administration since has brought into it as did every
economic school in the country and eventually in the world.
This is a very thatchery kind of ideology, but it
also heard in the EU, thank you, and you know
that these ideas kind of raced around the world and
became orthodoxy, and they were funded by very rich people.

(21:20):
So there's a thing called the man seminar m a
NN that is sorry. So the man seminars are these
free junkets for federal judges where they teach them about
anti trust law and they're run by rich guys and
they fly them to luxury resorts and they explain how
anti trust law should work. And there's been quantitative work

(21:42):
on this where they took federal judges and they looked
at their decisions before and after going to a man seminar,
and they found that after going to a man seminar,
they weighed more heavily in favor of monopoly. And so
they built up this evidouce edifice rather of precedent and
theory and orthodoxy. And Obama's FTC said, oh, We're going
to let them walk. What they were doing was just

(22:03):
continuing a grand tradition of Bush one, Bush two, Reagan, Carter, Clinton,
right like this was the way that things were done.
It was an absolutely wasted opportunity. But it's more or
less what Trump did afterwards, and it's what we did
all the way up to the Biden administration, which for

(22:24):
reasons I think that have nothing to do with Biden's
own ideology, but have everything to do with the way
that he operates as a party powerbroker, where he wants
to give everyone a little as something. We got a
handful of appointees at the tops of some of the
most important agencies that really oriented themselves towards smashing corporate power.

(22:46):
Lena Kahn at the Federal Trade Commission, Jonathan Canter at
the DOJ Anti Trust Division, Tim Woho's in the White
House for several years, and so on, and these people
have really transformed things. We've seen more policy transformation in
the last four years than in the last forty. It's
it's been head spinningly great. It's wonky, it's hard to
get your head around. It's very structural, right, it's about

(23:09):
changing the rules of the game, which takes a while
to show up in the game. Right, But boy, oh boy,
it's pretty substantial. Do you think it could go anywhere? Though?

Speaker 2 (23:18):
Do you actually we talk about hope. Should we have
hope from this?

Speaker 1 (23:22):
I mean, it's it's it's very it's very tenuous. It
has been great, you know Lina con in particular. I mean,
I wish the administration would do kind of a better
job of touting its wins because all this stuff is
just so popular. What Lenakon has done in awakening sort
of the FTC from its kind of kind of coma.
Yeah it's coma, it's in decades long coma. And then

(23:45):
attacking some I mean, who doesn't hate Ticketmaster? Like who
doesn't you know, who who loves logging onto the Internet
as it is today? Like these are these are entities
that are popular to go.

Speaker 3 (23:57):
After junk fees, non competes.

Speaker 1 (23:59):
Yeah, how Amazon which has recently turned so hostile to
its own, you know, third party vendors and small businesses
that that it relies on to operate that it's strangling.
This is a perfect example of what the you know,
the Democrats and Biden are supposed to be all about
standing up for you know this, the little guy, the
small business owners, you know. And I think the problem

(24:21):
with all of these gains is that they're just tethered
to an administration.

Speaker 3 (24:25):
Right, so if.

Speaker 1 (24:26):
If Trump, I mean some some of them, you know,
some of them could could be more durable.

Speaker 3 (24:32):
They could be. But I mean Lina con is not
going to be the head of the FDC if Trump wins.
They're already coming for Yeah.

Speaker 1 (24:39):
Well, I mean Democratic donors are coming for her too,
because tech companies don't like Lena Khan either.

Speaker 3 (24:44):
So you have guys like I think it was, is
it Reid Hoffman? Yeah? Oh god, but that company that
just turned down a thirty six billion dollar acquisition offer
from Google did so because they didn't want to go
through that wizy trust. Hell it is the security company. Yeah,
And so you know, you have had, if nothing else,
four years of companies not being acquired and instead trying
to be product led and make a thing that people

(25:06):
want that they can sell for more than a cost
to make them, which is you.

Speaker 2 (25:09):
Know, that's an insane I don't.

Speaker 3 (25:13):
I was raised by socialists, it's right, and even I
can recognize that this is what capitalism is supposed to look.

Speaker 2 (25:19):
Like my dad ran apart of the NHS, like I'm
the same way. But also this is also something that
really fucking bothers me about. Take you showed me your
framework laptop beforehand, and for the listeners, the framework is
a laptop that you can actually effectively rebuild, replace the
motherboard and the screen and all these things. As a
regular user. There is still cool shit happening. But also
I assume that company makes more money than it spends

(25:40):
on its stuff. Take can do this, and I feel
like the Commons, there's this weird paradox at the moment,
we've got Take saying well, it's actually these companies should
be able to quote Ilia Suitskava formerly of open Ai,
we should be able to scale in peace. I by
the way, Ilia, if you some here this, you will

(26:01):
never have a moment of piece. As long as I
have a microphone. Now that you said that, I'm going
to be watching you, I'm going to be popping out
of the toilet like the skibberty meme that the children
look at. But the thing that frustrates me is they
want this world where they can burn as much money
as possible, but also the biggest of them can print money,
prints so much money, more money than any company should

(26:21):
have while also providing their services. And it's just how
is it? How do they not want to be the
company that just makes more money than they spend that
people like.

Speaker 3 (26:32):
And is it just that they don't give a shit?
Is it? Is it sunned up a shy and his
ilk just don't care. Well. I think that what you're
looking for, the Fraser looking for here is what Lina
Khan calls too big to care. Right if you think
back to the moral philosophers of capitalism or you know,
writing at the time of the Light Eyed uprisings, and
they were saying, look that the thing that i'm a

(26:54):
profit driven economy as opposed to a rent driven economy
will get us is a world in which, instead of
the lord knowing that the peasants will have to bring
in the harvest every year because they're legally obliged to
they can't leave the land, the lord, the capitalist will
have to mobilize capital, bring free labor into contact with it,
alienate them from the surplus of that labor, and watch

(27:18):
over their shoulder all the time for someone who shows
up with a better machine, a better process, a better
pitch for their workers. You know, if you own the
building that a coffee shop is in, and that coffee
shop goes bankrupt because a better coffee shop is just
open across the street. You're fine. Actually you're great, because
now you have an empty storefront. You could rent on
the same block as this new hot coffee shop. Right,

(27:38):
if you own the coffee shop, if you're the capitalist
as opposed to the rontier, you're fucked. And so every
capitalist aspires to being a rontier. Right. They don't want
to operate in an environment in which they're like the
fastest gun in the West, who's always waiting for that
kid to show up and say draw Brian.

Speaker 2 (27:56):
Actually, Bridgingenolf of that is well, I think it's time
to much the oscy just very bas what did the
Ludites actually ask for? Because I've heard people who do
not like this. They're in people who do. But explain
for the listeners because you hear this phrase Ludism mentioned
quite long.

Speaker 1 (28:10):
Yeah, well, you know, so there's no there's there's no
you know, enduring lud eighte manifesto that where all the
lud Eites and the people who were out on the
streets who are part of the campaign, you know, of
the of the eighteen tens who were organizing basically against
this new sort of you know, this new this new

(28:31):
class of entrepreneur that was rising up to to basically
exploit their labor, to use machinery, to uh, to divide labor,
to to use children to run it, and to sort
of may force them to compete and to sort of
and to drive down their wages. So they are against
The Ludites are basically against all that they were.

Speaker 3 (28:51):
Truly a They were.

Speaker 1 (28:54):
In many ways, you know, a reactionary force. They had
their back against the wall. They weren't a pro active
political party or anything like that. So you can't necessarily
say every Ledite believe this or this is what they wanted.

Speaker 3 (29:06):
But there are a number of things that.

Speaker 1 (29:08):
Are clear about what they were that what they were seeking,
and you can really boil it down to basically a
seat at the table.

Speaker 3 (29:17):
This was a profoundly.

Speaker 1 (29:18):
Anti democratic way to develop and deploy technology that they
were experiencing, and they went to great lengths even before
they became Luddites to point this out. They went to
Parliament and said, these new machines are being bought up
by this new rising class of capitalists. They are being
staffed by either migrant workers or children, or precarious.

Speaker 4 (29:40):
Workers Napoleonic war orphans, right exactly, they literally would pipe
in orphans from London into these factories to staff them,
and they wouldn't pay them until they turned eighteen. They
and so the Luddites or the people who became Luedites
would point this out and they would ask for some
very basic, very common sense reforms, asking for minimum wages,

(30:02):
protections from fraud. And that went on for ten years,
and they got laughed out of Parliament time and time
again until those any regulations that did govern the trade
were thrown up altogether. And so when the Luddites become Lutdites,
they're basically protesting all of the above, the fact that
this was a profoundly undemocratic process that they are being

(30:23):
forced to uh to either succumb, to either go work
in the factories or to give find it, give up,
which wasn't an option. By the way, This isn't a
diverse economy of you know that we recognize today right
exactly learned it.

Speaker 1 (30:39):
You can't even learn to code. You grew up in
a weaving town. Guess what you're John's pretty limited to weaving.
So these are people who are remarkably at the at
sort of the whims of the market. And when the
market was as poor as it was, and these capitalists
were in control of it to the extent that they were.
You know, they tried, they tried to reason with a

(30:59):
lot of these uh, these factory owners. They tried before
it came to it came to what it came to
when they sort of organized this gorilla rebellion that was
incredibly popular that they you know, this was the new
Robin Hood movement basically, and what they wanted was, yeah,
a seat at the table. They opposed the machinery hurtful
to commonality that was align well must it.

Speaker 2 (31:20):
Seems like it wasn't, because the common misunderstanding if lotism
is that they just wanted to destroy the machines because
they didn't like them. And it sounds like, I don't know,
if the market forces would have given a seat the
table for the workers, they might not fucking resent.

Speaker 1 (31:33):
Under Yeah, I heard Cory, you make this make this
observation too again. If like someone had come to town
with the machine and said, all right, let's figure this out.
Let's get together, like, how can we all benefit from
this machine?

Speaker 3 (31:44):
I saw it over there.

Speaker 1 (31:45):
I bet the whole town could really benefit from having
access to this machine that makes our working lives a
little bit.

Speaker 2 (31:50):
Maybe to make the machine better. If we talk to
the people who'd use it.

Speaker 1 (31:53):
Yeah, maybe we can all make a little bit more
money together. You still have your if you're if you're
the you know, the merchant capitalist who was you know before,
was kind of the middleman in charge of distribution.

Speaker 3 (32:04):
Maybe this will make you more money too. But it
didn't work like that.

Speaker 1 (32:08):
It always developed where either somebody would come in from
out of town with a lot of money and be
set up something truly enormous, or a factory or owner
would recognize, or a prospective factory owner would recognize the capacity,
the productive capacity of these new machines and then try
to build them for personal gain. And this whole process

(32:32):
really was deeply conflicting to the vast majority of factory
owners too, because they saw a handful of people using
the machines this way in a way that then at
the time was very clearly corrosive to the social contract. Right,
if you have one person saying I'm gonna use this machine,
I'm gonna hire kids to run it, I'm gonna sort
of profit as much as I can drive prices as
low as I can to get these guys. You're fucking

(32:54):
up the entire town economy, You're fucking over your neighbors.
It's a disgusting thing to do. And you were hated
for it. So the people who did it were willing
to be hated. They were willing, they were they were
the ones who were willing to take that ire. And
few were. And a lot of some people walked away
from the trade rather than you know, automate their machines,
and and some you know, sided with the Ludites and

(33:16):
were quietly kind of like cheering, like, yeah, go smash
those guys factories. But by and large, you know who
did the state. Once the state threw its lot in
with with with with the factory and back them up
with arms and with with punitive policies, it was, you know,
pretty clear.

Speaker 3 (33:31):
I was going. I want to point out two things
that I think are really important that that Brian omitted here.
The first one is that Brian wrote the definitive book
about Yes, it's called Blood in the Machine. You should
read it. It's very good. And the second one, which
I think you just you skipped over maybe too quickly,
is that they already had the legal right to a
seat at the table. They did. There were laws in
Parliament that gave them that right. The firms involved were

(33:53):
moving fast and breaking things, and the Luodites went to
Parliament and said, we would like the law obey right.
And this is this is an early example of of
Wellhet's law. You know that that capitalism consists of exactly
or or conservatism consists of exactly one proposition that there
should be powerful people whom the law protects but does

(34:13):
not bind, and everyone else whom the law binds but
does not protect right. And this is what the Luddites
were actually. You know, the thing that drove them into
the streets was they they watched these people effectively stealing
from them and Parliament going along with it, and they
were like, well, I guess we just got to do
some self defense here.

Speaker 2 (34:31):
And I think you can draw a direct fucking line
from this from Luddites to people who would call someone
like me or someone better like Molly White, a cynic.
And I think it's this thing where what people like
everyone at this table, Molly White, David Garrett. When talking
about cryptocurrency these Gary Mark has been talking about AI,

(34:55):
Roger mcnamy, You've been talking about everything these people are
not cynics or critics. I guess they are critics if
you just frame any criticism as big critic. What they're
doing is saying, hey, can we apply very basic fucking
societal structure to this stuff that is thieving, that is
burning things?

Speaker 3 (35:11):
Hey, do we want this thing that seems.

Speaker 2 (35:14):
To enrich in the case of cryptocurrency a very small
and shadowy group. And the ones who want shadowy are
very annoying. And these people like the Winklevosses who lost
a billion dollars of their customers money. That's fine, But
when you talk about this stuff, well, you just don't
like technology. You just don't want progress. Motherfucker. None of

(35:35):
this is progress.

Speaker 3 (35:37):
From the very beginning, this was the goal.

Speaker 1 (35:39):
It was clear to anybody in eighteen twelve when the
Luddite movement was at its apex, that this was a
popular movement. That people were coming out in the streets
to cheer the Luddites, not the factory owners, because they
felt solidarity with them. Yeah, because they saw the shape
that industrial capitalism was taking. They knew who the winners
were going to be, they knew who the losers were
going to be, and they cheered the lued Heites on.

(36:01):
And so when the state comes out and it has
to sort of make its case against the Ludite, it
has to demonize the Luedites, it has to try to win,
and it does so again by making frame breaking a
capital offense. You break a machine, you can get killed,
you can get hung.

Speaker 3 (36:17):
Both taking too right, oath, taking too saying that you
were a Luttite became a hang up all offense. You
could get hung.

Speaker 1 (36:23):
That's why there's no real records left of sort of
their their meeting minutes or there anything or any letters
between the very few and far between.

Speaker 3 (36:32):
Yeah.

Speaker 1 (36:33):
Classic, and then they mobilize troops. There's more troops in
England fighting the lut Heites than they were fighting Napoleons.

Speaker 3 (36:39):
They brought them back from the front, right, they brought
them back from France to fight the Lutedtite.

Speaker 1 (36:43):
The tens of thousands at one point, so you could
literally see you'd see the lines drawn sort of as
industrialization was very literally almost a civil war, and the
states making the case against the Lutedites, and it's the propaganda,
you can call it. It's propaganda at the time, and
its proclamations and its declarations it's it's it's casting the

(37:05):
Ludites as backwards looking, as deluded, as malcontent under the
you know they must be you know, they're they're smashing
the thing that that that that you know that that's.

Speaker 3 (37:15):
Part of their job.

Speaker 1 (37:16):
Yeah, they're smashing, they're smashing. Probably this form of criticism
is already there. So when the Ludites do lose, when
they are hung en mass uh, that language is used
in the trial against them, and we can see why
because the state needs to have this enemy. They have
to be able to paint this portrait as anybody who's

(37:37):
against the shape that technological development is taking, anyone who's
criticizing any element of that, Like, wait, is are there
too many losers or winners? No, shut up, you're a luttite.
Get out, you stand down, or we will hang you.

Speaker 2 (37:52):
And I think that this is a really I'm so
glad you went into this specific thing because it really
feels like now I don't know if the state is
I think the state's laziness is more or like a
lack of focus is the problem. It's not that the
administrations are being like people who don't like AI are stupid,
or people who don't like crypto. They're just like, I
don't really get this, and these people got so much money.

(38:13):
And I love money. I love getting paid, I love
being lobbied. This is great. So I'm gonna go with
the guys with the money versus the fact that I
think that actually the optimists I don't just mean Andresan
or Bajali, I mean the AI fantasists, all these people
who claim that if you're against AI, well you don't
care about progress. I actually have a counterpoint, which is

(38:35):
they are against progress. They don't want anything new to happen.
They want whatever money they want the next Google. And
when I say the next Google, I don't mean Google Search.
I mean the giant monopolistic entity that just spits like
eats people and shits out money. And I'm just gonna
put it bluntly. They they are the cynics. They are
the pessimists that like the SCALEAI guy, like all of

(38:58):
these people who go around, well if you if you
don't believe, we need to do this, we need to
train as much as possible. That's what the future likes. Like,
Fuck no, these people don't care about tag.

Speaker 3 (39:07):
Can I can I I'll give you a little gloss
on this from Mal Sauter, who's very smart about this
and who was the first person to point this out
that that I encountered, which is that AI and indeed,
all statistical methods of prediction are intrinsically conservative because because
if what you're doing is predicting what should happen next

(39:29):
based on what already happened, what you're doing is you're
driving a future that looked like the past. So think
of it this way, Right, you pick up your distraction
rectangle and you type hay, and then it finishes it
with darling, because that's what you type every time. That's
what a cool mclenns you might be. You might want
to type hay, asshole, but it wants you to type hey, darling. Right,
and now, if you type something you've never type before,

(39:49):
it's going to pick the thing that most people say. Right,
that's that's the So you're going to get either the
statistically average version of who you used to be or
the statistically average version of who every one is. There
is something intrinsically and inseparably conservative about the project of
making the future look like the past. Now I want

(40:10):
to mention something that again I learned from Brian's very
excellent book Blood in the Machine, which is that Frankenstein
was a ludite novel and was widely understood as a
ludite novel. It's a cautionary tale about what happens when
an entrepreneur gets to build a machine without talking to
his neighbors about what it should do, and it was
really widely understood as this. And the irony here, and

(40:30):
I say this as a fully paid up member of the
Science Fiction Writers of America and science fiction novelist. The
irony here is how many of these torment nexus building
weirdos read cyberpunk as a suggestion instead of a warning,
and defend what they're doing as carrying on the visionary
tradition of science fiction. The first science fiction novel ever

(40:54):
was Frankenstein, and it was a novel about why you
shouldn't just build whatever goddamn machine you feel like? Bill?

Speaker 2 (41:00):
What I like about The Ready Player one one? First
of all, one of the worst books ever written.

Speaker 3 (41:04):
In mind, it would be nice to Ernie. I think
that book. It's very fun.

Speaker 2 (41:11):
I believe it is that we are not going to
turn this into a debate on that book in so
much trouble.

Speaker 3 (41:17):
But my problem. I like your book, Ernie, come on
the show, You'll kill me.

Speaker 2 (41:22):
But the thing is with that is that book is
not a positive book. No, nothing about that story feels good.
But also the central plot is this shanty town this
child escapes from every day and then using his distractions
from his terrible life, he is able to decode the
thoughts of a deeply troubled, mentally unwell man as he

(41:46):
is chased by greedy people who don't really care about
any of it, and all of them are obsessed with
not the interest in the stuff, but the existence of
the stuff. And Mark Zucker books that's actually the single
best I'm going to build.

Speaker 1 (42:02):
That's this is just what Corey was just talking about,
right is. It speaks to the conservatism of the a
lot of these the detection once they've built a company
that's big enough to where you know, you're casting around
for your next thing because you don't actually want to
do any you know, real innovating yourself.

Speaker 3 (42:17):
You you you outsource it to a sci fi novel.

Speaker 1 (42:20):
That's why we've seen this this sort of generation of
of sort of wish lists from cyberpunk from the metaverse.
I mean that was just Mark Zuckerberg saying, hey, uh yeah,
let's let's let my next My next thing is the metaverse.

Speaker 3 (42:36):
Then that comes from your your future is being a legless, sexless,
low polygon heavily surveyed cartoon character in a virtual world.
I stole from a twenty five year old ystopian cyberpunk novel.

Speaker 1 (42:44):
Exactly Elon Musk, the cyber truck that's from Blade Runner.
For some reason, it's from a blade Runner which is
Epegen based or oh now open Ai, which says, oh,
you know, it's gonna be like Her, the movie Her.
They're just saying like that, like right, like that. That's
that's sort of the extent of the imagination at work here.
And I do think that that is like an inherently

(43:05):
sort of conservative and restrictive way to operate, and it's,
you know, we were seeing the limits of it now.
I don't even know if they actually want to build
these products or if they just want to win a
hype cycle and sort of get it's a way that
they can get people to talk about it.

Speaker 3 (43:20):
I just want to call out that ads Elon Mosk
is early similar to FW to Clark.

Speaker 2 (43:26):
Well, if you're a legless fullness creature. You feel your
life's restricted, perhaps you could check out one of the
following products, and I guarantee you the following products will
not in any way embarrass me have some kind of
almost ironic connotation with what they run. So please enjoy
the following ad and we're back. So I think it's

(43:55):
funny you brought that up, Brian, where it's like, it's
not even obvious that they can about what they're building.
It's not even about the stuff anymore. And when I
look at a company like Google or Microsoft, what do
they do? Like if you really sit down and say
what does Google do? What does Microsoft? Or you can
look at Apple and say they make consumer electronics.

Speaker 3 (44:17):
That's fairly that.

Speaker 2 (44:18):
I think that that's a fair thing. They make laptops
to make phones, you use them, and sometimes they force
you into an abusive system called the app Store, But
nevertheless you can tell what they do. Google and Microsoft
are like holding companies. They're they're private equity firms that
have no.

Speaker 3 (44:34):
Like formal shape.

Speaker 2 (44:36):
Because this whole week I've been sitting down and reading
about something we will get to in a minute, which
is how open a I can can compete with Google Search,
and by the way, they cannot. It's impossible. It's actually impossible.
It just feels that it's not that they don't have focus,
is that they can't have focus. That there's simply nothing.
They can't make anything anymore. They can make more similar stuff,

(44:58):
But when was the last time Google up with something
truly innovative.

Speaker 1 (45:03):
Yeah, I mean it's it's you can kind of tunnel
by the rush and it's desperation to sort of like
build some sort of an AI product that can work
or that can at least capture some attention, because it's
not I mean, it's not this this overview AI thing
that they rushed out the door and were just ruthlessly
and widely and deservedly mocked for making it. You know,

(45:23):
it's not even clear I think to anybody how that
serves Google's bottom line. If anything its them, it loses
the money because they have ostensibly this remarkably potent revenue
generator in terms of Google Search. I mean, that's the
only real answer to your question. I think that is
what does Google do? It runs a search engine that

(45:44):
it has gotten tired of running, and it's or it's
backers and partners have gotten tired of running, or people
are convinced that it's no longer the future and that
it's not going to be useful to anybody. But I mean,
to me, the search engine is still far superior to
and to using a chat, GPT or any generative AI
product that I've come across yet. And it's you know,

(46:07):
it's the talent as I understand it, from Google, has
has been shifted away from running the search operation, the
search department. Now it's all on on Ai, which is
the again, this nebulous form that I don't even think
it's not like Google. I mean, Google sounded that famously,
this red alert that said all hands like we got
to catch up to open aid, we gotta we gotta

(46:28):
figure out a way to make this happen. And we
just see this sputtering, all these little false starts and
and turds rolled off into the Internet that are that
are more mocked than us and certainly not profitable.

Speaker 3 (46:41):
I want to I want to try and make the
heroic case for Google, okay, which is a tough case
to make, but I'm gonna make it so Google, I
think you said, what was the last time Google made
something truly innovative. I think it was twenty five years ago.
Google has not made a successful products in search. They've
brought a lot of successful products, right. They bought their
mobile stack, they bought their ad text, they bought video.

Speaker 2 (47:00):
You could like go for example, it wasn't go fairly
recent that the coding language.

Speaker 3 (47:06):
Their programming lange. Yeah, sure, I'm getting I'm getting to that. Yes,
they bought YouTube, right, yeah, they bought They bought YouTube.
They bought YouTube after Google Videos failed, so they made
a video platform that didn't work and they bought one
that did great. Right. They they bought their ad tex Stact.
They bought their customer service. They bought dogs, they bought collaboration,
they bought maps, they bought all of it, transition. Where

(47:27):
did they buy dogs from? Who's Who's they forget it
was that company?

Speaker 2 (47:30):
That's that's something like I feel like I'm fairly well versed.
Intent was the start I thought they built.

Speaker 3 (47:36):
No, No, they didn't build they so like first depending
on how on how you want to purse it, right,
because you have things like their hotmail clone, which was
very successful. Uh you know Gmail, Yeah, right, but it's
it's a they made that in house. They didn't buy
that from someone however, it's a clone of hotmail, right, sure.
And so they have these these products. But what they

(47:58):
did for these products, and and this is the heroic part,
the part where I'm going to be bend over backwards
to be fair to there, is they know they scaled
them and they made them reliable. Right, that's crazy. And look,
one of the things that we critique capitalism for is
that it is underinvest in maintenance, It underinvests in infrastructure,

(48:18):
it underinvests in what we could broadly call care work.
And at its best, what Google did was, in a
kind of patrician way, became a serial acquirer of nascent
competitors that didn't necessarily extinguish those competitors, although they did
in some cases. But what primarily they did was they
scaled them up to utility scale, and then they maintained
them to a very high degree of reliability. Now, I

(48:40):
think that if you want to understand Google's internal culture,
and I think you did a good job talking about
it when we wrote that the man who killed Google
is that it's a distinction between people who want to
do care work and people who want to do rent seeking, right,
And so you know, there's a version of even keyword

(49:02):
advertising that I think we can tell a heroic story about. Right,
if you are a novelist and you've written a science
fiction novel and it's like a novel by someone else, right,
it's a comparable to it, then you could buy the
adword and you could say, when someone searches for Neil
Stevenson launched this guy's career by saying at the top,

(49:22):
having one little ad slot that's like, have you read so?

Speaker 2 (49:24):
And so it's the foundation of the like the good
bits of capitalism, if you can call it, like the
honest marketing, like we like this, you might you like this,
you might like this.

Speaker 3 (49:34):
Yeah, yeah, it's it's and so there's a lot you know,
being able to reach into these niches. This is also
a thing that Facebook has done quite well. And and
you know when Facebook has defenders, they are often small
business people who could not address a market without fine
grained targeting. Me right now, that fine grained targeting needn't
be behavioral. It could be it could be content based. Right,

(49:55):
you could target if you want to target plus sized cheerleaders,
you could find the Facebook group for plus sized cheerleaders
and sell your uniforms to them. And I'm sure that
there are a bunch of people who are plus sized
cheerleaders who are really pissed that the uniforms for plus
cheerleaders suck, and who'd be really great to see those ads. Right.
You don't have to spy on them, you don't have
to follow them into the back. I don't need to
know who they are. Yeah, but that's what they've done

(50:17):
at their best, right, And the rent seekers just want
to make money by providing as little to everyone else
as they can and taking as much in as they can.
They want to own a factor of production and they
want to be insulated from the risk because other people
who produce useful things will have to license or buy

(50:37):
that factor of production from them in order to do
the useful thing. But whether or not they do the
useful thing or not, Like think of Nvidia right in
videos selling video cards to people who are going to
go out of business. That doesn't matter to Invidio and
they already sold them gets a stake in Think about
Unity right, who said, oh, we're going to take not
We're not just going to sell you the dev tools
to make your video game. If your video games succeeds.

(51:00):
We're going to get a royalty for every game you share, right,
And they called it shared success. Now, note that this
rhetoric of like shared success is not the rhetoric of
shared risk.

Speaker 2 (51:12):
Kind of like a parasite shares your success.

Speaker 3 (51:15):
Yeah, you know, shared risk, right, shared risk, shared reward
makes a certain degree of sense. If they're saying, okay, well,
i'll tell you what. We've got a deal where we
will give you discounted or free tools and development money
and whatever. We'll invest in your company, right, and as
a form of shared success, and we will also take
a shared risk in your failure. That's a very different matter.

Speaker 2 (51:37):
Yeah, because shared risk is inherent to actual shared success.

Speaker 3 (51:41):
We are taking a risk.

Speaker 2 (51:42):
But it's almost I think this keeps coming back to
basic evolution Darwinism, probably two different things. Now, I said
I did not do well in size. But nevertheless, I'll
get round to this. There are no predators for these
people anymore.

Speaker 3 (51:56):
There's nothing they're afraid of, no discipline, Like there's the
cattail thing is they don't want they want everyone to
stay the same, so they know who they're looking at
every day.

Speaker 2 (52:04):
We can bark, bark anyone out of the area. If
we don't like them, we can make growing difficult. We
know the venture capitalists, we know the commercial real estate owners,
we know the people in government office. We can make
this really difficult and really easy for us. But also
it feels like we need predators in the tech industry
and not the kind who worked at Google like Andy Rubin,

(52:27):
and I think the.

Speaker 1 (52:29):
Well, and as long as you have a company I mean,
I mean that has any you know, degree of success
or that becomes as big as Google does, I think
just like the base logics of capitalism are going to
mean that you do you grow the number of people
who are the rent seekers and who are interested in
protecting those revenue streams at any cost. And you know

(52:52):
that the halo of a company like Google may dim,
it's still kind of extent.

Speaker 3 (52:57):
You know.

Speaker 1 (52:57):
I just wrote this this art about a former Google
privacy engineer named Tim Leibert, who was a big security
and privacy guy in the academic field. He went to
pen he did a time at Oxford, and he was
a professor at Carnegie and he really, you know, was

(53:18):
really a big critic of Google. And then Google hired
him and he agreed to go because he thought this, well,
all the things that I'm talking about, all the things
that I want to fix, Google is the largest purveyor
of these issues of privacy. He looks at cookies and
the way that Google tracks your data. So he figured,
you know, if I'm going to have a shot at
fixing this, I'm just going to go into the mothership.

(53:40):
And sure enough, the way that he describes it to
me is his job was split between two basic tasks. One,
he's talking to other younger hires who still kind of
believe in the idea of a technology and a service
that can really help and serve people, and who want
to try to make sure that Google, you know, meets
its privacy commitments and is in line with the law.

(54:02):
And then he would go into the meetings with the
executives and management and they would spend the most of
the time telling him why they couldn't do all the
things that he and all of the people who are
actually working in the trenches at Google wanted to do.
And it just became this two year long clash of well,
there's this privacy law. Don't we we have the tools

(54:23):
and the capabilities to fix it. Don't we want to
do it and then butting up against management.

Speaker 3 (54:27):
Why would we do why would we do that? That would
it's would we could?

Speaker 1 (54:30):
It would be easier just to like pay the fines
as they come than to actually finds a price, and.

Speaker 3 (54:35):
They need to.

Speaker 2 (54:36):
I feel like they really need to change the finds
so they scale sure or here's my other there and
they're on a Smoglu said this as well, I'm then
take it step well laterally put them in jail. And
I'm only kind of kidding. I mean, if, for example,
the CrowdStrike situation. I brought this up on the Lost

(54:56):
Battle Off Live as well, such an Adella should be
looking at potential jail time or the cuts see of
CrowdStrike should be jail time or personal liability. If you
work for a public company with excess revenue blah blah,
and you make it so that they can't just do
stock splits or some weird shit so they get away
with him including stock revenue, you should potentially forfeit like

(55:18):
tens percent.

Speaker 1 (55:19):
You know, it's funny that you say this because the
other part of the story is, so is Tim built
this this this system called web x ray that can
search for every privacy violation that a company is committing
on the web and the ideas to make it into
like a sort of an assembly line for class action
lawsuits and for lawsuits. So you can say, okay, you know,
maybe Google, maybe people emailing Google and saying, hey, you're

(55:41):
in violation of the Germany's new privacy law. May can
you please address this isn't really but if you can say, okay, boom,
here's here's a search engine can show you exactly how
many times a minute or an hour it's violating these
laws regularly.

Speaker 3 (55:56):
Kind of logne that have costs you know, attached to them.

Speaker 1 (55:59):
Then and maybe you can sort of so there are
people sort of like fighting against this.

Speaker 3 (56:03):
This is maybe a good time to talk about the
structure of regulation or what regulations work and which ones
don't and why and so on. So one of the
key things to understand about regulation is that a regulation
that's hard to administer will be administered less. Right, So
if you make a rule, for example, that says, if

(56:24):
you're a platform that competes with the platform users, so
your Amazon, you sell your own goods, but you also
sell goods by your competitors. As a fairness requirement, you
must put the best match for the user's search at
the top and not just your own products. So this
is what's called a ban on self preferencing. The problem
is to administer that, you have to be able to

(56:46):
kind of peer into the mind of the Amazon executive
who decided to put the Amazon product at the top
of the search, because if they believe that they made
the best product, then they haven't violated the rule.

Speaker 2 (57:00):
And I was going to say, it feels like the
word best is the problem then.

Speaker 3 (57:03):
Right, So it's a very difficult thing to discover. Moreover,
if you have a sincere but incorrect belief about what's best,
because we're really good at talking ourselves into thinking that,
you know, with the stuff we make is good.

Speaker 2 (57:16):
And being empathetic, maybe you truly have seen more of
this thing being built, like yeah, loss is what they'd say.

Speaker 3 (57:22):
Yeah, So you might have a good reason to do it,
and you still might be wrong. So as an administrative matter,
figuring out whether or not someone is self preferencing is
really hard. Now, this isn't the first time this has
come up. When the Sherman Act was passed in eighteen ninety,
the first anti trust law in the United States, self
preferencing was a huge problem because you had railroads that
owned freight companies and they competed with the freight that

(57:43):
they shipped. Right. You had banks that hadn't been split
where you had retail and investment, so you had banks
that loaned money to local businesses and owned local businesses
that competed with their debtors. Right. And so we didn't
create a rule that says, Okay, you must treat everyone fairly,

(58:03):
because determining the fairness was too hard. What we said
is you have to be structurally separated. You either are
a bank or you're an investor, but you can't be
in a bank that invests. You are either a railroad
or you're a freight company, but you can't be a
railroad that owns a freight company. This is called structural separation.
And the thing is you do lose some efficiencies. Maybe

(58:24):
there's some cool things you can do with vertical integration,
but what you get is administratable rule. Because I can
tell really easily whether a railroad owns a free company.
I can't tell whether they're dealing with it fairly. But
it's really you can from orbit. You can figure out
whether a Railroad owns a freight company.

Speaker 2 (58:42):
So just imagine you're a dumbos look at mey you
It's pretty easy. Yeah, just why has this not been
done with I don't know, Microsoft and open Ai for example.

Speaker 3 (58:54):
This is very interesting. So think about the ad tech
platforms ad tech. It's it's into three pieces. You have
a demand side platform, a sell side platform in a marketplace.
So the demand side platform represents advertisers, the cell side
platform represents publishers who have places where the advertisers can place,
and out in the marketplaces where the agents for them
gather together to bid. And so you land on a

(59:15):
web page that lag before it loads. That surveillance lag
is all of these little auctions taking place where someone says,
you know, I have an eighteen to thirty four year
old manchild from Queens who has been recently searching for
gonorrhea and owns an Xbox, who will pay to advertise
to them? And how much will you pay? Right? Love them?
And so there's this little thing going on in the background, right, Okay,

(59:37):
So Google and Facebook each own a demand side platform,
a sell side platform, and a marketplace. And moreover, they
have rigged the game so that if you use one
part of it, you pretty much have to use the
other parts. And they're both publishers and they're both advertisers. Now,
so fifty one percent of every advertising dollar goes into
Google and Facebook SIG. The historic share for all intermediaries

(01:00:01):
and advertising supported media was fifteen percent, so they've tripled
the share for intermediaries in this. So this is like
if you and your partner were getting divorced and you
showed up and you realized you both had the same
lawyer who was also the judge, who was also trying
to match with both of you on Tinder, and then
after the divorce was settled, you discovered that you didn't
get the house. Your partner didn't get the house. The

(01:00:21):
lawyer got the house. Right, This is the thing. So
last year we had a bill introduced in Congress called
the America Act. And one of the great tragedies of
American politics is our brightest political science graduates go to
work in the Senate where all they do is come
up with acronyms for acts. So you know, the America
accents for some of the ess A is for America, the

(01:00:42):
country that I love. M is for motherhood, the value
I admire. So the America Act says to the ad
tech platforms, you have to split, you have to be
structurally separated. You can represent sellers, you can represent buyers,
you can be a marketplace, but you can't be all three.
And the two main co sponsors of the America Act
were Elizabeth Warren and Ted Cruz. Right, wow, that is

(01:01:04):
a pretty bipartisan pill because even someone as terrible as
Ted Cruz can look at an arrangement in which someone
represents the buyer the seller and is the marketplace and
is capturing three hundred percent of the traditional share for
representatives who perform this function and say, this market cannot
be fixed with a code of conduct. This market can

(01:01:25):
only be fixed structurally.

Speaker 1 (01:01:28):
Or at least recognizes the politics of taking on big tech,
right as something that is applicable to his constituents, Right,
because I think, you know, I don't know that most
of his most Texans might necessarily care about all that,
but they do care about fairness. They do care about
the fact that big tech has gotten so powerful.

Speaker 3 (01:01:47):
It is really interesting we've come to the place where
it has been become a truly bipartisan code. Ben It's look,
nothing moved out of the Senate last year.

Speaker 2 (01:01:56):
So the abyss do we move this shit forward, like
to the to the real meat of it?

Speaker 3 (01:02:04):
What do we do? How do like?

Speaker 2 (01:02:06):
Maybe it's not us and my editor Matt Hughes, he
suggested an idea. I can't do this because it make
me morally corrupt. But I'm just saying American software pants, right,
they're extremely and Matt, this is Matt's idea. You can
there are a lot of patents you can file pants
for anything.

Speaker 3 (01:02:24):
I'm not saying you do this.

Speaker 2 (01:02:25):
In fact, I know a lot of my listeners right
now you're holding up a drug store, you're working on
a new kind of tasteless poison, and you're or just
regular stealing. It's a Friday, It's an easy day for you.
Here's an idea. Surely someone could just anticipate the things
the Open Eye was going to do in the future
and just start filing patterns and then use those pans

(01:02:46):
to mire them in endless lawsuits bullshit lawsuits. Not saying
they should do that. But if you're not doing that,
how do we actually break these people up? Is there
a way for regular people to contribute towards this actual change?
Because it feels like relying on the governments is the problem.

Speaker 3 (01:03:05):
Well, let me tell yes, sorry, let me tell you
a story.

Speaker 2 (01:03:07):
I don't mean that in like the by the way,
the anal kick like we can't rely on the government
just it doesn't seem like the weapons of government.

Speaker 3 (01:03:13):
Let me tell a story about as yeah, yeah, about
a man who was in the wilderness, a man who
had ideas that everybody thought were crazy, a man called
Milton Friedman, right, oh, Christ. In the nineteen sixties, Milton
Friedman had this idea that all of the gains of
the Great Society and the New Deal should be rolled back.
So there are problems with the New Deal in the
Great Society. They excluded black people, they excluded gay people,

(01:03:34):
they excluded women to a large extent, and so on.
That wasn't Milton Friedman's problem with it. Problem with it
was who had included, not who had excluded. And he
was like, we need more forelock tugging plubs, right, like,
you shouldn't expect to have a dignified retirement. You shouldn't
expect that your children will go to school for free,
and so on. And people would say, Milton, people like that, shit,

(01:03:54):
how are you going to make that shit happen. And
he would say that in times of crisis, ideas can
move from the periphery to the center. So our job
is to keep ideas lying around so that when the
crisis arises, we can move them into the center. And
that crisis was the oil crisis of the nineteen seventies.
So I like to quote Milton Friedman because I like
to imagine that he looks up from that spit that's

(01:04:16):
protruding from his mouth and gargles a curse as he
hears his words. Him and Jack Welch, and you know,
the demons laugh and they throw more molten shit on
his naked body. But you know, if there's one thing
that Milton Friedman has bequeathed us besides this theory change,
it's an abundance of crises. Right. We have so many

(01:04:37):
foreseeable crises on our horizon, crises of technology, climate crises,
economic crises, legitimacy crises for the Supreme Court, and so on,
and having these ideas lying around like being part of
a group of people who say, look, right now, this
is impossible, but this is like just because it's just
because it's hard, it doesn't mean that we should do

(01:04:59):
something easier. If it won't work, you don't look for
your keys under the lamp post. I get it. If
you wanted to get there, I wouldn't start from here.
But this is the thing we have to do. And
if we keep talking about the thing we have to do,
then when the crisis comes and someone says, Jesus Christ,
we've got to do something here, we can say, here's
the thing we have to do.

Speaker 1 (01:05:16):
So it starts with breaking them up, then, I mean,
I think that that's well, you know, I think both
Corey and I would probably agree that that's pretty high
on the priority list. It makes a lot of things easier,
and it would make a lot of things better. And
it's something that's also within sort of the realm of
possibility for a lot of reasons. Again, that's sort of
that you know, it might be a flimsy bipartisanship right now,
but it is a bipartisanship nominally, and so it's it's

(01:05:39):
it's something that's actually sort of more imaginable than a
lot of other happen It's happened before Bell systems. Yeah,
and you know, I think there's a lot that you
can do. I think there's you know, as we've seen
the tech industry sort of you know, it is wealth
concentrate and it's sort of calcify, and each of those
things that that Corey mentioned earlier on gets sealed off

(01:06:02):
to most people. Well, you see more anger, and you
see an industry that doesn't historically have a lot of
interesting in organizing or in sort of activism becoming much
more so. So I think that is another area that's
ripe for action. You probably have a lot of listeners
who are in the tech industry or related to the
tech industry, and startingly underneath cause outside Google, that's the mechanics,

(01:06:26):
might all might might be the mechanics, right, Yeah, they
might already be unionized, but they should.

Speaker 3 (01:06:31):
But organizing, I think is a.

Speaker 1 (01:06:33):
Really really ripe area where you can really start to
move the needle and how.

Speaker 2 (01:06:39):
Software developers actually unionize, Yeah they can.

Speaker 1 (01:06:43):
So when I one of my organizing efforts, I was
at Medium and I and and Medium was at the time.

Speaker 3 (01:06:49):
They've since jettisoned all of their editorial employees.

Speaker 1 (01:06:52):
Like me, So there's no none, none of those like
three times as well, yeah, I was I think the last.

Speaker 3 (01:06:57):
Time the third wave Choral bleaching. Yeah, they just launch
them out of a cannon everything once in a while.

Speaker 1 (01:07:04):
But it was it was half editorial and half tech,
and there was genuine and uh you know, really sort
of animated interest in organizing, and we got so close,
we got within one one vote, and it wasn't you know,
for them, it wasn't about you know, I'm sure they
wanted their their salaries to be a little bit higher

(01:07:25):
or whatever, but that wasn't the driving, uh, motivating factor.
It was they didn't like a lot of the direction
that medium was going. They didn't like a lot of
the policies that medium had, they didn't like a lot
of uh, you know what the founder the power of
the that the founder had to wield over the entire platform,
and organizing is one way to sort of claw back

(01:07:46):
some of that power. I mean, in some cases, it's
almost like when you have a weird situation like Meta
with Zuckerberg at top, as like the boy king who
is just going to rule that forever barring.

Speaker 3 (01:07:57):
Some fireable right exactly.

Speaker 1 (01:07:59):
But well, most tech companies are not like that, and
you can begin to you know, sort of wield more
influence internally if you join some of those efforts. There's
a lot of great unions and organizers working in that
space right now. So that's just one thing that you
can that you can start to get involved in.

Speaker 3 (01:08:15):
You yeah, let me say, you know, you said can
we do this? And you said we have to break
them up. Well, like the thing that stops us from
unionizing is also the thing that stops us from breaking
them up, which is that they're powerful. Right, And so
the reason that we practiced anti trust law the way
we did until the Carter administration was not just because
we wanted to preserve choice or competition. It was because
when companies get too big to fail, they become too
big to jail. Right. Once a company's more powerful than

(01:08:37):
the government. IBM, you know, was suit for anti trust
violations from nineteen seventy to nineteen eighty two, and every
year for twelve consecutive years, they spent more on lawyers
to fight the DOJ Anti Trust Division than all of
the lawyers at the DOJ Anti Trust Division costs combined
for all anti trust cases in America for twelve consecutive years.

Speaker 2 (01:08:55):
There's no way to limit their ability to fight back
as well, because that'd be undemocrats.

Speaker 3 (01:08:59):
So once they're once they're a monopoly, right then you know,
like the best time to stall monopoly forming is is
But look, before there is a law, there must be
a crime, you know. Lenny Bruce points out that if
we have a law against fucking chicken, someone must have
fucked a chicken at some point. Right. So before we
had antis, before we had anti trust law, we had

(01:09:21):
monopolists right right, that's where we got the anti trust
law from. And specifically we had monopolists like John D. Rockefeller,
who is still the richest man that ever lived, the
most powerful, the most ruthless, terrible piece of shit, and
we brought him low. And like the way that we
did it, it happened in living memories, like nineteen twelve.
So there's people alive today who were mentored by the

(01:09:41):
people who did it right, and then there's people alive
today who are those people's mentors, who are still working
and still on the job. Right, it's not like embalming pharaohs.
It's not a lost art from a fallen civilization. The
thing we know how to do.

Speaker 2 (01:09:53):
And yeah, like breaking up for me, bring him low though.

Speaker 3 (01:09:57):
It's a great story. So woman called Ida Tarbell was
the first woman in America get an undergraduate science degree.
She was trained as a biologist sick. She was the
daughter of a Pennsylvania oil man who was destroyed by
the Standard Oil Trust Wow, and she went to work
for Colliers, which was the biggest magazine in America at
the time. Were one of the biggest, and she serialized
a two volume muckraking history that pitilessly laid out exactly

(01:10:22):
how the John D. Rockefeller scam worked. It had a
very good title, she called it the History of the
Standard Oil Company, Volumes one and two. It was super viral,
and at the end of it, the Incoe political will
that we'd had that said, like something is really wrong
turned into a very sharp political will that said, we
know how the scam works, and we demand that something
to be done about it. And the Sherman Act, which

(01:10:43):
had been hanging around for twenty years, finally was taken
off the shelf and mobilized against John D. Rockefeller, and
we brought him low.

Speaker 2 (01:10:49):
So this is I'm so glad you brought up the
story which I just learned. First of all, I'm gonna
look at cool people who did cool stuff. If Maggie's
not done that, she needs to hear this, that is.

Speaker 3 (01:10:59):
And she then became a leading suffragist and was one
of the greatest orators of American history.

Speaker 2 (01:11:04):
Just first of all, go eu to Tarbell original poster. Yeah,
like the like, genuinely like amazing. But also I think
this speaks to something. I'm speaking to anyone who is
a reporter, who's listening to this right now, there is
this common sense that why are we doing this, Why
are we cataloging this? They're still gonna do these things?
Why bother? And the answer is that fucking story Hindenberg Research.

(01:11:29):
I don't know if I agree with all their methods,
but the fact is we need more people out there.

Speaker 3 (01:11:34):
And I do this too. Well.

Speaker 2 (01:11:35):
You don't necessarily write these fuckers are bad, they are right?
Why and write in detail because it sounds like historically
that actually eventually works.

Speaker 3 (01:11:45):
You know, there is a there's a phrase out of
the finance sector called migo. It stands from my eyes
glaze over, And it's the idea that if you make
the prospectus thick enough, people will just assume that it's
good because they can't possibly read it like the pile
pony under it right, un unraveling migo, whether that's you know, Marco,
Robbie and a Bathtub explaining synthetic collateralized at obligations, or

(01:12:07):
you know, my last couple of novels, the Bazil and
Red Team Blues are about explaining how real world private
equity scams, cryptocurrency hustles and so on, how they work.
Is I think the first step on the path to
making people feel like they can see where the weak
points are in this in this very large but very
brittle edifice, and they can start go kicking at the

(01:12:29):
week point.

Speaker 2 (01:12:29):
Because I think that something Google is very powerful. They
have tens of billions of dollars in like sixteen billion
dollars a profit every quarter. They are unbreakable by traditional means. However,
they are also culturally weak. They are a patchwork of
people who've been there two minutes and twenty years. They
don't fire some Ben Gomes, the gouy around Google search

(01:12:49):
Board provate God took over. They put them in the
corner in SVP of education because they know if Ben left,
he could very easily at least raise money to completely
a real Google competitor. But the thing is, the asshole
doesn't talk to the mouth or the elbow. Nothing, there
is no form. But as they get bigger, they are

(01:13:10):
weaker on some levels. And I think it's just something
I'm doing with the show. And actually both of you
had done amazingly with your work is breaking down these
weak points, these pain points. Brian, you did the great
thing about the mechanical turk for example, how it's very
similar to how generative AI works. And it's just I
think it's strange. I've heard and I won't name names,
but I've heard people kind of make the suggestion that well,

(01:13:32):
why why should we bother?

Speaker 3 (01:13:34):
Why, Well, they're so.

Speaker 2 (01:13:35):
Powerful, And the answer is they're actually terrified of this.
They do not want this. This is scary. Do it
every day?

Speaker 3 (01:13:43):
Yeah?

Speaker 1 (01:13:43):
Look at I mean again, we mentioned this earlier, but
look look how Google reacted when a single credible.

Speaker 3 (01:13:50):
Competitor entered the field.

Speaker 1 (01:13:51):
Oh but they ran around like chickens with their heads
cut off and they scrambled to find a product.

Speaker 3 (01:13:58):
It was internal turmoil.

Speaker 1 (01:13:59):
It is. If anything, it wasn't the reaction of a
company that is convinced of the power and security of
its products and its placement in the marketplace.

Speaker 3 (01:14:12):
This was.

Speaker 1 (01:14:13):
This was a company that is like, as you were saying,
pretty profoundly insecure and reliant on all these deals in
order to maintain its position where it's paying out a
great deal of money.

Speaker 3 (01:14:25):
I mean this, it's it seems to me more.

Speaker 1 (01:14:28):
Like a house of cards that that could collapse if
the right one is pulled out. And to to add
on too, what Corey was saying, Uh, you know another
work that that I that that I read while I
was researching the Luddite Book, which was one of the
first efforts to sort of quote rescue the Lutdites from
from from condescension, was E. P. Thompson's The Making of

(01:14:49):
the English Working Class, and it tries to tell the
story of like, how did the how did we get
to the point where workers came together and said, wait
a minute, we have enough power to actually I have
some change.

Speaker 3 (01:15:01):
How did that?

Speaker 1 (01:15:02):
How did it because it didn't exist before? How did
workers start to see themselves as part of a class
or have solidarity? And it's just filled with all these examples,
you know, of little fights that you know have otherwise
been completely lost to history, where people stood up for
something and you know, were crushed and then everybody forgot

(01:15:22):
about it.

Speaker 3 (01:15:23):
By and large.

Speaker 1 (01:15:24):
But the neighbors didn't forget, the local newspaper men didn't forget,
the families didn't forget. They're recorded in oral histories. They
formed this tapestry. So I feel like it's it's always
a bad idea to feel like nothing that we're doing
matters or registers, because it is, and it can reverberate,
and it can echo, and it can coalesce, and it
can snowball really quickly.

Speaker 2 (01:15:46):
And if things aren't really working out for you right now,
if the crowball won't get the ATM open, if even
driving a car doesn't get into it, but you still
have some left open money, perhaps you could check out
one of the following products. I think that this will
solve probably every problem you've ever had.

Speaker 3 (01:16:01):
I hope it's SCRIPTO. So I want to bring up
one way in which Google is unexpectedly weak, in which
its strength strength is a weakness. And this is a

(01:16:23):
characteristic of all the big tech companies, which is that
they do the same shitty thing everywhere, which means that
if there's a case built against a tech company by
a regulator, and are a big, powerful country like the
United States, all of those facts will be almost identical
in much smaller countries like South Korea. We've actually seen
this where cases against Apple for antitrust have been effectively

(01:16:45):
recycled in South Korea Japan and successfully so. Right, so,
all of the world's governments are facing the same adversary.
So even if these companies are bigger than most countries,
they're not bigger than all many countries.

Speaker 2 (01:16:58):
When you say a shitty thing, what do you mean, Oh,
you know, like specific Apple.

Speaker 3 (01:17:02):
Apple has this Danegeld system where they take thirty cents
out of every dollar you spend on an app, and
it's super anti competitive, and they have all these ways
that they block people from trying to circumvent it, and
it lets them control certain markets, and it's just it's
it's illegal under most countries systems of law. Here's here's
a fun fact for you. After World War Two, the

(01:17:22):
American technocrats who went around and rebuilt the legal systems
of Europe and Japan just basically copypasted American anti trust laws.
Oh yeah, so, and here's another funny thing. They were
keenly aware of how important monopolies were to the rise
of fascism and the access powers, and so one of
the things they did when they got to Germany and
Japan and Italy was smashed their monopolies because they said

(01:17:46):
monopolies lead to fascism.

Speaker 2 (01:17:48):
But the thing is as well is monopolies are bad
for everyone.

Speaker 3 (01:17:52):
It is a bipartisan issue.

Speaker 2 (01:17:54):
Roger mcnammey, who I mentioned Elie Facebook and bested though, Yeah,
really good friend mine.

Speaker 3 (01:17:58):
He's wonderful.

Speaker 2 (01:17:59):
He really was, Like I was talking to about something
I was writing, and I within shitification. I did the
raw economy, which is very similar, but I know my
central failing in it. And it was these monopolies. And
it's just this sense where you get these Sam Wulton
types who want the monopolies. You get these and recent
types want to invest in monopolies for the money, but
they're shitty businesses. Monopolies are like Google is probably is

(01:18:22):
an incredible business, but inevitably must die, like it's going
to die eventually, and fast or slow it must because
eventually you run out of humans.

Speaker 3 (01:18:34):
Well, this is the problem Amazon is facing, right. They
keep looking at the rate at which they're burning out
warehouse workers and saying, well, all available semi skilled workers
will have worked at an Amazon warehouse and no longer
want to do so or have been permanently maimed by whatever.
Originally it was twenty twenty four was their initial projection.
That was their clarion call that made them start investing

(01:18:55):
more heavily in automation. Not in treating work I bake,
not in making it better to work there. Yeah no, no, no,
not not in the slightest no, just in in adding
more of that. Now. I have a lot of lefty
comrades who are skeptical of this anti monopoly stuff, and
they say, look, I don't want a million choices. I
don't want to go to Amazon looking for a light
bulb and find ten million light bulbs. You already do right, well,

(01:19:18):
you do right, exactly right. I want to go and like,
I want to live in an orderly regulated world where
I don't have to use the magic of the market
to make sure that like my water isn't poison or
that things aren't bad and so on. And there's something
to that. I don't know. Everything has to be. We
can have regulated monopolies, we can have state we can
have state.

Speaker 2 (01:19:36):
Things like healthcare. Ye competition, but the.

Speaker 3 (01:19:40):
Thing that that competition gets you is not merely choice.
It's a weak corporate sector. People really fail to understand
this at their peril because although these companies appear to
be at one another's throats, when you look more closely
at it, they're really quite chummy. We've surface around us
several times. Yeah, so you know, Apple says, oh, we're

(01:20:04):
the non surveilling company, and Google and Facebook are the
bad ones. They surveil and we're so different from them.
But Apple like puts a surveillance tap in the pocket
of every iPhone owner by defaulting to Google. I mean,
Apple wants to block surveillance unless they're getting paid for it,
so they block Faceto and.

Speaker 2 (01:20:21):
They don't do unless it's their and they.

Speaker 3 (01:20:24):
Do their own surveillance. Right. Google and Facebook allegedly are
our great adversaries in the ad tech sector, except they
have this illegal collusive arrangement where they divided up ad
tech like the pope dividing up the new world called
Jedi Blue. Right, you have Cheryl Sandberg moving from one
firm to the other. Right, she was a Googler before
she was at Facebook, and.

Speaker 2 (01:20:44):
Before that Mackenzine before that Larry Summ's Department's treasury.

Speaker 3 (01:20:47):
Yeah, and Facebook is stuffed with ex Apple people, and
Apple is stuff with x Facebook people. That the way
that you ascend through the ranks of a highly concentrated
sector is through lateral moves because we within the firms,
there are no boxes on the org chart for you
to rise up to. Either someone has to die or
become disgraced, right, and so it's only when a firm

(01:21:08):
seeks someone from outside that you can actually jump up
the rank, which is why you see, for example, people
from Disney jumping to Universal and people from Universal jumping
from Disney to ascend the right.

Speaker 2 (01:21:18):
Just to push back a little bit, isn't some of
this just domain expertise though.

Speaker 3 (01:21:23):
In the sense that that's why they're doing it. Sure,
But and you know, this is another problem with regulation,
is that when a sector becomes highly concentrated, the domain
expertise is siloed within the firms that you're hoping to regulate.
And you know, there's a there's a not wrong point
that people make about the revolving door, which is that
if a sector consists of five companies, then everyone who

(01:21:43):
understands those companies works for one of them. So where
else are you going to get a regulator? Right? Of course,
they hire ex Googlers.

Speaker 2 (01:21:49):
And at this point you have people where you only
really hire if they've worked at a big tech firm. Sure,
that's the only way you will know how to work
very similar machines.

Speaker 3 (01:21:57):
Although I'm speaking of regulators, so sorry. The reason the
f SEE is hiring x Comcast people, supposedly is that, like,
who understands how the cable networks work except for people
who work for one of the two or three big
cable operators. Right, So there's a grain of truth to this.
It's not entirely true, but there's a grain of truth
to it. And the fix for it is not to
let cable companies mark their own homework by parachuting their

(01:22:19):
own people into the regulators. The fix is to make
them small enough to be comprehensible, right, to make them
small enough that outside parties can see what they're doing.

Speaker 2 (01:22:30):
But also a marketplace with tons of choices. It makes
for a weaker corporate sector, but stronger companies, I would argue,
because if you make a good fucking product, you can
win a market. And I realize that this is not
a perfect system at all, and that doesn't always happen,
but at least to some extent, Apple has dominated because
they aren't imperfect. And I know, listeners, I know you

(01:22:52):
always am not hard enough about Apple. I'm doing an
app Store episode. It's so hard to write, you want
to write five thousand words. Jesus fucking But the things
of Apple, they've done pretty well because they make good phone, phone,
good laptop great, Like the laptops are great. These are
good pieces of tech that do the job well. And
yes they use monopolistic forces to keep where they are,

(01:23:12):
but it also feels like a Google's phone division. They're like,
why does this never work? And it's because it kind
of like Android phones are good, but they've really why
doesn't Google try and pay for my message? Well they shouldn't.
They should actually really push our cs much harder. But
they don't push it that hard. They could put a
lot of more money behind our ces, or try and
find a way to work without I know it's coming in,

(01:23:33):
but it's like, like, doesn't really give a shit about
Android doing super well.

Speaker 3 (01:23:37):
They just don't want.

Speaker 2 (01:23:37):
Apple to have everything. I guess it's so strange.

Speaker 3 (01:23:40):
It's hard to play criminology here, but I want to
say that while competition is obviously imperfect, yes that in
tech we do have a kind of secret power. And
you ask. You started this conversation by asking what excites
us about tech. One of the things that excites me
about tech is a kind of nerdy principle, which is
that all computers are We're in complete universal of onon

(01:24:01):
noion machines. They can all run every program, which means
that any program running on a computer that you have,
whether it's your pocket Destruction rectangle, your laptop, or your
game console, another program can run that modifies how it
works so that it works for you and not someone else.
Your printer can always take someone else's inc your car
can always have its surveillance technology disabled, and so on.

(01:24:22):
And what that does is it means that with tech
you have this separate competitive capacity, which is the ability
to compete by modifying existing technology. Right Like Lexmark used
to be IBM's printer division, right, and they made laser printers.
And there was a company out of not Malaysia, Taiwan

(01:24:43):
that made refill kits for their toner cartridges, and the
toner cartridge had a little crude microchip in it. This
is in the early two thousands, is when embedded systems
were expensive and very primitive. It had a sixty four
byte program and when you used up all the toner
and the cartridge, the sixty four by program would toggle
a switch that said, I'm now an empty cartridge, and
if you put more carbon powder in the cartridge, it

(01:25:05):
wouldn't work because the cartridge told the printer it was empty.
So Static reverse engineered this sixty four by program not
very challenging. They made another chip and it reset it
to one. Right, I am a full cartridge. And when
that happened, Lexmark sued them, and they sued them, arguing

(01:25:25):
that they had violated their copyright, that they bypassed an
access control for a copyrighted work, which is prohibited under
the Digital Mining of Copyright Act. And the judge at
the time said, like, where is the copyrighted work. It's
not the carbon in the printer cartridge. They said, no,
it's the sixty four by program. They said, well, you know,
like sixty four by programs software is like copyrightable, but

(01:25:46):
a sixty four by program isn't. It's not even a
high coup right, Like, it doesn't rise to the level
of complexity that attracts a new copyright. And so they
threw the case out punchline. Today, Lexmark is a division
of static controls, right, because they went for I'm refilling
the cartridges to buying the business. Ah. Right. So today,
if you were to try and make that argument about HP,

(01:26:08):
which has this disgusting habit of pushing at fake security
updates that break your printer's ability to use their party Inc.
It breaks on its own. Sorry that too. But if
you were to do this and you were to go
back to court and say the program in the chip
in the chip in the cartridge that stops you from
using third party Inc. Doesn't rise to love a copyright ability.
That's like a twenty six cent system on a chip

(01:26:29):
that's running a full Linux stack with like busy box
and a network stack. You could probably attach a monitor
to it if you want it to write, and like,
there's just no argument that that's not copyrightable. Now, with tech,
if we could clear some of these rules that have
been mobilized by big tech to block new market entry,
to block reverse engineering, to block all this other stuff,

(01:26:49):
we could do lots of things to compete, Like the
iPhone is a great platform for someone else to make
an app store, right.

Speaker 2 (01:26:55):
Vidia I was one of the like I was really
into the homebrew community and iPhone. Yeah, and you know what,
some of it was broken as shit, but you could
do so many There were weird little things that connected
the systems, probably in ways that were extremely leaky, but
they were drowned in the water. Now geohot makes weird
autonomous car things that kind of work.

Speaker 3 (01:27:17):
It's just yeah. But you know, if you remember in
the early two thousands, Apple was dying because Microsoft wouldn't
make a functional version of Microsoft Office right. And the
way Steve Jobs resolved that was not by begging Bill
Gates to fix Office. He just had technologists at Apple
reverse engineer Microsoft Office and make iWork pages, numbers, and
keynote that perfectly run word, Excel and PowerPoint. Now. It

(01:27:39):
was janki at first, because Microsoft kept cycling the file formats, yes,
in order to break compatibility. But that hurt Microsoft more
than it hurt Apple because Apple had their own internal
file formats. Microsoft just had these file formats, and they
kept making them more obfuscated and complex, and then they
had to support them all. So eventually Microsoft sued for
peace and they standardized those file formats. That's where we
get doc x PPT x xls X. Those are the

(01:28:01):
XML versions. And the reason you can copy and past
style text into a browser and then into libre office,
and then into iWork and then back into office is
because they're all using this standard. If Apple could not
use the courts to shut down something like Sidia, uh
and if they were required to just have hand to
hand guerrilla warfare with Jay Freeman and his engineers, they

(01:28:24):
would eventually sueer piece because they they are fighting an
asymmetric battle where they have to make no mistakes in
the iPhone, and Jay and Johatt just have to find
one mistake they've made, and every time they do that,
they have to go back to the drum. Alright.

Speaker 2 (01:28:40):
Talking to Margaret Thatcha, you need we only need to
be lucky ones.

Speaker 1 (01:28:44):
Yeah, And and you should be pointed out that I
think that Apple's in house design were watching what they
were doing in the in those jail broken sists, and
they were, and they a lot of the things that
were really popular. They poured it into the into iOS
and future design. You know, obviously not the interoperability or
the things that would be more radical, but just like

(01:29:05):
design flourishes and features, like they were watching that community
and taking notes from it.

Speaker 3 (01:29:09):
Every good designer starts by paving the desire path.

Speaker 2 (01:29:12):
And I mean one of my favorite examples of this
where yeah, the courts would I don't know how the
courts would have gone in because this is video games,
but the originals of the World of Warcraft were from
numerous forums, from numerous guilds. In EverQuest, there's a big
thing with the plane of time, with the shadows lukelan expansion.
Anyone who knows this story, by the way, email me.
I love you, easy Better off line dot com. But
long story short, there was a massive bug with the

(01:29:34):
final rate, and a bunch of people, for all being
one of them said I'm fucking done with this game.
This company doesn't give a shit. They can't even finish
their goddamn work. It's disgusting. And Blizzard, who are now
kind of horrible thanks to Activision, Kotic and all them,
said huh, well, if we just got all these high
level players and ask them, how would you like EverQuest

(01:29:55):
to be better? What would a better MMO look like?
World of Warcraft was an amazing m I genuinely love
Wow at the beginning, and to this day, I'm sure
ever picked up. And it does that because this crazy
idea of we could really compete with pretty much the
only successful Western MMO with some exceptions, by taking the
things they like and doing better at them, by iterating

(01:30:16):
on their work and doing great things. If we could
do that in tech. Even the small examples we have
in tech of that working are so good. I mean
Anchor being my favorite one, Like they have gallium nitrie.
They've managed to make shit really powerful and really small,
and yeah, it's made every other battery manufacturer have to
fucking compete. I'm sure they didn't do it first, but

(01:30:37):
Anchor has made a successful business out of seeing other
people shit and going, what if this was good?

Speaker 3 (01:30:44):
Let me make one more point about why user adaptability
is good. It's not merely because it lets firms discover
a new way so pleasing us, or because it lets
us make our lives easier. It's because when our technology
doesn't work for us, it is often because the circumstances
that we're in have changed for the worse. Okay, right,

(01:31:04):
your printer, you need to get one more page out
of your printer, and it's really desperate and it won't
print because it doesn't have any magenta in it, even
though you're printing a black and white document, and the
black and white document you're printing is the thing that
you need to bring to the bail bondsmen because your
kid's just been arrested. Right The people who are finness
this again, the people who make the technology can never
anticipate all the ways that the people who use the

(01:31:26):
technology are going to need to use it, not just
in the immediate future, but in the decades that will
come afterwards, because this technology lives in our stack forever
and ever. If you want to read an amazing document
about how technology ac creates, the British Librari's postmortem on
their ransomware attack, where they talk about how they were
early adopters and then they because they could never shut
down the system, they just put more and more layers

(01:31:47):
on top of it. Each layer was a seam that
could be exploited by ransomware creeps. It's one of the
most important tech documents I've ever read. So you know
when things are broken, when people in the future are
trying to maintain or adapt the things that you've made.
When the climate emergency hits and the power goes out,
or the internet is not available or not available continuously,
or a website that's supposed to be checked every time

(01:32:09):
you launch for a license key goes down because there's
been a big cyber attack or because CrowdStrike has gone down.
That is the moment where being able to change how
your technology works is going to be the difference between
success and failure, a good life and a bad life,
sometimes life and death by it. And the hubris of
thinking that you can anticipate all the ways that people

(01:32:32):
are going to use it, all the bodies they'll be in,
all the circumstances they'll live through, is so monumentally arrogant
and reckless, and ultimately, although you may not always make
the best decision about how the technology around you should work,
that decision should still always vest with you.

Speaker 2 (01:32:51):
So as we wrap up here, I want to finish
around the generative a Brian, do you see any value?
This is such an aweshole question to end on. I
realized we've got like tim but like, do you see
any value in this? I think this is a good
thing to wrap.

Speaker 1 (01:33:08):
I think like a lot of the technologies that have
kind of surfaced or have become fixations of Silicon Valley
over the last five years, as it's sort of cycled
through the latest next big thing, as it's investors are
desperate to find said big thing because it hasn't really had.

Speaker 3 (01:33:26):
One in a while.

Speaker 1 (01:33:28):
I think, you know, Jennert of AI has found a
few more use cases.

Speaker 3 (01:33:32):
Than whatever the metaverse or web.

Speaker 1 (01:33:35):
Three or crypto, So therefore it's been a little bit stickier.
And you know, as a technology it's itself, is there
is there something interesting there?

Speaker 3 (01:33:47):
You know? Yeah, like I think that there's there.

Speaker 1 (01:33:50):
We should be having researchers and developers looking into into
large language models and how to make them better and
how to make them more interesting. Scene they can and
can't do by kicking the tires? Do we need every
major tech company running massive data farms doing basically the
same thing as the cost of energy and compute skyrockets

(01:34:13):
and a game that and a company that made video
game chips as of two or three years ago is
now suddenly more valuable than Apple and the entire GDP
of or like the London Stock Exchange, I think I saw.

Speaker 3 (01:34:26):
Something like that.

Speaker 1 (01:34:26):
Yeah, then you have then you have an issue. And
this is the problem with our current mode of technological development.

Speaker 3 (01:34:33):
It's not just is technology? Is this technology and question good?

Speaker 1 (01:34:37):
Is Is this in any way a sensible way to
go about seeing how it's.

Speaker 3 (01:34:42):
Useful or who it could be useful for, or what
use cases it might have? And I think absolutely not.
I think it's so hard.

Speaker 1 (01:34:50):
To distinguish that question right now because it's gone from
zero to one hundred and twenty and it's spinning out
in all directions, and it's so many bad decisions have
been made in its rollout. It's it's been deeply non
consensual to a lot of the parties whose work has
been harvested into this thing. So you have a bunch

(01:35:12):
of people who are furious at the way that the
models were trained in the first place.

Speaker 3 (01:35:17):
It's being set against creative markets where people.

Speaker 1 (01:35:20):
Are now worried about their jobs, so you have them angry.
On the other side, you have ordinary people who are
just worried about is this going to automate my job?
How's it going to affect me? And so you have
this immense climate of not just fear but also of anger,
and it's really hard to pull out. You know, like, well,
is generative AI like a good technology in terms of

(01:35:42):
the use cases to which it is being put right now,
I would say absolutely not.

Speaker 3 (01:35:48):
Yeah.

Speaker 2 (01:35:48):
It feels almost like the most egregious example of everything
we've been talking about. When these companies have no natural predators,
when they have no one to go shit, you can't
do that. Customers will be mad because otherwise they'd say, so,
what is chet GPT exact? Why are we doing this
that just not more most stuff?

Speaker 1 (01:36:07):
Well, again, I would say I would push back on
that a little bit because it did rack up organically,
you know, one hundred million users or whatever. So there
were some people who are like, this is a fun
toy to play with, okay, but actually okay, well, but
but then we transition to like, well, that's not really
gonna make us any money, so how do we get
it to make any money? And then all of a
sudden it's like, oh, maybe we can just like automate

(01:36:29):
some stuff. And it becomes an enterprise software that it's
selling that it's being sold you know, to to consultancies,
to uh fintech firms, to anyone who any any anyone,
anyone who will who will listen to a to an
open AI salesman's pitch, and that's how we wind up
where we are today. Where you have hundreds of thousands
of these enterprise seats sold to a dubiously useful technology.

Speaker 3 (01:36:52):
Uh.

Speaker 1 (01:36:53):
People are are rightfully concerned that it's that it's you know,
that it's threatening their uh, their their, their livelihoods. Even
though it doesn't do a good job.

Speaker 3 (01:37:02):
It sucks.

Speaker 1 (01:37:03):
This is a point that Coreya makes a lot, is
that it doesn't actually often create a superior product. It
almost never does. Yeah, it's just it's good enough for
a manager to say, all right, we'll take it.

Speaker 3 (01:37:14):
It's cheaper. But I wanted to.

Speaker 2 (01:37:15):
Push back on one thing. And I don't necessarily know
if you've disagree, But yeah, I don't know if chat
GPT grew organically. The reason I say that is this
when you have no hypergrowth markets, when you have no
big new thing, and you have what I would argue
is members of the media willfully engaging in a marketing campaign. Yeah,
run by and I would. By the way, like all

(01:37:36):
evil systems, I don't know how much of this was conscious.
I don't think people sat around and now we shall
all agree to write about chat GEPT.

Speaker 3 (01:37:44):
No.

Speaker 2 (01:37:44):
I think that just a desperation in the media for
something new to write about, connected with the desperation in
venture capital, which connected with the desperation in the market.

Speaker 3 (01:37:53):
Boom yeah.

Speaker 1 (01:37:54):
And then into the vacuum comes this new apparently viable
product about which any number of narratives can.

Speaker 2 (01:38:00):
Unfold exactly, and you can probabilis telling me to leave
my wife. Wow, you know, like Kevin Rose, the goat
of freaking out about the computer.

Speaker 3 (01:38:08):
But you really, you really, you know whether or not.

Speaker 1 (01:38:12):
The one thing that I would say in its corner
that that maybe some of that was is that is
that it hasn't claimed that user growth has increased much
since then We're like a year later, they're still saying.

Speaker 3 (01:38:25):
Like, well, we still got one hundred million years.

Speaker 1 (01:38:27):
So you'd think that if they were inflating that figure,
they would at least they would find some you know,
metric to put together active paine. Now we've got two
hundred yeah, exactly, some muskie and you know, minutes viewed
per syllable generally.

Speaker 3 (01:38:41):
Yeah. So I think that, you know, maybe a useful
analogy here, although not a perfect one, is gamification. Right. So,
if you've read Jane mcgonagle's old work on this stuff,
you know, Jane was doing this incredible stuff where she
On the one hand, she she'd suffered a neurological industry
injury rather and she needed to do a lot of
very repetitive and difficult and generally low compliance rehabilitation for it.

(01:39:05):
People don't recover from those injuries because it's so hard
to do. And she made a game out of it, right,
and it really worked, and she started she's a game developer.
She started figuring out how to bring these out. So
there are some marketing crossings. I don't remember where she was,
but she you know, she she worked on I Love Bees,
which was the AI movie, and she worked on a
bunch of other like really Texas Hold Them, Texas, Tombstone

(01:39:26):
Hold Them or whatever it was called Tombstone Hold Them.
People got married playing these games like they were really fun, right,
And then a bunch of people said, oh, well, we
could use these to get Amazon warehouse workers to uh
gamify how long they can go without urinating so that
they can meet quota. Right. And we look at that
and we go, oh, this is horrific, But playing games

(01:39:47):
is not horrific, right, right, And I've been following people
doing weird, playful things with stuff we would now call
generative AI. For a really long time. This guy called
Shardcore is like a digital artist who's making weird videos
and stills forever, really cool fun stuff. You know, literally
a decade of this, which lends it self generative. Yeah,
and you know, today I look around and I see

(01:40:07):
people who like play a D and D game, and
then at the end, the LM has generated a transcript
of the game. It generates a precie of the transcript,
It illustrates it with fun epic moments from it. It
doesn't matter if the ORC has six fingers, right, And
everybody looks at that and they go, hahaha, that's great,
and then they fire the customer service reps for your insulin.

Speaker 2 (01:40:32):
For the company that runs the thing that generates the
ORC pictures.

Speaker 3 (01:40:35):
You know, No that for the people who supply you
with dialysis you sell, or who determine whether or not
you can legally get insulin on your insurance. And you
find yourself arguing with a chatbot about whether you're going
to get dialysis? Who live? And then you die? Right,
And that's not a problem with the existence of tools
that do weird shit, right, that is a problem. And

(01:40:58):
now the thing is that total addressable market of epic
illustrations from last night's D and D game is very small. Yes,
and it may be that some of the productive residue
of this bubble will be standalone models that run on
commodity hard Yeah. And in the same way that, like

(01:41:20):
the productive residue of the world Coom bubble was a
lot of dark fiber in the ground that's being lit
up now. The productive residue the dot com bubble was
a couple hundred million humanities undergrads who are coerced into
dropping out learning Python, Pearl and HTML and went on
to create a bunch of really playful things that turned
into Web two point zero. That might be a productive resue,
doesn't make the bubble good. Bubbles are always bad. They
steal from retail investors mom pops and leave them eating

(01:41:43):
dog food into their retirement or sleeping a cardboard box.
They're very bad, right. But en Ron was a bubble
that left nothing behind, right, NFTs left nothing behind, But
like shitty JPEGs and worse Austrian ex customs, the tank
was bad. Yeah, but this case, in this case, maybe
something good comes of it.

Speaker 1 (01:41:58):
Right.

Speaker 3 (01:41:59):
It doesn't make the thing good, but maybe something good
comes up it maybe there's a productive residue, maybe not.
I think it's so yeah.

Speaker 1 (01:42:06):
I mean you've probably been watching the Olympics or at
least read about like the I think one of the
really telling you. One of the telling things here is
you can look at how these companies are like trying
to advertise generative AI service. So when the first like Google,
you know, Gemini or I forget what it was called it,
then maybe it was already overview, but it was like,
you know, embrace this techno and then you can you

(01:42:27):
can use it to scan all your photos and I'll
tell you our new AI will tell you what your license.

Speaker 3 (01:42:33):
Plate number is. Is this a problem anybody has? You
can't just go outside and.

Speaker 2 (01:42:37):
Look at the So that one is the useful one.
But also I don't know if it's generally if AI,
because it's just that's not they've now started wrapping these
things in to try and sell the dog shit.

Speaker 3 (01:42:48):
Or look at this one.

Speaker 1 (01:42:48):
Did you see the did you see the ad for
for Meta's AI where it's like they're like walking through
a little Italy and going and and she's like she's like, Grandpa,
weren't you here when it was you know the beginning
of little Italy and she's like, meta, tell me what
you know, tell me what little Italy looks like. Now
we can see it, and it like produces like a

(01:43:09):
slop picture of like a pretend little little when it's
just like there are photos of little Italy.

Speaker 3 (01:43:16):
Could you could like, yeah, you could? That are in
a public aren't they also look up woods.

Speaker 1 (01:43:20):
Or the Google one where they were like, oh you
can have your Does your daughter want to write a
letter to an Olympic athlete?

Speaker 3 (01:43:26):
Did you see this one? Yes?

Speaker 1 (01:43:28):
And it was like, have Google Gemini generate and use
generative AI to write the letter. And it's like, obviously
this has been roasted because it's the whole point of
an activity like that is you get to you, you
introduce your child to this, you know, to this this
art of writing something down and engaging with this dream

(01:43:51):
of of you know, and you actually participating in an
activity not just like a fuck it hit a button.

Speaker 3 (01:43:56):
And you can imagine what it would be like to
be the athlete. After that video rounds, that's twenty million
AI slot letter. Where do I begin? So I have
a friend who's a professor who tells me that now
it used to be that if someone asked you to
write a letter of recommendation for a grand student, you'd
be very selective because you wouldn't want to just say

(01:44:16):
yes to everybody because it's a lot of work. And
then the people who got those letters really valued them
because they knew that you were very selective. But now
you take three bullet points about anyone who asks, and
it expands it into an AI slot recommendation letter, which
means that they're now getting so many recommendation letters that
they ask AI to summarize them back into three different
bullet points, three extremely losty bullet points, and the entire

(01:44:38):
system is broken down.

Speaker 2 (01:44:39):
Hosting the ass hose in the mouth, it's it's my favorite.
One of these commercials, by the way, was the co
pilot ad for Microsoft on the Super Bowl where one
of the things eagle in listeners will hear this and say,
I just smashed the window, but I know this bit.
It said, there's a bit what you type and say
write the code for my three D game.

Speaker 3 (01:45:00):
Yeah, it doesn't do it. I did it.

Speaker 2 (01:45:03):
It creates a document telling you how to make a
three D game. And I think that the point I'm
coming towards him glad you'd brought this up both you
is GPT really isn't the problem. Transformer based architecture cannot
do the things the saying it, But it isn't the problem.
The problem is the people building generative AI are actually
deeply uncreative because gory up, you're the person you cited

(01:45:24):
who makes the cool art. That's someone who's effectively like,
I don't know, this is a very messy way of
describing it, but getting a robot with sixty arms and
throwing pain to the wall to make something random, sure
that is art because you are operating the machine. But
generat ive AI, the way it's being pedled is depriving
people of actual operational expertise or any kind of domain
ex police. It's you're not writing a letter that you

(01:45:46):
care about because you're not writing the letter.

Speaker 3 (01:45:47):
Yeah, yeah, I.

Speaker 1 (01:45:49):
Mean that's I just I wrote a whole newsletter about
this because it was bothering me so much. How many
times they've heard it said, mostly by vcs and you
know AI advocates saying well, it democratizes creativity. I mean
in in what in what what is art? What is
more democratic than just producing art? Already? This is not

(01:46:13):
something that needs to be further democratized.

Speaker 2 (01:46:16):
Actually democratizes creativity, access to the ants. It's time and
space to access the arts and contributing. The best way
I got better at writing was writing but also reading,
which was my privilege of my parents being able to
afford the internet, afford books. These are the actual ways.
But none of these people are creative to how the
fuck would that this is?

Speaker 1 (01:46:36):
And this is my whole point is, if anything, it's
doing the exact opposite. It's it's it's creating an automated
process by which you can pump out as much shit
as possible, thereby pitting people who create art for a
living and hope to make a living off of it
against a machine that can generate as much slop as
as as as the client can handle. And it's just

(01:46:57):
pushing down wages. It's making it harder for art to
make a living. It's it's it's doing anything but but democratizing.
And it's making extant relationships between like a publisher and
an art and an artist, or a writer and a
and and somebody who's commissioning work much more important, thereby
depriving people who are new to.

Speaker 3 (01:47:18):
The field of opportunities. Uh, you know.

Speaker 1 (01:47:20):
Ted Chang makes makes this point in his Great New
Yorker essays about AI that it's just it's it's depriving.

Speaker 3 (01:47:28):
It's it's kicking down that ladder.

Speaker 1 (01:47:29):
Basically, it's like people who practice a craft by doing
stuff that's less than you know, authoring a novel that
you know you're going to get a full, full rate for,
or it's you might still just be learning to apply
the trade by writing a you know, marketing copy. Well,
the generative AI is going to vacuum up all of
those jobs and those opportunities. It's it's it's the anti

(01:47:51):
democratizing force.

Speaker 3 (01:47:52):
Really, I was just trying to look up a book,
but I failed. There's a wonderful artist, cartoonist, and novelist
named Linda Barry, and I co taught a workshop writing
workshop that she was also teaching at. And Barry is
a great believer in the idea that you can anyone
can draw, and that drawing is itself a gateway to

(01:48:14):
a kind of creative understanding of yourself in the world
that is uniquely powerful. And as someone who doesn't draw,
I was skeptical of this claim. She's written a couple
of very good books about this, and she's also won
a MacArthur Prize. And I came in and I looked
at what my writing students had done after spending a
week drawing with Linda Barry. They were not there to

(01:48:36):
learn to draw. They were there to learn to write,
and they were remarkably like they had a free and
loose arm and in a way that you know a
lot of writers. This is a very intensive writing workshop
called the Clarion Workshop. It's for science fiction writers at
UCSD and I'm a graduate of it and a board
member of it and in instructor. And it's often characterized

(01:48:57):
as a kind of boot camp, and usually you get
an it's a six week workshop, and I think I
was teaching week four and by week four of these
kids are zombies. These kids were not zombies, right, They
were really open and free and really thinking expansively. So,
you know, I think that like giving everyone a week
with Linda Berry democratize his art. And I think there's

(01:49:18):
you know, look, I think like typing prompts into an
LM is in its own way like being Jackson Pollock
and throwing pain at the wall. If that's why and
how you're doing it right, Like if you're fooling around, right,
if you're fooling around, you can you can get to
somewhere that you can't get to if you're trying to

(01:49:40):
get their own purpose. And I just don't think that
we're being invited to fool around. Yeah, I think that
that's the whole thing.

Speaker 1 (01:49:46):
I think artists would not give a shit about jenerative
AI if it wasn't being put in opposition to their livelihoods,
as if it wasn't threatening.

Speaker 3 (01:49:54):
Their economics security.

Speaker 1 (01:49:56):
The people that are many people as they want can
go play around at home and and do and make
different variations and for their own personal enjoyment. I don't
think anybody would would would begrudge them that if that
wasn't simultaneously being used as a justification as a promotional
material essentially by these companies who are then going to

(01:50:16):
go sell other firms.

Speaker 3 (01:50:18):
And I know you're gagging to go to an abit.
I want to say one last thing. We've got to
rouse this bad bloup. So I want to say one
last thing here, which is that if you pouparize every
single commercial illustrator in the world without open AI, you
will pay the kombucha bill for fifteen minutes for their
senior engineers. That the only reason they're doing this, This
is a great tragedy. The reason they're doing this is

(01:50:39):
not because there is a huge market opportunity in popularizing
commercial illustrators. It's that it is an exemplary opportunity. It's
not that there's a it's not that the market will
then return their investment. It's that it will be a
very visible example of what you can do with open AI.
It's a convincer.

Speaker 1 (01:50:56):
They are effected one of the few that actually works, Yes,
that can be dependably relied on to produce results that
you can point to and say, hey, look.

Speaker 3 (01:51:04):
That kind of works. Yeah.

Speaker 2 (01:51:05):
Yeah, So guys, it's been so wonderful having you. I
could go another three hours, but that's just me Corey.
Where can people find you?

Speaker 4 (01:51:13):
So?

Speaker 3 (01:51:13):
I'm at pluralistic dot net and as I've mentioned a
few times, I write science fiction novels and other kinds
of books, mostly published by tor Books at McMillan. You
can get them wherever you get books.

Speaker 1 (01:51:26):
Brian, Yeah, I write a newsletter called Blood in the Machine.
It's also the name of the book of the similar theme.
You can check either out.

Speaker 3 (01:51:36):
Yeah blood inthemachine dot com.

Speaker 2 (01:51:39):
Yeahs so you and also hopefully I'll put all of
the links to this stuff in there.

Speaker 3 (01:51:44):
But it's always good to hear out loud. Guys.

Speaker 2 (01:51:46):
Thank you so much for joining me and listeners, thank
you so much for listening. When you get out of
the clink, you can email me at easy at Better
Offline dot com and net for my Canadian and UK
just that out Canadians say z as well.

Speaker 3 (01:51:59):
Ez off line, You're like serial killers where everyone we
look just like everyone else and we say Z.

Speaker 2 (01:52:04):
That's right, bellow Canadian, I've I just learned, I've met
a Canadians.

Speaker 3 (01:52:07):
Wonderful.

Speaker 2 (01:52:08):
All right, thank you so much for listening, and Corey Brian,
thank you so much for joining me.

Speaker 3 (01:52:11):
Thank you, thank you, thank you for listening to Better Offline.

Speaker 2 (01:52:22):
The editor and composer of the Better Offline theme song
is Matasowski. You can check out more of his music
and audio projects at Mattasowski dot com, M A T
T O. S O W s KI dot com. You
can email me at easy at Better Offline dot com
or visit Better Offline dot com to find more podcast
links and of course, my newsletter. I also really recommend

(01:52:43):
you go to chat dot Where's your ed dot at
to visit the discord, and go to our slash Better
Offline to check out our reddit.

Speaker 3 (01:52:50):
Thank you so much for listening. Better Offline is a
production of cool Zone Media. For more from cool Zone Media,
visit our website cool zonemedia dot com. Check us out
on the iHeartRadio app, Apple Podcasts, or wherever you get
your podcasts.
Advertise With Us

Popular Podcasts

1. Las Culturistas with Matt Rogers and Bowen Yang

1. Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

2. Dateline NBC

2. Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

3. Crime Junkie

3. Crime Junkie

If you can never get enough true crime... Congratulations, you’ve found your people.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.