Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Welcome to Tech Stuff.
Speaker 2 (00:16):
I'm's Volosca and I'm thrilled to announce that the Week
in Tech is back and it's growing. Instead of Karen
and I recounting the Essential news today and every Friday
from now on, I'm going to be joined by three
of the best writers covering Silicon Valley. Read Abagotti, technology
editor for Semophore. I read your newsletter religiously and we
had a lot of fun on Tech Stuff a few
(00:37):
months ago with a story you wrote comparing different AI
companies to characters in The.
Speaker 1 (00:42):
Wizard of Oz.
Speaker 3 (00:43):
Welcome back, Thanks, it's good to be here.
Speaker 2 (00:46):
Kyle Jaker writes the Infinite Scroll newsletter for The New Yorker. Kyle,
last time you were on Tech Stuff, you talked about
your holiday tech gift guide, which leaned quite surrealist. Happy
to have you back.
Speaker 1 (00:57):
It was very fun.
Speaker 2 (00:58):
Thank you and Taylor friends who joined a great discussion
on last week's panel and is back this week. Taylor
writes the User mag newsletter and hosts the Power User podcast.
Speaker 1 (01:08):
Taylor, thanks for joining us again.
Speaker 4 (01:09):
Thanks for having me so.
Speaker 2 (01:11):
I've been following each of you for a long time,
and many of you have also known each other for
a long time, or at least you all know Taylor.
Speaker 5 (01:19):
Yes, I don't think Read and I have ever met,
but I follow this work of course likewise likewise.
Speaker 2 (01:24):
So just imagine we're all friends sitting around a table
in a living room with huge microphones in our face,
just shooting the breeze about technology. And let's really take
our listeners on a journey, because each of you lives
and breathes your beat and has, as far as I'm concerned,
better insights and access than anyone else in the world. Okay,
(01:46):
let's get into it and Read. Let's start with you.
You have what could be described as an insiderish story
this week a company conference, except not just any company conference,
the massive developer conference held by the multi trillion dollar
company in Video.
Speaker 1 (02:02):
Yeah.
Speaker 3 (02:03):
Yeah, it's funny you say insidery because I was in
this press conference with n Video CEO Jensen Walling yesterday
and I think there were three hundred reporters.
Speaker 1 (02:11):
In the room.
Speaker 3 (02:12):
If you can imagine that. I was thinking, like, I
wonder how many reporters are in like the White House
right now, you know, like it was an insane amount
of interest in coverage of this thing. But he said
something that I thought was in this I think this
made the rounds yesterday. But he said, you know, his
life philosophy is three things. Essentially, he said, don't don't
(02:34):
get fired, don't get bored, and don't get killed. And
it wants to live by literally right exactly. And I
thought he said living in that triangle is higher risk
than you think. And I sort of thought about that,
and I'm like, that actually does explain a lot of
what you're seeing from Nvideo, where it's just a company
like all the mag seven, all these big tech companies
(02:56):
end up in this place where you get to a
certain size and success and you end up just competing
in everything in every front. And the fronts are just
expanding for Nvidia into all these new areas and the
big I think externality that that has happened since you know,
the last year's big en Video conference here, although they
(03:17):
seem to do something like this a few times a
year now, is that is that open claw happened and
that's been sort of I think of it as like
the chatchypt moment that everyone's ignoring because the world is
on fire and it's and it doesn't and it's also
not like something that anybody can just do, can just
fire up and do like chatchept. But it's like a
(03:40):
glimpse into the future where if you talk to people
out here who are using this, you're like, oh, of course, like,
of course this is where things are headed. But it's
also quantum computing, it's video games still, it's like everything space.
Speaker 2 (03:55):
Yeah, so you and audience smiling when Reid was talking
about can you give us a little bit of the backstory,
like what are the essentials we need to know about
open floor and then what will you grinew the.
Speaker 5 (04:04):
Man, Well, I was grinning about the fact that I
don't think anyone outside of tech understands it at all yet.
I think there are these memes coming out of China
of like AI anxiety after this open Claw moment, from
people who are now scrambling to let agents take over
their inboxes and their text messages and their computing lives.
(04:27):
And I think there was one incident in which an
open Claw agent just deleted someone's entire email inbox or
something like this. So it's like, I don't know, it's
like vaping or something. It's like, let's just inhale it
straight into our bodies and like, see what happens. But
I do, Asri said, I think it's this tidal wave
(04:47):
that everyone's ignoring or just can't quite understand.
Speaker 2 (04:51):
School Christra of AI agents who basically have full access
to your computer and all your emails and everything you've
ever done, and then make decisions on your behalf.
Speaker 3 (05:00):
This guy I was talking to who was doing this,
one of these, one of these early adopters, was telling
me just how he has completely blindly turned his life
over to his open Claw implementation. And I'm like, aren't
you worried, Like it's gonna delete your inbox or do
something really bad? And he's like, oh, I think it will.
(05:20):
He's like, but I'll just I'll learn from it. It'll
it'll teach me a lesson that will probably give me
some idea for my next company.
Speaker 2 (05:25):
And I'm just like, okay, but the open cool thing
was happening before in video, Like why why was it
the topic? Dujoure of GtC.
Speaker 3 (05:34):
Yeah, and by the way, I don't even know if
it was the topic dujore like, It's just to me
the best example of how it's like this thing happens
and and now and video has to like scramble and
figure out how to how to play in that space.
Speaker 2 (05:47):
And the way that's stepping into the open potal space
is essentially building their own product around open Pool called
Nemo Cure.
Speaker 3 (05:54):
Is that right?
Speaker 1 (05:54):
Read that's right?
Speaker 3 (05:55):
And right now it's it's so expensive and so risky
from a security standpoint to do this, to just turn
your life over to many hundreds of agents and different
AI models. But like the prices are coming down so
much and the infrastructure is being built around it so
fast that that just means a lot more tokens, a
(06:15):
lot more revenue for companies like Nvidia and all the hyperscalers.
But be like a new a new field, a new
battlefield to try to like gain market share or potentially
lose market share. And that's so Nvidia is like doing
this stuff around that.
Speaker 2 (06:30):
And they're trying to create like a more secure architecture
around this open source thing. Right to basically honess the
power of open tool but make it somewhat safer.
Speaker 1 (06:39):
Right is the kind of goal.
Speaker 2 (06:41):
Taylor, what are you? What are you thinking about open toloor?
I saw you what us mining as well? And we
will come back to you up to us.
Speaker 4 (06:46):
Yeah, I have Fomo, and I want a claw. I
want one of these little bots. I like was debating
buying a Macmini. I just can't justify the like five
hundred dollars or whatever it costs these days, I kind
of feel like that that person who said that they
might learn from it, Like now I wouldn't give it
(07:06):
unmitigated access to my email. But I do think that
Read is right that this is sort of we're probably
moving towards this sort of like agentic future or whatever
you want to call it. And I don't know, there's
just like you guys know how hard like time consuming
could be to be a freelancer. I'm like, even if
this thing could just like sort of invoice companies for me,
it would be amazing.
Speaker 5 (07:24):
Yeah, it feels like it's proving out that AI will
be used a lot by normal people. Also, like we'll
all be using these large amounts of tokens, Like it's
going to be embedded much more deeply in our workflows
than just using a chatbot.
Speaker 2 (07:38):
Yeah, on in Vidio specifically, I mean, it's interesting, right,
Taylor and Kyle, you both focus on kind of the
intersection of technology and culture. I was thinking about this
morning like how to, how to how to relate in
video to both of your beats? And I had the
rather depressing thought that maybe all culture now sits downstream
of Nvidia.
Speaker 5 (07:56):
What well I was I was thinking about that because
I do feel like this is one of the biggest
moments for Johnson Huang as like a meme. Right Suddenly
his image was everywhere and he's been wearing the same
leather jacket for years or decades or whatever, and there's
crowds of people around him, and so suddenly it does
feel like a cultural movement as much as a technological one,
(08:20):
for sure.
Speaker 2 (08:21):
Read coming out of this story, I mean, what what?
Speaker 1 (08:23):
What's the you know?
Speaker 2 (08:24):
One of the interesting things was that all these incredible
announcements and more and more and more revenues, and yet
in video stock price was was I think flat or
slightly down? Like how did how did people react to this?
Did that correspond to the mood in the room? Or
is I mean, is it just they've had so many
wins that the market has kind of priced them at
this place. There's nowhere more for them to grow, Like
what's and what are the competitive threats?
Speaker 3 (08:45):
I think it's actually it's funny The thing about n
video is it's almost like too simple of a story, right.
It's like there's these things called tokens that we're all using, right,
that's what that's what we're using when we when we
query chat ept and people can't get another of them,
and they're willing to pay a lot of money for them.
And you could see with like this open cloth stuff
there will actually the more tokens you use, the better
(09:08):
it gets, and people are paying for it, and it's like, okay,
so these companies are making a product that is pretty profitable,
has good margins and everybody wants and they can't make
enough of it. Like that's just a good business, like
and I, but it's like too simple for tech. Normally,
Tech's like we're gonna get all these people these eyeballs,
and it's gonna be free, and then we'll figure out
(09:30):
how to monetize it and we'll offer all these free services.
And it's like, wait, you mean you're just gonna charge
people a lot of money for a product and make
a bunch of money on It's like that's so weird.
It's like a normal business.
Speaker 2 (09:43):
Taylor Caryle, I'm curious before we move to the next story,
and what are your burning questions about Nvidio. I mean,
the biggest company on us don't work it.
Speaker 4 (09:50):
I think, you know, that's a really good question. I
don't know. I mean I kind of come to it
like from the same angle as Kyle maybe of like
I'm very interested to see how they navigate this like
AI backlash, how Jensen kind of like ascends into this
like role of a very public CEO. I mean, he's
(10:11):
been CEO for so long, and I think most of
his time as a tech leader has been kind of
behind the scenes by Steve Joes.
Speaker 2 (10:18):
He didn't rebrand, he's worn that that that leather jacket
for a really long time.
Speaker 1 (10:21):
He already he already.
Speaker 4 (10:26):
But I also think it's interesting, you know, we had
like Sam Altman open AI CEO at the Vanity Fair
Oscar parties, where where if famous playwright I guess called
him like a Nazi, and there is like this kind
of burgeoning like anti tech sentiment, especially from more left
cleaning spaces, and so I don't know, I don't know
that we'll see Jensen Wong like out at you know,
(10:47):
Hollywood parties, but I do think that like he's going
to have to step into his public image and curate
a public image for himself that.
Speaker 3 (10:53):
Is that is right on.
Speaker 5 (10:54):
It's like in video is a brand, but what is
what is the brand of? Like a chip or tokens,
Like it's a utility rather than a particular product. And
I think I saw at a press conference or just
the scrum afterward, Jensen just said everyone, the Nvidia ecosystem
is rich, like it's this kind of MIAs like everything
(11:16):
I touch turns to gold. But also that feels kind
of cursed, like you're flying too close to the sun.
So I'm curious how much higher it can even go,
Like like, are we just getting started or or is
this a kind of threshold moment?
Speaker 1 (11:35):
Taylor moving moving right along.
Speaker 2 (11:37):
You mentioned techlash, and you brought us our attention to
a story about a Senate committee hearing this week with
the title Liability or Deniability Platform power as section two
thirty turns thirty. Now, before we get into that hearing,
you've been covering section two thirty for decade. You put
(11:58):
our six pot YouTube series devoted to it, You spoke
about it a south By Southwest last week. What drives
your fascination with this law? What do we need to
understand about it and what might change based on all
of these legislative efforts.
Speaker 4 (12:14):
Yeah, so Section two thirty is what a law that
frankly should be very non controversial. It is a law
that it effectively says, you, as a speaker of content online,
should be held liable for your own speech. It's a
foundational law. It's what that allows the Internet to kind
of exist, because the Internet is not like a public space, right,
(12:35):
There's no it's not like the real world where you
could where there is like a physical world out there.
So you need people or companies to kind of go
in and set up spaces in this world, whether that's
a website, whether that's a forum, whether that's a social
media platform, et cetera. And I think you know, there
was this really really bad article written about a decade
(12:56):
ago on section two thirty that was claiming that I
guess like Facebook was using section two thirty to kind
of like not police hate speech. Ironically, section two thirty
is what allows platforms to do moderation. But that kind
of set off this like decade long discussion where pretty
much everyone in power wants to transform the Internet from
(13:17):
this open place where people can sort of freely discuss
ideas to something more like the mainstream media traditional media,
where it's a top down sort of corporate control over
speech and every single platform or forum admin or person
online you know, is responsible for kind of the speech
that they host.
Speaker 2 (13:36):
Radio noting nodding along to Taylor's I'm always curious when
people are nodding what they're noting about.
Speaker 3 (13:41):
No, I mean, I just I think what she's saying
makes sense, and you're you're I think explaining it in
a good way, for sure. I don't think there was
anything I was like, you know, I don't think there's
anything controversial in that. It was sort of laying out
the fact.
Speaker 2 (13:55):
Now to the controversy, Taylor take us away.
Speaker 4 (13:57):
Well, so, you know, as as the Internet has grown
and these platforms have become more powerful, a lot of
people in power want to repeal section two thirty. This
would devastate the Internet. It would transform the Internet from
a place again where there is user generated content, to
a place where the whole Internet is Netflix right, where
(14:17):
you have to apply to a platform to get your
content on there, and nobody can kind of reshare content.
Speaker 3 (14:24):
You know.
Speaker 4 (14:25):
Section two thirty is what allows you to retweet a
tweet or forward an email. Removing section two thirty would
not allow you to sort of engage in that way.
It would all it would just be sort of a
Netflix style Internet. Obviously, this is something people in power
really want. They don't like that people, average people can
go on the Internet and sort of connect with each other.
So there's this aggressive effort to repeal section two thirty
(14:46):
that ironically, the big platforms are also now behind because they,
with the rise of AI, feel comfortable pre screening every
single piece of content via AI to kind of ensure
that any speech on the platform alot with what the
government wants. This is why Meta has been sort of
integral to these efforts to chip away at section two thirty.
Speaker 2 (15:08):
This is so interesting because I think the way section
two thirty is commonly understood, and you mentioned that wide
article from a few years ago. I haven't read it,
but it's essentially the battle of a Section two thirty
is the battle where like regular people are fighting back
against the tech companies and their ability to like harm
people with content and not be held responsible.
Speaker 4 (15:28):
Well, Section two thirty protects those regular people. So section
two thirty ensures again that you as a user without
a lot of power have the ability to speak to
to power. It allows for user generated content. If we
didn't have that, it would consolidate power among big tech. Now,
I you know, as somebody that covers tech for a
(15:49):
long time, especially in light of the tech lash over
the past decade. About a decade ago, these far right
religious fundamentalist groups and organizations that have always wanted a
censored Internet, groups like you know, the Heritage Foundation, they
realized that by claiming, you know, repealing Section two thirty
was cracking down on big tech, they've gotten a lot
of I think leftists and liberals to kind of fall
(16:11):
for their ruse and go along with this. Now, it's
important to note they did have a big victory. Back
in twenty eighteen we saw foss Assessa pass. Foss Assessa
was the first major carve out to Section two thirty
that was framed as cracking down on big tech. I
would ask people, you know, do you think Facebook and
Google are more or less powerful today than twenty eighteen?
I think we know the answer to that.
Speaker 2 (16:32):
And the hearing this week with center Ted Cruz, what's
what's going on here? What can we expect and what's
different from the bipartisan legislation introduced by Lindsay Graham and
Amy Klobashaw and others late last year on Section two thirty.
Speaker 4 (16:44):
It's essentially like the same thing. There is this Sunset
Section two thirty Act that again has been brought by
the Democrats that want to seize control over online speech.
I have to say, I know this is bipartisan. A
lot of issues related to mass surveillance and censorship are bipartisan. Unfortunately,
I told you.
Speaker 2 (16:59):
Rather last week where I think you said mass eventage
is always a bipod is any.
Speaker 4 (17:03):
Shoe exactly, And so you know, and I think it's
kind of terrifying that we're seeing this happen under the
Trump administration. I mean, I certainly couldn't do my job.
I wouldn't have a job as an independent journalist if
Section two thirty didn't exist. But yeah, there's this big
hearing to kind of discuss all of this on the HILP.
Speaker 1 (17:19):
Go ahead, Kyle.
Speaker 5 (17:19):
I have some of the same worries as Taylor here
that Section two thirty does protect a lot of speech
on the Internet. It's meant to kind of separate a
little bit the platform from the content on it, and
it has allowed for Facebook to exist, for Instagram and
YouTube to exist. But I think if we look forward
into the future, these tech giants are going to find
(17:39):
ways around whatever regulations exist. They will find ways to
profit from this situation. If it's repealed, and the smaller platforms,
like more independent spaces, spaces that don't have access to
millions of AI tokens as we were just discussing, we'll
kind of fall by the wayside and we will be
left with that narrowed, more homogenized internet. I think there
(18:02):
are good and bad parts of Section two thirty, Like
it's it's not just one thing, but it does really
protect our ability to put things out on the Internet
and for Internet hosts to support the publishing of things
online without being always responsible for every single thing.
Speaker 4 (18:18):
I just want to say, like, I actually don't think
there's bad parts of Section two thirty, Like I think
it's such a short, like uncontroversial law. Meta and Google
and all these big tech companies have been using so
many other laws to evade responsibility and to consolidate corporate power,
and we could pass so many laws. We could reform
things like the Computer Fraud and Abuse Act, which Meta
(18:39):
uses to crush competitors. We could, you know, prosecute them
for anti competitive behavior. So many things could there are
so many legislative and sort of broader political things that
we could do to curb the power of these big
tech companies. But chipping away at section two thirty only
chips away at the power of users, you know, to
speak truth to power, and I think that's really dangerous,
especially right now.
Speaker 1 (18:59):
Kyle.
Speaker 2 (18:59):
I want to come back to you because you wrote
a book called Filter World, and you wrote about Section
two thirty in that book, and here's what you said.
The problem with section two thirty is that ultimately and bizarrely,
the law makes it so no one is currently responsible
for the effects of algorithmic recommendations.
Speaker 4 (19:16):
That's not true. How would you say no one is responsible?
Speaker 5 (19:20):
Well, the law that section two thirty was based on,
or one of the precedents, was this bookstore ruling where
a bookstore cannot be held responsible for the contents of
the books that it's providing. And I think, like the
platforms and the way the internet works has evolved since
the time that Section two thirty was put into place,
and algorithmic recommendations and kind of top down wait a minute,
(19:44):
censorship where filtering has become more aggressive, and so I
think Section two thirty has protected a lot of that
algorithmic filtering action over the years.
Speaker 4 (19:55):
Kyle. I just did an episode which I hope you'll
watch on my channel last week about this idea that
Section two thirty wasn't written to protect algorithms. The two
cases right that led to Section two thirty, the Prodigy
and compu Serve cases. Both of those platforms used algorithms
to recommend content. Algorithms are speech as defined by the
(20:15):
First Amendment. Like, I think what people need to realize
is a lot of problems that society. I'm not saying
this is you, Kyle, but like I think a lot
of people have problems with speech law. And what Section
two thirty does is effectively just make it cheaper to
win cases on First Amendment grounds. So, you know, the
same cases that these a lot of these smaller platforms
win on Section two thirty, they only have to It
only cost them fifty to one hundred thousand dollars to
(20:36):
fight that case and win. But if they were to
fight it on First Amendment grounds, it would cost over
five million to ten million dollars so you know, algorithmic
editorial choices are speech in themselves. And the idea has
always been to give these platforms the power of publishers
and to make them publishers, but without the sort of
liability of treating them as newspapers. As Kyle said, it's
(20:57):
much more treating them like a bookstore. But a book
store does put books in the front window, right, a
bookstore does make recommendations to readers, And so I think
there are ways to hold these platforms accountable for, you know,
a lot of the bad effects that they've had on society.
But again, I don't think Section two thirty is the
law to do that through.
Speaker 3 (21:16):
Isn't that? Isn't that what the what these groundbreaking social
media lawsuits are that are happening right now are sort
of based on, is that the algorithm is the you know,
is the thing that these companies should be held viable for.
Speaker 4 (21:29):
Well, yeah, it's just very funny because when once you
know there is no problem with the algorithm, the problem
is the content, right, What people have problems with is
the content If the algorithm. If if everything on Instagram,
for instance, was just Wikipedia articles and that's the only
thing you could see was Wikipedia articles, but it had
the exact same recommendation algorithm. Would you have a problem
(21:49):
with it? I'm sure no, No, The answer is no,
these people would not because we have platforms that are
like that, that are online learning platforms that operate with
the same sort of engagement based recommendation algorithms, and people
don't home problem with it. Or something like Spotify right
which has an enormous amount of data on you and
will feed you you know, if you're depressed, you get
more depressing songs, and maybe those depressing songs, you know,
(22:11):
like those surressing songs make you want to kill yourself,
Like we're.
Speaker 5 (22:14):
Not having This is one of the cases of British
woman who was unfortunately pushed a lot of depression content
from Pinterest and then it ended up committing suicide, dying
by suicide. And you know, like there are problems with
algorithmic recommendations and particular types of content, but they're, as
Tyler was saying, they're also better legal frameworks and better
(22:35):
law packages that are being discussed now that can address them.
So some like EU laws, like the Digital Services Acts
that let you modify your algorithmic feed like things that
give you more rights to your data and your information.
So we can use other laws to target these issues
kind of on the downstream end for sure, But.
Speaker 3 (22:56):
Ultimately this is about who can be sued for what right?
I mean, that's what it really comes down to. Can
you win? Can you win a lawsuit for harm caused
by something you saw online or content you saw online?
Speaker 4 (23:08):
Right? Well, I think we've seen this moral panic forever,
right is is, and I would argue, yes you can
right now, you can win a lawsuit, like I mean,
you can sue the person that made that content that is,
you know, allegedly harmful. You can sue them and they
can be held responsible. And that's a really good system.
What we don't want to do is make it so
(23:29):
that every platform is legally liable for every piece of
content on their website, or there will be no such
thing as journalism on the Internet because all of these
platforms will simply, you know, not want to host any
sort of legally questionable content.
Speaker 3 (23:43):
And for a lot of content is anonymous or it's
not in the US, So like you could sue, but
you're not going to get much out of it. So
the only you know, deep pocketed entity you can really
go after is the platform right right? And the tech company?
Speaker 4 (23:55):
Well, I think what you're talking about, like a lot
of these sort of critics of these issues need to
get their story straight. Do you have a problem with
online anonymity, Okay, there are better ways to address that.
Do you have a problem with the fact that there's
harmful content abroad that's being brought to Americans, Okay, let's
address that through again lots of different things. Or do
you have a problem with the content on social media?
(24:16):
And if you have a problem with the content on
social media, what I would posit is that a lot
of this stuff is sort of the same moral panic
that we see about a lot of sort of content
based freakouts, where it's like, you know, you don't want
your child seeing xyz. And we should be in that
speech law and we should be very careful about sort
of how we asked the government to regulate content.
Speaker 2 (24:37):
It's interesting Senator Ron Wyden, who drafted this law, said
a couple of things recently.
Speaker 4 (24:42):
You know.
Speaker 2 (24:43):
One he said, without section two thirty, anyone who merely
shared a story or allegation about Epstein and his associates
on their social media could be sued by Epstein's deep
pocketed pals, along with the site that hosted those posts.
But he also said he was open to reform or
not not full on repeal, but some tweaks of section
two thirty to address some of the concerns people have.
Speaker 4 (25:06):
I think with that he's talking about AI primarily because
I think the question, the big, next big question is like,
as these AI platforms sort of generate their own forms
of speech, like how should they be regulated? How should
large language models be regulated? And I think we're seeing
them use section two thirty to sort of claim, well,
this is just sort of like equivalent to user generated content.
(25:27):
And I think that's sort of these questions that yeah.
Speaker 3 (25:30):
I would love to hear your view. And I don't
know if I don't know if Section two thirty is
really going to be fully repealed ever, or if I
mean that seems unlikely, But just like thought experiment, like
what would the internet? What would actually happen on these
tech like the just imagine a world where it does
get completely repealed, what would change?
Speaker 4 (25:49):
Yeah, I think we don't have to imagine. We see
what happened when we chipped away at section two thirty.
So again, Foss Assessa is the first major amendment, and
it's the first it's the biggest carve out to Section
two thirty that we've ever had. Foss assess as a
law that passed back in twenty eighteen that was supposed
to hold platforms liable for content that would incentivize sex trafficking.
(26:10):
All of the same arguments they're making today about Section
two thirty. We went through all of this a decade ago,
and we actually can see exactly what happened. What happened is,
first of all, it didn't reduce sex trafficking even in
the slightest. What it did is mass sensor LGBTQ content
and abortion content from the Internet. We can also look
at other countries that have more authoritarian laws, like China,
(26:32):
and I know people have a lot of complicated thoughts
on China, but China does have a very restrictive Internet
where you cannot say things that are not approved tacitly
by the government, and the government effectively deputizes their big
tech companies as state sensors. And that is the world
that we're moving towards here in America. And I think
that that should concern anybody that cares about civil liberties
(26:53):
or free speech online. So I think we should protect
Section two thirty. But I'm all for cracking down on
these big tech companies and regulating them a lot or heavily,
you know, in many other ways.
Speaker 2 (27:03):
We're going to take a quick break now, but when
we come back, we'll hear from Kyle about why taste
has become such a buzzword in Silicon Valley. Stay with us,
(27:31):
Welcome back to tech stuff. Kyle, you're up and you
wrote a column this week in the New Yorker about
why tech bros Are now obsessed with taste.
Speaker 1 (27:40):
I actually lolled reading it. Here was my favorite.
Speaker 2 (27:43):
Here's my favorite quote The entrepreneur and former Byte dance
engineer Cony Is it Coney Wang or Connie Wang's song
Oh songe completely wrong, echoed a New Silicon Valley axiom
in a blog post, writing quote in the AI era,
personal taste is the moat. Startups apparently need taste. This
(28:04):
is you, like AI needs data centers.
Speaker 5 (28:09):
Just trying to have fun, you know. I feel like
this this was kind of my way into the AI
culture to be in a way that's not just talking
about like humanoid robots killing people or something like that
it's this kind of mania that Silicon Valley and like
tech bros broadly on X in particular, have with the
(28:31):
word taste. And they're all trying to make their taste
better and have taste in AI models and you know,
train themselves essentially to desire better things.
Speaker 2 (28:42):
And I was just so sick of it, Like, guys,
how did you How did you become aware of the phenomenon?
Speaker 1 (28:47):
When did taste? When did taste? Dot ring like a
fire a lamb?
Speaker 4 (28:52):
How can you escape it? If you're on tex Twitter,
it's all they've been posting that idea.
Speaker 1 (28:56):
Yeah.
Speaker 5 (28:56):
I mean, I'm kind of a Canarian the coal mine
here because I've been writing about taste and algorithms for
like ten years now, so I'm very sensitive to the
use of that word. And I just started seeing it
crop up more and more and more. And I think
the threshold moment or like the climax of this was
like a few weeks ago or something two weeks ago.
(29:17):
Paul Graham, the technologists like great Entrepreneur tweeted like in
the AI age, like we all need better personal taste,
and I was just like, oh my god, Like personal
taste was the realm of like hipsters and DJs and like.
Speaker 1 (29:33):
You know.
Speaker 2 (29:35):
I worked in the mid two thousands. I feel like
it's crazy that this word has been claimed by the
technology industry, like.
Speaker 5 (29:41):
The industry that is widely seeing us the most tasteless
thing on earth, like no shade, like I cover tech,
I love tech, I love the Internet. Tasteful, is not it?
Speaker 2 (29:52):
Like taste mean like cigars and whiskey? To these folks,
does mean what's the like what what's the what's beneath
the saying?
Speaker 5 (30:02):
Man? I mean the definition that I read out of,
like interpreted from all of their tweets, is like taste
is basically what is profitable like to them, like, having
good taste means knowing how to choose the right thing
that makes money. It means knowing how to attract more
customers to use your like AI dating coach. It's maybe
(30:22):
the choice of like a logo or an Instagram ad
like that's on the more literally tasteful end of it.
But I took an issue with this like mechanistic or
utilitarian use of the word when really, like, to me,
taste is about enjoying art and culture as a human
with feelings.
Speaker 4 (30:39):
Well, I think it's so funny because do you guys
remember when Joyce Carol Oates kind of like dunked on
Elon Musk and she was just like, you don't enjoy
like art and literature. Like there was all this discussion
of like what does Elon Musk do? And then Elon
Musk sort of attempts to clack back and he starts
replying to these like movie review accounts and then he
tweets like this is a great or like homer Iliad
(31:01):
is a great book, and then he like linked to
like some other book that was like the Odyssey or something,
and it just like to me, like when I think
of tasteless tech people, I think of Elon Musk and
like this desperation to like be seen as like culturally
kind of I guess insightful. And yeah, they none of
(31:21):
these people have any taste. They're making slot, but.
Speaker 3 (31:23):
They're also living, so they're so far in the future
right that they're imagining a world where like the robots
are doing everything, and it's like, well, what like there's
this like existential crisis. It's like what am I at now?
Like I used to write the code and I don't
do that. I don't do any you know, I guess
I have taste. I mean I think that's.
Speaker 5 (31:43):
Telling the robot what to do. Like now, I'm now
I'm exerting myself for expressing myself by by bossing around
and exactly.
Speaker 1 (31:52):
Which is a bit suppressing.
Speaker 5 (31:54):
And there was another quite viral Mark Andreeson moment like
in the last few days, I think, where he just
says that he has no introspection. It's just like there's
nothing going on here up here in the head. I
have no thoughts, I have no sense of self. And
that was I mean, that was just in a nutshell,
(32:14):
like how can you have taste if you have no
thoughts about anything? If like you're only moving forward.
Speaker 2 (32:22):
I want to I want to ask you both and
Read and Taylor about this. As West West Coasters. There
was a story in Bloomberg last year that I just
absolutely adored about how all the tech bros now want
to build these like extremely tall and ugly monuments too,
And I'm just wondering, like, what are you seeing on
the like, are you seeing the physical architecture, I mean
(32:43):
East Wing style be remodeled in this in this era
of a new new taste.
Speaker 3 (32:48):
Or no one's building anything out here. So I don't think.
Speaker 4 (32:51):
There's read said it's time to build, there's no.
Speaker 1 (32:57):
Yeah.
Speaker 4 (32:58):
I think it's interesting kind of how they seek to
exert their influence on the physical world. I mean, Kyle
has written so much smart sort of stuff about this,
but it is you know, I and I think of
Kyle's work a lot when I think of like this,
you know, the sort of liminal, sterile spaces that we're
very affiliated with, the sort of associated with the tech
world in the twenty tens. I do wonder what kind
(33:21):
of text dominant aesthetic is emerging as in the twenty twenty,
twenty twenties, Like is it the slop, is it the
like what what is kind of like the dominant like
tech aesthetic. I'm curious, Kyle, like that you're seeing have
you have you been.
Speaker 3 (33:36):
To Cursor's office.
Speaker 1 (33:37):
No.
Speaker 3 (33:38):
There it's like very like warm like living room sheet.
You come in and like there's just a pile of shoes,
like hundreds of shoes because everyone just goes barefoot, and
it's like you just feel like you're in this like
it's like everything's wood and warm, and you're just like, oh,
I'm like I'm not at work, I'm like in my
(33:59):
like cozy living room, which is different from those Google
you know, like the Google was like very bright, like
bouncy balls, sit at your desk and like, you know,
there's it was just like this fun playground atmosphere. Now
it's like I don't know, I don't even know if
that's the question you're asking, but it just popped into
my head, like you should you should visit Curser's office.
(34:19):
I feel my shoes on. I was told you could,
I could leave my shoes on, but I wasn't sure
if I was being judged. Actually I was definitely being judged.
Speaker 5 (34:26):
But this is like the shaman cult vibes. I think, like,
like what goes with the quest to invent AI?
Speaker 1 (34:32):
God?
Speaker 5 (34:33):
You have to like have your cult in like a
shag carpeted room, everyone's wearing robes or whatever. You take
your shoes off, you ensconce yourself surrounded by AI agents
and you, you know, transcend reality.
Speaker 2 (34:48):
Like Kyle reading your story, I got to think of
a quote the last thought about when I was studying
literature University fifteen years ago from William Woodsworth, which was
every great and orig writer in proportion as he is
great and original must himself create the taste by which
he is renished.
Speaker 5 (35:08):
What a great quote.
Speaker 1 (35:09):
What a great quote.
Speaker 5 (35:10):
I mean to write that one down or put it
on the wall of my startup office. Yeah, Like it's
it's the job of an artist or a creator or
whatever to create a new paradigm, right, to create a
new sensibility and project it into the world and like
teach an audience in some ways to appreciate it or
(35:31):
understand what they're trying to do.
Speaker 1 (35:33):
But if you if you.
Speaker 5 (35:34):
Put that task onto an AI startup, like, I'm not
really resonating with what you want me to do here,
because what you want me to do is like wear
a pendant on my neck that insults me via text
message or you know, tells me to do things they
don't like want to do. There's that one startup that's
like AI can rent a human to do labor for it.
(35:57):
Like that that seems to be the taste they have
in mind. So I guess I just don't like, I
don't so far enjoy the sensibility that they're projecting, and
I wish they could come up with something a little better.
Speaker 3 (36:09):
There was an interesting uh, well, the sorry Taylor, there's
just this thing other thing that happened in a video
with this did you see this dl SS five meme
happening online? They have this new thing where they use
Generator AI to like upscale the graphics of video games,
and so they showed like before and after pictures of
the characters, and people like really took issue with it
(36:31):
because they're like, well, that's not the artist. I mean,
it wasn't really explained like I still like a face tune, Yeah,
like these hilarious memes about it with like it was
like Jensen Wong before and after and he was like
a busty woman after, you know, and it's like, you know,
it gets to this question though of I mean even
(36:53):
just the announcement of it. It's like there wasn't a
thought of like are people going to be are people
going to be confused by this and think that you're
we're at actually like changing the artist's vision here, and
you know, and the reaction to it and it is
you know, I mean, it's not exactly what you're talking about,
but it just popped into my head.
Speaker 4 (37:10):
But it is kind of similar because it's like how
is it altering art? And how is it altering kind
of like the media that we've consumed. There was also
a viral controversy over the Pretty Little Liar's book I
Guess was updated on Amazon Kindle Like it was like
the digital versions of it were updated to include modern
references to things like TikTok and stuff, which really kind
(37:33):
of like changes things in the story a little bit.
And so people were tweeting screenshots like what the heck,
Like why does a book need like an update, or
like why do we have to kind of like alter
our media to like make it conform with today's like
visual standards or like you know, cultural standards. And I
think there's a lot of rejection to that, whereas like
a lot of these tech people are like, wait, but look,
(37:55):
we have this shiny new version, you know, and it
can keep up with the times. And it's like, but
we appreciate that it didn't keep up.
Speaker 3 (38:00):
With the times.
Speaker 5 (38:01):
Yes, is like the taste of AI. Like all culturists
fan fiction, everything should conform to my personal preferences. Everything
should be like glossy and upscaled, like the video game filter.
It's just kind of like optimize, I don't know, optimize sheen.
That isn't very tasteful. Like if you go back to
the Wordsworth quote, like, that's timeless. It was from a
(38:24):
long time ago. We don't need to update it with
like riz or whatever. Words Worth doesn't need to say
six seven or something.
Speaker 2 (38:35):
But I do think, Kyle, you put your finger on something.
She's really interesting to me, which is these tech overlords
imagining themselves into a future where their creations have been
so successful there is nothing left for them to do,
and the idea of taste as this kind of last
bastion of like what makes us human or gives us value?
I mean, thinking about the hipster movement earlier, it was
(38:56):
kind of like a I don't want to summarize a
hipster movement, but in a sense it was like, well,
we may not play by the rules of like you know,
finance culture, but like we have our thing which is special,
and we have our taste. And so the idea that
that tech bros may on the other side of their
great achievement join the hipsters of yesteryear in feeling slightly
alienated by the world in which they live is kind
(39:17):
of delightful.
Speaker 5 (39:18):
Oh, I think that's so accurate, Like what else is
left but to take up carpentry and like make a
ceramic mug for yourself and like tinker with your espresso machine.
I think that this is extremely pretentious, probably, but I've
been reading a book of Basho poems, the like Japanese
haiku poet from the eighteenth century, and like all he
(39:40):
was doing was wandering around observing nature and writing very
short poems and hang out with his friends. And I
feel like that's all these tech bros actually want to
do in the end, and they could just do that,
Like they could skip all of this and just go
live in a cabin.
Speaker 4 (39:56):
They have I don't think they have the like introspect
and to do that like I think.
Speaker 3 (40:01):
They like So that's what one of the anthropic people
said when they you know, if when you leave an
AI company, you have to write a whole dietribe about
why you left. And he was like, I'm gonna go
write poetry. That's what he decided to do. So you
know they are doing that nice.
Speaker 4 (40:16):
See what Yeah, let's see what that poetry happens out.
Speaker 3 (40:20):
Shelley Banjo wrote this great column last week, Who's who's
my editor?
Speaker 4 (40:24):
Now?
Speaker 3 (40:25):
Who's newt at seven four? And she was like, she
asked She talked about asking this friend of hers who
has great taste in books, he reads a lot of books,
like what business books should I be reading right now?
And he just geminied it and sent her the Gemini
list and it was like the number one recommendation was
like the World is Flat by Tom Friedman. She's like,
what do you talk about?
Speaker 4 (40:46):
Like?
Speaker 3 (40:46):
This is like horrible? So, you know, I think you're
gonna have to have That's that's going to be the
new thing. It's gonna be like I have book recommendations
that were not generated by AI. You know, that's it's
real taste.
Speaker 1 (40:59):
That's we have time for today.
Speaker 2 (41:01):
But I want to end our discussion with a simple question.
Taylor had this one last week, but for Kyle and
and read it's fresh. Who had the best week in tech?
And who had the worst?
Speaker 1 (41:10):
We'll start with you, Taylor, No, don't stay Kyle.
Speaker 5 (41:16):
So I think the best week in tech was Nintendo
with Pocopia, the like new Pokemon slash Animal Crossing Slash
Minecraft game, and it seems to have just been like
a huge hit, even though people were cynical about it
and kind of thought it was a you know, hack job,
sellout move. It's delightful. Everyone loves Pokemon and hanging out
(41:38):
and building little gardens for their Pokemon to hang out in.
So I think Nintendo had.
Speaker 3 (41:43):
A great week.
Speaker 1 (41:44):
Who had a bad one?
Speaker 5 (41:45):
I mean, I was gonna say Sam Altman getting called
a Nazi by as it came up before. That's pretty
pretty bad. That is pretty bad read.
Speaker 3 (41:56):
I mean, I don't know. I think I probably just
just because it's like wresh in my mind and I
was talking about it, but I did, I did. I
do think that Nvidia had kind of a good, a
good week this week, even though they're stock I mean,
not looking at their stock price like it kind of
But maybe it's just me crystallizing in my head like
where they're at in terms of their their grand ambitions.
(42:19):
But I don't know. I mean, I think it just
continues to be like this this Anthropic thing with the Pentagon.
You know, Anthropic has had this big fight with the
Pentagon over exactly how their AI models should be used
in warfare and also in surveillance of Americans, and the
Pentagon is saying, you know, we don't want any restrictions
(42:41):
on these models, and in fact, now we're designating you
a supply chain risk. I just can't imagine that this
is like great for for Anthropic.
Speaker 2 (42:49):
I mean, that's very interesting because last week's takeaway was
the Anthropic it had the best we can take because
of all the use of growth and love that had
gotten from there.
Speaker 1 (42:57):
From their stance on this.
Speaker 3 (42:59):
I know, I know, and I I it's like it's okay,
you can look at that and you're like, yeah, I
mean sure if it it could work out for them
really well, right like this sort of like Apple versus
the versus the the FBI thing that sort of helped
that helped like solidify their privacy centric marketing. But I
just think in the end, it's not They're not a
(43:21):
they're not a consumer company really, so it doesn't like
like their app store numbers don't really ultimately matter that much.
It's like they have to win an enterprise and I
think like it's just not ultimately in the long run
that like, I don't think companies are going to look
are going to view this as like, you know, a
big bonus. It's just like it's just a draw. The
(43:43):
more the more this fight gets like drawn out, I
just have a feeling it's not that great for them.
Like their competitors are. Like Google has somehow totally avoided
this controversy, which I think is like fascinating and in
a lot of ways, they're like the real the one
everybody should be competing with, and like opening eyes kind
of geting dry into it too. So I don't know,
maybe Google had the best week actually I don't know,
(44:04):
just by not being like no one's talking about them
in this context.
Speaker 4 (44:09):
I know, when you're a tech company, sometimes you could
just have the best week by no one speaking about you.
Speaker 3 (44:14):
Right, right, It's true, It's true.
Speaker 4 (44:16):
I think Jensen I'm gonna I'm gonna kind of like piggyback.
I mean, I do think Jensen Wong had a good week.
I think, like I mean, correct me if I'm wrong.
But I didn't see like a huge amount of backlash
considering like kind of the stuff that he was talking about.
I mean, maybe there was like a little dram over
in Vidia, but also like he is ascending to use
the clavicular term, you know, into like CEO status, Like
(44:38):
I mean even at the Nvidia you know conference, I
think they were selling like sweaters with Jensen Wong's like
face on it. So I think he's kind of like
you know, when you look at the clout index, maybe
like his is going up and Sam Altman is going down.
I would say Sam Altman probably had the worst week. Also,
Uber's former head of self driving wrote this great piece
(45:00):
about actually how his self driving Tesla crashed and taught
him a lot about like Ai Risks, and I thought,
I don't know if that's a bad week, but I thought, like,
you know, that was a good lesson for that man
to learn, like.
Speaker 5 (45:14):
Your email inbox getting to lead it, like you learned something.
Speaker 1 (45:19):
People need to learn.
Speaker 4 (45:20):
These lessons for themselves.
Speaker 1 (45:21):
There you got forced reflection. Thank you so much. That
was really fun. That was fun.
Speaker 4 (45:25):
Thank you.
Speaker 1 (45:51):
Put text stuff.
Speaker 2 (45:52):
I'm as volocian. This episode was produced by Eliza Dennis
and Melissa Slaughter. It was executive produced by me Julian
Nutter and Kate Osborne Kaleidoscope and Katrina Norvel for iHeart Podcasts.
The engineer is Charles de Montebello for CDM Studios. Jack
Insley mixed his episode and Kyle murdoch Rodart theme song.
A special thank you to read Albergotti, Kyle Chaker, and
(46:13):
Taylor Lorenz. Please check out all the work they put
out into the world. We're very lucky to call them
friends of the pod and please do rate and review
this show wherever you listen to your podcasts.