Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Welcome to Tech Stuff. I'm os Volosha. I'm thrilled to
announce that the Weekend Tech is back and it's growing.
And instead of Karen and I are accounting the essential
news today and every Friday from now on, I'm going
to be joined by three of the best writers covering
Silicon Valley. And if you've been listening to the last
few Fridays, today's guests will be familiar names. Read Albergotti,
(00:37):
technology editor for Semaphore and a resident of the Bay Area,
always in the thick of it.
Speaker 2 (00:42):
Read welcome back, thanks for having me.
Speaker 1 (00:43):
Kyle Chaker, writes the Infinite Scroll newsletter for The New Yorker.
Kyle's a resident of Washington, DC, but hopefully not a
victim of the disastrous polymarket pop up could occurred there
last weekend? Or Kyle, were you left in the dark?
And Natasha Teak, who of the Washington Post. She haunts
the halls and perhaps sometimes even the consciences of the
(01:05):
big tech companies in Silay. Welcome back. Hey everyone, you've
all joined us before, so the format will come as
no surprise. But remember this is a discussion, so please
butt in. We have no fear of crosstalk, and you
three already have the inside scoop. We love a humble
brag and even a not so humble brag, So don't
be shy. Let's get into it. Nitisha will start with you.
(01:28):
There was big news out of Los Angeles courtroom earlier
this week. What was the trial about and what surprised
you most about the verdict.
Speaker 3 (01:36):
The trial was about. It was a Bellweather case looking
at social media addiction, and the last two companies remaining
a couple of them settled were YouTube and Meta, and
the jury found that Meta and YouTube were liable for
product liability. Basically, it's this novel argument that the set
(01:58):
of lawyers has brought kind of using the same the
same argument that they use for big tobacco. You know,
it's a way to get around Section two thirty, which
which is a shield that tech companies use. You know,
they're not liable for user generated content. So the jury
was instructed not to look at any of the content
(02:19):
in particular, but think about, you know, the product decisions
that Meta and YouTube had made, infinite scroll notifications, beauty filters.
Speaker 1 (02:28):
And what made I think that the case particularly interesting
in this respect was the discovery of the internal communications
right within the companies. That was kind of a key
thing for the jury.
Speaker 3 (02:36):
Yeah, it was. It was really damning because they had
you know, Meta had data looking at like thirty percent
of I think ten to eleven year olds in the
US are on Instagram, and they knew that even though
they have an age limit of thirteen. That is I
think one of the stats that when Mark Zuckerberg was testifying,
(02:59):
you know, there was a like a lot of tension
between him and the lawyer who was questioning him over
this kind of stuff. You know, he kept kind of
pushing back and saying, like you're misrepresenting this, like you know,
this wasn't intentional. He said, you know, the idea of
banning beauty filters, which was something that came up again
and again because it was related to the plaintiff in
(03:20):
the case. He said, it would have been paternalistic to
do something like that, and they just stopped like automatically
offering them. I mean to me, it was just like
I remember when we first started talking about these features
as addictive, Like I was looking back, I was working
at Wired at the time, Like it was like a
Wired Guide to Internet addiction twenty eighteen, right, Like, it's
(03:43):
almost a decade later talking about some of these same
features and the world has it feels like the world
has moved on, right, Like we like some of the
same people involved in this fight about dark patterns and
addictive social media, they're now obsessed with AI. They are
on Capitol Hill lobbying about AI. And here we are,
(04:05):
you know, looking at as.
Speaker 1 (04:07):
Litigating yesterday's news in some respect to.
Speaker 3 (04:09):
Like last decades news, you know, in a way, I
mean there's there's billions of people and young people still
on social media. It's just it's hard not to feel
like the paltriness of the of the verdict in a way,
you know, because what they what was on trial is
whether or not they you know, inflicted personal harm on
(04:32):
the plaintiff. She suffered from body dysmorphia, suicidal ideations, self harm,
and Meta was arguing that it was you know, it
was obviously like addiction and mental health issues, they're multi
factor things. They talked about abuse she faced as a child.
They looked at her therapy on the stand, right, yeah,
(04:56):
I mean, just the idea of Meta bringing in your
therapy sessions is like, you know, it just feels like
I said like kind of inadequate to the task, Like
the jury wasn't voting on Hey, did Meta kind of
mislead the American people about the age limits on their app?
And you know, did they know that these features and
product designs could be addictive? That's like not the verdict.
(05:17):
You know, the verdict is like around addiction, which is
just so much harder to.
Speaker 1 (05:24):
I think it.
Speaker 3 (05:24):
You know, that's part of the reason why you see
even some critics and consumer advocates not happy with with
what ultimately what the jury decided.
Speaker 1 (05:36):
And so read you were saying before we started take things,
sorry to cut you off, that you were very interested
in Natasha's take, and natashually used the word my holistics.
Speaker 2 (05:43):
So I would have.
Speaker 1 (05:44):
Thought this would be like kind of a course for celebration.
But I mean, read, what was your question.
Speaker 4 (05:48):
I was going to ask you if you think this
will stand up on appeal, you know, and does this
does this ultimately go to the Supreme Court?
Speaker 2 (05:55):
What happens?
Speaker 3 (05:57):
I mean, I don't know, I'm not enough of a
legal expert to say that, but in terms of like
looking at the effect on MetaStock, wondering you know, what's
going on in open Ai or any of the Chinese
apps today. Based on the verdict, I would say almost nothing.
Like I just was looking back at my coworker, Naomi
(06:17):
Nicks had this great piece with some internal documents from
Meta and it was Adam Mosei, the head of Instagram,
talking about how they needed to focus on teens. That
was like their your number one business priority as you're
looking for twenty twenty four. This document he sent a
few weeks after these cases were initially filed, So like,
what is going on internally inside these other companies? Like
(06:41):
I think the best hope is that, you know, maybe
it makes open Ai a little bit more cautious. I mean,
you know, there's sixteen hundred other cases. This is a
bell Weather case. You know, this was just one plaintiff.
They're trying to get a sense of how Jeuris will
respond to this snap had settled. There's lots of permutations
(07:04):
this could go. But in terms of like genuinely changing
the product decisions and practices and the mentality behind these
companies that are their financial incentive is growth, right, Like
part of what they were saying is we need to
get them as tweens in order to keep them as teenagers.
This is what the internal documents at Meta but Instagram
are saying.
Speaker 1 (07:23):
But Natasha, you use the word Bellwether twice. Bellwether suggests
this is meaningful.
Speaker 3 (07:28):
This is meaningful. It's bell Weather For the other cases,
you know, will will the plaintiffs get some kind of
day in court? Will they get justice? Will they get
some monetary restitution? I am talking about. Will decisions in
multi probably trillion by the time they feel the impact
(07:49):
of this multi trillion dollar companies be made so that
everyone can benefit from a sense of accountability for your decisions.
Speaker 4 (08:00):
Well, maybe I can be the nihilist because I think
you know if you look, if you look it.
Speaker 1 (08:06):
I want to be the anti nihilist. I mean, I've
been fought not nearly as clarity as any of you,
but I've been following this stuff for ten years. And
the idea that courts would accept an argument that goes
around Section two thirty hold the technology companies liable for
the algorithms they create and the addictive behaviors those algorithms
promote and provide monetary relief, even though it's peanuts I think,
(08:27):
six billion dollars. But nonetheless, I mean, how did it
start with smoking. How did it start with big Tobacco.
I mean, I think probably I don't know if there
was one huge case that changed the whole industry or
if it, but you know, there was a kind of
groundswell of bell Weather cases like this one that started
to promote real change.
Speaker 4 (08:42):
I think that the nihilist views like big tobacco still
went on. You know, they're still around today, you know,
and they've gone into other areas like food, you know,
packaged food. So and basically a bunch of a bunch
of class action attorneys got rich and you know, and
if people stopped smoking, it wasn't because of those cases.
It was because people were educated on the harms of
(09:04):
cigarettes and they and society change, right.
Speaker 2 (09:06):
I mean, well, that's.
Speaker 3 (09:07):
Because they were forced to change the marketing. I think
part of what makes me, you know, part of my
interpretation of the verdict is also the fact that on
the same day Trump named Mark Zuckerberg to his like
Science Council. And you know, I you know, I've been
following these companies also very closely for a long time,
and I know that, you know, bad press is something
(09:28):
that really motivates them. And we're living in a time
where it doesn't.
Speaker 1 (09:33):
Matter so much anymore. And in fact, if you can
go on TPPN or on all in, you know, whatever
the mainstream media say about you is probably way less
relevant than it was under the Biden administration.
Speaker 5 (09:43):
Suddenly, but it feels like the past, Like it does
feel like legislating ten years ago, when the platforms are
already moving on and we're experiencing new problems.
Speaker 1 (09:51):
But I'm looking, I'm looking, I'm looking thirstily at you
to give us an optimistic take, because presumably, when you
were writing your book Infinite Scroll, this was the type
of lawsuit and outcome that you will, in some sense
hoping might happen totally.
Speaker 5 (10:04):
I mean, I think it's nice to see the mechanics
of a platform separated from the content in some ways,
Like we should be directly looking at the mechanisms of
a feed and the algorithms and the recommendations and how
you get bombarded with content as a user, like in
section two thirty, like it's supposed to protect user generated content,
(10:25):
but there are so like protect platforms from being responsible
for that content, but there are so many borderline editorial
decisions I think that digital platforms make that shapes how
we consume things and what we consume and how we
are addicted to our platforms and feeds. So like that
gives me some optimism, even though the six million dollar
(10:47):
metric is so minusculess to seem like a joke unfortunately.
Speaker 2 (10:51):
I think.
Speaker 4 (10:52):
I mean, coincidentally, you know, cigarettes are coming back now, right,
like it's a they're making a comeback, and like, you know,
maybe it's not a coincidence, honestly, No, I'm talking about
actual cigarettes, the kind that you light on fire and
inhale the smoke. It's a it's a it's a new
it's a trend, right, it's cool again, and especially with
(11:13):
young people.
Speaker 1 (11:14):
Celebs are lighting up all over the place.
Speaker 2 (11:17):
Back in Hollywood.
Speaker 4 (11:18):
I mean, I just think it's like there are there
are vices that people have, and like this, I mean
clearly social media is like a vice that you know,
people sit there, We've all done it, like just you're
just scrolling and you're going, what am I doing right now?
This is horrible? And you just keep doing it and
it's like, you know, we all know it's bad. And
I think I think society is like building up the
scar tissue or like you know, the defenses for it
(11:40):
right now, and that's totally separate from these legal cases
that are that are happening.
Speaker 5 (11:45):
What if posting is the vice?
Speaker 2 (11:47):
Though?
Speaker 5 (11:48):
Sorry, If posting was the vice, if we legislated against posting,
then no one could consume and it would be all good.
Speaker 4 (11:55):
Hey, if you want to make posting illegal, I would
be the happiest person on earth because I posting.
Speaker 2 (12:02):
A naga.
Speaker 3 (12:03):
I mean, posting is the vice because of intermittent variable rewards, right,
Like they know that you you know, you'll just keep
pulling down to refresh if if you think that you're
going to get some kind of feedback. But I mean,
I think what Kyle's bringing up about the focus on
product design is actually why I feel. I think I
(12:24):
feel like such a sense of optimism that if people
understood the mechanics of how these how these social networks work,
what they what the designers know, how that how it functions.
Like I remember when I learned the term intermittent variable reward.
Has it stopped me?
Speaker 2 (12:40):
No?
Speaker 3 (12:40):
Have I continued to waste like years of my life
on these networks?
Speaker 2 (12:44):
Yes?
Speaker 3 (12:45):
But yeah, but I know what's happening to me, you know.
Speaker 1 (12:47):
And to me, there's something different about a lawsuit though,
where you get discovery of dozens of thousands of pages
of documents where the executives are saying to each other,
we know this is causing harm. Essentially, let's continue to
lean in versus the odd like Cassandra who does a
Netflix doc or like the open essay from a like
former employee, like I think that these lawsuits in the
(13:09):
aggregate will have an effect. I mean there was another
one in New Mexico that Meta also lost, relating to
not protecting children from sexual predation on the platform. That
was a much bigger settlement I think. So I'm going
to I'm going to push back a little bit against
the against the tide of these.
Speaker 4 (13:23):
Well, what about Australia that band it right? I mean
they just banned it under sixteen you can't use it.
Speaker 5 (13:30):
I mean, like more is happening elsewhere.
Speaker 3 (13:33):
I mean, this is what I'm thinking about. You Are
you are inside the like you know, product marketing design
team at chatchipt, Like what is this? What is this
verdict doing to you? Are you just not sutting it
in email? Are you not putting it in writing? Or
are you actually saying or or are you just not
(13:53):
studying addiction.
Speaker 4 (13:54):
You're probably discussing it. You're probably discussing it with chatchipt
and I cannot wait for the discovery on that.
Speaker 2 (14:01):
That's going to be unbelievable.
Speaker 3 (14:03):
I mean I think that this the sorry, the extent
of the internal documents that came to public light. I
mean this is like you know, your your right oes,
Like if if it was twenty seventeen and we got
access to this, it would have felt like vindication, partly
because it is really hard on an individual level to
(14:23):
stand up to some of these dark patterns. I think,
you know, we feel that keenly, probably in the way
that we spend our time.
Speaker 4 (14:33):
Well, Natasha, you and I spent like all weekend, that
one weekend looking through all those Facebook documents, right that
was like do you remember that, like the whole the
Francis hougen leak.
Speaker 3 (14:43):
I was actually on Matt leave.
Speaker 2 (14:44):
But oh yeah, just this is just as wild dreams.
Speaker 1 (14:49):
So can I just ask you before we count of
this story that there were two quite cinematic dare I
say it scenes in this little suit. One involved a
jaw sut of em and M's involved a pair of
ray bends.
Speaker 3 (15:01):
So the jar full of eminems. This was this the
plaintiff's lawyer that I mentioned, Mark Lanier. I'm not sure
if I'm pronouncing his name right. So he takes out
a jar of Eminem's and he's like, this represents revenue.
And he takes out a handful of eminems and he's
just like, you know, they won't even feel it. This
like won't even make a difference. And then at some
(15:22):
point he also like like chips, like takes a little
bit of the of the hard candy shell off and
he's like, two hundred million. They won't even know this
is happening. But there was an interview I think in
the New York Times with a couple of the jurors
as they were leaving, and they said they especially did
not want to do what the lawyer seemed to be
(15:42):
hinting towards, which is like give them a massive settlement
or sorry, a massive payment, because they said they wanted
to focus a little bit more on the issues, you know,
thinking about some of these product decisions and the impact
on users.
Speaker 1 (15:57):
Interesting and the ray Bens.
Speaker 3 (16:00):
Oh, the ray bands, Yeah, as you what did you
think about that?
Speaker 1 (16:05):
Well? I heard that some of the Facebook segs turned
up to court wearing the metro rate pians recording the
proceedings that had to be told off. Again to your
point about like whether or not the tech executives care
about like making the you know, judicial system feel good
in the way they may have done a few years ago,
Like when Mark put his suit on, it feels like
(16:26):
we're in a different, different era.
Speaker 2 (16:28):
Wait, I did exactly, just like I was a court reporter.
Speaker 4 (16:30):
I used to have to hand in well actually I
was one of the only people who didn't have to
hand in their cell phone in the Southern District. But
I always thought that was, like, why can't people know
what's happening inside the courts? Like what's what's the big deal?
Speaker 3 (16:42):
I wish I had an eminem visual, you know. But
they also the same lawyer also asked, like Mark if
he'd gone over the plaintiff's account and then proceeded to
pull out like a thirty five foot wall of her
selfies and like asked him to to look at it.
So he has obviously a flair for the dramatic, which yeah,
(17:04):
only the meta ray bands caught I guess.
Speaker 1 (17:08):
Okay, moving right to along. Open Ai made an announcement
this week that I quite found quite surprising. No more
Mickey Mouse ai videos. What's going on here?
Speaker 2 (17:19):
Yeah?
Speaker 4 (17:19):
I mean they so they got rid of Sora, their
their video generation tool. And what I thought was so
interesting about that it's it's funny because it kind of
ties in with what Natasha was talking about with with
what is open ai going to do now with social media?
There was sort of a social media app that was
part of But I thought was what was so interesting
was like, what the real reason is that they just
(17:41):
can't They don't have enough tokens to go around, right,
and they're seeing this there's a huge profitable you know
boom right now, like with open Claw, which we talked
about last week, you know, just causing people to use
way way more compute and they're kind of seeing where
this is heading and they have to making tough decisions
about where they want to allocate their compute and this
was not a profitable area. Like it's just you know,
(18:04):
they had this deal with Disney people were you know,
people liked making these videos, but it just took up
too much compute power for what it actually produced and
what it meant and profit. And so for me, it
was sort of another Bell weather about where the industry
is headed with AI, Like I think we were having
it was funny like a few months ago. I think
it feels like we were having a debate about whether
(18:26):
this was a bubble and the whole thing was going
to crash down. And now it's like there is so
much demand and it just keeps going up and up
and up, and I find it just it's just Incredible's
what's happening.
Speaker 5 (18:37):
It's just not demands for personalized slop videos, so demand
for doing your coding for you and like organizing your email.
Speaker 2 (18:46):
Yeah, maybe that's a good thing.
Speaker 4 (18:47):
It's not like as harmful as social media when it's
like you're just trying to get your your AI agent
to check your.
Speaker 2 (18:53):
Email for you.
Speaker 4 (18:54):
But it is you know, obviously there's gonna be there's
going to be slops like this is not going to
go away. But it's just like there's a huge battle
right now between these companies over who has the most compute.
I think, like I remember Dario amid Abas, I.
Speaker 1 (19:11):
Mean, this is literaltru but this is literally like they
don't have enough data sent to Alas to do sora
is that is that what this comes down to when
you say there's not enough tokens.
Speaker 4 (19:20):
I think they don't have enough data center hours to
just do everything they want to do, and they have
to make decisions and it's like how much are we
going to put into training? It's a it's a very
it's a zero sum game within the companies because they
all only have so much compute. It's also l yeah,
and it's also a zero sum game, right, I mean
we could talk about that too, but it's also like
(19:41):
a zero game between the companies, like if whoever gets
whoever locks up the compute today is that means like
that compute is not available to their competitors tomorrow. And
there's and there's only so much that can be made,
like there's just there's only so many like extreme ultra
violet lithography machines that could be a year. So it's
like to me, it's it's like one of it's almost
(20:03):
like a wartime economy, like we're just gonna have to
start ratcheting things.
Speaker 2 (20:07):
And I find it.
Speaker 4 (20:08):
It's it's like incredible, and I don't think there's enough
like attention in society like that part of the just
this sheer economic issue that that this is all bringing
up Natasha.
Speaker 3 (20:18):
Okay, can I can I ask a question?
Speaker 1 (20:20):
Though?
Speaker 3 (20:20):
So I I was just like scrolling on on Soa
this morning, which I haven't looked at since it came out,
and this this is what I'm curious about, Like open
Ai has there has a foot in absolutely everything, right
and obviously the you know, even when it came out
when Soura was very popular, they could barely afford it.
(20:42):
It sounded like, you know, to keep Soa going. But like,
is the demand for Sora in particular so high, because
I think it like totally fell off. I was wondering
how they even got it to the top of the
app store when it came out, because like, this is
a company that is so obsessed with Twitter and has
not yet, I think, figured out how to market to
(21:06):
people who are on other social networks. Like if you look,
you know the main characters that they let you play
around with, like are one of the Sora creators, Bill Peebles, right,
like Sam the s guy Gabriel from open Ai And
I was looking and they haven't posted for four months,
you know. So I'm not saying that like video Generation,
I think like Iran has shown us like how valuable
(21:26):
and coveted or the you know, the fruit narratives, the
fruit sex narratives like on if we're talking about what's
happening this week. But Sora itself, I just don't think
that it I don't think it was like particularly great,
and I don't think that the dynamics of the app
were very compelling. So it might sound very good for
them to say like we're buckling down, but basically they
(21:49):
had no discipline. They were trying to do everything. Now
Fiji's come in right and said like, hey, we're we're losing.
You know, we're not competitive on the things that people
really are sessed with JATGBT coding, and so let's reallocate
our compute. But like I would like it corrective on
the on the Sora is very popular.
Speaker 1 (22:08):
I want to come to you, Kyle, and I want
you to put your cultural historian head on and take
us all the way back to October twenty twenty five
the mists of Time, when Sora was the number one
app in the ATA Giant talk about sore from the
point of view as a cultural observer and pretty good technology.
Speaker 5 (22:26):
I mean it was I think it attempted to be
a social network of generative AI, Like the point was
that you would make your own videos with AI. It's
major mechanic was that you could transform yourself into an
AI doll essentially and have your friends reanimate you in
crazy different situations. The example was Sam Altman was one
(22:49):
of these digital puppets, and many people put him in
videos trying to steal chips from like a time or
getting your desk. So and it's like that is a
horrifying use case, Like I'm not sure who actually wanted
to be this digital puppet, and I think they were
betting too hard on this being a form of viral content,
(23:11):
Like we still kind of haven't figured out. I mean,
maybe it's the slopaganda Iran war videos, but we haven't
figured out like what is the unit of AI content
that is the most successful. So SOA was like an
attempt to figure that out. But I think what it
ended up proving was that no one wants the feed
of only slop. Like now we see it dispersed into
(23:33):
our Twitter feeds or on Instagram or TikTok or whatever,
and that's kind of fun, but seeing it wall to
wall is like, I don't know, eating only those eminem's
or something.
Speaker 4 (23:43):
I never thought that the social media part of it
was interesting, to be honest, Like you see these like, oh,
we rose to the top of the app store, like
anybody can be at the top of the app store
for a day. It's like the New York Times bestseller list,
like you can buy a box of books and like
get yourself in the best sellerst for.
Speaker 1 (24:00):
That story is slightly more compositive, sadly than.
Speaker 4 (24:05):
Like, don't pay attention to these stories. Like but like
it's you're totally right, Like both of you are totally right.
It's just like that's not the most interesting part of
the story. Like, yeah, companies make failed social media attempts
every day and they you know, it's it doesn't matter.
Speaker 1 (24:19):
But I want to, I want to. I want to
push back on you, on you again though, because if
I look at last year, remember Scott Galloway is saying, oh,
like that, you know, the studio Ghibli moment and Sora
like there's a total waste of time. They should be
focusing their resources on more like profitable things, which is
what you're saying. Read it as well. Yeah, on the
other hand, like outside of the like Silicon Valley and
New York like Silicon Valley Technology in New York Finance Bubble,
(24:41):
AI wasn't really it was just an idea, right, And
then with that Studio Ghibli moment in March last year
and with Sora, like every regular person in the world
got to see the magic of AI themselves and like
experience like making a Sam Waltman puppet or like turning
their family photos into pixarl. That may not have been
ultimately very helpful of open AI's business, but in terms
(25:03):
of opening people's eyes to the way this technology can
create magic, maybe people will look back and say it
wasn't so great for open ai, but actually from a
cultural point of view, that was quite impactful.
Speaker 2 (25:14):
Yeah, I hear that. It's a valid argument. I think I.
Speaker 3 (25:17):
Would say like the argument at the time was that
they discovered something because they were saying, hey, look we
understand people, because people want to see themselves in the videos,
and that's why you know this app that will allow
you to make a doll of yourself, Like that's why
we're winning and other people aren't. And I just think
it's very interesting that they did not understand the dynamics
(25:39):
of like why people use it. They used they built
a good tool, but people still want to share it
with their friends. And I think it's important to point
out when people who want to like change the you know,
landscape of the Internet and append the economy don't like
actually understand how people think and work.
Speaker 5 (26:00):
And there are other video models that are really successful,
like like AI generated video is huge and it's a
huge medium, but it's just like not everyone needs to
be making AI clips or wants to be making these things.
And to Natasha's point, you might not want to see
yourself in them. You want to see Donald Trump, like
you want to see copyrighted or famous individuals who you
(26:22):
want to manipulate, like dolls, And that's much more problematic content,
like harder to defend legally. I would bet I don't.
Speaker 3 (26:30):
I mean, I don't know, Like I saw someone say,
you know, oh, maybe in a week or two we'll
find out that, like, you know, something was on this network.
Like usually the problem with a lot of this stuff
is seesam, right, So that's like the absolute worst nightmare,
like AI generated se sam that somebody's creating. I'm just
saying like that they have this kind of monolith idea,
(26:53):
like if you like generating studio ghibli images or videos,
then you must want like Kyle was saying, like this
feed where it's all of that, and people are really
thinking about this as a tool right on TikTok. People
are using it super creatively, you know, to come up
with different characters and narratives. I think people want to
incorporate this technology. Incorporate it, you know, not not like
(27:16):
live or die by it read.
Speaker 1 (27:18):
I want to give the last word to you on this,
on this story you've written in your newsletter quote, the
entire economy and global supply chain is being reoriented right
now around a single commodity, the token, and most people
are only vaguely aware it's happening or how it will
affect their lives. And I just want you to ask
you to explain that. But you said a couple of
times we were kind of missing the main story by
(27:40):
talking about the social aspect of soor so. What is like,
what is the big takeaway here for you?
Speaker 2 (27:45):
Yeah?
Speaker 4 (27:45):
I mean, first of all, like I agree with everything
you're saying about this stuff, Like I think their product
sense here is kind of it is really off right.
But then at the same time, like I mean, yeah,
AI video creation is a popular thing that is going
to continue. It's going to change Hollywood, like it's for
you know, there are lots of other companies doing this,
(28:06):
but I think the what the really important thing to
look at is like there's just only so many resources
to go around, and they have to make these tough
decisions and focus on what actually makes sense from a
product perspective, and that which brings like to this broader
point like the economy orienting around AI. It's like we're
(28:26):
we're seeing like massive shortages now of you know, like memory,
and of course there's questions about what's going to happen
with energy prices.
Speaker 1 (28:35):
I don't know when you say memory, you don't mean
data centers, you mean literally like stick memory sticks of
full time is more expensive than they.
Speaker 4 (28:41):
Will sticks of memory, right, which go into everything right
from your cars to you know, your Nintendo switch. Like
it's just consumer product prices are growing up like way
beyond what you would expect with inflation, and I think
it's just going to continue, Like what's the next thing?
Speaker 2 (28:57):
Is it copper? Is it other stuff?
Speaker 4 (28:59):
So this is actually gonna have a real impact on
like what we're able to buy and how much things cost.
In a way, that like these are like the most
important things in society, right, like our pocketbooks, like how
much money is in our bank account, Like that's that's
the kind of stuff that like really hurts. And I
think we're just totally ignoring that, and we're we're, you know,
(29:20):
we're looking at like I don't know chatbots and whether
they make us sad, but like the whole economy, the
whole economy is changing and orienting around this thing, and
I don't even know what the effects are, but I
know that it's gonna be like there's there's gonna be
a huge impact.
Speaker 1 (29:38):
We just take a short break now, but then we'll
hear from Kyle about a universal feeling hatred of weather apps.
Speaker 2 (29:45):
Stay with us, so mad Apple bought dark Sky. Still
this is what this is.
Speaker 1 (29:50):
The whole thing. Welcome back to tech stuff. Over the break,
all of our panelists were giggling about Kyle's story about
(30:12):
how we will hate weather apps. But Kyle tell us
a story first, and then I want to hear read
and attash to your takes.
Speaker 5 (30:18):
Once upon a time there was a weather app called
dark Sky. It launched in twenty ten. It was amazing.
It gave you live radar, it predicted precipitation into the future,
it did a push alert when it was going to
rain near you, and it was great. And then Apple
bought dark Sky in twenty twenty and then in twenty
(30:39):
twenty three, Apple shut down dark Sky after integrating some
of its features into Apple Weather, a common story in
Silicon Valley startup you know, acquisition and destruction. But the
good news is that one of the co founders of
dark Sky, this guy Adam Grossman, and another co founder
as well, have a new weather app that we can
(31:01):
all enjoy, which is called acne Weather. And I wrote
this piece this week kind of mourning Dark Sky and
also discussing what what does make a good weather app,
because at least for me, it's a source of constant frustration,
and particularly so many people around me have complained about
Apple Weather lately that I suspected something had just gotten
(31:21):
worse like and shittification had struck the weather app genre.
So this was my investigation, and I have to say
Acme Weather is a huge improvement on Apple Yeah, I
recommend it, recommend it, I would say, so far worse
than twenty five.
Speaker 1 (31:38):
Dollars a year, which is deep redial smiling.
Speaker 2 (31:41):
I'm I'm I'm a customer.
Speaker 1 (31:43):
I will buy it.
Speaker 4 (31:44):
I mean I am now, like as of right now,
I will as soon as this is over. I'm paying
the twenty five dollars. Like I loved Dark Sky. I
was so so sad when Apple bought Dark Sky and
this makes me very happy. So that you that it
has your dom Kyle, what made.
Speaker 1 (32:01):
You choose to write this story this week?
Speaker 5 (32:03):
Well, the spring weather has been completely bizarre, especially in DC.
We've gone from like eighty five degrees in sunny to
hailstorms within the space of twenty four hours, and so
my weather apps were really failing. And I have this
nostalgia for a dark Sky, and so I wanted to
test out ACME Weather and I ended up really appreciating
(32:23):
the graphic design of it. It's very clear. It uses
a lot of text.
Speaker 1 (32:28):
They show different different weather models, right, so you know,
I could choose your adventure of the weather rather than
a top down yes ex cathedral pronouncement on what the
weather will be exactly.
Speaker 5 (32:38):
So what I my conclusion is that most weather apps
are too confident, like they're too short in their predictions
and their numbers and their icons, and what I liked
about acme is that it kind of admits to this uncertainty,
and it includes visualizations of different predictions and different weather
models right on that home screen, so you can kind
(32:59):
of tell if a prediction is not that certain, Like
it's not just going to tell you what's going to happen,
it'll give you this indication that I don't know, maybe
the afternoon's a little shaky, like don't totally trust me,
and it just I don't know. It made me more
reassured as a user.
Speaker 1 (33:14):
This is there a betting function built into the app.
Speaker 5 (33:16):
Oh my god, betting on the weather. Don't say that
too loudly. This is like, no, no, And the nice
thing about acme, as I came to understand through this work,
most weather apps just use the same data, like they're
all just accessing this government data and weather company predictions
and so they're reskinning the same stuff. But acme actually
(33:39):
has its own machine learning based on geography. It's kind
of making its own predictions and customizing them. So I
think it actually is a technical improvement on other apps.
Speaker 4 (33:51):
Do you think by the way, I mean, Noah was
cut pretty badly during the whole Doge thing. Do you
think that's actually making weather prediction worse.
Speaker 5 (34:00):
Or national something, Yeah, national, what isic uphe administration?
Speaker 4 (34:08):
Basically the weather keep the weather, the weather guys. That's
like where they that's where we got all.
Speaker 2 (34:12):
The data from.
Speaker 1 (34:13):
Yeah.
Speaker 5 (34:13):
So in reporting this, I spoke to weather app founders
to see what their problems were, which is a remarkably
large set of people, and they all were talking about
n O A A being defunded and also just the
people's perception, like as Trump has been degrading science and
talking about defunding these institutions, the weather app founders were
(34:37):
saying that their users trust data less. There's like this
aura of incorrectness about it. They're more likely to be
mad at an incorrect weather prediction. So I think it
is like a governmental thing as well.
Speaker 1 (34:51):
Natasha, I'm curious for your for your take here, because
it's this weird thing happening right where like we there's
there's a there's a new boom evidently in weather prediction apps.
There are you know, prediction markets going crazy with Calshian polymarket.
It's no longer possible as of yesterday to get the
live wait times from JFK because they're too long. So
(35:14):
you're not it's now totally unpredictable when you go to
the airport whether or not you will be able to
get on your flight. Like do you have any any
kind of wider thoughts about what's going on in the world.
And we also had like technology aura and whoops, and
there's this kind of like intersection of monitoring, predicting, prescribing
with health apps and stuff like what's your what's your
(35:36):
take on Kyle's story?
Speaker 3 (35:37):
Like monitoring the situation. I mean, just from this conversation,
I'm wondering, like how gendered is interest in monitoring the situation?
Like I personally do not I guess have extremely strong feelings,
but I just like don't understand how you know, there's
so many micro climates in the Bay Area, So like
(35:58):
for me looking up Oaklynn versus like elsa read over,
it's just not helpful And I think I'm just not
a I would just like to know, like.
Speaker 2 (36:08):
In California is what you're saying.
Speaker 3 (36:10):
Yeah, I live in California, but yeah, I mean I
did think the Noah stuff was extremely sad, and to
me it seemed like like this is an amazing case
for machine learning, right, like for the technology that we have.
I don't know why you wouldn't apply like the powerful
prediction algorithms that we have to like deploy more sensors
(36:32):
and give people like even more robust data about what
to expect in their micro climates. So yeah, it just
it just felt like, why are we always trying to
use machine learning to like ape humans when there's so
many cases where machine learning could help us and I,
you know, maybe it could help me, like just figure
(36:53):
out what I'm supposed to wear on a weekend when
I'm going to three different cities in the East Bay.
Speaker 1 (36:59):
Oh no, we lost, We lost Natasha. Hopefully she'll be back.
But in the meantime, Kyle, do you do you agree
with Natasha? That's something inherently bro about the situation room.
Speaker 5 (37:08):
Well, you know, come to think of it, I only
talk to male weather app founder.
Speaker 2 (37:13):
Is that true now?
Speaker 4 (37:14):
That the one that then to be honest here in
the situation room here, Well.
Speaker 5 (37:21):
You know, it speaks to DC that like the Polymarket
prediction bar was like the biggest news in town.
Speaker 1 (37:28):
Talk about that. Tell that story as a kick for
this man.
Speaker 5 (37:31):
I mean, so Polymarket, which is obviously the prediction market
that is trying to market itself in the US the
most right now. UH tried to open this bar in
DC to participate in this meme of monitoring the situation,
which is the admittedly brilliant behavior of trying to take
in so much data, take in so many predictions and
(37:53):
tweets and live streams and whatever stock market tickers that
you just understand what's going on. But in a beautiful
metaphor for our internet environments, the eighty televisions that Polymarket
had in this bar did not work. So it was
a broken space in which to monitor the broken situation,
and it turned out to be a pretty bad party.
Speaker 2 (38:15):
I would say, that's very funny.
Speaker 1 (38:17):
Well, that's all we have time for this week. I
like to ask who had the best week in tech
and who had the worst week in tech. I think
Natasha literally had the worst week in tech because she
dropped out of the riverside her computer stopped working. So sorry, Natasha,
but we loved having you on this week when sad
not to have your your take on the highs and lows?
But read who do you think had the best week?
About the worst week?
Speaker 4 (38:36):
I don't know, I mean probably, I mean I think
you have to you got to say, like meta, really meta,
I mean Google, to a lesser extent, had had the
worst week.
Speaker 1 (38:44):
Yeah, I would think that, but you look at the
stock price, it didn't I mean, to Nasha's point out,
it didn't move, ifact was slightly after trial. It's down
two percent in the last seven days. Like I mean,
I thought that was they lose two bell Weather cases
and the stock market doesn't react. I mean, maybe that
makes the best week for them to immune to the
judicial system.
Speaker 4 (39:03):
I think stock price is always is always like the
wrong metric to look at, like how to measure that means? Yeah,
for individual tech companies, I mean I think it's no.
I mean, like there, their stock price didn't really go
down even after like Cambridge Analytica and like they you know,
I don't think they. I think they realize, like they're
(39:24):
they're making money, like it's fine, but like they're because
it just means like they're less and less relevant to me,
Like it's there's the world is changing even outside of
this lawsuit. Like I think the lawsuit is like is
not even really the biggest issue. It's just like the
whole world of social media is is you know, it's
completely changing, and like I personally, no one I know
(39:46):
is like on Facebook or really maybe Instagram, but like
it's not like such an important part of people's lives anymore.
And that doesn't mean it's a bad business. Like it's
gonna be you know, they're gonna be fine. They're gonna
still make a bunch of money. And you know, I
don't know, and I don't know who had the best week,
who had a good week?
Speaker 2 (40:03):
Who's Kyle? Could we have multiple choice?
Speaker 5 (40:07):
I kind of feel like Anthrophic continues to have a
great week, and I just feel like Claude has become
a kind of ask Jeeves of twenty twenty six, and
it's like Claude is like Gray Scott.
Speaker 1 (40:20):
I asked Jeeves, but Jeeves goes and does it for
you rather than just giving an that could be better.
Speaker 4 (40:25):
I saw someone back to the point, but like someone
some VC was posting on x about how he had
sort of run into the like when you when they're
when you're using too many tokens on Claude, like you
get like these blocks sometimes and he's like, oh, I'd
gone over and you know, tried code ax and it's great,
It's actually really great. And I was like this guy's
(40:47):
an investor in anthropic Like, what the hell is going on?
Speaker 5 (40:50):
Like this is this is vibe coding, psychosis, this is
this is what I should write about now. But it's
like we're just getting sucked in so hard to these
things and we're anthwrop refires sing them ourselves now and
that's I don't know, it seems bad.
Speaker 1 (41:06):
That's it for the Week in Tech. Thank you so
much for joining us.
Speaker 2 (41:09):
Great to be here for tech Stuff.
Speaker 1 (41:34):
I'm as Volosha and this episode was produced by Eliza
Dennis and Melissa Slaughter. It was executive produced by me
Julia Nutter and Kate Osborne for Kaleidoscope and Katrina Norvel
for iHeart Podcasts. The engineer is Kathleen Conti from CDM Studios.
Jack Insley makes this episode and Kyle Murdoch wrote our
theme song. A special thanks to you read Kyle and Natasha.
(41:55):
Please check out all of these threes extraordinary work they
put into the world. We're lucky to call them friends
of Tech Stuff, and please also do rate and review
the podcast wherever you listen.