All Episodes

April 3, 2026 49 mins

This week: Iranian propaganda gets a Lego makeover — and it's going viral. Kyle Chayka (The New Yorker) tracked down the collective behind the AI-animated videos flooding your feed. Nitasha Tiku (The Washington Post) was in a documentary, The AI Doc: Or How I Became an Apocaloptimist, and the press tour feels like ChatGPT doomsday déjà vu. Reed Albergotti (Semafor) celebrates Apple’s 50th birthday, but wonders if the company is entering its Microsoft era. Plus: SpaceX files for IPO — it could be the largest in history. 

Additional Reading: 

See omnystudio.com/listener for privacy information.

Listen
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:11):
Read Natasha Kryle, good to see you. Two things struck
me this week in the world of tech. One the
NASA launch and to the SpaceX largest I p O
of all time read I know you're a bit of
a space head.

Speaker 2 (00:27):
Wait what do you mean by that?

Speaker 3 (00:28):
Exactly?

Speaker 4 (00:34):
No, I was excited to see the artibast take off.
Did you watch I didn't watch it live.

Speaker 3 (00:40):
No, shouldn't the astronauts be live streaming now?

Speaker 2 (00:43):
I think starlink has to get better for that to happen.

Speaker 1 (00:46):
Well, they should be. They need they need to be,
They need to be applying for need to become astronaut
influencers and want to get on the on the future.
Billion they led space missions like Katie Perry.

Speaker 3 (00:54):
The first Bishop for creators only.

Speaker 1 (00:58):
Exactly, Natasha, what is it with with billionaires in space?

Speaker 5 (01:01):
Oh?

Speaker 6 (01:02):
You know, it's a core. It's like sleeping under their
desk is just a core part of the of the mythology.

Speaker 1 (01:11):
Okay, let's get into it. Welcome to tech stuff. I'm
as Voloscian and this is the week in tech, but
I'm joined by three of the most plugged in tech
reporters in the world to break out the biggest tech news,
decode emerging trends, and debate what actually matters for us.
We're joined by Read Abrigotti, Hi, who is the technology
here to the semifore I read good to be here.
Kyle Shaker, who writes the Infinite Scroll newsletter for The

(01:32):
New Yorker.

Speaker 3 (01:33):
Hello, nice to see you all.

Speaker 1 (01:35):
And attached to Tiku from the Washington Post. Hey guys,
So we'll start with you Kyle. This week, you are
particularly taken by slopaganda circulating about the Iran wats. I
want to hear all about that. I want to hear
what you're saying before that. The New Yorker magazine, where
you work, the most hallowed journalistic institution in America, arguably

(01:58):
posts or they have to save best of lists at
the end of each year, like the best of television,
the best of theater, the best of opera. In December,
you published the best of slop? So congratulations, what is
it with you?

Speaker 3 (02:10):
Inside man?

Speaker 7 (02:12):
It was the year in slop and it was a
retrospective of all the AI generated garbage that we had
to wait through as people on the Internet over the
past year. And I think the scariest part of putting
that together was realizing how much better everything got, Like
the video quality of baseline AI generated stuff was vastly
vastly higher at the start of last year as opposed

(02:36):
to or at the end of last year as opposed
to the start of last year. So I really got
a first hand view of that, unfortunately, and thus we're
in twenty twenty six when it's better than ever.

Speaker 1 (02:45):
Do you remember that scene in Harry Potter where there's
this kind of boiling cauldron of like something and Dumbledore
and Harry there and dumbledon Dumbledore that it puts his
wand and then Harry can see all of Voldemort's memories
because of this bucket of of churning whatever it is.
I feel like AI slop is that for the collective unconscious.

Speaker 3 (03:05):
I think.

Speaker 7 (03:06):
So, I mean, we're a left waiting to try to
sort of the past decades of stuff, and it's up
to anyone, Like whatever anyone can imagine can now become
a high production value animation, so which you know, it
might sound utopian, but in practice it's actually horrible.

Speaker 1 (03:25):
So I think what's interesting though this week is that
we kind of definitively crossed the rubicon from from slop
into slopaganda. I want to play a clip from the
ex account of the Iranian Embassy in the Hague, which
features cartoon animated Donald Trump at a press conference and
what I think are his internal demons.

Speaker 6 (03:45):
Why did you attack the Monov school? Go on, lie, Lie, Lie.

Speaker 8 (03:59):
A week.

Speaker 2 (04:03):
We didn't hit the moknob school.

Speaker 5 (04:05):
America doesn't have Tomahawk missiles at all, and we care
deeply about the Iranian people.

Speaker 1 (04:12):
They did his voice much worse than SNL that was
going on here, Kyle.

Speaker 7 (04:18):
Well, over the past week or two or a few
weeks possibly, we've seen this trickle of AI it generated
propaganda from various Iranian sources, and so one popular format
that has emerged is Lego videos, which take on the

(04:39):
esthetic of the Lego movies, including Batman and the Pharrell biopic,
which I never quite understood, but anyway, they use Lego
toys to animate the Iranian war, and you know, imagine
the White House burning down and Donald Trump with his
pants on fire, and hordes of our armies marching on

(05:01):
Tel Aviv and Tehran and just like a panopoly of
horrifying things, kind of but made of children's toys. And
so I was watching these and they're like high production value.
They're going super viral. They have a lot of American fans,
and I was wondering who was actually making them. So
I actually got in touch with the collective of Iranian

(05:24):
students and graduates who was putting them together how I
mean in the classic social media internet reporter move of
scrolling way far back on their social accounts before they
got banned, and you know, contacting them that way, and
then promptly everything was erased over the weekend because the
platforms caught on to what was happening. And yeah, I

(05:47):
mean they decided to use this lego aesthetic because it
was cute and simple and elicited sympathy for the Iranian
people and kind of cast this like moral quality over
their whole concept. And they told me that they could
go from a highly worked out script like these are
very written through videos. They're very complicated, and they're plotting

(06:10):
to a full animation in twenty four hours. So that's
the that's the Iranian lego propaganda machine.

Speaker 1 (06:16):
Natasha Reid, have you been have you been experiencing these
videos or the slop againda coming out of our own
of our own way.

Speaker 2 (06:23):
I have seen the lego videos.

Speaker 4 (06:24):
I was I was, I, I'm glad that you did
the reporting because I was wondering the same thing.

Speaker 2 (06:29):
It's like, wait, who is making this?

Speaker 4 (06:31):
And like clearly it's coming from like the Iranian government
in some capacity, but like the Ayatola is definitely not good,
like are.

Speaker 8 (06:40):
Lego sycrilegiust or something like you know, it's it's there's
they're speaking the language that we all speak, right, Like
there's a lot of cognitive dissonance there watching the videos.

Speaker 6 (06:50):
That's so interesting that you got that they talk to
you about the scripts, because that is what I've been
wondering with both like the propaganda legos and then also
those low grade fruit corn that is just also populating
my feed. I'm just like that there's that and then
there's something else. I don't know. I keep sending them
to my coworkers, so I might be no, but like

(07:12):
everyone has millions and millions, and I was just like,
oh my gosh, what if instead of watermarking in some way,
you were mandated to put your prompt in any like
AI generated thing. I really really want to know like
how involved the human was. So that's like super interesting
that they that they wrote out these scripts which are

(07:33):
so appealing to Americans and so americanized that I was like,
is it Ironians? But looks like you answered that question.

Speaker 1 (07:40):
Well, I want to ask that question. I want, I want,
I want to ask that question about who's this ready for.
But before that, I'm not sure if any of you
saw the Chinese one, which is like a mini animated
doc with the Great White Eagle, and basically it was
the best prima on the straits of Hormus, the underlying
geopolitical forces governing that what's going on. I mean, it was,

(08:01):
it was. It was an amazing distillation like everything I've
read in like you know, smart places in the last
couple of weeks, as a six minute animated video with
Donald Trump as a great White eagle and the new
Ia Toler as a black household cat with red eyes.
But like it was very tiny, scripted. I have to say,
it was impressive. Information density was very compelling.

Speaker 4 (08:24):
Let's take the let's take the war out of it.
Though I know you said this sounds utopian, but it's not.
But it kind of is like the imagine unlocking all
this creativity that's in like some probably I'm I'm imagining
them as like kids, you know, young people in Iran
built making these videos who've like, you know, never in
their wildest dreams, so they've been able to make something

(08:46):
like this and now they can. And you know, of
course like there's this is under terrible circumstances, but like
in a in another world, it's like they're just you know,
they're just making entertaining videos and like, hopefully we have
ways to bring this, you know, the good stuff to
the top and keep the slop, you know.

Speaker 3 (09:04):
And this is the good stuff.

Speaker 7 (09:05):
Like I'm now I'm imagining like the equivalent of Lonely
Island or college humor or whatever, but in AI slop videos.
But that's what was kind of striking about the lego videos.
I think they look so good, like they're really there's.

Speaker 1 (09:19):
Not really AI slop. Like when I think about AI slop,
I think of like a kind of like an octopus
with like a cat coming out of it and then
going on fire and then living under the sea.

Speaker 2 (09:31):
Is good and the new slop is good, isn't sloppy?

Speaker 7 (09:36):
Yeah, Well, it just struck me that, I mean, almost
anyone on the Internet now has access to similar AI models,
like whether it's open source or a cloud pro subscription
or or what Sea dance, the Chinese video generation model.
Anyone can access these and they're proliferating faster every single day,

(09:58):
and now anyone can make the leg movie in one day.
And I think we're like only starting to imagine the
consequences of that. As Reid was saying, it can unlock
tons of creativity, but it's also this cognitive dissonance of
is this piece of media valuable or meaningful or not?

Speaker 6 (10:16):
Yeah, when you said utopia and like my mind went
straight to what about the like click farm equivalents of
Like so there's just like, you know, people trapped in
an Internet cafe in the Philippines like kind of making
individualized propaganda for all of us all day.

Speaker 1 (10:32):
And it's actually just before we come out of this story,
and I don't want to ask you about some of
your previous reporting on deep fakes and the horm they
cause I mean there was obviously the deep fakes non
consensual pornography is like it's a horrible story and a
horrible issue. There was a fear that deep fakes would
also become kind of tools of like political chaos sewing.

(10:53):
What seems to have happened is quite the opposite, where
in fact, no one really cares about deep faces. In
the political realm, people actually want like the most almost
absurdist animations which are clearly not trying to be real.
I wonder what's going on there.

Speaker 2 (11:05):
Yeah.

Speaker 6 (11:06):
I feel like we started to notice this with elections
in the global South. In Southeast Asia, you know that.
I know my editors in DC were extremely concerned about
like some kind of election manipulation, but instead people ended
up using it sort of like oh, you have access
to photoshop and like all of a sudden, some a

(11:26):
bunch of humanities majors and I don't know, political operatives
can do whatever they want. So in India they were
like splicing candidates into Bollywood videos and they were making
candidates like like sweet cartoony images for like a for
a strong man candidate. So I I yeah, I thought
it was interesting. Kyle. I'd love to know what you

(11:47):
thought about this that it almost seemed to like diffuse
the tension. Like there were, of course those videos where
you see the bombs heading at the Statue of Liberty,
but it felt like it's like we're all in this
together with this one crazy.

Speaker 7 (12:04):
I don't know, sort of specifying that the Statue of
Liberty in this case is like an evil goat demon ball.
Yeahs the ball statues of the Statue of Liberty. But yeah,
I think that's like the cartoonishness is the appeal. And
I think what brings so many people around the world
together with these beautiful videos is their hatred of Donald Trump,

(12:27):
and everyone's just in comprehension of what is even going
on in this war in the strait of our moves.
Is it happening, is it not happening, is it over?
Is it just beginning? Literally no one knows, so it
might as well be legos.

Speaker 6 (12:40):
I mean, the way that they worked in the like
you know Epstein files, is the reason that he started
this war was just like you couldn't I mean imagine
imagine something like that around the Iraq War.

Speaker 1 (12:51):
In fact, on the Iran on the Iran video I
played at the beginning, which we only heard the audio
on the ending title of title card on that on
that v is actually says Epstein's client, but an attached
to changing gears and but staying in the world of
film and TV, you were recently featured in a documentary
that's got a lot of press called the AI doc

(13:13):
or How I became an Apocaloptimist apocaly Optimist, and I
want to play a part of Oprah's interview with one
of the central voices in that documentary, Aseruskin from the
Center for Humane Technology.

Speaker 6 (13:26):
How could AI physically eliminate the human race?

Speaker 5 (13:29):
It's actually hard to imagine all the ways AIS could
wipe humans out. AI is already better than almost all
humans are doing cyber hacking, and so you could imagine
one of the things that an AI could do is
take out all electricity, water, hospitals, transportation across every country
in the world all at once. Now, that doesn't wipe

(13:50):
us all out. Where you could imagine the amount of
damage that would do.

Speaker 1 (13:53):
You shouldn't chaos and crazy exactly.

Speaker 5 (13:55):
And we're only you know, five missed meals away from anarchy?

Speaker 3 (13:58):
Did you say, we're only five minus meals away from HANARNT?

Speaker 8 (14:01):
Yeah.

Speaker 5 (14:01):
I think about what happens in New York City if
you can't get food.

Speaker 2 (14:04):
Yeah, I don't think a lot of people have thought
about that.

Speaker 1 (14:08):
Yeah, the soundtrack is particularly dystopian without without any video.
But Natija tell us about this this uh, this film
on your role in it my role was minuscule.

Speaker 6 (14:19):
I actually finished watching till the end yesterday and I
have two sentences, so I would not I would not
say I was featured in the in the piece, but yeah, it's.

Speaker 1 (14:28):
You're talking about right, You're you're an expert in the film.

Speaker 6 (14:32):
Yeah, but like for two seconds, there's many men like
basically read. You probably have interviewed half the people on
it at least or or more, a lot a lot
of familiar faces, and it hasn't been doing that well
at the box office. It looks like, however, like millions
of views for the trailer and the press around it

(14:53):
has been They've had people who are on it on
the Daily Show this week. This this Brah clip that
you just played, it's really actually in the clip when
she says like we're five missed meals away from anarchy,
they actually show the text like across the across the screen,
which I just thought was like wow. And that's the

(15:14):
little clip that starts this hour long program too. She
talks about like how could AI physically eliminate us all?
And I'm just like kind of transported back to twenty
twenty three. It's just really interesting to have these conversations
about some of these doomstay scenarios that feel very divorced

(15:35):
from you know, the current, like some of the very
real ways that we're watching chaos unfold. You know, there's
nothing about like what if two egomaniacal men are having
a contract dispute or you know, what if what if
your president is trying to distract from from the Epstein files.

(15:57):
Like it just feels like we're kind of back to that,
but you know what I mean, like back to that
like post chat GPT era of introducing the public to
existential risks.

Speaker 1 (16:08):
It's also interesting because Oprah did an ABC special last
year with a lot of the same big you know,
with Sam Waltman and Dario and all the all the
big names in this industry. And when that came out,
I was struck by this kind of huge not dissonance
is the wrong word, but this huge ocean between how

(16:28):
people you know, like you all who cover this industry
and how people in Silicon Valley talk and think about
what's going on. And then I mean, Oprah is a
relatively good gauge for like where the wider conversation is maybe,
and so it's just I mean, this does feel like
frankly quite dated, and yet at the same time, to
your point, it's gain millions of views on YouTube and

(16:50):
it's all over Daily Show and the morning shows. Yeah,
what's like, who's the who's the movie for? Actually? And
what does the marketing of the movie say about it?

Speaker 6 (17:01):
Who's the movie for? I think that the director was
trying to make it for the every person who is
feeling this like un you know, kind of amorphous anxiety,
and the point is to try to figure out how
you're supposed to emotionally approach this technology, like how you

(17:21):
know you understand that something strange is coming, you don't
know what it is, and you don't know what it
means for you and your family. And I think, you know,
I was trying to think about, like, okay, net net
is the documentary better for AI literacy? And I think
you do get a better sense of like the initial
training phases, you know, they talk a lot about like

(17:43):
how much of the Internet is put into it, but
then there's like no discussion of the amount of work
that the companies do, the amount of product making decisions,
the amount of data that they have to commission in
order for it to get better and better and it
just like kind of removes agency from the companies in

(18:05):
this way, like you know, where it seems inevitable and
it's just really interesting having spoken to a lot of
the people that were featured in the documentary, like you know,
it's not misrepresenting them, but like if you give if
you give these people a chance to zoom in on
something more concrete and not take the you know, five
thousand foot view, they're very chatty and like would have

(18:28):
given you an earfull. So I you know, I'm excited
for that movie to come out.

Speaker 4 (18:33):
I guess how much do you think it actually helps
people with these emotions versus like just sort of like
feeds it, you know, just just feeds that sort of
like anxiety and fear that people have that I think
I think is actually largely irrational, to be honest, You.

Speaker 2 (18:48):
Do what do you? Yeah?

Speaker 1 (18:49):
I do I do with the doom the doom a pond?
This will kill us all all that, this will make labor,
you know, tremendously strained or which part of it do which
part of it do you think is irrational?

Speaker 4 (19:00):
I think I think it's all irrational, to be honest.
I mean, we don't know, like the thing is. It's
like we're just talking about now. It's like we thought,
you know, the defas we're gonna be one thing, and
turned out there another thing. We you know, there's gonna
be downsides to AI, like to just or let's just
call it software automation, right, Like that's what it is.
There's gonna be downsides to this, like massive build out

(19:22):
of data centers and like reorienting the economy around this
thing these tokens, but like we don't really know what
that is yet, And I think sitting back and like
worrying about it is sort of you know, it's it's
like the old cliche. It's like, you know, paying an
interest on a debt you may never owe. So that's
kind of my view on all this stuff. But I
don't know if this documentary you think. I haven't seen it,

(19:44):
so I'm just curious, like does it help with that,
does it help clarify the situation or does it just
sort of like feeding the anxiety more.

Speaker 6 (19:53):
I mean I think that there is I guess like
there is room to be concerned and to worry, But
it's just so the industry voice like it's basically only
asking tech people what could potentially happen. And I feel like,
if you want to think about how human civilization has
adjusted to massive change, you talk to historians, like you

(20:14):
basically talk to almost anybody besides the people who are
building the technology. Like he's he's obviously very you know,
he does it from a first person perspective, the director,
and it's like, you know what, I got to talk
to the CEOs and it's like there's no shortage of
footage of the CEOs talking about this, like this is

(20:36):
it's really not. It's hard to get him to stop
yapping in some ways, Like a lot of these guys
are on a constant press tour, and I just feel
like I would love to hear from people who study
labor markets, how you know how change traditionally unfold, Like
what are different scenarios in that way of thinking. I

(20:57):
just feel like the like the buy that is proposed
between like utopia and dystopia, I think we already know
like who benefits from that. It's you know, it makes
the technology sound inevitable and powerful in a certain way.
And it's just we have a lot of data now
it's twenty twenty six, Like we can we can look

(21:17):
at how successful is this worldview? How successful has this
approach been in predicting the harms we've already seen?

Speaker 1 (21:24):
Do they ask you? Did they ask you those kinds
of questions during the interview when you would be interviewed
for the film, like what was the what was the
tenor of what you were being asked to contribute?

Speaker 6 (21:32):
I think I might have been pretty early because I
did suggest a lot of people. They did make me
explain like effective altruism and like give them the like
I remember using my hands a lot for like the
Dumer Landscape. And then they had to call me back
after Sam was blipped out for a second to ask
like what was really going on? Is it because you know,

(21:53):
like Q star they had something And I was like no,
it's very per very interpers.

Speaker 1 (21:58):
A open AI for like twenty four hours, right yeah.

Speaker 6 (22:01):
Yeah, yeah, yeah yeah, week or I don't read. You
probably remember it better than I a few days.

Speaker 2 (22:05):
I just remember it ruined my Thanksgiving Kyle.

Speaker 1 (22:08):
Have you have you bought your tickets for the movie
this weekend?

Speaker 3 (22:10):
I have not bought a ticket yet.

Speaker 7 (22:12):
I have seen it all over social media, like this
seems to be the media artifact of AI right now.
And it struck me as very vintage, like it's this
kind of hyped up dumorus like reel of people saying
that this is going to take over the world and
destroy things. And I don't know, I feel like the
current concerns are less about AI God killing us all

(22:36):
and more about the economy redirecting toward data centers and
accelerating climate change, and then you know, destroying the earth
that way is a kind of secondary effect.

Speaker 3 (22:47):
So yeah, I don't know.

Speaker 7 (22:47):
It felt like a bit of I don't know, super
sincere hyping up And as Natasha was talking about, I
think the CEOs are incentivized to make this technology appear
as powerful and world conquering as possible, and they're not
necessarily going to be the best sources for you on
how will this affect a normal person?

Speaker 4 (23:07):
And even on the doomsday stuff, I just don't think
these it didn't seem like they had much of an
imagination there either, Like talk to the have you read
you know, if anyone builds it, everyone dies.

Speaker 2 (23:17):
You guys read that book.

Speaker 4 (23:18):
It's like they come out with like much better scenarios
for how AI kills us all, Like, you know, it's
just like a Zukowsi, Yeah, exactly, like they're They're like, yeah,
I mean, it's just going to like co opt humans
and eventually of the humans will help them build some
bioweapon that we won't even realize we have for like
a generation and eventually we all die. Like it's it's

(23:40):
a much you know, Like I feel like, if you're
going to go doomsday, like go all the way.

Speaker 2 (23:45):
You know.

Speaker 1 (23:46):
This is why I said that you were a space
at the beginning of the episode read because I remember
the sci fi you referred to when corporations not controlled
by governments can create an extractive economy in space and
take people there and then keep them in slaved in space.

Speaker 2 (24:01):
That's right, that's right, like the company from Aliens, Right.

Speaker 3 (24:04):
What if they make alien you know, like that's so
much better.

Speaker 4 (24:09):
I mean I asked one of the authors of that
book on stage once, like he Nate Sores, and I'm like,
what do you think about This was like before the whole,
before I broke that whole like Pentagon Anthropics story, but
like I was interested in the topic of if autonomous weapons,
and I'm like, what do you think of autonomous weapons?
And it was like a layup I'm like, he's gonna
just say this is so crazy, I can't believe we

(24:32):
would ever do this, and he's like, no, I'm pretty
much for autonomous weapons, like AI doesn't need autonomous weapons
to kill us, and like they could actually help reduce
human deaths in the battlefield, and like wow, like that
came full circle, Like.

Speaker 1 (24:46):
I read I read Sebastian Amountabie's biography of Demis hasbist
this week because he was on the show Sebastian a
couple of days ago. And yeah, apparently the way he
impressed Elon ear Leon was saying to Elon, by the way,
going to space won't save us from superintursient AIS because
they'll be able to build rockets too and kill us

(25:07):
while we're up there. But just a doubt, just to
come out of this story, I want to pick up
on a word you use, Kyle, which is sincere, and
certainly the documentary comes across as very sincere. However, you know,
there is this odd thing where these AI doomers and
AI boosters are kind of two sides of the same coin,

(25:30):
and the doomers in a sense, okay, maybe they're not
becoming billionaires by building AI companies, but they're certainly becoming
amongst the most sought after media talking heads in the world,
and so like, is it truly in good faith?

Speaker 3 (25:43):
I mean, from my perspective, it's not.

Speaker 7 (25:46):
I think there is a hype cycle and people are
caching in on different sides of the hype cycle. Meanwhile,
like Anthropic and Open AI are cruising toward IPOs that
will make many millionaires and billionaires and that will change things.
You know, there are like more eminent consequences I think
that we can talk about instead of total dumer apocalypse.

Speaker 6 (26:08):
It's interesting because the whole documentary is about in incentives
and you're not, like, there's very little examination of the
incentives of the people that got the most airtime. And
I think there is this sense that it's just it's
hard to operate in AI, like writing about AI without
kind of knowing where everybody is coming from. You kind

(26:31):
of have to have this map in your head because
I think it looks like there are many different voices
saying the same thing, and so there is consensus, but
you know, many of those people are from policy groups
that are trying to influence legislators towards a specific view,

(26:52):
like they share a lot of similarity. So you know,
I just I wish we would move away from from
looking at a person's like kind of personal politics and
just look at like the CEO's actions A and then
for these kind of advocates, look at you know, where

(27:13):
they're coming from. You know, if they're giving you a
blackmail scenario from the anthropic paper, to what end and
how does that like reframe how you think about what
policies to put forward. Because this is aimed at policymakers,
I think we should think about it as its own
kind of propaganda.

Speaker 1 (27:32):
That's a really interesting point you make, because obviously in
the Trump era, it's sort of tempting to think, well,
like the train has left the station, they'll never be regulation,
Like we've just like essentially decided that there's going to
be unlimited, unregulated technology. But of course, like you know,
there's midterms coming up this year, there's two years time

(27:53):
there's another presidential election, and so are you suggesting in
some sense this is kind of building towards a co
of a policy driven tech lash post election.

Speaker 6 (28:04):
Or yes, I mean this is you know, like some
of the people who are represented there, they are affiliated
with groups who are arguing that there is massive bipartisan
support for a long list of legislation. I wouldn't say
it's like a crisp kind of policy proposal, but you know,
they're meeting together with faith groups, with children's advocacy groups,

(28:28):
with unions and saying like we all have common cause
in regulating in regulating AI. So I just think that
this is the tip of the spear, and we should
pay attention to where these where these narratives are pointing government,
and where these narratives might end up affecting all of us.

Speaker 1 (28:52):
We're going to take a short break now before we
come back to celebrate Apple's fiftieth birthday by asking what
comes next for the second largest company in America. Welcome

(29:16):
back to tech stuff. Read is kind of incredible. Apple's
been around for fifty years. I mean it started just
after the original Moon landings, which is kind of crazy
to think about. Wednesday, April first, Apple celebrated this milestone
by putting on a statement titled fifty years of Thinking
Different and amidst the praise of misfits, rebels, and troublemakers,

(29:38):
there is a telling quote at Apple, We're more focused
on building tomorrow than remembering yesterday. What does that mean?

Speaker 4 (29:46):
Yeah, I mean usually when you turn fifty in silicon value,
you don't tell anybody, like you try to keep it
a secret.

Speaker 2 (29:51):
But that was looks mixing. It's kind of interesting.

Speaker 4 (29:56):
No, I think I think there's there's a fascinating, fascinating
happening with Apple right now where they basically missed AI.

Speaker 2 (30:05):
Right.

Speaker 4 (30:05):
They're sort of like they were they were caught. You know,
I don't I don't know how that happened, but they
didn't see this revolution happening. I think had something to
do with being in Kooper Tino inside this like spaceship
building and not going outside and the future right. But
I but I think that like the defense of Apple
from from the Apple fanboy crowd, which they're you know,

(30:28):
they're very their many, and I'll probably get hate mail
for even saying this, but it's like, you know, look
they've yeah, I mean, Apple's great at this. They just
watch technology develop and then they just jump on top
of it and like and and you know, and benefit
from it. And I think that's like a very lazy
defense of of what's happening, and it's and the other
thing is like, well, you've got this phone, and you

(30:50):
know they're gonna just like that, They're gonna be able
to run all these AI models on the device, and
so that's going to be a huge advantage for Apple.
And it's like, I don't think that's really how it's
gonna work. I mean, I can't, I don't. It never
helps in the tech industry to like miss a technology
wave like we've seen companies do it. I'm not saying
Apple's gonna disappear overnight or anything like that, but you know,

(31:12):
like the reason people buy iPhones is not because it
has some great hardware or is a piece of great hardware.

Speaker 2 (31:18):
It's because they're locked into the ecosystem, right. It's it's they.

Speaker 4 (31:21):
Don't want to be a green bubble or name your
you know, the photo sharing product or something like that.
But you know, Apple doesn't know what the like, we
don't know what the on device like, what the edge
compute's going to look like. That's necessary to make you know,
whatever this future on device AI is. And Apple can't
move very quickly with its phones because it sells so

(31:42):
many of them, so like when it comes to like
new hardware, that's why they're always like many years behind
Samsung on the hardware because Samsung doesn't have as big
of an install based so they don't need to sell
you know, millions and millions of these things immediately when
they come out with a new product. So it's gonna
be really, really hard. I think for Apple too, you know,
the next fifty years, I think, I think might be tough.

Speaker 1 (32:04):
I can see Kyle. I can see Kyle smiling, and
I think he might be drafting some hate mail. Yeah.

Speaker 7 (32:08):
I was just remembering the recent marketing campaigns for the
latest iPhone models, and they just all promised various AI
functionality and just did not deliver on it at all.

Speaker 3 (32:19):
Like I when I bought.

Speaker 7 (32:20):
My most recent iPhone, I kind of knew the AI
was not going to come through, Like that was the
messaging and the tech community at least, but the public
didn't know that. And it just seems kind of embarrassing
for the company that they are missing their own goals,
like they're missing their stated aims with their AI functionality. So,
I don't know, it doesn't look like strategy from the outside.

Speaker 1 (32:42):
You don't think there's anything the consumers like. For example,
we just talked about that movie, right, and the kind
of AI doomerism. There's no argument that perhaps from a
consumer point of view, like having a technology company that
is not emotionally associated with AI and doom maybe an
interesting thing. I mean it was. It was interesting too.
Cook in that letter didn't mention the words AI lartificial

(33:03):
intelligence once. I mean, this is a tech ce or
letter in twenty twenty six. That kind of been an oversight.
I mean that must have been a decision.

Speaker 4 (33:09):
Well, look AI. The thing is we should stop calling
it AI. I know that's not gonna happen. But you know,
it's software, okay, Like it's like saying I hate software,
Like okay, fine, Like and there could be like an
anti software lobby and everybody lobbies. But like, you know,
when you have a product that can just like do
stuff for you and save you a bunch of time

(33:29):
and is like very useful, like you're just going to
use it, right, that's just how the world works. Like
I'm sorry, nobody's going to be like I'm morally opposed
to this software that's like, you know, allowing me to
spend more time with my kids. Like it's just, you know,
it's just kind of a silly concept.

Speaker 7 (33:43):
I found it kind of interesting just to jump in
the wave of new devices coming out for AI. The
little gadgets and dependance and stuff. They all look like
Apple products, Like the vocabulary is so apply, the graphic
design is apply. The whole vibe comes from Apple. But
Apple is like not anywhere.

Speaker 1 (34:04):
And Jon who designed the iPhone, is working on.

Speaker 4 (34:07):
AI, so like, and why are those products coming out? Right,
It's because the iPhone can't. You can't do what you
want to do on the iPhone. They won't let you.

Speaker 7 (34:17):
I'm just waiting for the real AI Tamagatschi. That's my dream.
I can't wait. It needs to happen.

Speaker 6 (34:25):
Wait read. But if we're locked in because it's a
closed ecosystem, doesn't that give them this at least, like
it buys them time to just come in when you
know when it's not AI AI, it is, here's your
useful assistant. Like I don't know some of these some
of these companies, right like fifty years they've had they've
been able to parlay their like consumer lock in from

(34:48):
across a generation.

Speaker 4 (34:50):
So yeah, like Expinity for instance, right, like they will
continue to exist, Like I'm not arguing against that at all,
Like I think Apple will go on for a long time.
Like this isn't This isn't an argument that like Apple's
going to implode and disappear. But it's like it's like
Microsoft when they missed you know, I don't know, they
miss search and then Mobile, like it's it's Microsoft still there.

(35:12):
It's still a very valuable company, but like not as
as relevant and it went through some dark times. Maybe
you know, maybe they can make a recovery or something
like that, but I think it will be dark times
for Apple. It's not going to have the same, you know,
the same relevance as as it used to once this
stuff starts to take off. And I'm not an expert
on the timelines, like this could take this could take

(35:34):
several years. But but the writing's on the wall.

Speaker 2 (35:37):
That's all. That's my argument.

Speaker 1 (35:39):
I also think I also take take a little bit
of you know, disagree slightly with you read that the
people done by Apple because of the product, Like I agree,
the ecosystem looking is very powerful, but also like I
was an Android user for many years and I bult
an iPhone. I was like, wow, this is just a
lot better. I'm not going to get another Android, And
I recently lost my airports when I was traveling and
I was like, I don't know about one hundred and
fifty dollars for airports. I'm going to get fake copied

(35:59):
one on Amazon for thirty bucks. I was like put
them in and I was like, I.

Speaker 4 (36:04):
Thought, look, I have I have AirPods in my ear.
I have an iPhone sitting actually I have like three
iPhones sitting on my.

Speaker 2 (36:11):
Dead like I'm on a mat computer.

Speaker 4 (36:14):
Like this is I totally get that, Like it's a
great consumer experience, like this is not like this is
not me being like I hate Apple and I won't
buy anything by Apple, Like no, people, It's a really
nice consumer experience, like it all, it all works really
well together. But I'm just looking into the future. I'm like,
this is not like this idea of like opening up
an app and like using it like which we all

(36:35):
do today is not the future. Like that is not
how like I'm not opening up a banking app and
like transferring funds and stuff. I'm just gonna tell an
AI agent to do that, and you can do that
today with you know, open claw. That's very early adopter technology,
but like a glimpse into the future again, and I
can tell a chatbot like go, you know, transfer some

(36:56):
funds on a bank, Go book me a flight, right
and without ever opening an app, and all that's happening
outside the mobile ecosystem. And that's that's kind of where
it's heading. And it just makes that the thing that
Apple built that nice, like very nice consumer experience, like
it's not as important anymore. It's it's going to be
sort of like there. I don't think there's a walled garden,

(37:18):
at least I hope not, because I think it would
be bad for consumers. Like, I don't think there's like
a walled garden approach to that.

Speaker 7 (37:23):
It feels like everyone's jostling for control of the new interface,
like what are we going to use to interface with
our AI agents and our open clause and whatever. And
you know, Johnny Ive and open ai are trying to
do that with some pendant thing that sits on your
desk plus the thing that goes in your pocket. There's
been all sorts of little experiments and different trials. I

(37:46):
feel like there's more opportunity right now for some other
company to take this away from Apple and to come
up with something else. And I was just recalling how
Apple is kind of bailed on its other innovative hardware projects,
like there is no TV, there is no car, and
the new possible CEO who people talk about John Turnus.

(38:06):
He's a hardware engineering guy. Like, it's not clear where
it's going.

Speaker 6 (38:11):
Did you guys see that picture of Joe Gebbia from
Airbnb with like what looked like the new device. I
mean I still want my If that's the new device,
I will keep my phone. You know, like you are saying, Kyle,
it's so derivative of early Apple. So yeah, I see
what you're saying. I guess, like, and I don't even

(38:32):
know why I'm defending Apple's hard I guess I like
that Tim Cook didn't put AI in his letter because
it is meaningless. Like you, if you're thinking about like
how to give people real utility, you don't need to
get like there is a chance that you could just
come in on the back end. I mean, fifty year
old companies have a very hard time innovating.

Speaker 4 (38:52):
I think the new interface that you're talking about his
voice right, Like that's what that's I mean, it won't
be all voice, but like you know, that's how we're
to control this stuff.

Speaker 6 (39:00):
Don't you think, yeah, but what is this big pebble,
this big pebble. I'm not going to carry around a
big pebble.

Speaker 2 (39:07):
Like you're gonna have to.

Speaker 4 (39:09):
We're gonna have to have some device, right because you're
gonna want some screen occasionally look at something visual or
you know, maybe it's in a glass as well. I
think that technology is like really far away that.

Speaker 6 (39:20):
They'll consult more women as they're doing this, because they
did not for VR and.

Speaker 7 (39:27):
A Imagchi all the way. So we've we've been here before.
There's the Pokemon version, there's the digimon little thing.

Speaker 3 (39:35):
It's gonna be great.

Speaker 6 (39:39):
Yeah. I was just gonna say, because I can picture
how to carry it around like that big pebble, I
was like, where am I going to put it? It's
going to be at the bottom of my back?

Speaker 1 (39:46):
Who has the inside scoop on the on the CEO
succession is Tim Cook? What are you going to go
this year?

Speaker 6 (39:51):
Oh?

Speaker 2 (39:51):
I don't know.

Speaker 4 (39:52):
I mean, I I just I'm not like deep in
the in the company anymore, so I would just be guessing.
But I mean for sure, like it's the end of
Tim Cook's you know, career, and that's that's apparent right,
Like when exactly that happened, I don't know, but but yeah,
like what I I think it's like there's every every
it seems like every six months there's like a new

(40:14):
successor a new name that comes out. And he's like, yeah,
like none of it's like super exciting. You're like, could
we get Steve jobs? Like is he available to?

Speaker 1 (40:22):
Like I just Adam Newman might be looking for a job.

Speaker 4 (40:25):
If maybe Adam Newman like I don't know, like something
exciting that like maybe that's not good. Maybe Apple doesn't
want someone exciting. Maybe the best thing is to be
like Comcast or you know whatever and just like ride
ride it out as long as possible.

Speaker 1 (40:38):
Would you when you when they would dooin transition from
jobs to cook it was from the vision to reach
the technic crawt. It's hard to go from the technic
crat to a new visionary, right, You're kind of you're
locked in at that point a little bit.

Speaker 3 (40:47):
Yeah, for sure, we knew Johnson Huang and his leather jacket.
You know.

Speaker 1 (40:54):
Well, that's all we have time for today. But I
want to end with a quick round robin who had
the best and worst we can tech? I'm gonna go first.
I think anthropic we always want to either the best
of the week or the worst. I think it's the
worst this week with the leak of that code. Interesting,
what do you think read?

Speaker 3 (41:10):
Well?

Speaker 4 (41:11):
I thought, I mean the word like Oracle laying off like,
you know, a huge chunk of their whole company was like,
has got to be a really bad, bad week for Oracle.
And it's all because well again, like people will say, like, oh,
the AI, it's taking our jobs, and it's like, well,
actually it looks a little different, right. It's it's like, well,
they need to invest a huge amount of money in AI.

(41:33):
They don't really have the cash, and so they're just
like looking in the couch cushions and it's like, yeah,
laying off a bunch of employees is okay, that's going
to get us like a little bit closer to what
we need to invest in this whole thing.

Speaker 1 (41:46):
And it's like just a general sixty minutes.

Speaker 4 (41:50):
Right exactly so six So it's like, I mean, I
don't know, this is like how we don't really know
how AI is going to like affect the world. It's
like all these people just lost their jobs because of AI,
but not because AI took their jobs right, and I
think it's a bad week for Oracle, But also like
that is to me just a good example of what

(42:11):
we were talking about earlier.

Speaker 7 (42:13):
On that note, I was going to say the worst
week in tech was for journalists because of this whole
AI journalism discourse. To Bockl, whatever should we use AI
and writing like, I think the answer is that it's
going to happen, and we're just kind of adapting to
this new situation. And at the same time, one of

(42:34):
the controversies was that this New York Times book reviewer
who's a British novelist used AI. He said to expand
a book review draft, he couldn't hit his thousand word mark,
which is crazy, and so he used AI and he
was caught kind of plagiarizing via AI. And my question

(42:54):
is just who are you, as a British novelist to
not even finish a novel review of a thousand words like.

Speaker 1 (43:01):
And this kicked off a bigger journalism and Natasha, I
want to know your thoughts on cloud code. But also
if you, I mean, we've got a couple of we've
got three worsts of the week. I think hopefully you
can bring us the best of the week.

Speaker 6 (43:12):
Oh, I have another worst of the week. I was
gonna say Mercore. Mercore is is the company that does
you know it kind of it hires those like PhDs
in biochemistry to create data that goes to the big
frontier labs to to put the PhDs in biochemistry out

(43:33):
of business. And they had this massive, massive leak. They
had video interviews with with individuals. They have so many
clients and I feel like we're just starting to get
a sense of where is that data going to go?
You know, does this mean it's like out on the
internet and everybody will have access to this like top

(43:54):
tier post training data. That one. That one is like
I'm not usually that interested in CyberSecure, but I really
want to see where this one goes.

Speaker 1 (44:01):
And how does that relate is it? It's a separate
story to the clawed leak, but just tell us a
little bit about that before regard.

Speaker 6 (44:07):
Yeah, the cloud leak is I think is spiritually very
different because you already have that a lot of that
code on your computer, so it wasn't as I mean,
it was like illuminating for people, but this is like
proprietary information that they owed to their customers, and you know,
there was no way that they wanted even a smidge

(44:30):
of this to get out.

Speaker 1 (44:31):
Okay, I have a best of the week the guy
from medv or Medvi who created the first unicorn of two,
him and his brother their first billion dollar run rate
company selling GLP ones and via great on the internet.

Speaker 7 (44:45):
It's a thing like they're just doing the marketing and
talking to patients.

Speaker 1 (44:51):
And I think I think that they're connect somehow, connecting
patients and doctors in a quicker way than it is
otherwise possible to bypass. Field Fashion and prescription.

Speaker 6 (45:01):
Kill Mill is the first billion dollar one person, two
person company.

Speaker 2 (45:06):
It wasn't born, you know.

Speaker 6 (45:09):
I mean, there's.

Speaker 2 (45:11):
Something no but it's just it's just boring.

Speaker 4 (45:14):
I'm just saying that was like what it was like
web one point out, like let's do something different.

Speaker 1 (45:18):
You know, he has a digital avatar of himself to
deal with his day's day life as well. The founders
that he's already going all in on the on the
on the AI moment.

Speaker 2 (45:26):
He should be a go ahead read can I thro?
I mean we said it earlier.

Speaker 4 (45:31):
I mean I think SpaceX right, it's gonna be the
largest I P o ever, right, they they filed confidentially
this week, right, I mean that's like, you know, you
gotta get a handle to them. That's pretty and you know,
and also just people are talking about space a lot, right,
You've got the Artemist launch, You've got the Project Hill
Mary movie.

Speaker 1 (45:51):
You know, I feel likensha not such a space stun.

Speaker 3 (45:56):
Can go to spaces going to space.

Speaker 2 (45:59):
I I just wish, I.

Speaker 6 (46:03):
Wish people would read more Octavia Butler if they're thinking
about space rather than you know, like what is he
like Robert Heinleine. Uh, I'm very curious to look at
the space X I p O. Because I think we're
going to have some like oracle type moving stuff around
in order to pay for those data centers. And you know,

(46:26):
it's just been rolling up and rolling up companies into
right like x into Xai, x Ai into is it
into SpaceX now?

Speaker 4 (46:35):
And I'm like blinking, well, it's going to be well,
well it's yeah, but it's going to be it is SpaceX,
but it's going to be Tesla.

Speaker 2 (46:43):
It's going to be elon KRK. Right, That's that's the plan.

Speaker 4 (46:46):
They go public and then they'll they'll join with Tesla
in some capacity, and you're just.

Speaker 6 (46:52):
You're just I mean, we do think there's a chance
that if people look at the at the numbers that like,
he's an incredible showman for for stockholders, right, So do
you think some of that will change if we see
the numbers?

Speaker 4 (47:06):
I mean, I think you just have to ask retail
investors what they think. I think, and I think the
answer is like we love the Elon, right, I mean
that's people just he is the I mean, he is
the Steve Jobs of our day, right, like people, He's
got the he can convince people of anything. You know,
you can tell a story about space data centers and

(47:26):
humanoid robots and all that, and I think you know
it's it's that formula still works, right, I mean.

Speaker 1 (47:34):
Kyle is eating the Steve Jobs of our age.

Speaker 7 (47:38):
Oh man, I mean Steve Jobs was not sending us
to space. He could be like the Thomas Edison or whatever,
like the if not the pure inventor, then at least
the showman who can trust all of this stuff up
for public consumption. Speaking of space, though, I also want
to add my best of the week, which is a

(47:59):
Nintime Do and Super Mario Galaxy Odyssey whatever, which seems
to be a rousing success for children everywhere.

Speaker 4 (48:10):
You know, my kids are going to the movie later
later today, so yeah, they had.

Speaker 1 (48:15):
It's a hard choice between am Ai apocaly Optimist and
Mary mary Win Space movie. I guess for your kids.

Speaker 4 (48:23):
I haven't even seen Project Tail there yet, so that's
the the apocal apocaal Optimist is gonna have to wait
for me.

Speaker 3 (48:33):
In it.

Speaker 1 (48:33):
That's thank you.

Speaker 3 (48:37):
All, thanks for having us.

Speaker 2 (48:39):
That was fun.

Speaker 1 (48:54):
For tech stuff. I'm was for Locian. This episode was
produced by Elizah Dennis and Melissa Slaughter. Executive produced by
me Julian Nutter and Kate Osborne for Kaleidoscope and Katria
Norvel for iHeart Podcasts. Jack Insley mixed this episode and
Kyle Murdoch wrote our theme song special thank you to
you read Kyle and Natasha. Please check out all three
of these incredible journalists work, and we're very lucky to

(49:16):
call them friends of the pod.

TechStuff News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

AboutStoreRSS

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Girlfriends: Trust Me Babe

The Girlfriends: Trust Me Babe

When a group of women from all over the country realise they all dated the same prolific romance scammer they vow to bring him to justice. In this brand new season of global number 1 hit podcast, The Girlfriends, Anna Sinfield meets a group of funny, feisty, determined women who all had the misfortune of dating a mysterious man named Derek Alldred. Trust Me Babe is a story about the protective forces of gossip, gut instinct, and trusting your besties and the group of women who took matters into their own hands to take down a fraudster when no one else would listen. If you’re affected by any of the themes in this show, our charity partners NO MORE have available resources at https://www.nomore.org. To learn more about romance scams, and to access specialised support, visit https://fightcybercrime.org/ The Girlfriends: Trust Me Babe is produced by Novel for iHeartPodcasts. For more from Novel, visit https://novel.audio/. You can listen to new episodes of The Girlfriends: Trust Me Babe completely ad-free and 1 week early with an iHeart True Crime+ subscription, available exclusively on Apple Podcasts. Open your Apple Podcasts app, search for “iHeart True Crime+, and subscribe today!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.

  • Help
  • Privacy Policy
  • Terms of Use
  • AdChoicesAd Choices