Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
PJ (00:10):
Welcome back to Tricky Bits
with Rob and pj.
Uh, folks, we know it's been alittle while again, but we
wanted to have a bit of funtoday.
you know, I remember few yearsback, Steve Jobs would get up in
front of one conference, theother, whether it be WWDC or a,
(00:34):
a bespoke announcement, he woulddescribe in great detail a new
revolution that they had createdfrom Apple, whether it was the
iPod, the iMac, the iPhone, andhe would always end with the
words, and it's available today.
(00:57):
Fast forward to last year, mid2024.
And we've got AppleIntelligence, which is one of
the entries and the latestzeitgeist for the race for ai,
where it's going to read youremail, it's gonna tell you when
your mom's coming into town.
(01:18):
It's gonna set up things in yourcalendar.
It is going to be that magicalpersonal assistant that you
never know you needed, but neverknew you needed, but you do.
And Rob, where is it at thispoint in time?
been waiting for AppleIntelligence.
It's, it's on the way.
We sold the iPhone 16 on it.
Rob (01:40):
That's a good question.
I don't think you're gonna getit in the iPhone 16 timeframe.
So all those people who boughtit, all those ads on Apple's
website that have Applecompatible with Apple in.
Intelligence and all that isgoing away because as of
yesterday, apple are now pullingall the Apple intelligence
(02:02):
marketing material becausesomebody recommended they do
some better.
Business Bureau was like, yeah,you, if it's not a valuable, you
shouldn't say it is.
And but that is the ult ultimatequestion of like, where is it?
And Apple have not done thisbefore.
Yes.
They've said like The VisionPro, it'll be available in six
(02:24):
months or this model of thisphone or pad or Mac or whatever
will be available on this date.
And it might be a week out, itmight be a month out.
They've never said it'll be avaluable and sold a product on
it.
And it's still not a valuable,it's, it's very not Apple to do
this, but I think they realizehow shit they.
(02:50):
Apple intelligence really is,and they've marketed the hell
out of it.
I mean, they even pulled theYouTube ad.
They had all those ads where youcould sh it's kind of creepy,
but you could sh they in the adthey show a photo of sh a girl
shows a photo of like aprofessor or something and says,
who is this?
And when did I meet them?
And Apple Intelligence answersof like, it was this.
(03:12):
And it was this time when you,uh, met them.
It's kind of creepy'cause it'sfull on stalker, uh, vibes from
there.
But it's functionality whichthey wish existed and it's never
going to exist just due to allthe privacy concerns.
I mean, on one end, applesaying, oh, we were a very
private company.
We could take your data.
(03:32):
But then you can say, who'sthis?
And it'll answer you.
It's, it's utter bullshit fromday one.
It was, it's very much of, wecould, AI could do this and it
won't be our ai and it won't bean AI in the near future, but
here's a world where thistechnology makes sense.
PJ (03:52):
Apple made this tremendous
announcement last year.
It was very forward looking nowwe're getting a sense of ex, I
mean, it's nearly a year later.
We are nearly at the next WWDCthe only signs of Apple
intelligence that we've had sofar are.
(04:13):
Some slight cosmetic changes toSiri as far as I can tell, a
connection to open ai.
And now I can do imageplayground on my phone as well
as this new button I've justseen called Visual Intelligence
that might be part of the, thelatest beta.
So we've got all of these AIesque features that have come
(04:35):
together, but we don't have thepromise that was talked about
with private cloud, with aprivate model.
You know, to you, none of thisstuff has actually appeared.
Rob (04:49):
I'm not sure it's going to
appear as they originally
intended.
It's all fits into the force inAI down our throats, and they
don't know where the market'sgoing.
I think Apple's being veryreactive to the AI world, where
usually the very proactive theygo, this is what we're doing,
(05:09):
this is what the product's gonnabe.
Where for ai, they've been veryreactive in responding to what
others in the market are doing.
And when they made all theseannouncements, maybe that was
the forward looking thing andnow a year later and a year in
AI terms, it's a hell of a longtime, a year later.
Now what they originallyplanned, some of it makes sense,
(05:32):
some of it doesn't make sense.
Um.
I still can't get over the factthat it feels like they're just
forcing it down their throat.
This is Apple's moment where mysame as Microsoft when they
missed mobile, this is Applemissing ai, and they just don't
understand it in the way that wewant to use it or not.
I don't want to use it at all,to be honest.
PJ (05:52):
in terms of the missing, it
really has a presumption that
there's something there.
Now, I'll fully admit, uh, I'veactually been enjoying using
Cursor for various differentprojects that I've been pulling
up really quickly, whether it'sa, a app or at a backend, or
I've even doing some embeddedstuff with it, which has been
(06:13):
kind of fun.
but beyond that use case for meand maybe doing a little bit of
just of, of research stuff, Rob,is there a, what that Apple is
missing on right now that's inthe greater marketplace?
Like, where are they being leftbehind?
Rob (06:32):
they don't really have
anything that makes sense in
their product range.
They are throwing stuff at thewall to see what happens.
They've got the fancy removingobjects from photos and whatnot
and, but where would it makesense, Siri?
It would make a lot of sense atthe Siri level if she had
context and, but she doesn't, sothat would be the obvious use
(06:53):
case.
If she's, if they're gonna keepthis intelligent system, which
I'm not sure Siri should evenexist anymore, but if they are
gonna keep it and they have thisai, why are they not putting
them together?
And they've just announcedthey're delaying the new Siri
again.
So they obviously are working onthis, but where is it?
No one's even seen a demo of itand which is very rare for
(07:13):
Apple.
Like usually some reporter or TVperson or whoever it may be,
some person with a lot ofexposure gets to go to Cupertino
and see the demo hands on.
And this comes back to what youoriginally said.
They used to have live demos onstage where Steve Jobs would
just do something and it wouldwork or sometimes not work.
(07:35):
And I.
Steve had an amazing ability todig himself out of any hole and
still come out like the hero.
And I think that's why Steve didlive demos'cause he didn't care
what happened because he knew inhis own confidence that he could
get out of the situation thatthe demo would put him in.
And they had lots of demos thatdidn't work and it always worked
in the end, at least worked intheir favor.
(07:58):
I think when we got to Covid, westarted doing these
presentations via video, whichthey could be very carefully
edited and very carefullycrafted.
And we went away from the livedemo.
And I think now we just in anextension of that where it's
utter bullshit that they justpull out of like, look, we could
do this and we made a coolvideo.
(08:20):
And no, none of it's real.
And to the, and I say thatbecause no one's seen it.
I mean it, we went from a actuallive demo of valuable today to a
video that.
Utter bullshit with all the adsthat they had.
Also utter bullshit withoutanybody even seeing it in
(08:40):
person.
And then Apple's so secretive,nobody inside the company's
gonna talk about it.
And we have had reports of, oh,this team got fired and replaced
by this team.
It seems like very much theydon't know what they're doing
either.
They're good at making videosthough.
PJ (08:54):
well, so is Hollywood.
I mean, we, we've had thepromise, since minority report
of this magical AR interfacethat has never come to be.
So, you can make a good videoand, and concept car this stuff
out, but there is a realdifference, as you're saying, to
actually delivering on it.
Rob (09:12):
Or show it to somebody like
nobody has seen even a hint that
this could exist of, yes, wecould have this ai, we could
have Jarvis, and we could havethe minority report displays and
whatever.
These things could exist.
Yes, somebody thought of it forthe movie.
Somebody could think of it inreal life, but you'd think if
(09:37):
they go to show a movie of who,who's this person, and when did
I meet them?
They'd at least have a demo ofthat, where it's kind of clunky
and kind of broken and not readyfor release.
But the concept is there.
Nobody's seen it.
So does the concept even existor are they just literally full
of shit?
PJ (09:54):
It is a really dangerous
place to be if, if they truly
are full of shit.
The progression that we've seenout of Apple has been, I talked
about at the very top at itsavailable today.
a, a difference of the SteveJobs era, where for the most
part it was, it's available theday of announcement, then it
(10:15):
moved to, it will be available,at a particular fixed date.
For example, the release of agiven operating system, room up
or an upgrade to iOS.
then finally we got to the pointof what we're seeing now of this
could be something that we cando at some point in the future
(10:35):
where it became late fall, earlyspring, a lot less refined
around these dates.
do you think are seeing anerosion of apple's credibility
in the marketplace to deliver oninnovative features?
Rob (10:52):
Yeah, they haven't
innovated since a decade ago.
What have they done that'srealistically new?
Very new, like true innovation.
The phone was obviouslyfantastic.
The iPod was fantastic.
The Mac, the I original iMacwhen Steve Jobs came back.
That was groundbreaking at thetime after that though, which
(11:12):
kind of being like, well,everyone else is doing this too,
and well, some people are doingit better, so we should copy
them.
The actual innovation is gone.
I mean, even the watch, yeah,it's a decent watch, but Fitbit
already existed and the idea ofa smart watch already existed.
Apple just did it slightlybetter because of their
ecosystem than everybody else,and I think.
(11:33):
The watch played into theecosystem where I think the in
vision process, the ecosystemkind of failed them because it,
it's not a good ecosystem forthe AR world, but they had to
integrate it and they did thebest they could, but it's still
clunky.
Um, that's their last product.
And I think the market respondedthe same way in respect to what
(11:56):
I'm saying.
It's like nobody really wantedit and I worked on the damn
thing, so, and I knew at thetime I, nobody would want this.
But yeah.
To answer your question,innovation is not a thing in
Apple anymore.
It's like rinse and repeat iswhat the cycle we're in.
Like what's new in iOS for thelast decade.
PJ (12:11):
I, I mean maybe they'll add
a fifth camera.
Rob (12:13):
Exactly.
It's basically a camera.
Your phone is now a camera, andall the AI processing that it
does out of any use is relatedto editing your photos.
PJ (12:22):
Are we seeing a pattern
then, and, and maybe there's a
broader marketplace questionhere.
We don't, we don't have to getinto, but where.
Running after ar now runningafter ai.
we desperate to get a newparadigm shifting innovation?
Do you think it's actuallymarket forces where it's like,
(12:42):
Hey, we're addicted to fastchange, big change, innovation.
Look, we've seen the internet.
Hell, even wifi was anincredible innovation at the
time.
We've gone to mobile phones.
Like it's, it's almost as iflike, we need something new just
to keep the whole thing drivingforward.
Rob (13:00):
Oh for sure.
'cause markets are, where arethey growing?
Can they keep selling the numberof iPhones that they're selling?
They can sell'em to existingcustomers, but are they actually
getting new customers?
Are they taking a dent fromAndroid?
'cause it's the only other placeto take'em from.
yeah, they're probably growinginto third world countries
slowly.
And so is Android.
But are they actually getting abigger market in actual number
(13:23):
terms?
And if not, then who are theirnew customers?
Like this all capitalism.
Yeah.
Infinite growth forever.
It's it some point, it comes toan end and I think.
For people who have picked thecamps, most people who have an
Android phone don't want aniPhone.
And most people who have aniPhone don't want an Android.
There's a handful who switchback and forth or work on both
or things like that.
(13:44):
But generally people have pickedthe camp and they stay and
they're staying there.
So I think Apple have to goafter these new markets because
it's a, it's expected by theshareholders.
And B, they do fear getting leftbehind.
Inside they have this, oh, mentwe've talked about this a
thousand times, of like, oh,we're the smartest people in the
(14:06):
uh, room, but they're alsoincredibly insecure about that
statement that they like tomake.
'cause even they know they'refull of shit and they're not
actually the smallest people inthe room.
The smallest person is in thatcompany over there doing
something cool.
And I mean, Apple's never beenfirst.
They've always kind of beensecond and done it better, but,
but even that's going away.
(14:27):
The AI is not, they're not evensecond, the 10th and 10th in
line.
It's like everybody else seemsto be doing it a lot better.
And I think Apple's ecosystemplays into this.
Like, we couldn't do a copilotthing for OSX.
You could, anybody could havewrote copilot preferred windows.
Microsoft happened to do itthemselves.
(14:47):
But anybody could have writtenthat, uh, software because the
APIs that Microsoft use areavailable to everybody else.
They may not be well documented,but they are a valuable, and if
you call'em and you're notsupposed to, all the
documentation says is notrecommended.
You call this, they don't say,you cannot call it where it
would be impossible for me andyou to write a system level AI
(15:09):
component for OSX.
It would be utterly impossiblefor iOS.
This walled garden is stoppingme and you or people over there,
more importantly, who are verygood at doing ai, integrating it
with Apple.
You could do an app but youcouldn't system level
integrating and there's a bigdifference there.
Only Apple can do the systemlevel integration and ultimately
that will bite them because theyare not the smartest people in
(15:32):
the room and they are veryslowly figuring this out.
And I'm not sure they know howto fix the problem cause they're
not gonna open it up.
PJ (15:38):
How much do you think this
is a a between a rock and a hard
place for them?
And let's frame it in.
Apple has touted for a very longtime.
We want to be the most private,we wanna be the most secure of
devices we want to be.
We want to specifically lockthings down to keep it safe for
(16:00):
you.
this was a decision that wasmade very early on.
They've stuck with it in manycases, even with like the Mac
Os.
They've gone even further intoit by sandboxing apps and making
sure that they can't talk toeach other.
Now.
Rob (16:16):
They've done it the crude
way though, haven't they?
PJ (16:18):
jewel for them.
But in this case, like are theystuck because, do they have to
break old promises in order tocreate this innovative
environment you're talking aboutRob?
Rob (16:30):
It's all or nothing is at
the moment.
Yeah.
Initially it was a good thingbecause my mom doesn't know
anything about settings and howto configure things and it just
works and that's good until itdoesn't just work.
And now they've dug this hugehole where everything is in the
hole no one can get out of.
If I choose to use another appstore, I.
Then why can't I make thatdecision?
(16:51):
I know this app store's dodgyand I'm willing to do it.
Maybe it's a burner phone.
Maybe I have to do it because mywork tells me I have to do it.
Whatever it may be.
This mentality already exists.
Lockheed Morton have their ownapp store.
Apple internally have their ownapp store.
So app stores exist with appsthat are not signed by Apple.
PJ (17:15):
Mm-hmm.
Rob (17:15):
Apple give them a
enterprise master certificate,
they get to sign their own appsand you can download it from the
internal store.
No questions asked of what thoseapps do.
Um, obviously for LockheedMartin, it makes sense because
you can't be sending this app toApple to sort of,'cause it's
probably secret or has topsecret clearance or things like
that.
PJ (17:34):
yeah, yeah.
Rob (17:35):
So those apps can't be sent
out.
So the, the mentality alreadyexists that this needs to
happen.
It just, why can't it happen ona bigger scale?
Like if I want to go to the appstore where I can get, uh.
Dodgy software.
Why can't that be my choice?
If I'm giving up my security ormy privacy, then that's my
choice
PJ (17:55):
So I framed it a second ago
as being a philosophical
problem, uh, with privacy andsecurity.
Uh, now I'm gonna be a bastardand frame it as a financial
problem because to your pointearlier, apple is slowing down
in terms of its growth in termsof how many new devices it can
sell.
(18:16):
as of its last earnings report,what was its greatest area of
growth?
I.
Rob (18:21):
Services.
PJ (18:23):
Yeah, so it actually creates
a dis incentivization for Apple
to want to open stuff out toother people's stuff rather than
directing them to their ownhomegrown services that they can
create and sell.
Rob (18:41):
That's true, but they're
not the best at doing these
services.
When we go back to the Appleintelligence mindset, they are
not the best at doing this.
So they have to open it up toothers and they are doing that
with open AI and things likethat, but they kind of do it in
a very crude way of like, ohyeah, we could stay private if
we do this, but we can't do thisand stay private'cause we have
(19:02):
to send you data to open ai.
And of like they did it the waya high school kid would do it.
Oh yeah, we'll draw, we'll dothis.
And then hard switch.
This goes to open ai.
It's if they were willing to putresources or engineers on this,
they could have done a much,much better thing.
again, I think yes, the servicesare big and they want to grow
(19:23):
these services, but are theyjust gonna become a services
company?
If they're just gonna be aservices company, why don't they
just do all these services forAndroid?
Because there's a massive marketthat they're not even tap yet.
I, I think internally they'reconflicted.
I mean, yes, you can get AppleMusic and Apple TV for Android
and uh, things like that, but,so they know that market's
(19:44):
there, I mean, they're notmorons.
Well, they are, but they're not.
And but why is iMessage notavailable for Android?
'cause it's the golden goose.
And is RCS gonna change that?
Maybe, maybe not.
It's, I dunno.
It's, it's interesting.
But the Apple intelligence thingis.
A, a problem that doesn't fit inthis space.
(20:05):
It's like, it's not a good thingto have in a, in a walled
garden.
Why don't, and they get to pickwho you form your data out to.
Yeah.
It's like their business dealsare all this of like, why don't
they just make it so there's an,a service interface where others
can plug into it and thenanybody could provide the
backend.
PJ (20:22):
Well, I think it's this
confluence of these constraints
they've put on themselves.
They wanted to be the mostprivate.
I like that.
They wanted to be the mostsecure.
I like that too.
Rob (20:32):
I like it too until I don't
like how do you turn it off?
Like right now, I don't want thesecurity.
I need to do something that Iknow is dodgy.
I decided to do it.
I want to be able to turn itoff.
They've, they're very black andwhite.
If like, if you have access tothis, then the entire system
falls apart.
It's not a very, it's not a verysecure, in a modular sense.
(20:55):
It's, there's a wall around theoutside of the whole thing.
And if you get in the wall.
All better off Where, so it'sactually bad engineering in the
grand scheme of things.
It's if you look at what otheroperating systems do, it's like,
yeah, this is still secure.
We can still have security andprivacy even if you have full
access to this part
PJ (21:15):
poke on this for a sec, Rob,
because Apple for the longest
time really has been a consumerelectronics company.
It's a consumer hardwarecompany, and I'm hammering on
the word consumer for a secondhere in.
Contrast to someone likeMicrosoft, which obviously sells
consumer software but is vastlymore friendly to developers,
(21:38):
either in its operating systemor the tools it provides, like
everything that Microsoftprovides is really first rate
for developers.
Apple seems like it has a verytense relationship with its
developers.
It needs them to provide apps,but also sort of hates them at
the same
Rob (21:58):
Oh yeah.
It at Apple are the absoluteworst.
It's, I mean, even X Code, itdoesn't even give errors in the
same way as other developmenttools of like, just tell me what
module you're compiling and tellme the error in that module.
I don't need you to, take themall and put them in a window for
me in some order that youdecide.
it makes working on an applevery different to work.
(22:18):
And even the languages they useof like, you will use Swift.
It's like, I don't wanna useSwift, my engines, and C it's,
everything they do is notnormal.
PJ (22:26):
Well, it's under that aspect
of control, right?
Rob (22:32):
Yeah, you, if you wanna
work on our platform, you will
do this.
You will use Swift if you wantto use Modern UI.
And if your engine game engine,for example, or Apple or
whatever it is, is written in C,let's say the game engine,'cause
it's a real hard one and it'scaused a lot of problems for me
personally, is, okay, so I'vegot this engine and it's written
(22:53):
in C and it's got a backend forVulcan and it's got a backend
for direct X, which are both CCplus plus, and now I need to do
a Mac version.
So now I have to deal withobjective C'cause I'm not gonna
deal with interface into Swift.
And so now I have to do anobjective C backend just so I
can call the APIs because metalhas no C bindings at all.
(23:16):
That's just horrible thing tohave to deal with.
Lots of bugs, lots ofmarshalling of data.
Just, just awful.
And they could just provide a cinterface for this one use case.
They, they could say, recommendyou don't use it, but it's here
if you need it.
Which Microsoft do all the time
PJ (23:30):
again, does this derive from
a philosophical issue that both
companies, you know, differ on?
On one hand, Microsoft is verygame friendly.
it's been in their DNA for thelast several decades apple is
not,
Rob (23:45):
games, I think is gonna go
the similar way as the ai.
I, I think they, on one handthey are, apple is the biggest
gaming platform, um, becausethey have the, well, they are,
they have the phone and theyhave the Apple TV and things
like that.
But, but no one thinks of themthat way.
When you look at what Microsoft,Sony do for actual game
(24:05):
developers, it's an order ofmagnitude different and 10
orders of magnitude better.
Apple could have been the way,the perfect position to be the,
the winner in the game space.
They could just make it a moreslightly more powerful Apple tv
and it would be as powerful as aPlayStation but they're not
willing to document thehardware.
(24:26):
They're not willing to help youmake things more performant.
It's all will write you a toolto do this, will make a better
compiler, will make newhardware, blah, blah, blah.
It's, it's the very softwareengineering approach which games
still haven't caught up with.
The games are like, this is thehardware we have, we're gonna
make it go as quick as we canthey have the mindset that we
need to do this'cause this isour modern GPU's work, but the
(24:48):
ecosystem that goes with it justisn't there.
Actually talking of the GPU APIsbrings us back to the 10,000
pound elephant in the room.
PJ (24:56):
Cuda.
Rob (24:57):
that's kuda.
Yeah.
If like all AI outside of Appleis pretty much done on Kuda at
some point, especially for thetraining side, the inference
side may be not so much, but thetraining side is all cruder and
which is why Nvidia has such ahuge position in the AI world.
PJ (25:17):
yeah, most of the inference,
to be fair also is happening
just as a slight infection, thatcan happen locally or on other
devices besides Nvidia, but.
We can also confidently say thatsince most inference is also in
the cloud, inference ishappening on Nvidia.
Rob (25:34):
Yeah, so how does that
affect Apple with going back to
the Apple intelligence of like,are they doing it with Coda and
just using the cloud or are theytrying to do it in-house?
Which also puts'em at adisadvantage because now they
have all this open source r andd code that they could use, but
they can't because it's allwritten in Coda.
(25:55):
And having been someone who'sported Coda out of Coda, it's
incredibly difficult.
Coda has a lot of built-inthings that just make things
work.
Maybe not the most efficientway, but it does just work and
getting that outta there andthen having to kind of
rearchitect.
The same sort of scaffoldingthat it needs becomes very
(26:17):
difficult.
It's not always easy one-to-onemapping.
Even if you app only, you haveaccess to the internal APIs,
which ironically RC and It's aproblem I don't think they know
how to fix because they've burntthe bridge with Nvidia so they
can't go back and they can'tmake a version of Cuda.
So what are they doing?
(26:37):
And they're just doing it inmetal code and kind of ad hog in
it together.
Not the best way.
And I think that's part of whythey're so far behind is they
can't use the tools thateverybody else uses.
They have to do it in-house, andthey're not doing it in a way,
they're doing it the Apple way,which isn't the way that this
market went.
PJ (26:58):
It's another useful data
point to talk about it from the
philosophy.
'cause like on
Rob (27:03):
I.
PJ (27:04):
apple has made a huge
investment into its own silicon.
It's designed its own neuralengines.
It has the unified memory.
It's, it is touted these thingsin, in terms of being so
powerful.
Now, now they have a choice, youknow, for these clouds they want
to build.
Do they do this route?
Which it sounds like they areof.
(27:24):
it with metal, trying toreplicate all this stuff.
Really reinventing the wheel forthe apple hardware.
Or how bad would it be as apress release if it ever got
leaked or got out if they wereusing Nvidia hardware underneath
the hood to take advantage ofeverything that's out there
Rob (27:46):
They should stuff at that
press release is what they
should do and just, I mean,yeah, apple's, silicon is
powerful with respect to itspower consumption.
Is it a 50 90?
PJ (27:59):
no.
Rob (28:00):
No, not even close.
And no matter how you fiddle thespecs or the stats or your
access list graphs in yourpresentations.
It's not, I mean they, when theM four came out, like compared
it to a 40 90 in very specificcases, and maybe in a few ways
it is slightly faster, but, uh,generally in a general app using
(28:23):
general GPU techniques, the Mfour hasn't got a standing
chance in hell against the four,the 40 90.
Um, that's a 40 90.
We're not even talking about theH series
PJ (28:37):
Oh
Rob (28:37):
cards.
PJ (28:38):
For, yeah.
Yeah.
Rob (28:39):
nvidia, obviously not
stupid.
They have a lot of this stufffigured out, and it obviously
works because everybody uses it.
So Apple should, if they want tobe competitive in the AI space
where everything is done inCoda, they also need to kind of
be in that Coda space otherwisethey're reinventing the wheel.
They have to build the platformfirst before they can even do
any good ai.
(28:59):
And they have to build thetools, they have to build the
debuggers, they have to buildeverything, and they have to
kind of re-implement everythingthat's already out there.
They could just take it and, uh,use it and they do that.
I mean, you look at, uh,PyTorch, there's a metal
implementation of PyTorch andblah, blah, blah, which makes
other people's AI useful onApple.
(29:19):
But at the big scale, no one'susing PyTorch, the big scale.
It's all done natively.
PJ (29:26):
so you've been inside of
Apple.
come to what I think is a, isthe correct engineering
conclusion, which is screwtrying to do all this stuff for
training on apple, silicon.
It's fine for inference.
need to just step up, the NVIDIAhardware, start using that, at
(29:48):
least for our backends.
How does that go over in termsof the executives in terms of
telling that to an exec who'sinvested billions of dollars in
apple, silicon, as well as allthe marketing credibility?
Rob (30:03):
it doesn't.
That's why it hasn't happened.
PJ (30:05):
Okay.
Rob (30:06):
Yeah, you'd never sell that
inside Apple.
Not at all.
It would be a huge, huge PRexecutive problem.
I'll never forget, theexecutives are fairly
disconnected from actual bitsthat happen inside the
processor.
So they work on these, they workon the same specs that they
bullshit to everybody else.
PJ (30:26):
So let's, this is an
interesting contrast though,
because know, I don't often putGoogle into a favorable light
too often, but I.
develop their own TPUs for doingai uh, operations, also have
zero problem, saying Yeah,we're, we put in a massive
purchase order to Nvidia to pickup a whole bunch of these cards.
Rob (30:49):
That's the Google approach.
It like Nvidia, do this better.
We'll just use Nvidia.
We are not in the market forremaking all of this.
We're not in the market forreinventing the wheel.
We'll just use them.
Yes, it's a competitor, but doesit matter?
they have thet p use forinference.
They can train on this side.
On the Nvidia side, they caninfer on the TPU side they can
learn how to make better TPUsbased on what Nvidia have
(31:10):
already done.
Apple just don't do that.
It's just not in their DNA.
It's like, we'll do it in-houseand Will or, or will buy
something, but there's nothingto buy.
'cause everybody's held owned,all the big AI companies are all
related to other$3 trillioncompanies.
PJ (31:27):
Yeah, they, they really all
have been snatched up.
And, and again, I mean, to
Rob (31:32):
I mean, they, they do hire,
they do poach back and forth and
things like that and blah, blah,blah, blah.
But one person can't make thishappen.
PJ (31:40):
play out the scenario of to
buy an AI startup.
So Rob, you're running an AIstartup right now.
hardware are you using to trainyour models and do inference on
them?
Rob (31:55):
Yeah.
There's not even an answer, isit?
It's it's Nvidia,
PJ (31:57):
right.
So if I bought you, isequivalent to me saying I need
to go get Nvidia hardware atthis particular point in time.
Rob (32:06):
yeah.
And it'd be easier just for themto do it, but I, I mean, they
have the, they have the people.
I think they're strugglingbecause of other restrictions.
The wall guard and the hardwareand things like that.
They open up a lot of researchpapers and they do document a
lot of what they're doing, sothey have smart people.
Why is it not playing out?
(32:26):
It's something else.
It's not just, they may be thesmallest people in the room, but
something else is cripplingthem.
PJ (32:32):
Let's look at like, you
know, kind of this idea of we
need to do it all ourselves
Rob (32:38):
Well, let's, before we go
there, let's, let's talk about
like, why is it, uh, crippling,first of all, it's like, is it
even needed?
Do, does anybody want thisproduct?
Is the first thing.
I think that's a crippling, uh,question.
And secondly, how AI to now upto date's been, it's been like
it'll hallucinate, it'll tellyou wrong information.
(32:59):
It's, it's only as good as howyou interpret it.
PJ (33:03):
Yeah.
Rob (33:04):
can't rely on it being a
hundred percent correct.
And you can see that byGoogle's, uh, AI overview.
When you do a Google search,it's like, it's mostly right,
but sometimes it's horriblywrong.
and it's only when you know thesubject that you go, oh, that's
actually very wrong.
How often is it wrong on thingsyou don't understand?
So well.
Is that a place where Apple wantto be in this kind of Yeah, it's
(33:26):
kind of a gray world of like,eh, it's full of shit, but, or
it's elucidated to just makesomething up'cause it doesn't
know the answer at all.
Um, you see this in code whereyou use like co-pilot or cursor
and it's like if you startgetting in, if you're doing
JavaScript for like how do Idraw a circle in an HTML page,
(33:47):
it'll get it right every time.
But if you start going in theweeds of like, how does this,
how do I program Nvidia GPUsdirectly register level stuff,
there's very little other than,than Nouv driver documented on
the internet.
So it doesn't have muchinformation, but it will attempt
to answer it and generally getsit very wrong and it will take
(34:10):
you down this wrong path, but.
Is that a place that Apple wantto be if they integrate this
level, current level of AIsmarts into Siri should be worse
than she already is.
go back to the original example.
Darren and Fireball had the sameexample in theirs too.
And it was an Apple, uh, it wasan Apple example to begin with
(34:30):
it, of like asking Siri, whattime does my mom arrive at the
airport?
And it's, it's a deep questionwhen you get into it because A,
where's the information?
Is it in an email?
Is it in messages?
Is it FaceTime?
Did someone just write it on anote?
Wherever it is, it's somewherein the Apple ecosystem of data.
(34:52):
Is the information about my mumarriving, but where, where's she
arriving at?
Is she arriving at the airporttrain station?
Who's my mum?
Who does, how does Siri know whomy mum is if I don't have a
contact called Mum And.
If, if you say, oh, my mom'sname is whatever, then she has
more information.
But that's context.
(35:12):
Then sim it doesn't current haveany context and it's, that
context gets much bigger ifthese questions get more
difficult.
Let's say ask Siri to book me atrip to San Diego.
well, now she has to know whenyou're going and what you're
going for.
do you want to get there a dayearly or maybe go for a
conference and it starts at 9:00AM and you wouldn't wanna fly
(35:34):
before 9:00 AM you'd want to gothe day before.
What sort of hotel do you like?
Uh, all of these personalpreferences that she'd have to
ask you.
And she could ask you a millionquestions and you could answer
them, but you wouldn't rememberthem.
So it's, I I think this back andforth context, building with a,
(35:54):
a smart assistant is only usefulif it remembers it.
If I have to tell it the sameanswer, the same questions every
time.
It would be a lot of work whereif the, she was smart enough to
go, oh, last time you stayed,you'd rather be downtown than by
the beach.
Maybe that's where you want tobe.
Maybe you're going to sit thatdoesn't have a beach, so
therefore it's not even aquestion you'd want to ask.
(36:16):
If you booked me a ticket, you'dask three or four questions and
then you'd just do it.
And if you booked me anotherticket, you'd be like, okay, I
know how this is gonna work out.
How do you get that level ofcontext?
They can buy training data fromopen AI or Google or somebody,
and they're not training it onyour data.
So they have a fully trainedmodel and it doesn't need to
(36:37):
train on my data to read myemail.
So Apple's AI backend can readmy email and get context of what
my email says without have everseeing any of my data.
That's something that they couldpurchase and, but how much of my
data does it need to have to beable to be smart enough to have
(36:59):
context for a simple question,what time does my mom arrive at
the airport or not even at theairport?
What time does my mom get here?
PJ (37:05):
there's that longevity
question you're getting into, I
mean, which is again, the, thetension of the privacy of the
data, right?
I want a personal assistantthat's personal to me, for that
assistant to be personal to me,so I'm not answering the same
questions you're saying three orfour times over and over and
over again.
(37:26):
Uh, data needs to be storedsomeplace.
And ideally it's like I can askon my phone, I can ask on my
computer, I can ask on my homepod.
So this data isn't storedlocally to any of those things
unless you're syncing across theboard all the time, which
probably wouldn't work that wellanyway.
So there's a, there's thisquestion then of, okay then
(37:48):
where is this private data?
'cause it's a lot of personalinformation about where I live,
you know, what my credit cardnumber is, like where I like to
stay, what my location's goingto be, is now gonna be held by
somebody.
And I think Apple wants to bankon the idea that, Hey, we've
been trustworthy with your dataso far.
(38:08):
We're gonna hold this stuff inthe cloud in some kind of
secure, private location.
that's the only way that youlike, get away with creating any
sort of notion of a longcontext, is to be able to build
up some corpus of data
Rob (38:24):
Yep.
PJ (38:25):
itself to you.
But by necessity, that'spersonal data.
Rob (38:29):
That's highly personal
data.
That's not even, that's not evenlike, yeah, you read a few of my
emails.
It's just like, where are youright now?
Think if you had a personalassistant.
You have someone who works foryou daily and she or he is your
personal assistant.
Think how much they know aboutyou.
I mean, go into the celebrityworld of personal assistance
there.
There's NDAs and things likethis.
(38:51):
'cause they know everythingabout you.
do you really want that to bein, in a Apple database
somewhere?
Even though they've never hadany big, big data breaches,
doesn't mean they won't have.
PJ (39:03):
Well, I mean, as the data
value goes up, the,
Rob (39:06):
The, opportunity cost goes
up too.
Yeah.
It's of like, or like they've,maybe no one's ever looked.
'cause it's like, who cares?
I mean, the best way to not gethacked is not be relevant, but
if you are now the most relevanttarget with the best information
and then how does fit in lawenforcement, are they going to
open this up?
Like your actual personalassistant probably won't tell
(39:26):
the cops where you were.
Would Apple, if the cops show upwith a, with a, uh, subpoena, is
this truly personal, alwaysencrypted?
If it's encrypted, how are theyreading it?
There's all of these questionsof like, well, if you can pass
it between my devices andsomehow have some context to
reply to, you obviouslyinterpreted that data, whatever
(39:48):
it is, whether it be encryptedin transport or not.
It's usable by you at the backend.
It can't all be local'cause it'stoo much power.
it's a massive problem as to howyou do it.
Even if they don't use your datato train.
Like I said, they could buy thatdata.
they can get the ability to passcontext without any of your
(40:09):
data, but when they, soon asthey start applying that to you
specifically, they have a littledatabase with all your
information in it of like,that's highly personal data.
That it's like, I don't care,like credit card number, don't
care about it because I can justget you a credit card,
PJ (40:26):
sure.
Rob (40:26):
like, like a a thousand
websites.
I'll probably have my creditcard number and it's probably
publicly on the internet too.
Uh, it's just a thing, you know?
Yeah.
It's sometimes some data has tobe locked down and sometimes
it's better to approach data islike what would happen if
somebody got this?
And like if someone gets you,I'm in the mindset now that
social security number is justpublic information So let's
(40:48):
assume it's public, save it'scredit card number, assume it's
public and there are other waysto protect it.
And, but some of this data cannever be in that state.
So it has to be private.
It has to be secured.
Yeah.
That's why we have safes, that'swhy we have things.
We put it away, keep it safe.
And why would we give all thisinformation to Apple?
(41:10):
And if we don't, then Siri willnever be useful.
No matter how smart she gets atthe backend, without my data and
without context of historicaldata points on me of where did I
stay last time I went to SanDiego, why did I go to this
conference?
Why, what are just my travelpreferences, my credit card
number, what just preferencesacross the board?
(41:30):
And then extend that to kids andyour wife and your business as,
uh, associates.
Things that I could ask anassistant and with maybe one or
two questions back, they'dfigure the rest out because
they, they can go find my flightdetails and put you on it, blah,
blah, blah, of like how in thismiraculous world that they put
(41:50):
in videos of where this justworks.
The details of how that justworks are kind of scary.
PJ (41:57):
it gets to this conflicting
set of constraints and the
privacy thing is a bigconstraint.
And, you know, we're touchingupon it that, Facebook, Google,
they're able to createstatistical cohorts.
like, you and I might get servedthe same ads because we're round
about the same age and, youknow, demographics, blah, blah,
(42:19):
blah, blah, like, but to take itto the level where it's useful
to rob where it's useful for pj,there's a lot more personal data
that's needed.
then on top of that, there's thequestion of Apple, even though
it's growing more in theservices space, has not
traditionally been a servicescompany.
(42:41):
So there is a question of howyou build up that set of
competencies there.
And then on top of that there's,oh, we don't get to train on the
same hardware as everybody else.
We have to do it ourselves onapple silicon.
So now we've got all theseconstraints layering on top of
(43:02):
each other with all that inplay, Rob, does it then make
sense why, you know, this stuffis like taking forever or may
never appear?
Rob (43:10):
I rather it never appear
camp because unless it's
actually a personal assistant,it's still gonna be clunky and
useless.
It's still gonna be like, I'mdone with it.
I'll just do it myself.
It's the data problem is thecrux of the problem.
The, the technological backendand all that.
They can work through that orbuy Nvidia or.
And not tell us or whatever theywanna do.
(43:31):
I don't care.
It's the data problem of youcan't be useful if you don't
know anything about me.
PJ (43:37):
Yeah.
Yeah.
I.
Rob (43:40):
it's, it's, you just can't
do it.
So where does that data go?
And would I be comfortable withApple with all of their privacy
cons, trades that they've hadover the years?
And it's, I don't want thatlevel of information about me
to, to leave my filing cabinet.
PJ (44:01):
come back to this, this one
because I, I think this is
actually a lot of fun, but letme detour for a second and then
round back,'cause I know it wasa question I asked you
yesterday.
I'm gonna ask it again.
So, if Google had in Android,the ability to do this personal
assistant stuff, you switch fromApple to Android?
Rob (44:22):
I know yesterday I said I
would, but I thought about it
more and.
I probably don't want Google tohave that information either.
Uh, I probably less likely tohave Google to have that in that
info, but
PJ (44:35):
a sec.
Rob (44:36):
they could probably derive
a lot of it from what they
already have.
So I don't really know would Iswitch, I'd like to switch away
from Apple because they'restarting to annoy me and I think
a killer app on Google might beenough to make me switch.
And I don't use my Mac as muchas I used to.
(44:57):
I, my iPad is kind of, nah,don't use that.
The whole integration betweenApple devices.
My watch, I hate, uh, it's, Idon't need much to switch.
I use Linux on a daily basis.
I'm on a Mac right now, but Iuse Linux on a, on a daily
basis.
I use a bit of Windows for thegraphic stuff and I don't have
anything that keeping me onApple.
So if, if there was a killer appthat's like, I need that, then
(45:21):
that would be enough.
Would I switch for a ultra smartassistant?
Again, the data is like, what'sgoing on with my data is the
problem.
PJ (45:30):
so I get it that you don't
trust Apple, but the reason I
was, I'm poking you on it, hasto do with would you switch to
Google Knowing,
Rob (45:39):
I, trust Google even less
than I trust Apple
PJ (45:41):
Okay.
So the, the killer assistedactually is not the killer app
for you to switch, switch offof, right?
Rob (45:48):
now.
I don't think the killerassistant can exist because of
this problem.
If, if you are willing to giveup everything that you, they
could possibly want to knowabout you, then that app could
exist and, but.
I am not willing to give thatup.
(46:09):
I'm not even willing to give alot of this to like HR of a
company that I would, that Iwould work for of like, there's
a lot of information you wanttot even share with your spouses
and things like that.
And it's, and and then it alsospreads to that.
Yeah.
How about my kids?
All of this stuff over time thisscope of data grows.
And obviously if you knowsomeone really well, they've
(46:31):
known this and they've built upthat same bundle of knowledge
about you over time, and it's,unless you know that you will
never be useful,
PJ (46:42):
So are we saying that an AI
assistant could reasonably only
exist in a country like Chinaall of your personal information
is gonna be known by somegovernment entity and its
related corporations?
Rob (46:56):
But is it there always
gonna be context that isn't
known by the government, nomatter how much they know you?
Which seat do I want to sit in?
Do you wanna sit in the windowseat?
The aisle seat, the middle seat?
PJ (47:05):
but even
Rob (47:07):
You could take stats off
the market like, oh, I was, I
was trained on all of this.
It's probably not gonna want amiddle seat, but I'm at window.
Aisle and it's like, yeah,there's data out there, but does
that apply to me?
If I travel with somebody else,I'll happily take the middle
seat.
'cause I'd rather sit with themthan be in a different seat.
PJ (47:26):
I mean, that's derivable
data though.
I could look at your history
Rob (47:29):
Yeah, but you have to know.
You have to know my history
PJ (47:31):
Right.
I'm,
Rob (47:33):
and switches back and forth
again.
Context is important.
all this higher level contextand it's context on top of
context, um, I don't think thisapp can ever exist.
And I think they're barking upthe wrong tree.
They're doing a me too of like,oh, we need to be in the AI
space and we're just gonna putsome bullshit out there of what
we could do.
And it's never going to happen.
(47:53):
And if that doesn't happen, whocares about the rest of it?
It's the only useful AI app thatApple can have is to make sir
smart.
Outside of that, it's useless.
PJ (48:02):
I'll recall back to what
something I said a year ago
ahead of our WDC or during ourWWDC episode, our predictions
episode.
That is, I said, apple shouldjust stay away from this because
I don't think there's a placefor it yet in the Apple
ecosystem.
they went whole hog into it andwent into fantasy land.
(48:24):
And I think where you're comingto, and I think I, I agree with
you, is.
Either they need to accept thatthis app can't exist without a,
an interminable number of conamount of context that's
required for it.
Or have to make a really hardchoice and open up the system to
(48:44):
someone who can create set ofapps where this is useful.
Like beyond that, you are justhammered within the, the rock
and a hard place.
Rob (48:55):
Yeah, I don't think I, even
if they open it up, I still not
gonna wanna share about thatlevel of data with somebody
else.
Un unless somebody shows up andthey like, we are ultra secure.
We do it this way.
Maybe a new way of handlingdata, which no one's thought of
yet, but, but that would've tobe replicated today by Apple
because No, they wouldn't beallowed to do it to the level
required due to the closed wall.
PJ (49:15):
we
Rob (49:16):
So there are, there are, I
mean, it's all local.
Maybe you have a server in yourhouse and it all stays there
and, and it's on your networkthat never leaves you your own
premises or whatever it is.
And that's how they distributeacross all the devices and
whatnot.
And and
PJ (49:32):
how
Rob (49:33):
could have this.
PJ (49:34):
where I use tail scale to
create a VPN for myself I'll use
tail scale for that sit modelson my PC and then just hit it
from either my phone or from
Rob (49:46):
Yep.
So,
PJ (49:48):
then it's not going
anywhere.
Rob (49:49):
so maybe that's what they
do.
Maybe they have a little, it'sa, it's a Mac mini that just
sits in your house and it, itdoes all of the mining and all
of the context management is alldone locally.
And that's, that's a gamechanger if that's a thing they
can do.
But would they be able to sellit?
I mean, as a technical solution,it's viable as a commercial
solution?
(50:09):
As a consumer solution.
Mm.
Maybe commercial, becauseenterprise will be like, yeah,
we'll buy a new piece ofhardware.
It's fine consumer.
How do you even convince them?
They want it.
PJ (50:21):
No, It's an excellent point.
I mean, imagine, sitting itinside a Mac Mini or an Apple
tv, like beef up the Apple TV alittle bit, and then all of a
sudden, oh, next time I upgradeit, oh, I could just put it
there.
Great.
It's just, as my my littlerepository of information.
Rob (50:39):
Yeah, and it could get
information by the whole house
then.
'cause everybody else could usethe same thing.
But the whole, I dunno, it's notan Apple thing to do, but it,
it's, it's a technically viablesolution.
Whether could they work it init?
I, I don't know.
But even with that of like,how's that get backed up?
Where's that going?
Is that encrypted installed inthe cloud encrypted?
(50:59):
Or you can put it on anotherdevice?
Who knows?
would you ever be comfortablewith anybody corporation having
that much information?
And how and where is it stored?
They're never gonna be honestwith you.
They're never gonna say, it'sstored right here.
This is our'cause.
If they tell you too muchinformation, it gives you our
ability to access otherpeople's.
the whole world of security islike, there's still a lot of,
(51:20):
yeah, security through secrecyis bad, but there's still a lot
of secrecy in the high levelthings of like, how do you even
get to the data?
PJ (51:27):
it is true.
security through obscurity mightbe bad, but it's still
prevalent,
Rob (51:31):
but again, who's
comfortable with this?
I would, I still wouldn't bethat comfortable with it.
I'd be more comfortable with itbeing on a box in my house, but
I'm not entirely sure.
And.
It's just a longer term problem.
Okay.
Once it exists, it cuts out thebag.
is it a thing that thegovernment could get a, hold on?
Is it a thing that the policecould use to prove or disprove
something?
PJ (51:52):
an important
Rob (51:53):
sure whether I'd be
comfortable with any of that.
So I'm a firm believer that thisapp cannot exist no matter how
much they make videos about howcool it's, think about that app,
whether she shows the picture ofthe professor and be like, who
is this?
And when did I meet him?
How does it know who he is?
Does it, I said, I said, justused all your face scan data
that they said they'd never usedfor anything else to identify
(52:15):
somebody else.
PJ (52:17):
No Rob, it uses bought face
scan data.
It's totally, totally legit.
Rob (52:22):
how is that a viable thing?
Yeah.
Are they going out to one ofthese.
Face ID websites and just givingthem a scan of the face.
I don't want'em to use thatwebsite.
I don't want'em to have anydata.
I don't want them to have mydata.
I don't want to have my photos.
So of like, there's eachindividual site that it's using,
which we have no control over,is it's own individual set of
(52:44):
problems.
But, but yeah, that, that's justa Creeper app.
Why can't you just go to somegirl on the street and be like,
oh, who's that?
Yeah, tell me what you knowabout her.
Oh, she lives here.
She does this.
Or maybe she gives you enoughinformation that you can mine it
in your head and be like, okay,I know.
And, now you just made a perfectstalker app.
PJ (53:02):
This is like another one of
those issues where we're at the
edge where like a lot ofpersonalization is awesome, but
also awful at the
Rob (53:11):
not everybody has good
intentions.
PJ (53:13):
has good intentions.
Rob (53:16):
What's your take on it?
I just answered my take.
What's your take on on this?
Do you agree with what I said,or would you be comfortable
giving your information up forflexibility or convenience?
PJ (53:26):
I, I don't think there's a
huge amount of cost utility for
me to say, go buy me a ticket.
For the amount of trouble thatit could get into.
And I'm using that as kind ofthe base example.
it it really is almost not worthit to me.
It is so easy to do online whereI'm like comparing prices or
like like 10 minutes
Rob (53:47):
I.
PJ (53:47):
it's still worth it to me
'cause I.
get exactly what I want and I'mnot, don't know if I wanna pay
for some AI assistant to go dothat.
all said, even though I don'tfind a whole lot of utility in
it, I think this app will existin a, maybe not in a perfect
(54:08):
form.
I think it will exist becausesomeone will just say, Hey look,
this is super cool stuff and itwill be done with a lot of
unintended consequences.
Rob (54:18):
Rabbit tried that.
Did they?
They made that little rabbit boxthat was supposed to be able to
do things like this, and it wasultimately a more of a scam than
anything.
But the cat was at the back andall these questions came up then
too, of like, how does it knowall these things?
And, and it, and it's verypersonal.
They get it.
They can't use generically minedata for things like this.
(54:39):
It has to be yours and it has tocome from questions it asked, or
you fill a massive questionnaireout or it gets it from somewhere
else, which is equally scary.
But, but yeah, it's, it's like Idon't think it needs to exist.
And you made a great point.
I can buy a plane ticket in twominutes.
PJ (54:55):
so I
Rob (54:56):
I.
PJ (54:57):
and again, I'm gonna be
talking theoretical land'cause I
I can't think through goodexample, but if I had an
assistant that was able to dostuff for me in parallel, where
it was a, it was more costeffective, but it required a
whole lot of my personalinformation.
Then in theory, I could be moreproductive.
(55:19):
And again, we're in theory landat this point in time.
That's why I sort of think, inChina, if I have no right to my
data, or if in another countrywhere I have sort of no right to
privacy in some sense, couldthese models be developed to
make people there moreproductive?
are there operations where it'slike, oh, go out and do this,
(55:39):
and even though you're gonnawrite me the crappiest code,
you'll make me a, a little appthat I want, which is then
allows me to do something else.
This is where I, I start toquestion like, you know, are
there going to be advantages incountries with far less data
protections than ourselves?
Simply from a productivity andcost standpoint, and again, I'm
(56:02):
not saying Rob, that I want togive up my data.
I don't.
It's really just trying to thinkthrough the economics of is
there some scenario wheresomeone can get advantaged by
this stuff over and above, let'ssay someone in the US or the eu?
Rob (56:16):
Yeah, it's a good point.
But then how do I really need tosave that five minutes?
It takes me to book a planeticket.
I'm not a CEO like think who haspersonal living.
Personal assistants.
Yeah.
Uh, a secretary or someone whois a true personal assistant.
I'm not that busy.
PJ (56:32):
Right.
Rob (56:32):
I could just do it myself
and I'm just as productive as I
was'cause I've got everything Ineed to get done and I have time
to do these things.
There are people who are so busyor think they're so busy that
need somebody else to do allthis for them, and they're
willing to give up that level ofprivacy to another person.
And there's consequences if thatperson does things with that
(56:55):
information.
What are those consequences inthe data world when a
corporation does things withyour data
PJ (57:02):
Right.
Uh, here's a, here's somethingwhere it borders on personal
information, but here's a, anexample where I could be vastly
more productive.
So, when I was managing a wholebunch of people, one of the
things I had to do was, thereare evaluation reports, and it
would mean they would provide medata on what their, their code
(57:24):
check-ins were, the design docsthey created.
To a certain extent,descriptions of what they did,
and then I had to regurgitatethat and, you know, put it into
a different form for othermanagers to see and justify the
ratings and all that.
Quite frankly, some of that iscorporate, bordering on personal
data.
I would love to have an AI thatwould just up all the check-ins
(57:49):
that someone made, connect thatto what the impact on the
business would've been, and gothrough the design docs to say,
okay, it should be this, this,and this, and here's the
justification for it.
Maybe some, some humanintervention afterwards just to
contextualize stuff.
But quite honestly, that issomething where it's like that
(58:11):
would make my life a lot moreefficient as opposed to me
spending 2, 3, 4 hours perperson writing up these reports.
Rob (58:17):
But it makes your life
easier.
It being wrong has greatconsequences for somebody else
who didn't do anything.
was just misinterpreted by an AIand you didn't catch it in your
quick scan of it because youtrusted it.
So it's like, what's theconsequence of it being wrong,
of like a consequence of youbeing wrong?
(58:38):
If you continuously getperformance reviews wrong,
you'll get reprimanded by yourmanager or you'll get fired, or
whatever it is, there's aconsequence for you not doing
those performance reportsproperly.
What is the consequence when theAI gets it wrong and you skipped
it because you didn't.
You only give it five minutesper person instead of three
hours a person.
I get why you want it, theconsequences for somebody else
(59:01):
are far worse with the AI model.
PJ (59:04):
in either case, whether I
did it myself or used a tool.
If the thing I represent as thiswork is wrong, I should get
reprimanded.
Like, no question.
And I think that is, you know,if it saved me going from three
hours to like a half an hour oran hour, like that's a huge
productivity me.
Rob (59:25):
Okay.
I will buy that.
If, if you'll takeresponsibility for it
PJ (59:29):
I
Rob (59:29):
then,
PJ (59:29):
way you can look at it.
Rob (59:31):
but a corporation wouldn't
take res, a corporation wouldn't
take responsibility for it.
That's my point.
so in this case, okay, and thisfalls back to where I, I've
always said is AI and little usecases is quite useful.
These big, large language modelsthat kind of do everything badly
aren't that useful.
There's much better use casesfor AI in solving specific
localized problems, and thatthat kind of fits.
(59:54):
It's a big one.
It's starting to go down the RLMmodel because it has to be able
to read check-ins and thingslike that.
But it's, it's a, it's a closedworld use case where yes, it's
the big backend and it has a lotof access to a lot of data, but
it's, it, it's scoped.
There is a scope that it, itfits in.
Um, kind of fits into what I'vealways said about, like I said,
(01:00:17):
AI has some very good use casesin very specific, realms, but.
Me personally, I'm on.
I mean, right now it's like youlike to use the copilot stuff.
I've tinkered around with it andI don't like it because right
now I'm working on missioncritical code that has to pass
DO 1 78, that has to pass FAArequirements of, it takes me
(01:00:39):
just as long to verify that thiswould pass than it does to just
write the code in the, in the,uh, first place.
You can't, uh, say to an ai,write me a mission critical, do
1 78 FAA compliant navigationsystem.
PJ (01:00:54):
And I think that's I think
that's a hundred percent fair.
There is, to your point, a smallset of use cases or there's
scoped use cases where I want itto do very specific things and
it's not a human languagecompiler where it can do exactly
that specification and spit outexactly the code at the other
end.
it'd be interesting to seesomeone training to do like that
(01:01:16):
to happen.
Rob (01:01:17):
Again, training data.
This, in this specific case,there's not enough example code
out there.
PJ (01:01:21):
And it, I think this is a,
there, there's an interesting
case here where like you sent methe, uh, you know, the c plus
plus rules for, I can'tremember, it was for the joint
strike fighter, right.
Rob (01:01:33):
it was It was, Lockheed
Martin's doc, uh, document.
Yeah.
It's a public document as to,and it's very nice rules, but
you could take a programmer thatdocument and say, now write
code, and you'd write codedifferently to what you wrote
without that document.
And you could be like, I'm awarethat I'm obeying all of these
rules today.
You can't do that with an ai.
You can't say, take that codeand write it in the style of
(01:01:55):
this document.
PJ (01:01:55):
it's an interesting question
of, where like this sort of, you
know, that document could beconsidered rules of an expert
system.
Rob (01:02:03):
Literally are, but there's
a lot of contention to, yeah,
they conflict.
What do you do when this one isan exception for this one, and
which one takes precedent, blah,blah, blah, and why?
And a lot of it is document thereason of, like, I made myself
an exception to this rulebecause of this.
I have to call this externallibrary.
And I, I rules conflict withtheir rules, whatever it may be.
(01:02:25):
There's all this crops up allthe time and, and AI coding is
terrible at this.
So imagine how bad it would bebooking flights or figuring out
when your kid comes home fromcollege and to get them home
safely.
It, it's just not gonna happenand Siri's not gonna get it.
Alexa and, okay, Google seem tohave gone away.
So Siri's the only one left andhe's not getting this level of.
(01:02:49):
Smart ever.
And if she doesn't get it, shestays useless, she stays
irrelevant.
I'll just ask her to play Musicis all I do now.
It's all I will do.
If they make a new one withoutsui, what's the point of Apple
intelligence?
It's all these little localizedproblems which AI is good at.
Is, is that what they're gonnado?
Is that what Apple Intelligenceis gonna become?
(01:03:10):
Remove this object from a photo.
Good use for an ai.
Remove, remove the audio from asong so I can sing over it.
Silly use, but technically agood use is that what Apple
Intelligence is gonna become.
It's just a bunch of littlebespoke problems that they
solved with localized AI modelsthat they can run local, that
(01:03:31):
they can train on their shittyapple hardware and you can run
them on the a and e on your Mac,on your Apple tv, on your phone,
whatever it may be.
I think that's the path they'regoing down.
It's not a coherent appleintelligent, it's, it's a little
tiny.
Trinket of a tool that theycould put under the Apple
Intelligence badge.
I think that's what it's gonnabe.
PJ (01:03:52):
the sense that I think
that's all these tools are going
that path, and think it's bad inthe sense that, you know.
In Linux, I can go and pipe saida rep.
All these things I could justlike put together to formulate
like one bit to the other.
I didn't like, I don't want themonolithic program.
(01:04:14):
What I want is to know when touse these tools in particular
situations and be able to likego and do that thing.
And it does it great.
Rob (01:04:24):
It does but that's what,
not what Apple's doing.
PJ (01:04:26):
get that.
Rob (01:04:27):
You can't it, you can't get
the edit list that it made to
your photo to remove thelamppost in the background.
It's just something that itdoes.
Can you command scripted fromthe command line?
Probably not.
It's you.
These are very disjoint, bespoketools.
All user interfaced.
You can't combine them yourselfinto bigger tools.
(01:04:47):
That's just not an Apple thing.
Yeah, it's,
PJ (01:04:50):
the
Rob (01:04:50):
it's, it's,
PJ (01:04:51):
you know like, like
something's gotta give in some
constraints somewhere.
Rob (01:04:57):
no, apple Intelligence is
gonna be a bunch of useless
trinkets that some are useful tosome people, but generally
aren't useful to anybody.
And that's what they'll do.
They'll add new trinkets underthe Apple Intelligence banner
and that's what it will be.
(01:05:19):
I mean, technically, what isApple intelligence?
Why do you need specialhardware?
PJ (01:05:22):
I, so, I mean, I'll be very
honest
Rob (01:05:25):
We've talked about this
before.
It's like my old, my old phone,my iPhone fifth 14, I have in my
hand right here.
It could do the job.
It take a bit longer.
It might not be a greatexperience because it's gonna
sit there and churn, but is thatwhy they limited Apple
Intelligence to the 16?
And this comes back to thewalled garden.
(01:05:45):
They can do that because we haveno control over our own
hardware.
Uh, but I'm pretty sure thatthere's nothing in any of these
apps that they've released underthe Apple Intelligence banner
that wouldn't work on my olderhardware.
It's just they don't want it towork.
PJ (01:05:59):
So at some point in time,
then I think Apple will make a
change.
I.
it becomes an existentialthreat, and I think it's a
question of whether they believethere's actually an existential
threat here.
there's not, then I, I agree.
Apple intelligence just becomesa set of trinkets that are just
operate independently.
If there is an existentialthreat, then you have to say, am
(01:06:22):
I gonna release the privacyconstraint?
Am I gonna release the scriptability constraint?
Am I gonna release the APIconstraint?
Am I gonna release the Nvidiaconstraint?
Like, which thing am I willingto let go of in order to create
something competitive in themarketplace?
Because in my mind, like thisthing isn't, like, if it's
vaporware, then if it's an, ifit's not an existential threat,
(01:06:46):
it doesn't matter.
Rob (01:06:49):
It should stay vaporware is
what it should do.
And secondly, these existentialthreats, it's are they real of
like, Microsoft missed themobile space and for a while it
was like, oh, it's gonna be theend of Microsoft.
They, they came out just fine.
They, they pivoted, did otherthings.
Yeah, they tinkered with it.
And I think the, the, theWindows phone is gonna be the
(01:07:11):
equivalent of Apple inintelligence.
It's gonna be a whole fastattempt at this new tech that we
are not a part of, but wedesperately want to be a part
of.
And it's gonna end the same way.
And Apple will just go on and doservices and no one will,
everyone will forget they evermade it.
PJ (01:07:28):
And I think it's a, a
incredibly valid way.
Again, like they said a yearago, we're just not gonna touch
AI right now, we would've beenin the same place except we
wouldn't be ragging on Apple.
Right.
Rob (01:07:42):
but we won't because we'd
be talking about how bad the
stock price dropped and thelawsuits that the investors
filed against the CEO, blah,blah, blah, this.
A lot of this is driven by, wehave to be involved because
we're a public company.
PJ (01:07:55):
I truly, I, you know, this
is like the one spot where I
really, I go hard in the otherdirection where I think Tim Cook
could have said, look, we don'tthink that there's a great use
case right now AI that we canship.
So we're gonna wait until likethere's a clear market leader or
(01:08:16):
a clear like paradigm shift forus to go after.
worry.
We'll put our best people onfiguring this out.
I think that would've beenenough.
again,
Rob (01:08:26):
I agree.
PJ (01:08:27):
is all counterfactual at
this point in time.
So I, I'm wondering how muchlike the, the, like the, the
real stock price is.
Because what we're saying is thestock price is being buoyed by
vaporware.
So now it's actually possiblyworse
Rob (01:08:45):
Oh yeah.
I, I, I'm not disagreeing withyou, man.
I'm just saying that would'vehappened.
Steve Jobs always had thisattitude where he didn't care
what the stockholders thoughtof, like, I'll do this, but now
they do.
Um, I, I think it's just likethe change in leadership over
the years at Apple has becomemore, not Apple, like, apple
started off as thecounterculture company and
(01:09:07):
everything was different and thewhole think different thing.
That was true when Steve Jobswas there.
But like I said this before,today Apple runs on the ethos of
this dead guy who's been deadfor 10 years, and it's still
like, they kind of, well, thisis what Steve would do.
But is it without him beingalive to ask him?
You don't know.
(01:09:28):
So it just kind of becomes thisalmost folklore thing and it
just kind of just gets dissolvedand it's just more corporate
speak at this time.
PJ (01:09:36):
that's there that have what
Rob (01:09:38):
Yeah.
PJ (01:09:39):
have done?
Quite honestly, I mean, I, Ithink.
Apple's stock is in vast moredanger if people really start to
hammer on where the hell is thisapple intelligence thing.
Like you made a promise and thenfailed people are going
Rob (01:09:54):
hardware on a promise and
people will remember that.
'cause that phone was a thousanddollars.
PJ (01:10:01):
Right.
Rob (01:10:03):
And what did they get?
They got nothing.
They'll get nothing because theywon't be out till the next
version of iOS at most, at best.
Sorry.
And there'll be a new phone thentoo.
PJ (01:10:14):
this is, I think this is
actually dangerous for them
because, you know, we're lookingat two promises broken in the
last two years.
was the notion of the AppleVision Pro being something like
the new zeitgeist that was gonnacreate all this revolution that
didn't happen.
they, I think, very recklesslypivoted.
this Apple intelligence stuffand said, Hey, we're gonna,
(01:10:37):
we're gonna take the, thethunder from all this AI stuff,
and that hasn't happened yet.
So, you know, what's the nextpromise that they're gonna try
and make?
Because now the pressure's on,like the, the right thing to do
would be to kind of back off,hold off the horses, even if it
takes a stock hit, just maybetake some more conservative,
(01:10:58):
cautious steps.
If they make a bigger promise ontop of that, they gotta deliver
on it
Rob (01:11:04):
Well, that's what they're
gonna do.
What, how often does Applebackpedal?
Very, very rarely.
Um, yes, they've delayed the newseries and they've delayed Apple
Intelligence.
The key features that we stilldon't have, uh, they should have
just come out and been like,apple Intelligence is, they
should have said it originally.
To be honest.
Apple Intelligence is acollection of smart utilities
(01:11:29):
and we'll add more over timewe'll start with these, with a
few here that we can, andbasically be honest is what
we're saying.
We're not saying, oh, thatanything more than don't
bullshit us with produce videos.
Go back to live presentations oflive demos of what it can do
today.
If you can't sell it today,don't sell it today.
And people make mistakes.
(01:11:49):
No one's saying you can't makemistakes.
It's just, don't bullshit us.
PJ (01:11:52):
it's, I think them coming
clean about this, you know,
doing a realignment, doing someMaya culpas and say, look, we're
gonna, you know, will do better,I think goes a long way.
Yeah.
There's some short termconsequences.
But I think in the long term,like it's like it's better than
trying to keep making up forshit by making bigger and bigger
(01:12:15):
promises.
Because when you do fail andfail big, that's when companies
really get hammered.
Rob (01:12:23):
Yeah.
And I think, I think this is,like I said, this is apple's,
uh.
Moment that Microsoft had withmobile, and they'll be fine in,
in in the long run.
They'll be fine, but won't be acompany.
You think about when you thinkabout ai, just like Microsoft
isn't a company you think aboutwhen you think about mobile,
PJ (01:12:40):
fine.
It is a totally fine thing notto think of Microsoft and
Mobile.
Rob (01:12:45):
it's, and I I do think for
Apple to be a big player in
this, everything needs tochange.
The world Garden needs tochange.
Let other people do this becausethey're better at it than you
are.
It's not one of these things isthe cause of the problem.
It's a little bit of everythingand it's death by a thousand
cuts.
PJ (01:13:01):
I agree.
I think it's, I think it's theoverlap of all these constraints
and I don't know if you need toremove every constraint, I think
you need to remove at least oneand make a decision about which
one you're gonna remove.
Again, do you wanna remove theprivacy bit?
Do you remove the closed gardenapproach?
Do you, know, tone down thepromises you're making?
(01:13:23):
Do you go to Nvidia hardware?
I mean, make a decisionsomeplace, but you're backing
yourself into a corner and it'shard to get out.
At the end of the day, it's, itis okay.
It's natural and good for, forcompanies to say, look, we made
a mistake here.
Like we're gonna do somethingbetter.
We're gonna match what themarketplace wants.
(01:13:45):
Like that's reallyunderstandable.
I think trying to do it all,that's a great recipe for
disaster.
I.
Rob (01:13:51):
For sure, and we'll see.
WDC is right around the cornerand.
Maybe I'll have to eat everyword I said'cause they'll come
out with this perfect solutionthat we've not thought of.
And, but, I'll believe that whenI see her.