All Episodes

October 6, 2025 48 mins

Send us a text

We trade last‑minute schedules and kid chaos for a deep dive into how modern phones leak data, why “Ask App Not to Track” isn’t enforcement, and what a third platform built for privacy and free speech looks like. Joe shares his Apple-to-Unplugged journey, the Raxxis findings, and practical features that make privacy usable.

• zero‑to‑one background from Nomi acquisition to Apple services
• motivation for a third platform beyond Apple and Google
• Raxxis test revealing 3,400 sessions and 210,000 packets in one hour
• third‑party data brokers, pattern‑of‑life risks, Fourth Amendment gaps
• layered threat model from passive tracking to seizure and signals
• emergency reset, false PIN wipe, and hardware battery cut‑off
• first‑party vs third‑party privacy and ecosystem incentives
• “Ask App Not to Track” as preference vs permission
• Time Away to reduce engagement and regain attention
• firewall, USB data blocking, 2G limits, Bluetooth controls
• camouflaged VPN and operational noise in repressive networks
• app compatibility layer and broader app sourcing without Google
• clear business model: hardware and subscriptions, no data sale




Support the show

Follow the Podcast on Social Media!

Tesla Referral Code: https://ts.la/joseph675128

YouTube: https://www.youtube.com/@securityunfilteredpodcast

Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_01 (00:00):
How's it going, Joe?
It's it's great to get you onthe podcast.
You know, we put this thingtogether pretty last minute.
And uh I was surprised that itactually worked out.
Usually last-minute podcastsjust there's a million things
that happen that that just liketake over and bump it out even
further.
So I'm glad to you know be ableto get you on.

SPEAKER_00 (00:22):
Thank you so much, Joe.
I appreciate it.
And yeah, the timing worked outgreat.
I happen to have been travelinga ton recently, but we have a
nice window here of quiet todayhere back at home.
So I'm really glad I lined up.

SPEAKER_01 (00:35):
Yeah, I I have quiet for maybe three more hours until
my uh my little terrorists comehome from daycare.
It's it's always just crazy assoon as they come home.
And how old are they?
I got a two and a half year oldand then a four-month-old.
So the four-month-old you know,crazy, but the two and a half

(00:57):
year old.
I'm in a similar bird.

SPEAKER_00 (00:59):
So our youngest is one, we have a two-year-old,
those are two girls, and then wehave four boys, eight, nine,
ten, eleven.
So I understand.

SPEAKER_01 (01:08):
Jeez, you have six kids?
That is I do, yeah.

SPEAKER_00 (01:12):
We're blessed.
We're blessed.

SPEAKER_01 (01:15):
Man.
That is hard.
I'm like I'm at two right nowand I'm questioning if I want to
go for three.

SPEAKER_00 (01:21):
Like, just go for it.
Stack 'em.
Keep going.
Stack 'em up.
It's great.
It's sort of especially as theyget older, they start taking
care of their siblings.
It it becomes more manageable.

SPEAKER_01 (01:30):
Yeah.
Yeah, because I I'm like in themiddle of my PhD right now, and
I'm just like, I need two hoursto do this.
You know, like, and then twohours turns into four hours, and
you know, I'm up till two orthree a.m.
doing it, and then I'm up earlywith the kids.
It's just crazy.

SPEAKER_00 (01:48):
Yes.
I I do not promise you will getany sleep with more kids.

SPEAKER_01 (01:52):
Yeah.

SPEAKER_00 (01:53):
That will not happen.

SPEAKER_01 (01:55):
I mean, I I don't really need it, I guess, you
know.
I I haven't really slept for solong now.

SPEAKER_00 (02:01):
It's like that's sort of the boat I'm in.
Yeah.

SPEAKER_01 (02:04):
Yeah.
With with the first one, Ithought that it would be like a
giant problem.
And it was in the beginning, butafter like three weeks, I got so
used to only getting four hoursof sleep that like if I got to
bed, literally, if I got to bedat like 10 p.m., I would be up
wide awake 2, 3 a.m.
totally fine, have my entireday, to take a quick nap, and

(02:28):
then keep continue on, right?
Like it was so crazy how quicklyI adjusted, but it was brutal
for those for those two, threeweeks.
It's so terrible.

SPEAKER_00 (02:40):
Yeah, there's a transition cost, and then it
normalizes, and then I thinkI'll probably be in that zone
for you know another 15 years.
And you know, so what goes?

SPEAKER_01 (02:50):
Yeah, yeah, no, for sure.
Well, Joe, you know, I I want tostart you off with you know,
telling your background.
How you got into the space?
You know, did you start, did youstart in IT, did you start in
security, or did you startsomewhere else and kind of, you
know, pivot over, right?
Just kind of follow the windingroad of life.

(03:10):
And here you are, right?
So what does that backgroundlook like?

SPEAKER_00 (03:14):
Sure.
My professional career startedactually running a family
business in the advertisingproduction space, which was
really exciting.
We innovated a lot of livereal-time human emotion
capture-based 3D animationtechnology.
Um, it was a great smallbusiness in New York City where
my dad and I had these twocompanies together.
He had started it and I had achance to get involved in it.

(03:34):
And that went really well.
We grew that business a lot.
It basically used 3D animationto replace drawings to plan out
large budget commercials.
So we focused a lot on VFX andstorytelling and sort of quick
productions for clients to gaugecommercial ideas.
That was great.
In 2013, 14, I wanted totransition out of this family

(03:57):
business.
I'd also seen a lot like theadvertising market was
consolidating and price pressurewas happening a lot in the
production space.
At that time, a friend of mineand I started talking about a
new idea for a video app.
And a lot of my sort of skillsthat I learned in working with
teams to build technology forthe animation company, we turned

(04:18):
that into building a video appfor iPhone.
That was called Nomi.
And Apple acquired that in 2015.
So that's how my journey atApple started.
I, my first meeting at Apple waspitching Johnny Ive in the
design studio, my phone app.
I was very excited.
It was sort of this sort ofentrepreneur's dream, right?
Where like the curtain opens andthey're like, we want to buy
your company.

(04:39):
I was very excited.
But that that kicked off acareer in consumer tech at
Apple, where I led a specialprojects division in the
services group at Apple.
I was sort of like one of thezero-to-one guys at Apple.
I would, I would be tasked withlike new product ideas.
Sometimes that was leadershipsaying we need a solution for
this.

(04:59):
Sometimes it was me saying, hey,we should do this.
But I led a cross-functionalteam that innovated a number of
new products that shipped oniPhone and iPad and also dealt a
lot with other platforms.
I would say my focus there, welooked at a lot, we touched a
lot of the Apple ecosystem.
But my focus there reallyrefined over time to on-device
intelligence while retainingcustomer privacy.

(05:23):
I was very excited about beingable to design experiences that
enabled great customerexperiences without allowing
Apple or any developer to seecustomer data from the device.
So privacy was a major focus ofmine.
When I saw Eric talk about theup phone and unplug my Eric
Prince, who was one of thefounders of the company last

(05:44):
summer, I saw him on a podcast.
I was already very invested inthe sort of privacy space at
Apple.
And I had, you know, over timeincreasingly felt like we really
need a third platform.
Apple and Google are great atwhat they do.
But I think in a lot of ways,although they're different
companies, they're very similarin a lot of ways.

(06:06):
Their business models aresymbiotic.
Apple is in many ways a leaderin privacy, but it also sells
real estate to Google for a lotof money to be the default
search engine on iPhone.
So I felt that, you know, theneed for both a new platform
that really focused on customerprivacy big time, like
completely, not partially, butcompletely.

(06:28):
As well as a third platform thatwas less willing to control what
information could be seen bycustomers on the platform.
This is another area that I'dbecome increasingly concerned
about was the US having reallytwo ways of experiencing the
internet and both of them sortof agreeing on what can and
can't be said.
And I think for a lot of people,whether it was COVID stuff or

(06:52):
other stuff that at certaintimes were put was politically
sensitive and then it turned outto be like obviously true and
something we've all accepted,that felt like another reason
for a third platform.
So when I saw Eric talking aboutUnplugged, I was very
interested.
I actually initiated contact.
I reached out.
My initial propellant was justas a supporter, sort of like,
hey, listen, I'm on the insideof this.

(07:13):
I think what you're doing isgreat.
I actually had a littleconstructive feedback about how
they could sort of think aboutthings, nothing, you know,
proprietary or whatever, butthat just began a relationship.
And, you know, the more I leanedin and the more I learned,
really the more concerned Ibecame about the status quo of
the two dominant platforms andthe more optimistic I became

(07:35):
about the prospects for a newalternative independent
platform.
So that's brought me to thefinal decision, which I made
last May.
This was a very challengingdecision for me.
I mentioned I have six kids.
You know, Apple was very good tome and it was a hard call.
But it it really, in the end, itfor many, many reasons sort of
coming together, it felt likeabsolutely the right thing to

(07:58):
do.
I will say was the timing wassort of serendipitous because
the very day that I had thiskind of yes or no conversation
with Eric was the very day thatI had gotten some really good
news on something I was focusingon for years at work, and it was
like, yay, great news.
Wow, I'm about to walk away fromthis.
And, you know, but that's that'show God lined it up.

(08:18):
So here we are.

SPEAKER_01 (08:19):
Yeah.
That's a so that's a fascinatingjourney, right?
When you when you started, youknow, that first company, I
mean, did you looking back onit, did you even have the
mentality of like, hey, maybe Ican sell this thing and you
know, make X amount of money,right?
Like it's kind of just like howlife happens, you know?

(08:40):
I mean, I don't know how else toput it.
Like you take the you take theleap of faith, you know, and
then life happens, right?
And you kind of uh end up inthis position where you're at
Apple with, I mean, a dream job,right?

SPEAKER_00 (08:55):
Like, I mean This is totally correct.
Yes, that is a a good way todescribe how it happened.
You know, I personally have anunderstanding about this, you
know.
I'm I'm a Christian, I believeGod had this all written out for
me.
I also can say I had zerovisibility into what the plan
was.
And I also will say that, youknow, another thing that has
been clarified for me is thatalmost every positive turn in

(09:17):
the journey that I've describedwas on the heels of some
disaster, right?
Like, you know, so I just I justtold you the kind of cliff
notes, but like, you know, andit sounded like this, but it was
actually like Yeah.
But I I have found Steve, Stevehas a lot of people quote Steve
Jobs on a bunch of things.
There were, there's one quote ofhis that I really hold close

(09:38):
where someone asked him, areporter asked him, what's kind
of the secret ingredient of agreat product?
And he starts by saying how tonot do it, which is people get
the this notion that if theyhave an idea and they can just
go tell people to figure it out,that it'll just work out.
But that's not how productshappen.
Products happen by endlessiteration of individuals who
discover, bouncing off offailure, what the thing really

(10:01):
wants to be.
And I believe that that's truefor a product.
I think it's also been true formy career, which has not been
conventional at all.
I didn't finish college, I haveno official credentials of any
kind.
What I have been, you know,really blessed with is
incredible people to work withwho've taught me a lot, a lot of
good breaks, a lot of bad breaksthat we've sort of continued on

(10:21):
and pushed through and turnedinto opportunities, both in the
previous company at Apple toalso the journey within Apple.
I mean, the journey within Applewas, you know, great news from
one executive, then the next dayit would get shot down.
And oh, we got to come back andcome, you know, it was a lot, a
lot of that.
And it continues now, you know,with a different problem set,

(10:42):
but one that I think is veryimportant and very well timed.
This issue of, you know, concernboth with data privacy, with
free speech, with not having alot of alternatives for these
products that we spend our wholeday on is, I believe,
increasingly important to moreand more Americans.

SPEAKER_01 (10:59):
Yeah, no, that that's um it absolutely is.
It's like at the forefront, youknow, of my own, of my own, you
know, mentality and how I viewsecurity and privacy.
And now I have little kids, soI'm, you know, the security
paranoia in me is turning intolike, okay, well, how do I like

(11:20):
protect their existence?
You know, because like thisthing called the internet isn't
going away, you know.
LLMs are not going away, youknow, the genie's out of the
bottle.
So what do we do?
What do we do?

SPEAKER_00 (11:34):
I completely agree with you, and I share those
concerns, which is another Imentioned there were many
reasons that drove me to thisdecision, and that was a big one
is I look at my kids and I thinkabout their future.
What I hadn't fully understood,and what has honestly was a
learning journey for me.
Apple, I mentioned Apple is sortof of two minds when it comes to
privacy.
I was on the end of invent coolthings that protect customer

(11:57):
privacy.
Great.
What I realized though is whileI was doing Apple first-party
services that may have takenthat approach, mostly what we do
on iPhones is third-partyapplications as customers.
And the truth is, I sort ofimagined that my approach to
first-party software at Applejust magically applied to
third-party software, which Ihave come to see as like naive

(12:18):
as the understatement of thecentury.
So we recently actually did atest.
I don't know if you've seenthis, it's on our site.
It's it's really eye-popping,but we hired this cybersecurity
firm called Raxxis.
And we asked Raxxas to take ourphone and an iPhone and put them
both on closed networks, but 33standard apps we all see, you

(12:38):
know, from Spotify to Pinterestor whatever, Expedia, stuff we
use all the time.
Ask App Not to Track for everyiPhone app and watch the traffic
that goes on and off the deviceat a really acute level.
Pardon me.
What we saw blew my mind.
The iPhone, even after Ask AppNotTo Track was selected for

(12:58):
every app, in one hour opened3,400 sessions with known
third-party data harvestingservers.
Meaning the apps that werelaunched had SDKs in them that
reached out to these servers totransmit what?
Location in any way they can getit, of which there are many
ways, any other types offingerprinting, trying to get on
Bluetooth to see other devicesthat are around, the orientation

(13:19):
of the phone, are you moving orstill all that stuff, what
you're doing in the app.
In those 3,400 calls that werecompleted, 210,000 packets of
data were transacted.
This is in one hour.
And this is a phone with just 33apps.
Most phones have hundreds ofapps.
So if you imagine this, this isone hour.
This is happening 24 hours aday.
Right?

(13:40):
So we're talking about, youknow, eight packets of data per
second in this test.
Now the good news is our devicestopped all of them.
So it it shut down calls to allof these known data harvesting
servers.
But that again, this is anexample of like a discovery,
right?
So it was that test which led usto realize like, wow, these

(14:00):
numbers are crazy.
So we created this feature.
So can you see this uh dashboardon my device?
Yeah.
What's the number there in themiddle?

SPEAKER_01 (14:09):
The middle, 976.
Oh, 977.

SPEAKER_00 (14:13):
977.
Okay.
That's the number of times, it's1124 here.
That's the number of 978.
That's the number of times todaythat my device has stopped the
apps on my phone from openingsessions with third-party data
harvesters.
By the time I go to bed, that'llbe 567,000.
Jeez.
So why does that matter?
Okay.
Back to our kids and the futureand security.

(14:34):
What I didn't know, what I havecome to get very educated on is
layers here.
Number one, I did notunderstand.
I thought like all tracking wasthe same.
Google, Meta, there's there aredifferences, they're different
species, and they each havedifferent risks.
But what we're talking abouthere is third-party tracking
services that are part of thisdata harvesting ecosystem that

(14:55):
is behind this huge world ofonline advertising.
The data that is gathered hereis publicly purchasable by
basically anybody.
This is not something Iunderstood.
I did not understand that therewere reams of data spilling off
all of our phones that is beingheld in databases and analyzed
and is publicly purchasable.
So I've now done this myself.

(15:16):
With a few bucks and a, youknow, a website, you can go to
an advertising firm, a dataharvesting firm, and basically
design campaigns to pinpointdevices that go to a certain
address and that open certainwebsites.
You can say, put this on a phonein this area that reads this

(15:36):
website, Breitbart versus CNN.
But you can also reverseengineer this data.
It's very easy to buy a bunch ofdata from a certain area and
say, hey, I know this persongoes to the gym here and goes to
school here.
Show me all the phones that dothat.
Oh, there's only one phone thatgoes to those two places, and
here's where they sleep.
This is all totally doable.
I'm not, I'm not the one liftingthe curtain here.

(15:58):
Byron Tao wrote a book on thiscalled Means of Control about a
year a year ago, this becamepopular.
The main subject of his book isa guy called Mike Yeagley that
we work with, who's anintelligence person who really
revealed this when he went tothe head of the DOD and did this
process I just described to getthe home address of an entire
home addresses of an entireDelta team, which is obviously

(16:18):
like a super secret thing,right?
So when we think about thefuture and LLMs and all this
data that's out there, it'spurchasable by our government.
So that's the first risk, in myview, is like there's no Fourth
Amendment protection on thisdata.
It's considered third-partydoctrine-covered data because
you've shared it with thecompany.
It has zero Fourth Amendmentprotection.

(16:39):
So that means the government canand does today buy this data and
say, show me the people who goto gun stores, show me the
people who go to cryptoconventions, show me the people
who go to gay bars, whatever.
Happening all the time.
One, two, foreign countries aredoing this, profiling people in
our country who have importantjobs or who might be involved in
government or defense.
Like this is huge risky stuff.

(17:00):
Also, individuals can do it.
Crazy individuals or privateinvestigators are doing this all
the time in court cases now.
Right.
So it's this is, I think, thebig like there are many layers
as we often describe.
There's we see it as sort oflike a pyramid of risk of
smartphone data and security.
But this is the big one in ourmind because it's happening so

(17:21):
much, and the data requires nowarrant to access and has
literally no constitutionalprotection.
And I don't think anyoneunderstands that this is
happening.
So this is this is sort of a bigarea that we see as a huge risk.
Above this, this is sort of thebase of the pyramid for us, this
third-party tracking, thispublicly purchasable data about
all of us, which again is itdoesn't just cover you, right?

(17:42):
You can very easily find outyour wife's phone, right?
And that where she works or youwork.
It's crazy.
It's very easy to draw patternsof life from people from this.
I can actually show you ifyou're interested, an example of
this.
It's crazy.
So this is like the bottom ofthe pyramid, right?
And then the next level up abovepassive data harvesting is what

(18:03):
happens if your phone isactually taken.
That's another thing I don'tthink people realize is we're
walking around with devices thathave our most personal thoughts,
our relationships, our bankinginformation.
And whether it's a criminal or alaw enforcement situation, or if
you're crossing a border, by theway, if you're going on a
vacation and you're going intoanother country, like you have
no rights to protect that phone,right?

(18:25):
So, you know, in this scenario,we have unique features in the
product, not just to block thethird-party tracking, but to
make it very easy to basicallylike remotely wipe your phone.
So, like one of the things wehave, which I love, is like um
we call it emergency reset.
So I have like a false pin on myphone, which I won't tell you.
But if someone like demandedthat I open my phone, I can tell
them a code that will wipe thephone and replace it with

(18:48):
essentially fake data.
I can also tell the phone, hey,if I'm not back here in 30
minutes or two days or twoweeks, wipe.
Uh things like that, you know?
And then lastly, there's likethe top of the pyramid, which is
like all of the unintendedelectronic tracing that can
happen or penetration that canhappen outside of like the kind
of commonplace third-partystuff.

(19:09):
And, you know, this is thingslike, you know, turning off your
phone and not realizing thatwhen it's off, it's actually
pinging towers to do some datatransmission that you know you
aren't aware of and you've justleft a trail of your location,
right?
So we have a very simple featureof a hardware switch right here
with a little pen or pencil, youcan just flick it and it
separates the battery from theelectronics and the phone
becomes completely inert.
It's like putting it in aFaraday bag, right?

(19:30):
So we have a ton of customersfor whom, like, you know, many
who are in very sensitivesecurity-related jobs.
And they have to go intosituations where, like, maybe
they're walking into a room withlike very senior people in the
defense org situation.
They need to physically turn thephone off because for someone
like that, you kind of neverknow, right?

(19:51):
Like, is my phone being turnedinto a monitor and I don't know
it?
You know, the only way to getaround that software is with
hardware that literally shuts itdown.
So, in any event, you know,again, these security risks
related to our phones aremyriad, they're layered, and
they're they're definitely notgoing away.
Like, as more and more of ourlives are lived online, right?
As more, as our screen timegrows, as more of our lives and

(20:13):
relationships are conducted onthese phones.
And as you pointed out, as thetechnical capabilities via LLMs
emerge to give everyone thetools to quickly synthesize data
sets and target people andextract information, this is the
risks here are just gonna keepmultiplying.

SPEAKER_01 (20:29):
Yeah.
That is that's crazy, right?
So I'm I'm a huge Apple iPhoneguy.
I got MacBooks, I have iPads, Ilove the products.
You know, I have new ones comingin because they just launched
some devices, right?
And I I guess this is what Iwould say, right?
I think there's probably adelineation that you also hinted

(20:53):
at, that there's a delineationbetween like Apple native
services or apps and thenthird-party services and apps,
and how they're both using youknow the resources and the
limitations of the device.
Can you talk about that a littlebit?
Because it sounds like Appleitself is indeed following its
privacy and security practicesthat it claims that it has and

(21:16):
offers to its users.
It sounds like the limitationsare more loose for third-party
apps because they aren't able tocontrol it the exact same way.
Does that make sense?
Am I off?

SPEAKER_00 (21:28):
Or yeah, I think that makes sense.
And I'm certainly not speakingon Apple's behalf.
Right.
And and I, you know, I will saythat I was very proud of the
privacy work I did at Apple.
What we discovered in this testis that I I would say a couple
things really emerged to us.
We also tested a Samsung in thatsame test.
And it actually made fewer callsto third-party data harvesters

(21:52):
by about half compared to theiPhone.
Now, some people saw that andsaid, Joe, does that mean the
iPhone is less secure?
I did not take it to mean that.
What I took it to mean,anecdotally, but I I stand by
this, is the iPhone is a muchricher target for data
harvesters.
The value of iPhone users ismuch more valuable to
advertisers than the valuable ofAndroid users on average, by an

(22:14):
order of magnitude.
It's it's not a it's not a smalldifference.
So my my takeaway is not, oh,therefore, iPhone is less secure
than Android.
I don't have any information tosuggest that.
But I what I do see is iPhoneapps, third-party apps targeted
at iOS, are appear to me to havefrom this test, to have more

(22:34):
data harvesting eyeballs on itbecause there's so much value
attached to especially AmericaniPhone users, customers.
So regarding the delineationbetween first and third-party
data, Apple speaks very publiclyabout its own policies with
data.
And I think that those areadmirable.
And I've never seen anythingthat would indicate a divergence

(22:56):
from those.
We should we should double-clickinto that though.
Yeah, yeah.
Because, you know, I like whileApple has services that are
designed to prevent Apple appsor third-party apps from seeing
what's like one of the things Iwas involved in is the smart
share sheet.
It's it's one of these examplesof an out-of-process system

(23:16):
experience.
Like you know, when you go toshare a photo in iOS and a sheet
comes up and it has suggestionsof who you might share with.
Our team was designed that.
And that experience is out ofprocess from any application.
What that means is the app getsbackgrounded, the sheet comes
up, and you're interacting withthe OS at that time, meaning
that data who you're talking tois in the OS, not the app.

(23:37):
I would consider that like anelegant I personally am excited
about that solution.
Apps can't see that data.
Okay, great.
To me, that is one type ofchoice.
Another type of choice is thesymbiotic deal that Apple has
with Google.
That is a very different type ofchoice in my perspective, right?
In which rumored tens ofbillions of dollars, I don't

(23:59):
have any private, you know,whatever.
I've seen news articles that saythat a lot of money is
transacted to essentially rentthe default search engine space
on iPhone.
So that's sort of like a littlethose two choices to me
represent like different typesof choices.
Like in one case, the default isprivacy.
In another case, the defaultpriority is revenue.

(24:20):
So those seem like divergent tome.
When it came to this ask app notto track situation, I was really
alarmed by that outcome we sawin that test.
And again, I'm just speakinghere as a customer, you know,
special insight here is otherthan just uh someone viewing
this from the outside, this testis a third party.
But I will say when I when Ilooked at the language of ask

(24:42):
app not to track, I just as aresult of this test, it really
hit me.
I was like, this is differentthan I thought it was.
Like and I I really slowed down.
And it's like, I thought thiswas a permission.
Like, can this app have accessto my contact list or not?
And then I looked at otherpermissions and those were
saying, can this app have accessto your content?

(25:04):
Can this app have access to yourcamera?
Can this app have access to yourlocation?
This wasn't saying that.
This was saying ask app not totrack, which is very different.
The other ones weren't sayingask app not to access my camera.
They were saying, Are you gonnaallow the app to access your
camera?
And I realized I sort of startedpulling that thread.

(25:26):
And I realized from myperspective, the language and
the presentation was less thanclear.
And as a customer, I thought Iwas hitting a permission, but I
wasn't.
I was asking it not to do, andit just seems like such a
strange thing to ask.
Like, why would I want an app totrack me?

(25:46):
Like that, like if I go to aburger joint and I order a
cheeseburger and the waitresssays to me, Oh, hey, before I
go, I gotta ask you, uh, wouldyou like the chef to not spit in
your burger?
I'd be like, Wait, why are youwhy are you asking me that?
Does the chef want to spit in myburger?
Like, what's happening here?
You know what I'm saying?
Like, it seems like such astrange.
So, my sense here is that thisis an area where we we really

(26:09):
want to differentiate.
And I again, I described areaswhere I see Apple makes really
smart decisions about privacy.
Other ones that are not as clearand then it sort of get less
even less clear.
We want it to be really, really,really simple.
So, like, we have some basics inour product that eliminate any
of this ambiguity.
Number one, we make zero revenueoff customer usage.
Like we have no upside at all.

(26:31):
I think everyone who makesphones knows that the more
someone uses the phone, theworse it is for their life.
Phone usage is like directlycorrelated with worse health
outcomes, loneliness,polarization, all these things.
Okay.
But these other platforms makemore money the more you use the
phone.
They just do.
You know, we don't.
We make money by selling phonesand by selling subscription

(26:52):
services.
That's it.
That allows us to do somethingpretty cool, like, you know,
this here at the top.
How many minutes since I'veunlocked?
Eight.
Eight.
Okay.
This is my actual favoritefeature of the phone.
Okay.
I use my phone too much.
It's chronic.
You know, I'm blessed with thisbeautiful family, and I have a
terrible tendency to sit down atdinner and like keep working on

(27:14):
the phone.
It's very bad.
A famous speaker, I love oncesaid, he says, you know, because
of phones, we can work anywhere.
So now we work everywhere.
I'm guilty of this.
So, like family dinners orchurch on Sunday, my favorite
thing now is I put my phone inmy pocket and I have this
impulse, like, let me check it.
Let me look, let me look, let melook.
And I'm like, no, I know what'sgonna happen.

(27:35):
I'm gonna let it sit there.
And when I leave church or whenI leave dinner, I'm gonna pick
the phone up and it's gonna belike 75 minutes since I looked
at the phone and I am gonna bethe boss of my phone.
Like for me, that's enormouslyvaluable.
And I feel like this reframing,like, forget screen time, which
is about the phone.
We called the feature time away,which is about like you building
something positive, notminimizing something negative,

(27:57):
right?
So we want to help peopleincrease their time away from
the device rather than makingthem feel like you know, they
need to feed this sort ofaddictive habit.
So what I'm getting at here iswhen we start having like
privacy, business modelblurriness, we start ending up,
I believe, with less clearloyalty to the customer only.

(28:20):
And we end up having to makesacrifices and compromises for
other interests like Wall Streetor whatever, right?
And we don't want to have tomake those compromises.
So I would say, like to answeryour question around like the
delineations, my observation isother companies have some great
policies, some blurrierpolicies, and some really blurry

(28:41):
policies.
We want there to be like zeroconfusion.
We want the phone you buy out ofthe box to like obviously not
sell your data, obviouslydiscourage apps from harvesting
your data and make it reallyeasy to know what's happening.
That's the last thing I'll sayis you can't really have
security without transparency.
And what I mean by that is ifyou don't know what the phone is

(29:02):
doing, then how do you knowwhere your security boundaries
are?
Right.
This is why we're reallyshifting a lot of our experience
in the product to like showingpeople what's happening.
That dashboard I just showed youis the tip of the iceberg.
Like, you know, when you tapinto it, you get like a pretty
detailed graph of like whenthese calls are made by what
apps.
And we're we're gonna keep goingin this direction so customers

(29:23):
can really understand likespecifically what servers are
communicating with their device.
So, yeah, for for us, likeeliminating the ambiguity is a
big part of the sales pitch.

SPEAKER_01 (29:34):
Huh.
Yeah, that that's crazy becauseyou know, I I remember when
Apple came out with thatservice, that that new feature,
right?
Ask app not to track.
And like I questioned it myself.
I'm like, well, it's saying ask.
That's not that's not shuttingit off.
And then you go into thesettings and it looks like all

(29:54):
of the other permissions, right?
I mean, it looks the exact samesame color, same option.
Option, you know, and so in myhead, I'm thinking, well, maybe
it's poorly worded.
And it's doing the samefunction, it just worded
differently to appease, youknow, Google or third parties

(30:16):
that are on the app.
Surely, if I say ask app not totrack, it's gonna limit how much
meta is tracking me, right?

SPEAKER_00 (30:25):
And I I by the way, I believe there are some
limitations that occur.
Like the more I've been readinginto it's it's not very easy to
understand, but like the moreI've been reading into it, it
appears that it may, I believeit does.
In fact, it may do somethinglike hide the ad ID from the
device's ad ID.
However, like what we seeclearly is our phone has no ad
ID, and that does not stop theseapps from trying to fingerprint

(30:47):
in other ways.
So it's like all of thesebusinesses, this is a
multi-hundred billion dollarbusiness of data harvesting on
phones.
All of these SDKs are investedin finding ways to fingerprint
devices, follow you around, getyour location for everything you
do, regardless of what settingsyou do on your phone.
And this is what we're trying tostay way ahead of.
So it may be that, for example,the ad ID is shut off in that

(31:10):
scenario.
I believe that that's possible.
The issue is like they don'tneed the ad ID.
There are numerous ways thatthese applications and SDKs
fingerprint the device and endup with the same result.
So, again, to your point, Ithink, you know, the other thing
that it does seem clear is asfar as I understand the language
is and the literature on this,is that the OS is telling the

(31:34):
app a customer preference, butit's not enforcing, as far as I
understand it, the customerdecision or monitoring what
happens.
And that's what we're seeing inthis test, right?
Is I'm just using an app where Isay ask app not to track, and
the app is m is making insanerepetitive calls to data
harvesters sharing informationfrom the phone.

SPEAKER_01 (31:52):
Wow, yeah, that that is that's really eye-opening,
honestly.
And so I have I have one of theunplugged phones.
I haven't switched over to ityet as my main, mostly because
I'm just a little bit lazy witheverything going on.
It's just like I'll get to it,you know, like and experience
it.

(32:12):
But what I really am interestedin.
So you got the ad part of it,the tracking part of it, down,
it sounds.
Are you expanding the securityposture of the phone in other
ways?
You know, like I'm thinking oflike Wi-Fi attacks, Bluetooth
attacks.
Are you looking at securing itin those ways as well?

SPEAKER_00 (32:32):
Yeah, there are already existing features and
there's more in development.
I'm I would definitely, by theway, let's make sure we get you
on the latest OS because wedramatically revamped the OS and
now it's much easier to switchover.
I I understand it the older itcan be a hurdle.
So it's it's an order ofmagnitude easier now.
So one thing I will call outhere is that in the privacy

(32:54):
center here, one of the thingswe do, we we change it to make
it a lot easier to drive.
And we created this four-tabexperience that allows you to
pick from like firewall, whichis here, to the usage stats,
which include like your timeaway, how much you're picking up
the phone.
But also just these devicecontrols.
This is really important.

(33:15):
So here at the top level, I candetermine like I want no ads and
trackers, I want no porno, Iwant no gambling on the phone,
but I can also control do Iallow USB to transmit data?
This is very important.
Like people will go plug theirphone in at an airport for like
a USB charge and not realizethey're hooking up to something
that's pulling data off thephone.

(33:35):
So, like this feature isimportant because it allows you
to get the benefit of power fromUSB without pulling that data
off.
We also have things like obviousthings like Bluetooth controls,
but also limiting 2G is reallyimportant.
And I think a lot of people missthis.
Like MZ MZ catchers leveragethis super unsecure 2G network
to be able to get identifiers ofphones.

(33:57):
Right.
So these are just showing likeexamples of ways we're hardening
things that other phones aresort of leaving as blind spots.
And I think, you know, you'regonna be seeing a lot more from
us in this area as we as we moveforward.
So, yes, whether it's from thephysical off switch, which is by
the way, the most secure way tomake sure the phone is not being
compromised, is by physicallyturning it off to a clearer way

(34:20):
than other phones provide ofeasily understanding the layers
of access that you're providingin like one simple to use place
are ways we're addressing theseissues.
You know, we also provide here,I'm sure you've seen this like
on the original device, but atthe top level, you know, right
below my time away, I can withsoftware control the OS's access

(34:43):
to camera, Wi-Fi, microphone,location, et cetera.
I can turn these features offactually outside of the
operating system.
And this allows like an extralayer of protection from the OS
itself being compromised.
So again, we see this all likefacets of a diamond that our

(35:03):
goal is to increase protectionin each area and just stay ahead
in every area.
I think, you know, areas you'llsee more work from us in the
coming cycles are in theprevention of first-party data
harvesting.
So we've talked a lot aboutthird-party data harvesting.
But there's a lot of the biggercompanies have first-party
harvesting, meaning they don'tcall on third-party STKs, they

(35:24):
build it into the tunnel oftheir app.
And a lot of action is takenthere.
So that's a focal point for ourcoming releases as well.

SPEAKER_01 (35:32):
Wow.
So so it sounds like I canactually maybe go to China and
not be fully, you know, trackedfor my device.

SPEAKER_00 (35:41):
Well, we do have a VPN, we have a unique VPN that
is, to our knowledge, one of theonly VPNs that's operational in
some of the more repressiveenvironments.
So we have a number of, I wouldsay, like security-oriented
professionals who are operatingin the worst environments you
can imagine on our device, andour VPN is designed to hide the

(36:04):
fact that it's in a VPN in theseenvironments.
So there are a number ofcountries that have, you know,
extremely closed internets andthey prevent VPN usage.
And our VPN is uniquely designedto camouflage itself among
normal consumer traffic.

SPEAKER_01 (36:18):
How's that even possible?
Basically, don't give me yourtrade secrets, but yeah, I'll
just tell you like what I'lltell you an example.

SPEAKER_00 (36:26):
Like, let me just give you a corollary example,
okay?
Is that as I mentioned, a lot oflike teams, you know, people who
are on the pointy end of thespear, right?
Use our device.
And something we've learned isthat our device is so much
quieter than other phones thatelectronically savvy adversaries
can pick out our device becauseit transmits so much less

(36:50):
signal.
Because other phones you don'trealize are constantly calling
these data harvesters.
They're also constantly callinghome base, their own OS home
base, right?
With information, updates.
Our phone is very quiet, right?
Which in some environment, likein most cases, this is an
enormous advantage.
But if you're in an environmentwhere an equipped, savvy

(37:10):
adversary is trying to find anindividual who's different from
other individuals, thisquietness is actually a
liability.
So for those people, weliterally deploy special
software that transmits fakeactivity.
So it's like it's justindecipherable from a normal
phone, right?
Meaning it looks like someonescrolling through Instagram and

(37:30):
ordering pizza or whatever, youknow?
Wow.
So that is really interesting.
These are things that you end updiscovering when you're dealing
with these different areas.
Now, again, most of us aren't inthat situation.
The problem most of us face isnot, you know, state actors
trying to target us to put adrone on our head, right?
God forbid.
What most of us are facing isthis passive extra

(37:51):
constitutional data collectionin which our whole lives are
being recorded on publicdatabases.
And I, you know, I think I'm notalone in feeling that like after
COVID and seeing what can happenin a crisis, like maybe we don't
want that information out there.
You know, maybe it's better forus to follow the direction of
our founders.
You know, something I've reallybecome obsessed with is the

(38:12):
founders' intent with the postoffice.
To me, this is the mostimportant lesson for us today in
our internet age.
Right.
When the founders wrote theconstitution, a big part of
their debate was how we needed aprivate, secure way to transmit
information to maintain therepublic, both private letters,
but also to get news.

(38:33):
So Article I of the Constitutionestablishes the post office.
Literally, right afterestablishing currency, they
established the post office.
Right after this is the PostOffice Act of 1792, which is
established then that it's afelony to open another citizen's
mail.
It establishes it's a crime forpeople in the post office to
sell or leak access about anyinformation about what was

(38:54):
shared or mailed.
So their vision was for us toretain a republic, we need
totally secure private encryptedcommunication and ability to get
news without knowing who readswhat newspaper, for example.
What changed is in the 70s witha couple of Supreme Court cases
in the Warren Court, whichestablished at the time it was

(39:16):
phone companies and banks thathad some third-party data.
But it was determined that, hey,if you've shared information
with a phone company or a bank,you know, you don't have a right
to privacy expectation withthat.
That has been applied to ourphones, which are very
different.
The amount of information I justmentioned, hundreds of thousands
of packets per hour, the amountof information flowing off of

(39:37):
our phones and what that makesdiscoverable about us is really
dystopian.
So this is this is really wherewe want to, you know, we don't
think legislation's gonna fixthis, although maybe there's
something that can be, you know,some improvements that can be
made.
We really just think we needbetter products that are more
aligned with the interests ofcustomers and don't have these
other financial interestsinvolved.

(39:58):
So our vision here is like,let's innovate our way back to
an internet communication datainfrastructure with products
that are designed to protect ourrights rather than sell out our
information.
Again, whether that's firstparty, third party, whatever.
We just want it very easy tounderstand with no with no

(40:18):
exceptions or or blurry terms ofservice.

SPEAKER_01 (40:21):
Hmm.
That is that is so fascinating.
Just this this entire world, Imean, what you're describing,
right, is essentiallytradecraft.
Right?
I I mean that that's that'sreally what it is, because of of
the way that you have to thinkof how you would be tracked and
what you bring into a certainarea and how it can be used

(40:43):
against you, and how you wouldsubvert, you know, those data
collection techniques andwhatnot.
You know, like this phone isabsolutely you know
user-friendly, which is notsomething that you would expect
from a highly secured by defaultdevice.
You would expect that kind ofdevice to be something that you

(41:06):
know, someone highly technicallike myself who lives in the
weeds is using, right?
Like and it's like only thatkind of person would use or
could even use this device andactually understand, you know,
what's going on and whatnot.
But it's huge for this deviceyou know to be designed the way

(41:28):
that you guys designed it and beas user-friendly as it is, which
I I was actually prettyastounded by.
I was impressed by that.

SPEAKER_00 (41:36):
You know, that's really great feedback, and I'm
really happy to hear that, Joe.
And we we want to keep making iteasier and easier because I
think your your sensibility iswhere we are.
We are very aligned, which is sofar, you've had like think of it
like a two by two, like oldschool business diagram.
You've had phones that arereally easy to use, like Apple
and Google, but as we've pointedout with these tests, at their

(41:57):
heart, they're not reallyprivate as ecosystems.
Then you've had other phonesthat are sort of more private,
but they're very hard to use andthey require like loading
special software and OSs andbootloaders and like and it's
just or they don't run apps,right?
Like you're sort of forcedbetween like you can run normal
apps and have not no security,or you can not have apps and

(42:20):
have some.
And our vision is like, what ifthere's an easy-to-use daily
driver phone that's that worksjust like your iPhone?
It's simple.
You know, we have like cloudstorage that holds all your
stuff.
We have password manager, so youcan easily, you know, run your
apps, all that stuff, all thesimple, all the things that
simplify the experience withoutthe giant security and privacy

(42:42):
risk.
That's what we're trying tosolve for.
And we think the market there ismeaningful of people who want an
independent phone that's veryeasy to understand, out of the
box, is protecting your privacy,but it just works like a normal
phone, you know?
And we just did something thatreally, I think, is gonna even
further amplify just theautomatic ease of use, which is,

(43:04):
you know, for us, we run Androidwithout Google, which means a
lot of apps require Google to dostuff and we have to route their
calls, for example, fornotifications or maps away from
Google to an open sourcealternative.
And as we've gotten better andbetter with this, what we call
the compatibility layer, whichis now making like pretty much
every app work totally great, wecan open up our app center.

(43:25):
So, you know, we have the, let'ssay, the top 10,000 apps that
our customers download most onour servers.
So, you know, if they're evercensored by big tech, they're
alive on our servers.
However, we now, when you do asearch, have access to an open
source library of all Androidapps.
So if you search for some likereally niche IoT app, like I had
this the other day, I needed anapp for like my daughter's baby

(43:48):
monitor.
And I was like, man, normallyI'd have to like call our guys,
hey, can we test this app?
But now we've integrated intothe flow where I can just search
this and it shows up as anexternal source.
It's available in our appcenter, but it's not on our
servers yet because it's notsuper popular.
And, you know, because of thiscompatibility layer, these just
work, right?
So this is one of the reasonswhy we think this is possible

(44:11):
today.
Again, this easy to use butprivate is because apps are
ubiquitous.
So there's sort of already anapp for everything.
So our job is just like makesure we have fluid compatibility
with apps.
And then that's sort of A, andthen B, design these features of
the OS that prevent those appsfrom taking data in ways that
are not clear and dangerous.

SPEAKER_01 (44:31):
Yeah, because you're when you secure the device by
default, you're able to thenjust put anything into the
walled garden or the sandbox,right?
And it'll be fine.
Right.
Like there when did when didthat feature come out of the
third party pulling it fromanother server that you talked
about?

SPEAKER_00 (44:48):
This is literally rolling out right now.
You'll see this in a week.
Wow.
Okay.
Yeah.
Because when I was like, I canget you, we'll send you an APK
so you can run it locally andgive us any feedback.
But uh, we're rolling this intothe OS literally right now.
Our guys are at the logisticsplace in in Nevada right now.
Basically, we're we're we'reflashing all of the devices with
a new operating system thatincludes this feature.

SPEAKER_01 (45:08):
Wow.

SPEAKER_00 (45:09):
Which is just such a great innovation.

SPEAKER_01 (45:12):
You know, you guys should just throw my device into
like an internal test, you know,pool.
Like I'm happy to test it.

SPEAKER_00 (45:20):
Great.
Well, we'll we'll we'll put youon the list, man.
Yeah, that'd be fun.
I mean yeah, definitely,definitely.

SPEAKER_01 (45:27):
Man, well, Joe, you know, I I I I really do my best
to to stay on time, right?
I know we started a little bitlate today, but you know, it was
a fantastic conversation.
I I I really try to stop on timebecause I know everyone's
schedule is so busy that we'renot always able to go over, but
I really do appreciate youtaking the time out of your day
to come on and talk to me aboutthis.

SPEAKER_00 (45:49):
Joe, thank you so much.
I really appreciate it.
It's been super fun.
We'll make sure to get you thatsoftware too.

SPEAKER_01 (45:54):
Yeah, absolutely.
Well, before I let you go, howabout you tell my audience where
they could find you if theywanted to connect with you and
where they could find theunplugged device if they wanted
to actually buy one.
There will be a link in thedescription of this episode.
I have an affiliate link forpeople that can use to go and
you know purchase the phone at aat a small discount.

SPEAKER_00 (46:13):
Thank you so much, man.
Yeah, so we're use Joe's linkand we're at unplugged.com.
We are a couple weeks out, aweek about from shipping right
now.
So if you order right now,you'll get a phone, you know, uh
second half of this month.
We are also on X and othersocial media apps.
I'm posting on our X channelfrequently.
So that's probably the easiestplace to find me is on our
unplugged X account.

(46:34):
And we're putting up videosthere all the time.
And, you know, we're we'rereally trying to educate people
through the internet on theseissues, and we just would love
feedback.
So that's gonna be the easiestplace to message us, DM me
there.
We also have on our site, wehave a support mechanism.
We also do in the phone.
So any questions, I think onething I want to make sure people
know is like we have a US-basedteam of experts who like help

(46:56):
people get on the phone andanswer any questions.
So, like, we really want inputand feedback and are like there
to help.
So this is I end up gettinginvolved in many of these
conversations myself.
So don't be shocked if one ofthose uh emails makes its way.
But uh again, this has beenreally fun and we're super
grateful.

SPEAKER_01 (47:14):
Yeah, absolutely.
Well, I'm I'm really happy thatyou came on.
Well, thanks everyone.

SPEAKER_00 (47:18):
Thanks, brother.

SPEAKER_01 (47:18):
Yeah, yeah, absolutely.
Well, you know, thank thankseveryone for watching this
episode.
I hope you really enjoyed it.
Go ahead and go pick up a phone.
It's a fantastic device.
I was very surprised by it.
Again, I'm an Apple fanboy, youknow, so I'm very used to the
Apple ecosystem.
I absolutely hate Android with apassion, but when I picked up
this phone, it was very easy touse, you know, first time out of

(47:40):
the box, no issues.
All right.
Well, thanks everyone.
Peace, brother.
Advertise With Us

Popular Podcasts

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.