All Episodes

July 16, 2025 56 mins

Jordan Bravo and Stephen DeLorme return with a news-packed episode covering the latest privacy violations and surveillance schemes. They discuss Trump's plan to create a master database of Americans using Palantir, WhatsApp AI accidentally leaking user phone numbers, Meta and Yandex exploiting Android phones to track browsing habits, and Ford's patent for cars that report speeding drivers. Plus, Jordan shares updates on his sovereign computing journey including anonymous phone services, Alby Hub lightning setup, and self-hosted lightning addresses.

 

Show Notes: https://atlbitlab.com/podcast

 

00:00 Introduction and Digital Footprint Philosophy

00:35 Welcome to Sovereign Computing Show

00:51 ATL BitLab Sponsorship

01:55 Production Updates and Schedule Changes

03:18 News: Trump Taps Palantir for Master Database on Americans

06:02 Discussion: Government Data Collection Reality

08:50 Advice: Minimizing Digital Footprints

09:42 Personal Anecdote: Marketing Work with Surveillance Tech

13:17 News: WhatsApp AI Mistakenly Shares User's Phone Number

18:07 Analysis: LLM Context and Security Rules

24:01 WhatsApp Metadata and AI Concerns

24:59 News: Meta and Yandex Android Tracking Exploit

28:34 Technical Details: localhost Port Listening

30:56 Instagram Microphone Surveillance Discussion

34:23 News: Ford Patents Car Surveillance Technology

38:37 Future of Autonomous Vehicles and Privacy

40:06 Privacy Alternative: Toyota Hilux No-Frills Truck

42:08 Jordan's Sovereign Computing Updates

42:31 Text Verify for Anonymous Phone Verification

45:12 Steven's Experience with Simple Login App

48:01 Mint Mobile Payment Issues and AT&T Alternative

49:55 Self-Hosting: Albi Hub Lightning Node Setup

51:48 Self-Custodial Podcast Boosts with Podverse

52:09 Self-Hosted Lightning Address with RustDress

54:20 Nix Package Repository Work

55:05 Wrap-up and Contact Information

55:46 Outro and Bitcoin Tips

 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
The thing we advocate for is
minimizing our digital footprints
and stopping or, or minimizing
the leakage of our personal data.
And that will continue to
benefit everyone who does that.
If you are thinking long term and
taking steps to always leave a minimal
digital footprint, then in the future,

(00:22):
whether it's a current administration
or a future administration, doing
something untoward with personal data,
the less of it you have out there,
the less you can be harmed by this.
welcome to the Sovereign Computing
Show, presented by ATL BitLab.
I'm Jordan Bravo, and this is a
podcast where we teach you how to

(00:43):
take back control of your devices.
Sovereign Computing means you own your
technology, not the other way around.
This episode is sponsored by ATL BitLab.
ATL BitLab is Atlanta's
freedom tech hacker space.
We have co working desks,
conference rooms, event space,
maker tools, and tons of coffee.

(01:04):
There is a very active
community here in the lab.
Every Wednesday night is
Bitcoin night here in Atlanta.
We also have meetups for cyber security,
artificial intelligence, decentralized
identity, product design, and more.
We offer day passes and nomad passes
for people who need to use the lab only
occasionally, as well as memberships
for people who plan to use the lab
more regularly, such as myself.

(01:25):
One of the best things about
having a BitLab membership isn't
the amenities, it's the people.
Surrounding yourself with a
community helps you learn faster
and helps you build better.
Your creativity becomes amplified
when you work in this space,
that's what I think at least.
If you're interested in becoming
a member or supporting this space,
please visit us at atlbitlab.
com.

(01:45):
That's A T L B I T L A B dot com.
Alright, on to our show.
Welcome to the Sovereign Computing Show.
I'm Jordan Bravo and I'm recording
here today from the heart of Atlanta
in a TL Bitlab with Steven de alarm up.

(02:09):
And we wanna remind you that you can
boost into the show with Fountain
or another app, such as Pod verse.
You can also email us at
sovereign@atlbitlab.com.
That's S-O-V-E-R.
EIG gn@atlbitlab.com.

(02:29):
And remember, we now can be found
with the name Sovereign Computing.
So if you can look in any of your
podcast players, whether that's Apple,
Spotify, et cetera, et cetera, and you
will be able to find the search results
for the Sovereign Computing Show, and
you can subscribe there and you can
listen in any app that you choose.

(02:50):
It's been a few weeks since we've released
an episode, and we apologize for that.
We were getting a new
production flow going.
We have a new editor and, hopefully we
will be able to return to our weekly
schedule now, and you'll, you'll see them
once a week, typically on the same day.

(03:12):
But, we hope you'll stay tuned
for a lot more good episodes
coming up in the future.
All right, today's, we have a
few news articles that we want to
take a look at, stuff that's been
going on recently that's relevant.
The first one we're gonna talk
about is the, the headline is
Trump Taps Palantir to Create

(03:35):
Master Database on Every American.
And if you haven't heard about this.
the Trump administration is planning
to collect data on Americans,
and they are contracting with
the company Palantir to do this.
Palantir is a military contractor.

(03:55):
the article that we have on the
screen here is the new republic, new
republic.com, and this is a pretty,
uh, typical like left wing biased,
publication, but just keep that in mind.
However, it still is reporting facts here
that are, you are gonna be the same no
matter where you get the information.

(04:16):
So there, this article injects its
opinion a little bit here and there.
You know, it says stuff like, far right,
billionaire, Peter Thiel, da da, da.
But the point is, Trump
has enlisted the firm.
founded by Peter Thiel to carry
out his executive order for,
instructing government agencies

(04:37):
to share data with each other.
And the, the danger here that we see
at least is that this is a, another
of the, we talked about this problem
before when Doge was going through
the, social security information.
But the problem here is that this,
they want to create a giant database

(05:00):
of Americans, personal informations.
And I, I think ostensibly this is for
Trump to be able, to Trump and the
administration to be able to crack
down more on illegal immigration.
I don't know what the supposed benefits
of it would be, but the problem

(05:21):
for people who care about privacy.
Is that this is a large,
centralized database of personally
identifiable information and
any database can be cracked.
And the, bigger a database of information,
the more of a centralized honeypot.
It is a bigger of a
target it is for hackers.

(05:42):
and, you know, rogue agencies,
rogue, governments, et
cetera, that could leak it.
And then.
Once they do that, that's a huge
treasure trove of, of personal
data that they can share.
Steven, do you have any thoughts on this?
I don't know.
I don't know that I'm
nearly as alarmed by it.

(06:02):
I don't know if that that's wrong of me,
but, at least my read of this article is
that, This is data that these government
agencies already have and you know,
it's like Palantir is just being tapped
to help, like, organize this data and
make it more accessible and, you know,

(06:26):
easily searchable and all of that.
So yes, the idea of a like woo,
easily searchable master database
on all US citizens sounds scary.
Having said that, it's kind of like,
are we gonna ask, it's like it,
it's weird to ask an organization

(06:48):
to like function ine effectively.
I don't know.
I guess it, it's like, it's just,
it's rational for any organization
to like, want to have its data easily
searchable and indexable and all that.
I mean, maybe that sounds
naive of me to say that.
so I don't know.
I guess it's, I'm not saying I love it,
it's just, it, it's sounds expected.

(07:10):
I mean, I think we should really
like to a certain degree you have,
we have to get used to this sort
of thing because, well actually
this will be the second time today.
I've said this on a podcast, but,
I think when you look back at like,
what happened with, with Doge, and,
and you have like the speed at which
they move with Doge was incredible.
Right?
And, and you had, um.

(07:32):
You know, young people getting
involved in government, young people
with technical skills, using AI to
move much quickly, process massive
amounts of data very quickly.
Doge moved quicker than we are used to
government moving, and you know, we might
have to get used to that being the norm.

(07:55):
Like we're used to thinking of governments
as being very slow moving things, I think.
But.
using AI and using technology
and using more motivated people,
that may not always be the case.
and I think we might have to get used to
the idea that like, especially when the
pendulum swings and, you know, eventually
we'll have a left wing president

(08:15):
again, that next administration very
well could, do the exact same thing.
If they were motivated enough, they could,
Put in very fast moving, highly motivated
people with ais, who, you know, are,
are able to make change very rapidly.
So, I don't know.
I, I'm not saying I love it, I

(08:36):
just, it's kind of expected to me
and I, I, you know, I don't know,
I don't know if that makes sense.
Yeah, I see what you're saying.
I, I think this doesn't really change
anything as far as advice goes.
the thing we advocate for is
minimizing our digital footprints
and stopping or, or minimizing

(08:58):
the leakage of our personal data.
And that will continue to
benefit everyone who does that.
if you are thinking long term
then and,
and taking steps to always leave a
minimal digital footprint, then in
the future, whether it's a current
administration or a future administration.

(09:19):
Doing something untoward with personal
data, the less of it you have out there,
the less you can be harmed by this.
Yeah.
Or any of it.
I, I'll tell a fun slash scary story
that, about this kind of thing, and
my background was doing a lot more
marketing design work, like video and

(09:39):
motion graphics and all that kinda stuff.
And, the, with the, the company I
worked for, we, you know, worked a
lot more with like, tech companies
or like B2B Tech products and stuff.
I used to have to work on like a lot
of videos and stuff that were for,
Like kind of tech products, targeted
at like, you know, governments and
stuff like, you know, they might be

(10:00):
marketed to, police or like emergency
services or disaster response.
Like, not all like bad stuff, right.
obviously like a disaster response.
I think you, you want that to be
very coordinated and efficient.
But I, I remember this one
product I had to do a, a video
for was very alarming to me.

(10:20):
'cause when I was being shown the, like,
videos and screenshots of the product,
I was like, holy crap, they can do this.
The reason I'm reminded of this
particular product is because, it was,
you know, they were telling me like,
oh, you know, this, this is like.
They're trying to compete with Palantir
or they like, want to be like seen as
like roughly analogous to Palantir.

(10:41):
and so I was looking through this
product and like basically, you know,
you could just type in a person's name
and it would just like pull up this
like web of details about the person.
It was very creepy.
Like you could just type in somebody's
name and it would generate this like
chart for you and it would be like.
So the name would be in the center,

(11:02):
and then it would have like little like
lines going out to different nodes and
it would say like, these are all the
addresses that this person's connected to.
And then these are all the phone
numbers that this person's connected
to, and these are all the Twitter
handles they're connected to, and,
and so on, and so on and so on.
Then it would map out and it would,
it would like connect to them
to like known associations like.

(11:24):
For example, it might be like, like let's
say you were connected to an address,
will it also show you another person
that was connected to that address too?
Like, um, uh, you know, like might,
maybe it was like a former roommate
you had or someone who co-signed on
the lease with you or whatever it was.
Or maybe it was someone else
who is also connected to the

(11:45):
same phone number or whatever.
And so like, this data has
to exist in some form, right?
Like.
Either, you know, at the phone company
or at, you know, uh, you know, if you've
signed a lease for a, an apartment or
whatever, um, you know, all this data, it
has to already be out there, but I think

(12:07):
it just makes it more easily searchable.
But it was very just like alarming to me
that somebody could just go, like, buy
a product like this and it would be able
to easily search all that information.
And then could like present it in this
nice, like easily digestible, graphic.
So it's pretty alarming out there,
like how much data is like, already

(12:27):
out there and already in databases and,
if a government wants to buy access
to one of these products and like, you
know, get all this stuff, they can,
I, I guess what wasn't clear to me
when I was working on the project was.
Are they buying access to the data
or are they providing the data
and using the product to make the
information more easily searchable?

(12:47):
That's what I wasn't clear on, but
yeah, these product exists and they've,
they've been out there for a long
time on the market, and this kind
of goes back to something we saw on
another podcast where it's like this,
this thing has always been a problem.
The amount of data that's out
there and you know, your privacy.
being leaked and compromised.
We just only worry about it depending

(13:08):
on how we feel about the president.
But it's always a concern.
It's always an issue.
and I don't know, I
just, I, I was alarmed.
So anyways, that's my personal story.
Roughly what year was that,
that you saw that product?
Uh, 2014, I think if memory serves.
Okay.
So you can imagine how in the past.
12 years.
It's only, or 12 ish years.

(13:29):
11 ish years.
It's only gotten, yeah, 11 years.
A lot scarier with the, a advances
in AI and data collection.
Oh, I'm sure it has.
I mean, that was supposed to be like
cutting edge stuff at the time that it
was like this thing identifies connections
between people, places, and things
that a human would have trouble doing.
So now with like a agentic

(13:51):
ai, it's probably even better.
Oh yeah.
I mean, you probably just type,
like, you probably just upload like
a blurry photo of somebody and it,
you know, probably pulls up a full
dossier and backstory and all that.
I don't know.
Well, with, with that thought,
it's behooves us to stay vigilant.

(14:11):
Yep.
Alright.
I, I think we've talked about this
article All there is to talk about.
the next one that we're gonna talk
about is the headline is What's
WhatsApp AI Mistakenly, let's see here.
It's terrifying.
WhatsApp AI Helper mistakenly

(14:32):
shares user's number.
I don't know why I said it in that voice.
So, we've got the article on the
screen for those of you watching
and we'll of course have all of
these links in the show notes.
But the, the gist of this article is that.
Some a, a WhatsApp user was using
WhatsApp's built-in meta ai, which is
provided by Meta, the parent company,

(14:53):
and it, he asked for the phone number
of somebody in his contacts, you know,
he said, call so-and-so, and what
the AI did was it pulled a number out
of seemingly nowhere and called that
number and it was the wrong number.
But it wasn't just a random number,

(15:14):
it was the number of another WhatsApp
user, just a stranger to this person.
But it was a real number and
it was real WhatsApp user.
So the question is, where
did it get that phone number?
It seems like it got it out
of the WhatsApp database.
And so the, if that, if, if it's not

(15:35):
clear that why that's a big deal.
The AI.
Was somehow granted access to, uh,
like if, if the AI has administrative
access, let's say, on every user in
the database, then the info from one
user is being leaked to another user.
Yeah, so the, the, basically what

(15:56):
happened is the, if I understand the,
uh, assistant, it was asking for the
customer support phone number for some
company, the Trans Penning Express,
and then the AI came back with the
phone number of, not the customer
service, but another WhatsApp user.

(16:17):
And so the concern is that
the agent has like access to
a database of WhatsApp users.
Yeah, I think I explained it
incorrectly, so thank you for that.
the idea is this AI should have just

(16:38):
been able to do a simple task for you.
Like, Hey, look up the public
number of this company, trans PanIN.
And you know, that should be
scrapable from the web that,
or their Google Maps listing.
You know, many places where you can find
a company's public contact information.
But instead, what the number that

(16:59):
the AI dug up was a private WhatsApp
user, which shouldn't be available.
so the question is how did the
AI get that number and then.
When the user asked how, like,
where did that number come from?
The ai just kind of the chat agent
just kind of backpedaled and said

(17:22):
like, what are you talking about?
I didn't do that.
So it's a little, it's a little
crazy like the AI gas lighting
and trying to cover its tracks.
but the weird thing about this is
there's no good or definitive answer.
there might be some follow
up articles later on, but.
From this article alone, we
don't have any more information

(17:43):
on where the AI got that number.
Yeah, that is interesting.
I mean, I think to to like
it, it does say on here.
Smith Hurst wanted to know why he had
shared the private number, which it turned
out belonging to James Gray, 44 property
industry executive from Oxford Shaw,
who is also a WhatsApp user and whose

(18:05):
number appears on his company website.
Ah-huh.
So, I don't know if I understand the, uh,
allegation that it's a private number,
given that it is, does appear on the,
the company website and, given that
it's on a company website, it could be,
given that it's on the, from a company
website, it could have been pulled from
a publicly available, like web search.
Like in theory, if we want this like

(18:26):
WhatsApp AI to be able to find stuff on
the web, for us it, it has to be like
an agent that's able, like, capable
of like, either searching a database
of like web search results or, query
and Google or whatever on its own.
so I think that just the fact that
it's on the guys company website easily
could easily explain how it got there.
Or it could be like a

(18:47):
database of WhatsApp users.
I don't know.
There, I, I think we don't really
know, but that could be that in
terms of like the AI covering its
tracks, uh, I just wanna say, yeah.
I did not see that part of the
article where the, it says that this
guy's number is public is a avail
available on his public website.

(19:07):
So to me that really lessens
the impact of this story.
You know, it's like you said, if you
can scrape the web publicly and find
the number, then it's less scary to
me that, you know, the it, it didn't
necessarily come from a private database.
So, I don't know.
It kind of feels like more

(19:28):
of a nothing burger article.
A little bit.
I mean, it definitely indicative of how
the general public perceives this stuff.
Smither said he did not believe
the chat bot and told it.
Just giving a random number to someone
is an insane thing for an AI to do.
It's terrifying.
Smithers said after he raised a
complaint with meta, if they made up
the number that's more acceptable.

(19:49):
The overreach of taking an incorrect
number from save data is, has
access to his particular worrying.
So I don't know.
It's,
I think when you look at how,
like you brush up against the
limitations of LLMs and agents.
When we see stuff like this because they,

(20:11):
like, they sometimes hallucinate stuff and
they don't know how they hallucinate it.
Like they, they just, they
don't know, like, 'cause they're
not like on all the time.
They're not like a consciousness.
They it, so they, they just sometimes
hallucinate like bad answers to things.
Right.

(20:32):
And I think one thing,
if you've ever tried to.
Program anything that involves an LLM,
like, you know, there's this idea of
like, you can give it context, right?
Like, so there's the prompt, which
the surface level prompt might be
me asking it, Hey, can you find
the phone number of this company?

(20:54):
But behind the scenes, there
might be all kinds of other
context being given to that LLM.
That might be stuff like.
You are a helpful agent.
You always respond in a polite,
you know, tone of voice.
but there might also be
deeper stuff in there.
Like, never ever say
this, never ever say that.

(21:15):
there, there could be a whole list
of complicated rules behind the scene
that's not visible in that prompt.
And, you know, that might make you
think, well, I don't like that.
I don't like the fact that there's
all these rules that I'm not seeing.
But from a security perspective, if we
want this agent in the background to go

(21:35):
run around and like scurry around finding
information for us, we have to have
some of these rules in the background.
'cause like, one, it's just not useful.
Sometimes these rules are put
in place to, you know, give it
a, a, a certain good output.
but in other, other times you
actually have situations where like.
You know, people can really get sensitive

(21:55):
information by talking to these chat bots.
Like we've had multiple demonstrations
at the AI meet up here at Bitlab,
where people have demonstrated like
jailbreaking ais and like, I mean, you
can just, you can Google search of, you
can just find all kinds of examples of
just bizarre information that LLMs will
leak if you give them the right prompts.

(22:17):
So, given that this agent.
Is probably hooked up to
some kind of database.
It has access to something.
There's probably rules, coded into
it about like, into the prompt that
we can't see that, that, you know,
guard what it can and can't do.
so I don't know.
I, if I had to guess here,
what's probably happening is

(22:37):
the reason why it's giving such.
Like kind of bullshit explanations for
why it gives the information is there's
probably rules in the coded into the
context so that we can't see that govern
what it's allowed to say to customers.
I. Yeah, something like never admit
to anything illegal or private or that

(23:00):
you did anything that would violate
users' privacy or blah, blah, blah.
Yeah.
And, and also like, you don't
want, you don't want WhatsApp users
being able to gain the AI into
divulging critical backend secrets.
Like, you don't want some kind of prompt
where the WhatsApp agent tells you
like, I'm, I'm oversimplifying this.

(23:22):
But you don't want something where the
WhatsApp agent can like tell you what
version of database software they're
running or, you know, grant access to
areas of the data information that the
user's not supposed to have access to.
And some people have like really, you
know, found security loopholes where they

(23:42):
can convince agents and LLMs to give them
information they're not supposed to get.
So I don't know that would, that would be,
that's my read on what's going on here.
But I, I don't know.
I'm not a agent expert and, I
don't know how WhatsApp's AI works.
I'm just, piecing this together
based on, you know, the, you know,

(24:06):
my experience, you know, working
with a, you know, ai, APIs and stuff.
That's interesting.
So what is the upshot of all of this?
Number one, if, if you haven't
listened to our episode on Instant
Messengers messaging apps, please go

(24:27):
back and do So, we talk extensively
about WhatsApp and the, um, metadata
collection that it, it does as an app.
And then you wanna, if you add
on top of that, the meta AI
that is now built into WhatsApp.
you're gonna run into the same issues.
So all of your, the, the AI likely

(24:49):
has access to the same metadata or, or
at least a, a portion of the metadata
that, um, the WhatsApp app itself has.
And so I, I just personally, I don't
like using WhatsApp because of all
of the metadata being collected.

(25:10):
And I know that it's profitable for meta.
so I, you know, this article doesn't
really change anything in my opinion.
It's kind of a weird
quirk, but I don't know.
It's kind of a, like I said, it
doesn't change anything for me.
Yeah.
Yep.
All right.

(25:30):
Well, let's take a look at our, the
next article that we have today.
Ooh.
This one is.
meta the company and Yandex, which if
you aren't familiar with Yandex, Yandex
is a company that is Russian based.
They, they fulfill the same role
that Google fulfills here in the us.
They are a search engine and map

(25:52):
software and stuff like that.
So what this article says is that,
meta and Yandex were shown to be.
Exploiting Android phones and
tracking data and users browsing
habits based on a, a little hack.
And for those of you who might know a

(26:13):
little thing about web development, local
host is the name of the website, quote
unquote, that's just your local machine.
this only takes place on Android.
So if you are an iOS user,
you are not affected by this.
But on Android, there were certain native

(26:34):
Android apps, that from Meta and Yandex,
and I think they list the meta apps.
Here.
It was Instagram.
do you see the other ones?
Uh, let's see.
Might have been up.
Yeah.
Oh yeah.
Here we go.
We found that native Android apps,

(26:56):
including Facebook, Instagram, and
several Yandex apps, including maps
and browser, silently listen on fixed
local ports for tracking purposes.
Yeah, so if you have the Facebook
app install on Android, the
Instagram app install on Android.
Or, um, these other Yandex apps.

(27:17):
Then what they were doing was, normally
Android has a little bit of sandboxing
and, there's barriers put in place to
prevent apps from listening to other apps.
And so what the Instagram app and
the Facebook app were doing was
taking advantage of if you are
in your mobile browser and you
visit a site and it, it had a.

(27:41):
Meta tracking cookie in it, the
cookie would then run in your browser
in the background and report data
back to the, and back to the native
apps like your, um, Instagram and,
I'm sorry, what was the other one?
The Facebook app.
Yeah.
Yeah.
And, and so by using this little hack,
they were able to break out of these,

(28:03):
these privacy sandboxes that the.
Operating system's supposed to have in
this case, and it was basically able to
track all of your, or not all, but it was
able to track a lot of your web surfing
and browsing habits within that browser.
So this was, um, and when, when this

(28:24):
article was published, and this is kind of
like a research, very technical article,
but, a security researcher did this and
was able to discover it and replicate it.
They, the response from meta was that
they are, they just quietly stopped
doing it, at least in this, these
instances, but they never, as far as

(28:46):
I know, I've been following this, they
have not responded or apologized or
said, we're not gonna do it anymore.
They just stopped doing it
in this particular case.
Now, that doesn't mean that they couldn't
try doing it again in the future or, or
a slightly modified technique, but, um.
They've been doing this for many years.
Yandex has been doing it since 2017

(29:09):
and Facebook has been doing it since.
let's see.
I don't know, I think, I think I read
earlier that they've been doing it
since at least the beginning of 2024.
But if you go to the
very top of this page.
You'll see that they say there's
an update as of June 3rd, 2025.

(29:30):
The meta Facebook pixel script
is no longer sending any packets
or requests to local host.
So essentially, Facebook
stopped doing this.
it says Yandex has also stopped
the practice we described below.
So the, this is very shady.
Obviously.
It should come as no surprise

(29:51):
to any of us at this point.
That.
Meta is doing everything in
its power to track users.
That is their business model, right?
They sell ads, that's their
primary revenue stream.
So that's what they're gonna do.
You know, meta is gonna meta.
the fact that they stopped doing it in
this particular instance doesn't really

(30:14):
give me any assurance that they're not
gonna do it again in the future or do it
on other apps or with other techniques.
But, I would say if you have the
Facebook app installed on Android or
you have the Instagram app installed
on Android, consider uninstalling it.
If you need to use these spyware laden,

(30:36):
uh, services, you can use your browser.
And, and one other thing
about the Instagram app.
This is not in this article, but we know
that the Instagram app was using the
microphone permission in the background
without explicit user permission.
Like it was just, asking for

(30:58):
it on upon install or, or like
that was the default permission.
And this is why you, you get so many
stories of people who say, oh yeah,
I was having a conversation about
something I've never talked about before.
Or my friend recommended something
to me, and next thing you know,
I'm seeing ads for it on Instagram.
Right?
These are the creepy phone spying things.

(31:18):
So it's good to be explicit and specific
phones, like when we say our phone is
spying on us, it's a very vague statement.
Like, well, what do you
mean the phone is spying?
Is the operating system spying?
Is there a hardware listening
device plugged into it?
Is it a particular app?
And so in many cases, it is an app.

(31:38):
And in many cases, that app is Instagram.
I don't know if Facebook does this
as well, the Facebook native app
on Android, it's quite possible.
It does.
It wouldn't surprise me, but if you
have either of these apps installed on
Android, consider getting rid of them.
if you have to use these
services, consider using

(31:58):
them in your mobile browser.
That'll give you protection
from these kinds of things.
And so was this like, this was
like the, Instagram app on Android
could send a request to your
browser while you're using it?
I believe what was so.

(32:19):
Insidious about this particular
technique is that it could just do
it in the background constantly.
Oh, so you didn't even need
the Instagram app open, right?
Oh, that's weird.
Yeah.
Ugh.
Okay.
Well that's gross.
I, I agree.
Okay.
Yeah, so basically this thing is

(32:40):
just like spying on your traffic.
Nice.
Yeah, it's pretty fucked up.
Hmm.
Well, don't use Facebook
and Instagram, everybody.
Yep.
And if you're interested in the
nitty gritty technical details, this
article goes into pretty good detail.

(33:02):
They have a, a, video that shows them,
like screen sharing, the Android app
with monitoring the network requests.
And you can see.
Just a stream of all of the websites
that they're visiting and how the, the
Instagram app has direct access to that.
Yeah, that's creepy.

(33:24):
This is very, very detailed report.
I love it.
Mm-hmm.
Yeah.
This person knows their stuff.
Maybe we should give 'em a shout out.
What's the author's name?
Uh, well, I think it's
actually a lot of people.
Yeah, this is like, they have a, you

(33:44):
can reach the entire team, local mess at
pm Do me authors are Anna Kath, garish,
una Kar, cio, Velina Rodriguez Nuna.
We are Sakara and Tim Lumin.
So it looks like a lot of PhD students
and some professors, a lot of them have
a, they look like they work at this same.

(34:06):
An organization called I-M-D-E-A networks.
I'm not familiar with that, but
it's probably a research institute.
Yeah, I think so.
Anyway, the link will
be in the show notes.
Check it out if you're interested.
Alright, the last article we
want to take a look at today is.

(34:27):
the headline is Ford Wants Patent
for Tech, allowing Cars to Surveil
and Report Speeding Drivers.
The headline kind of says it all,
but this has not been created yet.
This is not an actual product in the
wild, but a patent has been filed with
the US Patent and Trademark Office,

(34:49):
and it is, the idea is that they.
The car itself can report on the speed
of the car to police, and so basically
you might be going above the speed limit
and your car tells the police on you.
So it sounds very dystopian.

(35:11):
I sincerely hope that this
doesn't actually become a thing.
I hope that if it ever
did become a thing that.
People would just revolt
and refuse to buy it.
But you know, people
are kind of oblivious.
So what do you think says the patent
explicitly states this idea for a

(35:32):
system is specific for application
in law enforcement vehicles such as
the Ford Police interceptor, as it
would automate a capability that law
enforcement already have in used today?
That, so it looks like it's more
targeted at being inside of law
enforcement vehicles specifically.
I guess it's just that Ford makes
consumer cars and police cars.

(35:55):
Yeah, I, I mean, it sounds
plausible, but also I'm skeptical.
You could install it as a feature
in cars and then, you could, uh, set
it up so that, Uh, governments could
pay people to, narc on speeders.
So you could actually get, you could

(36:15):
actually get paid for sharing the data
from your car with law enforcement
and narcing on everyone around you.
Yeah.
It would be like a government app that's
in the similar vein of ways, except
instead of reporting to other ways users
that Oh, there's a speed trap up ahead.
Watch out.
Yeah, you're reporting, Hey
cops, here's this person.

(36:36):
Speeding.
Give him a ticket, arrest him.
Yeah.
And then eventually that would
solve the speeding problem
because no one would speed.
'cause they would know that they
were always, you know, because
everyone would be sharing the data
because they would want to get paid.
And so no one would speed.
Is that, I don't know.
Yeah.
Who, who knows?

(36:57):
It kind of reminds me of.
Have you seen the movie Minority Report?
Mm-hmm.
Yep.
Well, in that movie there's a, it's in the
future and there's, driverless cars, like
self-driving cars, but they're ubiquitous.
You know, every car on the road is a
self-driving car, and there's a scene
where the main character played by
Tom Cruise is running from the cops.
You know, he's been wrongfully accused

(37:17):
of some crime, and there's a chase scene.
And he gets out of his, he, he's
in a self-driving car and it's like
a high speed car chase thing, but
these cars are driving themselves.
So he, he gets outta the car
and he's standing on top of
it as it's driving itself.
And he's, and he's in traffic surrounded
by all these other self-driving cars,

(37:38):
and he's hopping from car to car.
And it's just sort of like a
river of self-driving cars.
Yeah.
And then, uh, it goes over to the side,
like these cars aren't confined to.
Just driving on the ground,
like they can drive on walls.
So the road starts going vertically,
and now he's hopping from like
vertical car to vertical car.
And it's, so I'm just, I'm picturing

(37:59):
the a, a future in which we all
have, like, cars that just report
on everything and are self-driving
and like, we just are completely
taken out of the loop as humans.
Yeah, it could be.
I think the future is, uh, with,
uh, driverless cars, you know.
Which is, there's good, there's good
things and there's bad things to that.
Yeah.
I just fear the, surveillance

(38:22):
technology packaged with them.
Yeah, totally.
But it's, it's where it's going.
So you gotta, you gotta get the
compound off the grid out in the
middle of nowhere with the, the
diesel pickup truck and, you know.
Yeah.
Well, you, you people who
are interested in privacy.
What they can do with regards to

(38:43):
automobiles is you can actually
buy older cars of course, but
not everybody wants to do that.
There is a new car coming out that
has, that explicitly is advertises
itself as not having all of this
newer technology, meaning newer
surveillance technology as well.

(39:04):
Okay, so this is the, the
car I was looking for.
Toyota has a 10 th $10,000 pickup
truck and uh, what is it called here?
The, to Toyota Hilux.
H-I-L-U-X.
Hilux pickup.
Yeah.
And so the idea is that this is
a no frills truck pickup truck.

(39:27):
And it's very basic.
Doesn't have any of the, um, a
lot of modern, extra features, but
it's just a basic pickup truck.
And this is, I guess this is
meant for low GDP markets, which
is like fancy way of saying other
countries where they have less money.
But privacy minded people have

(39:49):
looked at this and thought, Hey,
I'm interested because it has.
None of the modern technology in it that
is now so often used for spine, you know,
it doesn't have the computer tracking on,
um, location and, all of that other stuff.
So I, this is just a fun aside, I wanted

(40:09):
to mention, Hmm, the future for private
privacy respecting automobiles is
difficult, but not hopeless, I would say.
Hmm.
Yeah, that's cool to see.
But, you know, I, I don't know.

(40:29):
it's an interesting like thing
for me 'cause I, I, I do just
think that, you know, the future
is going to be like driverless.
There's just, I don't, I just
don't see like, there's so much
momentum heading in that direction.
I don't see it.
Not, so the, the big thing for me
is can somebody make a dry roadless
car that's like private 'cause.
You know so much of them right now,

(40:50):
like involve them, like being kind of
like a software as a service thing.
And eventually it'll be interesting to
see if we can get to the point to where
you can have a driverless car that's
totally compatible with society and yet,
you know, also maintains your privacy.
I want a sovereign driverless car.

(41:12):
Yeah.
And obviously for a long time we're still
going to be able to take the wheel and,
uh, intervene as a human, but I could
see a time in the future that comes when
there is no steering wheel at all and
you're just completely at the mercy of
the car, in which case it's so important

(41:35):
for the car to work for you, the owner,
not some other company or organization.
Similar to how we advocate on,
on our computers and our phones
and other computing devices.
We want them to be under our
control and working for us.
The dream is that if we get to

(41:56):
a a stage where everybody's got
driverless cars, we would also be
able to do that with our vehicles.
Yeah.
All right.
I don't have anything
else to add on this topic.
I wanna get into a few things today.
So it's been a, it's been several
weeks since we had an episode, and
I wanted to just check in and give a

(42:17):
few updates on some of my sovereign
computing journeys in the wild.
the first thing I wanna talk about is
call is when you have to use a phone
number to sign up for a, a website.
Let's say you're creating a, a new.
Google account or anywhere that
requires a phone number just to create

(42:38):
the account, just to move forward in
your interaction, but you, but there's
really no good reason for them to
actually need your real phone number.
I use a, a couple of services
that I highly recommend.
One is called text verify.com and
the other is SMS for sats.com.
But I'm just gonna focus on Text

(42:59):
Verify because they essentially
provide the same service.
And what it does is you create a free
account and then you, you can buy credits.
So yes, it is a paid service,
and you can do that with Bitcoin.
You could top up your balance
pay with Bitcoin, you could also
pay with credit card, et cetera.
And once you have a balance, you

(43:20):
can then generate a one time phone
number, real phone number that is
used solely for the purpose of getting
a two factor authentication code.
So I'm signing up for Google,
let's say, or Ticketmaster.
They've got all these examples
listed on their website.
Tinder, Yahoo, PayPal,
Uber, Walmart match.com.

(43:43):
DoorDash, when you sign up for
any of these sites, they ask for a
phone number so that they can send
a one time code to just so you can
complete your account creation.
And what you can do instead is
you log into text verified you
generate a one time number.
You paste it into whatever service
is asking for it, and then it gives
you that one time code and text

(44:04):
verified and boom, you're done.
You never have to use that
number again, and they don't
have your personal information.
So even if you have VoIP numbers
that are private that we've talked
about in previous episodes, go check
those out for more information.
Those are still.
Costly to spin up, you

(44:24):
know, they're not free.
so these text verified disposable
numbers provide a really economical
way of getting one time numbers right.
It's, it's maybe, um, 25 cents for 10
cents or something like that for a one
time number rather than where you have
to buy a reusable number and then spin it
up, pay for it, and then destroy it and

(44:46):
get another one that's, it's a lot more
costly and less efficient to do that.
Any, any thoughts on this?
Steven?
have you ever used this before?
I have not used it.
What I have been, using lately,
the, is like, kind of on a
related topic is, simple login.
and, uh, we've talked about that, you
know, previously in terms of giving

(45:07):
emails and it's like a, a similar problem.
You go up to a place, they need
your phone number for something.
Or want your phone number for something?
I've been, going, you know, started going
to the gym more recently and, uh, been
trying out like different gyms and stuff
and like, they, they all just have this
like, just atrocious onboarding experience

(45:28):
or like, even if you just want a day
pass, they like, want all this personal
information and they, they make you fill
it out at like a tablet at the front.
oh God.
It's the most annoying shit.
you know, for the email
I've been really happy.
Previously, I was just been using
Simple Login and the web browser.
Now I've actually started
using the app on my phone.
I found it really a delight to use because
if somebody demands an email from me

(45:50):
through a required input field, I can
pull out the Simple login app and spin
up a, a new one and, you know, maybe
type in the name of the gym or whatever.
And, uh, so it's been super easy.
I've been very pleased with it.
I haven't had any issues with using it.
and if you're already paying for a simple
login through like your proton or a
la carte or whatever, it's very easy.
And they have a nice feature too if you do

(46:11):
need, if you need to give it to somebody
so that they can type it in, like somebody
behind a counter, they have an option
where you can make the email and then
press the button to like, just bring up
a screen on your phone that just displays
that email, which is really handy.
'cause then you can flip the phone
around and just show them the email.
I did have an experience recently
checking into a hotel where I tried
to type in one of my simple login

(46:32):
emails and the tablet, the self check
in tablet, just thoroughly rejected
my email because, um, it was like, not
Gmail, at Gmail or at Yahoo, like it
was set to like only check for like
primary domains, which is so annoying.
But I was just, polite, but
persistent with the person.
No, that's my email.

(46:52):
And they're rejecting it.
This isn't, you know,
they're like, okay, cool.
I'll check you in manually.
Yeah.
They did it.
So, but yeah, I, I haven't tried text
verified yet, but I'd like to try it.
'cause I've had pretty good luck with
Simple Login and being able to just
very quickly make a fresh email for
every single business that I go to.
So it would be nice to, I don't know,
try ever, phone numbers as well.

(47:13):
I'm a big fan of Simple Login as well.
All right.
Yeah, check out text
verify.com at anytime.
You need to create, have a one-time, uh,
phone number, and then if text verified
is not working for you, for whatever
reason, SMS for SATs is a backup.
the next thing I wanted
to talk about is in our.

(47:35):
Yeah, previously in our episode on phone
numbers and, and getting private and
sovereign phone numbers, I talked about
how I was currently using Mint Mobile
and they used to be able to, they, I
used to be able to buy a sim and refill
the EIM by, paying online anonymously

(47:55):
by using Bit Refill where I'd, I'd
buy a vanilla Visa card with bitcoin.
And then I would use that vanilla
visa to pay for my Mint mobile eim.
A couple months ago, mint Mobile,
mint Mobile stopped accepting
those vanilla Visa cards.
they didn't explicitly say
they would stop doing it.
They just, I tried to pay for it,

(48:17):
and the account just kept failing.
I tried multiple times with
different cards and it kept failing.
So I came to the conclusion that they
are just not accepting that anymore.
So what I did was I went to.
A Best Buy in person and I bought a
physical Mint mobile sim card, and I
was able to pay for that in cash without

(48:38):
any identifying information, and that
worked great, except for I don't want
to have to do that every three months.
It was a three month SIM card, and so I
had to go back after three months, I try.
I went back and I tried to buy
a 12 month sim because I figure.
I'm okay with going once a year and
buying a sim in person if it, if I

(48:58):
get a fast bandwidth and I get to pay.
And honestly turns out they do not
sell them for 12 months in person.
I don't know if they previously did and
don't do it anymore, or if they just
never have, but it wasn't available.
So, I got another three month
just to, since I was already
there, but I, I started doing some

(49:19):
more research on other options.
One option I found is a company
called Cloaked Wireless.
They, they have, anonymous eims data eims,
very similar to silent link where you can
pay for these with Bitcoin anonymously.
However, I tried it out

(49:39):
and I do not recommend it.
The bandwidth was incredibly slow,
even slower than silent link.
So for me, that was just not
acceptable as a daily driver.
so I, I went back to my physical mint
sim, but I was still looking for an
alternative so that I wouldn't have
to buy a new one every three months.
And what I found is that at and t

(50:01):
also sells prepaid sim cards, and you
can get 12 months of that at a time.
So when my mint, my current mint is
expired, and that'll be fairly soon.
I'm going to try out the
at and t prepaid sim.
And I will let you know on the next
episode or, or when I've done that,
I'll report back and, uh, that might

(50:23):
be a good option for eims going forward
or for physical sims going forward.
Any thoughts on that?
Not really.
Sounds good to me.
Okay.
the other thing I've been
getting into is I set up a new
self-hosting app on my home server.
It's called Albi Hub.
And for those of you who haven't

(50:44):
heard of this, it's a way of having a
lightning node and some other ancillary
services tied to your lightning
node in a, in a really easy package.
So it's got a nice ux, you
can spin it up real quickly.
It's got, several LSPs liquidity
service providers or lightning service
providers that you can choose from.

(51:06):
And, um, it makes it super simple
to open a channel, have inbound
liquidity, set all of that stuff up.
It's very easy.
I think it was a delight to use.
So I, if you're interested in setting up
a lightning note and maybe you thought
this is a little difficult, I don't
know what to do, check out Albi hub.
Very nice experience.
one of the.

(51:27):
Features that I'm able to use with
Al Beh hub is now I can have self
custodial boosts in streaming.
So instead of having to use Fountain fm,
which is is nice for not having to set
anything up, but it is a custodial wallet,
I now have, through my self-hosted Al Beh
hub, the ability to plug my pod verse app.

(51:48):
And that's a, that's a
podcasting app that has.
podcasting 2.0 features, I'm able
to plug my auth information into
there and now I can stream and
boost to podcasts that I'm listening
to completely from my self-hosted
node node running on my own server.
So that's pretty cool.

(52:08):
Great experience there.
Once it's all set up.
The other self-hosting lightning related
thing that I've got set up now is I've set
up my own self-hosted lightning address
server and Noster name verification.
You might, those of you who are
noster aficionados might know
this as the nip NIP zero five.

(52:30):
Uh, standard and that just, it allows
you to have a human readable name that
looks like an email address for your
noster nub, which is a long string
of characters that nobody ever wants
to type out or have to say out loud.
So, for example, if my email ad, if my
domain name that I own is bravo.com,

(52:53):
I could have a lightning address.
That's jordan@bravo.com.
And I could also have a noser
address, that's jordan@bravo.com.
And so if somebody is going
to send me Lightning, they can
just send it toJordan@bravo.com.
They don't have to request an
invoice from me and have it be
an interactive back and forth.

(53:14):
And then for Noser, likewise, you
could look me up@jordanatbravo.com.
bravo.com is not actually my domain
name, so don't look me up with that.
Uh, I just using that as an example.
One last thing I wanted to say about
this, this self-hosted lightning address
thing, it's called Rust Dress, and our
own Natasha here at Bitlab wrote it.

(53:37):
Oh, and it is called Rust Dress because
it is a lightning address written
in rust, the programming language.
So, uh, yeah, I set it
up and it was super cool.
I got it up in a couple hours of
tinkering and now I'm a fully.
Self custodial self-sovereign
light, not lightning address,
having Bitcoin or nice rust dress.

(54:00):
I did not know he made that.
That's cool.
And I am currently, for those of you who
are Nick's nerds, I'm currently working on
packaging his rust dress application to be
available in the nix packages repository.
So it's as simple as doing like
a single command and nix and
it would install it for you.
Oh, nice.

(54:20):
That's cool.
Then it would just install it and you'd
be able to run it from your Nick system?
Yep.
Or any system that's running
the Nicks package manager.
Got it.
Yeah.
And then from there, I guess you just
need to expose the ports properly to
the outside world and all of that.
Yep.
Suite.

(54:41):
So that's everything that I've
been up to lately in terms of cyber
computing, self-hosting, and privacy.
anything that you wanted
to check in with Steven?
Any, what have you been working on?
Anything related to this topic?
Oh, not really.
yeah, I mean, I think all of my fun
anecdotes I've sprinkled in throughout

(55:02):
the rest of, uh, rest of this episode.
So yeah, I think that's
probably it for me for today.
All right, cool.
Well, that's all we have for today.
We wanna remind you that
you can boost into the show.
We'd love to hear from you.
that would be the
Sovereign Computing Show.
You can look it up on Fountain

(55:23):
and you can use Fountain to Boost.
You can also use other apps
to boost like Pod verse.
I'm particularly enamored
of that one at the moment.
You can also email us
sovereign@atlbitlab.com and.
Yeah.
Anything else you wanted to add, Steven?
No, I don't think.
No, I think so.
I think I'm good.
Alright.

(55:43):
Thanks a lot everyone.
We'll see you next time.
Catch you later.
Hey, thanks for listening.
I hope you enjoyed this episode.
If you want to learn more about
anything that we discussed, you can
look for links in the show notes
that should be in your podcast
player, or you can go to atlbitlab.
com slash podcast.
On a final note, if you found
this information useful and you

(56:04):
want to help support us, you can
always send us a tip in Bitcoin.
Your support really helps us so that we
can keep bringing you content like this.
All right.
Catch you later.
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.