All Episodes

August 24, 2025 115 mins
On Sunday, August 24, 2025, at 1 p.m. U.S. Pacific Time, the U.S. Transhumanist Party invites Josh Universe to discuss the privacy and security considerations of technological enhancement. His presentation and subsequent conversation will address the following areas:
1. Real-World Breaches
2. Defining Privacy and Security in Context
3. What is Technological Enhancement?
4. Privacy and Security Risks of Enhancement
5. Approaching Enhancement with a Privacy and Security Mindset
6. Actionable Privacy and Security Steps 
7. Questions and Answers
Josh Universe is an American analog astronaut, science communicator, and biohacker. He is the Founder of the International Biohacking Community. He is also the Founder of the Transhumanist Council.
Watch the previous U.S. Transhumanist Party Virtual Enlightenment Salon with Josh Universe of March 30, 2025, where he discussed Applications of Human Biostasis in Crewed Space Exploration: https://www.youtube.com/watch?v=Vf40zZyNqkc 
Visit the website of Josh Universe at https://joshuniverse.com/.   
Read the THPedia entry about Josh Universe at https://th-pedia.org//wiki/Josh_Universe. 
Visit the website of the Transhumanist Council at https://transhumanism.app/ 
Join Josh Universe’s social network, Science.Social: https://science.social/
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Greetings and welcome to the United States Transhumanist Party Virtual
Enlightenment Salon. My name is Jannati Stolieroth the second and
I am the Chairman of the US Transhumanist Party. Here
we hold conversations with some of the world's leading thinkers
in longevity, science, technology, philosophy and politics. Like the philosophers

(00:22):
of the Age of Enlightenment, we aim to connect every
field of human endeavor and arrive at new insights to
achieve longer lives, greater rationality, and the progress of our civilization. Greetings,
ladies and gentlemen, and welcome to our US Transhumanist Party

(00:42):
Virtual Enlightenment Salon of Sunday, August twenty fourth, twenty twenty five.
Today we have an important and in depth conversation for
you about privacy and security considerations of technological enhancement. Joining
me is our US Transhumanist Party Director of Visual Art Art,

(01:06):
Ramon Garcia, and our special guest today is Josh Universe,
who is an American analog, astronaut, science communicator and biohacker.
He is the founder of the International Biohacking Community and
he is also the founder of the Transhumanist Council. We
had a previous Virtual Enlightenment Salon with him on March thirtieth,

(01:31):
twenty twenty five, so please if you haven't watched that salon,
go after this stream and watch it as well. Josh
also previously joined us to discuss thhpedia, the Transhumanist Encyclopedia,
and today we will be delving into a topic that
is extremely important to the US Transhumanist Party. In fact,

(01:55):
it is emphasized in section one of our platform. So
the very first section that we adopted on privacy, and
I will read it for you to set the context.
The United States Transhumanist Party strongly supports individual privacy and
liberty over how to apply technology to one's personal life.

(02:17):
The United States Transhumanist Party holds that each individual should
remain completely sovereign in the choice to disclose or not
disclose personal activities, preferences, and beliefs within the public sphere.
As such, the United States Transhumanist Party opposes all forms
of mass surveillance and any intrusion by governmental or private

(02:39):
institutions upon non coercive activities that an individual has chosen
to retain within his, her or its private sphere. However,
the United States Transhumanist party also recognizes that no individual
should be protected from peaceful criticism of any matters that
those individuals have chosen to disclose within the sphere of

(03:00):
public knowledge and discourse. And I think this is a
very important stance for us. I don't see either of
the major political parties taking this stance. Some libertarians may
be sympathetic to it, but the idea is, you are
sovereign over your own information, your own data, what you
choose to disclose, what you choose to keep to yourself.

(03:22):
Once you choose to disclose it, however, you need to
accept the consequences of doing that. In other words, if
you disclose something and somebody doesn't like it, somebody disagrees
with it, somebody criticizes you for it, that is their
right to do and you need to accept that fact
as well. That what you do choose to disclose can

(03:44):
have implications in the world of public discourse and how
people think about you in terms of how people choose
to interact with you. But you are in control of
the information that you put out there. Often that's not
the world we live in today, and there have been
some disturbing erosions of privacy as well as risks to

(04:05):
privacy that are associated with some emerging technologies, and we
have Josh to give us an in depth discussion of this. Josh, welcome.
We're pleased to have you today. Please proceed, Thank you.
So we're going to be going over our privacy and
security considerations of technological enhancement. So did she know that

(04:26):
twenty three meters a personal genomics company actually severed a
credential stuff in data breach and that we revealed over
six point nine million users rough genetical information.

Speaker 2 (04:37):
So about me, My name is Josh Universe. I'm a
current student here at Ford text astrophysics and commercial Enterprise
in Space. I'm the founder of the International Firehacking Community
and trans Humanist Council. I do some work with Viva City,
which is like a network state, and I do some
astronauts stuff and some science communication. So first wing go

(04:57):
some real world breaches. We need to see, Okay, why
this is important? Why do we care? Then we're going
to define what is privacy, what is security, and what's
the difference. Then we're going to talk about what is
technological enhancement. Now we're going to go into pricy and
security risks of enhancement. So what are the possible risks
of enhancing ourselves with these technologies. Then we're going to

(05:18):
approach enhancement with a privacy and security mindset. So what
can we do? How can we enhance ourselves while maintaining
privacy and security as much as we recently can. Then
I'm going to provide you guys some really cool actual
privacy and security tips. So these are things that you
can do right now. You don't have to have any special,
fancy implants or anything. These are just things that you
can do right now. And then we'll go to contact

(05:40):
information and Q and A first for real world breaches.
So first one is twenty three and meet. This is
in October of twenty twenty three. The data exposes personal
information and broad ancestor information, so it wasn't exact. It
was kind of like broad information like Okay, are you
from the United States? Did you have a European? To say?
Did you have an Asian descent?

Speaker 1 (06:02):
No?

Speaker 2 (06:02):
Rawdy and A files are stalling. The method is called
credential stepping. So this is so say you have a
using a password that was linked on one website. What
attackers will try to do is take that same email,
same password and try to reuse it one hundreds of
different sites. So for you, we use your password on
multiple sites. This is a way for you to get
your potentially get your account information stalling. So do not

(06:26):
reuse passwords or call sites. Use a pass manager. We'll
get into that at the end of the presentation. Next
is the insult OMNIPOD so this is in January of
twenty twenty three. This data was personal health data and
IP addresses users exposed around twenty nine thousand. The method
was a site tracker privacy misconfiguration. So this is a

(06:48):
bit let's all kind of explain it. So any emails
you have things called personalized links, which are also kind
of called magic links. When you click that, there are
cookies that are loaded onto your browser from link. It
contains like identifying information which is which comes from the
database sent to your personal email and it allows you
to log onto your website without having a type in

(07:09):
your password. Well, in this scenario, the cookie that was
in the email magic link was unsecured, so that means
it means the third party diagnostics of Google Analytics and
stuff and third party advertisers that have tracking scripts on
that website were able to access the cookies from the
website's local storage. Well, somehow those cookies were able to

(07:32):
associate health data and secured and allowed those third party
advertisers and diagnostic agencies to access that data, which is
really bad. This is the device is the insult on
the pod five? Now, this vulnerability is not with the
device itself, It was with the cloud infrastructure on which
the device communicates with. Next, is these all medicals? So yes,

(07:53):
this was in jail or twenty twenty three again the
same month. The data exposes personal health data from like
a cardiac defibrillator device? Is it a wearable medical device? Users?
Exposes over a million users. The method was a simple
network misconfiguration, so an attacker got unauthorized access into an
internal network and was able to access encrypted data. Pricy

(08:17):
versus security. What is privacy? Pricy is about the control
over who can view and use your data. So we're
talking about controlling who can have my data and who
can use my data. So an example is ensuring only
authorized officials are disclosed with a patient's medical records. This
is associated with the HIPPA laws. What is security? Security

(08:38):
is about protecting our data from unauthorized access. So This
is including preventative safeguards to ensure that the data we
have is secure and that malicious actors are not able
to access it, example, implementing encryption and to FA two
factor authentication and to a hospital's HR system which stands
for Electronic Health records. Privacy versus security. So here are

(09:01):
a few scenarios, and I'm going to say, okay, are
these Is this a privacy scenario or a security scenario?
This will kind of help you understand the difference between
the two. First, a brain computer interface company sells aggregated
neural activity data without advertisers consent or two advertised to
advertisers without the user's consent. This is a privacy breach
because we're saying these which we're saying the company sharing

(09:25):
that data with advertisers withouter consent. So it's about control
over who has access to the data. Right. This wasn't
unauthorized because we like signed some user and licensees agreement
and there was no exploitation going on. But the data
is being shared without our consent. Why yet? This is
concerned about how sensitive data is collected and shared. That's

(09:47):
a hacker or exploitter remotely gains control to a smart prosthetical.
End Well, this is a security vulnerability. This is not
a privacy vulnerability because the hacker or exploitter is getting
unauthorized acts to the data. Yeah, this is concerned about
unauthorized usage and access of the device. We don't want
that to happen. A lot of bad things could happen.

(10:09):
It's such as scenario actually played out. Third one is
a garment mandate systems to get their closed source That
means the source code is not publicly viewable and auditible.
And FC implant which is like a subdermal implant. We'll
get onto that in end moment. This is a privacy
vulnerability because this is dealing with how sensitive data is
clicked and shared. So whatever data that you might have

(10:29):
on an implant, say in a few years, you have
implants that have a lot more data and communication capabilities.
If you're giving that data to the garment, that's providing
the garment with a lot of data that they could
use to potentially manipulate you or even serve you ads
through contracts with private organizations. So what is technological enhancement?
This is enhancement designed to enhance physical or mental capabilities.

(10:53):
Here's some examples subdermal RFID slash NFC implants. I will
get onto what those are in the next line. Even
smartphones and wearables. You might be like, what smartphone's not implant? Well,
smartphone enhances your mental capabilities. You carried it around if
you all the time, even though it's not physically attached
to you. You have it on you twenty almost twenty
four to seven. Same with smart watches and other wearables

(11:15):
like woots and auras and all sorts of other different
smart tractors. Next is brain computer interfaces. So these are
data that can read and write information to and from
your brain. So an example could be the neurosity crown
or even the neuralink. And finally, medical devices like pacemakers
and other like eg monitors, Apple watches, especially this device

(11:39):
right here, the Zole medical life fest and this device
right here. So what are subdermal RFIDNSC implants If you
don't know what they are, they are things that are
implanted subdermally in your skin, and they use a thing
called r FID and NFC. NFC is like a high
frequency RFID. That's what you use. An Apple thing. We

(12:00):
have the capsule type implants, the flex type implants, and
these are custom implants. This is called the peg leg.
It's like a basic like a Raspberry Pie that can
store data and it has this capability to serve as
like an offline mesh network. This is like very custom.
Don't try to impalent something like this. This is by
Dangerous Things, which is a symbdermal implant manufacturer. This is cool.

(12:22):
This has Java card applets, so you can do things
like unlock your tesla. Use it as a two factor
authentication like authentication provider kind of, so you can scan
your phone on your arm and it can show up
your two f A codes. Actually have a video of
that on my YouTube channel or actually do that. You
can use this as a U twof or a PHIDO
to NFC security key, So if you have like one

(12:42):
of those little ubkes so you put on the back
of your phone, you can actually use it something like that.
I think I do have. Yeah, it's basically like this
is like an UBI key. You can basically use an
implant just like a NFC ubikey. And then the capsule
implants are very basic. They can just read and write
basic end if data like your website your contact or

(13:05):
contact information. It's like text, like a string of text
on it. You can scan it and see the contact,
scan it, it'll take it to a website, et cetera.
So that's what subnermal rfity and NFC implants are. Just
if you do not know. Okay, So privacy insecurity risks
of technological enhancement, privacy risk one data explication. So this

(13:26):
now we're not really considering current technologies like current subdermal
rfity implants and current publicly available BCI technology because there's
not too much development, too much risk right now, but
especially in the future, these are very important to consider.
Risk the neural implant data or biometric information is sold
to advertisers or other actors without informed consent. This is

(13:49):
a privacy risk outcomes unauthorized parties have access to your data. Well,
this completed things like higher little health insurance or life
insurance creams. If your biomarkers are not healthy, the information
on your respective biomarkers or even neurological signals, however, they'd
be able to extract those and then the data is
sent to a third party. They can analyze that data

(14:10):
and they can be like hey, Kansas City let's increase
your life insurance rate. You don't want that price your
risk to workplace surveillance. Well, but if your workplace requires
you to get a workplace mandated RFID implant, or some
have advanced RFID implant where they're able to do remote tracking.
Can't do it now with current implants, but the future

(14:30):
that is something that could happen, or neurological input like
a brain computer interface outcomes. Employers may make the compensation
packages vary based on your performance data that could be
extracted through your biomarkers, your neurological land waves, and other sources.
They could also aggregate that with like a timetracker on

(14:51):
your computer, and they can be like, okay, you weren't
being so productive, We're going to pay you ten dollars
an hour instead of your normal thirty dollars an hour. Also,
employers may have access to some of your data outside
of workplace hours, especially if it's an implant mandated by workplace. Actually,
did see there's something on the news where a workplace
mandated like subdermal or city implants. These are like the

(15:12):
normal ones that we have today. There was no privacy
or security risk. Besides, if a employee scanned their implant
to get into like an access controlled door, then they
can see, okay, this person scanned it at this time,
which is the same which could still happen if you
use like a phone to scan in to like a door.
So that you know, it's not necessarily a risk from

(15:34):
the implant itself, but there are going to be, especially
with more capabilities of the implants, there's going to be
a higher risk, especially for things like workplace surveillance prence,
the risk three government surveillance. We've been seeing a lot
of this on the news. Government agencies may provide incentive,
right so they may make it seem like a good thing,
or require citizens to provide data from biological or neural interfaces.

(15:59):
For simplicity, if I'm saying a biological interface, it's like
a subnermal implant or anything, but a neural implant or
brain computer interface. A neural interface, I'm just talking about
like a brainta interface as of now, outcomes potential discrimination
of services. So government offered services based off of information
from a neural from a neural interface, especially if aggregated

(16:21):
with data from other sources like security cameras, internet of
things devices, This provid. This could provide these systems which
would likely be automated, These automated systems higher levels of confidence. Right,
so they see that you might have littered, but then
and they see some type of associated biomarker or neural

(16:41):
some type of neural output, and that provides a higher confidence. Well,
you could be discriminated against using some of their services,
or if you don't like the current leadership of an
administration and they're able to sense that based off of
data off of a brain com computer interface, Well, they
could also limit some access peer services or even raised
rates on like the insurance health insurance, or limit types

(17:02):
of schools you can go to. A lot of this
stuff is actually done in China, but I've seen things,
you know, with the the European Union, it's been passing
a lot of regulations. The United States has, So this
is something that could potentially become something of concern, potentially
a security risk. Remote modification, so motor devices like connectivity

(17:24):
enable to prosthetic limbs or excess skeletons could be accessed
remotely without proper security and encryption protocols. So some outcomes
physical harm. If someone has remote control access to a
prosthetic limb, they can can you make it hurt your
smash your face, or do all sorts of other things

(17:45):
that could cause unwanted harm against yourself or even others,
which could involve a littleit of liability. Yeah, unauthorized authorization
of other authentication systems. So if your prosthetic arm has
access or if sti's say your prostic hand and has
access to adore and if that they have these detachable hands,

(18:06):
so if they are able to remotely make a hand move,
which you can actually look on YouTube, there's as girl
as I am, like a prosset hand that can actually
move on its own, like disconnected from the arm. To say,
in the future, if those have authentication credentials, so could
be used to open authentication devices like workplace stores, security

(18:29):
rescue device theft. So certain types of devices that are
easily patchable, such as non avasive brand computer interfaces like
the Neurosity crown or those prosthetic hands, could be stolen
and the credentials on them could be used for unauthorized purposes,
or the data that is stored on those could be extracted.
If there's a known device vulnerabilding see outcomes stolen interfaces

(18:53):
with identity or payment modules could be used for fraudulent
payments or identity theft. So say you have something like
a identifiction documents add which stands for decentralized identification. You
have one of those on a on a brain computer
interface or detachable biological interface. Those could allow people to

(19:14):
go and you spin your credit card up with your hand,
secure yours three interface and malware. So this is a
bit of an advanced one, and this would require a
interface with a bunch of complex systems. But if there's
not appropriate sand boxing done to the processes that run
on a biological or neural interface, then anything can happen.

(19:36):
It depends on the capabilities of the interface. So if
it's a critical interface like a connected pacemaker, and so
one is able to put malaur on that, that pacemaker
may not work and that could lead to death. Same
for a neurological or a neural interface. If in the
future there are advanced neural interfaces that can write data

(19:57):
to the brain and that has that's able to attract
some type of malware on it, then they're able to
completely mess up your brain. They're able to We just
don't know there's all sorts of potential scenarios that can
happen to where they could control your brain or mess
all sorts of different things up. In case we do

(20:18):
not want this to happen. An enhancement with privacy and
security mindset. So what can we do well, first thing,
we want to adopt open source solutions. So open sources
where the source code of something is publicly viewable and auditible.
So if you go on GitHub all those repositories, those
are things called open source repositories. So people are able

(20:40):
to see the code, they're able to audit the code
for any security or privacy vulnerabilities, and they're able to
contribute to the code. Yeah, pros allows public auditing of
code and how data is handled in what subprocessors if
any have access to the data. So if someone has
a brain computer interface and they look at the software
and GitHub and they see that is there's a post

(21:01):
request being sent to Google, then they could be like hey,
they can notify the community of that respective implant and
be like, hey, there's sending information to Google. We need
to either just not use this product or we need
to submit a pollar request to get that removed. Some
cons So some drawbacks could be open source software maybe
vulnerable to security exploits if there's a complex system so

(21:24):
there's like a lot going on, and if there's a
small amount of contributors, so you know, if there's not
a lot of people looking over the code, something happens
to slip by in a poor request. While that can
bursent a lot of vulnerabilities, and with open source software
you're able to just look at the code, so people
malicious actors can look at the code and be like, oh,
there's vulnerability, let me exploit it. Some examples of companies

(21:48):
that are using open source in their products is Neurosity,
so it's going to They have a thing called the
Neurosity Crown and that can actually show you that one moment.
So this is the Neurosity Crown. It is a wearable
brain computer interface. You put it like this, it's really cool.

(22:12):
You can use a JavaScript STK which stands for a
software development kit and run custom applications on it. I
did say a research experiment in North Dakota where I
was able to think just think, and I was able
to basically send messages to chat, GBT and basely response
back through in an earset, which is pretty cool. Another

(22:33):
one is open BCIs Open Brain Computer Interface. They make
cool brain compute interfaces and they make them very affordable,
and they also provide you the three D printing files,
so if you wanted to thirty print the structure, they'll
just tell you the commandodes and then you can build
it yourself for a very affordable price. All me. This
is a new player. They make things called AI pendants

(22:55):
and smart glasses. They're all open sources on GitHub. You
can actually buy the developer kit on their website. They
are making like a very more like a polished one
that's meant for like normal consumers, not like hackers or
biohackers or whatever. So they're also cool company to look
out for. Number two using widely adopted protocols. Well, we've

(23:16):
heard a lot about this, especially with the event of
a thing called Bitchat, which is like a Bluetooth powered
mesh communication app, and they've tried making their own proprietary algorithm.
Security researchers looked at that and they're like, here's this vulnerability,
here's another vulnerability, here's another vulnerability, and they suggested to them, hey,
this can all be fixed if you adopt the signal protocol.

(23:37):
The signal protocol is a communication protocol used by the
signal messenger, which is one of the best known messengers
for privacy and security. So this is one thing that
we look out for, we need to implement, and we
need to advocate for using widely adopted protocols. So yeah,
it is encouraged to use open source and widely adopted
protocols from developing hardware or software. For an spen tech

(24:01):
pros significantly higher confidence in the security standards of hardware
and software, especially for something that's been like the signal protocol,
that's some needs on app that's been around for years.
It's been tested, it's been poked at, it's been everything right,
and any vulnerabilities have probably already been addressed years ago.
So any new app that adopts a signal protocol is

(24:22):
automatically going to have a much higher level of contents
from user base that it's private and secure the same
and that goes same with subdermal implants and neural in
our faces. Another pros better interoperabilities. That means it's going
to work better with other devices, so that allow you
to build a better ecosystem of devices of like neural

(24:42):
interfaces that also communicate with biological interfaces. So it could
get brain data and it can also aggregate it with
biomarker data and it can provide some type of response
from those respective aggregated outputs and yeat. So lead to
things like mentioned networks or app development to say you

(25:03):
have multiple different types of neural of biological interfaces, you
can make an you can make a centralized app repository
like on Linux something like the au are like the
Arch User Repository or the flatbax Store, which are basically
centralized places where you can actually download apps and they
work across different types of systems. So something like this

(25:24):
could be built for you know, sub normal implants or
neural interfaces. Once again, this company right here on me
they're actually building something like this, So actually building a
thing where people can make open source apps and they
can put them on GitHub and they can actually get
paid for the amount of users that use the app,
not data being sold and nothing like that. It's basically
the amount of users to use an app. They can

(25:45):
get paid by all me because they are both a
open source and they're a company that makes products, but
they're also open source, so they'll sell products, but the
products themselves are open source, which allows them to reward developers.
So creating a similar ecosystem like that for subdermal implants
would would also be a great benefit of using a

(26:06):
widely adopted protocol. Number three, stronger price of regulations. This
has been under attack so much recently. Yeah, more companies
should adopt pricy regulations like the GDPR and the California
CCPA pros less subprocessors. Those are third party companies. We'll
have access, would have access to your sensitive data. Also

(26:30):
a greater degree of individual autonomy. Right, So this allows
you see the neural interface. Do not have to feel
nervous about thinking about you know, you're not liking the
administration of a current government, or you're not liking, you know,
a newly passed law of the current government. Right, you
won't have to have that worry if you know if
the code is open source to be able to see that, hey,

(26:51):
no data is being sent to a third party subprocessor.
And it's also backed by these stronger pricey regulations. Cons
won't hire like cleans that proprietary algorithms or requires some
type of subscription fee. Or higher upfront cost, right if
you are if something is free, well they have to
make money somehow. How is Gmail making money? All They're

(27:13):
making money by taking data across all the different services
that they operate and selling them the third party advertisers,
so they can show you ads that they think you
would be interested in, and those ads receive significantly more
money than just normal ads, normal non target ads. So
if something like this was implemented, right then you're calling

(27:33):
you have to pay a bit of a higher price.
So that company is able to make money on the
products and services that they provide and some type of
subscription fee if they are also making software that is
constantly updated and maintained. Now, if we did use open
source software, this would help with the con number four
on device data at testation. So this is something that's

(27:56):
used right now in things called decentralized identities or just
centralized Yeah, decentralized identities called diivs, and this basically says
it's basically like if you go to a website that's
an adult website and they ask you and they want
to check, hey, are you eighteen years old? Well, we've
already seen this right now, going off a lot of

(28:17):
websites where I think the europe government they want to
see they want to restrict access to all sorts of
internet websites based on your age and stuff. So this
is an open source way. This could be implemented so
you can have your device credentials stored on the device,
not in the cloud, so we're reducing who has access

(28:37):
to your data. And then if someone needs if someone
needs to know if you have these specific piece of
information on your device, you're able to do a thing
called a test it basically saying yes I have this information.

Speaker 1 (28:50):
Now.

Speaker 2 (28:50):
This would be done using probably private public key cryptography,
which is a more secure way of doing it also
ensures they actually have the data and said to saying
yes I have the data or provide the full amount
of data and like a plain text format which is insecure.
So data takes you could attest is medical data. So
we'd want this to only be accessible in emergency situations

(29:12):
and this would be backed by biomarker statsis to safety.
Your heart rate was blacked, you did not have a
heart rate, then than any like subdermal implants or epidermal
implants or neural interfaces, they would be able to change
to a state where they are able to attest medical
data like are you an organ donor or do you

(29:33):
have a CRY a preservation contract and so what are
the steps to provide that CRY of preservation care before
like the SST team, a RIOTS or something. Another one
is access credentials, so a decentralized identifier or if you
have a do authentication control like an ID card for work,

(29:53):
you can also make it so instead of having to
carry around a wallet which could be easily stolen, you
could have it on a a normal implant. And if
you're authenticating with the door, so you're scanning your hand
against a door, all all you can make it you
can make it do is say okay, the door can
say hey do you have this piece of data or
do you have this public key, and they can be like, yes,

(30:15):
I have this public key, generates it from the private key.
Oh the keys matchup, okay, I'll let you in. And
if we want to do this in a more privacy
preserving way, you can just have it a test that
data not who it is or you know, what is
your name, what is your email? Just say hey, I
have this authentication credential, and then it would let you in. Yeah,

(30:37):
the ideas which Standsford decentralized identifiers. So yeah, pros selective
data disclosure. So say if oncegin with this medical data,
say you're storing all sorts of biomarker, You're storing thousands
of different types of biomarkers at thousands of different collection times.
Well you can make it so you're only testing, Okay,
the data that needs to be a tested. The data

(30:58):
is requested, and so give all of your data, which
provides much more greater security and privacy risk because there's
a greater attack surface. Another one is a reduced attack service. Right,
so you're sharing less data. It's called data minimization, which
is a common privacy practice, actionable privacy and security sets.

(31:18):
So this is something where you don't need a fancy implant,
You don't need anything. This is something that everyone can do,
that you can do for free that significantly enhances both
your privacy and your security online. First is use a
pastor manager. Why would I want to use passer manger? Well,
once again, if you're reusing your passwords across multiple different

(31:40):
sites and one of those sites suffers a breach of information,
people can once again use a credential stuffing attack. So
we're using that email, we're using that password from one
site on thousands of different sites, and they're able to
It provides a greater attack surface for people to access
more of your attacks. Passer commanders I recommend is proton Path,

(32:00):
Bitwarden and keypassexc So proton pass is by the proton
l company, the now called Proton They have a free plan.
They are backed by Reptal company. They're around since twenty
fourteen I think, and they are a cross platform so
it works on iPhone, Android Web, not Linux Web, mac Os,
Windows cons most some features are require payment, which you know,

(32:24):
you could say it's not really a con because they're
not selling your data, but they do require payment for
some features. Bitwarden, this is widely well known. This is
probably the most popular in my opinion. They have a
free plan. They are back to by once again a
regual company. It's also open source so people can see
all the code. A bit org audit it for security

(32:44):
vulnerabilities and pricy vulnerabilities and submit changes also called poll
requests to address those concerns. And it's also cross platform.
It's basically available on every single type of device. Every
single type of browser. Once again cons some features like
the TOOTP which is like your two FA authentication codes
those who quire payment. I don't really think of it

(33:05):
as a con because you really shouldn't be storing your
two FA codes in your passer manager, right that if
your passer manager gets breached, well, then your two FA
codes get breached, and that doesn't really serve the purpose
of two FA. Also, is key pass XCSS is local, right,
so you're not storing your data on a cloud like
the ordin r Proton. You're storing it fully locally. You
have full control of your data, and you know who

(33:27):
acts with, you know who can have access to your data.
It's fully free, right, there's no payment required, it's open source,
it's fully local. So the cons it's not cloud based,
which may be inconvenient for some. There are workarounds. You
could take your your key pass database, which is where
it's like a little database file, and you can put

(33:48):
that on something like people drive or Proton drive and
have it encrypted, so you have in best of both worlds.
So now two fan TWOFA helps prevent some simple account
takeover attapts. So just someone knowing your user and password
or having access security and address. So this is from
least secure to most secure. So SMS to like a

(34:09):
text message, this is the least secure, but it's better
than nothing. This is vulnerable to think called a simsult attack,
which is where someone is someone is doing things called
social engineering. We'll call up your company, whether Spectrum or
T Mobile, Horizon, and they'll be like, hey, I lost
access to my account. Maybe they'll play like a noise
of the crying baby in the background to make it

(34:30):
the representative more likely to hand over your information or
change the phone on what your SIEM is on. This
is actually quite common. Most providers fall victim to this.
I think it's about like eighty percent. Don't take me,
don't take me right on this. I think it's about
like eighty percent of the social engineering attacks are actually successful.
So this is something to watch out for. Someone can

(34:51):
literally call your phone provider. If they know your phone number,
they can look up, okay, who's your phone provider, and
they can be like, hey, I lost access to my phone.
I lost my phone or something. And if they do
a little bit of ocent on you, which is called
open source intelligence, they can find where you live. They
could find your birth date, then they can use that
information to help kind of verify. So say the support

(35:14):
representative asks, hey, win your birthday, they can don't have
the information right in front of them. They'd be like, hey,
this is my birthday, and it'll make the these customer
support representative more likely to change over your SIM to
the attacker's phone, which then if they use if they
try to log into your account and it has to
FA the SMS t FA, they can get a code

(35:34):
sent to their phone and you would never know, right,
so some did a SI simspop attack. You would never
know until you try to send message to your friend
or or someone calls you and the rails it's not
going through.

Speaker 1 (35:48):
Uh.

Speaker 2 (35:48):
And also in very rare cases, unencrypted SMSs can be intercepted.
But this is for a very high threat model. This
is something that you know, a government adversary or a
local police department would have to implement with something called
a sting ray or a cell device similar attack. These
are not too common unless you go to things like
protests and stuff. But if and if your phone has

(36:12):
two G enabled, so even if you have like a
five G phone, it can still downgrade the signal to
a two G but this can be disabled in Android devices.
Next is an authentication apps. This is also known as
TOOTP or time based one time passwords. This is not
phishing resistant because someone can have a fake website or

(36:32):
a request. So say someone sending an email saying hey,
we need your t a FA code, It could lead
you to a fake website and you could put in
your two FA code from your phone ontop website and
then they could send a get request or a post
request to like a log inform or something on the
actual website. This is very rare, but it could happen.
But this is like ten million times more secure than SMS.

(36:56):
The most secure are those security keys, right, so those
up keys or if you have like an am Viva
key apex flex, those have the same type of NFC
security key functionality. These are fully phishing resistant as you
know of there's been no known vulnerabilities or exploit slippies.
They also require physical access to the device, which makes

(37:17):
it harder for someone to fish you or social engineer you.
Even if it's on a fake website and request it,
it can't do it because the site host itself has
a key, and then it uses like the public some
type of public key cryptography to match those keys and
verify they're accurate so you can't be fished with these.
You can also add a pin for extra security, so

(37:39):
say you plug it intric so say when it prompts
you to use a security key, you plug it into computer.
It'll also ask you for a pin if you set
that up. So some authentication apps I recommend are Proton Authenticator.
This is very new, but it's also backed by very
reliable and reputable company Proton. It's also cross platform. Another

(37:59):
one is called Ages Authenticator. This is only on Android,
but it is frame open source and also has a
nice minimal design. Security keys these I only recommend ubi
key by Ubico. I'm not sponsored with them, but these
are like the only things I've seen used, and these
are like what all websites say, pull out your ubi key.

(38:21):
These are kind of like the mainstream thing. They're established
and they're a trusted company. Millions of people use them.
People cloud Flare use them. People almost at every company
use them for two fa They're well supported based bite
the Fido two protocol U two f protocol. They also
have an authenticator app so you can take your ubike
either plug it into your computer or tap it against
the back of your phone, and also get those authentication

(38:43):
codes for the t for the time based one time passwords.
And then yeah, this is for the Ubico five C
NFC series or higher, so not just the Ubico five
security key model that does not support it. Specifically, you
to go five C NFC or higher special keys like
the BIOS series, which is right here. We can also

(39:05):
require a fingerprint of authentication, which provides much more security. Third,
I recommend using something like Proton Mail or to the
NODA now called TUDA for email. Why Standard email providers
like Gmail and Yahoo are known for collecting email metadata,
which is based like what type, who did you send
an email to, at what time, what type of what's

(39:27):
the subject? What are any relevant attachments? So it's kind
of like everything but the email body itself and sharing
with third parties. They also have been known to share
like the full email body in the past, but they
no longer do this, which is pretty good at least
for Gmail. I can't really say for Yahoo. Providers like
Proton do not sell email, email metadata or email full

(39:50):
body context to third parties, and they also minimize data
collection right to say, if a government or some type
of other company tries to request information, there will be
a I don't have information because they've already done the
data minimization. Proton Mail, they have a large ecosystem of services.
They've got Lumo, which is like an AI, Proton pass,
Proton Wallet which is like a Bitcoin wallet, Proton dots

(40:12):
which is Google docs alternative, Proton Mail, which is that's
like the flagship products ground Forever, Proton VPN, Proton Drive,
Proton Calendar, Proton everything. It also has a better user
interface in my opinion. Also have Twoda it's more affordable.
But the biggest thing with two to note is if
you have like a business, they have very nice white

(40:32):
label functionality, including custom webmail domains. So you could go
do something like mail dot trans Humas, party dot org
and it will load up this webmail interface. Proton cannot
do that. For adding your email to the have I
been poned leaked credential detection, this is free. Just search
this on Google, dot dot Go or Brave and they'll

(40:56):
show up. So why your email and passwords from DAI
leaks are shown on the dark web or even on
the clearnet like the t app leak, which we'll get
to in a moment. And yeah, this makes it easier
for attackers to high check one and more of your website,
especially if you re used passwords. So what this will
do is it will constantly scan recent data breaches on
the dark Web and notify you pay your email has

(41:18):
been detected in one of these data breaches. Here's the
data breach name, here's the website. We recommend you to
change your password, and then you can go and change
your password and as soon as you're able to know
before attackers are able to see your information and try
to attack your other accounts. Five limit who get sensitive data?
Oh boy, the t app. We've seen this a lot

(41:39):
on the news. Why all the more companies with access
to your sensitive data like identification documents, your social security number,
your data birth et cetera, the higher risk of that
data being compromised from a service attack. Right, there's more
people who hold that data, so there's a higher risk
of one of those individual providers being exposed in your

(41:59):
data on the dark book or even the clear net.
An example, over thirteen thousand private images, including identification documents
and phase verification pictures, along with location metadata and over
one point one million private messages to cost three separate
data breaches on the tapp, which is like an app

(42:21):
for people to post like pictures of their significant other
and provide like green flags, red flags and kind of
like vet them. The data was attained through a public
storage bucket, no password or no production. It was posted
public on fourtan a public storage bucket. This is literally

(42:43):
like going to a website and just seeing the images there.
There's no encryption, no database, no server to get into.
It's literally you can go into a website and seek
say you go to the website, you go into your
developer constantly go to the network tab. You can the
post and get requests, and you can see, okay, oh
there's a get request from this website, right, and I'll

(43:05):
show the storage bucket. You can just go to that
website and you can query, which is like search all
the different picks of data on it in order to
show it to you. No password, do nothing. So this
is a very very very bad privacy and security boulderability,
Like this is w The worst thing someone could do
is have a public storage bucket. Yeah, So besides that,

(43:29):
some news, I actually did make a new Privacy and
Security channel in our discord and forums to kind of
encourage more of this type of discussion and hopefully, you know,
maybe make some change. Here's our main website for the
Transieness Council. I'll give you guys just you know, a
few seconds each to scan those, and then I'll go
on to next slide and then the main website for

(43:56):
the International Biopacking Committee and our forums, so you can
click scan here to go to the homepage and scan
here to directly sign up. We also have a moblat

(44:28):
for a forums. We have one in test flight which
is kind of like the beta for iOS X and
also on Android. And then our two main escort servers
for the International Biohacking Committee, the Biohacker Lounge, and then

(44:49):
one for colognitive enhancements like eotropics supplements even nonal own
our faces. And then finally just our two most popular
servers for the Transhumitis Council. This is our main server
right here, all right, And if you don't scan this

(45:14):
for any presentation feedback, that would be greatly appreciated. I
think I still have this on the previous one, but
just you can submit it. I'll still receive it. So
you guys wanted to scan that, all right? And yeah, finally, yeah,
if you want to contact me or send me any

(45:35):
connection on LinkedIn, you can scan this one. If you
have any questions or inquiries, you can scan here, or
goanna see more of my other social medias or anything else,
you can scan here. You get to my linked tree page.

Speaker 1 (45:48):
And that is it. All right, Thank you very much, Josh,
and we will leave the slide up while you answered
the first question, which comes from Daniel Tweed. He wonders
why aren't more people more attentive to privacy and security issues?

Speaker 2 (46:07):
Any thoughts on Oh gosh, there's a lot. Oh gosh.
I think one is just convenience. I mean, I think
people have so much. I mean, we all have so
much going on in our lives. We have events to
go to, we have people that talk with, we've got school,
we have got work. But that's so much. The last
thing on most real minds is privacy. I mean especially

(46:27):
you know, even if you want to think about privacy
or signing up the service even want to think about it,
or you click the pricy policy and it's like forty
pages of a bunch of legal jargon that's going to
be very hard to understand. I think there's just not
a public discourse around it really. Up until now. There's
been a lot more recent public discourse with things like
the clicking movement, which is more kind of you know,

(46:49):
individual autonomy with regard to technology, but that's still a
lot with regard to privacy, and also from YouTube and
other sites implementing age verification checks using artificial intelligence and
requiring your identification documents and stuff. I think we're getting
a lot more public discords from privacy and security. I
think it's just because people have so much going on
in their lives and privacy is like the last thing

(47:11):
they don't care about. Especially with regard to convenience. People
just want to sign up for service and use it.

Speaker 1 (47:17):
So that's what I would say, yes, And Daniel follows
up with this comment, Yes, it comes across as overwhelming
and inconvenient, and earlier on Mike Lazine wrote essentially, this
is too much information, so I'm sticking with what works
for me. So are there I wonder good practices that

(47:41):
are also easy enough to adopt and quick enough to
adopt that even people who have very little time could
improve their privacy incremental.

Speaker 2 (47:55):
Yeah, I think. First thing, if you have a Google account,
go to the privacy settings. Just go to my account
dot Google dot com. You can search it. Go to
the pricy settings. Turn off Google Activities, which basically give
an Android phone or if you have the Google app
and an iPhone and you give it location permission everywhere
you go. So say you can go to Starbucks, you

(48:15):
go to a movie or something, It'll track everywhere you've
been throughout the day. And you can actually see all
your all of your location history on Google's website, which
guess what they most likely and I can't say definitely,
but they most likely probably sell that to a third
party companies for targeted ads and stuff. So go to
Google's My Activity page, go through all the pricey settings,

(48:35):
turn off all the webmin app location histories, turn all
that off, it all down. The same with your phone,
go through your phone's pricy settings, disable anything that's unnecessary,
or disable any unnecessary permissions that apps have. Disable Apple's
advertising ID. You can just go and click a reset
advertising idea and then turn off all the analytics in

(48:57):
diagnostics data collection. So I think that find the best
places go to like Windows privacy settings. Go to macOS
privacy settings, iOS privacy settings, Android privacy settings, Google Pricy settings,
Facebook privacy settings. Any were used, just go to the
privacy settings and just go through all those. I think

(49:18):
that's pretty easy and kind of gets both the privacy
on hardware and pricy on software, you know, like online
services and stuff kind of taken care of.

Speaker 1 (49:28):
Yes, thank you for that response. Now you had mentioned
ub key, and John h also recommends using one of those,
though he says he understands those that have been hacked too.
Do you know of any potential ways that a ub
key could be hacked and if so vulnerabilities?

Speaker 2 (49:53):
Yeah, I don't. I'm looking kind of look right now,
see something about site channel vulnerability. It looks like the
only way they can be hacked just from what I
see right here is that someone has physical access to
the key, right, so one physically has the key. They
looks like they might build a clone it, but also
like something like this, it would be discovered this priority

(50:14):
patch by now. This happened in September of twenty twenty four.
But really the only real way that something could be hacked,
if there's vulnerability, it's going to be someone has physical
access to the device, which in for most threat models,
the chance of that is basically zero to none. So yeah,
any like online or remote hacks, that's not really possible
unless they break encryption algorithms, which is also very difficult,

(50:39):
especially as all the companies continue to upgrade and adopt
things like post quantum encryption and stronger encryption algorithms. So
don't really any way they could be hacked unless someone
has physical access to the device, which is already very rare,
and if there's like a zero day of vulnerability that's
not patched yet.

Speaker 1 (50:57):
Yes, thank you for that answer. Now, is it also
true for ub key, as John h states that not
all sites support it?

Speaker 2 (51:06):
Yeah, yeah, sadly, not a lot of sites support physical
security keys. Most of the main ones do. Facebook does,
Google does, even like those sites have upgraded like account
protection programs. So if you go on Google and search
Google Advanced Account Protection, you can make it to where
after you enterior userament password, you have to plug in
a ubkey or other security key to get into your account.

(51:28):
No SMS, backup note email passive. Great site, it's very
strong and it would require a security key. So yeah,
more sites need to adopt it, but the main ones do,
and a lot of other sites are starting to adopt
physical security keys. Even one site, even like very niche ones,
you'll see, oh you can use the security key, which

(51:50):
is really nice. So slowly but surely more sites will
adopt it. Yes, that is a major problem right now.

Speaker 1 (51:57):
Yes, thank you, Josh, and John does point out proton
mail and PayPal do support ubkey, so that is an
advantage of that approach. And also for freedom rights that
imagine you become a cyborg and forget about your fingerprint verifications,

(52:17):
so that could be a concern in the future if
you ever have to replace your limbs in any manner,
make sure you still either have a way to unlock
something that requires fingerprint verification or change that to not
require it. Now, I have an issue where my fingerprints

(52:40):
apparently cannot be read very well by certain devices. And
I've had this before, where fingerprint scanners would just consistently
register a low quality fingerprint in quotations, and I was
told this would be an issue that I would be
faced with my entire life. But the last time I

(53:00):
tried a fingerprint scanner was around two thousand and eight,
so I wonder if you're aware of this. Whether fingerprint
scanners have improved in the intervening years to recognize more
fingerprints so that the entire population potentially could.

Speaker 2 (53:20):
Use them, Yeah, I mean I would say they've definitely
gotten better over the years. A lot of phones are
now doing fingerprints under like the screen, so you can
tap on the screen. Not iPhones, but most of the
Android phones are. So they're definitely getting better, much much
better than like the old school fingerprint scanners that you
hook up to your the physically plug into your computer.

(53:42):
Much better. I think the iPhone's seventeen, like the newest model,
is also going to come with a fingerprint scanner like
on the side right here, like how it used to
be right No, not how it used to be, but
like how like on Android phones, like you can click
here and it scans your fingerprint, like on like the
pixel fold you can click here. I've when seventeen I
think will support that, which is pretty nice. Ps. They

(54:04):
have been getting better.

Speaker 1 (54:07):
Yes, thank you, Josh. And also for freedom rights, yes,
they like when things are secured with physical devices.

Speaker 2 (54:17):
Yeah so cool.

Speaker 1 (54:19):
Yes, yes. And along these lines, John h points out,
if he were a super hacker, the password file would
be a juicy target. So his passwords stay in an
old fashioned paper book, and convenient for him means inconvenient
for hackers. So unless a hacker breaks into his house
and finds the specific notebook, they won't get the passwords.

(54:41):
On the other hand, a notebook can be lost as well.

Speaker 2 (54:44):
Exactly, I think it's lost a being inconvenient too, right,
there's a lot of security with that, but can also
be very inconvenient. If you have a lot of sights
you log into, you don't get that nice auto fill,
you don't get a just log in on your phone
and just you know, pull up your passwords. But also
it's very secure, right if some one once you get
access to those castwords, if you physically first find out

(55:05):
where you live, come into your home. So if you
have like a home security system, you have you know,
a hard and down door and staff, they have to
get into your house. Then they have to get through
you write self defense, if you have a knife, if
you have other tools, they've got to get through you.
Then they have to access they have to find where
it is, right and then if it's in a safe,
they've got to get through this safe. Right, There's so

(55:26):
many layers of physical security, which is kind of the
best you could really get for passwords. It's just old fashioned.
Each short in a notebook, you put it in a safe.
So yeah, that's that's great.

Speaker 1 (55:38):
I like that. Yes, thank you, Josh. Now I would
like to discuss NFC and RFID implants a bit more.

Speaker 2 (55:48):
So.

Speaker 1 (55:49):
Here's a video how transhumanists prevented the then on microchip
implants in Nevada in twenty nineteen. There was an effort
by the US Transhumanist part to achieve revisions to a
bill that was proposed in twenty nineteen which was originally
quite troubling because it would have banned even voluntary microchip implants.

(56:13):
And what motivated this bill was that one of the
assembly members read this article about the company that you mentioned, Josh,
called three Square Market, and three Square Market was the
company that voluntarily asked its employees to have these implants,

(56:40):
and this became a headline news story, even though the
employees didn't actually object and they had a choice not
to get implanted. And this company was in the business
of making these implants for voluntary insertion into their customers.

(57:00):
So this assembly member decided this was a huge intrusion
of privacy, and this was also a risk for employers
because essentially, he was concerned that employers would either coerce

(57:22):
or influence their workers to get these implants for tracking purposes. However,
we brought in a cyborg magician, Anastasia Sin to testify
before the Nevada legislature and this bill Assembly Bill two

(57:43):
twenty six would have banned even voluntary microchip implants that
people would have used for their own personal, let's say needs,
like what zolten Ishtvan got when he campaigned for president
in two thousand and six, Team Chip an RFID chip

(58:04):
between his thumb and index finger. I don't have one,
but this is where it would be located, which enabled
him to unlock his phone or open doors if he
placed his hand directly adjacent to the lock. And what
the legislators needed to be educated on was that these

(58:25):
implants require really close proximity to a device in order
to be able to interact with it. So it's not
the case that somebody could remotely track a person with
an RFID implant. Somebody would literally have to be like
this close in order to be able to track the individual.

(58:48):
And I wonder Josh, you mentioned the risk that employers
could use this tracking technology, or other entities, maybe go
governments could use tracking technology in the future. Wouldn't it
have to be a very different technological architecture in order
to enable remote tracking, And couldn't like a mobile phone

(59:14):
be a much more effective tracking device if somebody installs
an app on the phone and that app then relays
data to the entity that is doing the tracking.

Speaker 2 (59:25):
Yeah, your phone, your phone, it's that's the best tracking adviser,
is I mean if it's a government adversaries or a
private company like Google, your phone is the best tracking advice. Right.
You have your SIM card, which allows something called cell
powered triangulation, which is basically what your phone's trying to say, Hey,
I want to get the best signal, let me keep
pinging several cell towers and then you know, in that

(59:47):
case you're providing your I NEI IMSI, your name, your
subscriber identity number, and then your carrier. That's one good
way for it to be a tracking advice or government.
And then for private all your apps, right, app has
different privacy features that your phone's offerating system has granted
to it. To see have Google's and my activities feature

(01:00:10):
right where it shows you know all you know where
where you've been. That's the best tracking device or is
right has all the sensors as everything. So yeah, if
you had like some type of subdermal endplane or epidermal
implant trains, thermal implant neural interface, it would require it
would require all those different types of sensors and technologies. Right,

(01:00:30):
we already have them all on your phone now. Eventually,
I think when you know the phone, you know there,
you've got the smart watches and stuff, You've got the
smart glasses that are coming out to smart contact lenses.
I think over time the phone will start to kind
of fade away like the touch, you know, the touch
phone will start to fade away. To think, eventually then

(01:00:51):
those devices would become more of surveillance devices if you
got them from the wrong companies. Right, So if Apple
released type of epidermal interface like where you sit on
your arm and then collect the data from your biomarkers
and stuff like that, then yeah, then it could turn
into oh, subdermal implants are surveillance defaces, right, but right now,

(01:01:14):
with a very simple, very close range or if ID
and NFC technology, they're not a concern. Also, do you
have a video that I do want to share showing
how close it has to be? So yeah, I will
share it with you right now. Guys, I'm asued you
is how I use two factor authenticational websites with my
subdermal electronics system. So first and go and open my

(01:01:36):
apps can just like this, Go ahead and click one
time passwords, Go ahead and log in, and then go
and enter my one time pastor whom I am in
with top notch subdermal. There you go. So that's basically

(01:01:59):
like what these let me share a screen. That's basically
what these are right here, these subdermal implants. That's what
this one is right here, let's type implant. I saw
a comment by Jason I saw something about like police
and some tipe. I like, I think it's like a

(01:02:19):
medical privacy question if you're in the future that one.

Speaker 1 (01:02:25):
Yes, So if you have if you have it to wear,
your phone unlocks with a fingerprint, do you think cops
can physically force your thumb on your phone to unlock
it if you haven't been charged with a crime yet. Now,
I do know that there needs to be probable cause
for a search, and even in cases where there's probable cause,

(01:02:48):
the law limits certain prerogatives, Like if a policeman pulls
you over in Nevada, let's say you've been driving erratically,
you do not have to consent to a rethalyzer test.
And the Nevada Supreme Court actually struck down the idea
of implied consent as unconstitutional. So if you get pulled

(01:03:11):
over and you don't take a breathalyzer test, you could
still get convicted for reckless driving, or if they have
a different way to find that you've been under the influence,
you could still be convicted of driving under the influence,
but you cannot be convicted of this implied consent notion

(01:03:35):
or violation because you refuse to take the tests. So
there have been other cases, say the FBI try to
get Apple to essentially unlock the security systems on Apple
phones to install a backdoor as a surveillance mechanism, and

(01:03:57):
Apple refused. Apple actually held out and did not allow
the FBI to do that, to Apple's credit. So very
interesting considerations though, and this is where policy advocacy I
think is extremely important, because through policy advocacy we can

(01:04:18):
shape the legal landscape in terms of what police are
or are not allowed to do, especially for people who
haven't been charged with a crime. For people who have
been charged with a crime, a judge can issue a
warrant with the Fourth Amendment. The way that it is,
the right to privacy is not absolute in the sense

(01:04:38):
that if there is a criminal case ongoing, yes, the
police can search your items, your home, your computers, whatever,
may provide relevant evidence. But you have to be on
trial for a crime. And I think that's the way
it should remain.

Speaker 2 (01:04:56):
So there is probable cause, and they if there's probable cause,
then there's biometrics. If your fine has like a fingerprint,
raider or face ID, then they can they can make
you unlock that device. If you do not have biometrics
enabled and there is probable cause, you would need a
warrant to get into a device with a pin codes

(01:05:17):
you couldn't ask. You could ask so on, hey, what's
the ncoat's device, But they can invoke their white Germain
silent and therefore they would not be able to get
the data off the device case they don't have the
pin code unless they got a judge signed court warrant
to search specifically including that device and their respective passcode,

(01:05:39):
where then the person would have to give up the
pass code. So actually is more it's more private enery
or it's more private with respect to if you are
in that type of situation where there is potential probable
cause for a police officer or other government official to

(01:06:00):
try to request information off your phone, to actually not
use biometrics, because then they would have to actually get
that warrant instead of just making you unlock it with biometrics,
which they can't do with probable cause.

Speaker 1 (01:06:15):
Yes, very interesting. Now, Jennifer Hughes rights that she thinks
phones will fade with the development of teletactics, And I've
been wondering about this because it seems to me as
new devices have been introduced to some extent, mobile phones
have consolidated certain old devices, like traditional phones and MP

(01:06:41):
three players or CD players. Even before them, I'm old
enough to have had portable CD players where you could
put a CD player in a little handbag and you
could put like twenty CDs in there as well and
alternate among the CDs, and that was your entertainment on

(01:07:04):
a long trip. But of course with first devices like
the iPod and other MP three players, and then with
mobile phones, that became obsolete. However, the desktop PC has
not become obsolete because it is a much better tool
for productivity than mobile devices that just don't have an

(01:07:25):
interface for typing something out very quickly, and even say
for graphic design or any sort of complex manipulation of
images or symbols, having a lot of real estate on
your screen. Having a mouse is still not beat by
any sorts of voice commands, any sorts of shortcuts on

(01:07:45):
smaller devices. So I wonder some futurists predict that in
the future computers will be embedded into objects, into the
broader environment. And surely there could be some of that,
But would that make dedicated, purpose specific devices go away necessarily?

(01:08:06):
Because I don't see anything replacing a PC anytime soon
for let's say, deep focused productivity.

Speaker 2 (01:08:13):
But what do you think, Josh, sure, yeah, so I
think I think with regard to a computer, I think
it's a very good input machine. Right, you've got the mouse,
You've got to keep our They are very good, simple
and tactile input machines. I think if that were to
be replaced, it would be through improvements and user interfaces. Right,

(01:08:33):
we already have, you know, the folding phone Right, you've
got the phone where you've got one screen, you know
it'll fold up into two. Right, and you're already got
you're able tablet. You're you're already replacing the iPad, which
is kind of designed for you know, you know, on
the go productivity. Right. So I think as technology improves
over time, I think as user interfaces improve over time,

(01:08:55):
that it could make the computer obsolete, especially as you know,
you can do all sorts of things on a smart watch,
right do you can do all sorts of things even
now with those the Apple Vision, right, the Apple Vision
pro headset or the Meta pro headset, or even smart
contact or even smart glasses. They have glasses where you can,

(01:09:16):
like it can like transcribe like people speaking other languages
in a real time, which is kind of cool, and
it shows up in like a little screen with a
little hood in your glasses. So I think over time,
those type of standardized devices, like the computer, I think
they probably will go away. They're not it's not going

(01:09:36):
to be there forever. I think, you know, if time,
you know, if time runs you know, thousands of years
in the future, I think most likely it's going to
be embedded in some type of neural neural or biological
interface combination. We already kind of you're already kind of
seeing that with the Apple Watch, the new Apple Watch,
you can basically pinch your finger and it can kind

(01:09:57):
of control some things on an Apple Watch. Same with
the the Apple Vision Pro. You can do like that
little I think it's like this, you can use this tap,
use this like swipe. So those types of novel input methods.
Just using your hand can control things on the Apple
Vision Pro, which kind of can be I think in
some situations can be much more intuitive than an ouse, right,
especially we're doing things in like a three D space.

(01:10:19):
Instead of having to hold a VR controller something or
trying to do CAD work on a mouse, you just
have your hands, right, you have your hand, you have
a maybe like a sensor attached on your walls, and
that provides much more fine and graining your control. So
I think over time that could definitely replace standard interface
devices like a computer. I think devices that are in
you think someone said devices like kind of in like

(01:10:41):
the public space. I think those could I think those first,
I think those would provide a significant privacy concern, a
lot of privy vulnerability, especially there's multiple people using the
same device. I think in certain like ways the future
could occur. Right, if there's much more of a what
shared future where multiple people are using the same input device,

(01:11:03):
Like you go to a public library, right and several
people using the computers at the public library, that there's
a significant privacy concertain, Right, that's what it says. You know,
never log in or make sure to log out of
this account if you're using a public computer. So I
think yeah, I think over time, standardized computing systems like
the computer will fade away with improvements to use their

(01:11:24):
interfaces and input technologies like you know, your finger in
your hand and stuff. And I think public, public environmental
or public environment computing systems. I think those will provide
or impose a greater privacy and security risk.

Speaker 1 (01:11:42):
Yes, And I think any sort of system where it's
possible for one's data to be commingled with that of
others will pose a greater privacy and security risk. Now,
Daniel Tweed had a related comment. Not backing up crucial
data is another widely neglected security matter. And I bring

(01:12:04):
this up because as technologies advance and one generation of
technologies transitions into another, sometimes it may be difficult to
extract data from the older format of technology, like libraries
for decades have backed up documents and what it's called microfiche.

(01:12:26):
I don't know how much you know about microfiche, Josh,
but back in my day, so they had these special
readers where you insert this film into the readers and
then you would be able to look at an old
article kind of through a screen. It almost resembled an

(01:12:46):
overhead projector in terms of how it appeared, except it
wasn't necessarily projected onto a bigger screen.

Speaker 2 (01:12:56):
I think, yeah, it's like it's put the paper in
and then it has a little light and it should yes.

Speaker 1 (01:13:02):
Right. So this was an old technology when I was
introduced to it during the let's say, beginnings of the
proliferation of personal computers into households, and these libraries that
created a lot of these microfiche documents then had the

(01:13:23):
challenge of transferring them onto computer formats that people could
actually read. So the concern would be if computers were
to become obsolete, at least PCs as they are now,
then somehow we would have to transfer those files. There
would have to be format compatibility, and the new devices

(01:13:46):
would have to have the ability to read them, and
you would have to be able to recall them, Like
you wouldn't have a mouse, you wouldn't have the file
hierarchy that exists on a PC. So somehow you'd have
to be able to navigate those maybe through your smart glasses.
You would see a similar kind of menu and you

(01:14:09):
would select, and there would be gestures for selecting in
order to bring up those files. So very interesting. But yes,
in whatever format, you can back up your data because
chances are if it's confined to one device that is

(01:14:29):
not going to be preserved over a sufficiently long timeframe.

Speaker 2 (01:14:34):
Sure, not only is it, it could I mean you
could be considered a privacy vulnerability. I will say if
you back up data to if you back up data,
there is more of a security risk because your data
is in more places. But mainly it's an issue with
regard to data availability, right, making sure you're able to
access your own data. Yeah, I recommend backing up. If
you have a server, make sure make sure not only

(01:14:56):
the file system itself is backed up, but all the
individual programs. Like if you have a program has an
adjacent file right for all your settings, make sure that's
backed up too. So you have multiple layers of backups
for backups, I recommend the it's like the one two
three backups systems. You have one on site back the
one on site backup of two different types of mediums.

(01:15:19):
So rights, maybe you use mac os time machine, and
then you have a local nause network attached storage system
with your other time time shift backup machine which is
basically like a backup system and a mac OSS and
then the three seems for you have a third off
site backup system. Right, so you have your you have

(01:15:39):
maybe a linked synology and ADS where maybe your friend
has a thing called a Synology NAS, which is basically
a Dancy network that's attached storage system where you can
send your time shift backups to them and they can
hold it and they can't see the data. They can't
access it because it's encrypted and only you have the
private key. So yeah, I recommend backing up your day

(01:16:00):
for data and buil building.

Speaker 1 (01:16:02):
Yes, thank you, Josh. And we have a number of
other questions from YouTube audience, but first let's go to
art Ramone for his questions.

Speaker 3 (01:16:16):
Yeah, I've always heard that maybe some of these portable
devices would be powered by six G. I haven't heard
anything lately about this, And also Jeff been computing that started.

Speaker 2 (01:16:31):
Like a decade ago it died.

Speaker 4 (01:16:34):
I remember sort of suggesting that using sign language interfacing,
and now it seems like some of the sign language
programs I've used have much better recognition of sign language input.
But I really haven't seen development of anyone trying to
use that as an input to any other interface, or

(01:16:55):
say like Apple Vision Pro, which is already discontinued, But
any thoughts of that.

Speaker 2 (01:17:02):
Yeah, So the Apple Vision Pro. I think the main
reason is discontinued is because it was like three thousand dollars,
which for most people it is absurdly expensive, especially considering
that you can get like the Oculus three I think
for like five hundred dollars. They are they're planning to
the Rumor two, may be making like a cheaper version
of it that' stalls all the same features at a

(01:17:24):
much affordable price point. But I also think right now
we're kind of in that intersection of like, now we're
starting to see, you know, you have like your fancy
computers and stuff, and now we're starting to see AI
pendent devices or starting to see smart glasses. We're starting
to see smart contacts. You know, in very early development,
we're starting to see you know, brain computer interfaces, right

(01:17:44):
these are all taking different types of inputs input in
our faces. And you know, you've got the smart watch,
You've got the smart rings, you've got all this, all
these different devices. So I think right now we're kind
of in the intersection of transitioning from that you know,
keyboard and MOUSEE type system or you know, cap and
swipe type system to more of like the gesture based

(01:18:06):
systems that we're starting to see on things like you know,
the vision pro or even like some of those AI pendants,
like some of those apendons, you can still do this
same type of gesture. I think it's the not the
rabbit are one. There's another type of pendant that has
like a little camera, like a little laser camera, and
you can still do some of those gestures to control,

(01:18:28):
like to like swipe through information on it. It's like
a little like laser system that like kind of points
at your hand and you can like kind of swipe
up and see information. So we're not there yet. I
think I do think over like the next ten years,
like gesture based systems will become more mainstream if not
gesture based systems like with your hands, but more like
systems with like neural thought. I think there are some

(01:18:52):
like devices that are starting to like use that type
of technology, mainly just things like the brain computer interfaces.
You can just think and you can like switch songs
on Spotify or top to chat giput. But also that's
all like that's all developer stuff right these as a
public SDK, So I think right now we're kind of
in the building phase with new types of input mechanisms

(01:19:14):
compared to like you know, keyboard and mouse or happens
white pipe system you know seen on phones and computers.
I think six gene Yeah, six g Yeah. I don't
know too much about six g I mean it's I
mean a sixth generation, you know, I know that, but
I believe that I uh yes, okay, sure, yeah, okay, Yeah,

(01:19:38):
So I think that's the I would assume it has
like Themorless air charging thing. I've seen like a few
demos of it. We have like a little base station
and then you can have your phone sitting anywhere and
or see very small bursts of power. Yeah, I mean
that's I think it depends on how the data is transmitted, right,
if the data is transmitted, you know, it's if it's

(01:19:58):
encrypted and transit, then there you know, wouldn't be there'd
be an enhanced Well, it's already I mean it's already
like you know, you know, just using a phone now,
using five G, you're already sending data over you know,
cellular signals if you're connected to a cell tower. I mean,
I wouldn't really see it being too much different besides
it charging your devices. You're already sending the data over

(01:20:20):
that cellular medium. So unless it's like a mesh system
that it's going to support of some type, which wouldn't
really make sense anyways, because you'd use Bluetooth low energy.
I don't see any other additional security concerns aside from
maybe some type of middle band software like if the
government is you know, if they're kind of or if

(01:20:43):
government or some type of private company, I don't know
who's really kind of building that infrastructure for six G
if they're you know, adding some type of data collection
begon you know, kind of you know, like middleman transmission
software that's able to catch some of that metadata less
they can't connect catch some of that like normal data
like what web browsing data or anything like that, but

(01:21:06):
any like metadata, if there's something kind of in the
middle collecting that. Besides, like the cell tower, that could
provide additional privacy concerns if that is part of the infrastructure,
or if that is part of the foundation for the
six G infrastructure. But yeah, I don't like see any
additional privacy or security concerns because we're already transmitting data

(01:21:29):
using cellular and I would say even you could argue
it's even more secure, you know, compared to something like
you know, five G or four G, which we which
am using a thing called sell Power Simulation. CSS devices
can actually downgrade those signals into more into less secure,
unencrypted two G signals, which then the data sent between

(01:21:52):
those can also be easily intercepted. Those are common things
like protests or in places like New York NYPT they
have some devices like that but also called sting rats.
But yeah, I think it don't see any major like
risks since we're already sending data over the cellular networks.

Speaker 4 (01:22:12):
So yeah, power is definitely let at right now. I mean,
the Apple Vision Pro has a huge battery pack that
you have to wear your hat. I've seen on TV.
Uh some people who have to wear a court monitored
ankle bracelet. Uh, the judge gets on them for not
charging the device. Yeah, battery power is definitely an issue.

(01:22:38):
Oh yeah, if there was any way to get some
sort of wireless system the charge devices, I mean, that
would be great.

Speaker 2 (01:22:47):
Yeah, I agree, I mean I think especially I mean
there's already protections you can do with like cables. Right,
So if you go to an airport, most pules tell
you to, you know, don't use the USC plugs with
the chargers in the airport because you know, when you're
when you play at charging people into a phone, not
only can you charge it, you can also send and
receive data through it.

Speaker 1 (01:23:06):
Right.

Speaker 2 (01:23:06):
That's why if you you know, you have like a
MacBook and like an old school iPhone, you're going to
send music from your Mac to your iPhone or you're
gonna upgrade your iPhone or downgrade your iPhone, you connect
it to your mac book. And on Android, I know,
at least on Graphios, you can actually make it so
only like power can be transmitted through that phone's USB

(01:23:28):
type seedport. So maybe there could be some extra protections
added if there is that wireless power transmission. So I think, yeah,
you could make it to like I don't know, I mean,
maybe you can make it like a separate channel, like
a separate encrypted channel for data, and then make the
make the power channel like where the path the channel

(01:23:49):
that the power is received and transmitted on extra secure.
I mean that's I don't yeah, I don't really see
any other like big vulnerabilities. Maybe you can make it
two different types of channels and then make the power
on much more secure just because you know, it allows
it allows it to have no data transmissional and power.
But I don't really see like any major vulnerabilities, thank you.

Speaker 1 (01:24:14):
Yeah, of course, yes, very interesting discussion. And shall me
read me thirteen right? Six G has a certain Hurtz range,
And there's also a comment by Mike Lausine right now
though a lot of places are still stuck with four G,
especially in places that are in rural areas and not

(01:24:35):
in super cities like New York City. Well it can
be worse. So I have five G generally in my area,
but when I go up to Lake Tahoe, I have
visited there intermittently for the past fifteen years and they
still haven't been able to get any sort of reasonable

(01:24:57):
mobile internet connection. On the cal Fournia side of Lake Tahoe.
You go there and often there's not five G, four
G or three G. There is what is called LTE,
which is so weak that it's barely usable at all,
and sometimes there's just no signal. Even though it's a
wealthy area. They could easily have placed some cell towers there.

(01:25:22):
I am not sure what the problem is with getting
actual infrastructure built in many parts of the country, even
where there are resources and where there's demand. But infamously,
the Biden administration had this initiative for a broadband internet

(01:25:43):
coverage that was to be made universal across the United States,
and they spent billions of dollars. Nothing got built. And
this even let's say, devolved to the extent that a
lot of Democrats are upset about it as a client
and Derek Thompson made a point of this in the
recent book Abundance as to the convoluted bureaucratic process that

(01:26:08):
was put in place where it would have taken years
to just vet any application by a state to get
the federal funding to build that broadband infrastructure. So a
few states ultimately passed through the hoops. But then the
administrations changed and the initiative got scrapped as far as

(01:26:29):
I'm aware, so very unfortunately, because if it had worked,
if it had achieved its intended purpose, maybe there would
be more internet coverage in those remote areas.

Speaker 2 (01:26:41):
And for so LTE that I believe that stands for
long term evolution. That is four G, but it's like
an enhanced version of four G, so it's actually better
than like three G or normal four G, which is nice. Yeah,
I do agree that not all places have access to
the fastest invest sailor town networks. And also five G

(01:27:05):
is significantly more secure than four G two and for
G l T E seeh. I do agree that is
a significant problem. And I mean I think that's I
think it's just because I think most of it's just
private infrastructure, right, and people want to create infrastructure, whether
it's you know, where they're going to receive the most
amount of money or there's the most usage. So yeah,

(01:27:25):
that is a problem for sure. And yeah, the long
term evolution.

Speaker 1 (01:27:28):
Yeah, yes, And art Ramone has several comments. He writes
that the smartphone is worthless with the LTE connection, so
even though it's an enhanced version of four G. For
some reason, today's smartphones don't take well to it, and
it's hard to even load a page with it, but
maybe texting is good art Ramone rights. He also notes

(01:27:52):
that people in some of these wealthy areas don't want
the towers or they want to make them look like
ugly trees. That there's this obsession with all natural appearances,
and it's very anachronistic as well, because some of these
people don't want any trees to be thinned to reduce
the wildfire risk, and for decades they enforced a prohibition

(01:28:16):
on people picking up dead leaves or pine needles from
their property until the folly of that was ultimately recognized.
Because their idea of the natural is just whatever is
untouched by man. They don't realize even the Native Americans
within out the trees in that area, and before the

(01:28:36):
twentieth century there were older trees, to be sure, but
these old growth trees were spaced out a bit more
to reduce the likelihood of a conflagration. So I think
this mindset, especially among wealthy homeowners in these areas, that
only the quote natural is good, needs to be challenged,

(01:29:00):
and it needs to be challenged with regard to five G,
there are a lot of myths and misrepresentations about five
G as well.

Speaker 2 (01:29:08):
I think the LT thing where someone was saying, like
the LT like the smart phones like worthless as LLT,
I think that's also not just LT useful. I think
it's also websites are jamming in so much more JavaScript
into the websites. They're jamming like fifty different analytics trackers,
you know, they're jamming in like Facebook, Facebook trackers, Google
Analytics trackers, Pinterest trackers and all you know, to see, okay,

(01:29:32):
how are my ads doing?

Speaker 4 (01:29:33):
Right?

Speaker 2 (01:29:33):
They have you know, Google ad trackers right to place
the ads on the websites, and they also have like
the metapixel, the pintrist pixel right. Those are basically saying, okay,
how are my ads performing on my website? So they
have like you know, fifty different trackers on the websites.
They're loading it, stuffing it with tons of JavaScript, most
of it is unnecessary. And yeah, that's also going to

(01:29:54):
increase amount of data that you have to pool not
only from the original website itself, but also the subprocessors
and like you know, third party websites like Google Analytics, Interest, Facebook, Meta,
Google so even if you're using like the most private
and secure browser setup or whatever, and you visit, you know,
like a website and it's using it's connected to Google

(01:30:15):
analy you know, like, even if you don't have Google
account and you go to a non Google website, it's
still going to connect to Google if it's using Google
Analytics or Google Ads or something. But I think that
definitely increases the amount of data you have to download
from a website. So I think that's another concern.

Speaker 1 (01:30:33):
Yes, And it's interesting, Josh. So I have had now
over three decades of experience of turning on computers and
getting them to start, and we've had massive improvements in
computer processing power since the nineteen nineties. Computers do not
start faster now than they did in the nineteen nineties.

(01:30:55):
I can tell you that from personal experience. So why
is that, I hypothesize it's because of all of these
additional programs that have been introduced running behind the scenes
relaying information. Some of it, frankly is blokewear. And I'm
sure that on websites a lot of it is blotwear
as well, because for those sites that have still endured

(01:31:19):
since the nineteen nineties or early two thousands, they were
pretty light. I remember you could just create an HTML page,
maybe include a JPEG image file that doesn't take up
that much memory, and the page would run very well.
And even today it would run very well. So if
you had an Internet or a version of the Internet

(01:31:41):
comprised of those pages, no pop ups, no animations, just
text and pictures, perhaps it would run a lot more smoothly,
and it would run on LTE as well.

Speaker 2 (01:31:52):
Hundred percent. Yeah, you had a probably run on two
G or three G. Yeah, with the computer starting up.
It's because yeah, with Windows, I do not recommend using
Windows all sorts of advertising spywhere beacons are installed by
default all Windows, even on like Windows, like I think
they install it candy Crush right, like you know, like
the little mobile phone game that used to play you

(01:32:14):
know when you're a kid. I think they have like
candy Crush on the Windows. You know, if you have
that starting up and all like this other you know,
analytics and tracking and even things to like validate, Hey,
is your your product key for Windows? Is that a
valid product key that's got to receive and send encrypted
data to Windows key servers. So yeah, especially even like

(01:32:37):
even on Max right, you've got Apple, eye Cloud, You've
got Apple. If you're in like beta and you're you're
forced to use Apple's diagnostics system, all sorts of things,
all sorts of blowwear, all sorts of eyewear. I you know,
part of it is, you know, the company wants to
make sure, okay, is there are there any crushes on
the device, is the software version running well for the devices?

(01:32:58):
But also a lot of it's data collection, right, I mean,
why is Windows not really cracking down on people, you know,
using the different key servers like they say, like they
I think they publicly said that people can use different
key servers aka getting free versions of Windows, and they're
not cracking down on it. They've also given people free
upgrades to Windows ten. You know why because you're if

(01:33:22):
with Windows, you're the product basically, right, if you're using
if you're going to a website, Windows is going to
collect some information about that, right, They don't know what
websites had you been to? What you know, which, why
did you go to? Where did you go to? It?
You know, what's your location? So yeah, they collect a
lot of that data. You can go through all their
pricy policies and stuff. They collect all the data they

(01:33:43):
sell third party companies because they make money off of
personalized and coked apps. So yeah, it's definitely because of
additional bloatwear and stuff. You could start up like a
Windows seven system, right, it'd probably be faster than like
a Windows ten system. Windows eleven, I think, is improved
a lot of like the performance enhancements, but still has
all sorts of Microsoft spywear and bluetwear installed. I think

(01:34:08):
phone offering systems are pretty good. They those put up
pretty fast. It's not like a lot of trackers that
load up on like the bootloader itself. But you know,
still especially like if you have like a not like
a pixel like operating system or not like an IOLs
operating system like a Samsung one UI or Huaaway's operating system,

(01:34:28):
like any of those third party operaing systems on Android,
those like a boatload of malware or not malware, excuse me,
spyware and other tracking like scripts and stuff when you
boot them up.

Speaker 1 (01:34:41):
So yeah, yes, indeed, and John H. S Writ's the
boot programs are huge compared to the old DOSS systems,
and I remember DOS as well, and they kind of
neutralize the enhanced speed effects of fester Ram and CPUs. Actually,
I am a Windows seven whole out, and I think

(01:35:02):
you've made a good case for me to remain one,
though I don't think I could poort it onto any
new PC. But honestly, in terms of like the key
functionalities that I am looking for, it seems Windows seven
is as good as Windows ten or eleven, if not better.
It has certain programs that Windows has deprecated in later versions.

(01:35:27):
And also it just doesn't have all of the pop
ups that a Windows ten or eleven system throws at
you at the start. I do not like pop ups.
Just as I want to be in control of my data.
I want to be in control of my actions on
the computer. Now. It's okay to have a portion of

(01:35:49):
the screen that's dedicated to alerts and make suggestions like
you should really take a look at this, but not
interfere with your workflow.

Speaker 2 (01:35:58):
Sure so, yeah, I mean I think with Here's also
i'd like to get you. You know, if you're interested
in Linux, there's a lot of great benefits of using
Clinux over using Windows seven, including compatibility issues. If you're
using Windows seven, basically all the software will work on Linux,
either through just native support, or using things like wine

(01:36:19):
which stands for wine is not an emulator, or using Crossover,
which is like a proprietary version. So yeah, I mean,
if you have any questions about Linux, just let you know.
I recommend that over Windows seven obviously because there is
still tracking in Windows seven, but Windows ten, Windows eleven,
I do not recommend at all. Even mac os. You

(01:36:39):
can fix macwist a little bit by going into the
privacy settings and stuff using like a hardware firewall which
kind of stops a bunch of those built in tractors
from peeing and receiving data to Apple servers, but do
not recommend Windows ten or seven. Yeah. And also with
the yeah, also with like the website. I will say,

(01:37:01):
one good thing I've been seeing is a lot of
websites and stuff have been a lot of programs have
been wanting to use a thing called markdown, which is
like a text formatting language. Things like Obsidian, like other
types of note taking systems have been using that. So
I have been seeing like a lot more usage of
some of those more open and minimal protocols, which is

(01:37:23):
kind of cool.

Speaker 1 (01:37:25):
Yes, very interesting, Thank you, And your observations on Linux
are duly noted, So we have some comments, first of
all from Daniel Tweet. He mentions the dead Media project
that was initially proposed by science fiction writer Bruce Sterling
in nineteen ninety five as a compilation of obsolete and
forgotten communications. So we need to make sure that there

(01:37:49):
is an initiative to preserve information in all of those
obsolete formats that are not easy to read anymore, and
to maintain continuity of that information. Jennifer Hughes writes she
thinks that some of the smart jewelry and being able
to project more and interact with teletactics will lead actual

(01:38:12):
phones to fade. And you talked about smart jewelry as well.
And in regard to foldable computers being developed now, she
notes that Jacques Fresco of the Venus Project talked about
that a long time ago. Now, Jacques Fresco lived to

(01:38:33):
be one hundred and two, and he was active throughout
much of the twentieth century. But yes, he was a
very forward thinking futurist. So I was curious about the
brain computer interface that you have, Josh. It looks like
a non invasive BCI that you can put on. What

(01:38:56):
are some of its functions sure.

Speaker 2 (01:38:59):
So it uses a public, open source JavaScript SDK with
Stansford Software Development Kit, so it can basically take all
the brain data from all different channels on here and
you can basically make it. There's a few cool things.
So there's one where it can do like kind of
like a it can do like a thing where it'll
ought to play different like types of Spotify music based

(01:39:22):
on like your brain weights if you're trying to get
into a deep focus when you're working. There's like an
dedicated program that they developed that'll do that. But also
the biggest I think the biggest thing is just the
public JavaScript SDK where you can build anything. Right, you
get the data from the from the neural interface, and
you can do whatever you want with it. Right. It
is free game. They're not locking down the data. It

(01:39:44):
is use jobs. Whatever you can do with JavaScript you
can do. So I think that's like the biggest thing,
especially if I did something in North Dakota where I
could literally just think I could just think of certain
specific things like you know, eating an apple or something
or drinking water, right, and I can map those two
different different characters like you know, like on a keyboard

(01:40:06):
or something and it would actually type it in a
chat GVT and then clicks in without you haven't touched anything.
I can just think it'll type it, it'll type it
by itself, it'll send it, and then I can get
that response. That's like a very that can be seen
as like a very early cybernetic information system for future
interfaces where you don't have to do all this fancy

(01:40:28):
JavaScript stuff, and you know, it'd be kind of built
in by default using like a JavaScript applet or something.
So I think the open Software Development Kit, I think
that's like the biggest thing about it.

Speaker 1 (01:40:42):
So, speaking of software development kits, Daniel Tweet has this
common USTP is the Software Development Kit for the free futures.
Thank you, Daniel. But now I'm curious, Josh, when do
you think this capability of essentially being through a brain
computer interface will become available? Is it a matter of

(01:41:05):
a few years or more like a few decades.

Speaker 2 (01:41:08):
It depends on what we're defining this available, right, I
mean it's technically available now, right. You gotta do a
lot of hacky ways, a lot of coding and a
lot of programming and stuff, using several different programs and
systems to work together. It really depends on how we're
defining it being available. If we were to say someone
was able to walk into a store and purchase some

(01:41:29):
type of system for a five hundred bucks with the
current rate of like AI, am you know, I'm taken
to account the current rate of agentic AI and those
implications on manufacturing economics and stuff. Oh gosh, twenty twenty five,
I would probably say maybe twenty maybe twenty twenty eight,

(01:41:52):
twenty twenty nine, maybe something like that could be possible,
whether it's you're buying it online, you know, at that price,
at that price point where something would be built in,
I would say maybe twenty twenty eight, twenty twenty nine
could be feasible. It's really difficult to kind of predict that, so.

Speaker 1 (01:42:11):
That's quite soon and imagine a scenario like this. It
may be a bit fanciful, but some entrepreneurs, including Elon Musk,
have talked about essentially making humanoid robots available for humans
to purchase on Sultani Schwan has even made a campaign

(01:42:31):
promise to give every household in California a robot to
help do the chores. But what if in the year
twenty twenty nine, you're out on a walk. You have
your humanoid robot with you. You have a brain computer interface,
and you think a particular thought, and the robot has
an AI maybe a large language model inside it to

(01:42:54):
help it to communicate. You think a thought and the
robot interprets it and speaks at using a robotic voice.
So you're not actually talking. The robot is talking, but
you're directing the robot to speak with your mind.

Speaker 2 (01:43:10):
That's kind of cool. I mean, that's kind of like
what we already see with like you know, digital agentic AI,
and that's basically what that is, is basically physical agentic
AI with like a cybernetic communications you know, medium to
communicate with the robot. I mean, where we see how
fast digital agentic AI is progressing. I mean, people, you
can already make a website just by typing in natural language.

(01:43:33):
You can do thousands, tens of thousands of line of code,
you know, in seconds, just by typing just a few
words into something like lovable dot dev or cursor got
I think cursor cursor or bolt dot new. You can
just go those websites type in a few sentences. It'll
make a full website. Something that used to take tens

(01:43:54):
of thousands of dollars and you know tens of you know,
tens or maybe twenty programmers, you know, that might have
took months, and I can do you know, even on
like a foldable. You can even just sit, like lay
in your bed, use a foldable and type it in
and make just a website that works, but back end,
front end everything. Right, So we're already saying how fast

(01:44:14):
this technology is progressing. And now we're starting to get
into the physical realm of agentic AI with tem warlocks
being developed by a Tesla and other manufacturers. To before
seeing how fast is digital a gentic aids progressing, think
about how fast the physical AI is progressing once those
huanoid robotics get to an affordable price point. Especially we've

(01:44:38):
already seen like kind kind of quote unquote the price
collapse of a website development. And you can do it
for you know, five dollars if you make like a
fully functional what not just a website right now, just
like a leak page or something like a full web
application right where you have user accounts. You can have
like a directory list of stuff and people liking and
engaging and having post and stuff for like five dollars

(01:45:02):
and you know we're done within hours. I mean that's
you know, we've already seen that, you know on like
the news rac you know, software developers and software developers
getting laid off and turning into things like you know,
we call our work as more viable alternative because of
the price collapse of that specific skill of programming and
web development.

Speaker 1 (01:45:24):
Yes, and it is quite remarkable to observe what has
been happening you yourself, Josh, have created a number of
websites remarkably quickly, and I do wonder sometimes what tools
you've used to help you in that, because it doesn't

(01:45:44):
seem to be the same kind of traditional web development approach.
Of course, I remember the days of HTML based web pages,
where either one would use a very rudimentary kind of
web editor. Microsoft used to have a product called front Page,
which was fairly intuitive, but I actually taught myself a

(01:46:06):
little bit of HTML code to go a bit beyond that.
And then there were these open source at least hopefully
open source content engines like WordPress, and WordPress really is
the best one of those. But you still have to
create each page, one page at a time, so I

(01:46:28):
think you probably use some tools to expedite website creation
beyond what a typical WordPress user would do.

Speaker 2 (01:46:37):
Sure, sure, sure, So I would say, yeah, I mean
WordPress is open source.

Speaker 1 (01:46:41):
So there.

Speaker 2 (01:46:41):
Oh, there's two things. There's WordPress dot com and then
there's WordPress dot org. Both of those use the same
underlying open source CMS, which stands for Content Management System.
WordPress dot com is like the hosted slash managed version,
like the enterprise version. Yeah, WordPress is open source. Also
with regard to WordPress, I would say look into Ghost

(01:47:03):
Ghost CMS, g h ost CMS. That's actually like a
more privacy preserving and lightweight version of WordPress and also
like much more security focus too, because WordPress has a
lot of security vulnerabilities, especially if you start installing third
party extensions and plug ins. So yeah, look into ghost CMS,
especially if it's for like a blogging website that is

(01:47:25):
actually really good. It's new, it's getting tons of updates,
it is recently just got an update called Ghost six
point zero, and that actually allows your blog posts that
you make on Ghost to be sent to a thing
called the Fetti verse, which is kind of like the
decentralized areas of communication. So if you if you've heard

(01:47:45):
of massed on, it runs on the same protocol as
massed on. So say you publish a post on ghost
CMS or your blog that you can go on masted
on and see that post published and it shows. You know,
it's like your little blog is like a profile and stuff.
People can follow it and stuff. So I would actually
look into go cms too as an alternative to word Press.
For websites, I use framer, which is it's so for websites,

(01:48:09):
any of like public websites. I don't use AI to
like code it or you know, make it and stuff.
I manulate do it. But you know, I'm still interfacing
with a CMS, a content management system. So it's like
a drag and drop website builder rate. You get all
the site elements on the left, get the different types
of elements. You can add images, you can add all
all sorts of things. You can use preset themes and

(01:48:29):
kind of mixmatch different themes. So it's kind of an
intermediary in between actually, you know, having visual studio code
in front of you and typing out all the things
and having like a local local you know, apatche web
server and you know, posting it to that and refreshing
it every five minutes and then something like Loco boll
dot dev or bolt dot new which is a thing

(01:48:50):
called a vibe coding software, which you know is just
where you type in natural language and it creates website
for you. So yeah, Framer and ghost CMS.

Speaker 1 (01:49:01):
Yes, thank you. That is very helpful. Now, Daniel Tweet
has an interesting comment that I would like to highlight.
He writes, to my mind, a main benefit of enhancing
one's personal privacy as a kind of distancing effect from
random crazies that might do you violence. And this is
an important point. I think in most cases, for most people,

(01:49:22):
if you disclose something personal about yourself, they're going to
be fairly understanding, or even if they don't approve, they're
not going to do anything. But there are these occasional
random crazies, as Daniel points out, who really could get

(01:49:43):
triggered by this information for whatever reason. They may be
online trolls, they may have mental issues, whatever it may be.
But without privacy, without individuals having that ability to decide
stun their risk tolerance, what do they want to disclose,
what do they want to keep to themselves, people are

(01:50:05):
vulnerable to these kinds of attacks, and I don't see
any other way besides stronger privacy protection to really defend
against that, But what are your thoughts?

Speaker 2 (01:50:16):
Yeah, one hundreds and agree, hundreds and agree. I mean
you always see with like YouTubers, right, if someone has
like a YouTuber has an Alexa or a Google Home
which is now called Nest Home. I don't recommend hitting
those devices. They are very pricing invasive. But you know
you've seen like, you know, people using a thing called
text to speech, which is like where someone donates and
they type in texts and it's like said on the

(01:50:38):
streamers computer, and you know, say that Alexa picks it
up and accidentally leaks their their home address. You've seen
countless times, or they you know, people call raids on them,
or you know where people will show up in front
of their house and record and do like, you know,
weird stuff. So yeah, one hundred percent, especially if you
want to protect yourself against crazies or people you know,

(01:51:00):
outside of that standard deviation of common sense and just craziness. Yeah,
definitely strong, stronger privacy, knowing who to trust with your data,
who to trust with certain pieces of information. I think
that is important. That's like your best defense is a
preventative defense.

Speaker 1 (01:51:21):
Yes, indeed, And as an example, Daniel notes that Mark
David Chapman's hatred for John Lennon simmered since his nineteen
sixty six more popular than Jesus' comment, and John Lennon
was murdered in nineteen eighty and Mike Lusine points out
that swatting is another tactic that is used because of

(01:51:45):
insufficient privacy safeguards, like this has been done by online
trolls if they don't like some commentator, whatever that person's
political orientation, they might pretend that there's like a hostage
incident at their house and call the police, and then
the police show up with swat teams. I think there

(01:52:06):
needs to be a public policy response to this. There
needs to be some way of not going in with
full force at first in those kinds of situations, because
a SWAT raid, irrespective of whether it's a false alarm,
could be deadly because of standard operating procedures. So I
think policy needs to adjust to protect people against these

(01:52:29):
kinds of hoaxes, very deadly hoaxes. But anyway, we are
coming to the end of our virtual Enlightenment Salon, and
I would like to invite you, Josh, to offer any
concluding remarks in the minute that we have left any
other points that you would like to convey to viewers
of our salon.

Speaker 2 (01:52:50):
Yeah, I think with respect to the swatting public policy,
maybe innefficient idea is for local police departments to keep
a record of specific extreamers and personalities that people that
might have a higher likeliness to be targeted for swading attacks,
and then like if there is an attack, calon on
them for them to see, Okay, are they streaming? If

(01:53:11):
they are, see if there's anything weird going on, and
maybe just wait outside or something to end of the stream.
If they're not, then do whatever the normal operating procedure is.
And other thing is like please please be a proponent
of privacy. If you're following like the clipping movement, set
your profile picture to clippy, which is like an old
Windows AI assistant. I mean, privacy is like you know,

(01:53:35):
pricy is like muscles. If you don't use it, if
you don't protect it, if you don't defend against it,
if you don't advocate for it, you're going to lose it.
And it's being undertacked every day, especially nowadays with all
these regulations and legislations getting passed. If you have any
other questions, go ahead and scan right here, send me
a message on LinkedIn, send me connection requests on LinkedIn,

(01:53:55):
use my contact form, send me any questions you have,
I'll answer them, And then if you do want to
join our discord server for Creams Munism, you can scan
right here. Added a new privacy and security channel which
you've been access by going into the roles channel and
then clicking Privacy and Security, so we can kind of,
you know, do our best and our effort to approach

(01:54:19):
this new technology and technological enhancement from a privacy preservative standpoint.

Speaker 1 (01:54:25):
So yeah, thank you, Yes, thank you very much, Josh.
And yes, Clippy is well known to many of us.
It's interesting that he is getting a second lease on life.
Is longevity is being prolonged by this privacy movement, and
privacy is important to our longevity going forward. So please

(01:54:48):
take what Josh has said seriously and look for ways
to enhance your own privacy and hopefully, as a result,
we will all live long, prosper
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.