Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Uh, welcome to It Could Happen here, a podcast about
things falling apart and how to deal with that and
hopefully take care of yourself and your people. Um. Today
we have a returning guest, Carl Casarda from Enranged TV. UM. Now, Carl,
every time you and I have chatted on a show together,
it has been about firearms, which is obviously your passion
(00:28):
and specialty, well one of your specialties. But today we're
not talking at all about guns. Um. I mean maybe
here and there, but today we're talking about the thing
that is has been your your career, uh for what
most of your working life. Fair to say, Um, you
want to kind of walk through your background here because
we're gonna be talking about information security and like sort
of the future of threats that are going to be
(00:50):
like coming uh throughout like the next few years of
our lives. Obviously, this year in particular, there's been a
bunch of stories about like Russian attacks on digital infrastry
sure and vice versa. And that's always like pretty much
has been something that's in everybody's backburter since we got
the Internet, usually through like Questionable films with Sandra Bullock. UM,
(01:11):
I think net that was net right, Um, the net,
the net, yes exactly, Yes, where they somehow hacked a
car and nineteen ninety eight or something. You gotta do
that when you're flying through cyberspace with your VR comment
on and your gloves right yea, um, but yeah, you
want to walk everyone through kind of what your actual
background is in this industry first. Yeah, totally. So if
(01:31):
anyone watches En Rangers, watched it for a long time,
you'll see this reflected in some of my content because
I do deal with some of this intermittently on the channel,
and it's definitely influenced how I approach my work there
with the social media and all that. But so way
back when I was like one of those kids that
was in the hacker space, and I grew up like
trying to make computers and technology do what it wasn't
designed to do, and learn to make it do things
(01:52):
it shouldn't have done for my own interests or others
around me, not not in any really negative way, but
like just a deep curiosity and how does this stuff work?
And being part of the early online community. We're talking
pre Internet, where you have like an acoustic coupling jack
modem and you would dial in like war games. Yeah,
literally plug your headset into the fad. I was on
boards like that way back when we never should have
(02:14):
gone past those days doing things wirelessly with such a mistake,
Like I'm so piste off that when I like sit
down to research, I'm not like jacking into a gigantic
box um like it that makes me live it, Like
shadow Run promised me that I was going to be
like using one hand to shoot at the approaching corporate
security guards and have another hand on my like keyboard
(02:36):
that I wear around my neck that I like plug
into the wall to hack buildings. But well, hey, maybe
someday we'll have neurological implants or wet wire implants brought
to us by Monsanto Li. The rmum will just get
shut off in our own rooms, right from their mouth
to God's ears. Carl absolutely, who doesn't want that? Who
doesn't want my neural tissue tied directly to a corporation?
But that? Fuck yes? But anyway, so I grew up
(03:00):
in that space, and it actually back then it naturally
turned into a career. It wasn't like now nowadays, you
pretty much have to go get a bunch of certificates
and a college degree to even start looking at an
infrazet career. But back then, if you kind of had
like skills with a Z at the end of you
could get a job. And I ended up doing like
help desk at this one company landed up they noticed
(03:21):
that that's where my interests were, and I ended up
becoming their information security architect over a couple of years,
and that turned into a multiple decade career pretty much
culminating and working at a Tier one Internet backbone provider
doing sub seed fiber optic like routing, networking and de
dos mitigation and bought net control, search and destroy, So
it really turned into a really wide career, not only
(03:43):
like when I started off backbone internet, but like encryption, firewalls,
application layer controls across the board for multiple corporations. So
it was a weird and interesting space. But I don't
really do that much anymore except on the side, but
I've had a pretty exciting career with it. So I
think probably a good place to start is just in general,
because folks are always interested about this, what what do
(04:05):
you what is your recommendation for people ask like what
should I be doing to kind of protect myself as
I forced my head under the constant stream of sewer
water that is social media these days. Well, yeah, you know,
the simplest thing and everything. An infosec is always controversial,
just like anything, Like any any recommendation makes someone's gonna
be like, but otherwise or anyways, or there's a better solution,
(04:28):
and there always is a better solution. But the realistic
thing is when you talk to the average person, the
average person isn't gonna sit there and hack a Linux
box to have a better social media experience. That's that's
not realistic. So the best thing anyone can do, the simplest,
best thing, is to get one of the trusted password managers.
There's a number of them out there. I'm not going
to recommend an individual one right now, because anyone I
(04:50):
recommend someone's gonna go. But there's another one, but there's
a few of them out there. Having a password manager
and having a unique, difficult, complex password for every account
you log in onto on the internet is the first
number one thing you can do as an individual to
protect your interest, because if you're logging in with the
same password monkey to Facebook, Twitter, and your bank account,
(05:12):
that is a disaster. Waiting to happen. So the first
thing you can do password manager passwords you yourself can't remember.
As a result, I allow the password manager to generate
like twenty four character long alpha numeric crypto. Nonsense. You
put a gun on my mouth to say, what's your
password to your bank and I don't know. I can't
give it. You have no idea. And so that right
there is the first thing any basic individual can do
(05:33):
to protect themselves on the Internet that, uh is totally sensible. Um,
I don't I'm not great at password managers, but I
never know what my passwords are and they're all different,
and so my life is this constant stream of like
needing to figure out what my password was, failing and
resetting it. But it does mean that I change passwords regularly. Right,
But what's so great about password managers. You can have
(05:55):
passwords that you could never human remember, and you can
have weak ones per website, every website you log in,
you can be unique, and by having it in this
database that's properly encrypted with a key phrase or even
dual factor, then at that point means you literally just
can cut and paste your passwords into things you don't
yourself know what they are and if depending on your
privacy levels, you can do that locally with local solutions
(06:17):
with files like on your own machine. But frankly a
couple of the cloud based solutions, as much as the
cloud freaks people out, is the better one because it
will work on your phone, it'll work on your black top,
it'll work on everything everywhere. That makes total sense, um
I think. Another good thing to get into while we're
on this subject, we just started talking about passwords, and
obviously it is important to keep and secure those. Um.
(06:40):
I think one thing folks don't often think about, especially
people who are activists UM who who may foresee or
have engaged in things that are legally questionable, don't think
about enough is social media networking um as and by
which I mean having social media that like It is
possible to find your other social media by like knowing
(07:00):
you know, like having the same name and Twitter and
on Instagram and stuff. Um Having social media that like
can be tracked across accounts um most people would be
surprised at how easy it is to do that, and
mellanct a huge amount of tracking nazis tracking even like
a ton of the what the work I did not do,
but my colleagues did to like doc docs Russian like
(07:23):
secret service agents and stuff. Was like, oh, we found
them in you know somebody their their bosses wedding, Like
they're tagged in this thing in v K and from
that we were able to like find their their account
on this other site and like from that, like now
we have this like map of everywhere they've been for
the last like three weeks, and we can like build
this social map of their entire life. Yeah. No, by list.
(07:46):
By just literally existing in modern space, you're constantly leaking
some form of metadata, right you are, You are always
leaking metadata, and the more of you allow to exist
in the world, the more that's the case. So, like
there's also you've got to think about what the threat
is and what the risk is. Right, there's the risk
of the individual having a para social relationship with the
Internet like I do as a content creator is one thing. People,
(08:07):
there's always someone that wants to delve into your private life.
But that's a very different risk than a nation state actor. Right,
those are two different things. And when it comes to
a nation state actor, quite honestly, unless you're real good
and I've been doing it for a long time. The
individual bluntly is kind of fucked word. As a general rule,
your best security as an individual in that situation is
(08:29):
the anonymity of the crowd. But when we're also not
talking about most people who are threatened to kind of
by the state and that situation are not being threatened
by the federal government. But they may have they may
like be attending protests and not want the Louisville Police
to like put together that they're in an affinity group
with people, and like, something you can do for that
is make sure you're not like if you have a
(08:51):
personal account that's under your name with your friends, that
account shouldn't be liking and sharing things from like a
political account that you have, or from the account of
like a a group that you're a part of or
something like that. Like just try to think about and
look at your your digital footprint from the outside and think,
is it possible to connect me to people I don't
want to be publicly connected to through this And the
(09:13):
minute you've breached that connection once it's gone forever. Right,
this is the forever. Yes. This is the same thing
as like with phones, like someone will have like their
regular phone which, by the way, all these smartphones are
just surveillance devices in our pocket. Right. Let's let's say
you go get a burner so that you don't want
to be connected to the device that you normally use
on on a level that's one step above the regular
individual level. If you ever have those two devices emanating
(09:35):
at the same time, they're now connected in a way that,
like let's say, the authorities can associate them together because
of triangulation and seeing a burner phone and your phone
coming from the same house, you've breached all the privacy
you would have had from your burner phone for example. Now, Carl,
do you have much to say on the subject of
because I know one thing I have seen people do
people who are you know, having conversations that they're concerned about,
(09:57):
is put bags in Faraday cages. And I've heard things
about how reliable Faraday bags and stuff are for actually
stopping signals. Do you have much to say on that matter?
My experience with that is not all not all bags
that you can just buy off the internet or made equally. Um,
So what you want to do is tested, and you
can only test it to a certain degree. But the
really simple tests are you put it in the bag
and you try to darn dial the darn thing or
(10:19):
use any WiFi connections to it. And that's a simple test.
Now is it as good as like? Is it as
good as not having the thing on you? Of course,
not leaving or else is always the best answer, but
it properly In my opinion, a properly built Faraday box
or cage or bag that you've put some testing into
is a pretty reliable solution. And it's you know, there
(10:40):
are so a problem that you might encounter, is um
or that I have so one thing I have heard
people talk about it is like, well, in order to
have kind of a private conversation, we like drove to
a specific location and we left our phones off in
the car and then went on a walk. And the
problem with that is that now you have both just
driven to a location with those phones, and those phones
are associated with each at it right right, Well, so
(11:01):
first of all, you got to think of a world
where all of this metadata is being collected at all times.
So these phones and their associations in physical proximity to
one another is stored somewhere at all times, whether or
not it's going to be resourced or accessible to the
powers that be when they wanted to be. It's all there,
my phone next to your phone, next to that guy's phone.
Those associations all exists. They're all talking to the same
(11:23):
cell phone towers in the same area, giving them not
only GPS coordinates but triangulation data, which, by the way,
if you go way back to the hacker Kevin Mitnick,
that stuff was going on back then before they had
gasiness triangulation data to get him right, So that stuff
is all still happening, and those associations occur. In regards
to saying I turned my phone off, how do you
know that's off? Most of these moderate phones, what does
(11:46):
off mean? And yeah, okay, pull the battery maybe, But
even then, I would not trust any of these devices
in the regards to them. Quote being off, especially things
like phones that have un or not removable batteries. Off
is more like sleep it is, right, Yeah, I mean,
I think one of the worst things that's happened for
(12:08):
personal security is the end of the phone where you
can remove the battery, like being unable to actually cut
power to it without you know, disassembling it is a
real issue. One could argue that there was like that
that's a much much more insidious reason they did that,
or one could argue that it was just one of
design and comfort, and it's like hard to say. It
(12:28):
doesn't really matter if it wasn't sidious or not. Reality
kind of a poor Kano Los dose situation right totally
is so well. Now that we're talking about phones, here's
another thing that's been near and dear, and I think
you've seen some posts for me about this. Um everybody
really likes the convenience of things like biometrics, thumb authentication,
fingerprint i D, facial identification. And here's the reality of that.
We know this already and there's legal this exists in
(12:50):
legal space already. But the reality is that you can
be coerced to provide biometric data against your will. So
if your phone is authenticated to with a fingerprint i
D or your face i D, they can pretty much
say you must give us your thumb to unlock this phone,
or for that matter, frankly, they could hold the phone
in front of your face in certain circumstract circumstances, even
against your will, and it will unlock the device. And
(13:12):
that is considered not a violation of your rights. So
for example, if you had a long, strong password on
the phone, they cannot coerce you to give that up
because that would be a violation of your own rights
and Fifth Amendment, which is interesting, um so. But at
the same time, one could also argue that a certain
circumstances where there's a lot of cameras that are not
necessarily watching everything you do. But you could also consider
(13:34):
that pass phrases could be dangerous, like saying an airport,
because all those cameras could see you plugging in your
pass code. So it's a matter of if, when and
where right, So what's the right solution at the best time.
But I would say that if you were going to
be in a place that was contentious, um it is
almost always better to make sure you do not allow
for any biometric authentication on device. Yes, I never like
(13:56):
never turn on don't even like ever have had it
in the like ideally you have never turned on facial
recognition on your phone, like even if you like deactivated.
I don't know, I don't I really that was that
was one of the first I used to be in
tech journalism, right, obviously I'm not an expert on any
of this. But like the worst thing in terms of
like my personal comfort with devices was when they were
(14:18):
like everything's gonna read faces and fingerprints. Now I don't.
I don't love that, um, but you know it's it's inevitable, right,
because it is and I had in the past. I
did a fingerprint unlock earlier in my life, and I
do not have any devices that unlocked that way anymore.
But you do like that it is more convenient, right,
You miss it when you need to get to your
(14:39):
phone quickly and you can't do it. But like, I
don't even I don't even let my phone have just
like a four phrase like password anymore. Like it's eight
characters for me. It's a little bit of a pain
in the ass, but it comes with fewer risks. And
one of the things that's challenging to every individuals they
have to look at what their threat profile is, right,
So like, for example, um, soccer mom driving her kids
(15:01):
to school and stuff, she might be really good, well
off with a biometric authentication on her phone, frankly, because
if she didn't use that, maybe she wouldn't even use
a proper four character pass phrase. And if she's not
concerned about being at a protest, for example, and having
some authoritarian take her phone away from her and authenticate
to it. Maybe she doesn't need to worry about that.
But for a lot of us in the world's we
live in, that's a different risk profile. Right, We've got
(15:23):
to think about what our risks are as individuals and
what makes sense. So if your passphrase is going to
be one, two, three, four or use a thumb print
I D. For most people, they'd be better with the
thumb print I D. But for someone like myself, no,
it's not a good idea. Yeah, and that's UM. Yeah,
I think that kind of brings us to uh, probably
the last part of this, which is um, do you
(15:45):
have specific advice on like VPNs UM. Obviously I recommend
everybody you signal I I justform messages in general, but
like a specially stuff that is secure. Don't if you
if you like Number one first rule of any kind
of this sort of security. Don't ever put anything on
your phone ever that's legally questionable if you can avoid it,
(16:09):
like conversationally, like right, do not don't send it over
a phone if it's something you would not be able
to survive. Having read D you in a courtroom. Yeah,
so for the audience, a lot of the audience may
not know what signal even is, right, So signal ism
is a is a text messaging alternative. So like for example,
on your phone, you've got regular text or if you've
got an iPhone, you've got I message. Signal is an
(16:31):
end end encrypted solution that you install as an app.
And because it's end end encryption, it means that it
passes the wire in theory not decryptable by the parties
that are passing the data packets in the middle, So
that's a man in the middle the decryption. Right, So
for example, I message is encrypted theoretically end to end,
but Apple ultimately has the cryptographic keys, so there is well,
(16:53):
they might say one thing, there is nothing really preventing
them from being man in the middle and being able
to read the message in transit from a to be.
But if the keys are stored on your device, which
are then protected with your passphrase or whatever your authentication
mechanism is, and those keys are not archived or kept
by some hierarchical man in the middle authority, if it's
(17:14):
done right, which signal is done pretty well, it means
that your data in transit is probably not decryptible, and
that's why Signal is a good solution, and it's a
good one for the average person. Install the app. It
works just like test text messaging, but you can have
a pretty good level of knowledge that the data you're
passing is not being decrypted or caught in transmission or
(17:36):
in the path. So I would say get get signal. Um,
it's it's your best bet, right Like, and again we said,
I said, you know, you don't want to ever say
(17:56):
anything over a phone. That is something that can get
you in trouble. But also like, life is life, and
that's not always realistic for people in certain situations. So again,
signal is your best bet. Nothing is perfect. And again
if you're putting it on your phone, there's a number
of things that could go wrong every single time you
do that. But um, that that's one of your better
(18:18):
things that you could do. And then of course we
talk about VPNs. Yeah, so so VPN to those like,
I'm just gonna go with the basic levels because I
don't necessarily know the level of knowledge that people are listening.
VPN is a virtual private network. So with that is
you connect to this virtual private network and it passes
your data through an encrypted tunnel to an exit point
(18:38):
somewhere else on the Internet in theory masking the source
and origin of your requests. So like, for example, let's
say you were looking up something on the Internet that
you didn't necessarily want people to know you're looking up. Yeah, Like,
let's say you're researching the truth about the assassination of
President John F. Kennedy by Bernard Montgomery Sanders Um. And
(18:59):
you know that the n s A is looking for
truth seekers who are who are finding out the reality
of that situation. You know, you don't necessarily want them
to know that you have have become pilled. Right, So
if you were to do this from your computer at home,
but what happen is to people that don't know how
this all works, you would be coming from an I
P address that's associated with your account that you're connecting to,
(19:21):
whether it's Verizon or Comcast or whatever. And you go
and search up that truth, and the n s A
finds you with a keyword search for jfk and the truth.
And therefore, because of that keyword search, they go to
Comcast or to Verizon and say, hey, we are requesting
you tell us who did this search, They will get
them essentially a request that's a legal request for information,
(19:43):
and then Comcast or Verizon will provide the n s
A this is the I P address and account of
the person that did that. What VPN does is you
connect to the vp and service. First, the connection from
your machine to the VPN services that encrypted. Now does
the VPN service no, your IP address, yes, But when
you actually type in that information or go to the
(20:04):
Internet to request that data, it actually goes through the
VPNs private tunneling network and egresses from somewhere else on
the Internet, thus masking your actual I P address and
in theory your origin of source. Now that's not true,
but what that does is mean that if someone if
say the n s A I wanted to know who's
(20:25):
doing this truth search, they would then find an eat
IP address that actually came out of let's say Joe's
VPN service, and they would have to go to Joe's
VPN service and go, we noticed this emanated from your network?
Who did this? At that point you have to trust
Joe's VPN service to not disclose their account information about you.
(20:47):
So what you've done is you've changed it. We know
the telecoms will communicate with the government or whoever if
they need to, they always will have. You don't necessarily
know if Joe's VPN service will You've changed your trust
model from your telecom to your VPN service. So if
you're gonna pick a VPN, you have to do a
little bit of research to know that it's a trustworthy
(21:08):
resource that won't just give you up at the lightest
form of interrogation. Yeah, and none of them again, there's
nothing perfect and often like we did find out what
was it last year that one of popular VPN was
like run by the FEDS, Like it's yeah, that's not
an impossible thing. Um. I know a lot of folks,
particularly journalists, use proton Um, which is i think based
(21:31):
in Switzerland, and you will get given up if you
if the Swiss government is angry at you. Right, you
brought up a very good point. Uh, Services that exist
outside of the KNUS the continental US mean that they
are under different legal jurisdiction than ones that exist wholly
within the CONUS. So as a result, if something from
(21:52):
the United States government comes as a request to the
Swiss company, there's a much like like higher chance of
a Swiss company would be like, we don't really care
about your reports. That's worth considering. Also, think about this,
This actually works in reverse. And I don't want to
get too deep into this, but when you're working at
a tier one Internet back one provider, you should know
that sometimes traffic strangely gets pushed offshore and then back
(22:15):
to the United States for analysis that would normally be
let's say, not necessarily constitutionally legal in the United States.
So there's a lot of shenanigans going on. Yeah, and again, like,
I think protons are generally a pretty good service. I've
had no problems with it. Um, But we should should
be clear hear. None of these are perfect solutions. There
(22:37):
is no perfect solution. The only perfect method of digital
security is not putting things on the Internet or like
through you know, the mobile networks and stuff like that. Is,
if it stays between you and someone else, Um, that
is your best bet of it not being you know,
intercepted or something. A conversation that you have in the
woods without phones anywhere near you is the most secure
(22:59):
kind of conversation station. Let me second on proton. I
agree it's a good service. There are others out there.
We're not trying to pick on one in particular or
pick against anyone in particularly. There's a bunch of work. Yeah,
another thing that you need to consider in this sort
of thing is also what you're dealing with. Like so,
for example, on I put up a post a while
back because there was a bunch of stuff going on
in Ukraine with with people posting photos that got their locations,
(23:21):
had things happen. I mean, that's and that has been
happening for a decade in that war, like well almost
a decade as long as it's been going on, And
I posted something about it. And one of the recommendations
I made on there was a contentious one, but I'm
gonna back it up in a minute. As I used,
I mentioned tour the Onion relay. So the tour is
essentially it was originally created as a as a way
(23:42):
to deal with the dark Web quote unquote and to
also relay traffic in a way to mask the origins,
very much like a VPN service. Now there are a
bunch of these, So what it was is there's these
Onion relay nodes all over the Internet, and when you
connect to the to the Onion network, your traffic bounces
through three, four, or five, six, seven of these nodes,
you can sort of dictate what you want depending on
(24:04):
the client you have. And so let's say you connect
to an Onion router network node in Arizona and then
you egress somewhere in France, and you've jumped through six
nodes in the process. Well, one of the things that's
a well known fact is that a number of these
Onion relay routing nodes are owned by nation state actors,
whether it's the United States or others. So, so one
(24:25):
of the things I got taken to task for, and
I want to explain this is people like, well, that's
not a compromise network. It doesn't mean that it's useful.
Actually it does, because depending on what you're trying to
do may matter. If you're trying to mask the origin
of your data source or your upload or your search
for a short duration of time, this will still help
(24:45):
you jump through six nodes. They've got a relay back
six nodes to figure out the origin of the person
connecting to the relay network. And that's assuming that there
was a compromise node in the process. So that means
if you're passing data through a compromise No. Does that
mean the data in transit is safe? No? But is
the is the anonymity of the origin of the poster
(25:08):
safer for a longer duration of time? Yes. So these
things get really complex, real fast. And this is again
one of the best things you can do, because there's
no single perfect solution, but stacking, so not just going
through tour but also tour into VPN at the same time.
And you're I think one of the better ways to
think about security is kind of the way Sebastian Younger
(25:30):
describes how insurgent war works, which is it's all about
creating friction for anybody trying to spy on your ship.
There's no perfect answer, but the more things you can
make be a pain in the ass, the better your
odds that you will not have an issue. Right like,
That's all you can do is make it potentially more
annoying and more difficult for for whoever might be looking
(25:53):
right like it. The more friction you can create, broadly speaking,
the more secure you're going to be. Absolutely Now another
thing to think about, and we're getting kind of deep
in the weeds too. This is above and on the
average person, right, the average person get a password manager
don't use your same password everywhere, and don't use biometrics
unless you're forced, like pretty much have to and move
on with your life. But once you're beyond the average person,
(26:14):
this is what we're talking about now. So like if
you're if you have a computer and you use it
as your normal day to day operating system, talking to
your friends, doing dot dot dot dot dot, but then
also need to do something else a little more privacy inclined,
you should not trust that device. So at that point,
your web browser may have all sorts of cookies and
metadata and storage in it that, even if you're going
(26:35):
through a VPN, still may be able to reveal your
identity a spell as Mac addresses and other stuff. So
if you really want to get pretty into the weeds
with this, you have to do something like use an
ephemeral operating system, install that it has no legacy data
on it. One example of that that is it's a
Linux based when it's called TAILS, you essentially use it
like a live USB drive. You boot off of that only,
(26:59):
or you use a machine dead cared for this and
you burn the OS down every time you're done. Because
there's no legacy information or data that can be pulled
out of your web browser or your cookies or your
Mac dress information that can associate it with you, regardless
of if you've done everything right to mask your IP
address of origin. God, that's the hot girl shit. Um,
(27:20):
when you're when you're when you're doing when you when
you're doing that kind of stuff. Um. And again I
think at this point, I think, up through most of this,
it's been kind of like people being like, that's too much,
and people being like, Okay, yep, this is exactly what
I already am or need to be doing. Um, this
is probably very few people need to be concerned about
that sort of thing, but um, you know it it
(27:41):
is I I've know, I know. Like again, I worked
at Belling cat Um. I had a number of colleagues
who were like personal enemies of the Russian state who
had to do stuff like this. Um and it's you know, paranoia,
I mean, And here's the thing going above. So again
like if you're a normal person, you probably don't need
to be you know, doing stacking a VPN, you know,
(28:03):
getting signal and all this stuff. But also why not, right, like,
there's no harm in in the additional security. It is
a little bit frustrating. But here's one of the things
I think people don't often think about enough. You're not
engaging in that kind of security stuff purely because there's
a threat now, but in part because you don't know
what the future is going to bring. And one of
(28:25):
the things that I would point out for that is
a lot of people right now have been having for
years conversations about a thing that may soon legally be
murder on a federal level, you know, um abortion right,
And so it is possible that overnight an awful lot
of conversations a bunch of people have had legally will
(28:46):
suddenly be very illegal conversations. And then you may be
glad that you took greater care with your your personal
security prior to that point. Yeah, I mean, like so
think of the I mean, I'm not a person that menstrates,
but on menstruation tracking app it's very useful to a
lot of people who do. And those tracking apps now
that metadata in there, at some point could be extremely
(29:06):
dangerous or incriminalized or incrimin criminalizing, incriminating excuse me, to
someone who otherwise was doing nothing more than trying to
maintain their natural health. And so that is a really
dangerous concept. So at this point, I mean, within the
United States, I hate to say this, those apps are
probably dangerous to the individual because that data could be
(29:26):
easily used by a government resource to UH to do
something bad to someone who's done nothing wrong. So I
think we should move. I mean, at this point, I
think we've covered the basis that you could kind of
responsibly the advice you can responsibly give someone in a podcast,
and and folks should be able to let let me
throw one thing out real quickly. So you mentioned, like,
for example, we don't you don't necessarily have the risk
(29:47):
vector that requires using VPN or signal. But let me
say this, way back when, gosh, when I was doing
crypto work decades ago, I was what you mean, cryptography
and not we should specify these days, excuse me, cryptograencryption. Yeah, yeah,
I had the opportunity. We're with Phil Zimmerman of p
GP and actually PGP pretty Good Privacy, which was one
(30:09):
of the fundamental UH security project or projects way back when,
was actually written for human rights violations. He wrote it
because people were doing research of like warlords were getting
their laptops taken away and then finding out who spoke
to them and getting people killed. So PGP was like
this human rights thing right from the beginning. And cryptography
back when I was young and naive, I always thought
(30:29):
to myself, this is what we need. This is the
future when everyone gets proper crypto will blind the government
will blind the corporations. We're gonna have this crypto anarchist
future where the government and corporations can't get us. And
the reality is most of that God who served. And
the truth is cryptography is too hard for most people
to use, and as a result, we don't. But here's
what I will say. The more people that do something
(30:51):
simple like you signal or use a VPN just to
browse the Internet, not because they're doing anything to various
just because their privacy like conscious, Yeah, because it makes
it normalize. And that means that the person that's using
it because they need to for like, let's say, to
protect human rights doesn't stick out like a needle in
the haystack because everybody's already doing something sane in the
(31:14):
first place. Normalizing proper privacy and cryptography is better for everyone. Yes, yes,
absolutely agreed. This is a nice segue because you were
just talking about the past and how beautiful and bright
(31:36):
it seemed. Um, let's talk about what you see as
kind of the future of info security threats. Well, I mean,
so there's so many levels to that. First of all,
if we're talking nation state level, I personally strongly believe
that all of the big players have already compromised everyone's network.
Everybody got everybody's got us, China's got us, we got China,
(31:59):
anybody right now it could go in and pretty much
funk up the grid on someone else like that, And
there's yeah, and that's not actually the least that's that's
safer than other possibilities, like because there is the level
of of mutually a shared destruction there where it's like, yeah, man,
Russia could take down the grid, but like that wouldn't
be good for them, and vice versa. You know, Yeah, no, true.
(32:19):
So the reality is, though everybody's and everybody's network, those
days are over. Um, when it comes to the individual
and I'm gonna have a the audience, there might be
people in the audience to feel differently, and it still
doesn't mean that we don't try. So one of the
things I want to say is you're gonna hear some
skepticism here because I've been doing this career for a
long time and I've seen things go wrong more than right,
(32:39):
and so in that regard, this is gonna sound kind
of cynical. But when it comes to the idea of
individual privacy, in my opinion, with the exception of when
you're taking a very active effort in something very specific
that you want to keep private because that's something you're
working on personally, the reality is individual privacy is dead
and gone, and we're just starting to smell that corpse um.
(33:04):
Whether it is credit card data transactions, your cell phone history,
your phone numbers, what you've done on the internet, what
you've done on social media or not done on social media,
whether you have an account on Facebook or not, doesn't
even matter. The metadata and the trailer you're leaving behind
you is all aggregated, all of it behind big data corporations,
all of it compromised, all of it searchable. Even stuff
(33:26):
the government has on you has been sold to large corporations.
Because I can tell you that some of the data
that they kept for like let's say D m V
or m v D, they decided to sell it off
to a corporation and they themselves access it through a
third party when doing research on you. So all of
that big data, there's a law of physics. The more
you aggregate, the mortal get compromised. Um, geez, I'm sorry,
(33:52):
that's the truth. No, no, no, I mean yeah, you're
you're you're like it's this Uh, there's this frustration because
I can remember the days when the privacy hounds. And
I don't say that in a negative term. We're like
warning everybody about, Hey, you don't want to be aggregating
(34:12):
all of these different social media things together. Hey you
don't want to be using all of these services. Hey
there's actually some like real downsides like all of what's happening.
Like part of why things are so cheap on Amazon
is you know that that your data there is is
one of the assets that they have. And um, those
people were absolutely right, and they lost harder than anyone
has ever lost at anything. Like So, Like when I
(34:35):
was back there at that company doing all that cryptography work,
we were trying to give crypto like to the average
general population of the Internet. I had this, like I said,
this naive view of like the future that was gonna
be this place where we're gonna have the Internet where
everyone was connected, and it was gonna be not only
would we have personal privacy through cryptography, but we would
be able to transfer information to one another in a
way that would make the Shenanigans impossible. Well, to some
(34:58):
degree that's been true, we've seen some of that, but
to another degree, we also have Snowdon dropping the bomb
on revelations about what the government has done to the
individual and how they've broken the law with all of
our privacy and data and what came of that a
man in exile in Russia and pretty much fucking nothing, Yeah, right, nothing?
(35:19):
And um I was sitting at a Defcon presentation where
General Alexander was on the screen talking about what they
weren't doing while Snowden was dropping revelations proving him to
be lying. And nothing comes of it, right, nothing really
comes of it. And one of the things that's so real.
And so whether it's the tribal level, your neighbors across
(35:41):
the street or the internet tribe, we as a people
in the aggregate are always willing to give up our
rights to something bigger for convenience. And we've done that
and it's called Facebook and Twitter and social media and
in the process, what was going to be an amazing
resource has become the trap. Uh, it's such a it's
(36:05):
because you know, you know Garrison, I I my my
friend who is much younger than me, UM has grown
up with the Internet being being what it is now
right like this this kind of like nightmare trap. You
know that that's sucking us all in this like giant
squid that has us in its tentacles. Um. And it's
I get I sometimes like dissociate talking with them about
(36:27):
certain Internet things because in my heart it's still the
promised land. Yeah, I wish I I guess my I
wish I felt that way. It doesn't feel like that
way to me anymore, to be honest, I mean it's
it's not right like and what I mean that in
like sort of I have this I don't know. I've
never entirely been able to like let go of the
vision of like, oh it could have been There's so
(36:48):
many things that could have been. Well, it's like, you know,
it's like all technology, anything can be remonized, right right,
Like an a R fifteen can be used for good
or for evil. A knife can be used to make
a beautiful meal or to commit a murder. And the
Internet is technology and it has been weaponized. It's been
weaponized against us. But at the same time, if we
just turn a blind eye to it and then not
(37:08):
learn how to use this technology to our advantage, we're
allowing them to do that unabated. And that's where like
the kind of hacker mindset comes from, which is like,
how do I make this thing do what I wanted
to do for me while not letting someone else do
it for them. And unless we take control of the
technology for ourselves, like I said earlier, normalizing using signal
and even basic VPN and cryptography, then we're just giving
(37:31):
it up. We're not even making it a challenge. We're
just like, here, you go have it. And uh, that's
something that I think that's more important as a community.
Maybe as people grow up on the Internet versus seeing
it becoming something that I saw become something maybe either
a they'll just accept which I hope isn't the case
that the reality is privacy is dead, or maybe they'll
approach the Internet differently than say someone at my age did.
(37:54):
We're frankly, we kind of messed up and we didn't
realize that Primrose Path was actually trap and that's a
like that was a mistake and maybe we can kind
of like evolve beyond that. But like you're asking, where
is info set going now? I I don't have good
notes for that. Like when I first started working in
the career, it really felt like a great thing. We
were doing important stuff. We were doing the DOOS mitigation.
(38:17):
We were going into hospitals and making sure that insulin
pumps weren't compromised as a DIDOS host. Believe it or not,
hospitals are INFOSECT nightmares. And we were doing stuff that
felt good. And then later in the career I realized,
wait a minute, I'm not doing anything to secure anybody's
personal information or make the Internet safer. I was just
protecting some corporate coffer And the reality was that the
(38:40):
private information that we were supposedly protecting the debate would
turn into calls, which was what's more expensive losing the
data or the lawsuit for losing the data. Literally, those
were the conversations and corporations, and those are the conversations
that corporations have now about each and every one of
ours personal information. Now when you when you think about
(39:02):
because so I obviously I'm in a different it was
in a different field. But when I was doing a
lot of the research on terrorism that I was doing,
I had these things that were like sort of the
this kind of attack is going to happen at something
I feel that very much about, Like drones. There's going
to be like a mass killing of civilians not in
a war zone by a civilian weaponized drone at some
(39:24):
point in the not too distant features going to happen.
It's going to be done. It's absolutely in an inevitability. Um,
that kind of stuff. Do you what are you when
you think about kind of the the digital equivalence of that,
Like what are you looking towards? Well, I agree with
you about the drone, Like you can see God, Yes,
you plot the you plot the dots and you know
what's going to occur, right, It's it's not it's not
(39:46):
possible to avoid. We unleashed that out of the cage
and it's going to happen. Um. Quite honestly, I think
we're seeing it already. We're seeing we're seeing the level
of privacy invasion that I don't think people already know
has happened. Like I know some of us realize that
we talked about it, we rant about it, but like,
I don't think people realize the level of the incursion
(40:09):
that has occurred to the point where all of this
data aggregated to the point they know what toilet paper
you prefer to buy. Like I'm talking like people like
Facebook knowing that um or the size of the corporate
oligarchy that controls the Internet, whether it's the small like
Alphabet Court, Facebook, Apple, Microsoft's becoming a smaller player weirdly,
(40:30):
but when you think about those big names, they kind
of like control everything in every piece of data about
you and everything you move, and say that, I think,
I think what's the end of that? I don't think
we got to the end game of that, but I
don't know how we roll it back. And that's the thing.
So what's the prediction. My prediction is it's gonna get
(40:52):
worse and we're gonna get to the point where there
isn't room to move without that surveillance tracking you and
like so for example, you think of things like Sci
Fi Minority Report, you walk to the mall and there's
facial idea happening everywhere you go with targeted advertising at
the mall. Oh that's coming I guarantee that's coming, and
(41:13):
all of that's happening already, and that facial recognition stuff
that's going on is happening currently now we're just not
that aware of it happening. The cop cars driving down
the road and every license plate is being measured with
the cameras being O CRD optical character recognition, and that's
coming back, and they're tracking every car they're driving by
(41:34):
on the highway even though there's not a GPS unit
on your car. The ability to not be tracked will
soon be impossible. How's that yeah? I mean allegedly, when
I was younger, they were like certain stupid petty crimes
I would commit just because like, people will not be
(41:56):
able to do this in the future, and I have
a moral responsibility to steal the bulbs from in front
of this bar and throw them in my threads, Like
what one day that will be a thing that people
can't do without getting caught, And so like I just
I had to. You know, there are like some bright spots,
because I think you're absolutely right, there's no on like
a broader scale, there's no turning back the clock for
(42:16):
stuff like facial recognition and how funked up. It's going
to get There are states like where I live in Oregon,
where like they have passed laws that are just like you,
public facial recognition is not a thing that is legal
in this state. Um. And I definitely support more attempts
like that, because again, anything you can do to sty
me them, to reduce the spread of the grid, to
reduce the profitability of these things, even though it's again
(42:39):
overall a doomed cause. Right, Um yeah, I don't know.
I mean I obviously I think that that's a good law,
but I don't know that laws stop corporations when the
corporations have more power than law. Yes, of course. Um.
And it's like I mean, obviously you can you can
ban it for police to use and stuff, which does
something to the extent that you know they follow the law,
(43:02):
but um, none of this is I don't know, Like
I That's one of the things that makes me most
depressed about the future is the thought that like the
space for this is not like a major issue, I guess,
but like the space for kids just like funk around
and do dumb ship when they're nineteen is going to
get so much smaller. I mean, I would say. I mean,
(43:23):
I think the thing is like, as a natural human being,
whether you're doing anything wrong, even if you're not doing
anything wrong, the nature to feel like you have a
private space that's to your private community space. I'm not
even talking about wrong or right here. We're just talking
about just that feeling that at this moment, this is
my space where I'm not being watched. Is a natural,
(43:45):
healthy need of the human orgasm or organism. Um and interesting,
Uh yeah, but no, it's it's a it's a human need.
And I think we're gonna find those spaces become smaller
and small. And I think when you said, what's your prediction,
I hate to say it, but I think the prediction
is it will become impossible to not be tracked. Now,
(44:07):
the bright spide of that, the bright side of that,
maybe maybe there's a bright side. Maybe at some point
when that's the reality, it could somehow also affect the
people that are powerful and the people that are small.
And we all realize that humans are humans and therefore
the failings that sometimes we have as all human beings
(44:27):
we just kind of acknowledge and be like, oh, yeah,
of course that's just what people do. Like maybe we
just realize people are people. But the idea that there's
never going to be a space to not get tracked,
I don't know. To me, I find darkly disturbing. It
is disturbing. I do think it kind of to pivot
off of what you were saying. The other aspect of
that that is more positive is that all of this stuff,
(44:49):
all of this surveillance ship um or at least not all,
but quite a bit of it is you know, in
a way, it's like a knife fight. There's no way
that both parties don't get cut, and you know, the
ones wielding the knife might get cut less, but they're
still going to get cut. And part of what that
means in this situation is that the prevalence of all
(45:10):
of these different ways to surveil and track also allows
us to track in the same way that like police
law enforcement watches people through their phones, but also a
hell of a lot of cops are getting filmed doing
fucked up ship now right now. Again, the balance of
the cuts I don't think is going to be work
(45:31):
out in our favor, but it's not going to be
nothing on them either. And and and you're right, I
think there are there are some things that we will
learn in the future about the people in power in
the world that would it wouldn't have been possible for
us to learn in the past, or may not be
possibly even right now, And that could be And if
we learned that about people in power, then they can't
weaponize it as much against the people that aren't in power. Right. Yeah, yeah,
(45:53):
you know one thing that I'm because I'm thinking a
lot about the fact that a bunch of folks in
the reproductive healthcare industry have pointed out that right wingers
have started using drones uh to follow people home from
like planted parrot hoods and followed them to their cards
to build databases of the people who are going to
places to potentially like do that kind of reproductive healthcare
(46:16):
that these folks don't think should exist. Um. The other
side of it, though, is that, um, it is also
possible to surveil them, um, and it will be possible
to track the people doing that sort of thing, and
it will be possible to do that in terms of
like legal accountability, and it will be possible to do
that for the people who embrace uh questionably legal tactics
(46:39):
for for frustrating those efforts um or illegal tactics for
frustrating those efforts. They have access to the same technology.
Um And again it's it's it is a knife that
will cut everybody. Um And I guess that's better than
just one person getting cut in this situation. That's that's
the concern I have, right, I agree with that like
(46:59):
I that technology goes it's a weapon and its weaponized
in all directions, depending on how to use it for
good over bad. And so this is the same place
I come to when it comes to the gun control argument.
I mean, we can do no, no, no, the same problem, right,
because if we allow only one side to have all
of the control and power and understanding of the technology,
(47:22):
then we at ourselves are at a huge deficit. We
cannot defend ourselves or fight back. So when it comes
to this kind of data and technology, knowing the basic
fundamentals of what you can do to protect yourself, understand
the reality of what the surveillance state or corporation is,
and then doing your best to not make it easy
for them is at least one step forward. But if
we don't own this technology, if we don't own the tech,
(47:45):
someone else will and they will use it against us.
It's as simple as that, and like they're super simple
stuff like I was gonna bring us up with like
you can't see video because it's a podcast, but like
there's these cool glasses from doctor are called reflectacles that
I'm showing you Robert, and look like regular sunglasses, but
when you put them on, they do they reflect I
R light and actually mess with cameras in a way
(48:08):
that your turns are diet facing a ball of light.
So you can wear these, You can wear their cold reflectacles.
You can wear them and just walk around the mall
and all the cameras get blown out by your by
your glasses. Like doing that just because you can. It's
kind of fun. That's the hot ship. That's the ship.
I was promised that that at least does exist. It's
not everything I had hoped it would be in terms
(48:31):
of its ability, but it is like that kind of
stuff rules and I will be picking up a pair
of those. Um, well, we should probably close out. I
didn't want to note because I mentioned this. UM I
got something a little wrong when I was talking about
the facial recognition ban. UM. It is in an ordinance
in the City of Portland itself, Um, it's the first
city that has done this, and it prohibits the use
(48:51):
of public facial recognition technology by all private businesses in
the city. UM. So that is the scope of the
band that a band that exists in Portland. I recommend
looking it up. It is the kind of thing that
I would support everyone pushing forward in their city. UM.
Because again, the more holes you can make in this thing,
the better. Yeah. I don't want to put that down.
(49:11):
That's a good thing. But the challenge of this is
just like I mentioned earlier, moving the data out of
the konas and back the minute photos from like I
take my iPhone and scan the crowd and then put
that picture up on the internet, not under their jurisdiction,
and all that facial recondition happens on every face in
that yep. And that is again, well, we'll do another
episode at some point about things that you can do
(49:33):
to just get like there. That's a whole different bag
of tricks. Um. But this has been really useful and
really valuable. Carl, do you want to plug anything before
we roll out here? Not much, just my normal thing
if you're interested in this kind of content, but with
a more firearms oriented thing. You can find me an
in range dot tv. But you'll also find some information
security stuff there as well. I cover that intermittently when
(49:53):
it applies to both topics. So if you, if you
even if you disagree, but appreciate my approach to this,
come check me out. I appreciate it awesome. Check out Carl,
check out in range tv, and continue to listen to
podcasts because the only thing that will save us is podcasts.
(50:13):
When I think the same, right, but good for business.
It could happen here is a production of cool Zone Media.
For more podcasts from cool Zone Media, visit our website
cool zone media dot com, or check us out on
the I Heart Radio app, Apple Podcasts, or wherever you
listen to podcasts. You can find sources for It could
(50:34):
Happen here, updated monthly at cool zone media dot com
slash sources. Thanks for listening.