Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Craig (00:08):
All right, welcome to
another podcast.
I'm going to Blake Ray.
Hello, lots of exciting stuffhappening today.
Any particular topic you wantto start with, or you want me to
start off?
Blake (00:21):
Yeah, go ahead and kick
it off.
You do a pretty good job withthat.
Craig (00:24):
Okay, no shortage of
headlines.
One of the ones I wanted tobring up was the OKTA breach.
Did you see that?
Where there are stolen accesstokens from OKTA support unit?
So I guess my comments there isthat my first reaction is it's
(00:44):
similar to SolarWinds, right,it's a vendor that's targeted,
that has a lot of potentialcustomers that hackers can gain
access to, so it's one of those.
Be careful with your vendors,but it's hard to protect
yourself.
I mean, if you've put OKTA inyour practice and your business,
(01:07):
I guess it goes back to ourlayered model, right, Blake?
I mean you can't trust onelayer.
So if you had more than onelayer and you've got multiple
layers, then one layer fails andthe rest of the system is still
working.
I guess that's the takeawayfrom this one.
What about you?
Blake (01:26):
Yeah, I mean you
definitely can't put all your
eggs in one basket.
I haven't read the news on that.
I was kind of really interestedin some of the former NSA
employees that were leakingclassified data to Russia.
Craig (01:41):
Yeah, so that's almost
like a repeat of the whole
Edward Snowden thing, right, andthen there's so we could talk
about that a second or another.
There's also the Microsoft bugwith the Active Directory.
Blake (01:59):
That one I heard about.
I haven't formally read aboutit, but I heard about it.
Craig (02:04):
Yeah same, I haven't
studied it either, I just saw it
in the headlines.
And then there was on adifferent topic.
There was a I saw a newsarticle where a coalition of 41
states are banding together tosue Meta / Facebook / and
Instagram for harm to kids.
Social media harm.
Blake (02:25):
Wow, what country, wha c?
O ou sai state 41 states.
Craig (02:31):
In our country, yep and
including the District of
Columbia, are filing lawsuitsalleging Meta Metta has
intentionally built its productswith addictive features that
harm young users of its Facebookand Instagram services.
Blake (02:47):
Wow, I think in the next
like anywhere from six to 10
years, like, social media isgoing to change.
It's kind of this revolvinglike revolving door, I guess is
the proper analogy.
But here in Vegas is a goodexample of like, when a new
(03:08):
hotel opens up, people justflock to the new app.
And so if you look at Twitterwith their rebrand, you know, x
kind of brought a lot, a lot ofnew users back to the platform.
Obviously, when MySpace wasaround and then Facebook came
out, right, people jumped ship.
So what is the next ship?
(03:28):
You know, the fact is Facebookis it's not a great platform,
it's not a great platform forthe users, it's not a great
platform for privacy.
So what's the next ship, youknow?
So I'm just a big proponent ofsomething next will come in, you
(03:51):
know yeah.
Craig (03:53):
I think, as more people
start to realize that with a
free product and platform,you're the data point right, so
it's free for a reason, the oldadage you get what you pay for
right.
So I mean, what was the lastanalytics from what was a
Cambridge Analytica back severalyears ago now at the election?
(04:16):
What was that like five, sixyears ago now?
Blake (04:19):
Yeah, 2018,.
I think yeah.
Craig (04:22):
So I mean, at that point,
over 5000 data points exist on
all of us, and that was fiveyears ago, right.
So, okay, just think of all ofthe tracking that's happening
since then.
I mean your phone locationservices everything wants to
have that turned on to knowwhere you are right.
And then Facebook.
(04:45):
In my opinion, Meta isparticularly intrusive with
always listening.
I'll have a conversation withmy wife and then all of a sudden
she's got ads about theconversation that we had.
I try to avoid social wherever Ican, but I could only imagine
(05:05):
how kids are brought up on itand I think it's challenging.
I don't know the grounds forthe lawsuit, but I don't support
social media for children,that's for sure.
I do think that it's, in myopinion, more harmful than good.
I understand that there may besome I read this a long time ago
(05:26):
where the founders never kindof intended for it to kind of
spiral this way.
I think there may be some truthin that, but I also feel like,
if that was entirely true, thenwhy are they putting all this
privacy leakage in place andsurveillance in place, right?
So I mean, I do think thatthere might be strength, and I
(05:47):
guess the defenders are going tosay well, it's easy for me to
keep in touch with my friends,and things like that.
Yeah, I mean, I think there'sdebate around that too.
I think I mean, do you reallyneed 10,000 friends and are they
really your friends?
Blake (06:02):
I think it pushes people
further away, because now,
instead of having those directcommunications where, like,
somebody will text you and say,hey, I was at the beach, like
here's the fish I caught, right,boom Text message, you post it
on Facebook and then that textnever happens, right?
So it's like, oh yeah, I saw it.
And then it's also really weirdwhen you have a conversation
(06:25):
with somebody and you're tryingto fill them in.
I don't know if you've probablyhad this happen.
Maybe not, since you're notactive on social media very much
.
Craig (06:34):
No, I try to avoid it
like a ten-foot-pole pole pole
10 10 10 pole all.
Blake (06:37):
Yeah Well, I thought
conversations with people and
some people out there probablylistening same thing.
But I'm like oh yeah, this iswhat I did.
And they're like oh yeah, Iknew, I know, I saw it on social
media.
Craig (06:48):
So it's like if you post
it all on social media then it's
like nothing to talk about whenyour face to face or on the
phone.
Blake (06:57):
And then I guess the
minimum age requirement I wasn't
quite sure about how to look itup, but for Facebook is 13
years old, like, I don't evenknow if, like, I even agree with
that, like having, like what Imean?
Obviously you have children.
What age are you going to letthem get on social media?
(07:17):
Hopefully, never.
Craig (07:20):
I mean, I'm already.
My wife and I are discussingthe importance of a phone, so we
chose to give them.
We have some lockdown.
They're called their watchesand they're locked down, for
they can only receive phonecalls from who we put in their
contact list and they can onlyreceive text messages from who
(07:43):
we put in the contact list, andit's a pretty restrictive
platform.
But it works and it's good for,like, if they want to go to
their friend's house and have abit of freedom, we can text them
and say hey, look, come homefor dinner or whatever.
But the challenge is that youknow peer pressure.
Their other friends don't havethat.
They have the real deal.
They have the iPhone or theyhave the Apple or they have both
(08:05):
, and there are apps and thereare ways to restrict some of
those platforms, but they'restill best effort and they're
not perfect, right.
So it's.
It's challenging.
I don't know what we're goingto do.
It's hard because you can onlyshield them for so long.
Then it's like well, I don'tknow, we haven't made a final
(08:26):
decision on it, but the watchworks okay for the moment.
It's challenging in socialsituations, though, because some
of the kids will start like agroup chat or something like
that, and then they're left outof it.
So it's a little sociallyawkward that way.
So they might have to resort toemail or their school email,
(08:48):
but then they're like theone-off and then nobody really
wants to follow that one-offthing because it's an extra step
.
It's not as easy.
I haven't heard of particular arequest for a certain social
app.
I've heard a request, obviously, for a phone, a full-featured
phone, but I don't know.
(09:08):
I feel for parents, I don'tknow.
We have some friends that havechosen to do the iPhone or
whatever, and then they do thesesurveillance apps where it's
just like constant police ofwhere are they, what are they
putting on the phone, what arethey allowed to install, what do
(09:29):
they mean?
So it's like.
But then at that point it'slike what's the point?
I mean if you're going to lockit down anyway.
I mean, so I don't know, Idon't know, I'm undecided on it.
When I was growing up,technology social didn't really
(09:50):
exist very much.
Technology was pretty embraced,so it was the support of
technology and the tinkering oftechnology and new stuff.
But now it's so broad strokeand there's so much stuff out
there and now this kind of is agood segue to it.
Did you see the DJI headline?
Blake (10:15):
No, but I have DJI
products now I'm curious.
Craig (10:18):
Yeah, so DJI is obviously
manufactured in China, and now
they are at the front headlinesof bills in the White House
about banning it.
Blake (10:33):
Yeah, they've been
talking about that for a while
because, obviously, like whenyou're flying a drone, like you
have location services turned onand essentially what happens is
there's like a recording thathappens of each flight and so it
records like a temporarypreview of the flight and
uploads it to the DJI server,right?
So like you're flying aerialsover these locations, right, and
(11:00):
nobody knows what they'refilming, you know, like you
could be filming.
I mean it's hard, you know.
But they've been talking aboutthat for a really long time and
I'm surprised to see that it'sfinally finally happening now.
Craig (11:17):
Well, I don't know if
it's definitely going to get
passed.
I just know I saw it in theheadlines again and it just got
me thinking about and we talkedabout this before.
You know where do you?
How do you really know what'sin your?
You know we've got laptops,we've got desktops.
You know parts come from Chinain different countries.
(11:39):
How do you know for sure thatthere's no surveillance in those
?
How do we know that?
You know iPhone is in theirpartner Foxconn.
Blake (11:51):
I think they or they
switch Well it used to be.
Craig (11:55):
But my point is I think
it's still made and manufactured
.
It's designed in California,but it's made and manufactured
in China, right?
Blake (12:03):
Uh, I think it's.
Is it Taiwan?
Craig (12:07):
No, I think it's China.
I don't know, unless theychanged it.
Let's see.
Hello, quick Google search.
Blake (12:14):
Yeah, Taiwan, Taiwan
semiconductor.
Craig (12:17):
OK, but it says the
iPhone is assembled in China,
Vietnam and India.
Blake (12:25):
Yeah, I mean there's
still no good Like.
This is the type ofmanufacturing that is probably
too dirty to happen in US soil,right?
Maybe not?
I don't know if it's too dirty.
Craig (12:43):
I think it's quote
unquote too expensive.
Blake (12:47):
Well, some of the
materials that have to come
together to produce a phone,like from some of the
semiconductors aspects, likesome of these raw earth
materials, like they're likelynot even found here in the US,
which is the reason why, like,for example, I don't know, I
mean this.
Craig (13:04):
this article I pulled up
is saying that there's basically
a supply chain of dozens ofcountries just to make an iPhone
, of all the different parts andpieces.
Right, that's crazy to me, Imean.
But my point is, you know wetalk about this and security and
compliance, and I'll call thisthe vendor relationships.
Each vendor we bring into theecosystem has to be vetted and
(13:27):
tested and we need documentation, policies, procedures, you know
, attestation, evidence, so yougot over a dozen things that
make up we'll just call it 12.
We've got 12 differentcompanies that make up an iPhone
.
Iphone's got a huge marketshare.
I'm sure iPhone is in severalgovernment agencies.
(13:50):
So you know it boils down tohow do we know that all these
different countries that areputting their parts in the phone
are not putting it back to work?
Blake (14:03):
Yeah, you know.
Yeah, I mean I think our supplychain has always been a huge
issue, like for me personallyand so like, obviously, when I
was in Europe, like I would seelike Xiaomi, for example, like
huge.
I mean they make I'm not goingto lie I feel like they make
great products, but the fact isyou don't see like a Xiaomi
(14:29):
store, like you can go and see aXiaomi.
There's like Xiaomi like retaillocations abroad, like in
almost every other country.
I've never even heard of thatcompany.
Yeah, they're a huge phonemanufacturer and they make super
high quality like Androiddevices.
Craig (14:45):
Yeah, and don't get me
wrong, I'm not saying you know
everybody's guilty either.
I'm not saying DJI is bad, youknow, I think their products are
cool, but I think the questionthat is raised is where's the
evidence, where's the?
It's almost like you're guiltyuntil proven innocent, right,
instead of the other way around.
(15:06):
My point, though, is that Ithink if there is more of a
trustless mentality, like ifmaybe the solution for a company
like Apple or DJI or these kindof companies that are bringing
products overseas to our country, maybe it's third party testing
(15:27):
, maybe it's evidence ofunbiased surveillance and
deconstruction, like I fix it,they disassemble everything,
right, and these websites likeours, technica, all these
different, but still I mean, howfar does that go?
Like what are they gonna getinto the chips?
Right, you know they're gonnadisassemble it, but they're not.
(15:48):
You know what I mean Like.
So it's like.
I think the bottom line hereand we can go down this for a
long time, and I know our timeis short today, but I think the
bottom line is again we'reforced to live in a trustless
society.
You can't trust any of it.
So I think that if you chooseto use a mobile device or you
choose to use a computer.
You just gotta be reallycareful of where is it coming
(16:12):
from?
Again, we talked about thisbefore what software is on it
and what layers are you puttingin place to make sure that you
do the best you can to limitLike?
I'll give you an example.
One of our layers that wetalked about is the layer that
is keystroke encryption, as wellas screen anti-screen scraper
(16:34):
technology, and I told you itwas really disturbing how many
programs the softwareintercepted.
That basically said hey, adobewants access to your microphone
and camera right now.
Well, I'm reading a PDF, soit's like you know.
So my point is that that couldbe an effective layer to you
(16:57):
don't need to give Adobe yourcamera and your microphone when
you're reading something.
You know what I mean.
Like.
So maybe we just need to workharder at putting more of these
safeguards and layers in placeto protect ourselves and I know
that I've heard many times likethe Air Force, on a base, for
example, you can't even bringthe iPhone or Droid in.
You have to have it locked upin the trunk of your car.
(17:19):
So it's kind of like you know.
I think we all should have aFaraday bag, you know, and you
know it's like how far do youtake this stuff?
But it's kind of comical and Ilaugh about it because we're
forced to do this stuff if youwant to protect yourself.
Blake (17:35):
Yeah, I, earlier we had a
ticket that came in and it was
for Microsoft Word and theyweren't able to open Microsoft
Word because malware bytes wasblocking it with the excuse that
an exploit was happening.
(17:55):
And so I had to go inessentially like, just like dumb
down, you know malware bytes toallow somebody to use Microsoft
Word and PowerPoint, and it'slike, really, what information
are you requesting to use?
Craig (18:13):
You know.
That reminds me of remember themacro virus.
Possibly that might have beenbefore your time.
So one of the early virusesexploited the macro
functionality of word and thingslike that.
Macros are, back then, a formof automation to do things
(18:35):
faster, but nobody really usedthem.
So my point is that Microsoft,adobe, all these big vendors
keep adding tons and tons offeatures, apple included.
They just put all this stuff onand then they turn it all on.
So it's like I think theyshould take the approach that a
lot of Linux distributions doand turn all off and go through
(18:56):
the list and figure out what youwant to turn on If you're in a
secure area, or you want to be,or you're going to be in a
audited or regulated area.
Maybe that's the betterapproach.
I mean like, so, instead ofgoing through all the work and
we call this process securityhardening.
So there should be differentdistributions of hey.
Look, this is the version, theDoD version.
(19:19):
They used to do this with Stigsand other things, but I think
we moved away from that forwhatever reason.
The lines got blurry and thenMicrosoft went down the rabbit
hole of GCC, high and thencommercial products, but there's
still no clear like hey, buythis from Microsoft, it's
already from the factory.
You know what I mean.
(19:39):
There's no like and the samething.
Apple did a good job, I think,recently with their lockdown
mode on their iPhone, which I'vetested for a while.
But I feel like there should bemore of that and I feel like we
, as a company in cyber andcompliance, should be better
supported from our vendors tohelp from their perspective, to
(20:01):
help us do the job, because itultimately goes back to them
anyway, like we're going to beasking them in GCC High
environments hey, where's yourevidence for this?
You know what I mean.
So it's like why not just haveit ready to go?
And I know Amazon has GovCloud,but my point is that I feel,
like our vendors, there shouldbe more work and heavy lifting
(20:22):
done by that side of theequation.
Blake (20:26):
Yeah, I mean permissions
and applications are.
Yeah, I'm not a huge fan of howthese phones are configured.
Now, even computers, you knowwhere everything that's put on
there can have free reign rightOver your data, whether it's
whatever bloatware, malware ornot malware, bloatware freeware,
(20:51):
whatever shareware.
Craig (20:54):
I think the takeaway here
is since we only got a couple
minutes is start now.
Go through your computer, gothrough your mobile devices, go
through your stuff and justeither turn off what you're not
using or uninstall it.
I mean, the more stuff you goton there and the more stuff you
have turned on, the easier of atarget you are.
So think about it as morethings that you're turning off
(21:17):
and more things that you'reuninstalling and not using.
Think about it as enforcementlayers to protect yourself.
So the more stuff you removeand the more stuff you just get
rid of that you don't need, thebetter, the more unhackable you
become.
Blake (21:33):
Do we know those Chinese
games?
Craig (21:36):
Well, it could be games
from anywhere, I mean, but yeah,
all sorts of games.
If you're not using a certainelectronic device on your
network, get rid of it.
Blake (21:46):
You know, yeah, it's a
good end.
Note right there, yeah.
Craig (21:53):
All right guys.
Well, I think that's a good rapfor this one.
We'll definitely lots more totalk about, so we'll record more
soon.
Blake (22:01):
Thank you, see you guys
on the next one Take care.