Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
How's it going, david
?
It's great to finally get youon the podcast.
I think that we've beenplanning this thing for quite a
while, you know back in 2024.
And now you're the.
You're the first episode of2025.
Because I burned myself out andhad to take like six weeks off.
Speaker 2 (00:18):
I am honored
no-transcript feelings about
(00:45):
that, actually and so there's somany people who bill themselves
as cyber security expertsyou're probably having some on
the show and some of them.
When you drill down on whatthey, why they think they are,
it's because they took a bunchof microsoft certification
classes, right.
So that's virtually useless inany real world scenario, because
(01:08):
the really bad stuff is stuffthat people have never seen
before, and no amount ofMicrosoft licensing or
certification is gonna prepareyou for some wacky denial of
service attack that nobody'sever seen.
That's just basically having abrain and being able to think
through it.
So the best cybersecuritypeople are people who are not
(01:30):
formally trained to becybersecurity people.
Speaker 1 (01:32):
Yeah, yeah, no,
that's, that's very true.
You know, it's alwaysinteresting when, when I bring
people on and you know I try andfind fantastic guests, you know
like overly qualified, veryexperienced people like yourself
and whatnot and every once in awhile this happened.
This happened, you know, maybein the middle of last year,
(01:56):
right, where someone came on andwe started.
I started to get a little bittechnical, because I'm technical
, you know like I'm.
I'm in the weeds, you know.
I, I, uh, I wake up and I getyou know into log files and I'm
figuring out what's going on.
I'm reverse engineering systemsand whatnot, like all day long
and I look up and it's 6 PM, youknow that sort of thing, and we
(02:17):
started getting a little bittechnical and I immediately
reach their technicalquote-unquote expertise limit
and then I push a little bitfarther and come to find out, oh
, you're not technical at all,you kind of stumbled your way
into this thing and someonepromoted you early and that's
(02:38):
what happened, right, which isit's frustrating for myself.
Speaker 2 (02:42):
I deal with this all
the time, given the kind of
things I do and where I live.
I've been asked by some VCfirms and other firms to do
vetting of people, and about 80%of the people who call
themselves cybersecurity areactually lawyers and they're no,
I'm serious.
They're people at a law firmwho were involved in one case
(03:07):
involving some kind of aspect ofcybersecurity and now they're
an expert.
It's like you know, I go intoCVS and buy a bottle of aspirin
and now I'm a doctor.
Yeah, it doesn't help ourprofession because it downplays.
It makes it look like it's easyto be this.
Speaker 1 (03:25):
Yeah, yeah, I mean,
you know, when someone is trying
to get into cybersecurity,right, and they're reaching out
to me, they're asking for adviceand whatnot really the very
first thing.
And some people, some peoplethat I've had on and I said this
too they like took offense toit and like were appalled by how
I approach it.
But I this to, they like tookoffense to it and like we're
appalled by how I approach it.
(03:46):
But I try to convince people tonot get into security.
Right, because if I canconvince you just through words
to not get into cybersecurity,you're not going to be
successful in this field.
Right, like, because you haveto have a curiosity that cannot
be itched.
Right, and you need to be theexpert like, not just an expert
(04:06):
in security.
Like.
You need no networking prettywell.
You need no system designpretty well.
You need to know, you know thedifferent processes and services
that are talking behind thescenes on your windows device,
right, what they're linked up toand where they're actually
configured and all that sort ofstuff so I've.
Speaker 2 (04:24):
I've been doing this
literally for half a century in
some form or fashion withcomputers.
The thing when I look at aproblem, I think of it at
multiple levels and sometimesit's literally at the bit level
and I'm thinking, okay, what'son the heap, what's in the stack
, what are the bits?
(04:45):
And without even articulatingit, this helps me do things like
I repaired a remote controlfireplace the other night
because I just knew what waswrong with it without having
ever touched the thing.
And this is a skill set thatI'm sure you have.
I have Many people don't.
You cannot teach this and it'ssome kind of weird survival
(05:10):
trait that I don't think peoplerecognize for what it is.
But you can throw anything atme that's a computer-related
thing and I can figure outwhat's wrong with it in a couple
of minutes.
Speaker 1 (05:21):
Yeah, wrong with it
in a couple of minutes.
Yeah Well, david, you know, wekind of we kind of just like
dove into the deep end herewithout you know you talking
about your background, so whydon't we backpedal a little bit
and talk about?
You know how you got into IIT.
What was that start like what?
What intrigued you right aboutthe field that kind of propelled
(05:43):
you into this world that you'rein now?
Speaker 2 (05:46):
Okay, well, I mean,
we have to go back a ways for
this.
So I mean, I went to highschool in Pennsylvania in the
mid-70s early 70s even and weactually had a computer and it
was an IBM 1130 or somethinglike that, and it had punch
cards, if you've ever used those, and you had to mark, sense the
cards and then you run themthrough.
(06:06):
And if you did assemblyprogramming, which is what you
normally had to do, there were16 switches on the CPU and you
would configure the switches fora binary number and you'd hit
the button and that was onemachine instruction and then you
would do that for the entireprogram you just wrote.
So that's like days to do that.
So I was intrigued by it andthen I sort of let it go for a
(06:30):
while.
I got a degree in philosophy,taught some symbolic logic and
other things, and then I,through a bizarre set of
circumstances, I ended up beingan intelligence agent for a
number of years.
I had to go in the military.
I did, and they, because of mytest scores, they trained me to
be a cryptographer.
(06:51):
They sent me to Russian schoolfor a couple of years and I
ended up on submarines, and so Idid that during the height of
the Cold War and it was fun.
I actually had a really goodtime.
It's not like anybody gotkilled, you know.
You didn't have to worry aboutbombs blowing up a Jeep or
something, it was just the ColdWar was a very different kind of
thing.
But computers first time I'veseen computers play a part in
(07:14):
the real world becausesubmarines at the time were
heavily computerized.
I mean, given what was there atthe time, what was there at the
time.
So the computers were calledYAK20s, u-y-k and I think it was
a deck computer and they usedto send out this is I always
thought it was hilarious.
They used to send us out with arepair kit and it was a big
brown plastic case and if youopened it the only thing I had
(07:37):
in it was a rubber mallet.
And they said you'll never fixanything.
Just start whacking the crapout of the boards and something
will go back in place.
And you know what it did.
I had to do it like three times.
So I mean, that's kind of theearly days.
I did some other work.
I went to NSA, I worked atCosmonaut, the Cosmonaut program
(08:00):
, and then I got some moredegrees from UMBC in computer
science with a mathconcentration, did grad work at
Hopkins and then I was at acrossroads because I either
stayed as a professionalintelligence agent or I went
into this fledgling world ofcomputers in the early 80s and
it was a no-brainer for me and Imean I knew where things were
(08:21):
going and I got out and startedprogramming a number of
different languages and within afew years I was designing
systems and then I ended up.
I was, I ended up runningresearch at Booz Allen and
Hamilton.
The consulting firm IBM hiredme to be the chief scientist for
(08:42):
the Internet Information Group,which is all their
Internet-related software,basically Not networking, but
anything above that.
And that was pretty cool.
Actually at the time I'd neverplayed in the big leagues like
that before.
I got a lot of job offers thiswas like 95, 96.
And I decided.
(09:03):
The one I wanted was thislittle little company in Herndon
, virginia.
It was an 8A firm and it wascalled Network Solutions and the
only thing they had going forthem is they had a locked
contract with the NationalScience Foundation to basically
run the internet.
So they it was called acooperative agreement, so they
(09:25):
ran the whole domain name system, all the root servers, tcpip
allocation for North America andthe CDPD, the cellular data
network.
So I came in as CTO and then Iended up running all that.
So that was pretty cool.
And I got to deal with crisisafter crisis because from 96, 97
(09:48):
on that's basically the dot-combubble.
So all of a sudden peopleactually gave a damn what was
going on in the internet, and upuntil that they didn't.
It was a curiosity.
In the early 90s it was likelabs, and by the end of the 90s
it was billions, tens ofbillions.
So I went through that.
(10:10):
My company went public, I didan IPO, a couple secondaries and
I was running all this stuffduring Y2K and I was on
President Clinton's task forcerepresenting the internet during
Y2K and that's a whole storyright there.
I didn't like where some thingswere headed.
I left and started writingbooks on privacy and wrote for a
couple of magazines.
(10:31):
Nobody cared really that muchabout it at the time.
It was some kind of like weirdconservative thing and the
liberals didn't want to haveanything to do with it because
privacy seemed to run smack intoFirst Amendment issues, and so
my natural constituency werepeople I didn't actually want to
deal with.
So then I got into some otherthings.
(10:53):
Story's almost over here.
Sorry the staking's long, noworries.
I did politics.
I was a CTO for Senator Bayhwhen he ran for president for
two years.
That was actually a paid gig inArkansas.
And then I was the head ofsecurity for General Wesley
Clark when he ran for president,and so I got some other
(11:16):
exposure and at this point I waspretty cynical about almost
everything.
And the thing I was cynicalabout was the people who should
understand what was going on didnot understand what was going
on, and this was a huge.
I mean I knew where things weregoing.
I mean what we're seeing todaywith cybersecurity, for instance
(11:36):
, you know, and data breaches.
I mean the writing was on thewall for that 20 years ago and
it's now.
Anyway, we can talk about that.
So I started doing.
I started traveling the world.
I hit 85 countries in a coupleof years and then I came back
and I started working with veryearly blockchain companies all
(11:56):
in Europe, because none of themwanted to work in the United
States because they wereterrified of the Security and
Exchange Commission, especiallywhen they're doing ICOs for
tokens.
I mean it's still not clear howUS tax law treats that stuff.
So I worked with a number ofthose companies and I'm still
(12:17):
working with a couple, and thenI got into post-quantum
encryption.
So now I'm doing sort of Web3non-centric security with
post-quantum encryption.
So that's kind of a long story.
Wow that is.
Speaker 1 (12:33):
I mean, that's really
fascinating.
You know, it's just where thisfield has taken you.
I mean, did you ever think thatyou would taken you?
I mean, did you ever think thatyou would, you know, be on
President Clinton's cabinet?
You know, like no, no, Likestarting all those years ago.
You know, did you ever havethat in mind as that even being
(12:55):
a possibility?
Speaker 2 (12:56):
So the truth is I was
a single parent, I was raising
five children, I couldn't evenafford daycare, and I'm sitting
on all this stock in a companythat might go public, so that
was very good motivation for me.
And finally, when the stock diddo well, I mean I didn't get
rich, but my kids all went tocollege and I bought a Porsche.
(13:19):
So what?
Speaker 1 (13:21):
kind of.
Speaker 2 (13:22):
Porsche.
I got a 911.
Okay, which I'm now feelingreally embarrassed about because
I sold it and bought a Teslaand I really love.
Yeah, I know, I love, love myTesla and I just feel, I feel
like such you for the Tesla.
Speaker 1 (13:42):
We we bought two
Teslas in 2024 and I love them.
I absolutely love the car.
Recently bought myself a ModelX and, uh, I've wanted that car
since it was announced, rightLike.
I just love everything about it, but I couldn't imagine selling
a Porsche, even for a Tesla.
I would just like have both.
Speaker 2 (14:04):
Well, you know, here
I live in, I live in the city
Parking spaces are at a premium.
I actually have two spaces,which is like two more than most
people have.
So when my wife and I boughtthis house, that was one of the
reasons we bought it.
But we have an SUV too, and thePorsche was just sucking up
money and every time somethinghappened it was thousand dollars
(14:29):
.
Oh yeah, I mean everything.
You can't you know?
Cigarette lighter, threethousand dollars, yep.
So I got tired of paying it.
The dealer here sucks sowhenever they couldn't get parts
.
So, and especially during thepandemic.
So anyway, that's, that's why Idid it but I got to drive it I
got.
I had to drive it for 20 someyears.
(14:50):
It's uh, it's an incrediblemachine.
Speaker 1 (14:53):
When you're dating,
oh yeah yeah, I, um, I I just
sold my audi s5 and it was my.
It was my first sports car andthat's a good car.
What a, what a fun vehicle.
But when it breaks a man, whensomething goes wrong on that car
, it's I mean, like you said it,I just got to the point where I
(15:16):
assumed you know I'm going infor an oil change and I assume
they're going to find threethousand dollars worth of stuff
that's broken that I don't evenknow about.
Oh, I'm sure.
Speaker 2 (15:28):
So you know, going
off that, going back to the
other thing I said when I was akid, growing up, this is, like
you know, pre-psychedelic era,going through the Beatles and
all that.
So my friends who are good withmechanical stuff were highly in
demand.
Women liked them, guys likedthem.
(15:49):
They could change your sparkplugs.
They didn't have to go into thegas station, they would go,
yeah, it's your timing, and theywould get in there and they
would fix it.
They could fix TV sets, theycould fix washing machines.
Guess what?
You can't fix a goddamn thingtoday.
So now it's the person with theskills that I was just talking
(16:11):
about.
It's it's the person who I usedto go to dinner parties with
people and a lot younger than meand I would have an iPod and a
CD and I would say, hey, I'llgive 20 bucks to anybody who can
take the songs off this CD andput it in this iPod.
Nobody ever knew how to do thatand to this day, when I deal
(16:32):
with politicians andmulti-hundred millionaire VIPs,
they don't know how to doanything either.
And they all have like nephewsand like eight-year-old,
nine-year-old nephews and theydo the work for them Like
printers, like configuring aprinter is still way too hard.
Way too hard, yeah, and itshould be easy, and if you're
(16:56):
lucky it will be, and if itdoesn't configure in the first
two minutes, you're in for abumpy ride.
Speaker 1 (17:03):
I hate printers.
I really do.
I really do.
I really hate having them.
I only have it because my wifeis a she's an early childhood
teacher and so she has to printa whole lot.
So we have like a very robustprinter and it's just.
You know it doesn't work.
You know very like fluentlywith a Windows PC and MacBook
(17:25):
laptops and you have toreinstall the driver all the
time and it's so, so dumb.
But so go ahead.
Yeah, I was going to ask youwhat, what your time was like at
the nsa.
You know, I've had other peopleon from various agencies CIA,
nsa, dia and they all tell meroughly the same thing.
(17:49):
And I have a good friend who'sin the military and he said that
if I ever do make it into theNSA, that first month when
you're being read into 90% or 80, 80 of what you need to be read
into and whatnot, it's gonnalike the capability side of it.
(18:10):
It's kind of just gonna blowyour mind right, like you
wouldn't even realize.
Oh, you can use that for forthis thing over here, right?
Well, I'm I'm wondering did youhave that same kind of
experience back then?
Because you were really, I mean, at the beginning of this
(18:31):
digital era, right?
I mean it didn't really evenstart.
You were at the very foundationof it.
Was that experience true foryou as well, or what was that
like?
Speaker 2 (18:42):
Well, when I got to
NSA it was early 80s and they
had a couple of supercomputers,like really expensive ones
Cray's, cray 2s is what theywere and we didn't have access.
Nobody had access to them, sothey had like PCs.
They were like 83, 86 machinesor something.
So if I wanted software I hadto write it.
(19:03):
So people used to come to meand I would write a Turbo Pascal
program to do some intelligencething, because you couldn't
bring stuff in from the outside,and so that was kind of fun.
And when I was in thesubmarines I had some of the
deepest security clearances youcan get.
I mean things that are onlylike that are still classified
(19:24):
and only like 30 people in theworld could read the material.
It was like there is stuff likethat.
But in the end in CIA what thatmeans typically is it means
they have an asset, a humanasset, like Putin's hairdresser.
So Putin's hair I'm just makingthis up I hope if he gets
(19:45):
killed tomorrow I'm going tofeel really bad.
But so let's say Putin'shairdresser gets turned you know
, happens all the time.
So that would be very, verycarefully protected because
they're going to shoot him inthe head if they find out.
For NSA.
It's almost identical to whathacking is.
(20:07):
In fact, now it is hacking.
It's like you know.
It's like basically it's zerodays.
Before there was even a termzero day.
Nsa was looking for zero days.
They were looking for defects,bugs, some kind of malfunction
in any mechanical or electronicdevice that they could turn into
(20:28):
an acquisition thing.
So that's why and this stuff'snot classified anymore, I think
but that's why they were doingthings like bouncing laser beams
off windows.
So you could I mean you couldhear what was being said in the
room, and there's crazier thingsthan that in the room and
there's crazier things than that.
And so that's the secret.
The secrets in NSA were mostlythat kind of stuff.
(20:51):
And then a bunch of boringstuff to most people, like what
frequency a satellite downlinkson.
You know what I mean?
It's it.
Most people could really give adamn and wouldn't even
understand if you told them.
But if the Russians got it itwould be a big deal, right?
Speaker 1 (21:12):
Yeah, that is, that's
really fascinating.
You know you talk about havingthat clearance and you know only
30 people in the world are evenallowed to read that document.
I always wonder how, like, thelevel up from that even works
right, because someone I'm justtrying to think of you know
(21:34):
least privileged permissions,right From my perspective.
From my perspective, if I wantto give someone else access to a
system or whatever it might beright, it doesn't matter the
sensitivity of that system, Ihave to have access to that
system right, in some way, shapeor form, I have to have that
access.
And so it's just interesting tome for how agencies deal with
(21:57):
that, because obviously youdon't want everyone knowing you
know nuclear secrets or you knowwhatever that might be, and you
have to really tightly controlthat information.
It's just fascinating for me toyou know, think about it, how
you would do it, even with apeople like a physical, you know
person right, like how do youcontrol that, how do you monitor
(22:19):
what they're doing, and thatsort of thing 10 years from now,
nobody's going to be doing that.
Speaker 2 (22:36):
Maybe six or seven
years, nobody's going to be
doing that.
And the reason is because bothdefense and offense is going to
shift over to AI AI-drivensystems because they move much
faster than human beings.
So if an AI is running somekind of denial of service act or
some kind of penetration hit onyour network, it can make like
(22:58):
a million hits on every singleaddress in your network just
like that.
So no human being will even seeit coming, let alone stop it.
So you need to have some kindof AI-driven defensive system.
On the other end, and that's oneof the reasons I'm working with
a company or two that's doingWeb3 decentralized stuff,
(23:19):
because I think the biggestdamage that's been done in
security in the last 20 years isdeferring things to centralized
companies and that's where allthe breaches happen.
They're service providers.
I mean you know Equifax andSolarWinds.
You look at any of those, it'snever the company with the name
on it, that's they're notresponsible.
(23:42):
It's some idiot third partythat they hired to do credit
card processing or something andthey got hacked.
And then it happened with AWStoo.
So I mean that's the hole.
So in the future, when it movesinto an AI driven system, that
hole, those holes, will go away.
Hmm.
Speaker 1 (24:02):
Yeah, I, you know, I
always talk about planning for
the future on the podcast andand you kind of seem like
someone that that thinks intothe future.
Right, then you start.
You start working towards itimmediately because it's like,
hey, if we're going into apost-quantum world like we are,
(24:23):
I need to be experienced with it, I need to have some level of
expertise with it, otherwise in10 years I'm going to be
obsolete and I won't be able todo anything.
Right, how, how do youdetermine?
You know where things are going, where to spend your time, what
to really focus on?
Because you know for myself,right 10 years ago, I knew I
(24:46):
wanted to get into cloudsecurity, right, and now I've
been in cloud security for awhile and now I'm shifting gears
, getting a PhD in how to securesatellite communications in a
post-quantum world.
That's a good one.
Using zero trust principlesright.
Speaker 2 (25:03):
Good yeah.
Speaker 1 (25:06):
So I'm also someone
that looks towards the future
and then acts on it and says,well, what's going to challenge
me, right, what's going to makeme grow?
And those are typically themost rewarding, probably most
arduous tasks, right, how do youapproach it?
Speaker 2 (25:21):
Well, I have some old
friends who are very, very
senior tech people and sometimeswe talk.
I just had a long call with anold friend of mine yesterday who
used to be the chief scientistat Amazon in the early days In
fact I had a fellowship there atthe time and we had this
futuristic talk because we bothwere kind of laughing about it,
(25:44):
because we both see very similarthings coming two years, five
years, 10 years.
I mean there's nothing we cando about it.
And I found a long time agothat if you invest in the future
you will go broke so fast,because I tried this, because I
always saw what was coming and Iwas almost always right.
(26:06):
But you can't just because youknow something I like 3D
printing.
I saw that coming years beforeit happened.
I saw that coming years beforeit happened.
So when the 3D printingcompanies came up, I said, oh,
I'm going to buy stock in thisstuff.
Well, guess what?
I was right about the industry,wrong about the companies, and
I mean that's you know that'sthe kind of stuff that happens.
But I think futurism that'sanother word I mean I sometimes
(26:31):
call myself that, but manypeople who call themselves
futurists are frauds.
I mean just flat.
They're like hella evangelists,like that level of fraud.
And when you talk to many ofthese people they have a
marketing background.
They're not people like you andI.
(26:52):
That could you know in a pinch.
You know, dig into a router andtry and figure out what's going
on.
I haven't done that stuff inyears but I could still do it.
They're not like that and youknow that goes back again to the
kind of the theme that I didn'tknow I had here, but that these
skills are changing and they'regoing to be less useful.
(27:14):
Like I tell you, you know youwere saying about cybersecurity
and you give people like a testquestion to see if they're
serious.
I try to talk people out ofgoing into computer science and
I've been doing that for sevenor eight years and I often give
talks to grad schools and theyget angry, usually because
they're, like you know, one yearaway from getting their
(27:35):
doctorate in computer science.
The argument I have is computerscience today and tomorrow will
be mostly algorithm development,and there is only so many
algorithms that you need peopleto develop, and it's a very
small subset of the number ofpeople running around today with
(27:57):
graduate degrees in computerscience.
So most people who callthemselves computer people or
technologists they're kind of,you know not to be offensive but
they're kind of webmasters.
You know they put up a website,they know how to do some Java,
javascript.
I mean they know what JSON is,maybe.
I mean they know stuff, butit's very, very narrow.
(28:19):
It's not the way things used tobe, where you had to know all
of this stuff.
It's like the mechanic guy whocan do your spark plugs.
It's not just General Motors.
He had to work with Fords andChryslers and you know else,
because it was they wereprincipals.
Speaker 1 (28:36):
So yeah, that is a
really good point.
You know, I don't even knowwhat, like what they would get a
phd in computer science andlike what does that even look
like?
Because in your bachelor'syou're learning, you're learning
, you know the bits and you knowhexadecimal, you're learning c
(28:58):
plus plus and all that sort ofstuff and I I didn't get my
bachelor's in that area.
I actually got my bachelor's incriminal justice and you know,
wanted to, wanted to go thefederal agency route and I kind
of stumbled into it and found itto be a lot more interesting in
some ways.
But what does that even looklike for a PhD in computer
(29:21):
science?
Speaker 2 (29:22):
Yeah, I think your
point's a good one.
I never thought about that.
Basically, everything you needto know about computer science
you can get as an undergraduateRight.
That's kind of what you'resaying and that's absolutely
true.
The stuff that paid off for mein the long run was stuff like
knowing how to build a compiler.
So I took a couple of gradlevel classes in that and I did
(29:45):
build compilers, but they werelike natural language compilers,
so you can apply thattechnology to many other things
if you understand what thattechnology is and that kind of
thing.
Like, I was a Lisp programmerfor a while if you know anything
about Lisp.
So Lisp was the language for AIfor many years.
(30:06):
But it's a crazy programmingstyle.
It's all recursion, so you haveto be I mean all of it, that's
what it does.
So you have to understandrecursion or mean all of it,
that's what it does.
So you have to understandrecursion or you cannot possibly
program unless.
So those programmers are prettymuch gone now, but that was a
skill I had to learn from school.
Speaker 1 (30:29):
Huh, I guess it makes
sense.
For, yeah, I guess, justthinking about from an education
perspective, right, it makessense to get that undergrad
degree in computer science, ifyou're going to go down that
path and whatnot, and then itprobably makes more sense to get
you know these onesie, twosieclasses of developing core
(30:52):
technology types right, ratherthan even going down the path of
getting a master's like a fullmaster's.
Just get those courses, getthat skill and then build from
there.
You know, because those areskills that really you know you
can build off of right and it'lltransform into something else
where you're using it with AIand building a model.
Speaker 2 (31:14):
Well, every once in a
while, because of the kind of
stuff I do, I run into hybridpeople.
Now I mean younger, typicallylike computers, people that have
an undergraduate degree incomputer science and then they
get a law degree.
I've run into half a dozendoctors who started off as IT
people and then they went tomedical school.
(31:35):
And these people are they'rekillers because they can do
stuff none of their colleaguescan do.
So when they get out into thatworld, the legal world, the
medical world, everybody relieson them for anything that looks
like a computer, and I'm talkinglike litigation.
Is the hospital going to buy anew $150 million automated
(31:57):
surgical robot arm?
Well, let's ask Joe, becausehe's got the computer science
degree, although you said youdidn't, but even so.
So I mean, I think that's verypowerful.
I don't see the specializationrequirements anymore
(32:18):
specialization requirementsanymore.
Speaker 1 (32:19):
Yeah, that's actually
very true.
You know, I think this is kindof how I approach it.
You know, when I was gettingstarted, I wanted to learn as
much as I possibly could abouteverything.
There wasn't a specifictechnology that I wanted to
focus on or a specific domain oranything like that, and so I
got experience, you know, withWAFs, right, and then regular
(32:39):
firewalls and EDR systems.
I have experience with all ofthe big EDR systems.
You know, when a lot of people,a lot of people, will say I only
know CrowdStrike or I only knowX EDR, right, I have a full
spectrum of experience acrossalmost every single domain in
security.
And then I went through and Idecided, okay, I'm going to
(33:03):
specialize in cloud security.
And now I'm kind of taking astep back and I'm upscaling
right on the PhD side with postquantum encryption on satellites
, two things that have so manydifferent facets to it that I've
never touched before, right,while I'm also going back in my
career and getting more broad,getting more generalized and
(33:27):
specializing in a few nicheareas, but still building, you
know, a stronger I just I saybuilding a stronger overall
experience, right, because youknow something I'll learn in
network security or with a WAFor whatever it might be, will
benefit me in vulnerabilitymanagement and it'll benefit me
(33:48):
in other areas.
Speaker 2 (33:50):
Knowing the concepts
will pay off throughout your
entire lifetime.
The concepts will pay offthroughout your entire lifetime
Far more than memorizing tablesor something like that.
Understanding the concepts isreally, really important, and I
think that gives peoplesurvivability in the marketplace
.
So you know.
Something else to consider.
When I went to college and myfirst degree was in the 70s,
(34:14):
late 70s, there was no computerscience degree.
You couldn't get one.
You had to get a math degree,carnegie Mellon.
I've got a couple of friendswho went there.
They got math degrees and thenthey ended up being computer
programmers, but that's all theycould do.
So we changed job titles,especially in this country,
every seven or eight or 10 years.
(34:35):
As look at what a lot of peopledo now in LA and New York and
Chicago, and you, if you go to aroom of millennials, you know
at a bar or something, and say,what do you do?
I guarantee you at least athird of them were professions
that I may not even know whatthey are and they did not exist
10 years ago.
I'm an SEO specialist, okay,well, what is that?
(34:58):
I mean I know what it is.
I'm exaggerating, but mostpeople my age wouldn't, and it's
because the professions havechanged.
So you want to be futuristicfor a while.
Put your hat on and think five,six, seven, 10 years.
What kind of professions are wegoing to look at?
Well, I bet a lot of them aregoing to have the word AI in
(35:20):
them, and they're not going tobe building AIs, they're going
to be training AIs or they'regoing to be servants to AIs.
So when the AI needs like a cupof coffee or something, you'll
metaphorically, that's whatyou'll do, because they don't
need us to do anything like this.
They need us to feed them data,but they've already eaten all
(35:42):
the data.
Openai announced, I think aweek or two ago, that they've
now looked at every single pieceof data that they could
possibly look at and they're nowbuilding systems that generate
false data that they can use fortraining.
The rest of the systems Soundsgoofy, but what that?
I mean?
What that is is those aremachines that are now training
(36:04):
themselves.
I mean, look at programming.
I mean open AI, like the chat,gpt stuff.
I'm sure you've tried to writeprograms with it.
Everybody has.
Oh, yeah, they're not bad.
Yeah, I mean, they're not themost clever thing I've ever seen
, but they work, they compileand they do the thing they're
supposed to do.
So you know we're just.
And then you know not to getyou know too spiritual here or
(36:28):
anything, but you take that ideaof technology and then you put
drones and robots and Tesla.
My Tesla has a summon featurethat I am terrified to use.
I tried it once in the middleof DC and we were like two
blocks away and I hit the buttonand then it comes barreling
(36:49):
down Connecticut Avenue withnobody at the wheel.
And this isn't Waymo, this islike a car driving itself with
no real particular direction inmind.
So when you start looking atthat, I mean what are we looking
at?
We're kind of looking at Skynet.
Speaker 1 (37:06):
Yeah, yeah, that's a
really good point.
Where do you think AI securityfits into the development of AI?
I know that we talked aboutthat offensive and defensive
component, but when we'retalking about models, it's a
little bit different, right,because you almost have to, you
(37:30):
know.
It's like you have to monitorwhat the model is consuming.
It's like you have to monitorwhat the model is consuming.
And you know this is the thing,I don't know how else to explain
it Right, you wouldn't want themodel right to look at Nazi
Germany and say that is good,that's something that we want to
(37:51):
propagate, but you don't wantto keep that information from
the model Right, and so you getinto a weird dilemma.
Speaker 2 (38:02):
Do you remember there
was an AI that Microsoft did
about 10 years ago that they hadto pull off the market because
it became a racist?
Speaker 1 (38:11):
Yeah, Do you know
what I'm?
I'm just looking it up becauseI'm here, yeah, t-a-y.
Speaker 2 (38:16):
So it did exactly
what you're saying.
So they fed it a bunch of stuffand they didn't really
constrain what it ate on thewebsites and it hit a bunch of
white supremacist sites and thenbasically it was saying Sieg
Heil and it was saying like awhole bunch of anti-Semitic
stuff and it used the N-wordwith people in casual
(38:39):
conversations.
So Microsoft shot it in thehead and they never revived it
again.
Wow, that's not programming.
That's not programming.
That is its interpretation ofthe data that was input to it.
Speaker 1 (38:54):
Right, right, so like
that.
That's that's kind of where Ithink AI security like comes
into, into play, right, whereit's kind of more about
monitoring what the model isconsuming and trying to figure
out.
See, I always view it andpeople with at nvidia they argue
with me on this as like a aimodel hierarchical system where
(39:18):
you have an overarching ai modelthat you want people to consume
and interact with and whatnot,and then that ai model is fed
off of other models that islooking at specific topics.
So it's almost like that modelgets specialized into a certain
area, like maybe world historyor European history or, you know
(39:40):
, sports, right, the financeindustry, and when that model
reaches a certain level ofmaturity, it starts feeding that
upper level model thatinformation to query, to
interact with, for users like usto start querying it and
building different things from.
I think that that might be theonly way to do it.
(40:00):
But again, you know NVIDIA,those geniuses over there they
argue with me that that's not agreat way.
Speaker 2 (40:07):
Yeah, I would argue
that too.
Actually, I mean part of theproblem here.
Here's another weird commentthat I don't think a lot of
people make.
We have hit, for the first timein the computer lifetimes, we
have hit the point where you canno longer backchain why a
computer had an answer.
We could always do that before.
(40:28):
It might take a while, but ifsomebody said, god damn it, why
did the computer do that, whydid it shoot down that airplane,
you know, a week latersomebody's going to tell you why
that day is gone.
And generative AI systems.
It is completely impossible tobackchain those guys and to get
(40:48):
like a stack dump and find outexactly why they did what they
did.
So as that is I mean that'slike pervasive across this
industry.
So as that continues, you knowyou're going to.
Well, that was like the racist,the racist plot at Microsoft.
They knew what it was becausethey went back and looked at the
websites that it was it waslooking at.
(41:09):
But in the future there'salready so many of them.
How would you know?
And it's not like they have tolook at whitesupremacistcom that
pulled in.
They can just go to Twitter orX or any of a number of other
ones, and they can find all ofthat crap in free speech forums.
(41:30):
So, from a cybersecurityviewpoint, you go back to your
question.
I don't think you can look atthe input, I think you have to
look at the output.
So I think cybersecurity forAIs is going to be like it's
like you have an attack dog andit's been trained and you're
walking around with it on aleash to make sure it doesn't
(41:51):
bite anybody.
And I think that is what AI isgoing to be like, because you
won't know why it's doing it andthere will not be a human
understandable correlation ofcausality between reading this
post on X and deciding to usethe N word in a forum.
You just won't know.
So you just have to wait untilit screws up and then you have
(42:15):
to roll up a newspaper and hitit in the nose.
Huh.
Speaker 1 (42:20):
That's.
It's fascinating.
I feel like we could go foranother hour just talking about
any of the 10 topics we justdove into.
You know, unfortunately we'realmost at the end of our time,
but I really want to dive intothe stuff that you're working on
now, right, so you'rediscussing about building or
(42:41):
working on, you know, web3 andpost-quantum.
So talk to me a little bitabout that, because I don't want
to butcher it and this can getpretty complex.
Speaker 2 (42:49):
No, I, actually I
wanted to do that.
So I'm working with a companycalled Neoris.
It started in Portugal, butit's a global company and it's a
it's a web three company andthe founder developed some
really cool security approachesto where you have these little
lightweight processes that canbe very quickly ported to any
(43:09):
device you know routers,computers, whatever, and even
IoT devices and then, when thereis a possible attack or there's
some suspicious looking, youknow, packets start coming in.
Instead of going out to likeMicrosoft and saying, is there a
CrowdStrike or something sayingIs this okay?
(43:30):
What it does is it takes a, ithas a blockchain attached to it
and it has a vote, but not justits computers, but other
networks that are in, like thebig meta network.
So some computer you know inBerlin will vote on this because
their profiles and you know,like virus profiles, right.
(43:51):
So there'll be that kind ofthing and and it works really,
really well.
And the demo we've been using iswe have a robot arm and we hit
it with a, with an attack, put avirus on it, and then our
system is able to.
When we do it again with oursystem running, it deflects the
virus and it won't accept it asinput, and so now we've added
(44:13):
post-quantum onto that.
So the attractive part aboutthis system, to me at least, is
it's decentralized.
So if you're a company and youbuy a system like this and you
run it and something goes wrong,it's your IT guy's fault, it's
not Microsoft, and I thinkthat's very empowering.
(44:33):
That's interesting.
Speaker 1 (44:35):
We spent the whole
time talking about how that
control or that empowerment isgoing away from us and more
towards technology or thesethousand-pound gorillas in the
industry and that's interestinghow it's bringing it back, how
it's bringing that ownershipback to us almost in some ways.
Speaker 2 (45:03):
Well, I think I mean
I'm kind of an anarchist at
heart really.
You would never know from mybackground, but whenever I see
things getting tooinstitutionalized it gets my
hackles up.
And the government neverbothered me because I'd been in
the government and thegovernment's fundamentally
incompetent, no matter who'spresident, and they can't really
do things.
They say they're going to dothings, but it takes them like a
decade to do almost anything.
(45:24):
The thing to worry about isguys like Zuckerberg, you know,
and those people, thebillionaires, that are not
stupid, that have lots of assets.
Elon Musk is probably a betterexample, because he'll do almost
anything, potentially, if itsuits his interest.
I worry about those.
So the more we take ourtechnology out of these people's
(45:48):
hands, the better off we are.
Speaker 1 (45:50):
Yeah, well, it also
enables us to maintain our own
privacy right, which has beensomething that you know doesn't
really exist.
Speaker 2 (46:01):
I wrote a couple of
books in this.
I wrote a book called PrivacyLoss, still on Amazon.
It's not there for 14 years.
I predicted a lot of this stuffand I think that the trick to
privacy is you have to acceptthe idea that the old definition
of privacy is irrelevant.
Privacy is not binary, it's not, and baby boomers talk about it
(46:24):
in Gen X people.
They go oh, I lost my privacy,oh, I got my privacy back.
It's not virginity, it's notlike that.
It's not binary Privacy.
It's like uptime on a network.
You know 99.9999.
It's like four nines or threenines or two nines.
That's what privacy is, and youhave to expend a certain amount
(46:45):
of energy and time and money toachieve each granularity level
on privacy.
But people don't want to spendthat money because they think
they're entitled to it anyway.
So that's going to be a problemtoo.
Speaker 1 (47:01):
Yeah, yeah, that is
definitely going to be a problem
.
I feel like to some extent it'sbeen that whole debate, that
whole talk has almost kind ofbeen pushed to the back burner
in some ways.
You know, I I always rememberthe first time I went to to
germany for a study abroad incollege, it when I was on the
(47:23):
plane it had just broke from thesnowden leaks that we were
spying on germany, right.
So when I get off this plane, Ihave a connecting flight to
make it to Berlin.
I'm in Germany, I'm inDusseldorf, right, and I'm going
through customs and this guydoesn't want to stamp my
passport and I'm sitting herelike like hey, man, I have a
flight in 20 minutes.
(47:44):
I have to run across an airportin Germany.
I don't know where I'm going tocatch this flight that I
hopefully get the right one,right.
And so I started to argue withhim, right, and it eventually
because there was no TVs aroundme or anything like that he
eventually just stamped it.
His boss came over and stampedit and by the time I get to my
(48:05):
gate, I sit down for fiveminutes and I see America spying
on Germany since whatever year,and I was like that's not good
for me, because now I just camehere and I yelled at that guy
and they're probably looking atme a different way now, but I
mean of course we were.
Speaker 2 (48:23):
Everybody spies on
everybody, and you know it's
like this TikTok thing which isabsolutely ludicrous.
And you know it's like thisTikTok thing which is absolutely
ludicrous.
I mean, it's not ludicrous tothink that TikTok is gathering
personal information.
It's ludicrous to think theyaren't, and in fact, I would be
shocked if they weren't doingthat.
And guess what?
I bet Meta does it, andInstagram and Facebook, and I
(48:47):
bet Elon Musk does it with X,and I bet, you know, every one
of these social media platformsdoes it.
Microsoft does it, even if yousurely you've noticed this.
But if you buy stuff, softwareyou used to buy, like Microsoft
Office or Adobe Photoshop theyhave switched to these
serialized license models whichrequire a lot more information
(49:10):
from you, and so not only dothey want the money, they also
want the information.
Speaker 1 (49:16):
Yeah Well, these
products, you know they can be
free to some extent becausewe're the product.
You know they're taking ourdata and they're selling it to
whatever broker and you knowit's a mess.
And I don't know how we comeback from this perspective
without having something likeWeb3, you know, widely deployed,
(49:39):
widely accepted and, you know,building from there.
Speaker 2 (49:44):
That's why I'm
interested in Web3.
I mean, my basic meta thoughton this is I think individuals
need to be armed with cyberweapons and, like when I was at
Network Solutions, I was runninga thing called the internet,
which is the DNS system andother stuff, and I had to defend
(50:04):
the first.
As far as I know, institutionaldenial of service attacks Big.
No one had ever seen one beforeand they were really stupid and
anybody today could havestopped it.
They were just like smurfing onsome broadcast address.
But we had to decide what to dobecause there was no precedent
(50:25):
and no policy.
And I made the decision let'sfind out the IP address and
let's, like, blow them up out ofthe water.
And we did, and that was myapproach.
If somebody did that to us, Iwould find out what network they
were at and I would blow theirnetwork out of the water.
And then I didn't have to worryabout Smurf attacks.
(50:46):
So I don't even think you cando that now, but yeah.
Speaker 1 (50:51):
It's like Battleship.
Yeah, it's like playingBattleship.
Speaker 2 (50:54):
It absolutely is, and
that's how things are with the
Chinese in the US right now too,right.
Speaker 1 (51:01):
Well, david, you know
we're at the top of our time.
I try to be very conscious ofmy guest's time.
You know, when I say it's anhour, it's an hour.
But I mean this conversationhas been very fascinating, very
engaging, and I definitely wantto have you back on.
Speaker 2 (51:22):
This is a fantastic
time.
Yeah, well, thank you.
I've enjoyed it too, and Ithink the stuff you I looked at
some of your other ones thatyou've had guests on, I mean the
stuff you're doing is reallyrelevant right now.
Speaker 1 (51:29):
Yeah, I try to be.
You know I don't want to putout like dated, dated
information.
I want the podcast to actually,you know, have value and show
value to to my listeners.
Well, you know, david, before Ilet you go, how about you tell
my audience you know where theycould find you, where they could
find your company, that you're,that you're doing this great
work with?
Uh, if they wanted to learnmore, the company is called
(52:00):
Naoris.
Speaker 2 (52:00):
It's a Portuguese
word, it's N-A-O-R-I-S dot com.
N-a-o-r-i-s dot com.
You can find me atDavidHoltzmancom or GlobalPOVcom
is another website I use and myemail address is on there and
if anybody wants to reach out,I'm I'm pretty accessible.
Speaker 1 (52:14):
Awesome, awesome,
well, well, thanks David for
coming on.
I'm definitely going to have tohave you back on, and you know.
Thanks everyone for listening.
I hope you really enjoyed thisepisode and more to come in 2025
.
Thanks.