Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Stephen King (00:00):
And welcome to
another episode of The
Incongruent.
Lovell Menezes (00:04):
My name is
Stephen King, and I am joined
with Lovell, a former student ofProfessor Steve.
Stephen King (00:12):
Right.
So, Lovell, who did we speak totoday?
Lovell Menezes (00:16):
So today we
spoke to Cordell Robinson.
Uh Cordell Robinson is in thecybersecurity space and he is a
former veteran and he's workedin the Navy and he's doing a lot
of interesting things in thecybersecurity space.
We we had an incredible uhconversation with him, and uh he
he goes in depth about uh allthese topics.
Stephen King (00:38):
Yeah, we could we
even asked him, we challenged
him to be the new president ofthe United States or at least to
be part of the governing bodyand to write new constitutions.
We talked about how governmentsneed to be very rapidly
changing uh their way of runningthemselves because of AI and
the security implications thatthey had.
Uh, we're really, reallygrateful for everyone if you
(00:58):
like it.
Uh if you like our content,please do like the podcast and
please do comment, leave us ashare, tell us, tell your
friends, uh leave us a nicereview.
But uh in the meantime, hold onto your hats because this is
the scariest podcast that we'rehaving this season.
It's on security issues relatedto AI.
And if we're ready, Lovell, areyou ready?
(01:19):
Okay, yes, you are.
He's nodding his head in themute.
So here we go.
Lovell Menezes (01:27):
Good morning,
good afternoon, good evening.
Today on Incongruent, we'rejoined by an extraordinary
figure in the cybersecurityspace, Cordell Robinson, CEO of
Brownstone Consulting, with arare blend of military
discipline, legal training, anddeep technical expertise.
He's built one of the mostinnovative veteran and
minority-owned cybersecurityfirms in the United States.
(01:50):
His career began in theDepartment of Defense and later
in federal civilian agencieslike the National Weather
Service under the top of theDepartment of Commerce, where he
helped redefine cybersecuritycompliance and governance.
What truly sets him apart ishis ability to translate complex
regulations into clear,repeatable systems that
(02:12):
executives and technical teamsalike can understand.
Outside his professional work,he leads the Shaping Futures
Foundation and is now drivingautomation and AI adoption to
make cybersecurity accessible,not just to experts, but to
decision makers shapinginnovation.
So, Cordell, you've been one ofthe most uh multidisciplinary
profiles we've seen (02:33):
military,
technical, legal.
So, could you walk us throughyour life journey and how you
got to where you are today?
Cordell Robinson (02:43):
Sure,
definitely.
So it's been very interestingand a very fun.
So when I uh graduated highschool, I went to undergrad um
and got my bachelor's incomputer science in electrical
engineering.
Um I wasn't ready for the worldyet.
So I joined the United Statesuh Navy um and went into naval
intelligence.
Uh, spent some time in DiegoGarcia, which is in the middle
(03:04):
of the Indian Ocean, and then Iuh went to flight school and
then ended up in Rota, Spain,where I did ops over the
Adriatic Sea uh during theBosnia-Herzegovina conflict.
Um after that I got out andmoved to Washington, D.C., which
is the nation's capital here inthe United States, um, and went
to Georgetown Law.
Um, while I was in law school,I was working as a software
(03:26):
engineer for the Department ofthe Army.
So I learned a lot there.
Um, and then, you know, when Igraduated, I was like, well, I
don't want to be a softwareengineer um anymore because it
was cool, but I wanted to dosomething that was going to be
uh very like it's gonna beneeded for many, many, many
(03:50):
years and it's gonna, you know,pay very well.
And so um I got assigned someuh cybersecurity duties there.
Um and then once I moved overto uh weather service and then
to commerce proper, um, I wasable to utilize uh my legal
training um as well on top of mytechnical training.
(04:11):
So it's been a great journey.
Lovell Menezes (04:13):
Wow.
It's it's very interesting howyou made so many transitions and
how you started off in theNavy.
And today, you know, you're inmy knowledge of Brownstone
Consulting is you know, it'sdescribed as you know, a veteran
or a my minority-ownedcybersecurity services firm.
So what would that identitymean in practice and what
(04:33):
problems uh you know do yousolve with that identity?
Cordell Robinson (04:40):
So, what did
identity mean?
So it shows that, you know, uhveterans, you know, which are
former military, um, that we aremultifaceted because one, we've
traveled the world andexperienced the world and
experienced so many differentpeople and cultures because you
know, we just come together fromlike all over the country to
serve our country together.
And so you get to know so manydifferent people.
(05:01):
Um, there's great training inthe military, there's great
exposure and experience.
So having that veteran as abusiness owner is extremely
important.
Um, a lot of organizationsreally like to see veterans
running businesses after we'veserved our country because we
have um such a broad knowledgeand not just academic knowledge,
(05:24):
but we have lots of knowledgeof the world and how to deal
with um and communicate with alldifferent types of people on so
many different levels becausewe've learned, you know, learned
so much.
And so it's it's amazing and tome being, you know, African
American or Jamaican AfricanAmerican, um, you know, I'm a
minority, considered minorityhere in the United States.
(05:47):
Um, and it's really um, youknow, sometimes it can be some
challenges, but uh, you know, Ithink it's really um a lot in
the grit that you have and whatyou put into it here.
So I think it's just I I thinkit's great to be, you know, to
be able to have that designationand you know, especially like
the veteran designation, um, andhave served my country so that
(06:10):
I can bring those values of myculture and my military
experience to the uh corporateworld.
Lovell Menezes (06:18):
Yeah, I I I
think it's absolutely great as
well, you know, to share youridentity with uh with all of us.
It's just wonderful, you know,uh the story that you've shared
with us so far.
But it uh could you you know goin further about you know brown
brown brownstone consulting andyou know how how how you tie
(06:38):
all of this together as as afirm in terms of the services
that you all provide?
Cordell Robinson (06:44):
Sure.
So uh when I was working at uhDepartment of Commerce, um I
helped a company a substantialamount of money.
So uh uh another colleague ofmine, we were both working on
some projects together, we werehelping win business together,
and so we were like, you know,we should, you know, once we're
done, you know, start our own umfirm and you know, see what
(07:08):
happens.
So we, you know, did ourhomework.
We started our own firm back inuh 2010.
Um and it was a rough startbecause uh we had to learn a
lot, we had to navigate throughmany things, but we did know how
to build business.
And that was that was thething.
We knew how to bring in therevenue.
So that really helped us movealong.
(07:28):
And so what I wanted the corevalues of Brownstone to be is
your go-to cybersecurityservices firm for compliance on
every level.
And so we always stay on top ofthe latest and greatest
technologies.
We're always taking training.
Um, me, even as an executive, Istill take, you know, not just
(07:51):
executive leadership training,but I also take um, you know, uh
different types of technologytraining.
Like I'm taking AI classesevery single month to understand
AI in and out, and especiallyAI security uh training, so AI
governance.
So it's really important forBrownstone and then all of my
employees, making sure that theyare properly trained and they
(08:11):
understand the latest andgreatest so that we can service
our clients and give them thehighest quality and best
services possible because I wantto make sure that the
reputation of Brownstoneglobally is one of your
companies that you can go to,and that's gonna really help
educate your organization,whether it's a company, whether
(08:32):
it's academia, whether it's umgovernment, and we can really
bring some insights to you.
So I built an ecosystem.
I self-like my me, myself, I'vetrained basically a
cybersecurity army.
I would hold held host classesat my home every single weekend,
(08:52):
you know, for hours and hoursfor over like uh you know,
around four or five years.
So it's like right before COVIDand then during COVID, we took
advantage of that, you know,that time period and we really
ramped up the training.
Um, and I, you know, reallydrilled into the trainings and
then brought them onto mycompany, gave them positions,
(09:13):
and they when we um my companyhas a reputation that when I uh
provide you know people todifferent uh clients, different
contracts for either governmentor commercial, we're considered
the best of breed.
Um, we always get the highestpraises and accolades because um
everyone takes, you know, Imake sure everyone takes pride
(09:34):
in their work, everyone is veryknowledgeable, um, and we do our
due diligence and we make surethat the client, we walk away
from the client where they haveone and more secure environment,
but they're also well educated.
So it's very important forBrownstone.
Stephen King (09:49):
Now, as a small
business owner myself, uh I'm
having to hold the hat of ITsecurity person as well as
everything else.
And this is not something whichI've ever wanted to do, but I'm
forced to do because ofcompliance with certain
contracts with suppliers with myclients.
And uh the the sheer simplicityof some of these security tips
(10:13):
is is like, for example, I haveto keep an asset list now, which
I've never done before.
Uh I have to make sure theytalk about you have to have a
policy in case there is a risk,uh in case there is a breakdown,
and we say, well, I can wellI'll deal with it when I deal
with it.
So these are the simple thingsthat I have to deal with as a
small business owner, but youdeal with so much bigger uh
(10:36):
problems.
I mean, right now you mentionedAI, as far as I see on all my
feed, is that the agents, theseAI agents, are a massive
security risk.
Uh is that something thatyou're discovering, or how how
are you uh what are you tacklingat the larger end?
What's the equivalent of a youknow simple uh asset list for
someone who's got all theseagents and uh AI tools and
(11:00):
people bringing in their owntools into the office?
What what how are you handlingthat?
What are you what are youtalking to people about?
Cordell Robinson (11:06):
So the first
thing that we do, well, the it's
kind of twofold.
So the priority is making surethat the people are properly
trained because the biggestthreat to AI is humans.
But then when it comes toassets and asset management is
extremely important.
So I found that mostorganizations, whether small or
large, their asset management isnot there.
(11:30):
Like they don't, either theydon't have it or it is very,
very inaccurate.
And so I say, okay, if you'regonna bring a new technology in,
have your asset managementlist.
Have a process that as soon asyou bring it in, you add it.
Don't wait because then you'llforget, and then it's there, and
then you scan your environment,and then you have all these
(11:52):
things.
I've been to hugeorganizations, scan their
environment, and they're like, Ididn't know I had this on my
network.
And I'm like, well, then youwould know that somebody is
sitting in your environmentcollecting your data.
So making sure that the ininventory is one of the biggest
issues that I've seen at somevery important places.
I'm like, you know, do a touchinventory every, you know, maybe
(12:17):
once a year, every two years,where you literally touch every
single, you know, software andhardware that you have and make
sure you have your line items.
And then make sure that you'reyou, you know, if you haven't
implemented that process, thengo ahead and every single time
someone brings something in, youadd it to there.
And don't allow people to justbring things into your
(12:39):
environment.
So put in some measures inplace where they just can't
install software.
Block them from installingsoftware and only let certain
people install software, butthey still have to fill out a
form to install that softwareand it has to go through an
approval process.
Even if you're the only onlyapprover, at least you know.
So then you can check itagainst your asset manage your
(12:59):
asset list and say, okay, um,they want to bring this in.
Um, sure they can.
You know, I'm gonna add it orwhoever you assign to add it,
let's add it right now.
Um, or no, and then you checkto make sure, you know, well,
one, you put measures in so thatthey can't do it, but if they
get around it and they do it andyou scan and you see it on
there, then you make sure thatyou, you know, handle things
(13:22):
accordingly.
Stephen King (13:22):
So we can do that
for humans, right?
If a human brings in a piece ofsoftware and downloads it, then
uh certainly, and there can beaccountability.
But these new agents, they theythey use all we don't even know
what they I don't know what, II assume that when I ask chat
PGPT or I ask uh Claude orsomeone like this to do a uh
someone um through turning intoa person there, but uh to do
(13:47):
something like create me uh I Iasked it to create a scatter
chart the other day and itstarted programming Python, etc.
etc.
Like um how does a company takethe benefit of these agents
when the agent is going tooperate autonomously and might
not have these guardrails?
Is that a problem or is thatsomething which the I a good IT
(14:08):
manager will solve?
Cordell Robinson (14:10):
Um it's a
problem.
So I tell people don't rushinto putting these things into
your you like into yourproduction environment that's
gonna touch everything.
So see if you can you have likea system or a laptop that is
not connected into yourenvironment, utilize it outside
(14:32):
your environment, work withthose agents, make sure it's
secure it's secure, get youknow, do whatever you need to
do, you know, make thingsefficient or whatever work
you're doing.
Scan it, make sure it's cleanand secure, you don't have any
um bad agents, and then bring itinto your environment.
I know it's an extra step, butit is a more secure way because
(14:53):
if you like so many companies,they just oh, I'm gonna use
chat.
No, like don't just do thatbecause you don't know that
data, you don't know where thatdata's going, and you don't know
who has put agents on that datato look into your environment.
Like you don't know, right?
And so just utilizing itbecause it's easy, it's simple,
(15:14):
it's making things work fasterfor you doesn't mean as it's
going to help you when it comesto the security of your
environment.
So I say do your due diligencefirst.
And like, like for me, like Ihave literally have a laptop
that I use like AI for that'sconnected to a separate network.
I do everything there, and thenI do a scan, and then I bring
(15:35):
it into my network.
Stephen King (15:37):
And what are you
actually what are you looking
for?
I'm gonna sorry, I'm just gonnawhat what do you look for?
Or what do you find?
Um I have heard that these notetakers and these calendars can
be tricked into sending messagesout.
You you you you attach your, Iwon't name a brand, but they get
attached to your calendar, uh,and then someone can then send a
(16:00):
message to the note taker tosend you the send them the
minutes of the file or somethinglike that.
That's something I've heard.
Uh what do you usually find?
What's what's the commonproblems that you find when
you're when you when you'redigging deep into this?
Cordell Robinson (16:14):
So that's one
of the problems, and so that's a
big one because the note takersare so great.
Like they are really goodbecause you don't have to have
someone taking minutes or notesand they just have the note
taker, it's gonna do it at theend of the call.
It's all organized for you.
So it's like such a wonderfultool, and everyone wants to use
it, right?
So that's it's it's a wonderfultool.
(16:36):
But like you said, people canattach things to it and get into
your environment.
So I say do an audit of thenote taker before you decide to
put it, bring it into yourenvironment, right?
So look, so reach out to thecompany.
So there's like Fathom, um,there's uh what Gemini, and
(16:57):
there's several other like Zoomhas note takers.
Um, and so reach out to them,reach out to the company and ask
them for what you know, what isyour security policy?
How are you securing this ifI'm gonna put this in my
environment?
And they'll let they'll tellyou.
And a lot of people don't dothat.
Lovell Menezes (17:14):
I I think it's
it's great to ask such
questions.
It's something that's never uh,you know, come to my mind when
it comes to the security of uhsecurity and privacy when it
comes to AI.
And uh you mentioned aboutwanting to make you know
cybersecurity accessible, notjust to experts, but to decision
makers as well.
(17:34):
Uh so, in your opinion, what doyou think you know are the
biggest barriers keepingexecutives from you know
understanding or uh you knowacting on cyber risk?
Cordell Robinson (17:45):
I think the
biggest issue is there's a lot
of executives, especially someof the older executives, they're
not tech savvy right now.
And they, you know, thetechnologies that they've
learned are from maybe the 90sor the early 2000s.
And because they've been inleadership for so long, they
have not kept up with the latesttechnologies.
So they're apprehensive to evenlearn the new technology.
(18:09):
So they just use the easythings like chat or the note
takers, and that's it.
And I'm like, well, you I saysit down with your um technical
folks, have like one-on-oneconversations and learn a few
things about how the technologyis being used in your
environment.
So you as an executive knowwhat's going on.
Because if something happens,the leader of the company's
(18:31):
called to the carpet, not that,you know, IT person.
So they they should understandthat they're they're gonna be
held accountable.
So if I know that I'm gonna beheld accountable, I want to
learn what's going ontechnologically in my
environment.
And I want to understand thetechnology and not just know,
okay, I have, you know, thislist of tools in my environment.
(18:52):
Okay, that's great.
But how do these tools work?
Are these tools secure?
Is there governance wrappedaround these tools?
Do I understand the governancewrapped around these tools?
Do I understand the technology?
And you don't have tounderstand it as much as a
leader and executive as much asyour technologists and your
(19:12):
company, as your AI engineers,but understand it enough to have
those conversations andunderstand it enough where you
can embrace it and not kind ofrun away from it.
I mean, because a lot of, forsome reason, a lot of the, you
know, some executives are like,uh, well, you know, I'll just
leave it to, you know, theyounger folks, they understand,
(19:36):
they handle it.
That's just, it's just toocomplicated for me.
It's not.
I mean, we're gonna learn untilwe die, right?
So it's just, you know, let'scontinue on.
It just makes things, and itnot just makes your life
professionally easier, it'llmake your personal life easier
as well by understandingtechnology because we live in a
very techy world and it'sgetting more tech, you know,
(19:57):
we're it's becoming more andmore and more filled with
technology.
And so if you don't, thenyou're gonna be left behind.
And even when you retire, youstill have to understand
technology when you retirebecause everything is on a
phone, a smartphone, or on alaptop or in a tablet.
It's just there.
I mean, when you have to booktravel, I mean, are you gonna
(20:18):
spend hours on the phone or areyou on on the phone talking to
someone?
Are you just gonna go on yourphone and quickly book?
You know, I mean, what do youyou know what are you gonna do?
Stephen King (20:25):
And here here in
the UK, if you want to get
health service, it's all done byautomation now.
You have to go through thebefore you speak to a doctor.
So there's so as as you rightlysay, the older you get, the you
know you need to have thistechnical saviness.
Uh Laval, what we got next?
Lovell Menezes (20:43):
Uh it it's a f a
follow-up on what you were
talking about.
So uh it I think it'sabsolutely necessary for all of
us to keep learning about AI andkeep adapting to these new
changes because as you said, youknow, all of this is here to
stay, and we are always going tobe using these technologies.
So i in in the world ofcybersecurity, do you see AI
(21:04):
being you know a big threat, ordo you see AI you know being an
ally for you know the world ofcybersecurity?
Cordell Robinson (21:13):
Right now it's
a big threat.
And it's a big threat notbecause of AI, it's a big threat
because of humans,unfortunately.
Um, I think AI is probably oneof the one of the greatest
inventions in our modern times.
But unfortunately, what I'mseeing globally as humans,
everyone is running and racingtowards it without doing any due
(21:35):
diligence.
And it hasn't caused hugeissues just yet, but I see them
coming around the corner.
So, you know, by 2030, we'regonna have some major problems,
or even right before 2030, somemajor problems.
If people, humans, we don't geteducated on it.
And that's the biggest thing isto uh make sure that we all of
(21:57):
us humans, everyone, no matterhow young to old, get educated
on AI and mainly the security ofit.
Stephen King (22:07):
We've had the
millennium bug.
I remember the millennium, youremember the millennium bug?
Yes.
Lovell wasn't born.
No, I do not know themillennium bug.
So what is the equivalent?
What are we looking at?
What what do you predict asbeing the millennium bug moment
with uh AI security?
Cordell Robinson (22:24):
Um millennium
bug moment is gonna be a lot of
uh power grid outages due to AI.
Yeah, it's coming.
Okay, about the coal.
That's that's good.
Lovell Menezes (22:36):
And you know, to
you know, just to go on with
you know uh our previous point,um what uh ethical boundaries do
you think we need when uhembedding AI into security
operations, especially in uh youknow government or defense
sectors?
Cordell Robinson (22:55):
So uh I think
the boundaries we need is uh
processes and procedures thateverybody follows and they
understand clearly.
So, you know, write them insimple terms so that executives
and non-executives and technicaland non-technical understand
across the board instead ofwriting it to technical, and
then enforce those policies andprocedures.
(23:17):
Make sure they actually workfor your environment, whatever
environment it is, because theacademic environment is very
different than the governmentenvironment, which is very
different than the corporateenvironment, which is very
different than the medicalenvironment, and then your
manufacturing, because it's somany different environments.
And so creating those things,which that's what my company
(23:37):
does, is that's one of thethings we do is create those
things, you know, those policiesand procedures are so
important, and then enforcingthose are so important because
one, it educates people, andwhen you're forced to actually
follow them, you get into aculture where it becomes second
nature, and not only do youadopt those policies and proceed
(23:57):
procedures professionally, butyou begin to adopt them in your
personal life because alsopersonally, AI can either be
great or do lots of damagefinancially to you personally.
Stephen King (24:12):
Well, that's I've
already subscribed to so many
different tools, so I don't knowwhether that's uh whether you
mean other.
Uh we just saw what did we seetoday?
Sam Altman has issued somethingabout this.
Uh they when Sora had issued uhuh uh a whole lot of things,
(24:33):
and then they they they withdrewsome of the chatbots type of
functionality, and now he saidhe's gonna treat adults like
adults.
I th I I see that.
Uh I don't know whether that'sgot to do with security, but
that that's that's definitely anethical concern.
Um we've had a previous speakertalk about vision computing and
talking about how in visioncomputing they no longer need to
(24:56):
take biometrical uh they don'tneed biometrics to detect who
you are.
They can detect from your gaze,they can from your your
physique, um and they canmeasure intent.
Now you're in the US where youhave freedom of thought or
freedom of First Amendmentrights, but now with the use of
(25:17):
AI and security devices, we canactually measure your intent.
Uh so philosophically, thatfits into this particular
question, is and to the next oneabout constitution.
How does this uh uh affectpersonal freedoms when I don't
need to quiz you on yourthought, I don't need to quiz
(25:38):
you on your political beliefs?
I can synthetically determineyour intent by the way you look,
by the way you walk, by theplace you are, uh by your
historical uh moments.
Um how should we be rewritingthe constitution of a government
or to to protect citizens?
(25:59):
And what would might be yourfirst clauses if this was to be
coming to effect?
I've made you president.
Cordell Robinson (26:05):
I've made you
I've made you the the second
African or Jamaican the firstJamaican-American president
there, kind of so one yes, thatis it's very dangerous, but I
think writing it that it we needto begin, we should have
already a while ago, but wereally need to begin writing
amendments into ourconstitutions and across the
world.
So whatever constitution,especially like in the American
(26:27):
constitution, we need to startwriting amendments with
technology into ourconstitutions, especially now
with AI.
If we don't write techtechnological amendments into
our constitution, then we'regonna have major problems.
And that's it, that it possiblycan create lots of wars.
(26:47):
More wars.
We already have enough, buteven more wars and civil wars at
that, because of that.
So I think definitely writingamendments into the constitution
because you know, in America wehave freedom of speech, which
is the first amendment, freedomto bear arms, I think, which is
the I think the secondamendment.
And so you, you know, and inAmerica, they're you know, they
(27:08):
love guns.
Um, so you know, which is like,okay, now you have AI and
you're like, you know, you'recarrying a gun and with the
intent, what does that looklike, right?
And so, and you could be themost harmless person because
there's people that carry gunsjust for their safety because
they live in certain parts ofthe country.
(27:29):
And so the AI could be verybiased.
And so writing those laws intoplace in the constitution about
um bias and making sure that thetechnology code is written so
that it won't be biased.
Because a lot of the lot of thecode being written is biased
either way, right?
And so making sure that by lawthey cannot write biases within
(27:51):
the code, and if so, there'sgonna be violations.
So, and like especially likewithin the US code, they need to
write it and then they need tomake it law in the Constitution,
which our Congress, ourlegislative branch, um basically
approves the law.
And so one of the things is um,you know, eventually, you know,
our Congress is older.
And when we get a youngerCongress, um, which is probably
(28:13):
gonna happen in the next fewyears, I believe that's gonna be
one of the big topics.
And I know I live in DC, I knowthat I'm probably gonna be
going on Capitol Hill andtalking with a lot of uh
Congress um and Senate about umtechnology and making sure that
um we really revamp ourconstitution.
And I think uh starting aglobal effort to do that as as
(28:36):
AI get more advanced, and but weneed to move rather fast
because AI is moving atlightning speed.
Stephen King (28:42):
I think Lovell,
I'm right in thinking the UAE
has drones, the police force hasdrones already.
And so when you have this kindof environment and you're
plugging it into data and it'susing whatever data is
available, it's going to makedecisions based on whatever data
is historic, and it's notnecessarily deliberate bias,
(29:03):
it's just data bias that it isit is taken aboard, and I think
that's really terrifying.
Let's go to something morepositive, Lovell.
Lovell Menezes (29:11):
Cardinal,
firstly, I I love the fact that
you know you have a verypositive outlook on you know
wanting to implement technologyand you know making these
amendments, and that's why ittakes us to our next question.
You know, we would love tolearn more about you know the
Shaping Futures Foundation andhow mentorship or community work
connects to your professionalmission.
Cordell Robinson (29:33):
Oh, awesome.
Yeah, so um I started ShapingFutures Foundation, which um is
my foundation that I have.
I have an orphanage and ArushaTanzania.
Um, and you can go towww.shapingf.org.
And what I do is there's you Ihave young kids uh at the
orphanage and not only justacademic education, but we teach
(29:54):
them life skills to includetechnology so that they can be
prepared to compete.
Globally.
So it's so important to educateand train our youth.
I mean, and these kids arebetween six and eight years old.
So they're very young, butthey're learning the most
advanced technology, which isgoing to help them.
And I want to, of course,expand and grow over time
(30:16):
because it's so important toeducate the youth.
I think that beyond me justbeing in technology as a
technologist and an executive, Iwant to make sure that I give
back and I train, you know, thefuture generations.
So it was really important tome to start the foundation and
to really put education at theforefront.
(30:36):
And I am working on in the nextfew years of actually building
schools throughout Africa andUnited States that are going to
be STEM-based and skill-based sothat these kids learn these
skills at a young age.
And by the time that they're,you know, younger adults, they
(30:57):
have all of these strong skills.
They're able to either go outin the workforce or go to
university, get to more get moreadvanced training, and they
are, you know, going to be someof the top leaders in the world
by, you know, starting thattraining at a young age.
So I'm, you know, really umhappy uh with the the work and
all of the people that work withthe foundation.
It's uh been so rewarding, andI'm looking forward to uh you
(31:20):
know many years of uh hard workand really changing lives.
Stephen King (31:26):
That's amazing.
I think we're gonna finishthere on that very positive
note.
Uh and I really thank you somuch.
I'm really glad that we hadthis opportunity to speak.
It's been two months or so inthe making, and it really has
really been worth it.
I've I've really, reallyenjoyed speaking with you.
I always enjoy talking aboutsecurity and AI.
Um, it's it's one of the mostimportant topics.
(31:47):
Uh, Label, would you like justto close us out?
Lovell Menezes (31:51):
So, once again,
thank you so much, Cordell, for
you know taking your time outand sharing with us your
wonderful insights.
It's it's great to see thatyou're very passionate about you
know the the cybersecurityspace and and and you know
wanting to ensure that you knoweverything, you know, privacy
and security goes by hand inhand uh in this field.
(32:13):
But you know, we l look forwardto having more great
conversations with you, and I'msure your insights will be of
great value to everyonelistening to this podcast.
Cordell Robinson (32:23):
Great.
Thank you so much for havingme.
It's been such a pleasure, andyou two are absolutely amazing.
Thank you.