Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Camille Stewart Gloucester is a strategist, attorney, and executive who's
cross cutting perspective on complex technology, cyber and national security
and foreign policy issues has landed her in significant roles
at leading government and private sector companies, including the White House,
Department of Homeland Security and Google. Camille builds global cybersecurity, privacy,
(00:22):
emerging technology AI, and election security, integrity programs and complex
environments for large companies and government agencies. And she is
our guest today. This is the Black Information Network Daily
podcast and I'm your host, Ramsay's Jack and I am
q Ward.
Speaker 2 (00:43):
All right, So Camille Stuart Gloucester.
Speaker 1 (00:45):
That feels so formal since we've talked and we have
a bit of a report now, but welcome to the show.
It's so exciting to finally have you on. These are
some conversations we've been looking forward to having, and you
are just the person to have them with. So you know,
first off, how's your day going so far?
Speaker 2 (01:02):
How you feeling?
Speaker 3 (01:03):
First thanks for having me. I'm feeling great. It's a
what day is it? Wednesday?
Speaker 4 (01:07):
And it's beautiful and we're about to have our election,
so I'm ready for that.
Speaker 2 (01:13):
Yeah, I absolutely think that we're ready for that first up.
Speaker 1 (01:17):
You know, I talked a bit about your background, but
I didn't go into any detail. Again, one of the
things that we like to do here is make sure
that our listeners know who we are talking to.
Speaker 2 (01:27):
So tell us a little bit about who you are.
Speaker 1 (01:31):
We start our stories at the beginning, so you know
maybe where you were born, how you grew up, what
motivated you to get into the field that you're in,
and sort of what led you to today's conversation.
Speaker 4 (01:40):
Yeah, so my parents immigrated here from Jamaica. I grew
up in Ohio. My dad is a computer scientist and
my mom's a college professor and a nurse. And so
I grew up knowing that I wanted to do something
in the tech space, but I also wanted to be
a lawyer, and so I used to make them sign
(02:01):
contracts may promise. And how that manifested itself through my
education career was realizing that what I liked about the
law was complex challenges and the intersection of society and
those complex challenges, and what better place to do that
than technology?
Speaker 3 (02:19):
And so I found myself working on cybersecurity issues.
Speaker 4 (02:22):
And really pulling together my technical acumen with my legal
background to kind of live at the intersection of people, technology,
society policy.
Speaker 5 (02:34):
So you don't have to really depend on hypotheticals given
your experience with election integrity in twenty sixteen and in
twenty twenty, in your opinion or in your experience. Rather,
what are the most pressing cybersecurity threats facing this twenty
twenty four presidential election and how should the candidates address them?
Speaker 4 (02:57):
Yeah, Well, first and foremost, there has been a lot
of work done since twenty sixteen to build resilience in
our election infrastructure. So it is unlikely that a foreign
adversary is going to be able to tamper with our
elections such that it changes the results.
Speaker 3 (03:14):
And I want people to feel comfortable in that.
Speaker 4 (03:16):
There has been a huge investment in remediating any of
the vulnerabilities and our election infrastructure. That said, China, Russia,
Iran are all trying to confuse everybody. They would love
to promote and sew confusion and discord and erode the
(03:38):
trust in our elections by releasing I mean recently there
was a video that said that there was you know,
fake ballots and things like that. I mean, that is
probably the biggest threat right Any attacks on election infrastructure
are pretty distributed every state, every county. The elections are
(03:59):
run at such a local level that those differences across
localities mean that there is no one uniform attack that
could stop our elections, that could keep us from counting
the ballots, that could help us figure out who was elected.
But that trying to sow confusion is the biggest threat,
(04:20):
and there's been a lot of great work done at
the state level and at the federal level to try
to combat that.
Speaker 1 (04:26):
So my question is, how do you know that it's
China or that it's Russia or you know, Iran or
whatever foreign entity. How do you know that it's like
a foreign agent that's doing the interfering and not just
people who are conspiracy theorists or whatever A. And then
(04:48):
the second part is what should people How can people
themselves vet the information that they're consuming to know whether
or not there there's an attempt at influencing their opinion
on the election. So how do you know where it's
coming from? And then how do people themselves determine whether
(05:09):
or not they're getting good information?
Speaker 4 (05:11):
Yeah, you know, people always talk about the best way
to lie is be as close to the truth.
Speaker 3 (05:17):
They have as much truth in there as possible.
Speaker 4 (05:21):
So the best way to sow discord, to make people
uncomfortable is to leverage existing narratives. Maybe they aren't as
far spread as they appear to be because of the
manipulation that you're doing, or maybe they're start with the
thread of something that's true, but exacerbate that. So I
(05:43):
mean Russia, China, Iran, they're all trying to exacerbate existing tensions, racism, sexism, xenophobia,
all of the things, the social unrest that underpins our
society and makes people uncomfortable and feel feel things. But
it's the scale, it's the impact, It's how they're being
weaponized against a candidate or against the process, the determining
(06:10):
of trust, the erosion of your confidence, and how things
are playing out, not just your discomfort with people's isms.
And I mean the way that we determine or the
way that the government determines it's Russia China, Iran is
there are a number of tactics and things that are
used commonly by these actors, and they will do a
(06:32):
lot of work on the back end through their intelligence
sources to determine that it is Russia, China, Iran, but
they are pulling on threads of things that are happening.
There are also thread actors or malicious actors that are
domestic that might be exacerbating tensions and further polarizing narratives.
It's not isolated to those actors, but those actors do
(06:55):
benefit from sowing discord amongst the US electorate to make
people decide not to vote or to try to move
the votes in a given direction. That requires of us
to do some due diligence and to be thoughtful about
the information that we are consuming. And there are a
(07:18):
number of resources. Most of your secretary's of state or
whoever's running your elections at a state level and at
a local level have websites that put out the truth
about how to vote, where to vote, when to vote,
all of those things, which are a lot of the
information that it tends to be manipulated. But also the
Cybersecurity and Information Security Agency that's part of the federal
(07:42):
government has a whole website about election misinformation, and you
can go there and see if something's true or not.
And it behooves us all to be thoughtful about that
and to leverage those resources to make sure that some
claim that you heard about election interference or random people
who don't exist voting in someplace, see if that's actually
(08:04):
true and if that's really happening, and they'll post information
about whether those claims are legitimate so that you can
check your sources, say that you can help debunk some
of this misinformation and disinformation that's spreading.
Speaker 5 (08:18):
That is really interesting because you said something about kind
of hiding the lie amongst mostly truths. I think when
some common folks I e. Ramsasen myself hear things like
cybersecurity and things like that, we think mission impossible espionage movies,
(08:39):
when it could be something as simple as a website
with a different dot other than dot gov that mirrors
what a person's normal I will vote type of website
would look like and just has enough information to look
and seem real. How should campaigns and or candidates combat
(08:59):
that in a way that they're normal voter that isn't
really really tapped into, you know, bad actors. How could
they prioritize the right type of communication to steer those
who are seeking the right information. Because Ramses and I
know not everyone is actually seeking the right information. Some
people tap into that polarity and amplify it. But there
(09:23):
are people who are looking for accurate, vetted out information.
So candidates and campaigns that are trying to reach those
voters and make sure they have access to the right
policy information, the right voting information amongst these bad actors
that are not always Ethan Hunt level spies. Sometimes it's
(09:44):
just something simple as putting as you already you know,
communicated enough truth around the lie, enough you know of
what people would typically be looking for and hide a
little bit of misinformation in there. So what are some
things that campaigns and candidates can do to try to
help their voters or just interested parties sift through that
(10:09):
kind of nonsense.
Speaker 4 (10:10):
Sometimes, Yeah, you've seen a lot of it, multimodal communication
going to people where they are. Despite all the controversy
around TikTok, you see the VP Harris is on TikTok.
She is talking to people on the platforms and in
the programs that they listen to. She's been on more
podcasts about you know, women's issues, or just to reach
(10:34):
different communities to make sure that her narrative, the facts
about her campaign, her policy proposals, where people can find
out additional information about her and her candidacy can be found,
and you'll see that at the state and local levels
as well, as folks are going to where people are,
whether that's town halls, whether that's having big speeches in
(10:56):
communities that need to hear the policy platform and the
truth about information that still is incumbent up on people
to make sure that they're watching per social media platform
versus someone else's, but being available in the places where
people are seeking their information is a really big tool.
And then also protecting those sources of information, like making
(11:19):
sure that you have the security on your social media
accounts so that they're not easily hacked, so that someone
is not usurping your voice and corrupting your narrative. That's
a really important thing, and there have been a lot
of investment by companies and campaigns to support that kind
of resilience and security for campaigns and candidates, and so
(11:42):
that coming together of the broader community is also really important.
Speaker 2 (11:47):
So you mentioned TikTok, and that brings me the thought.
Speaker 3 (11:52):
It always does.
Speaker 1 (11:54):
Yeah, so I have you know, I've been inundated with
election coverage. That's what we do here, and so I
vaguely recall seeing a headline, and I don't even know
how true it is, but TikTok is kind of on
its last leg here in the States and it's going away,
(12:16):
and you know, there was a lot of concern about
the the capacity I suppose for TikTok's parent company to
harvest data on citizens in the United States, just for
those who are like me and didn't get a chance
(12:36):
to like get beyond the headline, talk to us a
bit about what's going on there, whether or not we
were really, like really at risk for.
Speaker 2 (12:45):
The data harvesting or whatever the concerns.
Speaker 1 (12:47):
Were, and how I suppose how that impacts you know,
I guess government policy insofar as social media platforms are
concerned moving forward.
Speaker 4 (12:59):
Yeah, So China has set up a legal and regulatory
landscape that allows them to get access to companies that
operate within their country, and so by Dance being a
Chinese company, China has the capacity to say I need
access to all of this information. Dance claims that that
(13:21):
is not happening and that the Chinese government does not
have access to the personal information of its users. Whether
or not that is true, the capacity, the legal support
for them to make such a request exists, so the
US government has determined that that is a national security threat.
(13:43):
If you think about the best ways to perpetuate a
cyber attack, it is by using attributes about you as a.
Speaker 3 (13:53):
Person or your behavior.
Speaker 4 (13:55):
So you might use your mother's maiden name as your
security question for your bank account. I bet you you
talked about your mom on your social media and that
you might have said her made a name like There
are different ways that that level of access could expose you.
So whether it's a data point like that or something
(14:16):
just about your clicking habits that could indicate how I
could wage a social engineering attack against you that might
get you to click on this negative thing that I
want you to click on. And so that China could
gain access to the Chinese government potentially gain access to
a system that you have access to or information about
you that just benefits their long term ends. And so
(14:40):
that's the national security concern there, because just think how
many people in your family are work at a government
or large organization that whether or not they have access
to sensitive information. Once you get access to the information
networks in that company, in that organization, you can move
around and gain access to things that you really shouldn't
(15:00):
be able to see, and so that national security threat
was determined to be too great to allow TikTok to
operate under ownership of China. So they must divest of
TikTok and have some US ownership or also to get
banned early next year.
Speaker 3 (15:19):
And so that's the road that we're on.
Speaker 4 (15:21):
Will buy dance, find some US ownership for TikTok to
be able to create the separation that the US government
determines it as necessary, or will the band move forward?
And so the next president will have a real decision
to make about whether or not to enforce that and
how to drive forward that initiative, and whether or not
(15:45):
that does color what we do on social media in general,
and I mean it has to. But how once we
do that to China, they will do the same to
our companies.
Speaker 2 (15:56):
Right.
Speaker 4 (15:56):
It sets a precedent that there is a level of
access that will become untenable, unappreciated by different countries. And
so we start ourselves in this push and pull with
companies as they try to service clients and users around
the world. And so we are on the cusp of
(16:16):
a very interesting dynamic that is about to fold across
social media, particularly between the US and China, but really in.
Speaker 5 (16:23):
General, that is so interesting because to the novice or
naive user, something like TikTok could seem so benign with
regard to national security, But then you realize just how
much you share ramses and I talk about our mom
my mom who's adopted him and loves him more, and
ask where her baby is when I get home. But
(16:44):
she's talking about him.
Speaker 2 (16:45):
But I don't want to sound too hurt on this call.
Speaker 5 (16:51):
We talk about her all the time.
Speaker 2 (16:52):
And when you.
Speaker 5 (16:53):
Said that, I'm like, Wow, that is such a clever
and unique point of view that I wouldn't have ever considered.
Now you have worked extensively with regard to policy on
national security, how should our government prioritize the protection of
(17:15):
our broader digital infrastructure? In what ways can we further,
as you said, strengthen that infrastructure moving forward?
Speaker 2 (17:24):
Yeah.
Speaker 4 (17:25):
So the office that I worked at the White House
in the Biden administration has called the Office of the
National Cyber Director. It is a new office in the
White House with the aim to drive towards a vision
of a digital ecosystem that is secure, resilient, defensible, and equitable.
(17:48):
And that office has a real opportunity to kind of
align all of the authorities across the federal government, make
sure everybody has the priorities resources mandate to be able
to execute against the priority set, but also kind of
deconflict things. And that organization is a real opportunity for
(18:12):
the federal government to lean into what we want the
future to look like. And that's what the National Cybersecurity
Strategy does. It outlines an affirmative vision for what we
want the digital environment to look like. It should reflect
all of us, it should align to our values. It
should be safe. We should want to be able to
put our kids on the internet and not be afraid
of that.
Speaker 3 (18:32):
There should be the right protections in place.
Speaker 4 (18:35):
And so an investment in an office like that, I
think is a really important evolution in how our country
has organized itself on issues of technology and security and privacy.
That outlining where we want to be and then making
the investments to get us there, making the policy choices
(18:56):
to get us there is really really important. And a
big of that is how do we invest in making
sure the people understand technology in a way that allows
them to reap the benefits of jobs in technology, but
also to protect themselves and their family. One of the
things that I led at the White House was writing
this National Cyberworkforce Strategy, and what it said essentially was like,
(19:20):
the people are at the center of this.
Speaker 3 (19:21):
I mean, technology shows up in the lives of.
Speaker 4 (19:23):
People, and your use of it determines its viability, how
safe it is, how safe it's not, all of these things,
and so there are some skills that everybody should have,
not just digital literacy, which is like being able to
turn it on and off, but computational literacy, which is
understanding the choices you're making, in digital resilience, which is
(19:45):
being able to adapt to changing technology. And that focus.
That focus on making sure that everybody has the skills
to be able to thrive in a digital environment, be
able to work in a digital environment, be able to
protect and provide for their family is a hallmark of
where our government is going and an investment that must
(20:05):
continue as we transition administrations.
Speaker 1 (20:09):
Now that brings me the thought that there might be
some populations, some facets of our country that are particularly vulnerable,
or at least more vulnerable than other facets of the population,
and they might be vulnerable to specific type of thing,
(20:32):
you know, poor people comes to mind, and historically marginalized
people comes to mind. Talk to us about you know,
I see you nodding in the affirmative. Talk to us
a little bit about that. Who might be more susceptible to,
you know, cybersecurity risks, and what should those communities do?
Speaker 2 (20:56):
Really, what should the government be doing to ensure that
there's a bit more resiliency built to the society insofar
as these vulnerable populations are concerned.
Speaker 4 (21:07):
Yeah, so one of the things that the government is
doing is really investing in supporting at a state, local
level resilience of our education system, our healthcare systems, all
of these critical infrastructure that support our everyday lives. Because
to your point, yes, there are things that individuals can do,
and I'll talk about those in a second, but shifting
(21:30):
the burden from individuals to bigger organizations that have the
resources and the wherewithal to be able to advance security
and resilience is what we need to do. And so
the government has taken on that onus, but also is
charging big companies like your favorite providers, the Apples, the Googles,
cloud providers, all of those companies with making sure that
(21:51):
security is a part of the design that they are
privacy forwards so that you don't have to do as
much thinking about.
Speaker 3 (21:59):
It because you don't.
Speaker 4 (22:00):
You shouldn't have to be a cybersecurity expert to be
able to navigate the technology you use every day. It
should be incumbent upon them to do that work for you. Now,
there are still things that you need to do, just
like you're not a physical security expert. You still make
choices every day about how to protect you and your family. Like,
for example, you might walk down in alley because you
(22:23):
know Karate and I'm not walking down it because I.
Speaker 2 (22:25):
Don't know KARATEI right, I'm with you.
Speaker 4 (22:30):
But that choice was informed by your skills, your background.
All of these things in mine as well, but we
made we came to different outcomes. That's what I want
for people in the digital world as well. I want
you to feel enough agency and autonomy that you're like,
m I need to put a dead bolt on my door,
which is like you know, two factor authentication because I
don't need nobody coming in the front door. That should
(22:52):
not be so easy for you to do. I'm going
to use a password manager because I don't want to
have to think of a bunch of new passwords. But
it's really important for me to have a secure password
or a more complex password, which is like having a
lock on your front door. I want people to make
those little tweaks. Those small things actually remediate eighty percent
(23:13):
of the threats against you. And if you lead the
last twenty to the professionals, we would be in a
much better place. But everybody needs to do those small
things like updating their software, using two factor authentication, having
secure passwords. Those little things actually make you a lot safer,
make the ecosystem a lot safer, and then the big
(23:34):
horrible threats then national security and cybersecurity practitioners can go after.
Speaker 5 (23:40):
Remind me later to tell you a story about me
seeing some ninjas walking down the street at night.
Speaker 2 (23:46):
It's you at.
Speaker 5 (23:49):
A karate, so I got to make sure I'll tell
you that later.
Speaker 2 (23:52):
We want to recruit the hyaha.
Speaker 5 (23:57):
I want to switch gears a little bit. You would
think something as straightforward as cybersecurity would be a You
would think the reciprocity and the bipartisan support for strengthening
that infrastructure would be present. You would also think, with
(24:17):
regards to careers in that space, that diversity in thought
diversity in approach and diversity in the way that people
think when they hear the word diversity in DEI would
be important to everyone. That's not a issue that just
affects me or you, but that's something that the entire
country should be able to get behind. You have personally
(24:41):
been a champion for diversity in your field through your
work with Share, the Mic and cyber If you were
having a conversation with let's say our current VP man
and Vice President Harris, what are some ways that she
could or any candidate could address the ne need for
more diversity in the cybersecurity workforce.
Speaker 4 (25:03):
Oh, great question, because I came to emphasize the need
for diversity in the cyberworkforce and the tech space in general,
not because it's the right thing to do or a
moral imperative or anything like that, but in doing the work,
it is incoming upon us to bring a plethora of
experiences and perspectives to actually understand the threat and the opportunities.
(25:29):
And that's how I would encourage her to explain it. So,
for example, I use this one a lot. WhatsApp when
it was created, it was and it still is, an
encrypted messaging app that people use to communicate back and forth.
A lot of people use it while they travel, but
in immigrant communities it has become a lifeline to your family.
(25:50):
When your mama is back in wherever, that's how you
talk to her, your aunties, your cousins, whoever. And it
had become a hub for modern day chain letters. And
in the heart of twenty twenty, when we were dealing
with the pandemic and an election, a lot of pandemic
related misinformation and election related information flowed through WhatsApp because
(26:14):
people put a lot of stock in what their auntie said,
what their cousins in a different way. So when your
auntie told you to go boil that bush routine because
it's going to cure COVID or prevent you from getting it,
you decided to boil that bush root, right, and you
thought you could. You didn't need to go get it
back and whatever.
Speaker 2 (26:31):
Right, we don't wait, Yeah, exactly.
Speaker 4 (26:34):
A very different way that information moved through communities because
of that immigration, that immigrant tie and what that had
to come to realize that people were putting stock in
that information in a different way and made some changes
in what you see. So now when you're on WhatsApp,
if it's something that's forwarded, it shows you that it's
been forwarded a number of times. It shows you that
(26:54):
that information wasn't organically created by your auntie. That's not
her homemade bush root tee. And so those little tweaks
in usability and the way information services itself shows itself
to you created a lot more security for individuals and
a lot more information security individuals.
Speaker 3 (27:13):
But it required.
Speaker 4 (27:15):
Somebody who had an understanding or it's their lived experience
to be a part of an immigrant community to say, yo,
when my auntie sense X, that means something, or I
might download that image because my auntie sent it to me,
my cousin sent it to me.
Speaker 3 (27:30):
And those.
Speaker 4 (27:32):
Displays of how technology shows up in the lives of
people and how culture and background and all of the
things about us intersect with the technology and then change
the nature of its use, change the nature of how
we think about it. That is why we need diversity
and how we think about how to combat threats, because
you're not going to identify that threat if it is
not your little experience, or you're less likely to, or
(27:54):
it'll take longer it is not your lived experience to
have gone through that. And so The best proxy for
understanding the multiple ways that technology shows up in the
lives of people is diversity, and so getting more people
on a team finding other ways to bring other perspectives
into how you think about building out ecosystems or remediating
threats is the best way for us to be resilient
(28:17):
and secure.
Speaker 6 (28:19):
If my voice didn't matter, people wouldn't be trying so
hard to silence me, and if my vote didn't matter,
they wouldn't work so hard to take it away. So
you know why I'm voting this November because I know
they don't want me to.
Speaker 7 (28:31):
Your voice is powerful, your voice matters. Don't let your
voice be silenced. To register, confirm your voting status, or
get information about voting in your area, visit vote dot gov.
That's vote dot goov. A message from the Perception Institute
and the Black Information Network.
Speaker 1 (28:49):
We are here today with Camille Stewart Gloucester discussing election
security and integrity in the twenty twenty four presidential race.
All right, so a bit earlier you mentioned how information
flowed on WhatsApp and how you know your aunt might
communicate with you, and you're more likely to listen to
(29:10):
your aunt and that introduced a new thought, and so
we're gonna pivot just a bit.
Speaker 2 (29:23):
There's been a lot made of younger.
Speaker 1 (29:27):
People on social media apps and how it affects their
self image, how it affects how they relate to people,
it affects how they engage in school, so forth, and
so on. Right, and there have been elected officials, perhaps
(29:50):
even agencies who have tried to step in to create
some measures and metrics so that we can track how
social media affects younger people and limit the negative implications
of social media on younger people's lives. Right, this is
(30:13):
more of an opinion, and you're in a better position
to have an opinion on this than most folks. How
much involvement do you think that the government should have
in a decision like that?
Speaker 2 (30:29):
And do children?
Speaker 1 (30:31):
You know, because we talked earlier about TikTok TikTok, is
I associated more with younger people. Do they really pose
a national security risk?
Speaker 2 (30:42):
And should they?
Speaker 1 (30:44):
I don't know my questions in there somewhere, but I
kind of want to get your thoughts on this essentially,
So what should the government Should the government be stepping
in and saying, oh, you have to be fourteen or
older to have an Instagram account that sort of thing.
And what risks do you think, at least on a
national stage, do children pose to this country?
Speaker 3 (31:07):
Yeah? I think that it is.
Speaker 4 (31:11):
Long been a responsibility of our governments to protect our
children for different ways, and to give additional restrictions to
protect the innocence and inability for children to make a
well informed decision about certain environments or engagements that they
are in. That's why there's child protective services. That's why
(31:32):
there's all of these things.
Speaker 2 (31:33):
Okay, yeah, that's fair.
Speaker 4 (31:35):
How it happens, I don't know if they're best suited
to do that, And I think much of what you've
seen is a holding to account of tech companies and
requiring of them to create the tools necessary to equip
parents to be able to implement an additional layer of protection,
(31:56):
but also to think critically about what they are serving
up toildren. So I think the most recent company that
made an announcement was Instagram about how they've got like
teen profiles and children's profiles that kind of limit to
types of things that come to kids. And I think
that's an important advent that driven by this tech company
is hopefully informed by the stats and information they're seeing
(32:18):
about how children are being manipulated, targeted, all of these things,
and those are very real things. I worked at Google
as well, and you know, all of the data the
developers and users can get to be able to target
different demographics and send information to them is a very
real thing. And it's one thing for an adult to
be able to discern content. It's another thing for a
(32:40):
kid to be flooded with them. And what we're seeing
is things like their self esteem is being impacted, and
they're comparing themselves to these like manipulated images of women
and grown ups and thinking that their bodies need to
look a certain way. So I think holding organizations to
account for that and then creating the tool for parents
(33:00):
to go the rest of the way is important. That
visa VI TikTok and the National Security van. I mean,
getting all this information about the people who will one
day lead our country China is hard a lot of
information when computers hit AI and they are able to
crunch all of that data and use it to whatever end,
whatever their desired end is.
Speaker 3 (33:21):
That's fair, Okay, will be important.
Speaker 4 (33:23):
It might not be important today, but even that, I
would venture to say is not true, because if I
can gain access to your child's i've had get on
your Wi Fi, I can get on your computer, and
then I can get whatever I was trying to get,
So I would I wouldn't create as much distance between
a child's use and a malicious action, but it could
(33:43):
just be harvested for later. So there's there's a range there,
and you'd be surprised how many not children are using TikTok.
I mean, I don't use TikTok, but companies are using TikTok.
Brands are using TikTok.
Speaker 1 (33:56):
We use TikTok, So you know that's where you go, Well,
I mean we use it for now.
Speaker 5 (34:04):
You know, it's very interesting having this conversation with you,
because bad actors tend to be callous with regards to
who they harm to reach their end. So using children
as trojan horse was something that I didn't even consider
that once you said it, it seemed so incredibly obvious.
You're going to start to notice a bit of a
(34:25):
theme with our questions because in a time of hyper partisanship,
things like national cybersecurity should be straightforward, something that affects
and impacts all of us in a way that you
think we would really come together on this one. Right,
(34:49):
you've had the benefit of working in both private and
public sectors. Is there incentive for the private and public
sector to come together other to combat challenges in the
cybersecurity space? And if again you're advising the next president
to try to strengthen these partnerships, what are some strategies
(35:11):
or do you think there is a such thing as
an effective strategy to get everyone involved, even our tech companies,
with making sure that our country is protected from outside
threats with regards to our national security.
Speaker 4 (35:25):
Yeah, I mean public private partnerships or public private collaboration
is essential. The government has one set of data and
a view They've got intelligence sources, they've got all of
those things. But private companies, especially tech companies, are on
the front lines of use and so they get a
different viewpoint. And when all of those things come together
(35:45):
along with civil society, that gets to do a lot
of big thinking about what the future could look like.
All of that, we have better policy that is actionable
and informed by the technology and doesn't create as many
unintended consequences.
Speaker 3 (36:01):
The shift that.
Speaker 4 (36:02):
Was made in the creation of the office that I
was talking about was really centering public private partnerships, public
private collaboration as part of the policy making process, and
you see that across the federal government. That work must continue.
To your point about cybersecurity not being a partisan issue
or shouldn't be a part of an issue, traditionally, it
hasn't been. Traditionally, cybersecurity is a space where whether you
(36:25):
are talking to a Republican administration or a democratic administration,
you will see similar prioritization. Now, the stack rank might
move and shift based on the environment or based on resources,
and based on their priorities, but overarchingly, the direction we
need to move in the work that needs to be
(36:46):
done will be the same we saw after the twenty
sixteen election and moving into a Trump presidency was a
desire to just rip out things that President Obama had
done and had his name on it, even things in
the space of cybersecurity technology that traditionally had a lot
of bipartisan support or were non partisan issues. And so
(37:07):
what we're on the cusp of is another situation like that,
where an issue that traditionally and continually tends to be
pretty non partisan and have a lot of support. Although
you know, like I said, things might be prioritized different
based on administrations could be politicized, and different aspects of
(37:28):
it have become politicized. In a more longstanding perspective, like
the election context, right, securing election infrastructure should not be
a political or partisan issue.
Speaker 3 (37:41):
We should all want that.
Speaker 4 (37:42):
We want people to vote, We want people to get
out and be able to exercise their civic duty. But
there have been claims of manipulating speech and content moderation
and other things waged against agencies and entities and governments
that are trying to promote that kind of resilience in
(38:02):
election narratives and election infrastructure. That's really problematic because really,
at the base of it, it should just be that
we are trying to let people vote and trying to
let people get the information they need to do that effectively.
So the next administration must prioritize a focus on cybersecurity,
must prioritize the continued work on election integrity and resilience
(38:26):
and securing our elections and combating nation states that are
trying to confuse and so distrust amongst our of the
narratives around our election. That work must continue. How that
will continue will really depend on which administration wins. VP
Harris has made very clear and the work that she
(38:46):
has done is VP and then in her policy platforms
that that will be a focus and that work will
continue's I was at her twenty twenty campaign speech and
she talked about cybersecurity even then. That has long been
an issue for her, and that's something that is a
nation that is on the forefront of innovation, is a
hallmark of who we are. That should be a priority.
(39:09):
And so that's what we need from the future, from
the next administration.
Speaker 1 (39:15):
Okay, I like that, and I know that we've had
you for a good amount of time, so.
Speaker 2 (39:24):
We won't linger. But we're here.
Speaker 1 (39:27):
Now, and I want to ask you. You mentioned the
next administration, which we're going to find that out in
the coming days. But the truth of the matter is
that Vice President Harris and the former president neither of them.
Speaker 7 (39:45):
No.
Speaker 2 (39:47):
I mean, most people in general don't know as much
as you do.
Speaker 1 (39:52):
About cybersecurity, about technologies, and you know, like just people's
digital profiles and how vulnerable we are and what we
can do to offset some of those vulnerabilities.
Speaker 2 (40:06):
You are the expert. You mentioned.
Speaker 1 (40:12):
Whether or not one candidate gets into office or the
other one that will determine kind of the strategy of
this country moving forward. Right, So, I'm going to hit
you with a hypothetical. Since we've established that neither of
them know as much as you. Let's pretend that you
(40:36):
are the person who gets to decide. Let's say a
genie comes out of a bottle and grants you three
wishes or whatever, a wish, whatever it is. Talk to
us about the version of our national security, of our
(40:59):
cyber profile, our digital profile is a country. Talk to
us about what the things you would like to see,
like a like a best case scenario, a wish list
of secure, positive, accurate information and again profiles for us.
(41:26):
What does that look like for you in your opinion,
for the rest of the country, if you got to choose.
Speaker 4 (41:32):
Yeah, So first, let me just say something about the
We are about to find out the candidate. If we
do not find out who the person is on election day,
please do not be worried. It might take a few
days to count ballots to make sure that we have
(41:53):
an accurate account. This election is frighteningly close, and so
I just want to take this to remind people that
if on November fifth, we do not immediately know who
the person is. That is not a sign of trouble.
That is not a sign of election interference. People got
to count the votes, We got to figure out what's
going on. So first I just want to take that
(42:15):
moment to say that now for a next administration, the
things that I think that they should prioritize to protect
you as individuals, to set an affirmative vision on technology,
to keep us as a leader on innovation, we need
to lean in on AI.
Speaker 3 (42:31):
We did some great work while I was there on.
Speaker 4 (42:33):
The AIEO and there was just a national security member
and I'm talking about national security uses. But we need
to figure out the regulatory landscape. So a next administration
would continue that work, continue to invest in people being
trained not only to work in the federal government, but
to work in the private sector on AI, cybersecurity and
(42:54):
other technical fields. There's so much money, resources, and economic
opportunity there. Plus, if communities that are traditionally not represented
are in them, we are more secure because we have
a better understanding of use. We would be at the
forefront of enabling AI innovation and regulation and thinking about.
Speaker 3 (43:12):
What that looks like.
Speaker 4 (43:13):
We would continue to invest in our cybersecurity of our
critical infrastructure. I want to make sure your hospital can
service you. I want to make sure your kids' school
is not taken down by a ransomware attack. I want
to make sure that your bank is resilient. And so
that investment in making sure that our critical infrastructure is
secure is important. And then also securing all of these
(43:37):
clean energy transition initiatives that we have. It's really important
to make sure that the energy grid that we set
up to support electric vehicle is secure, and that we're
thinking about that interplay so as we make a transition
to a next administration. I mean, there are a lot
of things to prioritize, but those would be at the
(43:57):
top of my hitlist to get started.
Speaker 5 (44:01):
So you know this, Ramses, and I know this. Our
work does not stop November fifth. That is not the
finish line. That might even be a jump off point
that might be need to really start getting to work.
Speaker 2 (44:18):
Host election.
Speaker 5 (44:19):
What's next for you and to our listeners who want
to learn more, be more involved, support you? How do
they keep up with you? How do we keep up
with you? Websites? Social media?
Speaker 2 (44:31):
Please?
Speaker 4 (44:32):
Yeah, I mean you can follow me at Kamille Stewart
Gloucester dot com or at Camille ESQ on.
Speaker 3 (44:41):
Instagram or x But I don't know you use that
that much anymore.
Speaker 2 (44:49):
You feel you trust me?
Speaker 4 (44:50):
Yes, so, but you can find me Google me. But
for me, one of my priorities right now is we're
doing a lot of work on global ga of artificial intelligence,
and I want to make sure that global governance actually
includes the global majority countries in Africa, the Caribbean, Latin America.
I hosted an event with forty world leaders last week
(45:13):
on that very topic to make sure a summit that's
happening in France to advance global governance included more of
those countries.
Speaker 3 (45:21):
My work will be.
Speaker 4 (45:22):
Focused on all of the things that I outlined for
an administration to do, doing from my perch outside of
through my company, and making sure that the work continues,
and holding are leaders to account, whether they're leaders in
the private sector or the public sector. So yes, please
follow me. I will be putting out information translating these
issues for folks to make sure that you understand how
(45:45):
people are working on your behalf and can hold them accountable.
Speaker 1 (45:48):
Well, I will tell you this, You've definitely translated a
lot for me. I you know, I think I'm reasonably
you know, what's the word, reasonably technologically sophisticated.
Speaker 2 (46:05):
I'm not a hacker or anything like that.
Speaker 1 (46:07):
But you know, I get a new iPhone, I can
you know, set it up by myself. I don't need
an iPhone Master. Yeah, you know what I'm saying, like
and all that. You know, obviously I work in studios
and all this. You know, we're up there all this
sort of stuff, and you know, reasonably, you know, sophisticated.
But you know, as far as the translator goes, you know,
there's there's levels. You know, I'm reminded in conversations like
(46:29):
these because you know, it's it's very easy for us.
Speaker 2 (46:33):
To be vulnerable.
Speaker 1 (46:34):
I don't really have anything I think any hackers will
be interested in.
Speaker 2 (46:37):
But you know, when you when you lay it out,
I beg to differ.
Speaker 3 (46:41):
Well, I was gonna say.
Speaker 4 (46:43):
They are very Most of these criminals are criminals of opportunity.
Just like your wallet and somebody grabs it and they're like, shoot,
I got some cash.
Speaker 3 (46:51):
It's the digital equivalent.
Speaker 4 (46:52):
They just they go try to get whatever you got,
whether it's five dollars or fifteen hundred dollars, we'll.
Speaker 2 (46:58):
Shoot and you try dollars that bad. It's all good,
come on getting it.
Speaker 1 (47:07):
In any event, you know, I do appreciate you taking
the time to share this with us to as you mentioned,
translate this not just to us but to our listeners.
Speaker 2 (47:17):
But I also appreciate you.
Speaker 1 (47:20):
Being the voice in the room. I don't let me
speak for you, but I'm gonna speak for you. I
would assume that in a lot of these rooms, you're
the only voice that.
Speaker 2 (47:36):
Looks like you.
Speaker 1 (47:38):
You're the only person who has the perspective of a
black woman. You can bring perspective into spaces that would
just as easily overlook you. No, I'm not saying these
are bad people, but it's just hard to consider everyone's
perspective if people are trying to do whatever, and so
(48:01):
being present that really matters. And I want to say
this before we let you go. I know that it's
not simply just you being present. I know that there
was a whole life time of work that it took
of perhaps you often being the only person in the room,
to get to the point where you're like affecting change
(48:24):
on a national level. And yet and still you mentioned
going to France, and you know, making sure that the
global majority was represented. You might be on CNN after
this show, you might be on ABC News, but this
is the show where we can salute that part of
your story and highlight how significant that is to us.
(48:46):
That matters, to our listeners, that matters, and so thank
you would be insufficient in this moment. We have to
actually acknowledge that you are protecting us in ways that
maybe you know, maybe you don't, but it's up to
us to be able to say that.
Speaker 2 (49:05):
And so I'm accused.
Speaker 1 (49:07):
We have this conversation already, so you know we feel
this way electively. So we wanted to take a moment
to say thank you for doing what you're doing for us.
Thank you for your resiliency on the journey that I'm
confident that you had to get there. Now where you are,
(49:28):
you can bring us into the room with you, you
can bring you know. Q mentions his daughter quite a
bit on this show and how there are heroes like
you that exist, and you know, this path that we're on,
we get to not only learn about heroes like you,
but we get to actually have conversations and learn from
(49:49):
heroes like you, and that matters, and we are the
ones who have to and honestly get to It's an
honor to get to be able to acknowledge that. So again,
thank you isn't enough, But unfortunately I'll run out of
words eventually, so I will surmise the rest of what
(50:09):
I could say, what we could say by saying thank
you for all that you've done and all that you're doing,
that that really matters to us.
Speaker 3 (50:18):
I really appreciate that.
Speaker 4 (50:20):
I'm humbled by the acknowledgment and grateful to have the
opportunity to show up for our daughters and our sons
and all of you in these rooms. Because you're right,
I am often the only but it is my honor
and my pleasure to show up and make sure your
voices are heard.
Speaker 2 (50:37):
We'll take it once again.
Speaker 1 (50:39):
Today's guest is Camille Stuart, Gloucester strategist, attorney, executive and
expert on global cybersecurity, privacy, emerging technology, and election security
and integrity. This has been a production of the Black
Information Network. Today's show was produced by Brandy Urban. Have
(50:59):
some thoughts you I'd like to share? Use the red.
Speaker 2 (51:01):
Microphone talkback feature on the iHeartRadio app.
Speaker 1 (51:04):
While you're there, be sure to hit subscribe and download
all of our episodes.
Speaker 2 (51:08):
I'm your host Rams's Jaw on all social media.
Speaker 5 (51:11):
I am Qward on all social media as well.
Speaker 1 (51:14):
And join us tomorrow as we share our news with
our voice from our perspective right here on the Black
Information Network Daily Podcast