All Episodes

January 7, 2026 β€’ 40 mins

Whether you’re chronically online, or only log on to post a quarterly dump, it’s important that you know where and how the data you’re sharing on the internet is being used. Oftentimes, we accept, we say yes, and we check a box on the screen just to get to what we were looking for, but exercising choice in those moments is a crucial way to protect your identity and take control of your digital footprint.

Here to talk with us today is Camille Stewart Gloster, an attorney and strategist working at the intersection of technology, cybersecurity, national security, and foreign policy. Camille has advised top leaders in both government and policy and major companies like Google in cybersecurity practices, and I’m excited to have her on today to talk about how we can begin to protect ourselves from the risks that come with existing in digital spaces.

About the Podcast

The Therapy for Black Girls Podcast is a weekly conversation with Dr. Joy Harden Bradford, a licensed Psychologist in Atlanta, Georgia, about all things mental health, personal development, and all the small decisions we can make to become the best possible versions of ourselves.

Resources & Announcements

If you'd like to take the info from this episode a step further, we invite you to join us on Patreon for the 5-Day Digital Declutter Challenge! We’re hosting the 5-Day Digital Declutter Challenge to help you clean up, reset, and redefine your digital life.

When you join, you’ll get:

  • A free Digital Identity Audit Worksheet
  • Daily declutter prompts to reduce digital overwhelm
  • Community conversation and support
  • A Live Sunday Night Check-In where we’ll work through the worksheet together and reflect in community

If you’re ready to start the year feeling lighter, clearer, and more intentional online, this is your next step.πŸ‘‰πŸΎ Join us on Patreon
πŸ—“οΈStarts January ,7th 2026

 

Where to Find Our Guest

Website:  https://camillestewartgloster.com/ 

Stay Connected​

Is there a topic you'd like covered on the podcast? Submit it at therapyforblackgirls.com/mailbox.

If you're looking for a therapist in your area, check out the directory at https://www.therapyforblackgirls.com/directory.

Grab your copy of our guided affirmation and other TBG Merch at therapyforblackgirls.com/shop.

The hashtag for the podcast is #TBGinSession.

 

Make sure to follow us on social media:

Instagram: @therapyforblackgirls

Facebook: @therapyforblackgirls

 

Our Production Team

Executive Producers: Dennison Bradford & Gabrielle Collins

Director of Podcast & Digital Content: Ellice Ellis

Producers: Tyree Rush & Ndeye Thioubou 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:11):
Welcome to the Therapy for Black Girls Podcast, a weekly
conversation about mental health, personal development, and all the small
decisions we can make to become the best possible versions
of ourselves. I'm your host, doctor Joy hard and Bradford,
a licensed psychologist in Atlanta, Georgia. For more information or

(00:33):
to find a therapist in your area, visit our website
at Therapy for Blackgirls dot com. While I hope you
love listening to and learning from the podcast, it is
not meant to be a substitute for a relationship with
a licensed mental health professional. Hey y'all, Happy New Year,

(00:58):
and thank you so much for joining me for a
session four forty five of the Therapy for Black Girls podcast,
and so the first installment of our twenty twenty sixth
January Jumpstart series. We'll get right into our conversation after
a word from our sponsors. Whether you're chronically online or

(01:22):
only log in to post a quarterly dump, it's important
that you know where and how the data you're sharing
on the internet is being used. Oftentimes we accept, we
say yes, and we check a box on the screen
just to get to what we're looking for. But exercising
choice in those moments is a crucial way to protect
your identity and take control of your digital footprint. Here

(01:44):
to talk with us today is Camille Stewart Gloucester, an
attorney and strategist working at the intersection of technology, cybersecurity,
national security, and foreign policy. Camille has advised top leaders
in both government and policy and major companies like Google
in cybersecurity practices, and I'm excited to have her with

(02:05):
us today to talk about how we can begin to
protect ourselves from the risks that come with existing in
digital spaces. If something resonates with you while enjoying our conversation,
please share it with us on social media using the
hashtag TVG in session, or join us over in our
patreon to talk more about the episode. You can join
us at community dot therapy for Blackgirls dot com. Here's

(02:28):
our conversation. Thank you so much for joining us today, Camille.

Speaker 2 (02:41):
I am so excited to be here, Thanks for having me.

Speaker 1 (02:44):
So we're very excited to chat with you about a
topic that I think is on a lot of people's
minds in terms of our digital footprint. So the January
Jumpstar series is really all about like a metamorphosis and
becoming the next version of yourself. And as someone who
has been online for some time, how do you think
about your own digital presence and do you feel like

(03:05):
the story that it's telling is the story that you
still want to heal.

Speaker 2 (03:09):
Yes, I'm very thoughtful about how I show up online.
Particularly once I became a parent, it changed my entire outlook.
So to begin with, I've always been very intentional. I've
worked in cybersecurity my entire career, and often cybersecurity professionals
are not very public facing. But I want to change
the face of cybersecurity. I want to make these issues

(03:31):
dinner table conversation. Quite frankly, I feel like we alienate
people by having cybersecurity conversations at this super technical or
theoretic level that people steel alienated from. And so I've
made it kind of my mission to empower people in
and through technology, whether that's getting folks into cybersecurity or
tech in general, or just having them be a bit

(03:53):
more intentional about how they navigate the space. And so
I try to lead by example. I'm thoughtful about whether
something is public or private and what I say, and
particularly as things change with AI, folks need to be
really thoughtful about how they show up.

Speaker 1 (04:08):
So what was it about becoming a parent that you
felt made you rethink how you showed up online.

Speaker 2 (04:14):
We're lucky because when we came online, there was kind
of a moment of realization. I was in college, late
college when Facebook came out, and you know that people
might see what you posted, and you had some air
of privacy about you. And I've watched subsequent generations, particularly
ones that are on technology from the day they're born,

(04:37):
really be exposed, whether by choice or by force by
their parents, and the consequences of that. Having been at
a social media company and looking at it through this
big tech lens, whether I was at Google or even
thinking about it from a public policy perspective, from a
national security perspective, and then just that individual security perspective.

(04:57):
And one of the things I appreciated about my journey
navigating technology is I got to make choices. Even if
there were mistakes, they were my mistakes to have made,
and I was able to make them at a point
where I was informed enough and able to do the
learning to decide how I wanted to show up. And
we've taken that decision away from our children. In large part,

(05:18):
we are putting them online before they even understand what
that means, and many of them wake up fifteen years later,
twenty years later and don't love the kind of exposure
that they've had. So I want to gift that to
my child. But also there are some real security and
privacy concerns inherent and making your child very public in

(05:39):
the ways that are happening right now. It's no longer
you know, just sharing on Facebook that's just you and
your family friends, or you and your cousins. It is
wide reaching, and with AI putting an emoji over their
face doesn't even help. So I want to be intentional
about giving her the safety and security she deserves, but
also gifting her the ability to choose how she shows up.

Speaker 1 (06:01):
Yeah, so you mentioned that you've been in lots of
very cool places, some that require very tight levels of security.
So you've been in the highest ones of government. You
mentioned that you worked with Google. How have you been
intentional about cultivating your digital presence and what have you
learned from working in those kinds of spaces that you
think is also important for like the general community to know.

Speaker 2 (06:20):
Be intentional, be thoughtful, take the time to think about
what you want to say about yourself and who you are.
Often hear people talk about it as like your personal brand.
That's fine, your personal brand, your professional brand. However you
want to think about it, What are you saying with
every account that you set up? Does it tell a
cohesive story about who you are in the appropriate context.

(06:43):
We're moving to a world where we're AI browsers really soon,
and I think a lot of people are really excited
by that, and it's a really cool invention. But if
you think about what that means for how people search
the internet, You're no longer putting in a search and
getting a list of results. You will get a curated
narrative about yourself. So as you think about how you
show up online, that should be the frame through which

(07:04):
you think about it. If someone else we're curating who
you are, if they are to pull from all the
available sources and tell a story about you, what story
is that? And so as I engage with every platform
and choose whether it is a personal engagement, so just
for a closed network of people or a professional one,
I make sure that it aligns with me, my values,

(07:25):
the things that I want to say publicly, and ensure
that I'm willing to stand behind anything that I post.

Speaker 1 (07:31):
Yeah, so a very good reminder for folks that kind
of think twice about what you're sharing.

Speaker 2 (07:35):
Rat it never goes away.

Speaker 1 (07:37):
Yeah Yeah, And I definitely want to hear more about
the AI browsers. So we've started some conversations here on
the podcast around AI and what to be on the
lookout for, how it impacts our communities. But it feels
like every time we have a conversation, like there's thousands
of new things that we didn't even know since the
last conversation. So this is the first time I'm hearing
AI browsers, So I definitely want to get into that

(07:59):
with you. But I also want to talk about you know,
like I think that there is far more online about
each of us than many of us even recognize. So
can you talk about like what the general average person's
digital footprint actually looks like?

Speaker 2 (08:13):
That's a great question. So your digital footprint is likely
all of your social media and it is a culmination
of data you've inputed into a bunch of random sites.
I don't know if you've ever googled your name, but
you should if you have not, and what you'll likely
see is maybe a website if you have one some

(08:35):
posts from college or from high school where they're talking
about an accomplishment or achievement anywhere you've ever been featured
on the internet or someone else mentioned you. Some of
that might be curated, and some of that might be
an ancillary message, but it creates kind of a story
about you. But there's also a bunch of data from

(08:55):
whether data leaks or from sites with for or malicious
intent that are kind of aggregating your data. So you've
probably seen those like online directories that pull together your
address and your email and your phone number and then
try to connect you to your cousin and your mom.
And those are directories that we all probably hate. You're like,

(09:20):
oh man, how did they get my address? And that
is probably from some random something or from some data week.
There are tools that can help you clean that up.
Those things are unfortunately unavoidable. You'll find them about every person.
But there are tools like delete me and Canary with
a K that can help you scrape the internet for email,

(09:42):
social security number, phone number, any sensitive data about you
that you do not want to appear on these websites,
and does the work to help you get them pulled down.
And that's something I recommend for everyone in a world
where people get canceled and when people's attention is turned
on you, it could mean getting docks, which means people

(10:03):
flooding you, or swatted, having a swat teams show up
at your house. What you want to be able to
do is protect your physical space as much as you're
protecting your digital space, and so using sites like that
to kind of clean up your footprint the things that
go beyond your ability to control, Like you didn't put
them up there. You should retake that control and get

(10:26):
that content down. So that's what your digital footprint looks like,
the stuff that you intentionally put up there, like social media,
your wedding website, all the random things, and then a
bevy of aggregated data from random sites and random breaches
that tell a story about you, sometimes correct and sometimes incorrect.

Speaker 1 (10:44):
More from our conversation after the break. So, Gamia, you know,
I know occasionally I will get an email or like
an actual physical less from some company, maybe like an
insurance company or something, and like they actually inform you, like, hey,

(11:05):
there was this breach, yes, you know, and you get
maybe three months of like data protection or something, but
it doesn't feel like every company does that right, Like,
are there instances where your data may have been leaked
and you didn't even know about it?

Speaker 2 (11:19):
Oh? Yes, because how they choose to notify you depends
on some legal requirements that are not consistently applied. They're
very different by state, by country, all of those things,
but also a lot of its kind of voluntary responsible actors.
Companies that really want to do right by their users
will make sure that they understand will provide some kind

(11:41):
of resources to help scrape that data, but many don't.
So you'll hear about it on a news report, or
you won't hear about it at all, depending on scale
and scope. So that's why you have to be on
the lookout for irregular activity on your bank accounts, your
name popping up. I actually recommend that everyone set up
a Google alone or there's a site called talk walker

(12:02):
that you can set up on alert about yourself just
to kind of see what's flooding the internet with your name,
maybe your business name, your address, anything like that, so
you can see as those things start to pop up
online and do something to take it down.

Speaker 1 (12:16):
Unfortunately, I feel like there have been several cases of
like high profile and maybe some not even so high
profile black women specifically who have gotten docs right because
of maybe something you see it online or maybe activist activity.
Can you talk about why it's even more important for
black women, beyond privacy concerns, to be mindful of their
digital footprint.

Speaker 2 (12:37):
Yeah, we are in a devisive time where our comments
are weaponized against us. Where DEI diversity, equity, inclusion, I
actually like to say it to make people actually claim
what the words are is being vilified and denied, and
people are using that to push hate. And what often
happens is the loudest voices in the room pushing conversations

(13:02):
of equity and equality or the most marginalized groups often
feel the brunt of that. And so what we find
is a lot of Black women are standing up and
speaking out because they feel the brunt of the pain.
We saw six hundred thousand black women have been laid
off in recent months. We've seen, to your point, a
number of attacks on political figures. That kind of targeting

(13:28):
is endemic to the kind of natural leadership roles black
women tend to take on, particularly in the pursuit of
equity and human rights. It's far reaching that might be
equity in the tech sector, that might be equity in healthcare,
it could be anything. But with the amount of visibility
each person gets with social media and with the Internet,

(13:49):
and with all of the new tools at our disposal,
a lot of us become targets when we didn't anticipate.
You know, you might think political figures and journalists take
that on as part of the job because they're on
public scrutiny. But we all have a bit of product
scrutiny now, and so we all have the potential to
be subject to that. That shouldn't make us silence ourselves,
but it shouldn't make us be intentional about using our

(14:11):
levers of power to protect our presence online and the
way people get access to us, whether that's digital or physical.

Speaker 1 (14:18):
And you mentioned services that you could sign up for
like delete me Canary. Are there other additional layers that
you would use, because those are like after something has happened, right,
Are there things that you can do to be preventative
in terms of protecting your data online.

Speaker 2 (14:33):
Yes, so those are both proactive and reactive in the
sense that you'll find data that's online before somebody acts
on it, hopefully. But some really important things are really
small behavior changes that might seem like a little bit
of an annoyance to add them to your routine, but
they're going to make a huge difference. First, turn on
multi factor authentication on everything. I'm sure you see notifications

(14:57):
about too, if you've been forced by some apps to
do it. That is really important because when someone gets
a hold of your password because it was leaked in
a data reach, they still can't get into your account
because they need to have the other credential to get in.
That's like saying I'm not gonna put on a dead
vault on my house because I have a standard issued

(15:20):
like door handle lock that everybody has like a universal
key to. I mean, technically, I guess it could keep
somebody out, but it's not keeping everybody out, So let's
arm ourselves. Update your software. Most of those pushes include
some kind of security patch, use a password manager, and
stop reusing old passwords from college, like that one password

(15:41):
that's the password to everything has got to go, because
what that means is when they get access to your whatever,
they have access to so many things, and they're gonna
test every site that you use to see if that
password can be used there as well, the other thing
you shouldn't do is authenticate into something with something else.
So you'll often see use your Google, use your Facebook

(16:04):
to get into this other site. I don't recommend that
it's so easy, but when you do that, you make
this kind of connection that we're talking about, particularly if
not if it's done without connecting it to a multifactor authentication.
So they get into your Google account, they've gotten into
all of these other sites and potentially to even lock
you out of your Gmail account. Keep your work and

(16:25):
personal account separate, and consider using one of the privacy
focused browsers like duck dot Go and block the trackers.
I know the cookie requests are a little annoying when
you get them, but deny them. The less access you
give for them to be tracking you when you're off
their site, the better you are going to expose information.

(16:49):
By virtual of using a site, you know you're exchanging
access for some kind of service that you don't have
to give them more than what they need. And the
last major one I'll talk about, because we all use
our phones so much, is to be thoughtful about the
permissions you give each app and review your apps routinely. So,
for example, in the middle of the pandemic, everybody was

(17:10):
using the There was this app that kind of let
you play games together and almost have like a video chat,
and it asked for your contacts. It asked for access
to like your flashlight, all kinds of random things. Why
do you need my contacts for me to initiate a
call with a group of friends. You don't really in

(17:32):
most cases. So I denied that, and then I denied
access to my flashlight because I don't understand why you
need that. And I was thoughtful about each thing that
I declined, because the app is going to work if
you only accept the things you need, which was camera
and audio. Right, that's all I needed to do a

(17:53):
video chat. But what we started to see is your
contacts are like a roadmap through your history. You're op
boyfriend or girlfriend is in there, your cousins are in there,
your parents are in there. You think that's not a
lot of data about you, but that is a huge
amount of data, particularly when you think about that next
to all those sites that are trying to make connections
between you and all the people that you know and

(18:14):
put all of your data out there. You're giving them
so much access to yourself. So just be thoughtful about
what you say yes and no too, and try to
like allow access only when I'm in the app, or
allow once for some of these apps. Small steps can
really make a big difference in how much access an
app or a potential malicious actor has to you in

(18:35):
the future, you.

Speaker 1 (18:37):
Know, Krill, Even as I'm listening to you talk about
all these things, I'm starting to feel a little overwhelmed
by like all of the things that not not in
a bad way, but like all of these things are
so connected, right, and I'm thinking, you know, like even
like your newspapers will ask you to connect through like
your Google account or through your Facebook account, and it's like,
surely whoever is in charge of it at these companies

(18:59):
know so that that's not the safest way, but that's
like the path at least resistance right to get you
to sign up. But there are all these things that
you have to do and really be on top of
to try to make sure that you're protecting your data,
which I think can feel overwhelming, like how do I
stay on top of this? And so my next question
is around like the psychological impact of trying to stay

(19:20):
on top of data and privacy, and even just like
the digital clutter, like all the pictures we have and
all the apps we have, can you talk about just
the impact that has on our mental health and how
it impacts us psychologically.

Speaker 2 (19:31):
I'm so glad you brought that up because I know
that people feel overwhelmed by all this conversation about technology,
and I want people to see them as small steps
that have big impact. Right. Turning on multi factor authentication
is going to help even if you have used one
app to authenticate into a bunch of different things, Like
even if you have to use your Google or your

(19:51):
Facebook to get into a bunch of other apps, if
your Google has multipactor authentication, and then those apps to too,
you've created a bunch of resilience that you didn't have before.
So I want the people to think that. But you know,
in my line of work, I do get to see
all of the things that happen, and there is a
huge mental toll not only on practitioners who work in
the space, but everyone. What I challenge people to do

(20:14):
is think about this as a version of your physical safety.
In your physical safety, you trust the police, the fire
department to secure your neighborhood, to put out fires, to
do the big heavy lifting, and then you take your
sphere of influence your home and you put on a
dead bolt, you turn on an alarm system, you close

(20:35):
the windows, you close your garage, you take the precautions
you think are necessary to protect yourself, and those two
things in concert helped create a more safe ecosystem around you.
I want people to be intentional about that, and I
think that helps relead the mental load quite a bit.
But if you are a person who's very vocal online

(20:55):
and you are speaking out against all of the injustices
in the world for marginalized communities and speaking your truth
in a world where your truth is often a discrete perspective,
you are likely to be thinking about the potential harm
that could come from that. Whether it is docting like

(21:15):
we talked about, or getting squatted, or just hateful messages online.
There is a lot of harm and even just reading
the comments. So I recognize that the space can be
a harmful manifestation of the best and worst of what
we are as people and flood you with that best

(21:35):
and worst. Right it used to be that you'd have
to be able to come find me physically to say
you're negative or positive comments, and so the scale of
that was negligible, even though it could still be harmful. Now,
every troll, every random can say something to you about
the work that you're doing or about your personhood, and
they often do make it very personal. And so one

(21:56):
of my pieces of advice for people is to really
find opportunities to separate yourself into being community in a
physical sense, disconnect from your device. Also, with a lot
of the information that's flooding us these days, I try
to do more of a pull than a push, meaning
I don't want notifications to flood me all the time
about what's happening in the world or what's happening on

(22:18):
my post. I'm going to go with intention to kind
of pull that information out of the app. Let me
find out who's been commenting on my xyz post, let
me see what's going on in the news, and that
kind of behavior really does help to preserve your mental health.
One of the things we're seeing is this desire for community.
You're seeing it with AI chatbots that have become boyfriends, girlfriends, companions,

(22:40):
best friends, and that's really worrisome because their inclination is
to reinforce and to promote like a positive perspective on
anything you say, even your darkest intentions. And so I
also just want to encourage people to remember that technology
is just technology. It is not human connection, is not
the source of counseling. It is not your best friend.

(23:03):
It's not going to give you real advice. It can
be a thought partner, but it is still just a
piece of technology that is only as good as the
data that's put into it, and only as good as
the coding that built it. And so those reminders usually
are helpful for folks as they put this in context
and hopefully stay grounded in their real lives so that

(23:23):
the mental health aspects can be mitigated.

Speaker 1 (23:26):
So you mentioned AI several times. I echo your concerns,
especially as a you know, as a psychologist around you
know some of the suicides we've seen link to AI, unfortunately,
and you know the ways that people are really using
AI to mimic human connection. Can you talk about some
of the ways that AI makes this whole conversation around
like data privacy even more nuanced than what we should

(23:50):
be paying attention to.

Speaker 2 (23:52):
Oh yes, I mean, as the newest technological underpinning is
fundamentally changing our society, how people connect, where people connect,
how people get gather information. So I mentioned earlier AI browsers,
and what that is is your chat, GPT, your Claude,
your favorite Gemini being embedded into a browser and kind

(24:13):
of those aio reviews you're starting to get now when
you do a Google search are the breath of what
you see when you are searching for something, Think of perplexity,
but times ten because it's your whole browser and you'll
eventually link back to a website. But what you see
first is some curated answer based on how that model
reads all of the information about you or about that

(24:34):
topic online. That is a real change and how we
consume information. It becomes even more important for all of
us to be discerners of truth and to understand how
to research and to think about information integrity. Who made this,
When did they make this? Why do they make this?
Can I validate it anywhere? Else? Is that just one

(24:55):
source saying this one outlet? And so it'll also mean
more work from us. We want to be rooted in truth,
rooted in information that is fact based, and that's a hard,
dynamic shift and means also on the privacy and security side,
some new avenues for people to get access to you,
for people to understand who you are, for people to

(25:15):
form an opinion about you that might catalyze them to action. So,
for example, you're doing all of this work, and if
an AI browser or just an AI model in general
provides synopsis of your work, that's not favorable to someone
if they're unwilling to do the work to figure out
exactly your perspective and your point of view, that might

(25:36):
make you the subject of ayer, the subject of their
bad intentions, and then have them act out in accordance.
So there's definitely a heightened need to practice these privacy
and security behaviors, and I hope folks do see them
as small because they will provide a lot of far
reaching protection for you. And be thoughtful about how you
use AIAI doesn't have to do everything for you. Don't

(25:58):
forget how to read, write, think for yourself. The thing
that your job loves the most is your ability to systems, think,
your deep understanding of your area of expertise, and the
societal context, the cultural context. All of those things are
things AI can't mimic, So don't lose that in the
pursuit of leveraging AI to be more efficient or more effective.

Speaker 1 (26:24):
So earlier, Camille, you talked about even with your little one,
like even putting an emoji, which I think is a
popular way that parents like, well, share cute pictures, are
cute things, and they add an emoji, thinking like that
that's actually protecting. But it sounds like that is not
even enough, especially with the EBB and of AI. Can
you think more about that.

Speaker 2 (26:42):
Yeah, And I'm guilty of this too. At first I
was like, oh my god, could just put emojis on
my baby's face? No, no, no, AI has figured out
how to scrub that clean, and now people still have
the face of your child. And what you've seen in
a lot of unfortunate child sexual abuse material is predators
leveraging your photos to create really disgusting imagery or they

(27:06):
get off on things that you just wouldn't expect. I
really hope we're passed the place that people posting their
kids in the dattab, but that is a highly searched
category on some of these disgusting child pornography sites. But
even without that, even if it's just your child on
a playground, there's the potential. And don't get me wrong.
I recognize the value of community, end sharing, your family
and all of those things. So I'm not making a

(27:28):
judgment call on people who have decided that they do
want to put their children on social media. I just
actually want people to be intentional about it. What are
you posting? What context around your child are you posting?
Is that account private? Is it public? How old is
that child? There are ways to mitigate the risk even
if you still want to engage in sharing your child's wins, successes,

(27:49):
evolution with the people you trust and hoole dear, I
completely support that, but there's a real risk out there,
and so I hope parents just think intentionally. Actually, we
put out a toolkit called at Digitalfluency dot tech for
parents and educators to help them have conversations about technology
around AI and to just align their technology used to

(28:11):
their values. What you often find in toolkits is like
a turn this on, turn this off, and this is
more of a how do you make a decision that
aligns with your family values? With sleepovers, there are some
families that are like, ohhay, you go to whoever's house,
and there are some that are like, you are never
going to go on a sleepover, and most people fall
somewhere in the middle, whether that falls closer to one

(28:32):
side or the other, and they do that, they make
that choice based on trust, based on their values, based
on a whole host of factors. You can do that
with your tech used to does my child get to
dive right in and use everything or do they use nothing?
It's probably not either of the two. It's probably somewhere
in the middle, and it'll probably move based on their age.

(28:52):
And I think that's the way that we should all
be thinking about it. Just a little intentionality would provide
a lot of protection for us and our families.

Speaker 1 (29:01):
More from our conversation after the break. So, I think
when a lot of us think about like, Okay, I
want to take some steps to clean up my digital footprint,
we often start with social media. And I'd love for

(29:21):
you to talk through like the distinction between deleting and
like deactivating yourself on social media sites. Are those the
same thing? And what happens to your data? Let's say
if you do delete your account, so.

Speaker 2 (29:34):
It depends on the site. They usually tell you in
their privacy policy if deleting means that they will actually
delete the data that they are holding about your profile.
Sometimes they do, sometimes they don't. You disabling your account
does not mean deleting the air. Two separate actions, so
you've got to do both if that's what your goal is.

(29:55):
You know, there's a third category, I would say, and
that's just kind of making it really private and being
thoughtful about who you let view your account and like
who you let into your circle. I think that's probably
where most people can and should fall. I think deleting
your social media account is a great opportunity to leverage

(30:15):
your buying power to speak to your values. So, for example,
you'll see a lot of people jump off of Instagram
and Facebook because they feel like the way meta has
evolved doesn't align with their values, and I'm sure you'll
start to see that in the AI space as well. Now,
that can be tough because if everybody's on Instagram or

(30:36):
everybody's on TikTok, it's kind of tough to not be
in the mix, and so then how do you moderate
your behavior accordingly? But I do think that contemplating deleting
or disconnecting, even if it's just for a temporary detox,
is a tool at your disposal to think about how
you are speaking to these companies about what it is

(30:57):
that you value and how you'd like them to show
up in your life.

Speaker 1 (31:00):
You know, Camille, I think that there's often this pressure
intension that exists, especially for Black women who are entrepreneurs
and creatives. So much of visibility is tied to like
your next gig or partnerships or speaking engagements. Right, how
do you balance the need for privacy and protecting your
mental health with the very real benefits that often come

(31:22):
with being visible online?

Speaker 2 (31:24):
Yeah, that is about being intentional with what you are doing.
Your professional PERFOCTSONA or your professional pursuits require of you
to be visible and do that well. But that doesn't
mean that you have to film in your home or
talk about your home life, or talk about where you

(31:44):
went on vacation. There can be boundaries, and so my
recommendation to people is not to stay offline. As you
can see, I'm online. I think it is a great
opportunity for connection, to get business, to meet new community,
to understand how the things that you're passionate about alying
to other issues that are important in the moment just

(32:06):
for entertainment and recreation. And so there are a number
of reasons to engage online, do that thoughtfully. If it
is about business only or business accounts, keep it business.
If you are creating a professional personal persona kind of
this like hybrid, what are the boundaries on that. Is
it that your family isn't a part of the content?

(32:26):
Is it that your child isn't a part or children
aren't a part of the content. Is it that you
want to protect your family your parents? Really be thoughtful
about what those boundaries are and then go for it.
But just know also what you've put out there and
then react accordingly. So, for example, if you decide that
you've got this business personal hybrid persona that you want

(32:46):
out in the world, because a big part of your
professional pursuits is your personal professional brand, the information you
share as part of that should not become your password.
That shouldn't connect back to the ways that you seek
to protect yourself and your family. So just be thoughtful
about those things and you should be fine.

Speaker 1 (33:05):
You know, Gamil, I feel like being online. There are
so many things I learned that I just never would
have thought about, you know, like seeing people on TikTok unfortunately,
be able to guess somebody's location based on like the
angle the sun comes into their living room, or oh
I stayed at this hotel and now I see someone
else and then like sharing that information in the comments section, right, Like,

(33:27):
I think that there's just so much of that happening
that you know, sometimes I think happens mindlessly, but it
does impact our safety. Are there other things that you'd
like to call attention to in terms of digital privacy
or cybersecurity that we haven't touched on that you think
are important for people to know?

Speaker 2 (33:43):
I would just say complimenting that nothing is full proof,
and the technology and the circumstances change all the time.
Like I said, I thought putting an emoji over your
child's face might be okay for a time, and then
realize it wasn't, so you adapt. But if you compliment
that with some of those security behaviors that I've talked about,
like using twofa and having a really good password manager,

(34:04):
updating your software, you'll build in a lot of protection
for you and your family. If you find yourself the
subject of some kind of online abuse or harassment. There's
a good book called How to Be a Woman Online,
Surviving Abuse and Harassment and how to Fight Back by
Nina Jenkowitz. There are also a number of folks online

(34:25):
that talk about these issues all day. You don't have
to do the heavy listening. Let me and other people
look at the privacy policies for websites, look at the trends,
and highlight for you areas where you should or should
it start to engage. Also, reject the instinct to be
a first adopter. I don't know how many of you

(34:46):
have heard about Mattel embedding AI into some of their toys.
They've been thinking about that. Please do not give your
job AI model without having understood exactly what information it's
going to collect and what it's going to do for
the child, how it's going to affect their learning. You
don't have to be your first adopt on these things.
I would say your best bet actually is to let

(35:10):
it roll out, see how people use it, see where
some of the harms pop up, and then adjust your
use accordingly. That'll really help you as you navigate the
space and just stay connected. Just be thoughtful and adapt
your behavior as things come up.

Speaker 1 (35:24):
I was not aware that Mattila was inventing AI and
toy that it's new news to me, but it does
bring up a great question. You've already mentioned your daughter.
I also have two sons, and thinking about this is
a whole new thing. Now I have to talk about
with the kids. Right these schools are already talking about AI,
they're using it. What kinds of things do you think

(35:44):
as parents and caregivers are important to talk to our
kids about in terms of AI and the way that
it's evolving in our society.

Speaker 2 (35:52):
First and foremost is to have conversations about it. This
should be a dinner table conversation, talking about the latest tools,
talking about about your fears and concerns, talking about how
you're navigating it. Talk to your child about, oh, I
use this chatbot for a title for my article, but
I won't use it to write the article. And here's why.

(36:13):
Talk to them about the kinds of information they get
and how reliable or not reliable. It is the limitations
that conversation piece goes a long long way. Your kids
are creating their values and their boundaries based on what
they hear from you and your candor about your fears,
your concerns, and you figuring it out is going to
be a big part of them wrestling with and understanding this.

(36:35):
That inclination to be skeptical about a technology, skeptical about
when you get a random email. Continue to reinforce that
with your children, Tell them to lean into that those
instincts about things that are genuine versus not will be
a skill that they need long term. But don't keep
them away from technology. Moderate it based on their age.

(36:57):
I actually wrote a series about this where we talk
about by age group, what are some of the things
that you should be thinking about, What are some of
the tools at your disposal, some of the resources that
explain well what your child's capacity is. Because the goal
is to both make them digitally resilient and savvy and
able to navigate the latest technology, because that will be

(37:19):
an increasing part of their future, particularly as they transition
to being workers and employees and thinking about their future.
But they also need to recognize that there is a
world of human connection and anchoring their use and the
things that the values and the connections and dynamics that
they hold dear. And that's a tough balance if you're

(37:40):
not having constant conversation. So as you think about your children,
thinking clearly about what the limit should be on how
much TV, on how much they use the computer, what apps,
where are the places where you should be imposing a
limit and how are you checking in to make sure
that you're understanding how these tools evolve. You're gonna have

(38:01):
to be a little bit active on that, just like
you would be on where they're going to go after
school or who they're interacting with in real life. This
is going to take our active engagement as parents, because
we'll be learning as they're learning, but our lived experience
around how harm can manifest itself and how much access
people should and shouldn't have to your kids based on
their age is going to be integral to them figuring

(38:23):
this out, to you both figuring it out.

Speaker 1 (38:26):
This has been so helpful, Camille, Thank you so much
for sharing all this information with us. Where can we
stay connected with you? What is your website as well
as any social media challenge you'd.

Speaker 2 (38:35):
Like to share? Yes, so you can find me at
Camille Stuart Gloucester dot com. You can also find me
at at Camille esq on Instagram, and I also have
a substat called command Line with Camille. That's where you'll
find the articles I talked about with age appropriate, like
alignment on technology use. You'll find articles about the changing

(38:58):
AI ecosystem cybersecurity, and that is subseac dot com slash
Camille ESQ. That's usually where you can find me in
most places, and I look forward to connecting with you all.
Let me do the heavy lifting on what's changing, when
and how and how you secure yourself, and then you
do the light work of implementing it perfect well.

Speaker 1 (39:19):
We should to include all of that in the show notes.
Thank you so much for spending some time with us today.
I appreciate it.

Speaker 2 (39:23):
My pleasure. Thank you for having me.

Speaker 1 (39:29):
I'm so happy Kamille was able to join us for
today's episode to learn more about her and the work
that she's doing. Be sure to visit the show notes
at Therapy for Blackgirls dot com SATs Session four forty five,
and don't forget to text two of your girls right
now and tell them to check out the episode. Did
you know that you could leave us a voicemail with
your questions and suggestions for the podcast. If you have

(39:51):
books you think we should read, or movies we should watch,
or topics you think we should discuss, drop us a
message at Memo dot fm slash Therapy for Black Girls
and let us know what's on your mind. We just
might feature it on the podcast. If you're looking for
a therapist in your area, visit our therapist directory at
Therapy for Blackgirls dot com slash directory. Don't forget to

(40:11):
follow us on Instagram at Therapy for Black Girls and
come on over and join us in our Patreon community
for exclusive updates, behind the scenes content, and much more.
You can join us at community dot Therapy for Blackgirls
dot com. This episode was produced by Elise Ellis, Indechubu
and Tyrie Rush. Editing was done by Dennison Bradford. Thank

(40:32):
y'all so much for joining me again this week. I
look forward to continuing this conversation with you all real soon.
Take good care,
Advertise With Us

Host

Dr. Joy Harden Bradford

Dr. Joy Harden Bradford

Popular Podcasts

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

Β© 2026 iHeartMedia, Inc.