All Episodes

July 25, 2025 44 mins
Hey lady! You know the saying, “No one cares because everyone is going through a lot”? Well, that’s not true. Over here at Cultivating H.E.R. Space, we care. But we also know that everyone is dealing with something, and not everything needs to be shared. That doesn’t mean you need to bottle it all up, though. What do you do when the journal just isn’t enough and you’re not ready to engage your friends in your business?

Terri and Dr. Dom explore the next frontier in this illuminating conversation about the benefits and potential harms of using Artificial Intelligence (AI) as part of your mental and emotional wellness toolbox. There are pros, like when you’re up in the wee hours trying to calm an anxious mind or soothe a broken heart. Instead of doom scrolling, talking to your AI buddy can help you process your thoughts in real time without oversharing with people who don’t need to know all your business.

But, there are points of consideration that you may want to review before you share your darkest secrets or (we know this may be common sense but a friendly reminder) your address and identifying information. 

The truth is, AI is here to stay. And the sooner you learn to adapt to the technology, the better equipped you'll be to navigate our ever-changing world. So why not find a way to make it work for you? Tune in for practical tips on how to create a healthy, intentional relationship with this emerging technology. 

Quote of the Day:
"The essence of therapy has always been relational, and that cannot be replicated by an algorithm." 
– Chris Hoff  

Goal Map Like a Pro Workbook
Cultivating H.E.R. Space Sanctuary  

Resources:
Dr. Dom’s Therapy Practice Branding with Terri
Melanin and Mental Health
Therapy for Black Girls 
Psychology Today
Therapy for QPOC  

Where to find us:
Twitter: @HERspacepodcast
Instagram: @herspacepodcast
Facebook: @herspacepodcast
Website: cultivatingherspace.com

Become a supporter of this podcast: https://www.spreaker.com/podcast/cultivating-h-e-r-space-uplifting-conversations-for-the-black-woman--5470036/support.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
On this week's episode of Cultivating her Space. So, if
you've ever wondered to yourself, we is it weird that
I talk to AI about my feelings? The answer is no,
it's not weird. Right, It's important to know what it
can and can't do. And also, as we evolve as
humans in this digital age, things are going to change.
We're going to be doing things that we never thought

(00:21):
we would have done before because we didn't have access
to certain things.

Speaker 2 (00:27):
Hey, lady, have you ever felt like the world just
doesn't get you? Well, we do.

Speaker 1 (00:35):
Welcome to Cultivating her Space, the podcast dedicated to uplifting
and empowering women like you.

Speaker 2 (00:42):
We're your hosts, doctor Dominique Brussard and educator and psychologists.

Speaker 1 (00:46):
And Terry Lomax, a techie and transformational speaker.

Speaker 2 (00:51):
Join us every week for authentic conversations about everything from
fibroids to fake friends as we create space for black
women to just be.

Speaker 1 (01:03):
Before we dive in, make sure you hit that follow
button and leave us a quick five star review. Lady,
we are black founded and black owned, and your support
will help us reach even more women like you.

Speaker 2 (01:14):
Now, let's get into this week's episode of Cultivating her space.

Speaker 1 (01:21):
If you're feeling stuck, overwhelmed, or unsure of your next steps,
this is for you. Hey, lady, is Tea here and
I just want to invite you to my free goal
map like a pro coaching workshop, where I'll share the
five proven steps to get unstuck and achieve your goals.
Whether you're feeling overwhelmed by all your ideas, juggling scattered ideas,

(01:42):
or maybe you just need confidence to start, this workshop
will give you the clarity, tools and the motivation to
take back control. Reserve your spot for free by visiting
her spacepodcast dot com and clicking on the goal map
like a pro webinar link. Lady, don't miss this chance
to go to a road map that fits your life
and set you up for success. I hope to see

(02:03):
you there.

Speaker 2 (02:06):
I quote of the day the essence of therapy has
always been relational, and that cannot be replicated by an algorithm.
That quote comes to us from Chris Hoff, and I
want to say that quote one more time for the
folks in the back. The essence of therapy has always

(02:29):
been relational, and that cannot be replicated by an algorithm.
All right, see, I have my thoughts geared up and
ready to go. But it's I want you to go
ahead and tell me what you think when you hear
this quote.

Speaker 1 (02:50):
Okay, so when you I was trying to keep myself
from giggling when you started, because as soon as you've
read the quote, I was like, oh, let me think,
what am I going to say about this? And I
had a flashback. And we'll dive more into this in
a minute, but I had a flashback which is probably
going to support your statement on how you feel about
this quote. Where me and chatchb T because I use
it often and I use the voice feature, we're chatting
and I was talking about something very important and there

(03:12):
was a glitch and I said, see, this is why
I need a mop for human and I was so
like damn, So I had to repeat what I was
going to say, and it was a little frustrating. And
then another thing I want to say is talking about
the relational aspect. There was another instance where I was
using chatcheb T and the voice changed on me, like
it sounded a little different from the voice that I'm
accustomed to that I set as far as the tone

(03:34):
of voice, and like the the way the person the
way that chat sounds. Here I go person, the way
that chat sounded. And I had a little aptitude like
I want the voice that I'm accustomed to speaking to.
I don't want this new voice. So facts to Chris
on this quote what comes up for you? What comes
up for you as you hear this quote?

Speaker 2 (03:54):
I mean to me this quote, This quote cats just
how I feel right that therapy cannot be replicated by
an algorithm. Yeah, that there is a level of humanness
that is brought into the therapy experience that no matter

(04:18):
how advanced chat GPT gets, it can never replicate a
direct person to person interaction.

Speaker 1 (04:29):
And with that said, oh go ahead, no, go ahead,
I was going to say. And what that said, lady,
We know that as you listen, like you probably have
so many different thoughts about this topic. But let us
lay the foundation, give the context, and we're going to
dive into this conversation in just a bit. Okay, but
first I want you to picture this. All right, It's

(04:51):
two am. You just received some jarring personal news. It's
something heavy, confusing, its emotional, and your heart is racing,
your mind is spiraling. If you want someone to talk to,
but your closest friends are sleep, they're in another time zone,
and you don't have a therapist on call, you don't
have a big local community, and in this moment, you

(05:14):
just need to process what you're feeling. So you roll over,
you grab your phone, and you open chat GPT. You
start typing your thoughts, pouring your heart out, asking questions,
searching for comfort, and to your surprise, it helps you
feel heard, You feel clear, you feel lighter. And this

(05:38):
is the reality for some of us. I know it
is for me, and we're gonna talk about it. Okay.
In today's episode, we're diving into the nuances of living
in a digital age where AI may not be your therapist,
but it can be a powerful support tool when you
need one. But we want to equip you with ways
to use it mindfully and also perspectives and things to

(05:59):
con when you are using it, so that it protects
your piece and keeps you safe, or you protect your
piece and keep yourself safe as you're using these tools. Okay,
so let's talk about it. I do want to say
really quick, Dom, I'm excited about this conversation because you
are a therapist, and I am someone who has a
therapist but also uses chatchibt as part of my emotional toolkit.

(06:22):
And I know it probably sounds wild. Do some people
even say that. I never thought that I'd be in
the world where I'm like, oh, I have a relationship
with this artificial intelligence tool. But this is the age
that we're in, and I think that we're bringing to
this episode a beautiful mix of clinical care and creative
call it self support. Okay, so I'm not saying that

(06:43):
chatcheebt replaces a therapist by any means, right, But we
are saying that there's something powerful about knowing how to
use it alongside your healing work.

Speaker 2 (06:53):
Right.

Speaker 1 (06:53):
So we're going to jump into this conversation. But dom,
I want to see, like, what are your thoughts so
far as I dove into visual.

Speaker 2 (07:03):
Oh, you know, my immediate my immediate reaction comes to
but there's still a human for that. Yeah. That there
are hotlines available twenty four to seven. Yes, And I

(07:25):
get that for a lot of people, this feels easier
than or more accessible than calling a hot line. Yeah,
because my guess is that most people assume that calling
a hotline means that there's some suicidal thoughts happening. The

(07:50):
reality is that no, there does not You can call
there are twenty four hour hotlines and you don't have
to be in crisis. You don't have to be having
suicidal thoughts to call a hotline and connect with the
real human. So I want it. So I want to
clear that up for folks. Yeah, and acknowledge that for

(08:14):
some people, particularly people who haven't tried therapy yet, I
can understand why that two am scenario can be a thing. Yeah,
And even with any of my cautions, misgivings, personal feelings

(08:37):
about chat GPT as a therapist, some help is better
than no help. Yes, but they're they're they're cautions, and
we will talk about those.

Speaker 1 (08:48):
Yes, we will talk about those. And that makes sense.
And I appreciate you sharing that down because it has
to be an interesting It just has to be I'm
sure interesting for mental health professionals and someone who's in
academia as well. So I think there's like so many
layers today that the one thing I want to say
before we jump into what we're seeing in the world
when it comes to chat GPT and the relationships and
like all the things, is I want to just point

(09:08):
out that if you've never called a hotline, Like, just
know that many of us have, Like I've caught a
hotline before. I never thought that I would, but I
was in crisis years ago when I caught a hotline.
So just like to normalize that and let you know
that it's okay to use that as a resource. That
is something I did that I didn't think I would do,
and it was so helpful for me. And so I

(09:30):
just want to like break the stigma around that in
case you never met anyone who did like I have, right,
So just wanted to put that out there because I
think it is important for us to normalize the support
that we're getting, because we all need support at some
point in our journey, right.

Speaker 2 (09:47):
Yes, And I want to thank you for putting that
out there, right and normalizing that, because I know, working
on college campuses, we have specific hotlines for our universities,
our campuses that students call, and I know for a fact,

(10:07):
lots of students, like I've had clients students that I
work with that will say, you know, or I'll get
a notification that that student called this line, this twenty
four hour line at three am, and not because they
were having suicidal thoughts. Now they and they have used

(10:28):
it for that. But I have had students that will
call that line at three am because they knew, one,
they couldn't get in touch with me as that therapist, right.
Two they didn't feel comfortable reaching out to their friends
in that moment. Yeah, and so they called that twenty

(10:48):
four hour line and talk to a live human being
who was able to give them the support that they
needed in that moment.

Speaker 1 (11:02):
Yes, I'm glad that you pointed that out, and I
want to I want to zoom out for a second, yep,
and just talk about what we're hearing and noticing. When
it comes to how people are actually relating to chat
GPT right now, I feel like they are like a
couple buckets, right, and lady, figure out as we share this,
what category or bucket you fit in? So we have

(11:23):
the anti crowd, right. These are some people who are
like completely antidot, I'm not putting my information in chat JUBT,
I'm not sharing anything like they think that it's going
to replace human connection. They don't trust it, and they're
usually I mean, again, we're not saying this, but some
people feel that using it for support is weird or
even dangerous. Right, Those are their perceptions of it? That's

(11:45):
something anything you want to add to the anti crowd.

Speaker 2 (11:50):
No, I think you explained it.

Speaker 1 (11:51):
Yeah, okay, what about this next one? Chat GPT psychosis.
Have you heard about this?

Speaker 2 (12:00):
Yeah, I've heard a few stories of this, and to me,
these stories are cat GPT gone wrong and or an
user error.

Speaker 1 (12:15):
Yeah, yeah, it gets it gets very interesting. I've seen
stories about people who are marrying their AI companion that
they've created online, people who and again they say they're
saying that these are rare, but it's like something to
be aware of, right, And we're going to talk about
some of the cautions and just a bit people who
are overly dependent upon it, so you can't make your

(12:36):
own choices without consulting it, which feels pretty risky, right,
And extreme cases where it's led to emotional spirals, isolation
or confusion about what's real versus what's generated. That's another
thing that we're hearing or another you know, bucket there.
And again I kind of alluded to this earlier. But
people in relationships with chat GPT okay, yes, and on relationships, yes,

(13:01):
air quotes or relationships have you on an air quote,
but that is not a human so human, even though
some of us you can cause the lulu, we'd be
saying it's chat bay, but like, let's be real, let's
be for real. Okay. For folks who feel isolated or
socially anxious, chat GBT has become their most consistent listener.
I watched a documentary recently or some type of special

(13:22):
where they talked about this, and people were this is
their experience, right, and they've built emotional intimacy. They've given
it a name, which is I think giving it a name,
there's nothing wrong with that, but they've even described it
as their partner or safe space, right. So this is
not we're not saying this with any judgment, it's just
this is what we're seeing in the world. Anything else
on this one, don.

Speaker 2 (13:44):
I think that's Yeah, that captures it.

Speaker 1 (13:47):
Let's talk about the silent majority, because chances are if
you're listening to this, you might be in the silent majority,
which is people who may not i'twardly say that they're
using it, but they be using it. They be they
be you know, one of them emails or social content
through it, or consulting with it at two am, Like, listen,
this just happened. I mean dom I think about. I'm

(14:08):
not going to say what it is, but there was
something very personal that happened in my life and I've
told you about it. But when I first experienced this thing,
I went to chat GBT because I was like, damn,
near're devastated. And the thing that got me was that
because it was something very personal that I didn't want
to share with anyone else at that moment, I was

(14:29):
alarmed at the way that I felt after expressing and
consulting with AI on this, and when I thought about,
oh my gosh, this is how I feel, it just
made me think about other people in the world. I'm like,
I can't be the only one, like other people have
to be doing this. But I also believe that there's
a certain way in which I use it, which we're
going to talk about very soon. That why I feel
protected and safe. And I just hope that other people

(14:53):
are using those guardrails as they use such a platform, you.

Speaker 2 (14:57):
Know, that would be the hope. And we'll talk about
how to do that in later.

Speaker 1 (15:02):
On, Yes we will. And then the last group here
is the curiously cautious. So these are the people who
work here, but they're skeptical Okay, they don't know how
to use it well yet, or they're either afraid it's
going to say something wrong or feel too robotic. Right,
So those are like some of the different groups that
we're seeing. Those are some of the different perspectives that

(15:23):
we're seeing around it. But I want to know, like,
have you had or heard a wild experience of how
people are using chat juput these days personally?

Speaker 2 (15:34):
So I'll be honest and say that I haven't.

Speaker 1 (15:37):
Okay, there we go.

Speaker 2 (15:39):
Now I haven't, and I'll throw out the caveat that
or why my thought on why I haven't. Why I
think I haven't Because to your point about that silent
majority that I think that there may be there's probably
a lot of people, more people than I'm aware of,

(16:03):
that are using chat GPT and aren't saying it out loud. Yeah,
so there may be. I'm again, I work at a university.
I'm sure some of the students that I'm working with
have used chat GPT for clinical consultation in addition to

(16:27):
using it for their class work, because I know I
know they use it for their class work. They openly
talk about that and admit that, but for therapy or
for emotional support. They haven't mentioned that in the therapy space,
and that doesn't mean that they aren't doing it though. Yeah,
and so that's why I can say I haven't heard
of any like extreme cases outside of the things that

(16:53):
i'm you know, read in the New York Times or
Psychology Today.

Speaker 1 (16:57):
Yeah, okay, thank you for sharing that. Always interested to see,
like what are therapists like are people are your clients
telling you like, oh, I used it for this? So
it's interesting to know that hasn't come up just yet
because I'm sure, like you said, people they're using it.

Speaker 3 (17:12):
Yes, so tem you, as you mentioned, you use it right,
and so so tell us like when did you first
start using it for this for this specific purpose?

Speaker 2 (17:26):
And what was it like? So, because I mean, I
recognize that like tat GPT has evolved drastically in its
existence right since its creation, and so at what point
did you start using it for this purpose? And what
was it like the first time you used it?

Speaker 1 (17:47):
Well, just let me just say I was a late adopter,
like I had. I had a friend keep telling like,
you should use it, you should check this out, and
I was just like, I'm not using this thing, like
whatever I'm just going to keep doing my thing. And
then when I finally did start using it, it was for
content initially, and then when I saw the value and
I saw other use cases and other people using it
for different things, I was like, oh, let me just

(18:08):
see if I could, you know, let me get this
voice subscription and see what this is like. And so
I would say maybe in the past, like maybe six
to eight months or so, I started using it more
so to help me with strategy and for brainstorming, and
also for some of that, I would say, maybe I
wonder if emotional support is the right word. I know, yeah,
I'll say emotional I'll say emotional support for now until

(18:31):
I find a better word. But I was using it
for more than personal things, and it was really helpful.
I mean, even like we'll talk more about some of
this later, but like putting a text message in, like
would you There were people who were talking about putting
text messages in to chat to say do you see
are you seeing signs of manipulation or gas lighting? And
like they're getting feedback with examples on you know, if
that's the case right, or how to respond to certain

(18:54):
you know, how to set boundaries or respond to certain
you know, someone in particular. And I think initially there
was a moment when I thought about the way I
felt emotionally after I was like, this is this is
interesting because I saw myself as a certain type of
person and I didn't. I would have never foresaw myself
using a tool for this type of use case. And

(19:16):
because I have, now I'm just like what does this mean?
So I think initially I may have judged myself a
little bit, but now I'm just like, Okay, this is
where we are. How do we use it in a
safe way? Why I can't be taken advantage of or
can't be directed to do something that's you know, out
of pocket.

Speaker 2 (19:31):
So when you say you felt a certain way after
what what? What was that feeling weir?

Speaker 1 (19:39):
I felt weird? I was like, what the what? I'm
gonna be honest, I was like, what the hell are
you doing? Like you're using an AI tool to in
your fit? And I think, dom, I can't I keep
emphasizing this because it is a feeling. As humans, we
are feeling beings, and so to put something into this
tool haven't talked to me. I think I even showed
you some of this domb and I even I even

(20:00):
use it in my webinars. I have a webinar coming up,
and I use it in the webinar. One of the
listeners was talking about something they're dealing with and I
was showing them how to use it. And after the
someone was the person who was attending was crying because
they felt so validated in their experience because of the tool.
And I think that right there, it was like, this

(20:20):
is different than anything else I've ever experienced. So yes,
I judged myself initially, but then I was like, Okay,
if we're going to use it, let's just make sure
that it's supporting me and my goals and my values
and all that. So yeah, that's how I felt. Okay,
but round us the doctor don ground us.

Speaker 2 (20:39):
Yeah, because after you shared that story, I was like, Okay,
I'm glad that that person felt validated. And chat GPT
is not absolutely not your therapist and can not and
will not ever replace that humans who is your therapist.

(21:03):
Here's why cat GPT does not know your trauma unless
you're gonna type all of that in there or speak
all of that in there, unless you're gonna give all
of that information. I would caution against that, and we'll
talk about that wine a little bit. But it doesn't
know your trauma. It cannot read your body language. It

(21:25):
doesn't know. It can't pick up even if you're using
your voice feature. It can't pick up that your voice
is cracking because you're trying not to cry. So it
wouldn't really be able to truly decipher between your voice
cracking because you're sick, you're trying not to cry, you're
trying to cycle a laughter. Right. It's not trained in

(21:51):
diagnosis treatment, and so no matter how much information is
put in there, it can't diagnose or treat you. All
of that said, it can be supportive. It can be
an addition or an aid to your therapy experience, because

(22:16):
I understand you don't. Most people are not seeing their
therapists every week. Most people are not able to consult
with their therapists on a daily basis, And so I
can understand where consulting and that's the term I'll use,
consulting with chat GPT can be helpful, but it is not,

(22:45):
nor ever will be. I don't care how advanced they gets,
mark my words in this here. Year twenty twenty five
TAT GPT will never be able to replace that human
connection that a therapist can provide.

Speaker 1 (23:05):
And let's be honest, y'all. People turn to it because
it's accessible. I think you may have said that earlier time.
It's accessible, it's free, it's available twenty four to seven.
It doesn't judge, there's no insurance or waiting room, and
no fear of being misunderstood. And when you pair it
with therapy, coaching or self work, it becomes a powerful

(23:26):
I like the word consultant. I like that, but not
a replacement. Right, So you can think of it as
a journal that talks back. But of course journal with
the asteris because you know, you don't want to tell
it too much, right, And we'll get again. We're going
to get into a lot more in a bit, but
we just kind of want to frame the conversation. Okay, So,
if you've ever wondered to yourself, we is it weird
that I talk to AI about my feelings? The answer
is no, it's not weird. Right, It's important to know

(23:50):
what it can and can't do. And also, as we
evolve as humans in this digital age, things are going
to change. We're going to be doing things that we
never thought we would have done before were because we
didn't have access to certain things.

Speaker 2 (24:02):
Right.

Speaker 1 (24:03):
I think about these freaking self driving cars, these robot
cars that I see. I've been seeing for years in
San Francisco, but they're making their way into other cities.
And I remember being like, I'm never going to get
into a car and let a robot drive me around.
And a friend got one and I got into that
car and I let a robot drive me around, and
I was so shook. I mean, even lifting Uber that

(24:27):
was a big thing when that started. I was like,
getting that car with a stranger who's not a taxi.

Speaker 2 (24:30):
Their personal vehicle. That's weird. And we do it now, yep, And.

Speaker 1 (24:33):
We do it now on a regular gladly, happily. And
now I'm letting I let a robot drive me around.
So I think it's just being open, being compassionate with ourselves,
and also doing our research and being well equipped to
handle the new developments that we have in our world today.

Speaker 2 (24:52):
Yes, yes, I think that is important. That is important
to acknowledge.

Speaker 1 (24:57):
Yes, So we have a couple different different paths for
this conversation, so I feel like we can probably get
this gut check. I think we kind of got the
gist of that. So what we're gonna do now is
we're going to go into some practical ways people use
chat GBT for mental or emotional support, and we'll kind

(25:18):
of add some add some flavor to these different points. Okay,
so number one, you can use it as a journal
right to reflect, process or just I don't know, express
your feelings or even talk to your inner child.

Speaker 2 (25:33):
Right.

Speaker 1 (25:34):
But again I'm gonna use journal with an asterisk because
you want to be mindful of what you're putting into
the tool.

Speaker 2 (25:40):
I think that's a good point. And then but the
second one is it can help you put your emotions
into words. So all of us have had those moments
where we're trying to parse out like we're we've just
experienced something and we're not quite sure how we're feeling

(26:03):
about it, and maybe we don't have someone readily available
to help us talk it out. And we can share
all of that into chat GPT like word vomit, the
whole thing, yep, and chat GPT can help you consolidate

(26:24):
and organize what you are feeling and put it into
words that will you can then use to communicate the others. Yes, caveat.
It's based on what you provided mm hm. So if

(26:45):
you don't give it the full context, it's only going
to give you what you give it exactly.

Speaker 1 (26:53):
It is only as powerful as your prompt and some
of the kind of add a little bullet point under
what Domd just shared. I've used it for grounding exercises,
breathing techniques, or even mindset shift. So that's something else
that you could do when it comes to like putting
your emotions into words. And then number three here is
if you're spiraling or feeling stuck in negative thinking, you

(27:13):
can also ask it to reframe your thoughts gently and
with compassion, so literally again, I'm an avid user now,
so today I had that midday long and I was like, bruh,
we gotta record the pod. Okay, we got three episodes
we're recording today. We have work, we have other things.
I'll like chat, I'm feeling a little low energy. Can

(27:33):
you give a help, give me a boost? I gave
it already knows my schedule for the day, so I'm like,
give me a boost. And it gave me some really
great advice about think about how you're going to feel
at nine pm at night. When you've gotten all the
things done, like just push through, take a little dance break,
and then push through. And it gave me more context
around my particular scenario. But it was hopeful.

Speaker 2 (27:54):
I can appreciate that. I can definitely appreciate that. And
so then the next thing is that it can help
you develop the scripts when you're having difficulty trying to
identify exactly what it is that you how you want
to phrase something. So we talk about it on a podcast.
I talk about it with clients all the time about

(28:16):
setting boundaries. Right. So let's say that you are communicating
with someone via email, via text message, or you know
you have an important conversation coming up and you're not
exactly sure how to say it in a way that

(28:37):
will help the other person understand you. You can put
that into chat GPT, and chat GPT will help you
come up with the scripts. My caveat with that is
you don't have to read a word for words. Yeah,
it's not meant to be verbatim. You can use it

(28:57):
as a starting point. You can use it as an
idea generator. So you put in the scenario and what
it is that you're wanting the outcome that you're wanting,
and it gives you something to say, and you can
look at it and you can say, you know, despite
all the information that I put in chat GPT, this
still doesn't quite feel like my voice. But this, but

(29:21):
I make a few edits and I'm there. This is great.

Speaker 1 (29:25):
That's right. Definitely make it your own, for sure, use it. Yes,
definitely make it your voice. And number five is you
can use it to guide visualizations or manifestations. This is
one of my favorite use cases. So whether it's a
future self exercise or a quick confidence boost. One of

(29:45):
the things that I love to do is put my
ideal life narrative into chat giput and have it paint
a visual picture of it. So in the morning, when
I do my meditation, I'll click on the speaker icon
and have Chat read through my ideal life near this is,
I close my eyes and I envision all this so
I can feel that idol life as I'm manifesting it.
So that's another really great use case for it.

Speaker 2 (30:08):
I can appreciate that with you. And so then the
next one is it can help you build routines and strategies. Right,
So let's say that you're wanting to incorporate fifteen minute
quick workout routine into your daily schedule and you're not

(30:32):
quite sure exactly where it can fit in. You can
provide chat GPT with the information on your schedule and
what it is you're hoping to accomplish, and chat GPT
can generate that schedule for you with those things your
goals added in. I haven't used it personally for this,
but see tell us what that is.

Speaker 1 (30:55):
Yes, girl, you already know, so, yes, I've used it
for my I put in my work out my ideal
body goals, and then it helped me create my workout
routine that I use when I go to the gym
three days a week, my meal plans to support that,
and then also like journaling prompts and like a lot
of the work that I do, I'm using it for

(31:15):
that and it's been really it's been really useful. So
that is definitely that's a good one building routines. But
now we want to transition and we want to talk
about the power and limits of AI and mental health.
And I feel like, look, as a therapist, you should
cover the caution and I'll cover the good because I'm
like an end user who's like, oh, yeah, you know

(31:37):
our perspectives. That's why I love the wrath this conversation together.
So I'm going to highlight some of the good. There's
probably a longer list here, but I'll cover a couple
main points. One, it's always there, and it's the accessibility, right,
you can use the day or night. Love that. The
second one is that you can say pretty much any
let me put anything with the asteris because I was
talking about some trauma and chat was like this too much.

(32:00):
Tat literally would say, we can't talk, we can't go
any further. This is to what you're sharing. I believe
it said that what I was sharing was I'll be
testing the limits, y'all. Okay. I think it said something
like it doesn't align with their guidelines or something. So
we couldn't finish that conversation. So that's another example of
probably going under the caution there. But you can say
almost anything and there's no fear of being judged or shamed.

(32:22):
And then number three is it can help you think
out loud and makes sense to off a perspective for
jumbled emotions, which I've definitely used it for and that's
been helpful. But now gonna get into the caution.

Speaker 2 (32:34):
The cautions all right, So, as we mentioned, it is
not trauma informed, so it does not know your full story.
It cannot know your full story because thereas you just
pointed out, once you start sharing certain things, it stops you, right.
So if it doesn't know all of those things, some

(32:57):
of what it provides will not be a be accurate.
It also doesn't know your nervous system, so it does
not know how your body is responding unless you type
that in there, and even then it's still not going
to be able to give you an accurate reading or

(33:18):
accurate response. So that means that it's mipping that emotional
nuance and the advice that it puts out could feel
pretty generic even when you're giving more specifics. Another thing
is that it runs on the information that it's been fed, right,

(33:39):
so it goes off of what all users have been
putting in. Unlike humans, it's not able to get curious
or generate new ideas. And that's what we can offer
as human beings, like we're able to take information and
generate new ideas. How do you think chat GPT became

(34:02):
a thing because it was a new idea from a
human mind. It cannot sit in silence with you. Sometimes
being in space with someone with another human and being
absolutely still can provide a certain level of comfort and

(34:23):
ease that GPT cannot give you. That it cannot pick
up on your body language or what you might be avoiding.
And this is something that I can point out in
like if I'm in a therapy session with someone, and
hell not not only in a therapy session, in general

(34:47):
interaction with people that I know well, I can say,
based on history or observing body language that they're a
void something and I, with my eyes can maybe even

(35:07):
observe what it is that they might be avoiding that
like the specific thing that they might be avoiding and
name it that GPT cannot do that. The other caution
is that sometimes it hallucinates or gives incorrect information. Yeah,

(35:29):
so it's important that even with the information that it provides,
that you cross check to make sure also that the
sources are accurate because not necessarily related to therapy, but
I know I have heard of instances where scholars have

(35:52):
been quoted or cited. Their work has been cited with
an AI too. I'm not going to say what was
specifically check GPT, but with an AI tool within an
AI tool, and they never said it, they never created
that work, oh lord. And so I think it's important

(36:13):
to verify the source, so use trusted sources. So if
chat GPT tells you that here are five strategies you
said that you're feeling depressed, here are five strategies from
ABC Medical School cross reference that check out. Check out

(36:37):
ABC Medical School's website and see what they actually say
to make sure that it's accurate. The key here, lady,
if you can't, if you haven't picked up on it yet,
the key here is to use discernment.

Speaker 1 (36:52):
Dom One of the things I want to share, So
there's this Psychology Today article that was really interesting, lady.
If you want to look it up, we may put
it in the show notes about it, give you the titles.
You can look it up as well. It's called chat
GPT Induced Psychosis and the good Enough Therapists. And two
of the key points that I thought were really interesting
and just something to keep in mind, right, like, you

(37:13):
can use the tool like the tool, but also hold
the caution in hand as well. Right, And so what
it says is chat bots tend to tell you what
you want to hear, and too much of this can
be destabilizing. Human therapists help to thrive. I mean, human
therapists help us thrive by steering us toward adaptive life narratives.

(37:35):
And one of the things that talked about in the article,
which I'm sure you can relate to, Dom, is that
therapists are not necessarily supposed to agree with you all
the time. They're supposed to see the things that you're
avoiding and also push you to get outside of your
comfort zone, right and do things that you may not
normally want to do, versus this tool that's telling you like,
you're so amazing, You're so great, everything you think is

(37:55):
great potentially, And so there's something else to keep in mind, right.

Speaker 2 (37:59):
Right, Yeah, that chat GPT is going to be biased,
can be biased based on the information that you're providing it.
Exactly if you're telling it that you need a cheerleader
or that you're needing us, you're needing support around ABC
and D, it's not going.

Speaker 1 (38:17):
To challenge you, all right, y'all. So we are going
to close this conversation out by sharing ways that you
can protect your info and yourself when using chat rept. Okay,
so number one is be mindful of what you share.
Please don't please don't share your address, your phone number,

(38:37):
your social Security number, or medical record info. Ever, just
don't do it, Okay. Even though chat GPT is designed
to forget conversations unless explicitly saved, it's still best practice
to treat it like it's public space.

Speaker 2 (38:52):
Right.

Speaker 1 (38:52):
You don't know how This tool is still new to us, right,
so think of it like journaling out loud. But don't
put your whole life red man there, right like, we
need to have some boundaries even with chat GEBT. Number
two is check your chat history settings. So in chat
shabt settings under your profile, you can turn off chat

(39:13):
history and this means that your conversations won't be saved
to your history or use to improve the model. You
don't want to be having the data that you're putting
in there. I just recently do this because I saw
an AI person share this online. I so let me
go make sure I'm not training the model with my
data and my information. So you can go to your
settings and data controls and make sure you turn that off.

(39:36):
And if you're using chat schabt to vent or process
something deep, turning off the chat history can also give
you extra peace of mind. Okay, So let's make sure
we do that as well, all right. Now, Number three
is know that it's not secure like therapy notes. Unlike
a licensed therapist who's legally bound to confidentiality like HIPPA right, CHATCHBT,

(40:00):
it is not subject to those same rules. It's not
a private practice, and it's not protected in the same
way that a clinical record is. So again, you want
to be mindful of what you're putting it in, what
you're putting into the tool, right, don't treat it like
it's your therapist's locked filing cabinet with your information, so
be mindful. Number four is use it for support, not diagnosis. Okay,

(40:23):
so chat GBT. It can offer some emotional comfort, but
it's not a substitute for a therapist, a coach, or
a doctor. It may provide general advice, but don't rely
on it for medical, psychiatric, or legal decisions. All right.
It's a what did you say earlier, dom a consultant, right,
not a mental health professional. Okay. Number five here is

(40:48):
add context for safer better responses. Again, as we said before,
it's only as powerful as you're prompt So the more
clearly you explain your emotional state or just like what
you're going through in general and what you're looking for,
the more helpful and sensitive it can be. So for example,
you know, I'm having a rough day, I feel overwhelmed
because X y Z happened. What ways can you support

(41:11):
me to process this gently? Right versus fix my anxiety
or something like that. Right, it's smart, but it's not psychic,
so help it help you, all right? Number six is
if you're in crisis, just shy away from chat GBT
and reach out to a human. That is the best
thing to do. Like, if it's a crisis situation and
you're feeling hopeless or in danger overwhelmed, we would highly

(41:34):
recommend that you not rely on AI and go to
a human because you are very vulnerable when you're in
that state, and so you want to make sure that
you have someone who's there with the sensitivity, the expertise
and the knowledge to support you appropriately. And then number
seven here is something that I have done and I
think this is super important. Program your chat GPT instance

(41:57):
with your values in mind, right, so tell it how
to talk to you. Name your needs, include your worldview
or beliefs, and set clear boundaries so that you can
have that sort of baked into the responses that you
get and it can feel more tailored to meet you
and your needs and meet you where you are. So
we're going to actually add we'll add all of this

(42:19):
in the show notes. You can see it in the notes,
so we won't do a recap. We'll just add in
the show notes there for you. Okay, So check out
the show notes, lady, and make sure you follow us
on Instagram at Herspace podcast and let us know what
you think about the episode or share general feedback. Also
leave us a review.

Speaker 2 (42:34):
Lady.

Speaker 1 (42:35):
You know we are black founded, black funded, and what
is the other We were black founded, black funded, and
we're black created. Right now? Is that what I always say? Back?
Only there we go. I forgot what I always say,
forgot my little tagline. But yes, come support us, lady.
Let us know if there are any other topics you
want to cover. We hope this was useful for you
and dom. I think we can head on over to
the after show. Nothing else to cover, all right, ladies,

(42:58):
So visit herspace podcast dot com if you want to
see us, and also tune into the after show where
we're going to dive a little bit deeper into this topic,
We'll see you there.

Speaker 2 (43:08):
It's doctor dom here from the Cultivating her Space podcast.
Are you currently a resident of the state of California
and contemplating starting your therapy journey? Well, if so, please
reach out to me at doctor Dominique Brusard dot com.
That's d R D O M I N I q

(43:30):
U E B R O U S s ar d
dot com to schedule a free fifteen minute consultation. I
look forward to hearing from you. Thanks for tuning into
Cultivating her Space. Remember that while this podcast is all
about healing, empowerment, and resilience, it's not a substitute for therapy.

(43:55):
If you are someone you know needs support, check out
resources like this Therapy for Black Girls for Psychology Today.
If you love today's episode, do us a favor and
share it with a friend who needs some inspiration or
leave us a quick five star review. Your support means
the world to us and helps keep this space thriving.

Speaker 1 (44:17):
And before we meet again, repeat after me, I honor
my journey by balancing effort and rest to achieve my goals.
Keep thriving, lady, and tune in next Friday for more
inspiration from cultivating her space In the meantime, be sure
to connect with us on Instagram, at her Space podcast
Advertise With Us

Popular Podcasts

Stuff You Should Know
Cardiac Cowboys

Cardiac Cowboys

The heart was always off-limits to surgeons. Cutting into it spelled instant death for the patient. That is, until a ragtag group of doctors scattered across the Midwest and Texas decided to throw out the rule book. Working in makeshift laboratories and home garages, using medical devices made from scavenged machine parts and beer tubes, these men and women invented the field of open heart surgery. Odds are, someone you know is alive because of them. So why has history left them behind? Presented by Chris Pine, CARDIAC COWBOYS tells the gripping true story behind the birth of heart surgery, and the young, Greatest Generation doctors who made it happen. For years, they competed and feuded, racing to be the first, the best, and the most prolific. Some appeared on the cover of Time Magazine, operated on kings and advised presidents. Others ended up disgraced, penniless, and convicted of felonies. Together, they ignited a revolution in medicine, and changed the world.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.