Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome, friends, to
the Telewellness Hub podcast,
where we are going to dive intothe latest trends, challenges
and innovations in mental healthand wellness.
I'm Marni Hamilton, licensedprofessional counselor, based in
Texas, and today we are talkingabout a major shift in the
industry that I think isimportant to talk about.
I've taken a break fromrecording just because I've been
(00:22):
trying to educate myself onthis topic.
I have been concerned, I'vebeen excited about it when
thinking about Telewellness Huband how we can utilize this for
the better of humanity, but thetopic is AI-powered mental
health support.
Ai-powered mental healthsupport.
(00:43):
This is causing a big shift inour industry in many industries,
and I think it's important totalk about AI in terms of mental
health.
So AI-driven chatbots arepersonalized support systems
that are gaining popularity.
I've been spending the weekdoing a lot of research on this
(01:05):
topic and I've been reallyconsidering, though what does
this mean, these AI drivenchatbots?
What do they mean in terms ofreal providers, real clients and
the future of mental healthcare?
While AI offers accessibilityand efficiency I mean I've used
(01:29):
all types of AI support systems,even integrated within this
podcast platform throughBuzzsprout.
Now AI can look fortranscriptions, for summaries of
podcasts.
I mean AI is everywhere and itcan make things a lot easier.
(01:49):
I think it's important to alsolook at the risks.
The technology is evolving soquickly that and it makes things
so efficient and easy it can beeasy to overlook some of the
risks.
And for me, as a licensedprofessional counselor who's
been in this industry fullylicensed since 2011, and has
(02:11):
worked also as a supervisor anLPC supervisor certification and
have supervised interns whilethey become independently
practitioner providers,independently-practitioning
providers I can't help butreally think about how important
it is that we, as professionalsworking in this space, really
(02:34):
sit down and take a look at whatAI really is and what it means
for the future of our clients,those individuals seeking care.
Ai isn't the solution to everymental health challenge, and I
want to encourage all of you,both providers and clients, to
(02:56):
really think critically aboutthe role of AI, mental health
and why real human-driven careis still irreplaceable,
particularly with something asimportant as our mental health.
Let's get started.
I have a lot of notes, so todayyou'll see me reading a lot.
(03:18):
So what is AI-powered mentalhealth support?
Ai in mental health care isgrowing at an unprecedented rate
, so platforms like Wobot, wisaI think I'm saying this
correctly and Replika useartificial intelligence to
simulate therapeuticconversations.
Some AI systems analyze userdata and suggest coping
(03:42):
strategies, and others evenattempt to predict a mental
health crisis.
And the great thing is, thesetools are available 24-7, making
mental health support moreaccessible than ever before.
I myself know that feeling ofwishing I could talk to a
therapist in that moment at 1 am, when I was a single mom, and
(04:06):
wondering, when it's the daybefore my actual therapy
appointment, feeling like, okay,well, what am I going to talk
about?
So I think a lot of people havehad that experience where the
idea of an on-demand therapeuticsupport system is incredible.
So that's one of the greatthings about AI.
(04:27):
When it comes to AI in thisspace, some facts A 2023 study
published in the Lancet DigitalHealth found that AI chatbots
reduced symptoms of depressionand anxiety in 30% of users
after just four weeks.
I did not do a deep dive intothe power of this research, so
(04:51):
this is again a big disclaimerto really critically think about
all this.
Statistics is everything, andjust because something is
published in research doesn'tnecessarily mean that it can be
generalized to all populations.
You really have to look at theindividual variables.
But I thought that was aninteresting thing when I looked
at the abstract.
Additionally, according to theWorld Economic Forum, ai-driven
(05:15):
mental health apps saw a 500%increase in usage between 2020
and 2024.
So the demand is there, withouta doubt.
2020 and 2024.
So the demand is there, withouta doubt.
And at first glance, this soundspromising, right, like who
(05:35):
wouldn't want affordable,instant, around the clock
support?
Like I myself would be like yes, sign me up.
But as we dig deeper, thereality is a lot more
complicated, so I don't wantthis to be.
Of course, I'm biased.
I'm a mental healthprofessional.
I see the value in my humantherapist, but I want to empower
(05:56):
people.
So, if you are going to beusing an AI mental health
platform, I want to share whatclients should consider before
using AI mental health platforms.
So, if you're someone seekingmental health support, ai-driven
tools might seem like an easy,low-cost option and on-demand,
but here are some things I wouldencourage everyone to really
(06:17):
consider before using them.
Number one AI is not areplacement for professional
therapy.
Number one AI is not areplacement for professional
therapy.
So, while AI can offer generalcoping strategies, it cannot
diagnose, it cannot personalizeand it can't adapt to specific
emotion and mental health needsthe way a human provider can
right.
Same thing goes with a dentistor a medical provider, because
(06:42):
my husband unfortunately had togo to the emergency room and had
a bunch of lab work done andthere are AI apps that can scan
and read lab work.
There's amazing technologygoing on in the medical field,
which mental health is part ofright.
We are a clinical field and itstill cannot replace the human
(07:05):
eyes of someone taking the wholepicture in and making decisions
and collaborating with thepatient.
So that's something reallyimportant to think about that it
cannot do those things the waya human provider can.
A fact a study from the JAMApsychiatry some call it the JAMA
(07:26):
, but psychiatry found that 68%of users felt AI chatbots
provided mechanical or genericresponses to their issues,
making them feel unheard.
So you can Google that 2024JAMA psychiatry, look for AI and
you can find.
Dive deep into that research.
(07:49):
Number two so, number one AI isnot replacement for professional
therapy.
Number two AI can giveinaccurate or even harmful
advice.
I have used chat GPT a ton, um,but I know that it's given me
like false information, right,and they have a disclaimer like
you need to double check thingson your own, and I think it's
(08:09):
important, especially when itcomes to mental health, that you
are aware that the advice orrecommendations because a human
therapist doesn't necessarilygive advice, um, that gives
insight or recommendations isthe fact that AI can give
inaccurate or even harmfuladvice.
So AI chatbots are really onlyas good as the data that they
(08:33):
are trained on, and I amlearning so much while I'm
building Telewellness Hub aboutprogramming, the language
programming that is involved inAI.
Um, thanks to the brilliantproduct and technology officer
at tele-wellness hub, I've beenreally inspired to dive deep
into AI and learn.
Learn about this industrybecause I, I it's, it's in every
(09:00):
field, and so it's been reallyinteresting to see how how AI is
trained.
It's even like positivereinforcement, right, like how
you would give a treat to a dog.
It's a similar thing.
So AI chat bots are actuallyonly as good as the data they
are trained on, and they canmisinterpret a user, potential
(09:20):
client's input and provideresponses that are misleading or
, even worse, inappropriate.
So there's an example In 2023,a chatbot was designed for
mental health support and itactually suggested harmful
coping mechanisms to users.
I won't say the platform, butit led to major public concern
(09:43):
and public shutdowns.
So when you hear this, I meanit's the benefits of AI are
there, but when it just takesthat one life every life is
important that one life that canhave received inappropriate,
(10:04):
damaging insight, and you knowit's not worth it.
Okay, number three privacyconcerns and data security risks
.
Many AI mental health platformscollect sensitive user data.
I cannot emphasize this enough.
(10:25):
Big tech companies, they'rewonderful in their mission,
(10:50):
they're wonderful in theirmission, but many times are not
created, founded by mentalhealth providers and ultimately
becomes a business of collectingand selling data.
So when you have sensitive userdata, these clients, many AI
mental health platforms, arecollecting these and so, without
strict regulations andsometimes without knowing, this
has happened on many majormental health platforms.
Actually, big tech, I'm talkinglike I know.
(11:11):
I'm representing tele-wellnesshub, we are mom and pop shop.
Okay, I'm talking big, big likemillions, billions platforms.
Your private conversations couldbe at risk of being recorded,
of being used for marketing, ofbeing used for research, used
(11:38):
for research and, even worse hashappened, being sold to third
party companies, and I, as amental health provider, am one
that believes that my clientsare.
Their sensitive information iscertainly not to be sold.
Their sensitive information isprotected by HIPAA.
Um, it is my duty, through mylicense, through my years of
(12:05):
education and licensure andcontinue education hours that I
pay for and invest in to toprotect, to do everything in my
power to protect my clients andtheir information.
So, to me, this is one of thebiggest concerns for me, and a
New York Times report in 2024revealed that some mental health
apps were selling their userdata anonymously to third-party
(12:29):
companies, raising seriousethical concerns.
I don't necessarily want tothrow any companies under the
bus, but you can take a look NewYork Times 2024, mental health
apps selling data, just so thatyou're informed and you're aware
.
Another concern in terms of whatclients should consider before
(12:51):
using AI mental health platformsis the risk of self-diagnosis
and over-reliance on AI.
So some users may start to relysolely on AI for their mental
health needs instead of seekingprofessional help when they
truly need it.
And AI doesn't replace theaccountability and therapeutic
(13:11):
relationship that comes fromworking with a licensed
professional who understands youas a whole person.
Um, there is nothing like umthat therapeutic relationship.
Someone that you know in,understands your, your story,
(13:38):
understands your story, yourvalues, your challenges, is
there to cheer you on, to helpyou reflect to consider
alternatives.
We deeply care about ourclients Deeply, deeply, deeply.
Human care and I know that somepeople have had some hurt done
by therapy.
I've I've heard some of that.
(13:59):
I understand that there'schallenges in that, but there
are so many amazing providersout there that really do
everything, dedicate their livesto their clients' well-beings
and and that's that's theirlivelihood, that's everything
they've worked so hard for.
So something to think about.
(14:19):
When it comes to.
It's important.
If you don't find the righttherapist because it's happened
to me I didn't feel like it wasthe right fit and I knew that I
deserved to find the right fitand I kept searching until I
found the right fit.
You deserve to find the rightfit and I kept searching until I
found the right fit.
You deserve to find the rightfit.
And, side note, with thisrevamp of Telewellness Hub, my
(14:44):
goal is to make it even easierat first glance to know if that
provider is the right fit.
So more to come on that at aseparate time.
But so four things that clientsshould consider before using an
AI mental health platform.
Ai is not a replacement forprofessional therapy.
Ai can give inaccurate or evenharmful advice.
(15:06):
Privacy concerns and datasecurity risks amount and the
risk of self-diagnosis andover-reliance on AI is there.
So take all that and justconsider that right Again.
This is meant to empower and tohave empower clients and
providers, and for you to justtake those things into
(15:29):
consideration for criticalthinking of.
Is an AI-powered platform thebest for me?
And maybe it is, but justhaving those things in mind when
doing so is important.
Now for mental healthprofessionals.
Ai also greatly impacts us andit presents both opportunities
(15:54):
and significant concerns.
I'm going to talk about threemajor ones job displacement,
fears, ethical dilemmas in AI,assisted care and the need for a
community driven response.
These are all things that I'mseeing in my communities.
These are things that I feelmyself, and I think that if we
can empower ourselves byunderstanding AI more and really
(16:18):
being able to critically thinkabout it, it will help us in the
long run.
So, when it comes to AIreplacing us, essentially, ai is
often marketed as acost-effective alternative to
therapy, and I understand thatthis can make it even harder for
(16:39):
providers to compete with freeor low-cost AI solutions.
A fact is a little research.
Fact is a survey from theAmerican Psychological
Association, the APA go APA in2024 found that 42% of mental
health professionals feltthreatened by the rise of AI in
(17:03):
therapy.
So mental health providers outthere are in flight or fight
about AI.
Let me tell you there is a lotof concern out there, for will
we even exist as a job?
Will we be replaced?
So, just like many otherindustries, we are worried about
that too.
And something to take intoconsideration is most mental
(17:24):
health providers go into hugedebt in order to have that
credential after our name.
We go to graduate school, wecomplete thousands of hours to
be credentialed, we take tests,exams Every year, we have to do
continuing education units,which costs money and take time
(17:50):
just to make sure we're doingthe right thing for our clients,
and and we have overhead.
That's a whole.
There may be a whole otherconversation, a different
episode for providers, but it ishard for us.
It is hard for us.
And reimbursement rates.
So people might think, oh, likeyou can bill insurance, you can
make a lot of money.
The fact is we know the wholehealth system is broken with
(18:13):
insurance in the United Statesand actually our reimbursement
rates haven't changedsignificantly in decades, so it
hasn't caught up to the toinflation.
So it's it's.
It is tough out there, even ifyou're accepting private
insurance and, um, you could doprivate pay.
(18:33):
But that is a challenge also,right, because it has its own
unique challenges.
We'll say that and, again,maybe this needs to be its own
separate episode.
But this fear for mental healthprofessionals is very real and I
think there's an opportunity,though, for therapists to become
(18:54):
empowered by AI when we cometogether.
Now, also AI when it's marketedas cost-effective, as an
alternative to therapy by theseplatforms.
You have to remember some ofthese platforms have like
millions of dollars to supporttheir marketing.
Most mental health providersthat you see do not have
(19:17):
millions of dollars.
I don't want to speak foreverybody, but I know for myself
and, I think, most providersout there, we are trying to
figure out how to market on abudget, because we're spending
our money on continuingeducation and other things that
(19:37):
enhance the experience for ourclients, right, whether it's the
telehealth platform, our officespace, an additional
certification or training tohelp a client just really break
through barriers and meet theirgoals.
So the job displacement fear isreal.
So, when you support yourmental health provider, when you
(19:59):
choose a mental health provider, a human one, and it's their
private practice, let's say,their solo practitioner.
You're really empowering smallbusiness we forget.
You know, ultimately they're asmall business.
You know, ultimately they're asmall business and so if
(20:21):
supporting small businesses isimportant to you, I encourage
you to support a small mentalhealth business and support your
mental health providers ifyou're seeking mental health
support.
Okay, enough of that.
Number two ethical dilemmasexist in AI-assisted care, so
some providers are being askedto integrate AI into their
(20:41):
practice.
Sometimes it happens slowly,like through maybe like a
software they're using to keeptrack of notes for their clients
.
Um, sometimes it happens whenmaybe sessions are being like
(21:02):
listened to through certain likeintegrations and being
transcribed so that it can beturned into notes easily.
Um, there are different thingsgoing on with AI, but without
clear guidelines on its ethicaluse, they may be at risk of
liability, and mental healthproviders already carry
liability insurance and it'sjust something that we can have
(21:22):
concerns about.
Right, we are concerned aboutour clients' confidentiality and
there are many HIPAA compliantAI platforms, but still
something to think about.
So if you have a mental healthprovider who is old school and
they have a piece of paper andthey're writing their notes on a
(21:44):
piece of paper and they'refiling their notes in like a
folder.
Just know that it doesn't meanthey're like too old school and
not innovative and won't knowthe latest and greatest
techniques to support yourmental health.
Just know that actually a lotof mental health providers are
turning to that in an effort toreally protect their clients and
(22:06):
to protect their own liabilityand to do the most ethically
responsible thing that alsosupports our mental health
industry.
So just a little for clientslistening out there, just know
that there's a shift into thatright.
So, just like there's a shiftinto kind of going back to old
(22:29):
school, simple times of makingyour own sourdough, of like
homesteading, keeping thingssimple, there's also a shift in
that in our field.
And a third topic I wanted totalk about in terms of the
impact of AI on mental healthproviders is the need for a
community-driven response.
(22:49):
So instead of resisting AIentirely, because I don't think
that's the answer like AI ishere, whether we like it or not,
because I don't think that'sthe answer, like AI is here,
whether we like it or not,providers must stay informed.
I want to advocate that we, asmental health providers,
advocate for ethicalimplementation and I would love
to invite all mental healthproviders to help shape how AI
(23:14):
is used in mental health care.
So this next section is goingto be a little bit about um for
providers specifically.
Um.
So if you're, uh, our regularlistener who wants to know a
little bit more of an insider'sview on providers, just keep on
listening.
If not, um, I appreciate youlistening and taking into
(23:38):
consideration these things.
I think it's important, um, asa consumer of mental health
support systems, um, to knowthese things, to know what's
happening in our industry, toknow that there are big shifts,
because it directly impacts yourcare.
So I'm going to encourage youto listen, um, but just know
that behind the scenes, thereare a lot of mental health
(23:59):
providers trying to figure outhow we can best serve you
through this shift.
So why independent providers andclients need a supportive
community.
As we continue to see AI expandin mental health, it's more
important than ever for bothclients and providers to have a
(24:19):
space where they can navigatethese changes together.
Uh, platforms like Latintherapy and um, uh, latinx
therapy, I'm sorry.
Uh, liberatory wellness network.
Alex, let me double check thatthat's correct.
Um, let me see here.
(24:42):
Hold on, okay, alex, you mightwant to like take out the whole
thing about.
Like don't listen if you don'twant to.
So anyway, you can take allthat out if you want, so okay.
(25:08):
Why independent providers andclients need a supportive
community is my last littlesegment here on this topic.
As we continue to see AI expandin mental health, it's more
important than ever for bothclients and providers to have a
space where they can navigatethese changes together.
(25:29):
I want to share about someplatforms like Telewellness Hub,
but also the LiberatoryWellness Network, therapy Den
I'm using my phone right now tomake sure I have Latinx Therapy
Network Therapist Networks.
These platforms are designed tosupport independent providers
(25:52):
and ensure that clients haveaccess to ethical, human-first
care.
I will share a list of someplatforms that I know are
specific to supportingindependent providers and
human-first care in the shownotes, so take a look there if
you're looking to join one ofthese directories or seek a
(26:14):
provider through there.
Join one of these directoriesor seek a provider through there
.
So for clients, it's like it'sreally essential to find real
licensed professionals who canprovide personalized and
confidential support.
In my opinion, rather thanrelying solely on AI driven
tools, I think there's a greatspace for AI driven tools to
support the work, and I've evenhad clients.
(26:34):
I've even had clients talkabout what they entered into
ChatGPT for like journalingtopic ideas, and I've even had
clients I've even had clientstalk about what they entered
into ChatGPT for like journalingtopic ideas, and we talk about
it in session.
Together we can talk about thetools of AI and utilize those
tools as a resource for oursessions.
So I'm not saying it's bad, I'mjust saying it's something to
(26:56):
really think critically aboutand, I think, something that you
can bring and navigate with atherapist in a perfect world.
I realized, like I said earlier,that our, our health system in
the United States is is um isfacing a lot of challenges.
So I understand that I thatsometimes that's not an option
(27:17):
and that's also why telewellness hub is hoping to break
those, those barriers andthere's those challenges to
access by providing clients whoproviding clients with providers
who also have resourcesavailable, whether it's books,
digital downloads, podcastepisodes, youtube channel videos
, youtube videos, podcastepisodes, youtube channel videos
(27:39):
, youtube videos, um.
So, yeah, we want to make thereare a lot of people out there
who want to make mental healthaccessible and I think for some
I understand that an AI chatbotmight be the best option at the
time on your way to find theideal provider.
Um, for providers, having aprofessional network is key to
(27:59):
staying ahead of industry shifts, advocating for ethical care
and ensuring that independentproviders have a strong voice in
the future of mental healthservices.
I want to talk about onenetwork, the National Alliance
of Mental Health Professionals,an incredible platform that I
(28:19):
have recently joined that I loveand is let's see how many
people are in this group over8,000 people, and it's just
people having a dialogue aboutAI, about big tech corporations
and venture capitalists takingover our space.
So just something to thinkabout for providers.
(28:40):
Maybe take a look at some ofthese organizations or even
locally, see who's talking aboutthis and joining in At
Telewellness Hub, I will saywe're creating a space, a
digital space, a community whereproviders can really examine AI
and other emerging industrytrends together, along with our
amazing technology team, and mygoal is to have your voice.
(29:07):
Help shape the future ofTelewellness Hub and help make
decisions when it comes to AIwith our directory and platform,
so that we can have an ethical,human-driven and effective
mental health solution forproviders and clients alike.
(29:28):
So I welcome everyone to jointhe conversation.
Shape the future of mentalhealth together to join the
conversation, shape the futureof mental health together.
Before I wrap up, I just wantto take a moment to thank
everyone who has been listening,who has provided feedback on
(29:49):
our since the day I am onTelewellness Hub, the platform
telewellnesshubcom since I firstlaunched on a WordPress site
and then, um you know, madeupdates and changes and the
launch and especially with ourprevious launch um, I was so
excited about it and it was wasamazing to see the feedback and
the signups and we I got somereally great insights that have
(30:12):
helped shaped and refine aplatform that truly serves
providers and clients in a moremeaningful way.
So I am so thankful foreveryone who's taking the time
to give feedback, because thatis so helpful for shaping the
future of mental health, notjust tele-wellness hub, but for
(30:32):
mental health.
We're currently rebuilding,finalizing our custom web
application.
Our goal is to ensure thatindependent providers have a
space where they can offer theirexpertise, connect with others
in the field and truly make animpact, offer their therapy
services, their workshops,showcase their media content,
(30:56):
their products for sale andreally offer clients the
opportunity to do wellnessmental health wellness their way
.
So getting help isn't alwayslike scheduling an appointment
right away.
Right, it's navigating who aresome experts in the field, and
Telewellness Hub is here tohighlight the experts in this
(31:18):
field, which are human,independent providers.
So if you're a provider andyou're looking for a community
that values ethical client firstcare without the burnout, I
invite you to stay connected.
Please share your thoughts, be apart of this evolving
conversation.
We're building a channel withinTelewellness Hub for all
(31:39):
providers to share theirthoughts, to vote on features,
to help shape the directoryitself so that it's ours as a
collective.
I'm a big believer thattogether we can really create a
difference and a huge impact.
So if you're a provider or aclient who wants to better
understand the role of AI inmental health and ensure that
(32:01):
real human first care remains apriority, please subscribe to
this podcast and join ourgrowing community.
Let's work together to navigatethese shifts and ensure that
technology supports, rather thanreplaces, real mental health
care.
I'm going to have more topicson this in the future.
I'm going to be inviting someexperts in this field in AI and
(32:24):
research and media, mediaspecialists, technology
specialists, communityspecialists.
So please subscribe, stay tuned, because there's some good
stuff coming.
Thank you for tuning in andI'll see you next time.