All Episodes

December 16, 2024 67 mins

Send us a text

In this engaging third-round discussion, Joey Laswell, The FIRE Social Worker, reconnects with Dr. Marina Badillo-Diaz, The AI Social Worker, to explore the rapidly evolving intersection of Artificial Intelligence and Social Work.

We discuss:

The ethical considerations and potential of AI in social work practice.
* How AI tools like ChatGPT and HIPAA-compliant solutions (Bastion GPT, Uphill) are helping practitioners reduce burnout and streamline administrative tasks.
* The impact of AI on social work education, research, and workforce readiness.
* Real-world applications, from chatbot therapy assistants to tools for analyzing client trends.
* International perspectives on AI adoption in social work, including Dr. Badillo-Diaz’s upcoming Canadian symposium.
*The exciting potential of creating a social work tech collective to stay ahead of the curve.

Joey also shares his journey using AI to develop creative projects like The Social Worker, a serialized graphic novel, and innovative tools for financial counseling.

Key Topics Covered:
⏱️ [Timestamps]
00:00 - Introduction and catching up
04:35 - AI tools for social workers: Practical applications and ethical concerns
15:20 - Addressing skepticism and excitement around AI in social work
25:10 - AI in education: Student perspectives and ethical boundaries
35:45 - Emerging AI tools for mental health and clinical practice
48:00 - The creative use of AI: Joey’s Social Worker project
54:20 - Building the “AI Avengers” and fostering community among social workers

Links Mentioned:

The AI Social Worker Website: www.aisocialworker.com
Joey’s FIRE Social Worker: https://www.facebook.com/profile.php?id=61559229661247
Featured Tools: Bastion GPT, YouMore, Uphill
Don’t forget to like, comment, and subscribe for more discussions on social work, financial empowerment, and technology!

#SocialWork #ArtificialIntelligence #AIEthics #FIRESocialWorker #MentalHealth #TechForGood #AIandSocialWork #AISocialWorker

Support the show

Please join me on my different platforms and follow along my journey towards FIRE.

https://laswell.veteran.cards/

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
All right, hey everybody, this is Joey Laswell.
I am the FIRE social worker andthis is round three of our
discussion with the AI socialworker.
We have Dr Badillo-Diaz.
Thanks, joey, it's great to behere.
It's been almost four monthssince we last talked, so

(00:24):
basically one quarter has goneby of the year.
I know you've been like sincethe last time we talked.
You've been super busy.
I've been pretty busy with alot of projects and things too.
But you know, I just thought itwould be a good time to jump
back in the conversation withyou, kind of get an idea of like

(00:46):
where you think things areheaded with AI and social work
in general and you know justkind of.
You know explain and exploredifferent topics about AI and
social work.

Speaker 2 (00:53):
Yeah, no, super excited to be here.
It's always nice to chat, hearabout the amazing things that
you're doing.
I'm so excited to share what'sbeen going on in my realm with
the AI social worker and justfeel really fortunate to be here
with you today.

Speaker 1 (01:06):
Awesome.
Yeah, thank you, same here.
Okay, so we were talking alittle bit prior to the stream,
but we're just doing a littlebit of catching up.
But for the people that arelistening on the podcast and
watching this after the fact,can you give us a little bit of
rundown what's been going onwith you the last few months?

Speaker 2 (01:25):
Yeah, so it's been a busy semester.
As you know, I'm teaching at acouple you know universities, so
I've been, you know, justreally grateful for those
opportunities and just beendiving into, you know, just
supporting students.
But it's been really excitingto kind of bring in more of AI
into the classroom.
So that is something again, asan educator, I've been really

(01:46):
intentional about bringing innot only you know how can I
weave it into my coursework andmy assignments, but also really
engaging with students on thereally important topic for our
MSW and even also the doctoralstudents that I teach to as well
, of how is AI going to beimpacting the landscape of our
field.
So that's some of the work thatI teach to as well of how is AI
going to be impacting thelandscape of our field.
So that's some of the work thatI've done.

(02:08):
Since the last time that wespoke at the end of August, I
launched my new website, the AIsocial workercom.
I've had over 3000 views, whichI'm super just pumped about and
just so happy that you know it'sfree resources for social
workers on how to use AI.
I have a whole free promptlibrary with over, you know, 30

(02:28):
prompts that can support socialworkers and different tasks
using AI.
So that's been really excitingto launch.
And I've also done a couple oftrainings as well too.
You know I come from the realmof school social work and
education, so I've done sometrainings for some state
chapters of school socialworkers, some school districts,
and it's just been reallyexciting to engage social

(02:50):
workers in the field to talkabout this technology.
Some are really excited thatthey kind of knew about Chat2BT,
but you know we got into somereally great discussion, you
know, in the training sessionsand how to just really take
their AI practice to the nextlevel, or those who are just
really new with AI have nevertried it before.
We're excited to also be ableto try these new tools and talk

(03:10):
about how they can incorporatethis in their practice.

Speaker 1 (03:13):
Yeah that's awesome.
So in the general feedback,like, has it been mostly
positive or kind of, you know,pessimistic, cynical, nervous,
or how does the feedback beenfrom you?

Speaker 2 (03:26):
Yeah, all of the above, and I think initially
there's, you know, hesitation,concern, and rightfully so, I
think.
As social workers we aretrained to think about things
critically.
We're trying to think about howthis is going to impact our
communities, our marginalizedcommunities.
What does this mean for ourethics, right?
So I think these are reallyimportant concerns to have and
you know I've had social workerskind of, you know, share

(03:47):
concerns like what if we overrely on these tools?

Speaker 1 (03:50):
Is it?

Speaker 2 (03:51):
going to take, you know, parts of my job and am I
going to be able to still do thequality, quality work?
And then there's some that arelike I just have a lot of
technological gaps.
I'm really confused how thistechnology even works and, and I
think there's kind of like thissort of novice sort of unknown.
And then there's some that arelike really concerned about the
ethics and the data privacy andas we, you know, should be

(04:12):
concerned about those things aswell too.
But there's been, you know, sometrainings that it's like, oh
man, I'm retired, like I wish wehad these tools and I was like
I can make my job you know job,so much easier and I think
that's been such an excitingpart of that work and those
conversations to really empowerand uplift you know, because I
know burnout is so real for somany of our social workers out

(04:34):
there.
So to really hear people walkaway.
This is just.
I'm invigorated by you knowdoing my job.
I'm so excited to try thesetools really help me, you know,
in my, in my everyday job tasksto make it easier, and that,
just to me, is exactly why I dothese trainings.

Speaker 1 (04:49):
Yeah, yeah, that's awesome and I, you know, I think
that it is.
We've talked about it before.
We do need to keep, you know,an eye on being cautious and,
you know, and learning how touse these appropriately, and
learning how to use theseappropriately, but then also, we
don't want to get left behind.
So I think, having, like I said, these discussions to really

(05:12):
put it into the ethos and say,okay, look, it's happening, it's
here, it's even more advancingnow since we've been chatting.
So it's not going anywhereanytime soon.
If anything, it's just going tostart accelerating more and
more.
Right, yeah, and you know, likeChachaviti just had its.

Speaker 2 (05:25):
So it's not going anywhere anytime soon.
If anything, it's just going tostart accelerating more and
more.
Right, yeah, and you know, likeChachaviti just had its
birthday right, it's two yearsold now.
So you know, it's still in itsinfancy stage.
But, like you said, we have toadapt, we have to pivot, and
this also is something I say inmy trainings too.
Like you can choose whether ornot you want to use these tools,
right, that is totally up toyou, your sense of agency.
But you can't just not knowabout these tools and not know

(05:46):
how these tools work, becausethey're going to be impacting
you.
They're going to be impactingyour clients and the communities
that you're servicing, you know, especially from a mental
health perspective.
We know, you know there's somenot so great sides or some dark
sides to this technology, withdeep fakes, misinformation.
So this can certainly harm ourclients.
So we have to be prepared tohave those conversations and
supports and interventions inpractice.

(06:09):
But I think also there's a hugeopportunity, if we're working
with clients who need supportwith workforce readiness and job
readiness, that these tools canhelp them support writing their
resumes or writing coverletters, or, if you want to use
it as a tool alongside yourclinical practice that you can.
There's therapeutic chatbotsout there, so there's a wide

(06:29):
range of this.
Technology can be served forgood, but we also have to know
what this can be also like notfor good, and tech for bad also.

Speaker 1 (06:35):
So we have to also raising those concerns.
Yeah, and I think it goes backto even the house, like how
social media is, is a platform.
It's not good or bad.
Inherently, it is just aresource and a tool that you can
use how you choose to.
And then I think we've talkedabout before of training, trying
to train your algorithms forpositive versus negative that

(06:58):
you'd be surprised how quicklyyour algorithms can shift.
You know, and that's AI.
So you've got to, once again,you have to go with the flow,
learn how to, how to manifestthese things and how to also,
you know, take things sometimeswith a grain of salt because you
know, as we've probably bothobserved, not all AI is is great

(07:21):
outputs, but you know it's,it's always learning, it's
always growing, so, but yeah, soI'm curious from the school
side of things.
Have you gotten any feedbackfrom the student side, like,
what are they thinking about AIand using it for school?
Do they feel like it's cheating?
Is it cheating Like you knowwhat?

(07:42):
What are what are some of thestudents saying about AI and
stuff?

Speaker 2 (07:45):
Yeah, no, another, you know another realistic sort
of conversation we have tograpple with as as educators and
also preparing our socialworkers in the field.
So I'm getting a wide range tosome where they're saying you
know, we're not talking aboutthis enough in our classes and
we need to be talking about thismore, to like, if we're going
to be social professionals inthe field and this is going to
be part of our work in the field, shouldn't we be talking about

(08:07):
this more, getting moretrainings in the classroom?
To some students saying youknow, like I know students are
using it and they're writingtheir papers with it.
I think that's wrong, like Idon't want to be caught
plagiarizing and what does thatsort of mean.
And so I know, in theconversation that I've led in
the classroom with my studentsis, you know, my perspective is
as a field, we have to supportyou with workforce readiness.

(08:29):
Right, we have to have theseconversations about what are the
different tools out there, howcan they be supporting your
every day job, to ask, how arethey impacting your, your
clients?
And really thinking aboutethical use to like, as a
student, what is ethical?
How can you use AI to supportyou in your assignments?
What are the red lines ofconsidered plagiarism?
What is permissible use?

(08:51):
And this also now goes back tojust a larger conversation, just
outside the students but reallyin the hands of school
administration, coursecoordinators, like our
curriculums sort of have to beoverhauled.
Now we're going to reallymeaningfully introduce this
technology into our curriculumsand I think it's definitely,
definitely.
I think students are curious.

(09:11):
Students are using it, some ofthem not telling their
professors how they're using it.
Right, so they are, but I'mfrom the mindset of let's use it
, let's reimagine ourassignments to include this
technology.
So you really support withworkforce readiness and I think
this is a great opportunity topivot, like you said, and be of

(09:31):
the times and be adaptive thistime yeah.

Speaker 1 (09:35):
Yeah, that's.
That's interesting because youknow I don't really engage with
too much, I guess, youth at thispoint.
So I'm just yeah, I was justgenuinely curious how it's being
, how that generation is kind ofusing it, perceiving it like
because, I mean, there is asocial side of this whole AI

(09:56):
stuff.
You know Like people are formingattachments to their chatbots
and you know we, you know sadlyhad cases of people you know
that young person that you knowgot really, really attached to
their AI.
So that is something that youknow I didn't think we would

(10:17):
have to be considering, you know, like this whole concept of
that, but I mean some peoplemight be more susceptible to
forming these types ofrelationships with their, their
chatbots.

Speaker 2 (10:30):
Yeah, no, I'm so glad you brought that up Because,
again, as we think, from thepractitioner lens of our social
workers, we work in communitieswith, you know, vulnerable youth
, youth that have differentdiversity and neurodiverse, you
know needs, you know, maybethese chatbots might be an
access, an opportunity to engagein social experiences, but then
, on the flip side, like yousaid, it could really lead to a

(10:52):
dark place or a place of evenmore isolation more depression.
We did hear of that case of ayouth that did die by suicide
because of the chatbot.
I think it's also opening upmore doors for, say, legislation
and compliance becausecharacter AI that chatbot.
I think it's also opening upmore doors for, say, legislation
and compliance.
You know, because character aithat chatbot, you could be as
young as 13 getting access tothe platform.
So now we just start thinkingabout if we want to as a society

(11:14):
.
Do you want to put sort ofsafeguards you know in place to,
you know, ensure that ourvulnerable, most vulnerable,
especially children, right?
That should their access be andhow, like when, the limit
should be?
We know that the adolescentbrain is certainly vulnerable
and malleable, right?
And we know that they're alsomore impulsive, right?
This is all these things thatare kind of swirling, and I

(11:35):
think there's also moreopportunity to do more research
and I think it's really, really,really critical.
We know legislation andgovernment compliance takes time
and it's not moving fast enough.
So I know, as part of theadvocacy of parents that have
kind of been talking about, youknow, pushing for, you know,
increasing the age limit of, say, a platform like Character AI,

(11:59):
and I know, recently, australiais, you know, banning, you know,
social media use for 16 andunder.
So I think yeah, so I think thisis going to be, you know,
banning, you know, social mediause for 16 and under.
So I think, yeah.
So I think this is going to be,you know, catching on as it
should, and that's also part ofmy trainings as well, to talk
about other social workers,about this, because if it, if
it's not coming up in yourpractice, it's going to be if
you're working, yeah it's coming.

Speaker 1 (12:19):
Yeah, absolutely yeah , that's.
That's really interesting.
And in comparison to socialmedia, there is that new push
for okay, maybe we need tomonitor or limit or restrict it
for young people and young mindsthat we should do the same

(12:41):
thing for AI.
But it's also the Wild Westright now, so there's still so
much unknowns and so much it'shappening so fast that we can't
even almost keep up with it.

Speaker 2 (12:53):
Yeah, and I think the place that I stand for it is.
I think legislation and havingsafeguards in place are
important and at the same time,I think our future world, for
our young people, is going to bea world with artificial
intelligence.
It will be a world with havingthis technology to help you do
your jobs and everyday lifetasks.
So I think completelyeliminating it in some capacity,

(13:16):
I think isn't also doing itjustice either to really teach
our young people responsiblyusing AI.
And, like you said, it's a wildwest right now.
This is so new that useducators aren't even fully
aware or educated on these tools, and yet we're also responsible
for teaching this nextgeneration.
So it's just going to take time, it's going to take research,

(13:38):
it's going to take advocacy,it's going to take a village,
really, you know, to get to aplace where we have the
compliance, legality, theeducation, the tools all lined
up to really have an ethical.
You know tech for good, youknow space for our society.

Speaker 1 (13:56):
Yeah, that's good.
That's good stuff mentionedresearch and I wanted to
highlight or at least you knowum acknowledge that you recently
got approved for some researchright in in this realm, so tell
us a little bit about that yeah,so super excited.

Speaker 2 (14:15):
A couple of my colleagues uh, we have, you know
, submitted for an IRB.
It's now approved and we arereally interested in AI and
school social work practice.
As I mentioned, I'm really youknow, that's my area of social
work that I've been working in.
It's also an area have we'vebeen talking about with youth
and education in schools.

(14:35):
I think there's just a hugeneed for school social work
practitioners to understand, youknow, this technology and so
we're really looking forward inour research to get a better
understanding of how schoolsocial workers are using this in
the field.
What are the gaps, what are thetraining needs?
We really want to investigate.
You know what's going on in thefield, so I'm really excited to
be embarking on that in the newyear.

Speaker 1 (14:56):
That's awesome, yeah, and so for those who don't know
, you know what an IRB is.
Can you kind of give a littlebit of backstory of how that
process plays out for research?

Speaker 2 (15:05):
Yes.
So, as we know, we need to haveethics in place because
unfortunately in human historywe've seen lots of abuses and
marginalized communities beingcompletely taken advantage of by
for the sake of research andscience.
So IRBs the institutionalreview boards is really a
safeguard in place that weensure that all research is done

(15:27):
ethically and humanely, and itis a process as it should be
right.
We should just be, you know,not taking you know research
seriously.
It should be a whole processwhere there's a whole committee
that reviews it and you getapproval through that way,
through the institutional reviewboard.
Sometimes it could take time,edits and revision, but it's
really just part of upholdingour ethics and values in

(15:51):
scientific research.

Speaker 1 (15:54):
Yeah, I just wanted to highlight that because I
think a lot of peopleunderestimate what it really
takes to, for one, just getapproved for research and then
to actually go through it and doit, and then there's just the
whole iterations in the processand you know, like that's why,
you know, I think people shouldtake research more seriously or
understand those elements ofthat, because there's a lot that
goes into it.
You know, it's not just somelike two week project, you know,

(16:16):
this is like possibly years inthe making.
You know.

Speaker 2 (16:19):
Yes, no, and that's a great point too, and I think
part of that is you know, the isthe ethics and the safeguarding

(16:40):
.

Speaker 1 (16:40):
it shouldn't be an easy, easy process and at the
same time, we do want it to makeit accessible, so people can
get their research, you knowapproved and out there.
But it really needed for sure,yes, yeah, um, so what?
Uh?
I guess in general you havelike a, a basic premise for the.
You know like the abstractalready kind of figured out.
Are you still working on that?

Speaker 2 (16:55):
yeah, no, I think we're still.
It's still definitely in inprocess.
Uh, we, we definitely areinterested, though, in
understanding how school socialworkers are using this
technology in their respectiveroles.

Speaker 1 (17:10):
And again.

Speaker 2 (17:10):
I think it's just an opportunity to then say, ok,
this is what's actually happenedin the field, because I think
we're also professionals, socialworkers, we want to do
something.
It's not just say, oh, we havethis research, now it's like
let's make recommendations ofwhat we need to do next.
Now it's like let's makerecommendations of what we need
to do next, and I would imaginea big part of that it's going to
be around, you know, training,around compliance, like what
kind of sort of needs we need asa profession to really ensure

(17:33):
that we're using it safely andresponsibly.
So I'm excited.
I know these things take time,right, that was one of the
things that I realized in mydoctoral studies.
I'm like, oh, it takes a longtime to get published and
submitted.
An article like this is this isgoing to take.
So I know we're in the very,very beginning stages now, but
probably get back to me in ayear.
We'll probably have a littlebit more done, but hopefully we

(17:56):
submitted by that point to ajournal, but that's been one of
the, I think, one of the eyeopening experiencesening
experiences being in for theresearch side how long this
process takes right

Speaker 1 (18:07):
yeah, so but that's cool.
I mean like you're, you knowyou're you're branded as the ai
social worker, but you'reactually doing the legwork
you're doing.
You know, you've been at theforefront of this.
I feel like you know and youknow you're're going to I think
you're going to have the mantleof a social worker for a while.

(18:28):
So I think that's really cooland you know, just, I'm just
along for the ride with you atthis point, like I'm, I'm
enjoying it, I'm learning, I'malso growing and developing and
trying to, you know, use it forfor my own good, yeah, and I,
and I appreciate you, you knowsay because for my own good,
yeah, and I appreciate you, youknow saying because I wanted
it's the same thing for me too,like this journey that I started

(18:50):
in January.

Speaker 2 (18:51):
So this has been.
You know, I started with AIprior to that.
It's been about a year and ahalf that I've been submerged in
this AI world, but it really Ilaunched the brand, the blog, in
the first week of January 2024.
Now we're into mid-December andit's been such a journey and
just since I launched, just likeyou said, I'm learning

(19:13):
alongside of all of us in thefield.
New things are happening, newplatforms, new tech.
So even since the last time wespoke, I've been also diving
into more of tools that are outthere.
I came across a tool calledBastion GBT, and you know us as
social workers are so, andrightfully so we were really
concerned about data privacy andHIPAA, and this is actually a

(19:35):
HIPAA compliant tool that useschat GBT technology and I was
like this is awesome, right?
I was like this is awesome,right?
So for those social workers outthere that you know that you're
interested in using these tools,or you are using these tools
and you really want that extrasafeguard, in place.
Fastion GBT might be a greatoption for you.
It's not free, you know.
I do want to acknowledge thatit's not free but it's, you know

(19:59):
, $20 a month.
There's a free trial that youcould check out the tool, but
again, I think it just kind ofadds an extra layer of HIPAA
compliance security that I thinka lot of social workers, you
know, appreciate.
Some other tools that I've been, you know, looking more into

(20:20):
doing some collaborations withand research on my own, you know
, on my own is Umor.
That's a platform that uses AItechnology to create CBT
worksheets for clients andhomework assignments.

Speaker 1 (20:29):
So that's been that's a good one.
I like that.
I'm going to make it for sureyes.

Speaker 2 (20:35):
We should be having a webinar next week on Monday, so
I'll definitely send that alongto you, joey, and to your
listeners too, if they want tocheck out that webinar next week
.

Speaker 1 (20:44):
That I'll be doing.

Speaker 2 (20:46):
Another one is Uphill , and Uphill is another AI tool
for mental health practitioners,not just social workers, but
it's a tool where you can recordyour you know therapy sessions.
It will transcribe for you.
It also has those dataprotections of you know security
and HIPAA compliance.
And another cool thing aboutthe tool also is that it

(21:08):
analyzes your clinical sessionsdata so you can get trends on
the client's mood, differentwords that they sort of shared
in your session, and it's justanother way to start really
deeply thinking about yourclinical work that you're that
you're doing and you know,create more interventions from
there, or you know kind ofnotice the trends so you can

(21:29):
offer different you knowinterventions or insights.
So I just think what a greatway to get more objective sort
of analytics from your sessions.
Because, you know, humans wemake errors, we have biases.
But if you have a tool like AI,who's recording the session and
collecting some of this data,you might get interesting sort
of viewpoints that maybe you'venot considered before.

Speaker 1 (21:49):
Yeah, Interesting stuff, right, yeah, that is with
these businesses.
Like you have private practice.
You've got big organizationslike how difficult is it for a
company to, or an organizationto, just to try out one of these

(22:12):
AI tools?
You know, because I feel liketools are out there now but
maybe businesses, organizationsare kind of timid about trying
it out.

Speaker 2 (22:20):
Maybe no, and I think you bring up a really good, a
really good point.
I know one of my students atthe, you know, the university
had shared like, oh, like myagency has, you know, paid for
this like tool that we haveaccess to but honestly, it's so
far few in between that.
I'm hearing agencies, you know,purchase and again, I think a
part of it is it's new.

(22:41):
It's not, you know, it's a newtraining.
You know, anytime newtechnology it's a new training.
Anytime new technology is in anew organization, it can create
a lot of concern discourse.
I can kind of see the hesitancy, but I always go back to your
point, joey we have to beadaptive with the times.
This can make your employeesfeel less burnout burden,

(23:02):
especially on the administrativepieces you know, feel less
burnout burden, especially withadministrative.
you know pieces, I know I workedin outpatient mental health
when I started as a socialworker back in 2011.
And my caseload was, you know,30, 38, 39, 40, 41 clients.
Right, I was seeing I wasscheduling 11 clients a day.
Sometimes, in my practice, wewould have weekly conversations

(23:28):
and supervision about myproductivity.
Productivity was also somethingI talked about at the agency,
because and then, on top ofproductivity, not only how many
clients are you scheduling thatyou're seeing, but you have to
get your notes in on time.
So that's like a 40 hourturnaround 24 hour turnaround
and that's like a recipe forburnout right Like it's too hard
to sustain.

(23:48):
So what an opportunity if wecould incorporate some of these
tools with you know, clientconsent, because that's you know
, we can you know this could bea real opportunity to support
the caseload and workload when Iknow so many of my colleagues
in outpatient mental health youknow are struggling and I think
this would be one solution.

(24:08):
I'm not saying the onlysolution is some of the burden,
but I do know there areclinicians in private practice
who are using it because youdon't have to go through the
agency, right your own entity,but they're finding a lot of
positive use.
I did a training, I want to say, last month for mental health
clinician practitioners and theywere walking me through like

(24:29):
how they had the consentconversation with their clients.
And yes, there've been a fewthat have, you know, some
concerns, but majority were opento this process.

Speaker 1 (24:38):
Yeah, I mean, if anything, it's the modern
equivalent of having your video,your sessions, videotapes of
for training and whatnot.
You know, um, some people willagree to that, you know, as long
as they know that it'sconfidential, then it's not
going to get disseminated.
Um, but yeah, I think, I meanjust, I'm just thinking about
the amount of hours, man hourssaved.

(24:59):
Uh, not having to write out allthose, those case notes and I
think people, especially people,practitioners, know that you
said it also leads to burnout,it's stressful, it's time
sensitive.
And you know, not saying that weuse AI to do all that work,
like you said, but it's anefficient, a much more efficient

(25:20):
way of basically just analyzingdata, and then you can
basically add and subtract asyou see fit.
And and and I mean I just Idon't know, I just feel like
that's, that's a potential gamechanger in the in the industry,
because think about how muchmore or how much, how much of
backlog that would clear up, youknow.

(25:41):
Uh, conditions would be lessstressed.
They would also have more timeand maybe they could decompress
more.
They have more time todecompress and and uh you know,
I just I don't know.
I guess my mind is swirling withall these possibilities, but
it's going to come down to, onceagain, people consenting to it.
But I think if you, if you wereable to, yeah, if they were able

(26:01):
to explain it in a way thatthey see the benefit.
Because I think the other thingthat that could be incorporated
and we I think we've talkedabout it before is like having
the case notes basically sharedbetween client and and
practitioner.
So that way, if you havehomework you know and I've been
guilty of that too you know,therapy, homework is great in

(26:24):
session, but as soon as youleave that door, it kind of goes
.
It goes to the wayside a littlebit, you know.

Speaker 2 (26:29):
So yeah.

Speaker 1 (26:31):
I think that would be really helpful for both client
and practitioner.

Speaker 2 (26:35):
I absolutely agree.
I think again, the opportunityis just the willingness to be
open, the willingness to pivotand this willingness to also
have these importantconversations and training.
And you know the other thing inspeaking to you know other
mental practitioners who areusing these tools.
You know you still have toreview the note.
You still have to make sure itsays everything that you are

(26:55):
intending that you'd like thenote to write or to say.
You may want to make sure it'sconfidential.
There's no information that youwant the, you know the
insurance company to know.
But what a time-saving you knowtool.
And you know if you're I know,I've, I've been a supervisor, I
know from a management point ofview, it's so easy to get just

(27:15):
so in a snowball with needing toget all these notes completed
and I just think what anopportunity to support, even
from the business sort ofmanagement side, that if you
have a tool that could reallyhelp you know your employees.
Something to consider and thinkabout.

Speaker 1 (27:31):
Yeah, absolutely so.
If any of you privatepractitioners you know, if you
use this already.
I think there's quite a fewtools out there that are HIPAA
compliant and, like you said,practitioners are using it.
So, yeah, like you know, let usknow how it's working for you,
give us some feedback and, youknow, maybe we can convince
other people to do the same ormaybe they'll convince them not

(27:53):
to.
But I feel like, yeah, at leastsomething that we could, you

(28:13):
know, learn from otherpractitioners, from other social
workers who might be using itin the field.
But, and then, speaking ofusing, do a nice little recap
and, you know, basically createda whole outline for things that
we could talk about and I wasjust like that was.
That took all of a few seconds.
Well, not, it did take a littlebit of time, but I mean just

(28:41):
the amount of time that saved me.
And energy, mental energy, Ithink, is really one of those
things that people can, I think,if they can quantify that a?
Little bit more.
They might be more on board withusing AI for their own personal
needs and I think it's startingto break into the discussion,
the you know, I guess theculture of sorts you know know
like it is starting to becomemore mainstream.
People are talking more abouttheir chat, gpt and things like

(29:04):
that you know yes, yeah it'sreally interesting um, even the
NASW they had a call for, youknow, conference proposals.

Speaker 2 (29:12):
Ai is one of the the topics so I know they're looking
for proposals out there.
So I know I definitely put myproposal in.
But I definitely call otherother social workers in the
field to yeah to really, youknow, showcase this technology
and really engage in, you know,important conversations around
it too and and I really do thinkthere needs to be, if there

(29:32):
isn't already.

Speaker 1 (29:33):
I mean, I guess.
I mean I guess there probablyare, but like actual tech
companies that don't necessarilyknow what it is to be a social
worker.
So if any of these techbillionaires want to hire us as

(30:13):
consultants, that would be great.
I would not turn that down.
I have a very modest fee andyou know very reasonable.

Speaker 2 (30:21):
So plug, you know we got to put it.

Speaker 1 (30:23):
Yeah, Got to manifest that you know.

Speaker 2 (30:26):
Yes, yes, I love that and I absolutely agree.
I mean, just starting from techdesign, I think the social work
perspective is so needed.
Perspective is so needed and, Ithink, such a valuable
component because we may nothave the same tech expertise as
the coders, the engineers, thetech, the IT or the business
side or the branding, but wehave the ethics and the human

(30:48):
side of understanding and evenjust sort of that whole
ecological sort of system andpersonal environment framework
that we know and also our socialjustice lens, our ethics lens.
I think we bring so much ofthat into this discourse and
conversation that these arethings that these tech companies
are not really thinking aboutand I think that's an
opportunity for our field toreally be at the forefront of

(31:10):
this work.

Speaker 1 (31:11):
Yeah, yeah, definitely.
And in thinking about othersocial workers in the field that
are interested or advocatingfor AI, I did actually try to
get Ernesto Bejarano, the AIsocial work mentor.
I just give a shout out to him.

(31:32):
I was kind of last minute but Iasked him to jump on the stream
but he couldn't.
He had obligations minute.
But I asked him to jump on thestream but he couldn't.
He had obligations.
But I definitely want to pickhis brain Cause I know he's.
He's been doing a lot of reallycool stuff.
He's got the social work AImentor.
I'll put a link.
I'll put a link that in the inthe show notes.
But yeah, I mean there thereare people, there are social
workers that are that are doingreally interesting things in the

(31:54):
field.
I just wanted to give a shoutout to Ernesto Hope to have him
on the show someday.
So thank you for that and, youknow, do you have any other
colleagues that are doinganything similar or interesting
in the field of AI?

Speaker 2 (32:10):
Yeah, no.
So I mean I just want todefinitely also put in a plug
for Ernesto's you know tool.
Anytime I do trainings, anytimeI speak to my students, I'm
always saying, hey, there'sactually another social worker
in the field who is doing thiswork.
Right, it's not some techcompany.
It's one of us and I thinkthat's super, super exciting and
I know his tools, like in anumber of countries around the

(32:31):
world.
I mean that's so exciting and Idefinitely want to give him a
lot of props and shout outs forthe incredible, you know work
that that he's doing.
I know there's also a socialworker at uphill who that's like
they're an integral part of ofthe tool at uphill and the
name's escaping, but it's goingto come to me and I will
definitely email it to you afterum to make sure, and I know

(32:53):
we'll be both presentingseparate topics but also at a
joint at a conference in thespring.
So I'm like that's superexciting that there's even just
another, you know social workerin the realm of these tech tools
and bringing our perspectiveinto the role and, you know,
into into the work as well too.
I know there's, you know,researchers out there that are
really passionate and you knowdoing some great work with with

(33:16):
AI and certainly want to upliftDr Melanie Sage, who's been
doing research in AI.
Dr Desmond Patton, who's alsobeen doing years before even AI
and chat to BT.
Also Johanna Creswell-Bias, whoalso spoke at our NASW
conference.

(33:37):
She led a moderating panel onAI and social work.
So I mean, I know there's anumber of social workers out
there who are doing this workand I know we're going to be
getting more people into thisfield and into this intersection
because it will be growing.
And I've already had even adoctoral student say like I'm
really interested in AI andsocial work practice students.

(33:59):
So I think this is going to bemore of a wave where there's
gonna be more, more of us comingdown that pipeline.

Speaker 1 (34:05):
Yeah, yeah, and I think the last time we did talk
about forming some kind ofAvengers style um mashup or or
even like a collective orsomething, obviously that kind
of fell to the wayside.
We got busy with everythingelse that we're doing and
working on, but I do think we'rein the early stages of either
creating our own tech companyfor AI, social work, or we're in

(34:30):
the beginning stages of justcreating our own community.

Speaker 2 (34:35):
Yes, definitely, definitely, because, again, I
think there's such power insharing resources, sharing ideas
, sharing best practices, youknow, sharing articles that are
being published or blogs orthings, or you know, or even
just hearing how other like Ithink about you know school
districts, how they'reimplementing AI for their school
social workers or otheragencies like how agencies are,

(34:55):
you know, bringing in thistechnology.
So what a place to have like aplatform to share that knowledge
.
Be in community, you know,share the struggle and the
challenges or the concerns.
I think that's a great thing todo.
So it's always been on my mind.
Let's manifest that for 2025.
Let's do it there.
You go.

Speaker 1 (35:12):
I like it, start off with a charge, you know, and I
mean, then again you're gonna.
It sounds like you're probablygonna have a big 2025 coming up.
So maybe we need to do this instages and, like I don't know,
come up with a, you know, asmart game plan.
But yeah definitely 2025.
It's gonna.
It's gonna be happening forsure.

Speaker 2 (35:33):
It could be as simple as like a Slack, or it could be
as simple as a Facebook groupgroup, it could be as simple as
even just a Google, you knowlistserv.
I think there's different waysthat we can engage in community.
I think it's just you know,like I said, making it, you know
, making it happen andmanifesting it to do the next
steps in logistics.
I'm game, I'm ready.

Speaker 1 (35:52):
All right, Well, anyone in the FIRE social worker
community and the AI socialworker communities.
If you guys are interested insomething like that, then you
know, let us know, I'mdefinitely open you know, email
me, dm me, whatever, shoot acomment on the stream, yeah, and
then we'll figure something out.
Because you know, like I said,this is the future and you know,

(36:13):
I think we're just.
We're just in that, in thatmindset, where we're willing to
step in maybe a little bit moreboldly than other people might,
but at the same time, like we'vesaid, it's not going away and I
think getting ahead of thecurve just makes more sense to

(36:33):
me.
That's just how I view it.
But there also is genuinelyuseful things that could come
from AI and probably already has.
So I'm also trying to have thateternal optimist kind of
perspective that, okay, we canuse this for genuine good.
So that's the other thing thatI'm passionate about.

Speaker 2 (36:54):
Yeah, no, I'm so with you and I guess you know.
I know we kind of talked abouthow I've been using AI since our
last session, but I wanted toask you how are you using AI
since our last session?

Speaker 1 (37:04):
Because you use it in a couple ways too right.

Speaker 2 (37:07):
So I know I use it in some ways, but it's always
great to be in community andlearning from each other.

Speaker 1 (37:15):
Yeah, well, yeah, so I've had a lot of different
things that I've been messingaround with, but you know I did.
I did create a chat botbasically for financial
counseling.
So basically my vision was alot of the initial onboarding
about financial counseling iskind of tedious.

(37:36):
You're getting information andthen you're, you know, having to
compile, just like any intake,you know.
So I was just thinking if, if,if, there's a way to create an
smoother onboarding process forfinancial counseling and using a
chat bot to basically not justask questions like the data,
because it could.

(37:57):
It could do the data collectionand compile it into something.
Really, and I've tested it.
It works.
But then I'm also incorporatingthe financial social work
principles and mindset into thechat bot, you know.
So I've been feeding itfinancial social work and
financial trauma and things likethat.
So I'm building, yeah, I'vebeen building that, playing with

(38:19):
that for a while.
I also have been.
Well, you know, it's just oneof those things where it's like
I I see a problem, cause I was,I was having that problem with,
you know, taking on clients andthen trying to do all that and I
was like there's gotta be abetter way, you know.
So I just messaged around withchat GPT and created my own
custom, custom GPT, and I'mstill working on it, so it's not

(38:43):
it's not long, but it's goingto be much.

Speaker 2 (38:47):
I love that Such an innovative way and just thinking
about how to really meet yourconsumers and clients right With
being able to get thatinformation, but also that you
could also then use thatinformation too, but it's also
just an easier way for you togather it also, so it's like
time saving for you I love it.

Speaker 1 (39:05):
That's awesome, really really cool yeah thank
you um and so that's one thingthat I've been working on off
and on.
And then, um, I just recentlypublished I don't know like
we've talked about it before alittle bit, but like almost like
a weekly thing, or I don't know, however, however long it's

(39:44):
going to take, but basicallyhave have a story, um, a comic
book type of story.
Uh, and this is purely just forfun.
This is nothing anything youknow crazy, just something like
a creative outlet that I justhad to scratch.
Um, and it's going to be the.
It's going to be called thesocial worker, at least for now,
and it's basically a premise isa social worker who basically

(40:07):
gets superpowers and then isfighting bad guys with, with
empathy and with compassion and,you know, instead of you know,
hurting people, trying toactually cure and fix people
with their superpower.

Speaker 2 (40:20):
So um and again.
So, so creative right, and Ithink again.
I think it just kind of speaksto how much does it open up the
door for creativity when youhave access to tools like these?
I'm an assumption, I don't knowif you are like a graphic
artist, you could do thisartwork by hand.
But think about now you don'teven need to.

(40:40):
You have access to.

Speaker 1 (40:49):
I know, and yeah, it's not perfect, but, like you
know, I just generated a cover,a cover page or a graphic, you
know cover art, and I was likethat's amazing, like I just
would have never.
I don't have that ability, butin the and I know people do, but
I mean for it to just come outof you know, all the prompts and
all the information that I fedit.
And it was just like it was,like it went from my, my brain
to the page, you know.
And so, yeah, it's just beenreally cool experimenting with

(41:12):
that from a creative side.
But I mean, yeah, it's stillultimately, I use it a lot as
kind of a personal assistant.
You know brainstorming.
I've actually been goingthrough a little bit of my chat
GPT history and I realized that,yeah, I talked to my chat GPT

(41:34):
quite a bit, even more so thanpreviously.
But know, I'm also um, you knowit's because it's so integrated
into you know, it has a lot ofmy backstory now and like my,
you know, even I had to do andthis would be interesting prompt
if, if you're interested, but Ihad to actually do like a year

(41:54):
in review for me and it took allthe memories that it had and
then it compiled, like you know,granted it wasn't a full year,
but you know it was like, um,kind of a month, my month by
month recap of the projects thatI started and, you know, things
that I've been working on.
So I mean, I don't know, it'sjust kind of a cool like a year

(42:15):
in review kind of thing yes, no,I love that.

Speaker 2 (42:18):
please like definitely share that prompt,
but I will, you know, I wouldlove to do that even just my own
work and how I've been usingchat to be T for the year, and
it's interesting because evennot too long ago I had a
colleague of mine who's in theclinical field share this like
really interesting prompt andI'll also send that to you too,

(42:38):
really interesting prompt, andalso send that to you too.
How, uh, chat to d can analyzeyou on like a deeper level, kind
of say things about you thatyou haven't necessarily shared
it with it but it can.
Why?

Speaker 1 (42:43):
I think I've heard of that prompt, yeah I'm blowing.

Speaker 2 (42:46):
I was like whoa, this is kind of really interesting.
You know what this shares aboutabout you and you know it was
interesting because it kind ofbrought up some of my, you know,
work-, work-life balance needsand that you know I'm working a
lot all these things.
I'm like how does it know?
I'm like you're helping me domy job.

Speaker 1 (43:04):
Yeah, yeah, it's crazy, it's crazy it is.

Speaker 2 (43:07):
But you know, it's just, it's super interesting and
, like you said, as a personalassistant, I know just.
Even you know, a few weeks agomy siblings and I we got
together, um, my brother reallyenjoys baking and we enjoy like
cooking.

(43:27):
So we asked chat to be tea, tocreate like an olive oil
chocolate cake recipe, and wetry it so good Like they're just
it too.

Speaker 1 (43:37):
Yeah.

Speaker 2 (43:38):
So and I know my brother just moved to a new
house he was asking me questionson like design or what kind of
like a table size you should getto maximize the room.
I mean, there's just so manythings that I didn't even think
about how you could use it inyour everyday life to really,
you know, ask it questions andit and it shall give you an
answer.
Right, yeah, yeah.

Speaker 1 (43:57):
I actually used it a couple of weeks ago or about a
month ago and just kind of on awhim.
But I was reading stories withmy son and I was like let's see
what ChatGPT can come up with.
And I just gave it a prompt.
I said, you know, to create astory, a fun story for my
four-year-old son.

Speaker 2 (44:17):
Yeah.

Speaker 1 (44:17):
And that was.
It was a very basic prompt andI was like, oh, this is going to
be really crazy.
But, it actually told.
Like, as I was reading thestory I was like this is
actually kind of a good,interesting, fun story for a kid
, you know.
I mean it was kind of, it wasjust crazy, so little things
like that that I just neverwould have imagined using an AI

(44:41):
for.
And yeah, it's I mean like youcan search Google.
I mean I think, yeah, chatgptis just kind of taking the place
of Google in a lot of ways formore nuanced, yeah, like
analysis, maybe, like instead ofjust, like you know, searching
for a topic you put it in yourchat, gpt, and it'll, it'll tell

(45:02):
you a lot, you know it's beenreally helpful for me just to
kind of learn new things or justflesh out some details that you
know I didn't initially thinkabout.

Speaker 2 (45:12):
Yeah, no, absolutely Great, I know now.
I believe it's.
I don't know if it's yet on thefree version, so I should
definitely, you know, doublecheck that too.
But I know chat GPT is nowgiving you the sources where
this information has.

Speaker 1 (45:25):
Has come from, so just like complexity.

Speaker 2 (45:27):
That will give you the link I know on the app on my
phone you can click on, wherethe source of information, where
it came from and I'm like thisis this is now.
I think you know I was gettingsome concerns and I know I've
gotten this question intrainings, like where is this
information coming from?

Speaker 1 (45:40):
Like how do I know what it knows?
I'm like, no, that's a reallygood critical question to ask
yeah.

Speaker 2 (45:47):
Yeah, and so I think it's.
You know, I know perplexity hasbeen doing that for some time
now where you could askquestions, and perplexity will
give you exactly the sources,the website where the
information is coming from.
And I think it also pushes fortransparency, but I think also
for us as a consumer andinformation.
We know where to actually getthe source of that information
from, so we can continue ourresearch and cross-check and

(46:09):
reference.
So you know, I think thesetechnologies are going to just
get better over time.

Speaker 1 (46:14):
Yeah, yeah, that's actually a good point because,
um, I've been hearing thingsabout like, uh, chat gpt could
be also spreading misinformationor disinformation without it
actually quote, unquote, knowingyou know it's just pulling data
from the internet or from itsarchives or whatever, and since

(46:35):
it's human generated input, andhumans are flawed.
there's going to be some flawedinformation in ChatGPT, so
that's one of those things whereyou just got to take it,
sometimes with a grain of saltbut, like you're saying, they're
trying to get it to where it'slike has some validity.
You know, like Wikipedia.
You know Wikipedia is usergenerated, but they have they

(46:56):
have the facts, they have youknow the.
They have the facts, they havethe proof and the references, so
that adds an air of validity toit.

Speaker 2 (47:06):
Yep Agreed.

Speaker 1 (47:07):
I like that they're doing that.

Speaker 2 (47:08):
Yeah, yeah.
No, it's definitely someimportant work and, as you
mentioned, this technology isgoing to continue to advance,
continue to come out with newfeatures and I'm sure the next
time we chat hopefully not in afew months, but I'm sure that
comes up next time that we chattoo right.

Speaker 1 (47:26):
Yeah, absolutely.

Speaker 2 (47:28):
Always growing and changing and I think it kind of
goes back to that communitypiece that if we are in
community in a learning space,that we continue to share these
you know with each other, bemost up to date with the most
you know relevant things.
I know I'm trying to post youknow as much as I can on
LinkedIn.
I you know I know the term thissemester sometimes got the best
of me right trying to geteverything done.

(47:49):
But you know, I definitely youknow, I really feel a sense of
you know, importance to reallyshare, share free resources,
share the most up-to-dateknowledge with each other,
because it's only just going tobetter advance our work in our
field.

Speaker 1 (48:01):
Absolutely yeah, and I think there's some genuine
concern that people you knowthere are jobs being taken away
right this second from AI andthese chatbots and things like
that.
But you know, it's kind of oneof those things, too, where
these things will change, likeit'll, it's going to evolve,

(48:22):
just like when you know thehorse and buggy, you know you
had the automobile come along,you had people that were like I
like my horse and buggy, youknow, but then, like this new
thing is, is it even better?
You know, like objectivelybetter.
But there's that, that awkwardtransition period where you know
, basically, we're like somepeople are, are on board early,

(48:42):
early adopters, other people arekind of in the middle, and then
other people are justcompletely oblivious and they
want to keep it that way.
You know, um, and I think, uh,that's just going to be.
It's going to make things morechallenging for them because,
they're just going to.
You know it's going to, it'scoming to be, it's going to make
things more challenging forthem, because they're just going
to.
You know it's going to, it'scoming, you know it's here.

(49:03):
So it's just one of thosethings you got to, I think you
know, as we've said, just bemore proactive about it and try
to learn it.
You know not, not, not, justyeah, not just put your head in
the sand and really just try tolearn it, use it for good and
see what it?
Can do, you'd be surprised.

Speaker 2 (49:18):
Yeah, no, I'm.
I'm glad you brought up, youknow, the concern around the
loss loss of jobs, I think, alsothe.
I think the flip side of thattoo is I think there will also
be new, new jobs that will becreated because of this
technology too.
I think it's, it's definitelygoing to have an impact on the,
on the workforce and like,interestingly enough, you know,
I was doing some research, I dida presentation on this, you

(49:38):
know, a couple weeks ago at at aCUNY is.
You know, there's a real gapright now.
There's a gap that employerswant to hire employees that have
skills and understanding andskills to end up being able to
use AI, and yet there's a gap inthe skills where our
universities students are sayingthey're, they're graduating

(49:59):
with I don't think I even havethe skills, or we didn't learn
about this, and there's a realgap.
And I think that really kind ofspeaks to the urgent need, how
this is really is going to shiftthe workforce, but even for
social work practice.
I think we're a profession where, yes, I think there's aspects
of our job that can be done withAI, like administrative tasks.

(50:21):
Right, Like the recording of oursessions and creating, you know
, progress notes.
But can a chatbot, you know,really replace the art and the
connection that we have intherapy, right?
So there's been research outfor a couple of years now where
they've been using these chatbottools and and therapeutic

(50:43):
chatbots like Wiza, and there'sbeen a couple of others, wobot,
where you know research hasshown.
You know again, I think theiroverall sort of message is it
can't replace therapy, it can beused in conjunction and tandem
therapy.
It can be used in conjunction,in tandem.
There's opportunity foraccessibility so that if you are
on a waiting list or you haveno other, you know options
accessible right now.

(51:04):
That it could be like atemporary sort of support.
I even think again because ofthe human connection aspect, the
training, the you know, theconnection that comes from, the
power of, because we know theresearch shows that the most
defining sort of component ofthe therapeutic work is not what
technique I've used, it'sreally the relationship that we

(51:25):
have, I don't think the chatbotcould truly replace that but,
it's opportunities for to use intandem, and that's sort of what
the research is showing too,but there's still going to be
much more research.
I know you come from the worldof veterans.
I think you're a former veteranyourself.
The VA is also in contract andresearch right now, creating an
AI tool called I think it'scalled Battle Buddy, where

(51:45):
they're still doing research onit, but it's a way to gain again
, to speak to a chatbot, to gainresources.
They're going to algorithms,kind of pick up concerns on
suicidality to give additionalsupport.
So I think it's just their wayof concerns on suicidality, to
give additional support.
So I think it's just their wayof, as you know, a historically
marginalized community,community resources.

(52:05):
This might be a way to I thinkabout just the entry point of
what a chatbot can do, becauseif it, can give you an entry
point into feeling morecomfortable for asking for help,
for getting services.
Could that also lead for you tohave a connection with a social
worker or other mental health?

Speaker 1 (52:18):
person, right, right, yeah.
And, like we've said before too, the way these chatbots are
trained is very important.
So, like you can I mean AIs andyou know these language
learning models.
You know, depending on how wellthey're trained, they can be
very, very accurate and very,very like.

(52:40):
Basically, like you know, likewas it Watson, the AI that beat
Jeopardy?
Basically, you know, likethey've been incorporating
Watson from what I was followingup on a little bit, and they
actually have you've been usingit in the medical field too.

Speaker 2 (52:58):
I was just going to say, yeah, it's taken the
medical license exam and passedlike, and it's done really well
yeah.

Speaker 1 (53:06):
And it's able to detect, like cancer, like at a
higher, more effective rate thanhumans.
And you know, it's just one ofthose things where, yes, we have
to kind of like accept thatthere might be something wrong
or whatever.
But, like you know, if the, ifthe statistics say that it has
like a 90 success rate versuslike a human has a 60, like that

(53:30):
to me is it says, okay, that weneed to look at this.
You know, um, and maybe itcould be human um, or ai
assisted human.
You know, that's basically whatwe're doing here, where it's ai
assisted um, like decisionmaking or writing or creation.
So I think, yeah, it's one ofthose things that it's going to

(53:50):
be like.
Yeah, we just have to use it forgood, train it for good, train
it properly and ethically, likewe've said before too, and, you
know, just try to try to keep itreined in, you know, and not
let it, you know, exterminate usall, as everyone wants to think
that that's going to happen,but I think, if it hasn't

(54:13):
happened yet, I don't think it'sgoing to happen.
But, you know, hopefully, sincewe've been so nice to the
artificial intelligence, that wewill be saved in the, in the
future, robot apocalypse.
But that's another storyaltogether, I guess.

Speaker 2 (54:29):
Yes.

Speaker 1 (54:32):
Well, you know, we've got, I think we've pretty much
covered a good amount of stuff.
You know, like, honestly, fromfrom the chat GPT created
outline, we've covered a lot.
You know one thing that that wedidn't touch on a little bit
and we could talk briefly aboutit.
You know, um, I think, uh, onething that that we didn't touch
on a little bit and we couldtalk briefly about it.
But, uh, you know,international, the international

(54:54):
um AI conversation.
You know, like you're, you dohave international ties, right,
are you?
Are you engaging with any othercountries with this stuff?

Speaker 2 (55:04):
Yeah, no, great.
Um, you know, great greatquestion.
So, yes, actually, um, ourneighbor to the north canada, so
, um, I'm actually like sidebar.
I'm also a dual citizen ofcanada as well.
So I did my undergrad in in atthe university of western
ontario and I've been I'mactually currently in in in

(55:24):
ontario right now.
Oh, nice, my parents live Niceand so really I'm super excited.
But in April I've been invitedto speak on a symposium, a
national symposium for Canadiansocial work in Montreal and to
really engage in this same topicof AI and social work.

(55:45):
So I'm really excited to be incommunity here with my, you know
, fellow Canadians.
So I'll be presenting in Apriland so I know that's kind of
been at least one of the lensesof the work that I've done on an
international scale, but I hopeto do more, you know.
I know this to the neighbor tothe north, but to still do more

(56:05):
of that engagement with otherplaces you know as well.
So that may be another, anothergoal for 2025.

Speaker 1 (56:13):
Yeah, there you go.

Speaker 2 (56:14):
Maybe to kind of build, build more.
It's also really cool lookingat the data of my, of my website
.
It's been really exciting tosee just other countries like
Japan, ethiopia, india, dubai,like the well, the United
Emirates, you know, just seeingother countries around the world
that have been checking out mywebsite.

(56:35):
So it might lead to otherconnections you know
internationally.
So if you are, actually outthere and want to connect like
please, by all means, I wouldlove to you know, chat at
university, do a training orjust even you know even talk and
network too, I think there's alot more work to be done, even
on an international stage.
I did connect with a socialworker from South Africa who's

(56:58):
done some work in writing on AIand social work practice in
their country in South Africatoo, so I know there are other
countries who are engaging inthis work, so shout out yeah.

Speaker 1 (57:10):
Yeah, I mean, you know it's one of those things
that it's going to be asubiquitous as social media is
now, so other countries aregoing to have to deal with it
their own way.
I'm pretty sure the UK is goingto look at it way differently
than we are, and that's fine,you know, like it's going to you
make it, you know, appropriateto whatever country
nationalities that you have.

(57:32):
So, yeah, that's really cool.
You know, I'm hoping to hearsome more good news about the
those connections in the future.
So let's see what's our, what'sour homework.
We we have to create the AIAvengers.

Speaker 2 (57:48):
Yes, and I got some ideas for my next steps.
I have a small listserv from mywebsite, but I would love to
survey the listserv and seemaybe a platform.
Maybe we can maybe choose itbetween Slack doing a Google
group or maybe doing Facebook,but if there's another option
that you think of, that might bea great platform.

(58:09):
We can even ask chat to bt.
What's a great platform.

Speaker 1 (58:12):
I think discord could be a good option.
I've been using that a littlebit lately okay, yes but, uh,
yeah, we'll have to brainstorm alittle bit and, yeah, if you
guys have any um, any feedbackor suggestions for how can we
create the avengers, and if youwant to be a part of it,
obviously let us know too,because you know, at this point

(58:33):
um, the the more, the merrier,the you know the hive mind that
we're going to start to buildtogether um is, uh, I think
could be a very powerful thing.
And one thing that actually alot of people might not realize
is that you can actually sharechat gpt, like you could share
it with people and collaborateum, and they also have a team.

(58:55):
A team feature or team is like25 bucks a month um, so you can
collaborate um.
I am curious about this 200 amonth chat gpt, uh, pro um that
that like, obviously I can'trationalize spending that money.
Genuinely curious to see um what, what, what is it?

(59:18):
What is?
What?
Do you get benefits from?
You know?

Speaker 2 (59:21):
I know, I know it's.
It's piqued my curiosity.
I don't know if I couldfinancially commit to it right
right maybe there might be anopportunity in the near future,
but definitely has piqued myinterest also.

Speaker 1 (59:32):
Yeah.

Speaker 2 (59:33):
What can it do?
Yeah, yeah.

Speaker 1 (59:36):
Open AI.
If you're listening, you knowwe would definitely support you
guys.

Speaker 2 (59:41):
Yes.

Speaker 1 (59:42):
Be on the team, whatever manifests that.
But yeah, I mean, you know,even, uh, the plus plan has been
20 bucks a month.
Honestly, like it's totallyworth it to me.
Um, I am nervous that they'regonna eventually raise that
price, um, for the that tier.
But uh, you know, that's justone of those things yeah, we

(01:00:06):
will see and then?
And then, what is what do theydo with all that?
What if?
What if it goes belly up, thenwhere does that data go, you
know?
So that is something to thinkabout too.

Speaker 2 (01:00:15):
Yes, these are important, important questions
and I think it kind of goes backto and I always say, you want
to think twice about what youknow, data you're putting in and
you're asking and you knowcertainly if you can use it your
professional work as a socialworker never to put anything
identifiable you know as a butif you are again like, if you
have the paid version, you canput some privacy settings on it.

(01:00:38):
Um, so you're not going into the, your information's not going
to the main.
You know public data, um, youknow, but it's just always going
to be really mindful and thinkabout yeah absolutely yeah well,
um, let's see if there'sanything else that we missed.

Speaker 1 (01:00:54):
Uh, we got your projects.
Oh, um, youtube channel, wereyou still gonna do or you're
still working on that, or isthat on the?

Speaker 2 (01:01:01):
something else I manifested into the universe
just did not happen.

Speaker 1 (01:01:04):
This time it just didn't.

Speaker 2 (01:01:06):
No, but I will be marketing interns in the spring.
So maybe it might be it mightbe, you know, it's still
something.
It's always on the back burnerthat I really like to get that
launched.
Um, so we'll see.

Speaker 1 (01:01:17):
We'll see if 2025 that makes it happen oh, but
speaking of launches, you aregoing to be working on another
guidebook, is that?
Yes, yes so I.

Speaker 2 (01:01:26):
so I did a, and it's free on my website right now, at
theaisocialworkercom, that youcan download a free 83-page
guidebook for incorporating AIinto your social work practice.
But my vision for the guidebook2.0 is not so much micro, meso,

(01:01:52):
macro, but actually taking the30 most prevalent social work
jobs and creating just sort ofguides for each of those 30,
most prominent professions inour social work field and
different prompts and differentways to use it.

Speaker 1 (01:02:11):
in that retrospect, yeah, we, that's awesome.
Yeah, I love that idea and, youknow, just also a plug for the
field of social work Any anybodylistening that might be
interested or looking at, ifyou've got a big heart and you
like helping people.
Social work has so, so many youknow flavors.
So I just learned about sports.

(01:02:32):
I think it's like sports,social work or something like
that.
Yeah, yeah, and did you?
You had, or somebody was, itwas trending on LinkedIn, but it
was like, you know, let'screate like the alphabet of
social work and it had all thedifferent, different types of
social worky jobs.

Speaker 2 (01:02:48):
Yeah, different types of social working jobs.

Speaker 1 (01:02:50):
And so yeah, just to plug, for you know I'm in right
now, you know I've recentlytransitioned to becoming a
therapist which is somethingthat a lot of people don't know
you can do as a social worker,but before that I was in the
hospital system and you know I'malso on the side doing

(01:03:11):
financial coaching withfinancial social work.
So yeah, there's just so manyoptions.
It's almost it can beoverwhelming.
Honestly, I do remember in gradschool when they were like all
right, what do you want to do?
What do you want to specializein?
I was like I don't know.
There's so many, so many thingsyou know.

Speaker 2 (01:03:27):
Yeah, and I think that's really what I love about
this field so much is.
It's how broad our field is,how we could be in so many
different areas.
And I think, too, as you evolvein your career, just like you
shared in your story.
You worked in hospitals, you'veworked in this sector, you now
doing financial planning, youhave a business.
I think there's just so manyways that you can continue to

(01:03:50):
evolve your career.
So I just find it's neverboring.
It's something that alwayscontinues to challenge me, it
excites me.
I've met so many incrediblepeople and I think also the
impact that you could do fromdoing, you know, work with
direct clients, but communitywork, organization work.
It's just been so fulfilling tobe in this field and, as you

(01:04:11):
know, I love working withstudents and I think I really
try to impart that wisdom.
You know that you're reallygetting a degree that supports
you being able to do social workpractice in micro, mezzo and
macro and what a degree to havewhich is very unique and, unlike
other professions, so yeah

Speaker 1 (01:04:30):
yeah, no social work this is absolutely team social
work all the way yes, yes umwell, let's see.
Um, I think, man, we covered,covered so and we're gonna use.
Uh, obviously I've been usingchat, um ai, and you know,

(01:04:50):
making clips and posting and alllike all the stuff that I have
to do manually.
I just couldn't do it, you know.
So, and it's not always perfect.
You know like sometimes it'llcut, you know, at a weird or a
weird spot and cut off a littletoo early, but overall, like it,
it does a really good job forthe most part, you know.
Yeah, yeah, like I said, it'sjust, it's a tool, uh, and, and

(01:05:14):
you know you can use it for, um,making your life easier or, you
know, just for something forfun.
So definitely but yeah, so I'llbe using some, uh, some tools to
kind of cut the footage, get,get some cool little clips going
and then you know we'll, we'llpush it out into the ether.
But but yeah, so any, any lastminute ideas, concepts, things

(01:05:39):
that you wanted to to highlightor touch base on, or you know
anything like that.

Speaker 2 (01:05:45):
No, I, you know.
I think we again, as always,had great conversation.
I just love being in learningcommittee with you, learning how
you're using you know, chat tobe team, these tools, and thank
you again for having a platformto share and continue to raise
awareness and, you know, again,just education around these
tools too, so we can use itethically and responsibly, and I
look forward to our next chat,which will hopefully be soon

(01:06:06):
yeah, yeah, absolutely.

Speaker 1 (01:06:08):
You know, like it's, uh, as I've been doing these
podcasts and interviews, I'verealized that people are busy
these days and it's it's kind ofhard to get people on the same
page.
You know, like for you know,just us, us doing this right now
is actually kind of a threadingthe needle, like I'm surprised
we were able to pull this offbut, um, but yeah, it's been.

(01:06:29):
It's been really fun andengaging and, yeah, I learned
something every time, somethingnew, um, so, yeah, just grateful
to be here and to have peoplelike you on to um, to have these
kinds of conversations, um, andyeah, hopefully we'll.
We'll do some more andcollaborate, keep, keep
collaborating, keep in touch.
So follow along with theassocialworkercom.

(01:06:52):
Right and fire social workersis for me, yeah, that's.
That's pretty much it.
It's been a great conversation.
Please check outassocialworkercom.
And expecting to see somereally cool things Already
seeing cool things, but evenmore cool things in 2025.

(01:07:15):
So, thank you, dr Badillo-Diaz.
I guess that's going to be itfor this stream.
Thanks for everybody who joinedin and watched and will watch
later on down the road.
So thank you everybody and havea great night yes, take care
all right, you too.
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.