All Episodes

April 30, 2024 61 mins

Join us for the third and final installment of our conversation series, hosted by our Chief Strategy Officer, Jennifer Jones. We’re wrapping up with a crucial discussion on Technology and Child Welfare, featuring Takkeem Morgan, Executive Director of Mosaic Parent Hub.

Subscribe for more episodes

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_01 (00:12):
Good morning.
Hello.
My name is Jennifer Jones, andI'm the Chief Strategy Officer
at Prevent Child Abuse America.
I'm extremely excited to haveTakeem Morgan here with me
today.
I met Takeem uh several yearsago when we were both working on
the Thriving Family SaferChildren initiative.

(00:34):
And I think we quickly found outthat we are incredibly
passionate about uh preventionin particular.
So Takeem is the founder andexecutive director of Mosaic
Parent Hub.
He's a data-driven marketingexpert who has been recruited by
the national nonprofit FosterAmerica to serve as a strategic

(00:56):
marketing consultant for theIndiana Department of Child
Services.
In his role, he's using hispersonal experiences as a former
foster youth, along with hisexpertise, to develop new and
effective strategies forretaining and recruiting foster
parents.
He's got a passion for buildingpartnerships across the private

(01:17):
sector to address uh issuesimpacting marsh marginalized
communities.
And he's really making asignificant impact on the foster
care and child welfare systems.
Most recently, Takeem has joinedthe board of directors for
Foster America.
And you can learn more aboutTakeem's work by visiting his
website at www.takeemorgan.com.

(01:41):
So thank you, Takeem, forjoining us today, spending time
with me.
I I love, I always love ourconversations.
And today we get to have aconversation about some of the
amazing work that you're doing,some of the challenges that
we're facing in terms oftechnology and innovation, but
just really excited to dive in.
But before we jump into ourdiscussion, just want to see if

(02:04):
there's anything you want tostart off by sharing or
reflecting on before we kind ofdig in.

SPEAKER_03 (02:10):
Well, so as you said, Jennifer, it's always a
pleasure to talk to you.
You know, I always walk awayfeeling a little bit uh more
informed, a little bit moreconnected, a little bit more
strategic.
You know, you're an awesomestrategic thinker.
And uh so yeah, it's always apleasure in terms of you know,

(02:31):
just excitement.
I, you know, I'm very excitedwhen I look out at the macro
trends that are taking place inhuman services and child welfare
in general.
It's there's a lot of excitementbecause I can really see that
there seems to be a mental modelshift.
And so, you know, we can talkabout it in detail as we as we

(02:52):
go through this.
But that's one of the thingsthat's really exciting me is
that I think there is changethat's happening at a level that
allows for significant systemicchange as we go forward.
So that's exciting.

SPEAKER_01 (03:07):
I love that.
And um, you may know, Takeem,and and we'll uh talk a little
bit about this uh in ourconversation, is that Prevent
Child Abuse, uh Prevent ChildAbuse America launched our
theory of change last Augustinto a lot of buzz and
excitement.
And one of our key uhstrategies, but also one of the
key leverage points uh webelieve we need to engage in is

(03:29):
this idea around shiftingmindsets and transforming the
narrative.
And so as you talk about sort ofmental models, right, that's so
important if we're actuallygoing to create systemic
meaningful change.
And so I'm uh I'm excited to uhto hear that that's a trend that
you're seeing.
I think it's so important.
Um we've seen it with a lot ofother movements in our in our in

(03:52):
our uh history, um, that thatmindset shift uh is really when
you sort of hit the tippingpoint and are able to make true
meaningful change.
So we can certainly dig intothat more.
Takeem, you and I both have apassion for innovation and
creativity.
I think that's the beauty uh ofour of our sector, the ability
to really think about socialissues and root causes and

(04:15):
systemic issues, and then useour uh strategic thinking about,
you know, how can we beinnovative and creative in
addressing those complex issuesthat families are facing today.
And so we really have theopportunity to elevate this idea
of innovation in the socialservices sector.
You've been talking about it inyour work, and we've been
talking about it in spaces likeThriving Families.

(04:36):
So just would really love tohear from you, Takeem, about
your perspective aroundinnovation in prevention in
particular, but certainly feelfree to share uh about
innovation in other spaces aswell.

SPEAKER_03 (04:49):
Yeah, so I I when I think about innovation in human
services and social services, Ithink what is compelling is the
acknowledgement that the systemas constructed is not designed
to address the problems and thechallenges that we are facing,

(05:12):
that the system was designed toaddress uh a perceived problem
that is not the one that we'reactually facing.
And so because we have ascenario where a system is in
place and operating and designedfor a different set of problems,

(05:32):
then we have to quickly bothidentify the actual problems
that that make sense to solveand that we really want to
solve, and that we can getoutcomes, positive outcomes on,
um, and the the ways, thesolutions uh uh for addressing
those problems.
We have to do those things atthe same time.
And I think when you have asituation like that, then that's

(05:56):
when innovation becomes uh uhvery valuable, right?
Because you have to do thingsdifferently and you have to do
them quickly.
And you essentially have to tryand fail fast in order to come
up with solutions that wouldallow us to move forward on the
challenges that we know we'reactually facing, not the ones we

(06:16):
thought we were facing.
And I think if you know, to bemore specific, you know, I think
the child welfare and much ofhuman services, uh, some of it
was designed um for exceptions,right?
We we thought that perhaps wehave exceptional parents that
aren't, you know, loving theirchildren and aren't you know

(06:38):
stepping up to the plate.
Perhaps we have exceptionalcommunities that you know that
have a problem here or there.
Uh, but what we're finding is itwe're not dealing with
exceptions, we're dealing withsystemic challenges and we need
systemic solutions.
And the ones that are in placeuh don't fit the bill.

(06:59):
And so we need to innovate.
And and that's what's exciting.
Rapid change and new sets ofproblems require uh innovation
mindset.

SPEAKER_01 (07:09):
I love that.
And um uh as I as I was talkingabout our theory of change
earlier, our whole approach wasthis was around adaptive
strategy, right?
It was an adaptive strategyprocess, which is the idea that
we need to be flexible, we needto be nimble, we need we need to
be able to respond to theever-evolving world around us.

(07:29):
And I think you uh you justalluded to that with this idea
around rapid, you know, rapidfeedback, rapid cycle, rapid
testing, right?
That these these systems arecomplex and complicated.
And we can't create somethingthat we wait, you know, five, 10
years.
We can't do like we did in theold in in the in the past, where

(07:51):
we you know create thesestrategies, we wait for RCTs,
random control trials, and five,10 years down the road we decide
it doesn't work or it's showsthat it doesn't work.
We just don't have that luxuryanymore, right?
We need to be innovative, weneed to be creative, we need to
be adaptable.
I also love how you said aroundyou know, systems, these systems

(08:11):
weren't designed, right?
Um, or they were designed to beable to do that.

SPEAKER_03 (08:15):
The challenges that we face, right?
That's exactly right.

SPEAKER_01 (08:19):
That's exactly right.
And so the child welfare systemin particular, right, was was
designed to keep kids safe,right, from physical abuse in
particular, right?
And we know the history uharound the orphan trains and all
the things that led up to ourcurrent uh child welfare system.
Um, but they weren't designed toaddress, it wasn't designed to

(08:42):
address things like poverty,right?
Um, or prevention and this ideanow that we know, or not idea,
but we know that, you know, 60,anywhere between 60, 70% or more
of the kids coming into thechild welfare system are due to
neglect, right?
Are due to issues related topoverty.

(09:02):
And so how do we, how do we nowchange, right?
How do we now address the factthat poverty and racism and the
things that we know are nowresulting in the current child
welfare system or contributing,right, to the current uh child
welfare system and the kids andfamilies that are coming to the
attention of that system?

(09:23):
We now have to change, right?
We have to think differently andwe have to be innovative and
creative.
Love that.
I love uh I love how you'rethinking about that and and and
why and the systemic challengesthat need systemic solutions,
right?
Um I I love that.
I I I want to talk a little bitabout technology, right?

(09:43):
And you uh, you know, in yourbio, you say you're a
data-driven marketing expert.
You and I have had conversationsabout technology, and we see
that, right?
We see the advances andenhancements of technology
playing out in our daily lives.
We saw it during the pandemic inparticular, where we had to rely

(10:04):
on technology uh in differentways.

SPEAKER_03 (10:06):
Um, and we had to use many things, yep.

SPEAKER_01 (10:08):
Right, and to get resources and supports and
communications to families.
So, how else?
I know you're really thinkingabout this, and I love that.
I mean, you talk aboutinnovation, right?
You're really thinking about howwe can be more innovative when
it comes to technology.
So uh tell me a little bit abouthow else are you seeing families
use and rely upon technologytoday?

SPEAKER_03 (10:31):
Yeah, so I think that um at a very um basic
level, we we realize thatfamilies are nuanced, right?
There's families in general arevery nuanced.
Each family has a differentconfiguration of challenges that
they're facing.
And so whenever we uh uh committo actually uh serving those

(10:55):
families and uh identifying andtrying to meet those family
needs, we uh inevitably have tobring a myriad of resources to
the table for those thosefamilies.
And you know, just by virtue ofthe world that we in, the best
way to do that is usuallythrough some type of technology,
some type of you know,intermediary that allows for

(11:19):
multiple resources to intersect.
Um, and if and if nothing else,we're looking to understand to
be able to track and measure youknow the impact of those uh
resources.
And so it becomes a it becomeseasier to do that at scale when
we leverage in um technology.
So you say what what types oftechnology?

(11:41):
I think that when you look, appsare huge, right?
So um we have you you all haveuh the the Healthy Families
Initiative.
One of the things I noticed as Iwas doing research here in the
in the state is that uh there isan app that at least the service
providers use to manage thoseengagements, right?

(12:01):
There's an app that says howmany mothers, how many young
mothers are in the program, howmany are being served, what
their level of progress is.
So, you know, that that's youknow a prime example.
This is a young mother, right,trying to be prepared for her
future.
And we're saying, look, ifyou're if you don't understand
how to manage some of some ofthe challenges using technology,

(12:25):
then you're probably not goingto be very prepared for the
future that we see on thehorizon.
And so we right from the fromthe beginning, we say, look,
let's let's leverage uh thetechnology.
Um find help is a very popularuh tool, right?
Where you know if you just lookat the the approach and the

(12:45):
name, it's it's very generic.
It's saying, listen, you need alot of different when you need
help, you need a lot ofdifferent resources.
How about we, you know,aggregate those resources in an
easy to navigate space that'sdigital, and then we can
communicate with you about uhwhether it meets your needs or
not and be able to continuouslyimprove.

(13:08):
Um you know, uh there areprofessionals with lived
experience all across the sectorthat are coming up with um
technical solutions uh tochallenges that are out there.
And the and the truth of thematter is not every solution
requires technology, not at all.
The challenge that we have inchild welfare, though, is that

(13:29):
we generally are technologyadverse.
We usually like to you know beslow uh uh to adopt technology.
And and here's the thing whenyou're dealing uh as some people
have called the surveillance,when you're trying to find, seek
out, and you know, identify badactors, and that's that's uh

(13:52):
basically a one-to-many.
So it's one state, you know, oneagency that's going out and and
and trying to do these, uh, youknow, do this, um, take this
view of of families.
So that's one way to solve aproblem.
But if you shift and you say,what do you need, and you allow
families to opt in, well thennow you have to have the

(14:14):
capability to serve them in adifferent way, right?
Because the service on the onthe surveillance side is from
you going out and deciding whoto serve and when.
But if you actually look outthere and you see what the
challenges are, poverty, right?
We in particular, and if povertyis being conflated, you know,
with neglect, then well, maybeyou need to allow parents to opt

(14:37):
in.
Maybe you need to allow familiesto opt in.
And when you allow them to optin, you're gonna need to be able
to manage, you know, uh thatinformation flow and that
engagement differently, usuallyprobably requiring some type of
technology.
Uh, at least you're going to bea lot more efficient if you're
able to leverage technology.
And and here's the other thing II'm speaking a lot about

(14:59):
technology, but when you talkabout innovation, it's not just
technology, sometimes it'sprocess, right?
So the so the shift from a uh meto you to you to us uh model
that shift um from push to pull,maybe, is can be an innovation.
Like the way that we're gonnameet needs.
I know um you you're probablyaware there was an initiative

(15:21):
recently launched to helpfamilies, and it's all about you
know, families saying what theyneed and responding to those
needs.

SPEAKER_01 (15:28):
So which seems so simple, yet we don't we don't do
that often enough, right?
Um listen to what families needor or or want or or say.
Um so I go ahead.
You were gonna say something.

SPEAKER_03 (15:43):
And I was gonna say that's exactly so, you know, my
my career started in the privatesector, right?
So I I you know I um umcompleted my master's and I went
into you know corporatestrategic marketing, and uh uh
you know, I operated on thetechnology side of things, using
on the business technology sideof things.

(16:03):
And the interesting thing overthere, just as a marketing
expert, you know, our uh youknow, golden rule is to go to
the customer and ask, you know,how's it going?
What do you need?
How's what we made working foryou?
What changes would you like?
You know, and and we we knowthat, you know, we call it
consumer preferences.

(16:24):
It's it's the it's the rule thatconsumer preferences change and
they change often.
And so you always have to be intouch with the with the consumer
to know, you know, and I'mtalking obviously in the private
sector side, but to know howthey're feeling and how they're
changing what they need as theybecome more educated, as as
their needs get um become moresophisticated.

(16:46):
Um, so it it when I made thetransition into the public
sector, it was surprising to methat there was not a mechanism
for feedback loop, that therewas not a mechanism to
constantly engage with the enduser and allow the end user to
help you uh co-design, you know,the solution ultimately, because
we know that the besttechnologies, Google, Apple, you

(17:08):
know, uh Amazon, all these, thebest ones have really close
relationships with their endusers and they're constantly
improving and perfecting um thatservice or that product.
And I think, you know, we'reobviously we're the public
sector, but when we talk aboutprocess, right, and we talk

(17:29):
about systems, um, this issomething that we could learn,
right?
Is that constant feedback, asystem for constant feedback and
continuous improvement and makethat uh you know um just as high
a priority, right?
Because the because the reasonwhy they use it in the private
sector is because it deliversexcellence.

(17:49):
And obviously, if you deliverexcellence, you can deliver
profits, right?
But in the in the public sector,if you deliver excellence, you
get outcomes, right?
You get the outcomes that youwant consistently and reliably,
right?

SPEAKER_01 (18:04):
Which is well, and it's it's you know, it's this
idea that you know, howludicrous would it be to go out
to market with a brand newproduct, like an iPhone, for
example, or you know, Google.
Uh, I actually participated in aGoogle uh research study, right?
Where you I went into theiroffices and they said, How do
you feel about this?
I think it was an app actuallyabout travel.

(18:26):
Um, but it's that idea that youwouldn't go out to market
without actually testing itfirst uh with your users, with
your end users.
And we don't embrace that atall, right?
I mean, we're constantly in ourin our sector developing
strategies and solutions andprograms and interventions

(18:47):
without ever talking to people.
Um, and then we wonder why itdoesn't work when we go out and
launch it, right?
And so it's so, I think that'sso critical.
Um, and how do we bake that into who we are and you know, to
the into the DNA of ourorganization so that we're it's
just part of who we are and whatwe do, just like it is for

(19:08):
Google or Apple or um uh yeah, II I I love that.
I think we need to do more ofit.
We need to figure out how to doit better.
Um you brought up two thingsthat I want to just respond to.
One is uh that we're technologyaverse.
And, you know, I worked in stategovernment, you're doing that
right now with your consulting.

(19:29):
Um, and I always had this, youknow, I guess this perception or
imagination that stategovernment, right, government
should be leading the way interms of technology, right?
Um they have the resources, theyhave the uh the people, the
experts, right?
You have all the knowledge,right?

(19:49):
And the need.
And then and the need and theneed for sure.
And so I always thought that wesh we would see like, you know,
technology coming out ofgovernment that was far more
advanced than what sort of whatwe see uh in our in our own uh
individual lives, but that's notthe case, right?
It's like you go into thesegovernments or organizations and

(20:12):
the technology is, you know, 20years old in some cases, right?
And so I think how do we how dowe get better, right, about
using technology in a way, howdo we stay innovative?
How do we stay in front of thetrends, right?
Um, in terms of technology.
Um you also mentioned apps, andI think it's it's interesting

(20:33):
because you never really thinkabout how much you rely on apps,
right, during your day, duringyour your your uh every day.
I I have my travel app, I havemy weather app, I have Shazam
when I'm listening to music andI want to know the song, right?
Um, I do headspace uh formeditation.
I know you do you do meditationas well, and I want to hopefully

(20:53):
get to that.
But it's it's so how how do weit becomes it, it has become
part of our everyday life.
And so how do we use it to ouradvantage?
Um, or as we're designingsolutions that were
incorporating some of thisthinking, um cutting edge
thinking, right?
Um, in terms of uh technologyand solutions uh for families.

(21:17):
Um, but we know there areconcerns, right?
There are definitely someconcerns, there are some
challenges about relying ontechnology um to do our work.
Uh and so I wanna I want to askyou that.
What are what are some of theconcerns or challenges you see?
Um, how do we ethically resolvethose issues, right?
Um and we're gonna get to AIhere in a minute, but I think AI

(21:39):
brings some of those ethicalchallenges out uh in the
forefront.
But talk to me a little bitabout the concerns or challenges
that you're seeing aroundtechnology and our work and and
how do we kind of think aboutethically resolving those?

SPEAKER_03 (21:55):
Well, I think at the top is uh, you know, privacy and
bias are are right, you know,way up there at the top of the
concerns.
Um, because you know, uh we weare in such a data-rich uh time
period and environment that youcan almost use data to paint

(22:16):
really any picture to get uh youknow a clear um view of things
that maybe others don't want youto have a view of, right?
Like that's that's a realconcern that if you have too
much data in you know about meand around me, then now all of a
sudden I can't uh be safe.

(22:36):
You know, I I can't be uh aprivate person.
I have to be, you know, uh apublic person, which means that
I have to be subject to all thefacts as well as the falsehoods
or the perceptions, right?
Uh now everyone can kind ofpaint perceptions on me, and
that could cause problems.

(22:57):
When I say me, that's me as anindividual, me as a community,
right?
Um, me as a stereotype, youknow, because we still have
those dynamics that are at playtoo.
So that the um privacy is anissue, bias is an issue, right?
Because um, because of thehistorical exclusion of you
know, various uh people, variousuh communities and various

(23:21):
groups, if they're not includedin data and they're not included
in technology and technologybecomes the status quo, then
perhaps the status quo is theexclusion of their ideas,
thoughts, cultures, um, youknow, uh needs.
And so those are very muchflying high uh when we talk

(23:42):
about adoption.
And often because governmentusually has challenges when it
comes down to this equityquestion and this service
question, that they're like, wedon't need another thing that
uh, you know, uh can cause usproblems.
And so that's where you see someof the um, you know, you see

(24:03):
some of the opportunities withtechnology and innovation
bumping right up against thecultural norms within uh the
space, within the you know, theum child welfare and human
service government space.
Um and so those are the thingsthat we'll have to just be very
mindful of and and monitor.
I know you said we're gonna talkabout technology a little bit.

(24:24):
At some point, the opportunityuh outweighs the risk, right?
The risk of, hey, is this justanother thing that's gonna make
us more biased?
Well, is there an opportunity?
I know I wrote a paper um alittle while ago about what if
you had folks who who have theneed for the technology involved

(24:47):
in the design and developmentand continuous improvement,
continuous evaluation, and youjust made that a norm.
So now you're giving them thepower of this resource.
It's almost like, you know, uhhaving a safe way to, you know,
uh uh pipe the electricity intothe house, right?
As opposed to saying, no, yougotta come, you know, we got to

(25:08):
ration it out, you got to comedown to the government to get
access to the things you needelectricity for, like
refrigerators and microwaves.
No, they say, you know what,we'll figure out a safe way to
give you access to this power.
You know what I mean?
Uh, and I think it's the samehere.
And that's you know, electricityobviously is dangerous, right?
You put electricity in thehouse, people can burn down.

(25:29):
I mean, it happens all the time,but the uh benefits outweigh the
risk.
And so we pipe that electricitydown to the house.
And I think it's the same withsome of the new emerging
technologies, especially AI, Ithink it is really that
powerful.

SPEAKER_01 (25:42):
Yeah, I I love that, I love that example because I
think in order for us to getahead, as we do in our world, in
our society, right?
We talked about innovation, youhave to think about the risk,
but you have to also think aboutthe opportunity.
And if we didn't think about theopportunity in some of the
things that we've created, likeelectricity, we would we

(26:05):
wouldn't have the luxury ofbeing able to turn on the lights
in our house, right?
Um, and so I think that's areally important point.
I think oftentimes, at least inour sector, I'm sure other
sectors as well, but I think inour sector, we tend to be more
risk adverse.
Um, and we tend to, you know,spend a little bit uh more time,

(26:26):
and rightfully so in some cases,right?
But how do we, you know, how dowe get at that, right?
How do we think about theopportunity?
I also love the point that yousaid about government has
challenges with equity.
I think that's a a nice way ofsaying it, right?
Um, but uh I think that in inall of the things that we talk
about related to our work andchild welfare and technology,

(26:50):
it's equity is a huge part ofwhat we need to think about, um,
what we need to uh to make surewe're, you know, is at the top
of our mind, right?
That we're always viewing thisthrough, in particular, a race
equity lens.
Um, and so uh I love that thatyou brought it up.
One one thing I also want to addbefore we go to AI, because I I

(27:12):
definitely want to go to AInext.
Um the social, social media,right?
We didn't have that, or at leastI didn't, I'm I'm a little older
than you, Takeim, but I didn'thave that when I was, you know,
in college or a kid, right?
And, you know, sometimes verygrateful uh for that.
But, you know, I have a lot ofum family members that have kids

(27:32):
and they're not, you know,they're like no, you know, no
posting of my child on socialmedia, they don't show the
faces, you know, all of thosesorts of things.
And for me, I'm like, oh, well,you know, like it would be great
to see, you know, a, you know,or be able to share a picture of
my, you know, niece or nephew orwhatever.
But it's a real issue uh thatthat you know, I think parents

(27:54):
today have to be concerned aboutwith social media um and how you
can actually kind of steal, youknow, uh a child's photo and
voice and and use it for for forbad.
So I think there are some ofthose uh risks as well that
parents have to be cons, youknow, sort of concerned with and

(28:15):
um uh and think about.
So would love for you to addressthat at any point, but let's
let's let's go to AI for aminute.
And I think that the other thingthat I was gonna say, I'll say a
word about yeah, I'll say a wordabout that.

SPEAKER_03 (28:28):
I think so.
I think the um danger is realand it's been proven there's
there's been several storiesthat have been written on you
know some of the awful thingsthat could happen when people
kind of play with this newtechnology.
But I think um a lot of thisconcern is concern uh that takes

(28:48):
place at the beginning of anynew technology when people are
experimenting with it.
I think that it's great for itto come out now and for us to
you know raise those concernsbecause then we can you know
develop solutions and developprotocol and rules.
Now, you know, we know peoplewill break rules no matter what,
you know, you can have all thebest rules you know in the in

(29:11):
the world, but some people arestill gonna break them.
So the fact that people arebreaking the rules or doing
nefarious things is not uh, youknow, I don't think it's enough
to say let's run away from thisthing, but I think it does tell
us, hey, here's here's thethings you want to be aware of,
and here's how you want todesign you know uh solutions so
that we prevent and don't allowfor this type of behavior, you

(29:34):
know, or prevent it as much aspossible.
You know, I think that that'sthe thing about that.
But I think once, you know, thetruth is once you start to show
some of the actual value of someof these tools, then I think
you're gonna see people using itfor the thing that's gonna get
them the most value and not, youknow, some of the I mean, you

(29:55):
know, you have one or two peopledoing this, but most people
probably end up using it for.
know to benefit themselves toincrease their quality of life,
you know, if you if you uh makeit such.

SPEAKER_01 (30:06):
So yeah, I I think that's right.
I mean, I think you always havethe people that are going to
use, there's always going to bepeople that are going to use it
for, you know, for bad, right?
Um and for and and not for good.
But I think again, your pointabout opportunity outweigh the
risk.
And we got to think about thatand use it uh um in that way.

(30:26):
Um so I want to talk a littlebit about AI.
And you you've been writingabout this, you've been doing
work around AI.
Um again, love your innovativespirit and thinking about how we
can use this in better ways forour work.
And so tell me a little bitabout what you're thinking
about, what your perspective ison the impact of AI and how we

(30:47):
can use it to better supportfamilies.
And uh I know you've you've donesome of this in your thinking
and in your work.
And so would love for you toshare that with us.

SPEAKER_03 (30:55):
Yeah I think I think the thing that thank you I think
the thing that makes AI uh soexciting is the way that it
intersects with some of themacro trends that we see in
human services and child welfarein particular.
Just from a a macro standpoint Ithink you see a shift from as I

(31:16):
said surveillance and monitoringto allowing families to opt in
to services and taking more of auh service posture of a customer
service posture where we want tomake access easy we want to make
uh prevention the rule thatwe're not waiting to crisis in

(31:38):
order to help because we know atcrisis it's most expensive
usually right so if we're reallygoing to do this and do this uh
economically efficient thenwe're really gonna try to
leverage all the tools that wecan to lower the cost and really
prevent you know crisis fromfrom um taking place.
And so I think when you look atuh the capabilities of AI and

(32:02):
its rapid pace of improvementand the myriad of use cases for
for its uh strengths theapplicability for its strengths
then it just kind of coincideswith the change perfectly and
what I mean is that uh when youshift from a surveillance you

(32:23):
know or monitoring model to moreof a customer service model you
inevitably have to be able to uhconnect with assess and be able
to reach more people at one timeright and so the language models
one of the obvious and just sortof uh you know immediate

(32:44):
benefits is that it's processinglarge amounts of information in
relatively short periods of timeso just that kind of high level
basic functionality means thatokay so we could potentially
understand the feedback of 5,000families in their nuanced
experience of a solution uh in away that we couldn't before

(33:06):
right that we can query a onemodel one system on 5,000
different experiences and if wewanted to we could say list each
one of them and it would havethe capability of listing 5,000
you know or we can say aggregatethem or you know um and uh I
think that capability just opensup a world of possibilities um

(33:29):
you know and I think we're kindof definitely just scratching a
surface of understanding okayhow does that allow us to scale
effective solutions you knowyeah I think it's like with any
you know with any new technologyit's like understanding the
breath like it's kind of crazyright that you can you know I

(33:49):
use AI um on occasion you knowand it brings up everything
right it just like brings upeverything you ever wanted to
know.
I mean it writes papers for youright it's like the capacity is
is endless and tremendous rightand I mean let me just say
something about so that writingpaper thing right really the the
model that we see that's publicthe only and I'm I'm probably

(34:14):
underselling it a little bit butthe the main thing that you see
taking place yeah is the modelbeing able to effectively
manipulate the English languagedown to the letter right it
understands how each lettercombination occurs I mean it
literally so the the model thatwe see out here is doesn't think

(34:35):
at all it only guess itliterally is like a mathematical
model right and if you thinkabout 26 letters in the alphabet
right there's a limited numberof words you can make up words
that sound phonetically correctbut there's a limited number of
words right what if you can makea machine that could tell you

(34:58):
every combination of those 26letters and those limited number
of words such that if youstarted a sentence it could give
you the probability of the thecomplete sentence the variation
of complete sentences right ifyou started it right and it and
it's a calculation machine andthat's all you're seeing right
there.
But that same calculationmachine and that's capability

(35:22):
you can put things likestatistical information about
the effectiveness of uh you knowa universal basic income program
and query it the same way and itand and that's even more limited
is a limited number of outcomesthat have have taken place
within this UBI experiment.

(35:42):
But now you can query that fulldata set in lightning speed you
know if if you were to run yourassessment through this AI right
through this language model.
You know so you and and now youknow there's challenges right
because it's new right but youcan just think about what that

(36:03):
power could represent like howquickly you can begin to make
decisions and adjust once youknow that you can rely on the
the uh a calculation theassessment right yeah after you
get past that hurdle you're likewhoa we can do amazing things
here yeah and I I think this iswhere and I want to get to this

(36:23):
in a second I think this iswhere equity comes in right um
uh but but uh let me ask youthis do you think that there
might ever be a moment in timewhere we would use ai for making
case decisions in child welfareyou know I think that when you
say making case decisions rightso what if needs never never

(36:47):
became a case like what if needsnever reached the point of case
of a case right now right theymay they may need to right but
here's the thing I my interestin ai is way upstream and so
that's why I give you thatcounter I don't think that it's
necessary for us to use the besttechnology we have to make case

(37:10):
decisions I think case decisionsI mean they're you know when we
look at them on the wholethey're just fraught with
problems the you know uh uh uhprimary assumptions often are
off you know you know what I'msaying yeah um because it's a
human it's a human making a adecision a judgment right uh

(37:30):
about a particular family basedon your lens in this world of
which equity is is is is is abig part of that but I love that
you're like I don't even want tothink about that I want to think
about you know I only want tothink about the prevention space
but it's it's an interestingquestion right it's an
interesting question like todayin today's child welfare world
we still have a child welfaresystem right we haven't gotten

(37:54):
rid of it we haven't transformedit we haven't it's still here
and it's curious it's curiouslike it would there ever come a
time or are there folks thinkingabout well we could actually use
this technology to decidewhether or not a family even
comes into the child welfaresystem so I think there's some
interesting well what I will sayis that it has been used that

(38:18):
way.
So the predictive you knowAllegheny County had a had a
famous case predictive analyticsyou know where they yeah where
they used uh predictiveanalytics which you know in some
of it is uh very close to thefunctionality at least the you
know kind of uh calculationfunctionality of a of the new ai

(38:39):
models the new large largelanguage models um but you know
uh I just think that we're notthere where we want any robot
making intimate decisions for usyou know what I mean I don't I
don't I don't think we're we'rethere now you know robots can
help us get what we need wewould love that you know carry

(39:01):
more stuff for us likeabsolutely you know what I mean
fix things you know what I meanuh give you know give
forewarnings about things lookaround corners for us like
that's all great but you knowdeciding whether or not to uh
take my liberty or you know takemy child I don't I I don't think
that that's what we're askingfor our robots to do you know

(39:24):
what I'm you know what I'msaying yeah and I think those
are part of the things thatthose are part of the
conversations that are being hadout there right it's like what
are the consequences um thenegative consequences of of some
of this technology right it'slike here in I live in Madison
as you know and on campus theyhave those little robots that uh

(39:45):
are driving around campus allthe time and it's crazy when you
drive down campus and there'sthese little things and in those
little things are food thatpeople order and it gets picked
up at the restaurant and itdrives to the dorm or the, you
know, and it's like this littlerobot, right?

SPEAKER_01 (40:05):
Those are the kind of things I think, you know,
that you're like that's a greatidea, right?
It's like I can order food andthis little robot delivers it to
my door.
But you know those are the kindof technology things I think
that you know that we allappreciate and we know and this
is what I wanted to ask you nextis that we have to be aware of

(40:26):
some of these things, right?
We have to be aware of some ofthe consequences.
We have to be aware of theequity issues.

SPEAKER_03 (40:31):
So so tell me a little bit or talk to me a
little bit about what are someof the challenges that you're
seeing barriers concerns thatyou're hearing about or seeing
when when we think about AI YeahI think um you know one of the
biggest challenges and biggestbarriers is the is the data
that's used to train the largelanguage models right um uh you

(40:56):
know when we talk about equityyou know um we we we know that
that there was not an um therewas not an eye towards equity in
the initial training of themodels and we've done extensive
research I work with ChapinHall's AI working group and we
do a lot of review of currentresearch on the use of AI and

(41:21):
like I said the the issue ofbias and equity are way up
because they've been testedthey've tested these models and
they've done things like facialrecognition and they struggle to
recognize minority faces rightso they got this you know
amazing technologists like lookwe can recognize your face we
can associate things but not ifyou're of you know not if you're

(41:46):
too dark right not if you're toouh ethnic um it's the same thing
with gender right so when we askum the models in the past and I
don't even improved on this butwhen you ask the models in the
past you know give me somecharacteristicness of this
person and you make this persona traditionally female or woman

(42:09):
name then the model you knowgives you stereotypes and bias
right and so and the same thingfor ethnicity same thing for
race you know um and religion insome cases as well and so you
know these sorts of things areare uh the result of the
training data for the modelswhich you know was was likely

(42:33):
popular uh data so data that'savailable on the internet and we
know you know there's a digitaldivide there's just so many
people that aren't on theinternet um the internet uh in
many ways is it um is still aluxury to to many um you know
we've closed that um that gap inthe US somewhat but um it
there's still a lot of folksthat aren't aren't there um and

(42:56):
let alone a lot of folks consumeuse the internet to consume but
when it when it comes down tocontributing you know their
perspective their ideas theiryou know um viewpoints there's
not a you know there's not anequitable amount of information
about everyone uh particularlycultures that that are on there
and so these models have beenessentially designed to be

(43:19):
biased or you know yeah theythey've been designed to be
biased and so you see a lot ofscrambling to correct that um
which which is is is umimportant and it's doable that's
the other great thing about themodel I mean you think about
last year in uh artificialintelligence and language models
have been around I mean youprobably can remember websites

(43:42):
where you you know speak to achat bot and you know it has a
pre-selected number of answersand multiple choice questions
and things like that.
You know that's that's anartificial intelligence and
that's a language model as wellbut if you see it's all
pre-populated the questions andanswers are pre-populated it's
just kind of pulling from a likea menu spreadsheet yeah that

(44:06):
yeah menu of answers where thenew model is much more
sophisticated it's in real timeuh having an interaction yeah
you know um it's just a lot moreum powerful and it's moving a
lot faster models though uhrequire a lot of energy that's
the other concern right is thatbecause of the amount of

(44:28):
information that it's processingthey require a lot of energy and
so as you proliferate the use ofthese tools you you know you're
gonna actually increasecomputing power dramatically the
use of computing power andthat's kind of like the area
that a lot of folks don't talkabout a lot but everyone that is

(44:50):
developing is well aware that umwe need to increase our ability
to be able to efficiently youknow use and leverage computer
computing power and when I sayefficiently without destroying
the globe you know withoutburning too much energy and
right right you know umoverheating because they really

(45:11):
do require yeah our nextconversation can be about
climate change in theenvironment because there's so
much of it right right um yeahbut that is a real concern with
the computing power with AI uhservers and you know managing
that so they're trying to do andsomething that you don't
efficient yeah it's somethingthat you don't really think
about but it's again theconcerns and the consequences
and the things that we have toat least weigh out again this

(45:34):
idea around opportunitiesoutweighing the risks like what
are the risks what do weunderstand them to be and then
how do we you know how do weaddress those I want to I want
to shift just what just slightlyand we have talked about this a
little bit.

SPEAKER_01 (45:48):
So you have shared your story uh around being a a a
youth a youth aging out of carea former foster youth and you've
talked about the importance uhand the impacts of centering
families in decision makingyou've talked about co-design
you've talked about you knowmaking sure we're talking to the
end user right so how do we youknow I think this is a

(46:10):
conversation we have a lot inour work uh how do we do that in
a meaningful way so it's notjust gathering feedback but it's
actually co-design it's actuallysharing power and then how can
an organization like PCA Americabe a national leader uh in this
space around centering families?

SPEAKER_03 (46:31):
Yeah I and I think the way um that you center
families is that you uh makethem a central part of the um
system of the of the of thechange mechanism um I'm really
excited about this neworganization that myself and

(46:54):
about 50 other professionals arelaunching called lead um and
it's a professional network forum people working in human
services that have firsthandexperience with systems with the
systems that they're trying toimpact and the whole idea is
that there's an intrinsicmotivation with individuals who

(47:17):
have uh navigated these systemsif they've committed themselves
to improving those systemsthere's an intrinsic motivation
and that intrinsic motivation isof significant value to uh
systems change and to you knowfuture systems because it's the
sort of thing that that's whateat every uh good employer wants

(47:40):
they want an employee that'smotivated internally that's not
motivated by the incentives thatthey're given but they're
motivated by uh the uhopportunity to make a difference
in the world and i i think thatum um when when we look at some
of the uh mega trends some ofthe uh paradigm shifts uh within

(48:03):
the child welfare space and arapid change I think it would it
would be uh important toincorporate some of those
professionals that havefirsthand experience that have a
commitment intrinsic motivationin that change process and
they're gonna bring theirfamilies in their communities
into into the um fold if by noother reason by the fact that

(48:27):
they're connected to them thatthat's what they know and that's
what they that's part of theexperience they bring to the
table is the experience withtheir community and with their
families.
I think that's one way to do it.
I think the other way to do itis to center solutions in the
place where the where thechallenges are right um Camden

(48:48):
and New Jersey have done thisvery well with when they rolled
out that early systems of care.
I know they have some challengesuh as of late but when they
initially rolled out theirsystems of care they were
getting a lot of responsivenessfrom communities because they
placed uh access points incommunities and they allowed the
community to uh essentially ownthe experience the the the

(49:11):
customer experience um of thoseplaces meaning that they didn't
have to look like governmentbuildings they could look like
community centers and communityplaces places where people can
feel comfortable um and thenthey had services co-located in
those places right um and andand I think that that's that's
another way to do that is tojust you know uh be committed to

(49:34):
partnering right uh in a waywhere you're sharing actually
sharing the value by sharing thevalue some of that value of the
solution has to get left in thecommunity with those individuals
right because then then that'stheir sustenance as well that's
that's how they sustain theirpart of the partnership is that
some of that value has to beleft uh in that community and so

(49:58):
I think that those are two greatways to do it is to just bring
on and support professionalswith lived experience.
So you know PCAA could certainlysupport and partner with an
organization like Lead, right?
Yeah that's developing andtraining professionals.
For instance, one of the thingsthat Leeds wants to do is be
able to do workforce readinessright so if you all have

(50:23):
requirements and you have youknow workforce needs working
with lead to understand whatthose needs are and then to
develop some type of mechanismsfor being able to make people
more likely to be prepared forthose opportunities.
Not that it's a guarantee butthey're more likely right here's
what PCAA is about here's how aperson with lived experience

(50:45):
might intersect with PCAA rightjust that sort of education that
would make it a lot easier forsomeone to go from say working
at another nonprofit to workingwith them for PCAA.

SPEAKER_01 (50:58):
Yeah I love that and I I'm uh congrats on on lead I'm
excited to uh to follow that andcertainly very excited to see uh
how we can partner with you allwith lead uh as we think about
centering families in ourdecision making and again making
it part of our DNA ourorganizational DNA takeem one

(51:20):
more question I want to ask ifyou don't mind before we before
we wrap up so I I I saw on uhagain just social media I saw
we're friends on Facebook I sawthat you're you're starting to
do art again and you told thestory you told a story uh about
that on social media about yourart and I I found it to I I

(51:40):
found it to be very profound andand touching and and you said in
the story you had realized uh itwas the first time you didn't
have a home.
And so if you don't mind sharinga little bit about that and then
would love to hear what yourfavorite art medium is and when
when we'll be able to buy yourartwork in the in the mainstream

(52:02):
stores.

SPEAKER_03 (52:03):
Yeah um thanks for that question it's a great it's
a great question.
So yeah so it's true.
So I aged out of care when I was18 years old as I was um
entering college I didn't knowthat there was the possibility
that I could that I wasbasically entitled to support

(52:25):
while I was in college um Ididn't realize that the you know
aging out was an option and youknow it was really kind of
miscommunicated to me.
I thought that that was whatwhat had to happen because I was
going to college I was no longerin care.
So I didn't get the option ofgetting that added support once

(52:46):
I was in school and what thatmeant is that you know my dorm
was my home right because I wasno longer in a foster home I was
aged out they were no longerreceiving any type of benefits
from me.
I no longer had a place to liveat you know at my former foster
home which is where I you knowthat's where I went to college
from you know and so um I had aportfolio of work that I had

(53:11):
done since the time I initiallywent into care actually it was
so it was the portfolio of thetime period that I was in care.
I started it at my first kinshipplacement with my art teacher
Marcy Morris and I had built upthis portfolio through all the
way through uh the end of highschool and it was very precious
to me you know there was you itbasically was the development of

(53:35):
me as an artist because it'sthis my start early stuff and
then some of my bigger moredetailed more intricate pieces.
And when I I stashed it I triedto stash it right because I
really didn't have a place atthis at this home it was it was
somebody else home and they hada family and all that and so I

(53:56):
tried to stash it and I wasreally I was self-conscious
about it.
So I didn't say to someone heyI'm gonna put this up because I
kind of knew that you know theydidn't it wasn't that type of
situation right they they theyuh likely were not going to be
able to you know hold andpreserve my my artwork and you

(54:18):
know who knows maybe they couldhave maybe if I wasn't so
subconscious that they mighthave nonetheless as I tried to
hide it and then I come back andit was just gone.
And you know it's like my worstnightmare I was just like oh no
it's gone you know um and when Irealized that it was gone it was
a confirmation I guess I kind ofalready knew that that this is

(54:39):
not your house that's why youneed to hide hide it somewhere
but yeah when I came back and itwasn't there and then I asked
around and no one they they itwas not even a thought you know
what I mean they didn't knowwhere it was they had no
recollection of moving it andand and we were had a great
relationship wasn't like I waslike treated like a stepchild or

(55:00):
anything it just was kind of afact you know um and so yeah I
did I realized I was like manyou don't have a home you know
and I had to just kind of cometo come to grips with that you
know and I decided I wasn'tgoing to do art because art you
know for a person that doesn'thave a home art is just too much
you know it's too much to manageyou know because you've carried

(55:22):
around it's gonna get allcrumbled up and so um and so I
gave it up yeah I I rememberreally making a decision around
that because I was like allright I'm not gonna do any more
any more art you know and one ofthe things I did say though is I
was like I'm gonna do my art ina different way I'm gonna live
it out you know that's one thingI did say to myself is like I'm

(55:43):
just gonna be creative in theway that I live in the way I go
about life.
And I think I've kind ofmaintained that I do get a
little serious sometimes but uhI think I kind of maintained
that a little bit but yeah Icame back to it recently and
it's interesting because I wentand seen a Salvador Dolly show
it's an immersive experience.

(56:03):
And as I was looking throughDolly's work immediately soon as
the show started I had a rush ofenergy that put me back into my
art world and actually the waythat he went about his pieces
and his design was I I relatedto it in many ways because he

(56:24):
wasn't trying to be real in asense he was trying to express
how he was feeling and hisemotions and I realized that
that's you know the type ofartist I was as well that I
wasn't drawing for perfection Iwas drawing for expression and I
was painting for expression.
And so in cre in creatively youknow being when you painting to

(56:47):
express is there's a real strongcreative component that you know
uh doesn't have to be groundedin reality which is liberating
from an art standpoint.
But what I realized about thispain that brought me back to the
artwork.
And so I immediately conceivedof two three pieces that were
all focused on pain.
And so that's the series thatI'm uh two um pieces into right

(57:12):
now and is this series that'sfocused on pain.
And although it's focused onpain it's not a sad series you
know because what I realized isa lot of my life has been shaped
by pain.
But you know I think that byembracing pain and understanding
the roots of pain you uhinevitably or you have the

(57:34):
potential to unlock a path tojoy right because you understand
what it is that bring you painand you understand why you
understand you know what it isabout your humanity that makes
this painful and then you knowyou can chart a path towards joy
at least that's what I do.
That's what I've done.
I realize that though once Iacknowledge that pain I realize

(57:55):
okay this is why you play golfand why you have so much fun
with your son and you knowencourage imagination and still
play with action figures, youknow, and things like that
because the seriousness of lifethe problems and the challenges
have caused pain and some ofthose need to get dealt with and
we'll deal with them but we'llpreserve space for joy so as not

(58:18):
to be consumed by pain.

SPEAKER_01 (58:20):
So I love that I love that take him I love that
um well first off I love thatyou're that you've found your
way back to art and I I so lookforward to to watching that um
when you're when you're ready toshare that.
I I think the you know just theprofoundness of that story that
you know people that haven'tbeen in the system or been

(58:43):
touched by the system just haveno understanding right of like
that's uh that's something thatwas really you know profound for
you at in the moment.
And that's not something thatyou know people who haven't been
part of the system don't thinkabout.
Like I can keep my stuff at myfolks' house and I did when I

(59:04):
was in college and I didn't haveto worry about that.
And you know that's somethingthat you just don't think about.
And so I I really appreciatedyou sharing that with us and
sharing that I love that you'reback uh doing your artwork.
And uh if you ever get a chancego to the Dali Museum in St.
Petersburg.
It's pretty powerful in St.

(59:24):
Petersburg Florida so if you'rea Dali fan you'll love that.

SPEAKER_03 (59:28):
Check it out.

SPEAKER_01 (59:29):
Yeah so so go check it out.
So Takem I just want to say uhagain thank you so much for
being here.
Thank you for sharing yourstory.
Thank you for dedicating yourlife to working to ensure
children and families have whatthey need to thrive.
We need you we need lead we needall of the folks that you talked

(59:50):
about who have that intrinsicmotivation to keep us real and
to also keep this movementalive.
I appreciate you I appreciateour partnership And I look
forward to all of the work thatwe get to do together as we both
uh move move along in our spacein this work.
I also just want to give a bigshout out and thank you to all

(01:00:12):
of our partners in prevention.
We believe at PCA America thatprevention is possible and that
everyone has a role to play inprevention.
I encourage you to visitpreventchildabuse.org to learn
more about our theory of change,access any additional resources,
or to make a donation to supportchildren and families through

(01:00:32):
Prevent Child Abuse Americabecause we know that together we
can prevent child abuse becausechildhood lasts a lifetime.
So thank you again, Takeem.
We look forward to seeing you uhagain soon and uh have a great
rest of your uh day.

SPEAKER_03 (01:00:48):
You're very welcome, Jennifer.
It was a pleasure.
Thank you.

SPEAKER_00 (01:00:53):
Thanks for tuning in to this exclusive episode.
The 2024 Cat Month series can bestreamed on our Cat Month page,
preventchildabuse.org, backslashCat Month 2024, and wherever you
listen to your podcasts.
You can find more information atpreventchildabuse.org and on our
social media channels.

(01:01:13):
Remember, prevention ispossible, and together we can
prevent child abuse, America,because childhood lasts a
lifetime.
Advertise With Us

Popular Podcasts

Stuff You Should Know
CrimeLess: Hillbilly Heist

CrimeLess: Hillbilly Heist

It’s 1996 in rural North Carolina, and an oddball crew makes history when they pull off America’s third largest cash heist. But it’s all downhill from there. Join host Johnny Knoxville as he unspools a wild and woolly tale about a group of regular ‘ol folks who risked it all for a chance at a better life. CrimeLess: Hillbilly Heist answers the question: what would you do with 17.3 million dollars? The answer includes diamond rings, mansions, velvet Elvis paintings, plus a run for the border, murder-for-hire-plots, and FBI busts.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.