All Episodes

August 29, 2024 • 48 mins

In this episode, we sit down with guest Emily Hicks-Rotella - Data & Technology Strategist at Make Tech Work For You

You will gain insights into:

  • How to understand and develop a data culture at your non-profit.
  • What are key principles for adult learning tech and data
  • How to approach program evaluation and data with humility and courage
  • Practical ways to develop a Data Tech role at your organization

and more.

Emily Hicks-Rotella | LinkedIn

Make Tech Work For You

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
All right, welcome to Making Data Matter, where we have conversations about data and leadership

(00:05):
at mission-driven organizations with practical insights into that intersection between the
mission, the strategy, and the data. And I'm your host, Sawyer Nyquist.
And I'm your co-host, Troy Duhik. And today we're joined by guest,
Emily Hicks-Fortella. Emily, welcome to the show.
Thank you so much. And Emily, for people just meeting
you for the first time, give us a little background, who you are and what you do.

(00:28):
Sure. So I usually start my background way back saying that I was in college for music,
theater, and literature and was actively avoiding data technology. But through a series of events,
got introduced to it and was really intrigued by the capabilities of these things to move

(00:48):
organizations more efficiently and effectively towards their very large, important missions.
And I had to learn everything on the job and through Googling and late hours and redoing my
work. So I really feel that it is possible for anyone to learn data and technology, especially
for social justice work. And that's what I'm focused on now. So I am a consultant with many

(01:11):
different organizations across all areas of the public sector to help develop data and technology
culture at organizations, especially small to mid-sized ones where they don't have necessarily
all the resources and funding to focus on that. But we're finding the ways to get it to them.
And okay, I want to start this conversation a bit of an unconventional place because Emily,

(01:36):
we'll get to data in a second, but Emily, I love emailing you because when I send you an email,
I get an unconventional response. I get an out of office, not an out of office. I get an automated
response that says, first of all, the subject line reminds me to take five deep breaths today,
which usually I haven't, and it lets me slow down. But then your automatic reply says something else

(01:57):
and it says you check email once a day. So before we get into data world, tell me just a little bit
about how do you approach your inbox and the way you work and approach technology?
Yeah, well, this is about data world actually. So I think that if you're a person that's
working on the data team or the tech team, a lot of urgent messages come your way. There's a lot

(02:20):
of urgency. There can be requests, fires to put out when, and it can feel for the person working
in that role, like you're the person that's responsible, not that no one's necessarily
holding the technology responsible or they just think you're the expert, you're the wizard behind
the magic that makes all this happen. And when something doesn't quite happen towards expectations,

(02:43):
that urgency really gets ratcheted up. And I felt that as a person working in this sector,
and because I care, I really care, I want this to be right for the people who I serve.
But that urgency was really leading to burnout and distraction from work and making my responses
to those fires less quality than they would have been if I took a breath and really assessed the

(03:09):
situation from a state of mindfulness and calm, to whatever degree is possible. So part of this email
routine for me is to lower that urgency rate and to let people know that that's going to be my
response. It's going to be a thoughtful quality response for them. But that has spilled out into
other areas of life. And it has been to me a really great vehicle for concentration and lowering

(03:36):
distraction and making sure that when I am focused on a task, I'm really putting as much of my mind
into it as I can. We've all been there getting emails that just you can feel your anxiety or
your blood pressure just kind of rise of like, oh, that's stressful. That's something to fix.
That's happening right now. Somebody's frustrated with me or regardless of what was actually said

(03:58):
in the message, all those stories start to play. And so I love that idea of like, hey, I'm going
to slow down my pace in the way I interact with that and how that can shape not just how you're
responding, Emily, but also how the other person receives the interaction from you. It kind of
permeates a culture of an organization. I'm so glad that you've been able to take some small

(04:18):
deep breathing moments from that. That makes my heart sing. Yeah. I don't email you that often.
And I forget about it when I do email you. I'm like, oh, that's right. I get this beautiful
out of office or automated response from you. It's great. I almost feel like this is going to be the
episode titled yoga and data or something like that. We all better start doing some breathing

(04:38):
exercises right now and just deescalate all the data problems we have. Data is stressful sometimes.
I have a colleague who runs an organization. They just changed their name to be more aligned
with what their philosophies are. It's called Data Plus Soul. They're a great organization.
I feel that really deeply we need data plus soul and tech plus soul in this work to really make it

(05:04):
mission driven and mission focused. Yeah, that's great. Emily, tell us about some of the work
you've done with education and teaching around data. I've seen you've had some classes and some
different training things that you've done over the past, but around data management. Give us an
idea of what types of topics in data and specifically for nonprofits that you've found
passion or interest in trying to teach and do educational initiatives around.

(05:27):
Okay. This could be a long one. There's so many to pick from. I'm really excited. Down to just data
organization. The difference between what is data analysis from just your data analysis just for
data cleanliness and data governance versus predictive data analysis. There's so many, but

(05:50):
I'll start again. Something you mentioned at the top was how I got started in a kind of education
based role for this. Teach for America, that was my first nonprofit job and it was my first data
and tech role. It was a lot of job embedded learning, which I highly, highly value and is an
adult learning principle that I think is really useful to remember. I had to learn how to approach

(06:16):
learning, how to use data, how to use the technology and how to use the data. At first,
I was just learning from a technical viewpoint, how to press the buttons and how to straight
answer questions. I thought maybe there was a straight answer to any question. That led to
hours of redoing poor work and having to ask questions and Googling and not finding my way.

(06:38):
What I was really fortunate to find was a shift in my framework from this technical mindset to a
more mindset approach from the core values of the organization. When I started learning data from the
framework of grit, resilience, relationship building, critical thinking, problem solving,

(07:00):
innovation, it opened a whole new world of learning for me. The learning curve was so different.
It was enjoyable. It was actually enjoyable to learn and use data. At that time at Teach for
America, there was this initiative where national team members could develop programs that the
regional teams across the country could buy into with their budgets and get those services that

(07:24):
they thought they needed. I created one to help folks learn how to use their data and tech from
this framework and was traveling about twice a month, maybe for eight months, to all over the
country bringing this learning. It was so powerful, so powerful personally to see people have a shift
to say, oh, I can work with this system. This system's not working against me. Oh, this data is

(07:50):
my data to own, not just somebody sends it to me and I am unquestioning or frustrated with it.
That shift made a huge impact on me and has made me want to continue in that education-based
approach for using data. Now, is that for technical users or non-technical users or how do you even
think about the distinction between the two? Because it sounds like you're kind of helping

(08:13):
everybody or people from all sorts of that across that spectrum interact with data differently.
It is all across the spectrum, but I would say it's more non-tech team users.
Folks on the program staff, on the development staff, who have to use data for their jobs,
there's no opting out of it, but that's not why they got into this work necessarily. They're not

(08:37):
always supported in their professional development to continue to learn how to use it. On the
technical folks' side, often it's a lot more of this coaching around how to interact with users
and how to do storytelling with data, which is also really fun. That shift that I had from

(08:57):
being very technical, I wasn't doing the being very technical very well at the beginning.
Folks who are doing being very technical and doing it very well, sometimes there are folks who have
a gap to bring it back to the human side. The classic role for this is the business analyst
who can talk in the spoken language to users and can talk in technical language.

(09:19):
To developers, it's a really sweet thought. I actually think that that's available to most folks.
I'm doing more technical training to program users and then more of that relationship building,
human side with more technical folks. I wanted to put more flesh on the bones around
that learning journey, knowing we've all been there where we've done something and we look back on it

(09:42):
and say, wow, that was poor work. That just didn't work out the way that we wanted it to.
That just didn't work out the way I wanted it to. Or, wow, everything I've learned now is so much
better than what I implemented over there. I'm wondering, Emily, if you could just give us one
instance of that aha light bulb moment where you looked back on some of your prior work and get a

(10:06):
little more specific with us in terms of what that learning was like and encourage the audience as
it's okay to actually have work that you look back on and say, I've learned so much since I did that.
Give us a little more flesh on what that learning experience is like and how you
observed a moment like that where it's like, I can do that so much better now.

(10:28):
Okay, yes. All right. I'll try to bring up two topics and maybe you can ask if we want to
elaborate on one or the other. The thing that comes to my mind first is a kind of non-technical
one, which is at one organization I was at, I was sort of told, analyze the data, but no matter what,

(10:48):
we always report that we get like 30% in this area. So make sure it's that. We can't go below that.
And that was an experience that I really had to contend with and understand like, what is my role?
What's my voice? Is it that I should question myself? Because obviously the answer is the data

(11:08):
should always point this way, or can I be less biased than that and bring more information in?
And then similarly, I was in a position where a manager was making decisions about what tech
platforms that we should use. And I really, really doubted it and thought we can't go forward with

(11:28):
this, but at the end of the day had to go forward and then sort of sell the whole thing. Both of
those, I think I've learned that I have more voice and everyone has more voice at any level of their
career that if they want to question things like that, they can and should. And then on the side
of like making data errors and being wrong and things like that. Oh boy. Yeah. So I think anyone

(11:51):
who has used a spreadsheet can relate to the fact that it's a little better now, but if you don't
put the filters across all of your column headers, then you're going to sort your data and one column
not going to get sorted. And now a person's name who's like Emily Rotella, their email looks like
Michael Fletcher and you're like, what happened? And you send a mail merge out to all those people

(12:15):
to thank them for their donations. So I have specific Emily. Oh yeah. It's happened to me
specifically. And it happened to people on my team and people adjacent to me in the work. And I've
seen it happen. Yeah. Many times. And that can be from a database as well. Not just the spreadsheet.

(12:36):
You can kind of misplace one filter in a database. I'm emailing all my non-participants to see if
they want to participate, but actually you're emailing all the wrong people. And I don't know
if we want to think about like, how do you come back from that? That's a whole podcast topic,
I'm sure. But one thing that's important for me personally, having been in-house and on the consulting

(13:01):
side is just to know that no, my definition of expert is someone who believes that they can learn
something, not someone who knows everything. And I think there's a sort of toxic trait of thinking
that consultants know everything about technology or about their specific data skills. Like sure,

(13:23):
they know a lot. That's your focus. But everyone makes mistakes and there's new things coming out
all the time. So really we can't expect someone to know everything and we're always learning.
You've mentioned the phrase a couple of times earlier, data culture. And I'm wondering if
that's even related to what you're talking about with learning and not knowing everything or being
able to embrace that sort of perspective on an expert. And how does that relate to data culture

(13:46):
or the way you think about data culture at an organization? Yeah. So there's two big ways for
me that that relates. One is this kind of learning culture in general that organizations I hope would
have and works really well when applied on the data technology side. So learning culture, first off,
assumes that you always have to be learning. That should predicate the assumption that you have to

(14:09):
support that learning through actual dollar and time investment for your employees and that you
can make mistakes and learn from them. You're always learning. You're going to make, we learn
from mistakes. So hopefully that kind of in the learning culture, this idea of support for it and
the way that we deal with mistakes should be really central. And the second big pillar of this

(14:31):
learning culture is that that part of evaluation for your organization. So we talk about a lot for
program evaluation. We should not be working in the field if we're not evaluating how we're working.
I mean, I don't want to be hyperbolic about it, but it can be dangerous. It can be harmful to

(14:54):
communities that we're trying to help if we're not actually reviewing what impacts we're having. And
data is such a huge tool for doing that. So I think both learning in the data tech realm, how we learn
there, how we make mistakes there, and using data to learn about ourselves and learn about the work
we're doing. Those two areas are really central to good learning culture at a nonprofit.

(15:18):
Oh, you just opened up like two huge topics for us to explore here. So I want to, can we start with
the learning culture within an organization and support for education and training and just like
nurturing that sort of culture within an organization? What are some examples of how
organizations have done that well? I think I'm going to get really specific

(15:40):
because I have a bit of a go-to example here. So there's an organization in Connecticut called
the Compass Youth Collaborative, youth serving organization in Connecticut. And they have
partnered with another great Connecticut organization called the CT Data Collaborative.
The CT Data Collaborative provides resources, training, education, and structural strategy

(16:03):
work for nonprofits in their data culture. So the things that Compass Youth has put in place
are so central. So they involve all of their stakeholders in their data decisions, and
especially in things like building the stories around their data. They have sort of like this

(16:25):
really good relationship that they've built between the data folks and everyone else on the team.
So there's trust, there's open door policy. They built it into their, they have weekly data
calls where it's part of that culture of the organization, part of check-ins and meetings,
the overall, you know, like org-wide kind of communications. They have really good relationships

(16:50):
with everyone that is a part of their data and they know that. So even the youth that they're
serving can take part in some of that like data culture aspect of it. CT Data Collaborative,
as that organization does like youth data walks in the Connecticut like areas that they serve.
So they really make it, it's part of everyone's job. And I think that's a big shift that we're,

(17:17):
we are seeing and will continue to see that the tech team integrate more into program teams.
And that really helps with budgeting too, so that we don't have to have donors say,
no, you can't send this to your tech team, your overhead, right? Anytime I work with organizations,
I try to, the grant writers or anything, we try to say all the money that would be spent on the

(17:40):
data tech side of this is programmatic money spent. So the two things kind of required,
even in that example of how like, as an organization who wants to and has embraced like learning and
how holistic and integrated data is to their success and them having organizations that they
can partner with that have some of that more niche expertise around data and can invite them in and

(18:04):
collaborate well. And there's some like mindset around, we want you to be successful, you want us
to be successful. And so there's a, yeah, that's, that feels like a rare example. It's a beautiful
one of two organizations kind of working alongside each other well, and not everybody has kind of
that place. What are the ways that, so when you're developing data culture, so it's the more

(18:26):
technical skills, maybe on the other side of it of like developing a data tech role or people who
are niche and for smaller organizations or mid-level organizations, when do they staff
somebody who is just focused on technology and data or when does that team need to be more than
one person and multiple people? Or how do you think about that kind of, that kind of role in place?

(18:49):
So many different thoughts, really pretty dependent on the organization and where their places and
even the time period that we're in now and the like what the grant situation looks like. I mean,
we're in an election year right now, so things are really up in the air. But I have a baseline
that I try to stick with, like a line in the sand. Every organization should have this role

(19:12):
that I, and I understand that not every organization is actually going to be able to fulfill that, but
I think if that's the goal, then when new nonprofits are starting, having a data tech role becomes as
central as having a fundraising, a chief of development role or a chief of staff. And I think
to put that message out there is going to be something that drives us towards having that

(19:36):
support for all organizations. If you are using a login database, something like a CRM, an LMS,
something you log into and there's some pretty user interface over a database, I think at that
point you should have someone on your team. It introduces this new level. All of those are going

(19:58):
to be in some way a relational database. It introduces a level of data complexity and
reporting complexity that I think is really well served to have someone on the team.
And the more that you interact with your data, the more you ask from it. So once you understand
that you can really define things well and get good answers, you start asking more questions.

(20:21):
And then it just ratchets from there, which is a good thing. I think that we should go that way.
And so once you have support in place, you can imagine that in your five to 10 year plan,
depending on the goals of the organization and their growth plans, that you would be adding on
more team members. Now I will couch that to say that there's a lot of different variations of

(20:44):
how that can look. I serve as the tech team, part-time tech team for one of the organizations
I work with. It's been five or six years now. So that's an example of having someone not in-house
but in a long-term contracted role. That's what I want to see for organizations, that this is
built into their budget, it's built into their culture. In-house I think is great because that's

(21:07):
even more embedded. You've got a job description, you've got the responsibilities, people
interacting with them. You can transfer that over when someone, it's easier to transfer knowledge
and transfer files and ownership of records and things like that when there is a staff change.
But it doesn't necessarily have to look like that. And for a lot of smaller orgs,
they need to figure out a different configuration at first.

(21:29):
And so as you're thinking about this data role, and I'm assuming you're saying once you get to
that, you've got a CRM, LMS, ERP system, something like that, it's a full-time role. What would you
say are those non-negotiable core functions of that particular data tech role? Are you thinking

(21:50):
this is some kind of a, a lot of people might accuse you of saying, well, that's a unicorn in
such a small organization because you're looking for someone who can manage the database, they can
pull the data into some kind of a model so that it can be analyzed and they can build the
visualizations. Next thing you know, you've got a long list of responsibilities that you're

(22:10):
expecting of one person to do for an entire small organization. So how would you put some
guidance around that, some scope, just to limit it to say, these are core functionalities that
you kind of have to hit on and others are a little more negotiable for each organization.
I don't know. I'm not trying to put you on the spot, but offer a little bit more around that.

(22:34):
What are those core non-negotiable functions of the role?
Yeah. I haven't written this out before, so I can give you some thoughts just having been posed
the question. So they start for me again in the like mindset, non-technical area. So a core
responsibility is going to be curiosity and learning. So knowing that there's always more

(22:59):
to learn and both in the data analytics sense and in even the base of data governance, data
organization, things like that. I do think that some experience and some learning in database
administration modeling is really helpful. When Teach for America went from spreadsheets to

(23:28):
Salesforce, that was my first introduction to it. And it was a total mindset change to understand
what a relational database behind a UI was. So having been introduced to that is really helpful
and being able to talk to other people about that, being able to translate technical stuff to
non-technical speak, I think is a real... You know what? That's it. That's what I'm going to say is

(23:49):
the most core value that there is. As long as you can bridge that gap and take technical knowledge
that you have and be able to talk to non-technical people about it, I'd say that's the most important
part of it. And there's an organization, a consulting firm called the Build Tank, and they
have a whole curriculum around, I think how they call it is distributed ownership across the

(24:15):
organization. They build different roles so that that job description, hopefully as big as it might
be for that one data tech person is shared across other roles. So the responsibility for the numbers,
for the actual data point, that can be distributed to folks, how to pull them from your database,
how to analyze them, that might be on your data analyst or data tech person. But making sure that

(24:41):
number of participants is 30 and that jives with the reality in the classroom or wherever it is,
that's someone on the program team. And maybe the thinking up of what are the reports for grants
that are needed, that's spread out across the organization. So I think that really helps take
that kind of pressure off, even if the job description is big. I'd be curious to ask both

(25:06):
of you if you've looked at some of these job descriptions recently for data tech roles at
nonprofits from manager or associate level up to CIO. Sometimes they look exactly the same from
the manager level to CIO. I am in shock at how much is being asked on job descriptions of any

(25:27):
level data tech folks. And maybe the expectation is that people will see that and think that
there's a growth opportunity for them. But I wonder if it's actually creating a barrier to entry
because you're saying you should have this, should need this, where are you going to get that?
That's partially why we don't have to go into this too deeply, but that's partially why I started my

(25:49):
apprenticeship program, because I was given the experience to learn on the job. That was
so valuable for me and made it possible for me to work and get better at this stuff.
And I see a lot of experience being asked of entry level jobs, and I don't know where they're
going to get that. So I said, you can get it with me, you can work on my projects and have that

(26:12):
experience. And then it's a little bit also of the Teach for America Kool-Aid, where you say get
folks into social justice work early and they'll stay in the job. And I think that's a great
way to get folks into social justice work early and they'll stay there. So partially getting folks
to work with nonprofits as some of those first data tech experiences for them and hopefully
stay in those roles.

(26:33):
Right. And limited resources will definitely make those entry level jobs feel like they need to be
ratcheted up on experience because they can't pay for the multiple roles. So it's like, well, we
can't bring someone new into this and teach them everything. We need someone to come in and do work
with them at that higher level. And so it all gets jumbled. But my favorite is when these job

(26:55):
descriptions will list new tech and say, you know, give me five years experience in something that
just came out last year. And it's like, they're so out of touch sometimes with what even is happening
in this space. They don't know how to write the JDs. So it can be a big challenge for nonprofits
when it's an ever changing landscape, the tech is so evolving all the time, and they probably don't

(27:19):
have the resources to train entry level roles in this stuff. They need people with experience to
come in and often take a salary cut from what they get out there in the for profit and big tech
spaces. So these are real tension points that I think everyone in nonprofits and mission driven
organizations tend to feel with the more technical a role becomes, the harder it is to find those

(27:43):
people to fill those positions. So yeah, great conversation on that, Emily.
You're helping me write the website copy for this apprenticeship program. That's exactly right.
That's why the idea for this is you have someone like myself who has 10 plus years experience is
technical can do both sides of it, that can act as the main tech person filling in right like that

(28:03):
interim in imagine we're creating a role in an organization, right? It's your first data tech
role. So who you're going to staff in it first, first, you're going to staff me at like maybe it's
a part time 60% role. And I'm going to have an apprentice who's learning how to do this stuff
alongside me, but within your organization's context and your data. So let's a year goes by

(28:24):
with that happening. Now I can step back and this apprentice can come in as a full time role at a
lower salary than then they will imagine because I'll stay in a support role for them. Then when I
step back fully, all of that funding goes back into that role. And even in this conversation,
it makes sense why and with your background, Emily, that curiosity becomes one of the core virtues

(28:47):
of what it of what a tech person needs a data tech person, because it's about on the job
learning. And it's about having the curiosity to ask questions and to innovate around what's
happening with the technology, but then also with the programs and with the with the organization
itself. And so yeah, even think about adult learning and learning on the job, like that's

(29:08):
a curiosity thing. That's not I have lots of technical chops and I can step in. But no,
I passionate and interested enough to figure it out and to learn. And I will recognize the
challenge. I think we've said that there might not be a person in house on the team to provide
that guidance. Right? The year every tech person if you're creating your first tech role, your
manager is the CEO, the chief of the usually it's the finance person, right, the chief finance

(29:35):
officer, finance and data and tech seem to have a lot of overlap in the world. So yeah, being able
to also find the resources out of house to support that in house role is a challenge as well.
Tell me about this was the other large bucket that you opened up earlier, I didn't get to go
to but about program evaluation or really like, are we doing good work? And does it matter?

(29:58):
Or how is it mattering? And obviously data has an intricate role to play in that question.
So even thinking about how the organizations measure their success or evaluate how they're
doing, give me some examples of what that would look like for an organization to start assessing
that or to explore that with data. And I want to piggyback off of that to Emily and just say,

(30:18):
describe the field you mentioned the field and I want to make sure that our audience knows exactly
what you're talking about in your particular context when you're doing that kind of program
evaluation. So I kind of wanted to merge that in there with what Sawyer was asking.
Sure, sure. Okay. So a lot of organizations when they're getting their foundation set up in their

(30:42):
data, it's about definitions and processes, defining what you're trying to do, who you're
serving, and then what data you need to track for that. So I mentioned this earlier, just being able
to count how many participants you have, you have to define what a participant is in each year.
So that set up is part of the field of program evaluation too, just really that foundational

(31:09):
set up of understanding what data points we have, what we want. Then being able to use a tool like
a logic model to define your inputs, outputs, behavior changes, and outcomes that you want to
see in your community and from your work. And then layering data on top of that logic model to say,

(31:33):
for us to know this, what do we need to know to say that we can prove, we can have the evidence
that either, yes, we were correct that these outputs will lead to these outcomes, or no,
that didn't happen as we expected and we need to change course. So that's a kind of high level
view of the field of program evaluation in a nutshell. And the examples I can think of,

(32:00):
just to talk to a few. So if an organization wants to start thinking about this and they haven't
before, I would recommend looking into the book, Leap of Reason by Mario Marino. This is an old
book. It's free online and Mario Marino has this leap ambassadors program, all about program

(32:21):
evaluation. It's just a great resource. And they have a lot of videos on their website of organizational
leaders talking about how they've introduced program evaluation and what it's meant to them.
And the book itself talks about some organizations that have found that they weren't having the
outcomes that they anticipated and how they wouldn't have known that and might have kept

(32:42):
operating in what they were doing had they not evaluated what they were doing and what the
outcomes were. There's also a new data tech platform called Sure Impact, S-U-R-E Impact.
This is a program evaluation technology tool geared specifically towards nonprofits collecting
the kind of data they need to report on their programmatic impact. So I think it's maybe five

(33:06):
years out, it came out of another organization that was all about program evaluation research.
That's a really one to check out, even just to learn like, what does it look like to do program
evaluation in a physical sense, in a data sense, you can kind of see from their demos what that
would look like for an organization. And on the culture side of program evaluation, it is not easy

(33:29):
to put your heart and soul into helping other people and then be given the opportunity or the
challenge to have to evaluate, were we doing right? I mean, you were doing right because that was in
your heart and with all the factors that you could, you thought this was the way to go.

(33:50):
But taking that next step of debriefing and evaluating and possibly being faced with really
hard information that no, it wasn't working out and that doesn't mean nothing went well, right?
If you're, there can be good that comes out of it, but that's so difficult and how many
for-profit organizations are willing to do that kind of review of themselves. And for nonprofits,

(34:12):
I think it is, where's the time? You just have to keep going, keep serving, keep pushing out.
But it really is in a learning culture aspect of that. It is so valuable to have program
evaluation, especially through data, be part of that culture because we will serve better.

(34:33):
We will reach our missions sooner and better, or we'll change our missions for the betterment of
the world and our staff members if we do participate in this kind of evaluative work.
Yeah. It's that data plus soul you were talking about where if we're not willing to take a critical
eye to our programs and evaluate, are we actually achieving our missions? And even asking the even

(34:59):
harder question, is our mission the right mission? Did we actually get the right values into that
mission that's serving this particular population? We could be missing the mark and we have to be
able to allow the data to speak into that openly. So these are great, great topics to talk about,

(35:20):
to know that we're hitting what we're really after in these missions that we're aiming towards.
It's amazing to me that I think that the data technology side of this work sometimes is like,
oh, that's not part, it's not direct mission work. It's on the side, it's overhead, whatever.
The learnings that introducing good data tech culture can bring to the overall culture of

(35:42):
the organization to me is so aligned. We talked about we don't know everything about our technology.
It's changing all the time. So we have to keep on learning and evaluating our program.
The challenge it is to change your CRM and how often do organizations change their technology?
They are having to go through that all the time and do this kind of evaluation.

(36:04):
I really think that the more we get everyone in the organization involved in that
day-to-day act of their data and tech work, you can zoom it out to what the organization as a whole
could be doing and take key learnings from that. So I think that the data tech side of a nonprofit
has even more to offer than just the analysis and just the ability to report and talk back.

(36:25):
It has these cultural values embedded in it that we can bring to other parts of the organization.
Yeah. One of those values seems to even be humility of we're going to honestly look at the data
and realize and see if we're doing what we thought we were doing and helping the way we thought we
were and being honest enough to admit that we might have been wrong. And that's a scary thing.

(36:47):
Whether it's a new organization, a young organization that has some innovative spirit
to admit they're wrong or for these more legacy or long historical organizations that have just been
charting this mission for decades or centuries and to look at the data and start to evaluate
with enough humility to say, maybe we're not doing the impact we thought we were.

(37:08):
And that sounds like a data culture thing right there of like, we're all embracing the data
because we care enough about these people or these communities or this mission that will
evaluate the data well. It's beautifully said.
Emily, I'm curious if you could share a little bit more about the services that you offer or even
like this apprenticeship program. Tell us a little bit more about like, yeah, your consulting firm

(37:30):
and what you do. Sure. So, you know, it's still evolving. I have a young company and I want to be
nimble and evolve with the needs of nonprofit organizations. So we offer a kind of fractional
tech team approach where you don't have the resources to have a tech person on the team,

(37:53):
but you want and understand the value in not doing quote unquote project-based technology
work, which I think is really not a, this is a whole other podcast topic, I'm sure.
Technology isn't always really project-based in a nonprofit. It's long-term,
in perpetuity kind of work. So organizations that can understand that and know it, we are

(38:18):
very happy to partner in and be that resource for them and become part of their team, right?
It's almost like having an in-house team and get ready for potentially actually having an
in-house role in the future. So we serve as a fractional tech team for organizations.
We also have a sort of larger service that's our, we call it our Learn, Use, Love service,

(38:41):
which is to help data culture and technology culture go from wherever it's at now to where
it could be super impactful for the organization. And those three areas are our key areas that we
introduce into organizational culture. It's a culture of learning, right? We've talked about

(39:01):
that a lot. Culture of using, which is just to say that your job functions aren't the only way
that you can use the data and technology. There's more to it and you can be partnering on different
projects. So we do a lot more of introducing use across the organization of their tech.
And then I believe that the relationship between humans and technology can make or break use of it

(39:26):
at the organization. And so that's why we include love in this kind of approach. It doesn't mean
you actually have to love, but you have to look at your relationship with technology.
Okay, it's loving technology. Okay, this is our relationship.
Yeah, it's a lot. You can, I personify technology as much as possible.

(39:47):
Okay.

(40:17):
would be sort of like a two to three year commitment,
where at the beginning we say we're
wanna get ready in our budget and in our culture
to have our first tech role in-house.
And the pathway that we want that to go on
is to have someone like myself
who is a little more senior in their career
to serve in that sort of fractional,

(40:39):
or in some cases it could be full-time tech role
to get it ready, get it prepared,
and have an apprentice learning alongside,
building relationships, getting the contacts,
building them up into that role.
And then that's the first year.
In the second year, the apprentice takes over
in the full-time role at the organization,
but I stay on as support.

(41:00):
And so where the money distribution is shifts then
to the apprentice also.
And then in the third year, I can step away completely
in terms of that day-to-day interaction,
but still be a resource.
Even when I also sort of like build quick tech tools
for folks with spreadsheets or air table
or some things like that.

(41:21):
And I always offer lifetime support on that
if it's like small changes, big projects,
it's a different question, but I just can't,
I can't leave a project behind fully.
I just never can.
And I'm not ashamed of it anymore.
I'm always gonna be there.
So there's always, I feel like I can't let go
of some connection, but the idea is really
to hand it over to the apprentice.

(41:43):
And at that point, they're the resource on their team.
Yeah, well, we're gonna have to have like a check-in
in like two or three years, Emily,
because I know this apprentice thing isn't new,
but like I wanna see two or three years from now
what that's looked like and hear the stories that come out.
Because I think that's a really compelling model
of slowly helping organizations grow
and mature their culture while not having the budget

(42:04):
right away to make that happen,
but can slowly bring on board
what the technical skills and ramp up.
That's beautiful.
Emily, what got you into the nonprofit world?
What matters and why have you stayed in the nonprofit world?
Why does it matter to you?
I have always been really lucky to have opportunity
and resources that I've been able to use

(42:27):
and take advantage of.
And I've from a young age been introduced
to the fact that that's not everybody's life situation
and was taught early on that everyone is there
but for fortune.
Everyone's situation is there but for fortune,
everyone is equal and worthy and deserving
of good and happy life.
But that that's not the case for everyone

(42:48):
in terms of resources and opportunities.
And so I have always felt that it is something
that we should work towards as a human culture
to have everyone have those resources and opportunities.
And then just more like specifically in career path,
when I first moved to New York,
I was working in book publishing

(43:10):
and it was a fairly easy job
and I had a micromanaging manager
so I didn't have a lot to do.
And I was volunteering around New York City
with a lot of different small nonprofits
and we were making very good local individual impact
in the ways that I was volunteering
and seeing what the organizations were doing

(43:31):
but they had really big missions
like end domestic violence or end poverty.
And I felt like we weren't doing activities
to get towards that mission.
And I had a lot of passion and time and energy.
I was in my twenties.
So to give but I didn't see that I had any skill

(43:53):
to bridge that gap between what we're doing
and getting towards these big missions.
And so I ended up going back to school at night,
just looking, I had come from a music theater
literature background.
So I went looking for a skill set I didn't have.
I went to business school.
I got introduced there to data and tech

(44:14):
for supply chain logistics,
really kind of dry stuff.
But somehow it seemed like since I was volunteering
and learning about this and also actually I went to a school
that had a really good social entrepreneurship program
and saw that this was a tool meant to efficiently
and effectively move processes towards their end goal.

(44:36):
And that's what I wanted for social justice
and mission-driven organizations.
And so marrying those two things in a career
felt like the right move.
Well, that story, thanks for sharing a little more insight
into your background.
And we're coming up on time here.
So Emily, if someone wanted to reach out to you,
find you online, find out more about what you're doing,

(44:59):
where can they go online to find out about you more?
Yeah, they can go to my website.
It's called maketechworkforyou.com.
And that comes out of the fact
that not everyone needs to be a coder.
You don't need the shiniest technology.
You just need to make tech work
for what you're doing right now.
Yeah, and they can find me on LinkedIn,
Emily Hicks-Rotella.

(45:20):
And I love to talk to folks and hear what's going on.
So anyone that wants to find the time
to just brainstorm, thought partner,
talk about these topics, yeah,
that's my favorite thing to talk about.
And thank you both so much for having this conversation
because you ask really great questions
and you have really great insights.

(45:40):
And I've learned some really interesting ways
to sort of think about this from you.
So I appreciate it.
Super, super technical, practical, tactical episode here,
conversation, really appreciate it so much.
And Emily, I gotta ask a question.
So if data was to be found in the field,

(46:01):
like we were talking about,
what kind of crop would it be?
If it was found in the field,
what kind of crop is it gonna be?
Oh boy, see, I'm a city person.
I don't have a lot of good nature stuff.
Okay, I'm gonna go in.
I'm gonna go in.
Oh God, this is so terrible.
It's not even a crop.
Data are the redwood trees, okay?
So they are big, right?

(46:25):
Unmoving in some ways, but like always growing,
no matter what, they provide oxygen
and they have a symbiotic relationship
with humans and the earth, right?
We can't live without them
and they can't live without us.
This is going deep.
The oxygen, carbon dioxide.
Yes, I think that's that relationship part.

(46:45):
Maybe there's some idea that data technology
is separate from nature,
separate from the ecosystem of the divine
and humans are in that.
Some people even think humans are separate from nature
and these mindset bridges, I think,
if we can see ourselves as part of nature

(47:05):
and data technology as part of that natural ecosystem,
I think we'll find the symbiosis among us
to help each other grow in positive ways.
Wow.
That's a lot in that redwood tree.
That was a lot in there.
And I was just thinking this whole time,
I wouldn't know it, but it's gotta be some kind of grain.

(47:26):
It's a basis for bread.
Yeah.
You know what else it could be at some point?
Like, you know that, where is it?
Maybe in Switzerland where they have a grain vault.
So in case of oblivion,
then we can regrow all of our grains.
The storehouse.
Yeah, the storehouse.
I think it'd be really interesting to see

(47:47):
what data points about humanity
or even if this be an interesting thing
as a data culture exercise,
if you were building your grain vault
for your organization,
what data points would you put in it
that you would need to have if everything blew up
that you could start over with?
Or you could just do a time capsule kind of thing.
Like, hey, what data would I throw in a time capsule

(48:07):
for a hundred years down the road
and what would you find in there?
I don't know.
Oh man.
Gotta unpack that stuff in another conversation.
That's good.
Yeah.
All right.
Well, thank you, Emily, for being here
and thanks listeners for joining us
on this episode of Making Data Matter.
Signing off.
Thank you, everybody.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.