All Episodes

May 8, 2024 • 47 mins

In this episode, we sit down with guest Joe Squire - Director of Analytics & Data Science at UPMC.

We discuss:

  • How Joe transitioned into data after beginning his career as a nurse.
  • Partnership with business areas to improve and align outcomes.
  • Uses and adoption of AI among data professionals.
  • Ways AI can improve healthcare access and outcomes for patients,

and more.

Joe's LinkedIn

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Welcome to Making Data Matter,

(00:02):
where we have conversations about data and leadership at mission-driven organizations,
with practical insights into the intersection of nonprofit mission, strategy, and data.
I'm your host, Sawyer Nyquist.
I'm your co-host, Troy Dewek.
Today, we're joined by guest, Joe Squire.
Joe, welcome to the show.

(00:22):
Thanks, Sawyer. Thanks, Troy. Glad to be here.
For folks just meeting you for the first time, Joe,
could you share just a little bit about your background and what you do?
Yeah. My background actually came from a nursing background and transitioned into data,
where I worked as a clinical data analyst for multiple years,

(00:43):
and I currently sit as director of analytics and data science for
the Heart and Vascular Institute at UPMC out of Pittsburgh, Pennsylvania.
So very background coming from the clinical area, but still working in healthcare now.
One of the things that's always fun about the data world is the backgrounds that people come from.
We were talking about this a little bit off-camera beforehand,

(01:06):
about Troy and I's backgrounds aren't data,
and then you came from a nursing background,
which I don't think is a common background for data professionals, right?
No. If there's any nurses listening to this,
which I highly doubt is probably the case.
But I would say a majority of nurses are just very data illiterate in some senses.

(01:29):
Well, I shouldn't say data, I should say tech illiterate.
So they shy away from working with any raw data.
They're not data illiterate because every day they go out and they're making
interpretations of various data that comes into from the patient,
whether it's labs or physiological data like blood pressures or something.

(01:53):
So they are not data illiterate,
but technically you run into a lot of nurses and they don't want to deal with that kind of work.
So they shy away from it and they dismiss it as soon as they see it.
So I think that's why you don't see many nurses in that role.
But there is a very big need for nurses to transition out of

(02:16):
that clinical care setting and into leveraging data and helping
healthcare organizations become what they strive to be,
and that's providing the best care to the patients at the right time.
So Joe, what got you to cut your teeth on raw data like that?
Did you just have someone send you an Excel spreadsheet and you nerded out one day?

(02:37):
What was that journey like to cut your teeth on that?
Well, it was completely on accident to be honest with you.
So I was working in a cardiovascular OR sharing call,
24-7 call with one other individual.
So we saw like 2004, the call time of year.

(03:00):
We just had our first, my wife and I just had our first kid and I needed some work-life balance
and ended up finding a position that at the time I knew dealt with data.
But it was really just a glorified data entry position that needed
like clinical knowledge to do.
But as soon as I got into that,

(03:20):
I became bored with the data entry portion of it and asked the question,
okay, we're collecting this data,
what are we doing with it?
How do we use it better?
How do we make a difference with it?
That's where you get your gateway drug of Excel and you start messing around with it often,

(03:40):
and then everything snowballs from there.
So that's really my career progression was just the natural curiosity
that stemmed from an accidental move that was out of no other reason
other than wanting to find a better work-life balance.
So one follow-up question to that is many of us probably were cutting our teeth on Excel.

(04:04):
Like you said, that's the gateway drug into the data world.
But at some point,
our thinking changes, there's this journey of you might be in that space of doing
your pivot tables or you've got some VLOOKUP formulas that you're
doing to try to mesh data together and then at some point,

(04:25):
we're introduced to databases or dimensional data modeling and data warehousing.
So just tell us a little bit about that part of your journey to go from
the basics of data in Excel to starting to think bigger,
broader picture design patterns and how we can
actually make this an enterprise solution for the workplace.

(04:50):
Yeah. So I think the biggest thing that comes down to is just the mindset of making data useful.
And that concept guides you and drives you to try to understand basically,
how do we automate things?
How do we make things more efficient?
And that's what leads you from the basics of Excel.

(05:15):
And I'm not saying you have to move out of Excel,
but there's a lot of people that their entire lives are based around the Excel environment.
It's a very powerful environment.
And you can connect directly to databases with it and everything.
But if you're thinking and you want to automate things that may not be as functional in Excel,

(05:38):
then you're going to be looking to move out of that,
looking to see what else is out there.
And so that automation concept is really what drove me to be like,
okay, what else is out here?
What else can we find?
And that's where you start understanding,
okay, this is where the data is coming from.
These are databases.
This is how it's structured and everything.
And you start looking at,

(06:02):
okay, now I can actually plug in, say, a BI tool on top of this.
Now I can automate that report that I had to do manual manipulations on to send out.
And then you start learning your coding languages,
whether it's SQL to bring that out or whether it's Python to do some more advanced manipulation
or advanced analytics.

(06:23):
And that's really where you get your...
For me, that's where my curiosity juices get going.
And I'll be honest, ever since the whole generative AI explosions occurred,
it really helps you iterate a lot faster and explore more avenues more quickly

(06:44):
than in the past whenever you were sorting through Stack Overflow or Googling things.
Because now you can just type in, hey, I have this thought,
how would this be possible if you were to do it in Python?
And it would say, these are the basic steps.
There's a skeleton of code.
And then you can just pull it out and start manipulating it

(07:06):
and work on what you're doing a lot quicker.
And this is kind of going down a rabbit hole,
but as someone in a leadership position, that's very helpful
because it exponentially helps your efficiency.
And since your scope of what you're doing is much broader and it's not as technical,

(07:28):
you naturally have to move away from the technical aspect of your work
because if your head's down, you're not looking up and out.
But with Gen. AI, that efficiency gain really lets you also keep developing
in some capacity to help your team out, whether it's offloading work
or generating new ideas on how to look at the data

(07:50):
or how to bring the data in from a different source.
That's really some of the largest benefits I've seen with that aspect of AI augmentation,
especially as a leader in the data field.
But yeah, coming back to your original question,
Zoe, go from Excel to everything else.

(08:13):
Now, OK, so Joe, you opened the AI box.
So we're going to go there, I guess.
So first of all, tell me a little bit about your team
and then even how you've thought about Gen. AI for how your team works
or what they're empowered to do with Gen. AI or ChatDBT
or what type of tools you've done there.
So yeah, as a leader with team members, what does Gen. AI look like for you guys?

(08:35):
So my team, we sit within the Heart and Vascular Institute.
So that's a service line within the hospital.
And healthcare, like larger health systems, have these service lines,
which are generally dedicated to specific procedures and disease processes.
Ours happens to be cardiovascular, so heart, the veins and arteries,

(08:59):
those types of procedures and diseases.
And so much of the work that we do is centered around outcomes,
like quality outcomes, research.
We have a pretty big research lean where we help basically use the EHR data

(09:20):
that we have to answer research questions and publish.
And we also support the operational side in some capacity.
In terms of generative AI, UPMC itself is a large Microsoft shop.
So our approved AI, generative AI, is Microsoft, so copilot.

(09:42):
And currently right now, they're trialing that across the system
with certain individuals. I'm part of that trial.
And copilot's fine. I think it has a ways to go.
I think that some of the integrations are neat.
I haven't found it extremely useful.
Like it's integrated into Excel, but I haven't really needed to use it a whole lot in there

(10:05):
because I don't work super heavily in there.
It's mostly just simple things like so.
But in terms of empowering my team, it's difficult.
So there's individuals.
So I would say there's experts in the field and they've been doing it for a long time.

(10:26):
And so there's not a whole lot of need to look at, like use gen AI to do the work that they do
because they either have scripts that they've used in the past, they can easily modify it.
They know exactly what they're going to do as soon as they go into a script.
And I would also say there's an age factor in there.

(10:48):
And this is generalizing, but from my experience that if somebody's older and they're set in their ways
and they have that expertise, they're less likely to adopt generative AI than somebody who's younger
and who's not as set in their ways or they don't have the extensive experience.
And I would put myself in that bucket with a lot of the individuals.

(11:12):
Like a lot of the individuals underneath me have much more, their technical chops are much better.
They're much more experienced than I am.
I mean, we have analysts and biostatisticians and some of these individuals have PhDs.
It's just they're much more experienced.
So I am all for using generative AI.

(11:32):
And if anybody ever wants to adopt it, I am more than happy to help them,
like understand it and introduce it to them.
But I'm not going to force it on them, because if you force it on them,
they don't understand that idea of how can I leverage this to be more efficient kind of goes away.
And they just see it as nonsense.

(11:55):
And that's probably one of the biggest stumbling blocks I see with the analytics world adopting generative AI.
But there is like using GitHub Copilot.
It's basically like autocorrect and autocomplete for codes.

(12:17):
And it's very efficient because you can basically start typing in your SQL and it will bring up different ideas on what it thinks you are going to do.
And it reduces a lot of redundancy of copy and pasting certain variables or statements and stuff.
And so that's really like one of my analysts really found that extremely helpful and up in their efficiency.

(12:40):
And it was a newer analyst who was straight out of, I mean, they were still finishing their masters,
but it was basically their first analytics position.
And they really loved it.
And they were very gung ho on the whole gen AI and how I can use it to work into my workflow.
And so, again, it just reiterates that divide that I see in terms of people using it to augment their workflows.

(13:07):
I think it's again, I think it goes back to experience.
Like if people and I don't know why it is this way, but people with lesser experience tend to see it as this is very helpful versus those that are older where they it's not necessarily they don't see as helpful.
They just don't see the use case for it yet.
And I think that's I think maybe that comes down to workflow.

(13:28):
So once you get products out there that inherently have it built into the workflow, like AutoCorrect has been in workflows for years and years and years,
people get used to it and people will adopt it and understand it.
And that's understanding is almost the entirety of that battle adopting new technology.

(13:49):
Recently, I've been testing the copilot features specifically for Power BI.
And I've found that I like the more developer side of the copilot tools where it might enhance my semantic model by auto populating the descriptions for all the fields.

(14:10):
And I'm like, man, that saves me so much time not having to sit there and type these out, especially for any of the DAX measures and calculated columns.
I'm like, man, if it can pick up some natural language from the code and populate that in there for my report builders, I'm loving it.
But then the other side of that is some of the new copilot features in the Power BI service for the report consumers and for dashboards,

(14:37):
where you could have smart narratives or you could have a conversation with copilot about the visuals they're seeing in the report.
I'm just curious, is that an area that your team has gone yet?
Is that I mean, it's it's really new on the Power BI side.
And I'm not sure if you're using Power BI, but I'm thinking about those report consumers and where the role of Gen.

(15:01):
AI fits into the report consumers that that feels a little more concerning to me than the developer enhancements and efficiencies.
What are your thoughts on that, Joe? So as part of our trial, we don't have the Power BI copilot initiated.
Like, we're not allowed to use it, which I'm very bummed about because I think I'd probably be heavily using it.

(15:26):
No, I I think that is a big white space.
And in the day in the analytics community is that, you know, the the ultimate thing that we kind of aspire to is that I shouldn't say it's the ultimate thing.
But like the promise of self-service analytics has been out there for years and it's largely kind of falling on its face.

(15:47):
But I think that's an area where Gen.
AI can kind of meet those expectations if you model the data correctly, where you can like, OK, here's the use case.
So you send that that report out to the executive.
It's 8 a.m. The executive logs into their email.
They look at the report. They have some they have some top of mind questions.
Well, rather than firing email to so and so, then cascades it to so and so and so and so until finally it's something that can answer the question.

(16:15):
You have that ability to do natural language querying where it can mean that ideally it would bring back a valid answer.
That's not incorrect. But, you know, that's that's where I see that going.
And I think that's there's a lot of promise in that and being able to search across the whatever data model you have or even the entirety of data like a warehouse in terms of trying to find cool answers to questions.

(16:46):
And really. And that's what analytics is, is empowering the business to make decisions.
And if you can do that in a way that people are better able to find those answers, then we should go for it.
I think that people have been working in that area for a while now.
Like you see, like the narrative, the smart narrative has been around for a while.

(17:09):
But, you know, the underlying model really changed at this point since it's been out.
And I think I think that there's still work to do.
And obviously, it all depends on the data underneath it. But it's definitely an area that has a lot of promise, in my opinion.
So you talked about data consumers.

(17:32):
Ideally, that's the space where we can grow is that self-serve analytics, they can consume data on their own more effectively than having to send 12 emails to find an answer to the question they saw in a report.
Tell me a little bit about your the people who consume data from you, who are the stakeholders that your team interacts with?
What's their perspective or culture around data? How do they how data literate are they?

(17:55):
Do they love it or are they resistant? I don't know. What kind of interaction do you have with stakeholders?
What's kind of their what's their what's their role? What do they do?
Yes. I mean, there's a couple of different stakeholders.
The research size a little different. It's kind of niche. So we won't talk about that. But the terms in terms of the clinic, let's let's do clinical outcomes.

(18:18):
So we have an interesting model with clinical outcomes, because a lot of it a lot of it is geared around the clinical registries, which are still largely manually abstracted.
And so you have a person actually goes into the medical record and they look for certain variables and those variables are then put into a data collection form.

(18:42):
And basically, the purpose of these registries is to look at specific procedures or disease processes.
And it's usually a national or international registry that then can risk adjust and compare all these sites and how you're doing to each other.
But with those clinical outcomes, the registries in particular, we the way we have it structured is we actually have the abstraction outsourced to a third party.

(19:08):
And then for each registry, we have a subject matter expert who does data QAQI.
And then they also serve as basically like the educator.
So they're the ones that are meeting with different teams, clinicians, administrators to to go over the data,

(19:30):
tell people, understand the metrics, what goes into the metrics, what levers to pull to change those metrics.
And so it's we were doing data literacy before data literacy literacy was cool.
But that's all right. But I it's it's an interesting model.
And even though you have that set up, there's still people that will give you a hard time saying, oh, the data is not right.

(19:54):
This metric isn't right. And so on and so forth.
But it's it's something that you the way that I see it is if somebody is engaging with you that way, they're at least looking at the data.
Right. And so you want and that's that means that you're actually making it meaningful for them in some way, whether they find it useful or not.

(20:15):
They're at least looking at it, trying to understand it.
And maybe they're engaging with you in a way that's negative.
But, you know, it's still engagement. It's still trying to understand, trying to make things work to get to that point where we can use that data to make those decisions.
And so I think it's it's no matter how you have it structured, there's always going to be people that don't like what you show them,

(20:40):
whether it's because the data is negative, reflects negatively on them or because it's it's actually incorrect.
So I think I think in terms of consumers and your engagement with them, it's really just a matter of being willing to listen, being willing to eat some humble pie and being willing to admit that maybe there is a mistake.

(21:02):
Going back to the drawing board, taking a look at it.
But it's also being willing to put your foot down at some point and say, look, this is how this is legitimately correct stuff.
And you need to, you know, kind of partner with us here and figure out how we can change the direction of the boat and make it make this metric go the right direction.

(21:23):
That's a great conversation. And I love that you've mentioned two things.
You were talking a good bit about data literacy and sort of how those executives and whatever stakeholders you're working with, they do need some level of data literacy.
But you mentioned things like listening well and understanding them.
So there is a role that we as data professionals need to be willing to learn the business literacy in order to bridge that gap.

(21:52):
Otherwise, it feels like an impasse or stalemate in that, well, this is the data.
Well, I don't like the data. And then you ended so well in saying, how do we partner together to turn that ship around to actually make a difference?
The data isn't the end all be all.
It's it's got to lead to something where we're going to make changes based on the data to help this organization or this business.

(22:19):
So I love hearing about success stories.
So, Joe, I'm wondering if you could tell us about one project or something that was like this was a really good success.
There might have been bumps in the road along the way to get there.
But what was it like to see the light bulbs go on, see data being used in a way that whether it was positive or negative, that stakeholder was like, you're right.

(22:44):
Let's get on board. Let's make a change.
So I'm wondering if you have one of those examples where you could get a little more into the weeds and details of what that was like.
Yeah, I don't want to be a wet blanket, but I don't know.
I don't know if I've ever actually experienced that sitting in a room and you're telling people all this stuff and all these light bulbs go on.

(23:05):
I don't I don't know.
Like if there's some, yeah, we may not have a video.
Let us know if that's a real thing or not, if you've ever experienced it.
But I would say that with projects that drive meaningful benefit and my perspective is biased because I'm from health care and things generally move slower in health care.

(23:31):
But projects that drive meaningful benefit generally take there's generally a long horizon on them.
And so and this leads to one of the benefits of staying at a company for more than two years gives you that perspective,
especially if you're working on projects of, OK, this is what it takes to deliver a project.

(23:53):
This is what happens after the project's delivered.
This is how you sustain a project.
And so I think one of the along those lines, one of the big things that we did was use these clinical outcomes to basically turn the ship around.
And so we did that at our flagship hospital, UPMC Presbyterian for cardiac surgery programs.

(24:15):
I'll be candid, the program, the cardiac surgery programs there were floundering.
There wasn't the outcomes were very poor and it wasn't just like one source of data that led to that change generation.
There are multiple things that went on.

(24:36):
We started with the clinical outcomes where people said, OK, something's wrong.
We need to make a change.
They actually did a lot of qualitative research to get that small data from interviews with individuals in terms of trying to understand is there a cultural aspect to what's occurring.
And you put all that together at the end and you drive a path forward using that data to help guide as long as as well as pairing it with the business stakeholders and their input.

(25:05):
So there were leadership changes that occurred.
There were practice changes that occurred.
And it took it took a total of probably close to five years, really, to see a significant turnaround where we went from being the bottom quartile basically of clinical outcomes in our program across the country to the top 15 percent in the country.

(25:30):
So, you know, it's not it's not always going to be like you, the data master, standing up there delivering a grand speech.
It's going to be a partnership.
And that's that's really the important aspect of it is partnering with individuals, getting leadership by it is huge.
If you don't have leadership by and you should just stop your project, basically, it's my opinion.

(25:52):
Like if you if you can afford it as many times as you want, but if that leader doesn't say, yes, let's do that, it's I'd say your failure rate is probably going to be like 98 percent.
But yeah, part of people getting that leadership by and using that data in tandem with the business knowledge and setting clear, definitive goals on what you want to accomplish.

(26:17):
That's that's really the big things in terms of having a successful project that is data driven and that actually succeeds and accomplishes something at the end.
So questions around the you talked about outcomes a few times and like quality outcomes.
And I'm guessing there's a large aspect of that that's that's quantitative, like that's data based.

(26:40):
And how can you develop it more, put more texture on that for me in terms of how do you measure outcomes or what is a good outcome?
What is a negative outcome in this world?
Is it like did the patient have a successful recovery or not successful recovery or like what are the types of outcomes that you're measuring?
Because it seemed like those were a linchpin for outcomes are down.

(27:02):
Let's we need to start transitioning to something different and making changes.
So tell me about those outcomes that you were measuring and what that looked like.
Yeah, so there's there's a lot of different ways, like any any type of outcome is so outcomes can be measured with anything in health care.
So you would have some some type of patient contact, whether that's like an office visit or whether it's a surgical procedure.

(27:24):
And you can measure obviously an outcome from that.
It'd be the same thing as somebody visiting your website and seeing, OK, did they make it to the cart?
Did they leave stuff in the cart? Basically.
Now, here is this person had a procedure. Did they have a surgical infection?
Did they die? So it's very similar.
That's just we call them outcomes in in health care.

(27:47):
And so like we'll use we use the cardiac surgery as an example.
So there's outside of like the national bodies actually risk adjust everything and not rank you, but they they don't like to say rank.
But essentially, they they assign you basically different quartiles in your performance.

(28:08):
And and so you can see that like where you're sitting nationally or internationally.
But for cardiac surgery, the predominant ones are obviously mortality.
Did the patient die after the surgery?
And there's certain parameters around that. OK, did they did they die within 30 days?
If they did not die within 30 days or the hospitalization where the procedure occurred,

(28:30):
then it counts as a surgical mortality versus if they were discharged and they died at 45 days, it's not a surgical.
And, you know, that's that whole you can get into the whole idea of whether that's accurate or not accurate,
because literally the patient could get hit by a bus at day twenty nine and that'd be a surrogate mortality.

(28:52):
But, you know, chances are probably pretty low.
So like surgical mortality and then major, major issues like we talked about, like infection or whether they,
you know, were on the breathing tube for too long or they developed some kind of like developed kidney failure or they had a stroke or things like that.

(29:14):
And so those are the big outcomes that you measure.
And those are the ones that these like these governing bodies, they have risk models that are run.
These outcomes are run through and then you're stratified based on those risk models.
And so that's those are like that's just an example of some of the outcomes we look at within within health care.

(29:35):
But it's not any different than looking at how people interact with a website, like I said, or, you know,
how people interact with a product or something like that, like you have what the downstream effects are of that interaction is basically what that prevents.
Yeah. And you have the advantage of or suppose what's different is there's some kind of national established ways of measuring certain things and measuring outcomes where some businesses are kind of left to their own like, well, how do we want to measure this or what do we want?

(30:03):
That's maybe one of the pros and cons, perhaps, of health care is you have a lot of established standards for what things look like in health care outcomes.
Right. And that also like that comes to the discussion of sometimes you'll have people say, well, this metric isn't correct, even though it's a nationally established metric.
It's like we don't do so much of it's actually established.

(30:24):
But yeah, and now and there's and it's you know, that's you run into those issues and that's like sometimes it'd be nice to be operating in an industry where it's like we can actually tweak this metric and make it more accurate in our eyes.
And so, yeah, you are in highly regulated environments. You are held to those issues.
And that's that's one of those things like you you don't have much control over those aspects. And and there's there's an aspect of gamesmanship that comes in there.

(30:52):
Right. So making sure that you are doing what you can to make that metric look as good as it can be.
It's not not committing fraud, but it's OK. What's what is there like some documentation that needs to be tweaked that would make this metric better because it just needs to be in that.
So and, you know, it's again, you can get into a whole conversation about administrative burden and health care and everything else.

(31:20):
But that that is one of those I wouldn't say it's just limited to health care.
I'd say like highly regulated environments like banking would be another one where you see some of that stuff going on.
Yeah. What are the things that any any metric that you give and then there's when there's incentives wrapped around it, then you do end up with.
Yeah, gaming or at least optimizing for metrics because there's potential incentives or potential opportunities tied to it.

(31:47):
I'm curious about the next if you look ahead, six, 12 months, what are some exciting initiatives you are tackling or you see on the horizon for data or within your hospital or within the space, health care space in general?
Yeah, so there's I actually have two big projects, one one still in the ideation phase, the one we're moving forward with, the one we're moving forward with is actually working with our clinical trial group.

(32:13):
And so we have a group of research coordinators that part of part of what they do.
And so these research coordinators, they get these clinical trials and there's long lists of inclusion, exclusion criteria for who would qualify to be recruited to the trial.
And it's basically like a big, you know, a big logical if if statement, like so if they have this or if they have this, this and not if they have this, not if they have this.

(32:44):
And so they're manually combing through charts, trying to find those people.
And I was just working on some of the metrics recently.
And some of these for the trials that we're looking at implementing, what I'll talk about is what I'll talk about here in a second.
They had for every patient recruited, it was about 120 to 180 hours of chart review for a single patient.

(33:14):
That's astronomical, right?
Like that's. Yeah, yeah.
So so the idea here is to take a company that was actually came through our enterprises arm at UPMC,
which is basically like they they see fun startups and this group.
What they do is they they take I think their underlying technology is based around NLP because it's it's not it was before the whole gen AI boom.

(33:43):
But they take this NLP technology and they build different models in terms of what it goes out and looks for.
And so what we want to do is take use this clinical trials group with the research coordinators,
take this this product that this group has and then let it loose on our data to go out and basically call through the data to find people based on these inclusion, exclusion criteria.

(34:08):
So rather than spending 120 hours looking for these patients,
they get a preset group of patients brought back, which they can then screen versus going out into the mass and trying to find these patients.
So the idea is to reduce that amount of screening time, be able to recruit more patients more quickly.

(34:30):
And that leads to things like, you know, expanding access for patients that need access to these trials as well as revenue for us,
because we get industry sponsorship for the patients that we recruit in.
And so there's it's it's a huge benefit.
And most people don't really think about like clinical trials and and matching patients to those trials until you're in a position where you actually need to be in that need to be matched.

(34:56):
And so like a good a good instance, I didn't even really fully understand until recently.
We had a a friend of ours whose two year old son had a rare cancer and, you know, when they get diagnosed,
you're sitting in the hospital and you think about, well, I only have this one opinion.

(35:16):
I mean, fortunately, they're in Pittsburgh. It's a good opinion.
But there are so many other people out there.
There could be trials out there that could be treating this rare form of cancer.
But how do you actually find it?
Nobody's actually look like the chances are that you need to make a decision fairly quickly and you need to be able to do it with confidence.

(35:39):
And so that I mean, you've got to stress your feeling right now, too, as your child is going through something like hard position to even make a decision to.
Right. Right. It's hard to make a decision and you don't know if it's the right decision.
And so that concept of being able to more to expand access to clinical trials, to be able to recruit more people and potentially life saving treatments is a really, really important thing, in my opinion.

(36:07):
And, you know, leveraging these analytic capabilities of people and their products and being able to do that and see if it actually can deliver is is really one of the things I'm really excited about this year.
The other aspect, the other thing that I'm where this currently in the ideation phase deals more with gender of a.
And so this goes back to the clinical registries.

(36:29):
I do have a group underneath me that's still manually abstracts for these clinical registries.
And, you know, there's going to be this retirement tsunami coming with this group because they're all in their late 50s or hitting their 60s.
And I can say that because they said that in a meeting to me.
So the and so the thought is, can we use like a use a augmentation?

(36:56):
Can we put this into our HR be able to tell it here are the different here where we generally find the different data points and go out there, get those data points for the certain parameters that need to be entered into the questionnaire and bring it back with context.
So then the abstract or themselves as they work through, they can then reinforce that learning.

(37:22):
And essentially what they do is one, they increase their efficiency to their training, something that is basically a knowledge capture of what they know and understand.
And three, it sets them up.
It sets the person up who's going to take over for them as having a much, much not a much less steep learning curve to take on that work after they retire.

(37:48):
And so that's it's still ideation.
I know I'm talking with one company about the possibility of that.
And so but that's like super exciting for me because I've always been big on like with these clinical registries, there's a lot of like single fail points because there's a very deep knowledge expert that sits with those registries.

(38:11):
And to lose that knowledge is a huge knowledge loss.
And so it's always been like a bug, like a thorn in my side in terms of how do we prevent that single failure point with this?
And this looks to be possibly a promising way of doing that.
Yeah, I love hearing about efficiencies that can get built into organizations like hospital or health care because of how that opens up potential for people to get better health care or the right people to get access to the right thing or for it to scale in a way that it wasn't before.

(38:45):
And so those those outcomes and the people's lives that are actually affected by the work of health care are improved in meaningful, tangible ways through like technology efficiencies and data efficiencies in the background.
I was thinking the same thing, like as you're talking about outcomes, Joe, I'm thinking there's still a clear outcome that is different from most of corporate America, where it's let's just up the profits in our organization.

(39:13):
And there's nothing wrong with that. But there's a unique mission that health care has in the improvement of people's lives and well-being and all of those kinds of outcomes.
And the fact that you do even have those benchmarks shows that there's concern for that aspect of someone's life that often gets missed in consumer goods and other kinds of services that are out there that aren't focused on human well-being.

(39:39):
So I think that's a neat distinctive that certain nonprofit, not-for-profit organizations get to focus on.
And it brings meaning to jobs that may be very energizing and exciting, as I'm sure you even relate to having been an RN and having that boots on the ground experience, caring for patients.

(40:03):
And now you are translating that into your data experience.
I think that's a really neat part of this conversation that I've enjoyed listening to you talk about.
Yeah, and I would just temper that with and I 100% agree with you, but I would temper that with saying every business is financially driven.
So always have defensive metrics set up and ready to go.

(40:26):
For instance, my my biostatisticians and the research group, like that's it's amazing that it still exists because there's not it's not like we're generating revenue, right?
Like a lot of what a lot of my group, there's no direct revenue generation.
We have a few few projects that do generate revenue.
But so far as my research group, a big thing is like papers that we publish from that group.

(40:50):
It's not just we're publishing papers.
It's how many papers are we publishing in top quartile journals?
Right now, it's over 80 percent in the last year.
How many people are getting eyes on this in terms of the citations from each paper?
In terms of how many news outlets have picked up some of the publications, how many blog posts, how many mentions on X and, you know, formerly known as Twitter.

(41:18):
And, you know, that those kind of metrics are super important because the bean counters aren't really thinking in that way.
So being able to show them, being able to tell them that story of, hey, look, this is marketing.
Like, that's what this is like you. You cannot look at this purely as a cost center.

(41:39):
This is something that actually gives meaning, whether it's marketing for the company,
progressing the science or helping clinical outcomes and making things better for the patients because we're using that data to drive improvement.
And so those those kind of things like you have to be able to talk about that and convey that to those people that come in and they're just looking at the bottom line.

(42:02):
Because, yeah, like at the end of the day, it's a business and you have to be financially profitable.
And we just saw this UPMC laid off over a thousand people last week because we're in financial trouble.
And because of that, like you that's when those defensive metrics, when those discussions happen, you want to be at the table, be able to say, look,

(42:24):
this is what the team is capable of doing, this is what they've done, this is why they need to stay.
And that is regardless of the altruistic goals of your company that will ultimately be a discussion at some point in most careers.
Like, how do we keep this company financially viable?

(42:45):
And you need to look at your services and what you're offering, what makes sense and what doesn't make sense.
And you need to be honest with yourself, too. Like, is what we're doing really worth keeping here?
And so there's those are tough decisions and those are decisions that people that move into leadership will naturally have to make.
But if you're an individual contributor, like you should be thinking about, OK,

(43:08):
how do I measure what our team is doing and the output that we're putting out there and make sure that the person leading your team,
either they're tracking that or you get them thinking about it?
Because if someone comes to you and you don't have any way of saying, here's our productivity, here's what we're doing for the work, I'm sorry.

(43:29):
You're not going to get much sympathy from people that want to cut services.
Joe, this has been fun. And we're coming a little a little bit closer on time.
But a couple of questions before we're wrapping up.
Tell me tell me something you do for fun outside of work. Who is Joe behind director of science and analytics?
You know, I'm a robot. I don't do anything fun.

(43:53):
All day, people are just robots. Yeah.
I'm actually Derek, you know.
So I have a I have a wife and three kids, eight, six and three.
And so a lot of my time is spent with them.

(44:14):
And so, you know, like going to soccer games or going to ice skating, things like that.
And, you know, for fun, most recently, I've been training for half marathon.
So that's been some of my some of my stuff that I do for fun.
I think it's debatable whether half marathons are fun or not, but that's fun for you.

(44:38):
Yeah. I mean, those are the big things that I do.
I don't have like I my my hobbies vary. It all depends on the day of the week.
I was really big in a DIY for a couple of years into winemaking for a while.
You know, like different random stuff that you just pick up and you start doing.
And I think it's probably one of the big reasons why I got into data is because it lets you marry that creativity with the right sided brain.

(45:08):
And that's also one of the reasons why I couldn't go into the arts. Right.
Like I'm not left brain enough to just go like crazy.
So I have that right brain like pulling you back constantly.
And data gives me a nice balance with that where you can be creative, but also use the like the logical side of your brain as well.
Excellent. Excellent. For sure. I think Troy and I both feel the same way about about the data world.

(45:31):
So before we wrap up, tell me tell tell listeners about where they can find out more about you online.
I know you do you do a lot of work around helping people get into data and you have some content stuff around there.
So, yeah, people want to find out more about you or look up some of the content you put out.
Where should they go? Where should they go? Yeah, I mean, I really just focus on LinkedIn.
I don't have I haven't really fostered any other social outlets.

(45:55):
But if you if you just look for Joe Squire on LinkedIn, you can find me.
And that's a lot of the work that I'm doing there.
The original content was more more on that, like like helping people get in and up skill and data.
And I still do some of that stuff with the base, like the technical stuff like out there.
It's shifted a little more with more toward like data leadership and those concepts.

(46:22):
And it'll shift more. I'm going back for my master's in data science.
So I think it'll be more on the advanced analytics side in the near term here.
Good stuff. Excellent. Well, Joe, thanks so much for your time today and for joining us.
One last question before you go. I've got to get this in there.
I'm a huge dad joke fan. So I just have to ask you, what's the richest kind of heir?

(46:47):
The richest kind. I don't know.
Billionaire. Oh, painful, painful.
All right. Thank you, Joe. Joe, thanks so much.
Pleasure having you here. All right, folks, that's it for today.

(47:11):
Thanks for joining us on Making Data Matter. I'm Sawyer Nyquist and this was Joe Squire and Troy Dueck.
Thanks so much, everybody. We'll see you next time.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.