Steve Rummel is a Senior Internal Audit Analytics Manager at CVS Health.

In this episode, Steve explains how he helps his audit team use data.


About this podcast
The podcast for performance auditors and internal auditors that use (or want to use) data.
Hosted by Conor McGarrity and Yusuf Moolla.
Produced by Risk Insights (riskinsights.com.au).

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Narrator (00:18):
You're listening to The Assurance Show.
The podcast for performanceauditors and internal auditors
that focuses on data and risk.
Your hosts are Conor McGarrityand Yusuf Moolla.

Yusuf Moolla (00:19):
Today, we have Steve Rummel joining us.
A Senior Analytics Manager,Internal Audit with CVS Health.
We're not going to try toexplain your background because
you can do that best.
So it would be good if we couldkick off with a little story
about where you've come from andwhere you are now.

Steve Rummel (00:34):
So I am right now about 16 kilometers west of
Chicago, where I've lived formost of the last two decades
after graduating university.
My undergrad is in Finance.
I'm not going to tell you whatyear that's from, but you'll
probably figure it out if you dothe math and take notes
carefully during the session.
I started out as a FinancialAnalyst and an Accountant,
worked there for a couple ofyears.
Went into consulting.

(00:55):
Started to do IT audit rightaround the time Sarbanes-Oxley
heated up.
For those of your listeners whomight not be familiar with that,
here in the States we had somereally notable public fraud in
the early 2000s that resulted ina large global accounting firm,
Arthur Anderson, going under.
Part of their group rebrandedthemselves as Protiviti and spun
off as an IT audit serviceprovider.
I worked for them for aboutseven years.

(01:17):
During that time, I starteddoing a lot of analytical work.
And again, like you and a lot ofyour listeners, I was a nerd
from day one.
So I've been a computer personand decided that was just not
sexy enough.
I had to do accounting andfinance as well.
Along the way I picked up a CPA.
During my time at Protiviti, Istarted doing a lot of
analytical work using ACL, thebest audit analytics software
package in 1987.

(01:38):
And worked on that for about 10years.
Eventually, found my way tobetter tools and have been
basically doing some variety ofthat ever since.
And for the last 10 years or so,I've spent most of my time
helping organizations ramp upanalytical functions within
their internal auditdepartments.
So everything from figuring outwhat tools to use, what kind of

(01:59):
people to hire, writing jobdescriptions, working on very
detailed, I don't want to callit a vision statement cause that
sounds a little too generic, butbasically sitting down and
helping organizations figureout, all right, hey, what does
data analytics mean in aninternal audit capacity anyway?
What's the difference betweendata science and analytics and
just ETL?

(02:19):
What can we as an organizationdo?
What is the best way to ensurethat when we ramp up this
function within the organizationthat it will stand the best
chance of being successful?
And I've been doing that forabout the last 10 years on and
off.
Right now, I am at CVS.
We have 10 or 15 differentbusiness units within my
internal audit shop.
There are about 200 auditors inthe shop.

(02:40):
I work specifically within theprescription benefits management
vertical.
So we are over all of thefunctions within the company
that actually makes sure thatpeople who come into our
pharmacies or submit aprescription to one of our mail
order pharmacies or specialtypharmacies, that they get that
prescription in a timely manner.
That they get what they'resupposed to get when they're
supposed to get it.

(03:01):
Which sounds really simple, butthere's actually a lot there.

Yusuf Moolla (03:03):
What is it that drew you into the use of data
within audit in particular?
Cause it's quite a niche area.

Steve Rummel (03:09):
Well, I think problem solving, right?
When you start out working ininternal audit, especially if
you're working in professionalservices, you don't get called
into an organization when thingsare going really well and
everything's just swimmingalong.
You get called in becausethere's a problem.
Either the client doesn't havecapacity or there's some weird
thorny thing that they'regrappling with and they can't

(03:29):
solve.
And obviously, accounting andfinance, they all run on top of
some kind of enterpriseaccounting reporting package.
There are all kinds of otherapplications from Microsoft
Excel all the way up to SAP.
There's a lot of moving pieces.
There's a lot of data movingthrough organizations that
things can go wrong.
And there is a good living to bemade, helping organizations

(03:52):
figure out where the problemsare and breaking down that
problem into solvable chunks andthen saying.
Here's the problem, let's figureout how we can implement a
solution that will work for youand that will hold up over time.
That sounds really nice andnotional.
It was basically my boss going,what the hell is going on here?
Figure it out.
So you go and you sit down withIT.
You sit down with theapplication owner, you sit down
with accounting and you say,okay, where are things just

(04:14):
falling apart?
Where do you see the wheelscoming off this process?
And when they tell you, youstart digging in and just doing
that root cause analysis.
And pretty soon you've dug yourway all the way down to IT, into
the database administrator, intothe data lake and whatever else.
And it is actually a lot of funwhen you solve a problem like
that.
Human beings are natural sort ofproblem solvers.
So when you can go in and say,hey, you know, I figured out
this thorny thing.

(04:34):
Here's what we need to do.
It's fun.
And then you do that oftenenough and pretty soon you've
done it enough that people thinkthat you have some credibility
in that space.
So they call you more often.
And then you have to trainminions on how to do it because
otherwise you're going to spendall your time doing it.

Conor McGarrity (04:48):
Just to put it in context, Steve's company,
CVS, is a Fortune 5 company.
Just behind Apple, which isnumber four.
So Steve, Over the years you'vehelped establish or grow various
analytics functions withininternal audit.
What are some of the commonchallenges or issues that you've
encountered, particularly in theearly years of setting up those

(05:08):
practices?

Steve Rummel (05:09):
Probably one of the biggest is technology has
been changing so much in thelast 50 years, but that only
accelerates, right?
And it's a cliche to say it, butit's true.
When I attended university inthe mid 1990s, the internet was
brand new.
Napster was still the big thing,you know, And you start pushing
that forward.
You look at the tools and thingsthat have come about since then.
Web development back then was,hey, you fire up the text

(05:31):
editor, you write HTML, maybeCSS, but like that was it.
And now we've had severalgenerations of really robust
tools.
Keeping track of that is afull-time job in and of itself.
A lot of the people who are nowin senior leadership roles, in
organizations, especially largeorganizations where you're
spending your career, workingyour way through these things.
They are not necessarily thekind of people who stay really

(05:52):
on top of these things.
I made a joke about ACL beforebeing the best software tool of
1987.
There are so many more toolsnow.
You start with ACL and like SAS,but then over time there are
other tools that came out foranalytical things.
So you've got Python, which isone that I use now Tableau for
visualization.
DataIQ for GUI centeredanalytics.

(06:14):
Alteryx, another sort of GUIsoftware.
DataIQ is more web-based.
Alteryx is a fat application.
You gentlemen were telling meabout an open source application
that you used.
So there's a lot of stuff outthere.
There are a lot of vendors outthere pushing these things.
And a lot of the seniorleadership who are, right now,
they are the Chief AuditExecutives.
They're making decisions about,hey, I heard this great thing
about, we should put together ananalytical function.

(06:35):
They don't know.
They need to be well-advised.
And there is a whole cottageindustry of people out there who
are willing to advise them.
A lot of them are vendors.
I'm sure a lot of your listenershave sat through vendor
presentations where, oh, I'vegot this black box product that
you plug it in and you feed inyour data one end and you get
nothing but beautiful, clean,crisp.
That's garbage.

(06:55):
That's not true.
There are people who get paid alot of money to sell so many
yards of blue sky to seniorleaders.
And that's not to disparage theproducts, necessarily.
They are selling good products,but the leaders need to have
somebody on their side tounderstand not just the
products, but what's theprocess.
Analytics is not a product.
It's a process.
It's a way of thinking aboutyour organization, the business

(07:17):
process you're looking at, whatyou want to get out of it.
You have a business process thatyou want to, in the case of
audit.
You want to audit that businessprocess.
How can you get some assurancethat the process is working as
it's supposed to.
And to do that, you need tothink about not what product can
I stick on this thing, but whatam I looking for?
A professor of mine once calledinternal audit, the practical
application of common sense.

(07:38):
You look at a process, you say,where are the risks here?
Where are the hazard?
Let me draw some things on awhiteboard and draw out this
process and say, hey, here's afailure point.
Here's a failure point.
Here's a failure point or apotential failure point.
Let's think about this in anorganized and systematic manner.
And then once you have some ideaof what you think the issues
could be, then you start saying,all right, what's the best tool
for the job to solve this?

(07:58):
Do we need to monitor something?
Do we need some kind of areconciliation tool, whatever it
may be.
So that's a challenge becauseyou've got leaders who have
limited experience with thetechnology, but they know their
business and their processinside and out.
And they don't want to besitting through multiple tech
demos.
They want somebody to give themthe answer.
And the answer is never going tobe just a product.
It's going to be this is aprocess that when we put this in

(08:21):
place, and we execute it in aconsistent manner and to
quality, you're going to getthat additional assurance that
you want in your audits.
And they have to understand thebenefits of it.
And they have to understand thelimits of it.
There are some audits and somebusiness areas that you can't
automate.
Flip side, there's some that youcan automate that people get
really afraid of.
So you've also got the sort ofthe HR dynamic.

(08:42):
Oh my gosh.
If you automate all of thisstuff, where's my job going to
go.
And so you get people who wereresistant to that as well.
The way I typically pitch this,I shouldn't say pitch, I think
it's true.
Auditors don't want to be bored.
Nobody wants to be bored atwork.
Auditors want to be doinginteresting work.
Most of all, it is boring.
When you sit down and you'reputting together a set of work
papers, the process ofconducting an audit from start

(09:02):
to finish might be someinteresting meetings, might be
some interesting stuff.
You learn about the businessalong the way.
But the bulk of it is going tobe doing just scutwork, really
dull stuff.
If you can automate that and getpeople to focus on the stuff
that they need to be focused on,one, you're going to get better
results because they are goingto be again, focused on the
problem at hand.

(09:22):
Second, you won't be wastingtheir time.
If you give somebody X number ofhours to do an audit, and those
hours are spent actually usingtheir brains to think about what
they're supposed to be doing andabout the risks that they're
supposed to be illuminating andquantifying, you're going to get
a lot better results.
If you can go the whole hog andturn that into some sort of a
continuous process so that allthe boring stuff never has to be

(09:42):
done by a person again.
You're giving your auditors agreat experience right.
Now, they're like, hey, Ilearned this thing.
It was really hard to learn, butnow I don't have to do the
boring stuff anymore.
I can think about this.
One of the hardest things inaudit to find are things like
collusive fraud.
That would be ideal.
It takes a ton of legwork thatyou can't automate, you can't
build an automated frauddetector.

(10:04):
You can build a universe ofsmall applications and analytics
that can point you in thedirection.
That's what people should bedoing.
They should not be spendingtheir time doing boring
automatable thing.
So you asked, hey, what are thethings that are challenging or
are the issues that come up whenyou're trying to ramp up a
function like this?
It's getting past thoseorganizational hurdles where
people are worried about theirjobs.
It's getting past theleadership's lack of familiarity

(10:26):
with the tools.
It's getting grassrootsengagement.
I think that's the other piece.
You need executive buy-in.
You need them to understandwhat's possible.
You need to understand what theywant to do.
You need to understand what istheir vision for this thing?
Do they want a be all and endall super analytical function
and knock it out?
Or are they just looking toautomate some things?
Or are they looking to get, youalways hear the phrase, quick

(10:48):
wins?
There's a lot of quick wins toget in the analytics space.
If you don't have anything tostart with.
It's really easy to jump in andsay, hey, I'm just going to save
you the hassle of wrangling 400gigabytes of data, because it's
just a pain.
You can't do it in Excel andyour staff are tearing their
hair out.
But figure out what their visionis, what do they want to do, and
then put together a project planto get there.
And you want to have acombination of sort of visionary

(11:10):
things that you put in there,but you also want to have very
nuts and bolts.
All right.
Hey, I'm going to be involved inevery planning meeting.
I'm going to be involved inannual audit planning.
And as I build a team, that teamis going to be involved in those
things.
If I come into an organization.
I'm going to start by, one,helping leadership figure out
what they want to do, where theywant to go.
And I'm also going to starttalking to the team right away,
because if you don't get theteam to buy in, if you don't get

(11:32):
the managers and their staff tobuy-in, you will fail.
You will not just be able to doit.
You have to have that grassrootslevel of support and you have to
have the executive support.
The only other piece is time.
It takes time for this,especially in a larger
organization.
You can get measurable resultsright away.
Hey, this went a lot faster.
This one a lot better, but ittakes a long time to actually

(11:52):
put a fundamental process inplace that is going to manifest
itself in the audit plan interms of, okay, how many audits
did we do that we used analyticson?
How much time did we save?

Yusuf Moolla (12:01):
You were talking there about the key challenges
being largely managerial, asopposed to needing to find the
right technology or find theright data, et cetera.
The other thing that we see andthat you alluded to was that
there's fear amongst auditors'directly using that to
themselves.
The natural reaction would be tojust toss that over the fence
and get Steve to do it asopposed to have to do it

(12:23):
ourselves.
How have you overcome that?

Steve Rummel (12:25):
I don't know if it's a fear of data or just a
fear of new things.
And sadly the only way to getpast that, and when I've run
into it, it's been a trainingquestion.
I spend a lot of my time now,training staff, training
seniors, training managers,training executives, to
understand enough of what theyneed to know so that they can

(12:46):
intelligently assess auditanalytics in whatever role they
have.
When you're talking staff andseniors, if you have a staff or
a senior person, so you'retalking like ideally, typically
somewhere less than 10 years outof university, in most cases.
There are a lot of exceptions.
I've run into much older staffand seniors.
But when you're running intosomebody a few years at a
university and they're reallythat afraid, you have to ask

(13:08):
yourself why.
I haven't seen anythingconsistent there, but a lot of
it is just training.
Sit them down and just walk themthrough.
And I think the best way to putit would be any of this stuff,
any of these tools that we have,any of the techniques that we
have, they're discoverable.
Audit is not rocket surgery.
Audit is not even data science.
I have a master's degree in datascience.
I've only scratched the surfaceof my first graduate school

(13:31):
class.
Because accounting and financeare fundamentally, in an audit
context, really simple.
Unless you're doing some kind ofreally specialized audit, like
model validation audit for ahedge fund or something like
that, which that's a wholeseparate conversation that has
nothing to do with the vastmajority of audit.
Most audit is prettystraightforward.
So it's talk to the person aboutit..
What do you do?
What are you trying to do?

(13:52):
Explain your audit to me.
Tell me what you're doing.
They tell you and they say,okay, what are the challenges
you face?
What's the biggest pain tryingto do this audit.
I get this Excel file that's 400megabytes.
And okay.
I can help you with thelogistics of that.
So it's trying to find out whatthe person's pain point is and
solving that.
Because even somebody who'sreally afraid, paranoically
afraid, they still appreciatethe effort to help them do what

(14:14):
they want to do.
So you have to meet people wherethey are and talk to them about
things that they understand.
And if that means take it veryslow and just, if it okay with
you, we're just going to startright here and we're going to
get about this far and that'sit.
But for the most part, whatwould I see a lot of is it's not
so much they're afraid.
The frame of reference istotally different.
I'll give you an example whentypically auditors think in
terms of I'm going to take thesample and I'm going to test it.

(14:36):
and if they want to get fancy,they'll say we're going to take
a statistically significantsample and then we're going to
test it.
And I say okay, what's astatistically significant sample
of this thing.
It's 60.
Okay.
Where'd you come up with that?
That's what the public auditortold me it was.
Okay.
How do they know?
I don't know, it's 60, it's likea magic number.
And then, if they get reallyfancy, they'll go to a website
and say, hey, I went to this iswebsite that calculates the, I'm

(14:57):
not sure what a confidenceinterval is, but I know that
this is the value I plug inthere.
Right.
So you sit down and you starttalking to them about it and
say, okay.
There's two things at play here.
One, negotiated numbers.
You have a partner at a publicfirm who says, they're not going
to do a sample of 360, eventhough that would be a
statistically significant sampleif we wanted a 99% confidence
interval.
So fine, we'll take 60.
There's a lot of that that goeson.
There is a lot of negotiation inaudit.

(15:19):
So it's understanding wherethose pieces fit in.
And then two, going back to theother and say, okay, look, if
you want to do detailed testingof a sample of 60, that's fine.
But one here's what statisticalsignificance means.
And we actually just didsomething recently a big
exercise at CVS, where we puttogether some training to
explain to our teams what doesstatistically significant even

(15:40):
mean.
And we walked through threepages of a really detailed
example of saying, okay, if youdo this and this is how you get
a statistically significantsample.
And by the way, there are waysthat you can change the way you
sample where you take, you getthe sample, but then you start
testing.
You say, okay, if I can gothrough a certain percentage of
it and find no issues, can I cutoff testing early?
So we start talking about waysto like really making people

(16:02):
understand what the process is.
But the better case, and this isthe use case that people have
been using forever.
I got this from ACL and this isthe one insight that I love
about them.
Can you test a hundred percentof population.
If you can do attribute testingon a hundred percent of a
population and come up with trueerror rate for the population,
bam you're done.
And not only that if I tellsomebody, hey, I can give you a

(16:23):
Jupyter notebook that's going tohave everything from the data
import, the validation.
All the things you need to tickthe boxes and say, yes, we have
complete and accurate data.
We tested the population.
This is how we tested it.
Bam.
Here's the error rate for theseparameters that we tested.
They're going to be happy as aclam.
And once they do that, then theystart to see, they start to
think, not in terms of what doesan audit look like in my little

(16:43):
world, but they actually startto think about the audit as it
should be thought of as a meansof clearing away the undergrowth
and getting to assurance,getting to the actual risks at
play.
How much have we explained?
We have just explainedeverything about these three
things by doing this test.
So it's a change of focus.
It's really easy for us auditorsand I'm guilty of this myself.

(17:03):
I think we all are where youhave an audit plan and we all
make our check.
Okay, here's the audit plan,which is a big checklist.
And now I'm going to pull theaudit out and I'll just start on
the first one.
I'm going to start checking ityet.
Okay.
Did this.
I did my kickoff deck and mykickoff memo.
I had the meetings I got to workmy way down and you get into
that mode because we havedeadlines, because we have time
pressure.
And because you have to doquality work, there's a lot of

(17:26):
stuff you just have to do.
So you get really good atworking your way through a
checklist.
In audit, that's the careerpath.
The better you can crank yourway through a checklist, the
better you're going to dobecause you're going to be doing
good work on your time budget.
So it's getting people to stopthinking about that.
And that goes back to executivesupport as well, because
analytics will save you time butyou do have to invest time to do

(17:47):
it.
And sometimes, analytics doesn'twork.
You may say, hey, I'm going tolook at this data, give me a
week with it.
You play with it, you stretchit.
You bend it eight ways fromSunday and you might come up
with something interesting, ormight not.
And you might find that youcan't do the testing that the
audit team wants to do.
You can't do it in an automatedfashion.
You can certainly help themunderstand the context of the

(18:07):
business process and the data,but you may not be able to give
them that home run thing.
And that's something you have tobe willing to do.
And something that leadershiphas to be willing to accept that
there's not always going to be aDA play.
I go to all the prep meetingsfor every audit that kicks off
and one of the deliverables Ihave to my audit teams is, I
will give you either ananalytical plan for what I think
we should do based on ourconversation.

(18:27):
So it's not my plan, it's ourplan.
But I'm going to tell youtechnically how we'll do it.
Or I'm going to write aparagraph saying there is no DA
play here, and here's why.
And I will list out in Englishwhy we're not doing any kind of
analytics, because again, myleadership, they're like you're
a data guy and you're telling meyou're not going to do any
analysis.
Yeah, there's no play here.
And here's why, and you explainit in plain English and that
takes time.

(18:48):
Doing something simply andclearly actually takes a lot
more time than doing something,just writing paragraphs and
paragraphs.
Writing a nice short, conciseaudit report is a hallmark of a
great audit manager to my mind,but it takes time to do it and
it takes talent to do it.

Conor McGarrity (19:23):
I'm just going back to something you said there
where you guys like to get inand with the real hairy stuff,
get the data analytics teaminvolved as opposed to the
auditors themselves.
So they kick over some of themore complex analyses to you
guys.
Is that a sign of success thenfor you guys, when your
services, Steve, areoversubscribed?
And if that is the case, how doyou then prioritize your team's

(19:44):
ability to assist the auditorson their projects?

Steve Rummel (19:48):
That's an excellent question.
I have yet to get to the pointwhere we were oversubscribed,
because again, it takes time toramp up.
So if you have a 200 personaudit shop and so in CVS, we
have an actual dedicated dataanalytics function.
They have a lot more high endhardware and some applications
that I know how to use, but Idon't have direct access to
because I have a laptop.

(20:09):
I can run Python on there.
I can run Alteryx on there.
I can run Tableau on there.
But the gerbils in my littleDell are not going to be able to
run some kind of clusteredcomputing application.
I don't have that.
So we have a national team.
Their role is a little differentthan mine.
They have their own sort ofanalytical plan that they put
together.
So my primary driver is I servethe PBM audit plan and secondary

(20:31):
to that I serve the rest of thedepartment's audit plan.
So I'm the service Bureau.
I go out and I work with theauditors.
The national team, they will dothat if they are asked.
So if I get in something whereI've got like four terabytes of
data, I can't do anything withthis.
I know what I want to do.
I can't do it.
Can you guys handle this for me?
We'll figure it out.
Yes.
Okay.
So we'll work with that.
But to your question, what doyou do when you're
oversubscribed and how do youprioritize?

(20:51):
Every large organization I'veworked for, the audit plans,
they spent a year.
So if you do some planningcorrectly, assuming you do some
planning correctly, you don'treally have to worry too much
about that happening becauseyou've looked the audit plan
ahead of time and said, okay, weknow we've got some audits come
up that are going to use oursystem of record and is going to
take up a ton of data.
So we try to space things outanyway, because we don't want to

(21:13):
be banging on the same dataprovider all the time.
That's a really good way to losefriends in audit is to always
nag the same.
If you're always auditing thesame business process, or God
forbid, you're always asking thesame team because you have one
data team that supports multiplebusiness processes and all.
And they're like, why you alwayscalling me?
Cause we're auditing all thepeople you serve.

(21:34):
So what we try to do is we tryto rationalize.
We rationalize our datarequests.
So if we know that we're goingto be auditing from the same
data source, and we have severallarge ones, but if we know, hey,
we're going to be hitting thisone a lot.
We'll reach out at the beginningof the year.
We'll tell them, hey, this iswhat we're doing.
So we really try to avoid thosebottlenecks.
Typically the bottlenecks arenot with our capacity.
It's with either trying to getIT to schedule stuff.

(21:57):
Because again, rule number oneof audit, especially IT audit,
don't mess with production.
Do not mess with the production.
So we're respectful of that.
We try to have an SLA, a servicelevel agreement with our
auditees and IT, so we can avoidthat.
But how do we prioritize?
There are times where we willsit there and say, okay, which
one's more important.
Fortunately, I don't have tomake that decision too often

(22:18):
because when we do our auditplan, we do what we call a
risk-based audit plan.
It's a hybrid.
Our risk-based are many of ourrisks are based on what the
business leaders of thedifferent business units tell
us.
When we do interviews Q4 of theprevious year, we'll say, okay
well, you know, we'll keep youawake that what are the new
things on the horizon?
And we also look out at theliterature and landscape and
say, okay, what's coming in fromoutside the organization that we

(22:40):
need to look at.
Are there legislative changes?
Are there other cyber threatsthere?
Of course.
Yeah.
But we have a ranking of ouraudits and it's not hard and
fast or cast in stone.
But we know what audits like.
Okay, these five got to getdone.
And they take priority.
And then these others.
Yeah.
If something had a slip, we'regoing to pick this one over this
one.
Then we have some that are rightat the bottom of our sort of

(23:00):
modeling approach and we acceptthis going in.
At the beginning of the year, weknow that not every audit we put
on the plan is going to getdone.
Because chances are something'sgoing to happen during the
course of the year that we'regoing to have to then add an
audit to the audit plan andrejigger everything.
So what are we comfortabledropping off the auto plan?
And we have a whole changemanagement process for our audit
plan.
That's another thing, actually.
Having good processes in placethat you can integrate the data

(23:23):
analytics process into isanother key thing.If you're
going to make a change to theaudit plan, do you have a
defined change process where yousay, okay, we're proposing a
change to the audit plan basedon this, that, and the other
thing.
And here's our rules, we'rechanging it.
And this is the risk assessment,Because it makes the
conversations with leadershipeasier when you're like, hey, we
said were going to do thisthing, now we're not going to do
it.
And when we go to our auditcommittee and have to explain
why we're swapping stuff out, weneed to be able to explain that.

(23:45):
To your point, I have not seenus get so oversubscribed.
And again, I feel like it'sbecause audits are not
necessarily, there's a lot ofmanual components to it.
And there's also a technologywin already.
So we already have at CVS andother places I've been to,
functions where like we userobotic process automation in
multiple areas, a lot of thestuff that would normally take

(24:06):
up a lot of time, not that itdoesn't, there's still auditors
doing stuff, but already startedto automate a way.
the really just, I don't evencall it analytics, the pure like
RPA stuff.
I don't think anybody would evercall RPA analytics, but it's
deploying technology to automatethings and to save time and to
save your staff from drinking onthe job because out of sheer
boredom.
So we don't really have too manyproblems with that note.

(24:28):
We're all busy.
We're all crazy busy, but wehave never had a case that I've
seen where we had to say, yeah,we just can't support this
because we're just too to thewall.
We can usually move thingsaround or just shift deadlines
around a little bit.
And I think that's a real keything too, when you're talking
to leadership and you startsaying things like hey, if I
need to move a deadline by aweek or two or a month even, or

(24:48):
hey, I'm going to take an auditthat was supposed to release the
end of Q2, and I'm going to pushit into the end of Q3, as long
as you can make a case for itand your management is
reasonable.
I'm like, okay, I get it.
You're going to do everythingyou said you're going to do this
year, but we have to move thingsaround.
All right, that's fine.
And when we go to our auditcommittee meetings, I don't get
to go, but when our CAE goes tothe audit committee meeting and
says, hey, we're still doing allthis stuff, we had to move this
one around.

(25:08):
We had to move that one around.
Yeah.
There's always going to be acertain amount of what
percentage of the audit plan isdone as of the end of Q2?
And if it's not 50%, if it's40%, then like, well, we're
behind now.
You're not behind.
We've got a whole backlog ofright here and there.
Two weeks from now, they're allgoing to be done.
Yeah.
So to your point It's a lot ofplanning and just trying to be
organized and consistent andhave processes and actually have
trust.
We have to be trustworthystewards of the resources that

(25:30):
we shepherd and the processesthat we audit and our audit
process in and of itself.
So that when we say we're goingto do something and that we need
the extra two weeks to do it, weactually can justify it.

Yusuf Moolla (25:40):
You mentioned RPA there.
Just curious what the span ofwork that would involve the use
of robotic process automationwould be.
Is it primarily in thecompliance areas and SOX areas,
as you were talking aboutbefore, where there's that work
that just has to be done everyyear, as opposed to those that
would roll through an auditplan?

Steve Rummel (25:58):
Right now, I am not the expert on that.
I talk to the RPA team a lot.
In fact, one of our dataanalysts on the national team,
he's also their RPA guru who hasa couple of hats.
And he's also a frighteninglyintelligent young man.
So I deal with him all I can.
We use it for compliancetesting.
We use it for SOX.
We also use it for pretty muchany kind of a process that is
repetitive, that we know we'regoing to have to use.

(26:18):
If there's a process thatinvolves gathering information
by logging into some kind of aweb based portal and then going
through a whole bunch of menusand then downloading a bunch of
PDF files or word documents orsomething like that into a
repository so that an auditorcan go through them and not have
to waste the time of pickingtheir way through a web
interface to get stuff.
Or even better, take the outputof it and pipe it into some kind

(26:40):
of an OCR application that'sgoing to then open the things up
and extract the information thatwe want and stick it into a
lookup table, something likethat.
It's kind of across the board.
I know that there is a concertedeffort to use it for and has
been to use it for complianceand SOX, but we are looking to
use those tools in any way shapeor form we can to support audits
just in general.
I don't like to drawdistinctions between okay.

(27:01):
You've got compliance audit.
You've got SOX.
You've got IT audit.
It's a business process.
If you want to really auditcorrectly, you're auditing in
alignment with the way abusiness runs.
Ideally not just the businesssegment, but the way the
business segment interacts withother business segments.
Like where I am, we have an ITaudit shop and they focus on the
typical IT audit things.
But where I am now, unlikeplaces I've been before, where

(27:23):
if you said, we have concernsabout segregation of duties on
our database cluster.
Then that's IT audit, that goesto the IT auditors.
With us, if it's ours, and weown it.
It's PBM.
We're going to look at it.
So if that means I'm going to bedoing SoD testing, which would
normally be in IT.
We're going to do it.
Period.
So we deploy tools whereverthey're needed for, whatever
they're needed for.
That experience has been uniqueto CVS.

(27:45):
Every other place I've worked inmy career, they had very defined
cases that like RPA, every placeelse had been RPA was pretty
much a SOX thing or a compliancething.
But again, RPA is actuallyrelatively new.
It's not brand new, but it'sonly been big for about the last
7- 10 years.
And I feel like there's a lot oforganizations that don't really
use it yet.
Partly because it takes a lot oftime and frankly money to roll

(28:06):
the thing out.
It wouldn't surprise me to seeit get a lot bigger because once
you get the things set up and ifyou do it right, you can save a
lot of money.
And again, people stop drinkingon the job.

Conor McGarrity (28:14):
CVS actually stands for consumer value
stores, which is excellent.
And your company objective ishelping people on their path to
better health.
So one of the things we try andtalk about in this show is
aligning internal audit with theobjectives of the company, in
which, you know, internal auditis focusing on how data
analytics can contribute to thepurpose of an organization.
Are you able to tell us aboutone or two of your projects that

(28:37):
you've been involved in CVS orelsewhere that have really
contributed to that consumerhealth or where the end user
actually gets the benefit ofwork done by internal audit?

Steve Rummel (28:47):
So I can tell you this.
I came to CVS precisely becauseof that, our CEO, he just
retired a couple of months ago.
About a decade ago, he decidedto stop selling tobacco products
out of any of our retailpharmacies, because he said it
made no sense to sell tobaccoproducts and smoking cessation
and cancer drugs from the samestore.
That was when CVS actually cameonto my radar besides a store

(29:07):
that I used to pass when I droveto and from home.
So yeah, CVS is very much thatpeople take that seriously.
We take it seriously.
I take it seriously.
It's one of the reasons I'm veryglad to be there.
I would like to tell youstories, stories about specific
things.
But unfortunately, I can't.
I will tell you this, however.
We have pharmacies that dealliterally with any drug that you
would ever need.
So everything from aspirin orall the way up to cancer

(29:30):
treatment drugs and esotericdrugs.
And we have a multitude ofprograms and plans in place that
our goal is to make sure thatevery one of our customers, no
matter what plan they're a partof, what employee group they're
part.
Like American healthcare isdisastrously, horrifically
complicated, I get to see howthe sausage is made, it is every
bit as complicated as you canimagine.

(29:52):
And I only see the drugprovision.
So from the time a prescriptioncomes in to the time it actually
gets filled.
Everything that happens inthere.
So that is my entire universeall day, every day of the week.
We go to great pains to payparticular attention to where
there are things that impactpeople's health dramatically.
So we have an entire group thatdoes nothing, but look at

(30:13):
specialty drugs.
So God forbid you have cancer orsome exotic disease that you
need a drug that cost a hundredthousand dollars a week.
We pay attention to that stuff.
And we are, as a team, verycognizant of the needs of our
customers.
When we do our audit planningfor the year, we have extensive
and sometimes violently, butconstructively, fireworky

(30:37):
conversations about things thatwe feel strongly about.
And when we put the audit plantogether, that is the entire
focus.
What are we doing to help ourcustomers on their journey to
better health?
If you were to imagine foryourself what a risk map would
look like of an organizationthat deals with that kind of
thing, I can assure you thatours looks exactly like you

(30:57):
would hope it would.

Yusuf Moolla (30:58):
A lot has happened over the last 20 years, as
you've spoken about in terms ofmindset, in terms of capability,
uptake, the tools andtechnologies that are available,
the techniques that are beingused, the number of people that
are involved.
What are you planning for overthe next three to five years
say?
What do you see coming down theline that you think will have a
major impact on your job and theway in which auditors' generally

(31:22):
use data?

Steve Rummel (31:23):
I think you're going to see a generational
change.
I don't know about three years,but as the individuals who were
like my age and younger, as wemove into more senior leadership
positions and have a betterunderstanding of the tools and
techniques that are availableand data and frankly, a much
more flexible outlook on the wayorganizations run.
Let's say, there've been a lotof changes, just in

(31:43):
organizational dynamics as well.
50 years ago, you started acompany and you stayed there for
your whole career.
And these days that doesn'thappen anymore.
So there's a lot of stuffhappening outside that
definitely impacts the careerpaths of human beings, and that
impacts the way businesses work.
We're all a lot more flexibleand you hear, even large
companies like ours, we arehaving to respond to all these

(32:03):
changes.
So the changes that you're goingto see are not going to be
technological cause that's justa constant.
I think what you're going to seeis a lot more robust and mature
use of AI, the writing's on thewall.
It's been there for a while.
It's getting there.
especially now that we havecompanies like Google and
Facebook, who've been aroundlong enough.
That the senior people inindustry have seen their fancy

(32:24):
data tools.
They don't understand them, butthey know that they work.
So I think you're going to see alot more of these tools migrate
their way into businesses andstart speeding up, make things
much more efficient.
They already have done that infinance.
They've done that in search andmarketing advertising.
You're going to start to seethat push down into other areas
of the business as it should.
I'm hoping that what you'll seeis all of that getting paired

(32:47):
with solid project managementrisk assessment, which is where
we, as auditors, can come in.
Because implementing fancytechnology without a good plan
is a recipe for disaster.
And in the past you could getaway with it.
Now at the speed things go, youcan't.
And the magnitude of error youcan make by not executing a
project with analytics, thedamage just goes way up.

(33:08):
But I think most of us are goingto see as a cultural change in
senior management, as currentsenior management who came up
before the internet was even abig thing or right when it was
happening.
Once they leave and a lot of thepeople who grew up on this, you
just live and breathe it.
They're gonna start moving intoroles.
They're going to be much morefocused on their teams and on
tools and getting the job doneand they're going to be less

(33:30):
focused on, organizationalpolitics.
So it's a cultural change.
I think you're going to see moreof that.
I hope you'll see more of thatin my career.
in the place that I've been,where I was able to be
successful.
It was largely because theorganization was willing to
allow teams to do their jobswell, and to be a service Bureau
to the organization.
So rather than the traditional,I've built a kingdom, I've got
this nice pyramid of peoplebelow me.

(33:51):
It's more I have these littlepockets of capability and
competence, I know the ITpeople, and I know the audit
people, and I know people whodeal in this thing.
And we all get along and we allknow each other and a much more
free-flowing and collaborativeenvironment.
I can only judge from what I'veseen, but from a purely
subjective standpoint, that'swhat I see.
And that's what I'm hoping willwork.

Yusuf Moolla (34:12):
Steve Rummel from CVS Health.
Thank you very much for joiningus.

Steve Rummel (34:16):
Thanks.
Take care.

Conor McGarrity (34:17):
Thanks Steve.

Popular Podcasts

Bookmarked by Reese's Book Club

Bookmarked by Reese's Book Club

Welcome to Bookmarked by Reese’s Book Club — the podcast where great stories, bold women, and irresistible conversations collide! Hosted by award-winning journalist Danielle Robay, each week new episodes balance thoughtful literary insight with the fervor of buzzy book trends, pop culture and more. Bookmarked brings together celebrities, tastemakers, influencers and authors from Reese's Book Club and beyond to share stories that transcend the page. Pull up a chair. You’re not just listening — you’re part of the conversation.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!