All Episodes

May 9, 2021 25 mins

Michael DeCero is an Internal Audit Analytics Manager at TDS, a telecommunications company. 

In this episode, Michael explains how he helps his audit team use data. 

 Links:  


About this podcast
The podcast for performance auditors and internal auditors that use (or want to use) data.
Hosted by Conor McGarrity and Yusuf Moolla.
Produced by Risk Insights (riskinsights.com.au).

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Narrator (00:07):
You're listening to The Assurance Show.
The podcast for performanceauditors and internal auditors
that focuses on data and risk.
Your hosts are Conor McGarrityand Yusuf Moolla.

Yusuf (00:19):
Today we have Michael DeCero from TDS.
Michael is an Internal AuditAnalytics Manager at TDS where
he's been for over 12 years now,I think it is.
Is that right?

Michael (00:30):
Yeah.
I've been with internal auditfor 12 years and specifically in
the data analytics space ininternal audit for the past
three years.

Yusuf (00:36):
What brought you to audit and where you are right now?

Michael (00:38):
TDS stands for Telephone and Data Systems.
We're a telecommunicationscompany based in the Midwest in
the United States, but we havegot markets across the country.
Wireline and wireless servicefor all your telecommunications
related needs and cablecompanies as well and hosted
managed services.
What drew me to internal audit,honestly, was I graduated
college with my undergrad inAccounting back in 2009.

(01:01):
So that was right during therecession.
It was really challenging to geta job.
And I'm like, I gotta dowhatever I can to get one.
I was just telling the story toan intern that's in our group
right now and I said, I wasapplying to everything.
I just wanted to make sure I wasmaking decent money somewhere.
And if you recall, this isshortly after Sarbanes-Oxley
became a thing in 2002.
Accountants and auditors werereally in high demand.

(01:22):
And that's what drew me to it.
The reason I've been in internalaudit for 12 years is I've
gotten really lucky in workingwith TDS.
Not only is it rewarding work,working for a telecommunications
company, but they really takecare of me too.
So that's the biggest reasonI've been sticking around in
this field for so long.

Yusuf (01:36):
You've been in the data area for the last three years or
so.
What was it about data thatexcited you or drew you into
that?

Michael (01:44):
That's a good question.
Early in my career, I was tryingto identify what's life after
internal audit?
I was kinda thinking about if Iwant to continue to move up the
corporate ladder, especially ininternal audit, they expect you
to go into the business and getsome experience there.
If you ever want to become adirector or a VP, or Chief Audit
Executive, they expect you tohave some business experience.

(02:05):
I was working with my leaders onwhat does that next step look
like for me, and the word thatkept coming back was accounting.
And I said, I'm not interested.
I did that my undergrad.
And I started reading currenttrends, whether that's The
Economist or other magazines outthere, other publishing
articles.
A lot of good stuff on LinkedIn,I follow.
And I'm just starting to get asense that automation,

(02:27):
artificial intelligence is athreat to the workforce and in
finance, especially accounting.
So I wanted to make sure that Ikind of stayed ahead of that
curve.
And I said, I need todifferentiate myself a little
bit.
I want to get a sense for whatdata analytics really means.
I don't want to just say it as abuzzword, because I often think
that it is just said as abuzzword.

(02:47):
People don't understand reallywhat it means or what it
entails.
So my company, my department setup a new data analytics function
with an internal audit and, Istepped into that role.

Yusuf (02:56):
Okay.
And what's your experience beenover the last three years?
How has your view of the use ofdata changed or your experience
with data changed over thattime?

Michael (03:07):
It's a great question.
How it's changed over the time.
I'd say there's some things thathave stayed the same and there's
some things that have changeddramatically.
Some things that have stayed thesame over the past three years.
It's just as challenging to getour hands on good, reliable data
as it was three years ago.
Identifying where theconnections are that we need to
get data and ensure that we'regetting what everyone kind of a

(03:28):
buzz word.
If you would say, in this fieldas the single source of truth,
how do we agree with ourstakeholders?
What is that proper set of datathat we all agree upon as
complete and accurate,represents the data that we're
looking for?
And then agree upon what weshould be seeing.
What are the thresholds wheresomething is good or bad?
What is the threshold wheresomething's considered an
exception or not?

(03:49):
That is all extremelychallenging and continues to be,
and probably will be for awhile.
I feel like all companies aretrying to get more centralized
and formalized with their dataneeds.
Meaning how do we store the datain a coherent, cohesive manner?
How do we govern said data sothat we understand things like
"What is the single source oftruth?" And we're making

(04:11):
progress.
So although it's still achallenge companies are starting
to realize that if they reallywant to get true value out of
their data, they need to havegood processes in place and
good, warehousing, goodmaintenance to ensure that they
can rely on said data.
It has the right availabilityfor people.
And of course, we got to makesure we have it all secured,

(04:31):
too.

Conor (04:32):
You spoke there about some of the challenges that were
data focused in the three yearsthat you were trying to set up
this function within internalaudit.
Can you tell us a little bitabout some of the people
challenges or some of thechanges that needed to be made
maybe around the people workingin internal audit, but also
within the business when youwere trying to set this up?

Michael (04:50):
Great.
I'll talk first about thepeople.
So when I first took this rolethree years ago, I was just an
individual contributor.
I was just kind of the sole dataanalytics guy on our team, but
as needs grew, we needed to growthe team in order to fulfill
those needs.
So I hired a few individuals.
Now we've got a team of about,I'd say two and a half.
So a couple of full-timers andan intern.

(05:10):
And we've had to be creative andstrategic in our headcount and
resourcing needs.
Resources and headcounts justdon't grow on trees.
Luckily, I've worked with mysuperiors and I've had a lot of
support to convert some of theoperational and financial audit
positions that have been vacatedover the past three years and
convert some of them, not all ofthem, to our data analytics

(05:31):
function.
As far as process challenges go,and this does include people.
You know, it's changed.
I'd say vast majority ofindividuals out in the business
world.
Especially in finance, I feelare a little fearful of anything
data, really.
I don't necessarily think it'sjust like they're concerned
about losing their jobs toautomation or artificial
intelligence.

(05:51):
I think it's more just likeheavy mountain of knowledge and
skills that you need in order toreally do this type of work.
I think that that might be alittle misguided.
Don't let fear stop you.
That's been a big hurdle istrying to convince people, hey,
give us the time and space.
And we can really do some greatthings with the data that we
have in our hands instead of usselecting the sample of 30 odd

(06:13):
objects to test and report onhow of a population of hundreds
of thousands, if not millions.
Maybe we can, in fact, test theentire population and get much
better insights into that datawhere our stakeholders will
appreciate and respect and actupon our findings better.

Conor (06:29):
So you mentioned there that in setting up your team,
you were able to convert a fewof the positions that were
vacated.
It'd be really interesting tounderstand what was your
compelling argument to be ableto get those positions enabled
within your team?

Michael (06:43):
Yeah.
Like I said, I got great supportfrom up top, but, you know, I
had to sell kind of what we aregoing to be using these
resources for.
And the big thing that, althoughwe didn't have hard numbers in
our kind of business case, ifyou will, we got close to an
estimate of how many hours arewe expecting to save with some
automation features that we'rebuilding out.
So Sarbanes-Oxley, for example,here in the States has at least,

(07:06):
our team, internal audit spendsa third of our audit plan every
year on compliance activitiesspecifically for SOX.
That's a repeatable process andthat's, done every month or
quarter every year.
So we've started to build someworkflows that will save the
audit team a lot of time.
And so, hey, all right.
Instead of an auditor that'sgoing to come in and do this

(07:27):
every quarter, or maybe we canbuild a workflow that will do it
for us, or at least maybe do,you know, up to 50% of the work
for us.
And showing that in order toreally realize those gains, we
need another resource because wehave so many other requests
coming in for ad hoc reports oranalytics for the various audits
that are in process.
And we broke it down in saying,this is how much time we're

(07:50):
spending on audit projectrequests, which is a vast
majority of our time.
I'd say 80% of our time rightnow.
We're only dedicating about 10,20% to Sox automation
activities.
Now, if we can get another headcount here, we can really start
realizing this, automationstrategy and that's kind of been
a selling spot.

Yusuf (08:06):
With the team that you're on and the use of data across
audits.
Is it that everybody comes tothe central team to enable that
part of their audits to be doneor do some of the teams actually
use data directly themselves?

Michael (08:20):
It's a combination of both.
I'd say majority of the former.
I'd say, if I had to estimate,it's probably 85% of the time
they're coming to us with a, wehave a specific request.
We need you to run someanalytics, if you will.
Build some reports for us.
Dashboards, et cetera.
Or build a workflow to get tosome sort of outcome, that's
majority of the time.

(08:41):
One thing I would say that we dothat I'm really proud of is we
try and build, we use Tableau alot for our dashboarding.
And that gives us an opportunityto have an dashboard that our
operational financial teams canuse.
So once we coordinate with themand say, here's the data that we
have, here's the outcomes we'reexpecting.
Here's kind of the visuals weneed to help us identify

(09:02):
exceptions or something thatlooks strange or that we want to
do further research on.
Or maybe it's indeed to help usselect our risk-based sample.
We're able to build thoseTableau dashboards that they
could kind of filter and searchon their own to see if there is
in fact, some, you know, what Icall monkey business going on in
the data.
That's been a real successbecause any data analytics shop,

(09:23):
I think you want to promote dataliteracy in your department,
whether that's internal audit oranywhere.
And that's really, I think,driven a lot of that.
Our internal audit team, I thinkthat we, as the data analytics
function, have done a reallygood job trying to boost that
within our department.
So we've hosted some SQLtrainings with some select
operational financial auditors,just to give them a little bit
more understanding and ideas ofother skills and tools that are

(09:45):
out there to get them what theyneed.

Narrator (09:47):
The Assurance Show is produced by Risk Insights.
We work with performanceauditors and internal auditors,
delivering audits, helping auditteams use data and coaching
auditors to improve their dataskills.
You can find out more about ourwork at datainaudit.com.
Now back to the conversation.

Yusuf (10:07):
What do you find to be the key challenge in working
with other audit teams onscoping and planning their
audits?

Michael (10:15):
It's a great question.
Our team is specificallyinvolved in what can and should
a data analytics team give youall.
And what I have found to beprobably the biggest challenge
is agreeing upon a clear enough,expected outcome for our team to
deliver on.
If we were in programdevelopment it's requirements.

(10:36):
How do we land on and agree uponwhat are those requirements that
my team is delivering on for anyset audit project?
I think that's just the natureof this.
People who are not involved indata, maybe day-to-day, have a
hard time articulating exactlywhat they want and know what
they want.
So it's always going to be aconversation that needs to be
had.
And indeed it's probablymultiple conversations.

(10:57):
But that's been my big push is Iam not comfortable agreeing to
any sort of timelines ordeliverables until we kind of
have a documented understandingof what those requirements are
for any audit project.
And it has been going well.
We have some kind of commonquestions that govern this type
of activity.
What are you expecting?
Do you want an Excel output ordo you want a Tableau dashboard?

(11:18):
What is the definition of anexception?
Another real important one iswhat are the key attributes of
the data set?
So when we're getting a table ofwhatever, a million records or
rows, and we have up to, I don'tknow, I've seen up to 50, a
hundred, 200 columns.
Okay.
I am seeing column X as the onethat we really want to base our

(11:39):
analysis on.
We need to agree with thebusiness that we understand it
correctly.
And I've been working on gettingmy master's in data science,
more specifically in artificialintelligence now.
But whenever you're doing datascience work, you're going to
spend probably 70 to 80% of yourtime understanding your data and
that's vital in any role.
I don't care if it's internalaudit or advanced data science.

(12:01):
You need to spend that time tounderstand your data and know,
okay, we are going to base thisalgorithm, this program, this
model on column X.
We need to know what column XYZor whatever column it is.
We need to know what that means.
And we need to agree with ourbusiness partners that we have
our understanding correctly.
Until you do that, you reallycan't move on to your testing.

Yusuf (12:22):
When internal audit teams come to you with an audit that
needs to be done, how much ofeffort do you have to put into
explaining what is possiblebefore you can get further into
exactly what is then required?

Michael (12:33):
Great question.
It depends.
I'll give two examples.
For something that might bestraight forward.
Like that we're doing aprocurement project right now.
We also did an expensereimbursement project earlier
this year.
You could come to some prettyclearly defined expectations.
We want a dashboard of spend byvendor.
We want a dashboard of spend bypurchase order approver.

(12:53):
That's easy.
We kind of understand whatthey're trying to do.
They're trying to get a bettersummarized view of the data so
that they can understand it.
Make sample selections.
Drive questions, et cetera.
But those are pretty simple.
Recent example today that wecame across was my team ran a,
sentiment analysis on somesurvey responses that we sent
out.
And I think that this is a greatpractice.

(13:15):
This we're going to try andimplement maybe for as many
audit projects as we can.
As you're probably familiar,right?
You have your planning meetingsand field work, and you're
having discussions with variousstakeholders, process owners to
understand their process andfigure out what the risks are.
Well for the procurement projectthat we're running, we actually
sent a survey out to variousindividuals across I think we

(13:36):
sent it to maybe over 200people, I believe, that's
involved in the procurementprocess in the business.
And we had a series of bothclosed ended and open-ended
questions.
Multiple choice and then justfree range questions.
The closed ended questions areeasy to report on.
How many people selected A, B, Cor D?
You could report on that and geta sense for how things are

(13:56):
looking.
One of the questions we askedwas, did you participate in the
training that was given over thepast, I don't know, X amount of
years?
Because the company said thatindeed they gave the training to
everybody.
And now we're able to reportthat people actually take it.
Pretty straightforward, quiteeasy.
What we're learning now is wedid some sentiment analysis on
the open-ended questions thatare just kind of free range.

(14:17):
And if you ever have takensurveys like that in the past,
you get responses.
And there's usually an algorithmthat kinda determines is it
positive, negative, or neutral.
And when we started lookingthrough our results, we saw that
we had a lot of errors, I'lljust say.
I don't know if you want to callthem false positives or what
have you, but no, we were sayingsomething was positive and
really it was more, not evenneutral, it's maybe negative.
Our model didn't quite, predictit correctly.

(14:39):
And so now we're saying, well,what we really need to do is
kind of have a data scienceapproach where we select a
sample of that population andtrain it first and say, hey, let
me run it through my model.
Let's see what the results are.
Are they right?
Are they wrong?
Okay.
For the ones that are wrong,let's mark them as, hey model,
you did this wrong.

(14:59):
Try again.
This is where machine learninggets kind of involved in
training your model to ingestsaid data.
And then testing it to see, isit viable?
Is it giving you the resultsthat you're expecting?
And then, and only then once youkind of understand that, can you
move on to giving other data todeal with that kind of friction?
And that takes time, you know.
We kind of did a real quickturnaround and now we need to

(15:21):
take a step back and be like,all right, if we do this in the
future, we need to take, muchmore time.
We turned this around in like aday or two.
We need a week or two to trainthe data, test it and get to a
more accurate model.

Yusuf (15:31):
That's something that we're seeing more of in a number
of areas.
And particularly where you havefalse positives that you want to
eliminate, but also other areas.
And it can really be veryuseful.
But like you say, it takes a lotof time to make sure that you're
selecting the right features andselecting the right algorithm to
do the learning and theprediction, et cetera.

Michael (15:50):
Yeah.
This is a really interestingtopic because I've been thinking
about with us auditors, we wantkind of like black and white.
Is this an exception?
Yes or no.
And we want a hundred percentconfidence in that answer.
Well, we may want to have somesort of language in our audit
report that says we did this andwe have, say, 90% confidence in

(16:10):
these numbers.
You can run statistical scoreson your models and get, what
you'd call, the adjusted Rsquared to see, you know, how
well is my model performing?
What is the error rate?
You know, those types of things.
We're not there yet, but we'reconsidering how do we articulate
this in an audit sense wherewe're not saying it's an
absolute, but we still fulfillthose audit obligations to say

(16:31):
what our conclusions are.
So it's kind of a hard jugglingact.

Yusuf (16:34):
What would you say the next step in that particular
area would be for you?
So going from where that outcomewas and where you'd like it to
be to be able to, properlyproduce a result and explain it.

Michael (16:47):
Current our current intern, she wrote this,
sentiment analysis model.
And if you would asked her,she's probably frustrated
because really she wanted tospend more time to get to a more
accurate, dependable model.
But at the same time, we had tobalance the needs of our audit
project team to get them whatthey needed in the timeline that
they did.
But to me it's like, no, no,this is good.

(17:07):
Like, all right, maybe we didn'tget exactly what we wanted for
this particular project.
But now we have this model thatwe can use for future projects.
I just asked her today, hey,spend some time over the next
month.
Indeed now training up the modeland testing it so that we can
have that in our back pockets.
What I've been finding in thiswork is scale kind of starts to

(17:28):
become exponential.
When we run this program for oneproject, we learned from it and
be able to apply it to anotherproject where now, if we do
another survey for a project, wecan easily run the responses
through that same program.
And we can have, we at leastknow what our kind of confidence
rating is.
And we just keep learning fromthere.
We keep adjusting.

(17:48):
The more surveys we sent throughit, the smarter the model will
be.
And that way, you know, we'rekind of mitigating that.
We're actually making progressand getting more reliable
results.
It's going to take time.
It's going to take iterations toget there.

Conor (18:00):
One of the important things there, in that approach,
is if there's a realexperimental mindset or almost
some allowance or someacknowledgement that you won't
get things right every time orfirst time.
And it might take a little bitof time to actually start
delivering on the results.
Can you talk to us a little bitabout the importance or how you
went about getting thatleadership buy-in to allow that

(18:21):
experimentation to happen?

Michael (18:23):
I think a lot of it again, I got lucky.
I think we have pretty good toneat the top you know, my VP and
Director The tone at the top hasbeen data analytics needs to be
deployed for every auditproject.
That's really been the messagefrom the top and that's paved
the way for us.
So anything in audit, anythingin the business world, you need
that tone at the top.

(18:44):
How did he get that sold to hissuperiors?
I think that people are open tonew ideas.
I think here I've had thischange happened to slowly.
Yes.
But people do want to explorewhat are the current trends in
the marketplace?
And so that's our culture here.
That's probably the mostimportant, factor in our success
so far.

Yusuf (19:03):
You spoke about key challenges.
What do you see as the keyopportunities for you and your
team and the broader internalaudit team over the next few
years in using data?

Michael (19:12):
Oh man.There's a lot of opportunities here.
So this is what's reallyexciting to me is everyone that
I've talked to across manychannels, many industries, many
functions, many companies.
Everyone seems to be on thiswave of trying to get a better
handle on all of their datasources and trying to streamline

(19:32):
and warehouse that data in acohesive manner.
That's where I think everyonehas opportunity and namely our
data analytics function withinour department.
Right now, what it takes is weneed to work with our audit
teams to get, we're usuallygetting flat files to do a lot
of this analysis.
And that's probably fine.
If we were connecting todirectly to some databases or

(19:53):
data warehouses to run automatedworkflows, you run the risk that
you might miss something withyour maintenance and something
could go wrong and give you aninaccurate result.
If you kind of have this controlin place that says, deliver said
report to me on a cadence basis.
And we all agree that this isthe report that we all against a
single source of truth typething; we can rely on our change

management controls to say (20:12):
if this report does change, we know
about it so that we can adjustour workflows accordingly.
That's actually a pretty soundprocess where we can maintain
our workflows and programspretty well.
Now, when it comes to how thatapplies to our entire
department.
We need to have a better sensefor what is that system of

(20:32):
inventory or inventory ofsystems that the enterprise
uses?

Conor (20:37):
You're obviously very passionate and feel strongly
about using data, not just ininternal audit, but going
forward in your own professionallife.
Is there a community of practiceor a group of like-minded
individuals that you tap into tolearn from, or share experiences
or anything like that?

Michael (20:54):
It's interesting, you ask that.
Because I just got invited onLinkedIn.
Somebody reached out to me lastweek about this program that
sets up these random, videodiscussions with other data,
-interested people around theworld.
I keep in touch with a lot ofpeople that I have known in
previous roles specifically oninternal audit, operational

(21:14):
financial roles, and they'vemoved on to other organizations
or other departments.
And I just keep in touch withthem because honestly, everyone
I've talked to, whether they'rein IT, finance, marketing:
everyone is starting to learnhow to use data because they're
like, oh, we can actually get alot of value out of this.
So I find a lot of value in justkeeping up with people that I've
worked with in the past.
Hey, what are you doing withdata these days?

(21:35):
And then a lot of it issupplemented by my Master's
program.
I'm taking discrete mathematicsright now.
And as a class, we're havingdebates of things as abstract as
"Was mathematics invented ordiscovered?" And I liked that
kind of stuff.
So there's people out there thatare really interested in this
stuff and I consider myself tobe one.

Conor (21:52):
What does the next five years hold for you?

Michael (21:54):
Over the next five years, I do plan to get some
sort of role within thebusiness, where again, I think
that if I ever want to come backand audit and move up the chain,
I need to get that experience inthe business.
And that's kind of where I seemy next step is maybe getting
into some sort of like dataengineering role, but it gets
more hands-on experience ofmanaging these databases to get
said insights.
How do you balance a structureddatabase for structured data

(22:16):
compared to unstructured data?
How do you balance that with howmuch would it cost?
How do you have all thesedifferent systems talk to each
other?
That's a really interestingchallenge to me.
And I think it's a question thateveryone's kind of asking right
now and I want to get in on thatwork.
After five years, I'll be donewith my artificial intelligence
degree.
And then who knows?
Who knows what five years isgoing to look like from now.

(22:36):
Because things seem to changepretty rapidly these days.

Conor (22:39):
So as a data-focused internal auditor, what are some
of the key things you've learnedor developed in that realm that
are going to hold you in goodstead in the future?

Michael (22:49):
For internal audit, specifically what it's really
taught me, is how do you bringtogether the people that know
the business and the people thatknow the data and get them to
agree on something?
That has been a huge challenge.
And I wonder why is that?
And I think it's because thepeople that are really good with
the data don't always thinkabout it.
They're so passionate about it.
They so much enjoy just lookingat a data set and saying, oh, I

(23:12):
could do it this way, or I coulddo it that the way.
They often don't think likewhat's really going to drive the
bottom line here?
They're not thinking of what theinvestment and dollars are.
They're just thinking about whatseems cool.
And I've kind of learned to be,taking that step back and start
with the customer need first.
That's just something from anold Steve Jobs interview.
And he said, when they firstwere starting up, they were

(23:33):
like, oh, we can offer thisreally cool product because it
does X.
And they struggled at firstbecause they weren't listening
to their customers.
Let's start with that and workbackwards from there.
Start with what the customerswant and then build the
technology, according to that.

Conor (23:45):
That's good advice.

Yusuf (23:46):
If you had to choose one project that you've been
involved in that used datawithin audit and that produced a
good result, what would you saythe key success factor for that
would have been?

Michael (23:59):
Glad you asked.
I think it is kind of what Ijust said is understanding your
customer's needs and aligning onwhat those expectations are.
That's been the biggest successfactor so far.
And I mentioned my passion aboutdata governance in our pre-call.
We're running a data governanceproject right now, so, okay.
What are all the tables out inour databases that we really
care about that has sensitivedata customer, private

(24:21):
information, financiallyrelevant information: how do we
want to agree upon with ourstakeholders?
What are all the tables that weindeed want to look at?
Things like is the recordsretention policy being followed?
Are there records out in ourdatabases that are 10, 20, 30
years old?
How many records are that old?
How many are in non-compliancewith some of our other policies?

(24:42):
Getting all of that defined.
All those requirements definedand agreed upon is vital.

Yusuf (24:48):
What's the best way to find you and connect with you,
Michael.

Michael (24:51):
Find me on LinkedIn.
That's probably the best spotand I'm pretty active on there,
so I'm on there usually everyday.
Because that's where I've foundthe most kind of, as you asked
before, those like-mindedindividuals.
And I follow a lot of datascience channels on there.
There's one channel inparticular.
This is for any nerds out therelike me.
It's called Towards DataScience.
A lot of technology channels outthere are very high level and

(25:14):
can be a little trendy andbuzzwordy.
And I don't get a lot of valueout of those.
So I really try and find thetechnical ones that talk about
the approach and process somecompanies are using for whatever
type of analysis.
And that's one that I get a lotof value.
There's a lot of others outthere, too.
But for me personally, just findme on LinkedIn.

Yusuf (25:33):
We'll put that up in the show notes.
Michael, thank you for joiningus today.
Lots of good insights there.

Michael (25:39):
My honor, really appreciate talking to you guys
today.

Narrator (25:41):
If you enjoyed this podcast, please share with a
friend and rate us in yourpodcast app.
For immediate notification ofnew episodes, you can subscribe
at assuranceshow.com.
The link is in the show notes.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.