All Episodes

May 17, 2025 • 56 mins

In this episode of 'Cybersecurity Today', host Jim Love is joined by panelists Laura Payne from White Tuque and David Shipley from Beauceron Security to review significant cybersecurity events over the past month. The discussion covers various impactful stories such as the disappearance of a professor, a data breach at Hertz, and government officials using a commercial app during a conflict. They dive deep into the ransomware attack on PowerSchool and its implications for K-12 schools in North America. The conversation also highlights the vulnerability of critical infrastructures, including the food supply chain and the importance of robust cybersecurity measures. Finally, the panel touches upon the progression towards post-quantum encryption by major tech companies like AWS and Google, signaling advancements in securing future technologies.

00:00 Introduction and Panelist Welcome
00:20 Major Cybersecurity Incidents of the Month
02:04 PowerSchool Data Breach Analysis
04:11 Ransomware and Double Extortion Tactics
12:20 4chan Security Breach and Its Implications
16:31 Hertz Data Loss and Retail Cybersecurity
17:44 Critical Infrastructure and Cyber Regulation
27:03 The Importance of CVE Database
27:54 Debate on Vulnerability Scoring
30:17 Open Source Software and Geopolitical Risks
31:43 The Evolution and Challenges of Open Source
37:17 The Need for Software Regulation
46:50 Signal Gate and Compliance Issues
54:08 Post-Quantum Cryptography
56:10 Conclusion and Final Thoughts

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Welcome to cybersecuritytoday, the month in review.
Joining us today are LauraPayne from White Tuque.
Hello, Laura.
Hey, Jim.
Great to be back.
Great to have you back.
And David Shipley, fromBeauceron Security.
Welcome David.
Thanks for having me again.
Okay, panelists, you know thegame and I'm sure most of our
audience does in the show.
We focus on the stories that had thegreatest impact over the past month.

(00:24):
We try to do a deeper dive into themand this month started so calmly,
A prof professor and his wifedisappeared, maybe, and everybody was
wondering what was happening to them.
Hertz lost a trove of datawith an amazing amount.
All those things that you tell thepeople so that you can get insurance
as well as a car and all of that, whereyou're staying, where you're going.

(00:45):
They misplaced all of that in a hack.
But, that was justanother day in Paradise.
'cause then things got really weird.
Government officials were using acommercial app in the middle of American.
Conflict.
The, they're attacking, a group of rebelsin the Houthies, I think they're called.

(01:06):
And these guys are chattingaway on Signal, about this
with locations and data.
Nothing wrong with that.
And then just, I'll just add Yep.
And one meme, the mean use of emojisin a way that we've never seen.
Bam.
Fire pow American flag, whenthe bombs were dropping.
Oh yeah.
That to me was 21st centurywarfare right there.

(01:28):
It's, I think they were selling, they'reprobably selling tickets to this.
They didn't have to sell tickets'cause they had a journalist
that had invited on the call.
That was just really,that was really special.
And then the government, using, they'reusing a commercial app and they go to
Chris Krebs and say let's attack ChrisKrebs, the guy who used to run CISA
and, that was the start of the month.

(01:48):
And I have to say it gotmore exciting after that.
The past few weeks have been overdrive.
I've seen some of your notes as theycome across about how we're gonna
narrow this down, but, we have to findenough stories to go through an hour.
Why don't we start with easy ones?
I wanna take a victorylap on PowerSchool first.
Yes, please.
On this show, the patent pending fingerwave of stop paying cyber extortionists

(02:15):
was had because what did we say?
We said you can't trust criminals.
And what have they done?
They didn't delete the data and nowthey're going for a double payment.
'cause why not?
Why not get two scoops of tastydata extraction ice cream?
So rewind and give us the story.
not everybody remembers DavidShipley's lessons on this, so let's

(02:37):
give him a little history first.
so Power Schools is one of the largestsoftware as a service providers to
k to 12 schools, if not globally,definitely in North America.
They were hit and hit badly, witha data extraction, not your typical
ransomware, but simply the other halfof the double extortion, via insecure
parts of their support, infrastructure.
and paid a undetermined sum.

(03:00):
We can assume it's likely in the millions.
'Cause that's the waythese kind of things go.
And then sent the communications on to theregular school districts, which cause all
kinds of really interesting conversationsabout, how sensitive is this data?
In some cases highly sensitive.
In terms of maybe the bus pickup locationsand other, issues or notes that teachers
may have had about students, dependingon the school district, how long

(03:21):
they've used the platform, et cetera.
It was an on a wholly date of notificationmass and affected school districts
in Canada, from the west to the east.
You name it.
And of course one of the defensesthat we've seen trotted out
in these data exfiltration.
We saw this in Canada atthe Life labs years ago.
Man, this feels like a lifetime ago.
But a long time ago, somebodyelse tried to do this and pay

(03:41):
this and then tell you it's okay.
They deleted it.
And I believe if I remember the storycorrectly, and Lu, and Jim, correct
me if I'm wrong they they got proof,they got a video and they showed
us on video, they deleted the data.
Okay.
There's a video deleting the data.
Yeah.
And now it turns out they're targetingindividual school districts for
a double dip, of the, of the dataextraction, which is as uncouth in

(04:05):
cyber as it is, with nachos, right?
The single dip.
But it's so funny because ages ago.
PE we used to say that, cybercriminals had great help desks.
They helped you through pay the ransom.
They made sure that they gave you adecrypt key because they were in business.
It, when it became a franchise and anybodycould be out there and it got dirty there,

(04:28):
there is no longer any rule of, you'repaying the ransom as power schools did.
But now as you've said, these guys aregoing around from school to school,
telling the schools we have your data.
And you pointed out some of thedata that's in there, the data
on children who are vulnerable,that it just goes on and on.

(04:51):
If you're a parent, yourheart just goes just sinks.
So what do you do about this?
PowerSchool, will, PowerSchool survive?
This is gonna be a big question.
At this point.
But the second one is, how do theselocal school boards deal with this?
And I think it comes down to, backto, we've learned hopefully, right?

(05:14):
It's an education system.
So here's your educationfrom the criminals.
Fool me wants shame on me or shame onyou, but fool me twice, shame on me.
Yeah.
Back to me.
you don't wanna get there wrong.
The first one was on you.
The second one's on me, there'sno point in paying this ransom.
They're just gonna be back for more.
you gotta figure out whatyou're gonna do about it.
The horse is outta the barn.

(05:36):
and yeah, it might mean that power of yourschool is not in your docket anymore and
you gotta figure out something better.
Or maybe we're back to paperin the meantime, this is not
the kind of thing where it's aclick and pay and solve, right?
Money doesn't solve this problem.
Could we have possibly gotten tothe point where this might be the
straw that broke the camel's backand people might stop paying ransoms?

(05:59):
These people might stop paying ransoms,like the decision makers here, but I don't
think it's, yeah, there's no universaloh, we're gonna stop paying ransoms.
Because it will depend on situations,it'll depend on the knowledge of
the person in that seat makingthat decision at that time.
Do they consult somebody who's goingto give them good guidance to, know

(06:20):
what their options are and whatthe likely risks are if they pay?
I'm sure there are still some Iwill call them with a heavy dose
Salt, noble cyber crime groups thathave good reputations, but they are
fewer and farther between, right?
So if you're going to pay theransom, you just have to know.

(06:40):
it might give you a spell of,time to deal with something,
but it's not your end game.
You can't delete the data once it's gone.
If they've got it, they've got it.
And I think there's a distinction here,like the use case of a hospital, paying
a ransom to get systems back operationalas quickly as possible remains a deeply
ethically and morally problematic space.

(07:03):
But when we're talking very specifically,it's not about your operational.
It infrastructure, it's aboutdata that was exfiltrated.
There is no value.
And I think the only way we're gonnachange this is if the courts say
we don't care if you pay to, to getthem to pinky swear they deleted it.
That doesn't matter.
It's untrustworthy, unreliable.
there's no definitive way to prove it.

(07:24):
Your penalty is for the lossof custody, not for everything
you try to do after the facts.
You devalue the incentive to tryand pay this 'cause part of the
reason they pay this is so thatthey can go around and lower their
eventual class action lawsuit losses.
I don't think they genuinely care asmuch or really in their heart of hearts,
believe the criminals are trustworthyactors that are gonna delete the data,
We gotta get rid of thatout of the equation.

(07:46):
And I think this is the first step togetting out of the pay, the ransom mess
that we're in, but definition of crazy mandoing the same thing over and over again
and expecting a different result out.
are we just looking at thisthrough rose colored glasses?
You're saying like the wholegame might be just keeping their
liability down, so doesn't matter.

(08:07):
do we know how much they paid?
Anybody know how much they actually paid?
Millions?
I think No don't think we know the Yeah.
Yeah.
If we don't know.
And they may have negotiated veryhard and the ransomware are back.
I maybe they did go reallybig and pay a large ransom.
And so they figure, there'smore from the well to draw from.
I don't know.
And I don't know what decisions,the payers made going into that.

(08:28):
I wouldn't wanna be in theirshoes, but I think at the end of
the day this just does go to provethat Yeah you can't get the data.
Out of the ecosystem.
Once it's out there.
It a leak is a leak.
It might be in for a bit of a surprise.
They, might be looking at public dataand going, oh, the Toronto School Board.
Oh, they've got, $80 million in budget.
The fact is they don't have $8in money, as a matter of fact.

(08:50):
Yeah.
They owe money.
They're in a deficit, so they're not gonnabe able to come up with a lot of cash.
And I don't think the parents are gonnago for a bake sale on this one, no.
And the insurer may have coveredthat first payment, but they're
not gonna be up for a round two.
the impact on this was, $60 million.
all the reporting I've seen have said thepayout was in the quote unquote millions.

(09:11):
For context, PowerSchool has reportedly18,000 clients covering 75% of K
to 12 schools across North America.
And, 60 million up to 60 millionpeople are in that system.
So definitely on the general rule of,10 cents to a dollar per impacted user,
going market rate for extortion, anywheresto chain, what, six and 60 million?

(09:35):
Potentially paid out.
It sounds crazy to be talking abouttens of millions of dollars, but we had
an un, yet still a yet unknown Fortune500 company pay something like 47 or
$50 million last year to a ransom.
It was the single biggest ransom take.
So yeah, I guess that's where we're at.
I think the one thing I wanna makea point about as well is the end
of the first era of internationaltransnational, ransomware came with the

(09:58):
big takedowns, the lock bit takedown,the other gang, takedowns, et cetera.
The groups kind ofself-destructing on their end.
And that was the end of theeasy days of this crime.
one of the things that I had predictedwas that their tactics would get
more vicious, but that the industryitself would get more resilient.
So it'd be harder to take down, but theywould also in turn, become more ruthless.

(10:20):
This was the natural, Darwinianevolution of this crime.
So what we're seeing now with there extortion and, the retargeting of
healthcare post pandemic and otherthings, this is cyber crime, ransomware
3.0, and they are going for the throat.
it's interesting to see the.
Physics for every action, there'san equal and opposite reaction play

(10:41):
out in the cyber battle betweenthe good guys and the bad guys.
for some that's an excuse to donothing and to say we were better
off just having this as a, basicallyinternational IT tax that we were paying
to Russian and other cyber crime gangs.
And at least we had X amount,but now it's even worse.
I think we have to get through the otherside of this, whether we have the will

(11:01):
to get through the other side of this.
And keep in mind all this ishappening is the counter ransomware
initiative and other really good USled initiatives are all for not now.
There's a giant sucking sound and that isthe vacuum of previous incredibly valuable
US led leadership on this global issue.
And I don't think anyone else has gotthe heft to keep that charge going.

(11:25):
So I think the good old days andthe nasty days are also combining.
2025. So much fun.
And we'll get to it later.
C is debtor than we think.
I do wanna add one clarifyingpoint on this story, 'cause
I think we glossed over it.
Is that, we have, in thisparticular story, we had the
third party who was breached.

(11:45):
And they paid the ransom.
The attackers are now coming back notto the same third party, they're coming
back to the users of the third party.
The victims.
Yeah.
To the victims.
It's not the case where, the boardspaid out the ransom in the first place.
They are coming, they are nowbeing asked to pay a ransom.
And who knows, maybe the attackers weregonna were hacked in the backend and, the

(12:06):
data, maybe they really did delete thedata, but somebody else grabbed it first.
You don't think they'rereally good at security?
Like lock bit was hack this month?
And four chan and youknow it Yeah, of course.
The honor among thieves these days.
Yeah.
Let's twist around to that.
'cause you brought up the fourchan story and you were gonna,
you were gonna cover that.
That's another story that,where of Wrong Gone Wrong?

(12:28):
I don't know how else you put it.
Yeah.
It, it's a little refreshing to see thatbad security affects the bad guys too.
And four chan I guess is.
Not objectively bad in the sense that theydidn't start out to be full of evil crap.
But here's, that's, that's where they are.
And shockingly ly is what they'rea site that they a counter culture

(12:50):
message site they started, yeah.
And just a, allow whoever,whatever communities to co coalesce
on their site and to exchangeinformation and opinions and less
information and more opinions maybe.
But it got pretty sick key to it, the end.
But yeah, it was, it is prettyfull of things that are undesirable
from a moral standpoint.

(13:12):
And, not surprisingly, it became more andmore difficult for them to engage vendors
to provide the hardware and the updatesthat were needed for their infrastructure.
also not shockingly, a lot of peopleweren't willing to pay to advertise.
So their revenue streams and.
Here's another shocker.
People who aren't, maybe on theright side of moral and ethics
don't like to pay for services.

(13:32):
So they weren't gonna pay fora subscription to four chan.
So your revenue model doesn't really work.
And, so you know, this is a foreseeablecircumstance of a very popular site that
has no revenue and is known for beingon the wrong side of a lot of things.
Really not being able to sustain itself.

(13:52):
it's back online.
It will probably go the way many things dowhere it devolves causes, splinter sites
to crop up in its vacuum and somethingelse will probably grow out of there.
I hope this is the beginning of the endof that particularly large community.
The site collapsed,but they brought it up.
It's gasping, at least at last.
I've heard there's not theycan't bring it all back up.

(14:14):
They've lost a lot of data.
We won't miss them.
one of the largest purveyors ofdeep fake, pornography services, Mr.
Deep Fake, was taken down and as aresult of US legislation now coming
after non-consensual intimate image.
Deep fakes as well as, realnon-consensual intimate image

(14:34):
distribution, which is fantastic.
But in a sad note for Canada, theindividual that has been determined to
be responsible or a key individual isa, is a Canadian citizen, CBC, and a
whole bunch of other media, let's worktogether to, piece together this person's
identity through opsec leaks and actuallyconfronted, the individual, about it.

(14:55):
In terms of, we think about AlphaBay was an expat Canadian who was
running that was at the time one of theworld's largest dark web, narcotics,
marketplaces and other things on it.
Not exactly that moment of Canadianpride of some of these clowns.
The biggest clowns out therehave been coming from up here.
At the time when we want to excelin digital industries, this is not

(15:16):
exactly what we want to be known for.
Is that not the industrywe were aiming to excel in?
Not quite.
Yeah.
We want we wanna focus on it ona different clientele, I think.
But what's interesting is to see thesegroups like, lock bit, in the middle of
a fight and things that are happening.
And this gets back into the interestingsort of, politics and drama that
happens within these environments.
Like we've seen, these core ransomwareinfrastructure providers stiff their

(15:39):
affiliates and, and then, we've seeninter fighting like when the, the
latest sort of expansion of the war inUkraine in 2022, which wasn't the start.
Goes back to 2014.
But in 2022 the most recent actual,official Russian invasion of that area,
we had gangs that fell apart, thatactually were, came Ukrainian and Russian.
And that leaked all of their sort ofinternal chats or playbacks the playbooks.

(16:02):
This is where we learned about the HRstructure, the payment schemes, the,
the, the feedback from project managersfor your, particular ransomware group.
So we're seeing this, play out and itis nice to see the self-destructive,
but the one thing I will describefour chan, 'cause we're post May the
fourth, but if there ever was a MosEisley, that den of scum and villainy
referred to by Obiwan Kenobi, fourchan was it was the Mos Eisley of, the

(16:26):
internet, the den of scum and villainy.
And you just love to seeit burn to the ground.
So wouldn't we rather talk about Hertz?
No, talking about starting the month out,they just lost a whole pile of our data.
and we saw the impact of it wasthe co-op and Marks and Spencers.
And everybody knows, not everybody knowswho co-op is in the uk, but it a large, I
think there's, I forget how many thousandsof grocery stores they have, but they

(16:49):
cover a vast swath of the UK or England.
Whether, I can't tell the differencebetween the UK and England, but Marks
and Spencers also got hit, and peoplewere starting to wonder whether the new.
Pursuit of ransomware andhacking was going to be retail.
This has been a big hitfor these stores though.

(17:11):
There was a story justtoday and it's come up.
The stores at co-op, theycan't stock the shelves.
they're rationing milk.
rationing has been announced and allof that entails ever, you mind, I said
to her, you put on a kettle, we'llhave a nice cup of boiling hot water.
you've got a big deal, beyond the fringe.
Fans will like that one.

(17:32):
Nobody else will figure it out.
But the issue is, that justwiped out two major grocers.
How much more of our infrastructurethat we don't think about
is vulnerable at this point.
Yeah, and this is something I've thoughtabout in the context of the discussion
around critical infrastructure and ingeneral, the discussion of how do rank

(17:53):
the criticality of different devices,different systems, different industries.
And unfortunately, I do frequently comeback to the conclusion that yes, some
are more quote critical than others,but everything is so integrated now that
the difference between the importanceof protecting less critical and more

(18:16):
critical is just aligning, right?
Because the things that are socritical, when we think about
typically the power grid and, waterresources and things like that.
Now we think, okay, food security.
Transportation and logistics,which makes all of that happen.
And then all the suppliers that go intothat, whether it's professional services,

(18:38):
which typically are smaller organizations,and all of these supporting organizations,
there really isn't anything left atthe end of the day that's not in the
circle of supporting a critical service.
And, so it just reinforces that youcan't just protect a few things and
the rest of it'll be acceptable.
It's they, everything needs to havea reasonable level of security.

(19:02):
Not that it'll be perfect, but thatthe risk or that the impact when there
is an incident is a smaller impact.
That it's more contained.
That it's more manageable.
And I think a couple of different things.
It's stunning to me that Marks and Spencerdid not see what was happening in Canada
to, both what happened to Empire Sobeys,what happened to, the drug, chain.

(19:25):
London drugs and not say,okay, we need an incident plan.
We need to invest in this area.
it's Stupifying that,we saw ransomware rise.
We saw it happen.
We saw the impacts in very specificretail food security verticals.
And we still failed to act.

(19:47):
Isn't that fascinating?
A realistic level of security.
What's your RLS?
And then, what alternatively isyour realistic level of resilience?
can you go back to pen and paper?
Can you stock the shelvesso that people can get milk?
absent regulation, it's veryclear the private sector, north
star of shareholder value.

(20:09):
Will dominate the risksconversation to the point where
they're ill prepared for this.
The evidence could not be clearer.
I feel Matlock resting hiscase at the end of the episode.
Ah, here's the proof.
This is why we need regulation.
And yet, I, as I raised the point onLinkedIn this week, Prime Minister
Kearney is, gonna be deliveringhis speech from the throne here
in Canada, with the king coming.
This is super exciting for us.

(20:30):
I hope in that speech cyber regulationfor critical infrastructure, at least
for what they were trying to do withBill C 26, get some kind of nod intention
and back on the legislative radar.
But what I will say is that even whatwe were trying to do before it failed,
here in Canada didn't even contemplatedealing with food distribution.

(20:53):
Food security is not criticalinfrastructure in this country
from a federally resourced mandate.
And it means the provinces are offto pick up whatever they can on this.
And the good news is some provincesare, I'm aware of conversations
happening within provinces to go,Ottawa is not coming to our rescue.
We're gonna have to figure thisout for American listeners.
What you have to understandis that, CSA is gutted.

(21:16):
You are, you're globally leading.
Trump created critical infrastructuresecurity agency, which was a success.
And President Trump deservescredit for creating csun.
Absolutely.
Term one, that was one of the wins.
Here I am actually giving appropriatecredit for where it needs to be on that.

(21:38):
So there's lots of people thatwere involved in the creation
and formation of that policy, butit was under his administration.
Now, Trump too, is literally butcheringone of the successes of Trump won
and turn around saying the states aregonna be responsible for security.
And there are some states,New York, California, Texas.
Sure.
But are you kidding me?
The rest of the states arebeing left to their own devices.

(22:01):
This is bad news Bears.
And CISA was known as the coordinatorfor critical Infrastructure,
security and Resilience.
And it's where I think, I don'tknow the total system yet, where
you reported a lot of things toYeah, no, they did great work.
Jen Easterly wasadvancing so many amazing.

(22:24):
The most recent past cs, a director.
She was building sort of consensuson really deep, long-term issues
like software quality and holding thesoftware supply chain accountable to
creating secure, software, secure codingtechnologies evolving from the things that

(22:44):
we know are inherently risky, that we'remaking it too easy to create these CVEs.
These are big media, substantialnational level, international
issues and all that's gone.
It's worse than gone.
for those of us who, rememberChris Krebs was the head of csa.
He was fired for honestly,saying that elections were safe.
So the head of CSOs, learned notto speak truth to power, even

(23:08):
if it's the right thing to do.
They've learned that it's worse than that.
I can't disclose totally why I know all ofthis stuff, but as I was telling you guys
before the show, we've been doing someinterviews with whistleblowers and CISO's.
Non-existent.
It is a political front right now.
It is a puppet and it is overruledby political decisions coming from

(23:30):
God knows where, but that's, it is nolonger trusted and that is a crime.
Not and again, not I'm notgetting into American politics.
You've given Trump a nod.
I don't want to, talk about Trump oranybody else, but the fact is we have
lost one of the best assets in the worldat leading the fight on cyber crime.

(23:53):
and Laura probably can speak morewith more intelligence than I
about Mitre and the CVE database.
we had an asteroid almost hittingthe planet moment for the software
reliability and resiliency, automationand vulnerability scanning environment.
Previous to the current administration,it was a very fragile budget improvement
process, which shame on those for notgetting it figured out beforehand.

(24:17):
So let's put appropriateblame where it belongs.
The fact that it was so brittle to beginwith was not good that it was so heavily
dependent upon and yet so brittle.
then it almost came completely unglued.
And I don't know what yourthoughts are from your perspective.
Yeah.
there were a couple things, and Ithink your comment is very on point
that, it sounded in some of the.

(24:38):
Write-ups around this, they, it actuallywasn't unusual for it to come down to the
23rd hour for approval to come through.
But what was different was inprevious administrations it was,
yeah, we're getting close to the wire,but we're pretty sure it's coming.
There was confidence thatthe job was gonna get done.
There was not confidence this timethat it would actually get done.

(24:58):
And what that shows is, yeah, therewas that ez Fair nature around it.
maybe not by everybody, but byenough people that it was, yeah.
Business as usual is yeah,we just, we get her done.
Instead of saying no, we need thesethings done in a timely fashion so that
there are checks and balances that canreasonably be put in place, and so that if

(25:18):
the funding isn't going to come through,there's enough time to do something else.
And that's really what's been missing.
But this is not new for the US government.
The number of times that thingshave been pushed through, or they've
gone into, basically the governmentfreezes because they haven't finished
passing their budget for the year.
this is just another symptomof the larger issue that.

(25:40):
Has been building for quite a long timeand really highlights the fact that any
one organization who is leading sucha linchpin in our system, of security,
that if there is any trouble in thatorganization, it has such a huge potential

(26:01):
for impact across so many other places.
And whether that's government or whetherit's nonprofit or a collaboration
of governments, whatever it is, ourlesson learned out of this is to take
a better look at the warning signsas they're coming and try to build
as a community that resilience sothat yeah, any organization can fail.

(26:21):
So whatever organizations we're puttingthese key services into, we need to
have a better way of staying on top ofwhat's going on there so we can course
correct before it becomes a crisis.
Absolutely.
And I think Laura, you're talkingabout the CVE system, right?
And in particular inApril was the funding.
Sure.
Again, and not everybody's, noteverybody does this security junkie,

(26:42):
jumping into this, the CV system isa critical piece of how, like cisa,
it's a critical piece of how wemanage, any sort of hacks or problems.
Zero days.
All of those things.
Yeah.
These are, and do you wanna justgive us a brief explanation Report?
A view?
Yeah.
Give it, yeah.
The Quick primer.
So CVE is the common Vulnerabilitiesand Exposures database.

(27:03):
And it is a mass collection ofall the, known, published, vetted,
confirmed vulnerabilities in software.
Doesn't mean it's the full enterpriseof software or vulnerabilities because
we know there are people who foundthem and they haven't reported them.
But it is the biggest single centralsource for vulnerability knowledge.
All of the major vulnerability,scanning, players and technologies

(27:28):
brings CVE data in from thissource, as part of their ecosystem.
it is the most robust, as robust asanything is in, this day and age.
Collaboration point forthose vulnerabilities.
So to lose it or to have it notbe effectively maintained, and
it requires staffing as well.
It's not just about the technologyof the database, but there are real

(27:49):
people attached to reviewing thesubmissions, scoring and writing them.
And there's lots of people whohave lots of debate about the
effect, efficacy of the scoring.
That's neither here nor there becausethe most important thing is that
there are actually people workingtogether to try and make the effort
to get these things in a place whereeverybody can then contribute, and use

(28:13):
the information that comes out of it.
That would've been a huge loss.
It would've put new findings ata standstill for, being reviewed.
It would've meant things thatwere in progress, weren't
finishing, getting through.
and then the older findings do getbrought back forward and updated
from time to time as things change.
So all that new context informationwouldn't have been being published,

(28:35):
would've been a huge win for the badguys, Have this lack of communication
in the ecosystem for this, the defendersI cannot imagine anybody who listens
to the show here the word CVE 19,whatever, and they'll know that instead
of having to look in 16 or 60 or 600databases across, God knows where to find

(28:56):
out is, is there something out there?
There is one database archived,normalized to the best possible.
And I hear the criticism to the scores.
But I don't, if it's at eight or an8.25 or a 9.13, it doesn't matter.
It's bad, and if it's a three,it's not as bad as a nine.

(29:18):
it's not perfect, but it has accuracy.
But the one thing criticsignore about it is that it
moved us from low, medium, high.
Low, moderate, high, whateverways you want to put it.
Which is even less granular tosomething a little bit more specific,
with a little bit more rationale,harder for organizations to argue,

(29:40):
oh, it's not that big of a deal.
there's a Beyond Trustreport I just came across.
Apparently they've been doing this fora number of years, where they look at
the total number of CBEs that come outtaMicrosoft per year and how many turn out
to be critical versus the total volume.
And it enables that kind of ananalysis, which is our only kind
of barometer of software quality.
And so if we don't have that kind ofdata to work with, we really don't

(30:03):
know how the trajectory is going.
And that's just one example.
But I think we rely on so much inour modern digital economy on so many
things that are brittle and fragile.
Like that are just one bad actor away.
And I think, Jim, you postedthis in the, prep notes today.
It was the, Russian hosted, opensource software that used Oh God.

(30:24):
Yeah.
Everywhere.
And everyone's oh, all ofa sudden this is a problem.
And remember, the opensource community are, okay.
Let's still, hang on.
Let's make a point of going backand giving the audience the context.
they don't know what we'retalking about at this point.
The story I posted was a Russiangroup is the source of a module.
That is in practicallyeverything open source.

(30:46):
it's a GO package and everybody will know.
If you follow Linux, you'll know thatGo is where everybody wants to go
from c plus to this new language go,which is supposed to be more secure
except for one particular module thatis supported only by Russia Everybody
who supports this module is Russian.
And the head of this organizationthat supports it is under sanctions.

(31:09):
And what it does is it supplies theinterpretation and retrieval from JSON.
Now, you don't have to be anybody, youdon't have to be a smart programmer.
JSO that's used in alot of places, isn't it?
Yeah.
So that is what has beendiscovered, and we've all woken
up and going, wait a minute, I.

(31:30):
This guy named, and he's not named, notVladimir, is supporting almost all of our
open source libraries for this function.
That's a bad thing.
But I wanna paint a broaderpicture for a second.
I was like, okay, why did this happen?
Remember that at the dawn ofthe age of open source that.
I just wanna give a shoutout to the, these are some
of the best of us as humans.
They were thinking about how wegenuinely collaborate for the collective

(31:54):
benefit of everybody on the internet.
That this was non-commercial focused.
Let's make better software together.
Let's advance technology together.
And I love that.
Like this is the Ben and Jerry'sice cream of the internet, right?
And these are still the internet hippies.
And I loves meet theinternet hippies, right?
It was fantastic.
And at the time this was happening,we were heading towards that guy

(32:16):
Kawasaki end of history, view thatthe, the Cold War was over and its
und deleting prosperity and war.
Democracy was on the march thanksto the power of the internet and
authoritarianism was done, and we'regonna have unending economic growth, yada.
It's like looking back onthat from the nine, looking on
the nineties is so sweet now.
Like you look back with a nostalgiathat I think people in the eighties were
looking at the fifties, so now I get it.

(32:38):
Not to say the fiftieswere great for everybody.
They weren't.
Nostalgia is a funny thing butthe open source community, that's
how you ended up with people fromRussia doing cool key things.
Because before Vladimir Putin's rise,we thought we were gonna have a more
normal relationship globally with allkinds of places, including Russia.
And now geopolitically, this has changed.

(32:58):
And I think this whole idea of afractured internet of this, unwinding
of globalization, this is a canaryin the coal mine of what that
means for the open source movement.
And what I can tell you is that ifthe open source movement genuinely
is as dead as globalization is forthe next 30 years, software costs.

(33:20):
Everything's about to get expensive.
Everything's about to get real,real expensive if everyone has to do
their own coding from the ground up.
And I'm not saying that we can't Yeah.
And before everybody gets ontheir high horse and goes o open
source, we should get rid of it.
But it just in case anybody getson their sort of well open source,
I always knew that was bad.
Can you say supply chain hack?

(33:41):
Boys and girls, they're we're reusingmodules in commercial software.
This is not just what we've got iswe've got an explosion of software
without any be ability to track itsprovidence or where it came from.
That's at the heart of it.
And I wanna challenge that a little bit.
there's the ability to do it.

(34:02):
That's established.
It's just whether people do it andwhether they put the right checks and
balances all the way up the chain.
And, depending on the limitationsof how you're scanning, right?
Maybe you go three layers deep.
You go six layers deep.
Do you go seven layers deep?
Do you go 10 layers deep?
And this kind of.
Back to the theme of thecritical infrastructure, right?

(34:22):
It's all connected.
You can't say, six layersdeep, that was good enough.
You need to go all the way back.
And then protecting that chain,the open source community.
I don't think it's dead, and I don'tthink the movement is dead, but I think
like all things that we've learned,along the way with communication,
innovation, and you could say the samething about the printing press, right?
People thought it would be like thisgreat boost for democracy, which it

(34:45):
was, but it was also a great boost forauthoritarian dictatorship too, right?
It's just a hammer, right?
What you use it for is what makes thedifference and whether it's good or bad.
So I think open source isgonna be the same where.
People will have to get more serious aboutchecking the provenance and, being, if
you're a software supplier, you need to bemore serious about the provenance of all

(35:09):
of the modules and libraries you've used.
the smaller the code chunks that you canlook at, the more likely it is that you'll
actually understand what's going on in itand, and being able to make sure that it's
good and less open to security issues.
So I'll put that out there.
And David's gonna hate this one, butthis is a really good use of ai, you

(35:31):
can hate the code that AI generates,but there's nothing better for finding
out that no code is ever documented ordocumented well than running AI against
it, saying, what does all this stuff do?
I can see it.
I can't read code that it can readand it can tell me what it's doing.
So we have tools and you've pointed outquite well, Laura, we've got the tools.

(35:55):
We know the issue.
We're we just we gotta do it.
That's becomes part of the due diligenceor part of the quality of making
any software program or taking inany software to your organization is
understanding who it hangs out with,Because tangible things people can see
and feel a little bit better, right?

(36:16):
We have regulations aroundhow bridges are built.
We have, strong provenance as faras how the calculations are done.
We have a licensing regimearound the people who are allowed
to sign off on the designs.
We have, rules around inspection.
The materials that go into it are sourcedand people understand the full supply

(36:37):
chain up to the screws and the rivets,where those things are coming from, and
the material composition within them.
And we want all of that becausewe drive on them and put, millions
of tons of weight on them.
X number of meters up in the air.
I'm gonna use meters.
'cause I'm Canadian though.
I wanted to use something else.

(36:57):
And we have these expectations ofthings but it's so geographically
limited and the people who use abridge are, within a confined space.
And then we look at systems thatwe've now extended to billions of
users on the planet and there arenone of these things in place.
And then we wonder why wehave the problems we have.
And you're hitting with something reallyimportant and it's this is that we

(37:21):
still live in an era and maybe in thenext monthly review, I'll remember the
exact research study, but I came acrossthis just before the show and it said
something like, organizations with moresecurity tools are less secure because
they keep just buying the stuff, but theydon't actually properly implement it.
They probably don't actually fundamentallychange how their business is operating
to be more secure and resilient.

(37:42):
Going back to the Marksand Spencer example, right?
Like I'm sure there's a lot of vendorsthat are gonna settle their next
generation firewall or their nextgeneration AI powered MDR full on
14-year-old David version eye roll.
And, but they're notfundamentally gonna fix.
Their ability to have a good testedincident response plan or, to review

(38:04):
their supply chain for their softwareand their systems and put in place an
SBO m and do the thing about securityis we always go for the sugar rush,
quick calorie junk food solution.
We don't do the hard work of,from the field to the table of
actually that's required for aproper security nutritious meal.

(38:25):
You can tell that I'm verymuch focused on food lately.
It's interesting to seewhere we are, right?
Like we, we've got an internet that wewalked into that was, what the nerds did
and it wasn't critical to our everydaylife to, it's now being essential
to sustaining the life and wellbeingof 8 billion people on the planet.
But we still treat it like it's aplay thing that it's just a fun thing.

(38:45):
We think about how smart tractors inCanada are solving the labor shortage on
farms across the country by having thethings go and use AI and run themselves
to the field because we don't have enoughpeople to staff them or want to do the
job, to do that manual labor at therate we're willing to pay for our food.
We depend on this stuff,but we don't protect it.

(39:06):
you said a curious word, you said we,and I don't wanna make it seem like
it's the people who are out thereworking in cybersecurity that are
the problem at one point or another.
The reason why, you can tell wherethe rivet on a bridge came from, or
if there's an outbreak of, poisoningfrom, Ecola in Minden, Ontario where

(39:26):
I live, they can trace where thelettuce came from, is not because the
lettuce farmers wanted it, and it'snot because the rivet makers wanted it.
It's because there is legislationand regulation that says thou
shalt provide a chain that tellsme where any piece came from.
There are.

(39:47):
Standards that must be held fromengineering and other places
that say, you must know this.
We've got to step up.
Finally, 50 years into our world, andmaybe, I don't know, the first virus
I saw was in the 1980s, let's say 45years into our world of cybersecurity
and start to treat this like it is aprofession like engineers or anyone

(40:11):
else who have to obey certain rules,and that goes against the grain
right now of this deregulation thing.
But I gotta tell you, every timeyou fly in an airplane, you want
to know that there was regulation.
Trust me.
And listen, Jim, I wannagive a shout out to ai.
I mentioned this, the other show, butthe AI that does the collision avoidance,

(40:32):
the machine learning, simple logic-based,structured automation that avoids
humans potentially making human errorto make a bad situation, a tragic one.
Yes.
Laura, you were about tosay something relevant.
Wait, I was, Ooh.
Oh, that's a shot of shot fired.
No, just, just to wrap up the thinkingand what you were saying and on, and

(40:54):
in particular shout out to engineering.
there's debates around the term softwareengineer has been used and abused over
the course of time and it is a soarpoint for a regulated engineers who,
actually had to do some properly qualifiedtraining in ethics uphold certain promises
whenever they sign off on something.

(41:15):
I think there are a lot of people whocall themselves software engineers who,
if they had to put their name to signoff on the code that's published and
then it's known if there is a failurein their code that causes a mass issue
with availability or security or whateverother issues it might come down to, they
are personally on the hook to be sued.

(41:36):
That would change a lot of perspectivearound the, how quality people feel
about the code being put out there.
'cause at the end of the day, if there'snot a person on the hook, whether it's
within the corporation or, a, an engineer.
The responsibility just gets distributedand people feel like they're safe.
I'm a consultant.
I spent like 20 years or 30 yearsof my world as being a consultant.

(41:59):
I am a certified management consultant.
I hold a designation that isduly authorized by the province
I live in and I'm licensed interms of being able to do this.
I have a code of ethicsI have to adhere to.
I can be called before a disciplinecommittee in the same way an
actuary can or a lawyer can.

(42:21):
and I agree though, if you're gonnacall yourself a software engineer.
And not just a guy who writescode or a lady who writes code,
then you have to step up to havethat profession be regulated.
But again, we also need regulationand we need those two pieces.
And they're missing in this industry.
And almost no other industry.

(42:41):
But also there are no rules to upholdsomebody's professional standard
right now when it comes to code.
Yeah, absolutely.
But the other part is that with thislevel of diligence and approach, the work
that, Tanya Janka from, SEM group doeson educating developers to create secure
code requires investment by company toactually do it, to slow down their build

(43:03):
cycles, to do the validation check,to create cultures where developers
can actually be assertive and say,no, this is an unsafe way to do that.
That is gonna come at a costand it is going to significantly
increase the cost of cheap software.
We live in the era of cheap.
Software.
it's incredibly valuable to us.

(43:24):
It does amazing things, butwe don't pay what it's worth.
And our willingness to pay to havesecure quality software is questionable.
that'll be the interesting thingto see is that I don't think the
market has ever sent that signal.
I think we're in an era of deregulationnow with the global largest marketplace.

(43:44):
So the pressures on's not there.
The Europeans are overwhelmedalready trying to hold everyone
to a higher standard on dataprivacy, ai, and security.
Already the software quality maybe a bridge too far for them.
that leaves us with a lot of questions.
But I'm gonna put this to you, and notto disagree with you, David, but I'm
just gonna say that would we really misshalf the crap that gets into software?

(44:08):
Do we really need a Word program thathas 50 million lines of code in it?
I'm not so sure that I wouldreally miss some of this stuff.
As a matter of fact, I've beensaying, could you slow down
the development a little bit?
'cause I have to usethis stuff and I like it.
I like to be able to use itregularly, lemme put it other way.
Imagine we went back to the dayswhere you had to pay for your

(44:30):
annual Windows upgrade, right?
and now we're in an era whereyou get a basically a almost a
new version of Windows every sixmonths with significant feature
enhancements and it costs you nothing.
But we'd be willing to pay 150 bucks eachfor the next six month upgrade of R os.
We used to, but now we expect it.

(44:51):
To have, Windows that hadhalf the vulnerabilities.
Would we be willing to waittwice as long and pay more for
software that's also secure?
And that's the interesting challenge.
Look at this madness aroundthe AI arms race, right?
it's gotta be first, it's gotta be best.
We gotta just release this as fast aswe can, regardless of the harm it can

(45:14):
cause from deep fakes to hallucinations,to, all kinds of other chaos.
The environmental side.
We just, we gotta go, wegotta go, we gotta go.
We're repeating the same mistake withAI writ large and the insanity around
that, that we did with software.
You either you let somebodyelse develop it or you do.
And that's the world we're in.

(45:35):
We're in a competitive worldand without, I think we need
software minimalism, right?
Like we, we need someMarie Kondo in maybe.
Yeah, that'd be good.
But we do need, and nobody in AIhas been willing to participate
in any form of regulation.
No.
In fact, and regulation is not necessarilybad if it's there to protect you.

(45:56):
we went through that with NASA.
When Nassau first started launchingpeople up into space you were
rated by having no mistakes.
After a while it became, you wererated on being faster and we had.
The great disaster where we lost sevenastronauts and, so we go back and forth
in this continuum but right now, theissue before us here is not probably world

(46:20):
hunger, it's, we're in a world where wecan't do this any longer with software.
We're at a place where stuff'sgonna start crumbling if we don't
address the security in software.
And Pro pro, I kept saying Providence.
Providence in software,where it comes from.
And those are two things I thinkwe can wrap up on in there.

(46:41):
No, I wanna get it one more story.
Oh, sorry.
Yep.
Lori, do you wanna say that?
Okay.
I want to, I don't wanna leave.
We can't leave without this story.
The story that won't stop.
And we all remember back at the early partof the month, the big story was a bunch
of people in the Department of Defensewere watching a, an invasion of, going
against Hootie Res. And they were merelyputting this stuff out on their phones

(47:04):
and in a commercial app called Signal.
But of course it'sencrypted, so we're okay.
All kinds of problems with that.
And we've had great scandals butthis story just will not go away.
so in the middle of all this.
We not only had the story ofSignal, we not only had the
story of, how it got there.

(47:25):
I don't think anybody'sactually been fired over this.
Mike, waltz, no longer theNational Security Advisor.
And he's now the UN ambassador.
Fired, transferred, demoted.
I don't know.
put him where can't do any harm, right?
But not the guy that actually disclosedthe timings that airstrikes, that, that
would be Pete Hegseth, he's got a lotof time on tape saying people that do

(47:45):
exactly what he did should be in jail.
Yeah.
But that's American politic.
The what about him is American politics.
That's just, that's their game.
They put that's played.
You can't do anything about.
But what is frightening to me.
Is that there's a structure bywhich, and that's broken down by
which these people are saying thisstuff was put on their machines.

(48:06):
So how could this be?
I just wanna back up for a second.
at first we were all calling thisSignal Gate and credit to Signal.
They just rolled with it, had fun with it.
They did a very cheeky softwareupdate that I thought was hilarious.
But it turns out it wasn't Signal.
It was this thing called TM Signal,made by a company called Telem
Message, which took the signalopen source code, and made a third

(48:28):
party version of it that could thencommunicate with other signal clients.
Now, here's the delicious irony.
In some perverse way, what we'venow learned that Mike Waltz and
others were using Telem messages.
The reason you use Telemmessage is that it complies.
Complies with archival record keepingrequired by the US Federal Government.

(48:48):
So the deep and twisting delicious ironythat someone probably went to somebody
and said, we wanna use Signal, andthey're like the only way to do it to
comply compliance with the legislationis to use this Israeli made tech.
Okay, great.
I get to do what I want and I'm compliant.
Great.
You notice what I'm saying?

(49:08):
User need compliance and massivelyinsecure are all in the same paragraph.
Yeah.
Because somebody came on andhacked telem message in 15 minutes.
But it's better.
It's better than just the vulnerabilities.
So first of all, a shout out to all thehacker nerds who just dove bomb on this.
You can tell I spent the weekendlearning World War II stuff and

(49:29):
watching movies about the Pacific, butliterally who dove bomb the carrier
that was Tele message to the pointwhere they turned themselves off.
they found out that the opensource code had hard coded creds.
Oh my God, please just stop.
And then their infrastructurewas vulnerable.
They got into that.
But the chef's kiss to the person thatdid the process flow and captured the

(49:50):
video and screenshots to show that thisapp would take end-to-end encrypted
messages from the Signal protocol.
It would take a plain text copyof this and save it, including to
destinations like your friendlyneighborhood Gmail account.
Sweet.
it's gonna be really interestingseeing what more comes out of this.

(50:12):
I don't think anybody has ever actuallyadmitted yet that, even though we
know which account the person wasadded by, but I don't know whether
that individual ever actually admittedto adding the Atlantic's account.
So how much access did someactors maybe have beforehand?
Maybe it wasn't the individualand their sloppiness.
Perhaps this was actually malicious.
I don't know.

(50:32):
I'll put it out there.
So the story that I heard about this, andthis is one of my, David time patented,
cheap shots at AI that Jim loves so much,but apparently an apple AutoFi thing
came in from a media request from this.
Journalist and it had the same name asanother contact on this assistance phone.
And it was like, Hey, do you wannaupdate so and so's contact information?

(50:54):
And he said, yes.
And so that was the person heintended to add to the chat,
not the Atlantic reporter.
So I like that story 'cause itfulfills my particular zealotry.
But I can see Jim wellbefore you do the YA burn.
Apple has been prosecuted for not beingable, not being let to call that stuff
that they put out AI, they've been askedlegally to remove this, saying that

(51:19):
this is this, you can't call this ai.
Anyway, back to this.
all these people's phone numbersare out there on the internet.
Any one of their devices they're usingpersonal devices is not only probably
hacked, I will say it is hacked.
But once again, and we can make thispolitical or not, but we can also
go back to our own worlds and say,every time an executive can overrule

(51:44):
your security procedures and nottake personal responsibility for it.
I'm not saying.
CEOs or senior executives shouldn'tbe able to run a company, but they
should have to go back to the securityperson and say, I will take personal
responsibility if we change this.
And I'll tell you, we've beendumping on CISOs and we've been
saying we wanna take CISOs to courtand we want to prosecute them.

(52:08):
Let's start prosecuting some ofthese people who come in and say no.
You're just a, you're just a hack.
You're just an IT person.
You don't know anything.
Let them take responsibility fullyand say, I will now personally
take responsibility for the factthat if we as a corporation get
hacked, it was my decision andthis is when I get back to it.
You can do it nicer than that, butjust going back to your executives and

(52:30):
saying, what risk are you willing to run?
And I would just reinforce this.
Blind compliance creates insecurity.
you need to look at your compliancemandates and say, is this way to comply?
Safely and securely.
Just because it says it compliesdoesn't mean it's gonna do

(52:52):
it in a safe, smart way.
That's not say thatcompliance isn't important.
Laws, regulations, contracts matter.
So I'm not slamming compliance.
I'm just saying don't misinterpretthat the software is compliant
with the software is secure.
Yeah.
your user requirements are not thesame as your security requirements.

(53:13):
I think the really important point outof this whole story for sure is that at
the end of the day, all of the tools, allof the controls, all of the good things
we put in place, they're all able to beworked around by a user making a mistake.
Even if it could be intentional or itcould be completely by error, but the

(53:37):
user who has permission to take anaction, like adding a person to the chat.
If they add a person to the chat thatwasn't supposed to be there, there is no
security technology that's likely to catchthat, are you sure is the best we have
as a security control for that problem.
So we have to wrap.

(53:57):
That's the best way to add it.
Yeah, we have to wrap up, but I wannaleave with at least one positive story
because we've been in the morass ofall the things that are going wrong.
Laura, you dropped in a note about.
the fact that we're actuallydoing some post quantum work now.
So I am for the April roundup, AWS hasnow released, they're updated libraries

(54:18):
or they're updated, services for sa CM and Secrets manager, with the
L-M-L-K-E-M, post quantum algorithm.
the important takeaway from that is ifyou use AWS and you use those services
for encryption, have access to postquantum algorithms that you should be
looking at, bringing into your usage.

(54:39):
Google had, in February announcedfor their KMS that they were
supporting some PQC as well.
My challenge with the previousconversations about, quantum readiness
was that there was nothing tangiblefor the majority of people to do.
There were certainly lots of tangiblethings for people in the math side
to do, in the coding side to do oncewe had the algorithms established,

(55:01):
but there wasn't anything for thekind of the majority of people.
'cause I do not recommendrolling your own crypto.
But now there are things to do.
So keep paying attention on, these arethe first, items really coming out, being
ready to go and, there will be more.
Get your inventory up to date, knowwhere you're using cryptography, where
you need to be ready to, update tomodern libraries and get her done.

(55:23):
Yeah.
The other thing I'll just add is that thisopportunity that you just mentioned, the
inventory is also the opportunity to say.
Do we need this data anymore?
Is this system one we want to keep?
If we're gonna talk about softwareand data minimalism, our, our
Marie what was her name again?
Marie Kondo Yeah.
While we're improving security,we can clean our house.
And wouldn't that be nice for everybody?

(55:45):
Doesn't mean you have to get ridof everything, but it has to have
joy and purpose in your business.
So prepare yourself for quantumand this is a wonderful time and
this is the positive side of acompetition because if AWS announces
it, everyone else will have to.
So there, there's our positive sideof competition in this industry.
And the second is we get to declutter.

(56:06):
Absolutely organizers.
Rejoice.
Thank you Laura Payne,and thank you very much.
And David Shipley, this has been agreat piece and I, as usual, I'll
have the editing of it to do, but Ithink we've had a great discussion.
I hope the audience hasenjoyed it as much as I have.
And if you're out there listening,wherever you're listening, I never
know when people are tuning into this,whether it's on a Saturday morning

(56:28):
or sometime through the weekendwhenever you take long podcasts.
Thanks a lot for joining us on this.
And, have a coffee, relax and,think about cybersecurity.
I'm your host, Jim.
Love.
Thanks for listening.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Ridiculous History

Ridiculous History

History is beautiful, brutal and, often, ridiculous. Join Ben Bowlin and Noel Brown as they dive into some of the weirdest stories from across the span of human civilization in Ridiculous History, a podcast by iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.