All Episodes

July 24, 2025 40 mins

What if everything you worked for—your savings, your home, even your identity—could be stolen without a weapon or a warning?

In this chilling yet empowering episode of Chatter That Matters, recorded live at the Toronto Hunt, I moderate a critical conversation on the growing threat of cybercrime—and how we can fight back.

Joining me are three remarkable minds from the frontlines of this invisible war:

Margot Denomme, former Crown Attorney, author and advocate for youth digital safety, compares giving a child a smartphone to handing them car keys without a manual. She’s on a mission to delay smartphone use, reform digital education, and protect self-esteem before it’s shattered by comparison culture.

Milos Stojadinovic, RBC’s cybersecurity expert, explains how cybercrime has become a multi-trillion-dollar global industry, complete with 24/7 malware support and AI-powered deepfakes-his warning: the identity crisis is just beginning.

Detective Sergeant Colin Organ, head of York Region’s Financial Crimes Unit, shares heartbreaking stories of victims who lost everything to scams—and exposes a justice system struggling to respond to these crimes as anything more than “non-violent.”

Why this episode matters:

Cybercrime is no longer fringe—it’s global, organized, and relentless. As AI accelerates the threat, we must take action to protect our families, finances, and future.

You’ll learn:

  • How to build habits that shield you from scams

  • Why criminals are always a step ahead—and how to close the gap

  • What parents must know before giving their child a phone

  • Why outdated laws leave us all more vulnerable

  • Whether deepfakes and AI put humanity itself at risk

Subscribe to Chatter That Matters and take the first step toward cyber resilience. Because in this fight, awareness isn’t just power—it’s protection.

 

To buy Margot Denomme's brilliant book: The Family Smartphone Guide:  https://www.amazon.ca/Family-Smartphone-Guide-Navigate-Smartphones/dp/0992034043

To read articles from RBC on CyberCrime and prevention: 

https://www.rbcroyalbank.com/en-ca/my-money-matters/topic/cyber-security/

 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:01):
The domain of attack is boundless. It can happen from anywhere. Because of the
interconnectedness of the Internet. We go through life
almost every moment with some element of risk and reward.
We cross the street, there's a risk that we're going to get hit. We get
on an airplane, there's a risk of a crash. I mean, it's just a
constant trade off. But more often than not, we do our homework. We look both

(00:23):
ways on the street and we navigate life.
But when it comes to the subject tonight of cybercrime,
I'm not exaggerating it, but I would say to us that if we're not prepared,
if we haven't got countering strategies, it's not a matter of if,
it's a matter of when. Providing access to the world through these
smartphones, through social medias, without any boundaries or

(00:47):
direction. Some of the most vulnerable are
seniors, young adults. But I'd also just say the level of
sophistication happening now when they can take a
still picture of your granddaughter or grandson and clone their
voice and have that person arrive to you on the screen and start talking to
you. I mean, we're at an unprecedented level of crime.

(01:09):
This is not someone that's going in in person with a balaclava on
into a bank. They're committing it in other countries out of our jurisdiction
where we can't reach. And that's what we're going to talk about tonight.
Not so much to strike fear, but my hope that we take away
from tonight, raise our awareness, more importantly, walk away
with some smart countering strategies that we can use

(01:31):
not only to protect our own assets, but the people that
obviously we hold very close to us. So let's bring up the panel.
Hi, it's Tony Chapman. Thank you for listening to Chatter that Matters presented
by rbc. If you can please subscribe to the podcast
and ratings reviews. Well, they're always welcome and they're always

(01:54):
appreciated.
On a special episode of Chatter that Matters, Margottt Denemy, a former Crown
Counsel with over 20 years in criminal law. Milo
Stoyodinovich, our cyber expert from rbc,
and Detective Sergeant Colin Organ, a seasoned law
enforcement officer specializing in cybercrime investigation.

(02:17):
Colin, we were talking earlier. I know you're very senior in the
police force, but just give us a sense of what's happening.
Is this truly a wave that's growing almost at a tsunami level in
terms of these kinds of crimes against humanity? Yeah, I would definitely agree with
what you're saying from the police perspective of what we see. And again, I'M
with York region. But although we can be confined kind of

(02:39):
geographically, anything with cybercrime is far from
that. So that's what they're doing to try to defeat it. But we see the
cases going up and up and up and more people victimized,
very vulnerable people that are affected by this. And to the
point where, you know, seniors losing their homes, being kicked out of their
homes, it's taken over because they've been scammed or put trust into

(03:02):
someone that they shouldn't. So can you paint a picture of a story so we
can personify it, personalize it of a senior that kind
of went from thinking that their retirement was a beautifully paved
speedway and then overnight their life turns upside down. It was
a scam not known to them. They signed some paperwork,
you know, on a door to door scam that they thought nothing of it,

(03:25):
you know, over a quick check for their water heater. And they didn't
realize they were signing loan documents and mortgages. They had
their home, beautiful house in Richmond Hill, all paid off.
Unbeknownst to them, there was about $2 million
put lean onto their house and that ultimately
led to they lost their house. We were working on trying to find

(03:47):
them emergency housing through our
resources, through the community and so forth. And that's someone that thought they were
set for retirement. Their house was paid off and it's just horrible.
See someone go through and you want to just say, wave the magic wand. This
isn't fair. And the reality was, everything we tried, they still lost
their home, an entire life that someone's affected. Maybe

(04:09):
I'm just caught up in, you know, the CSIs and the Hollywood folklore, but
I would say in the past there was a certain amount of different
crimes. But is it fair to say that with this type of crime that is
changing, it's getting more sophisticated, is getting more complex. Just when you think you've figured
out their model, they're reinventing it. That's a tough one for me to
admit. The reality is they're always

(04:31):
a step or two ahead. Those involved in these crimes
are some of the hardest working people you would ever meet. If they used it
for good, they could do extraordinary things. But they don't.
They can adapt, they overcome obstacles. The second
we come up with a strategy to stop it, they
learn how we stopped it and they change before we even know it. So

(04:53):
it's always that catch up. You briefly touched on the AI and
stuff like that. It just makes it even harder for us. And you
said, this is not someone that's going in in person with a
belaclav on into a bank where they're leaving fingerprints
or so forth, a lot of these. And I could talk about one is
it's happening. They're committing it in other countries out of our jurisdiction

(05:15):
where we can't reach. So it's definitely difficult. We were
chatting before and you talked about the math. You said how many
thousand cases and how many officers that are available. It
sounds to me almost like that fairytale story of the little Dutch boy trying to
stick his finger in the. D. We spoke of it. I mean, I'm in charge
of our financial crimes unit. Put in perspective for all of

(05:36):
York region. I've got five teams with about
25 to 30 officers and investigators. And
last year we were over 7,000 cases. And that's not
to talk about the dollar value. That's the amount of cases that you're trying to
deal with on a level of volume that you just have to. You have
to do the best of which ones you can. So it's. It's definitely

(05:58):
overwhelming for sure, for the officers, but for the victims as well.
You know, the reality is 7,000 cases, it's very
important to that person. But it might take a couple weeks, it might take a
long time for an investigator to get a hold even to find out, you know,
we can't recover the money, which can be devastating for people. But it's a reality
that we have to deal with. And how is AI and technology

(06:20):
serving to your advantage versus just simply to the criminal's advantage?
We're trying to catch up with that. It's. It's difficult from the policing
aspect, the sensitivity of the information that we keep or the
access we have. There's a lot of loopholes we have to go
through in order to even introduce AI and so forth
to work for us. Because right now it's who do we want to have access

(06:42):
to some stuff if that was subject to a data breach or anything?
So it's things where we're constantly trying to look at. Right now, I know we
have a number of organizational projects that are trying to utilize
AI. We're bringing in a program now where we have students,
the new young minds to kind of work with us and try to
figure out how we can kind of overcome this or come up with some different

(07:04):
programs to try to combat it. A big part of the narrative through the whole
election was this sense of catch and release. We even had unions,
probably for the first time, police unions supporting the conservatives. These
crimes against humanity, when you do, are Successful in capturing
somebody. Do the courts view them truly as a crime against
humanity? Or is it more all that was just a one off?

(07:26):
We understand the constraints with the courts, but to
be honest and upfront, it is frustrating. I come
from a background with human trafficking and robbery squad
and stuff like that. We even see violent offenders being released.
And then coming to the financial crime world, you do see that the
reality of it's a, you know, it's categorized as a nonviolent crime.

(07:50):
When in reality, you know, my investigators are the ones
constantly on the phone, email, dealing with someone. How do
you deal with someone that now is thought they were financially set and now they're
in peril, their housing is at risk, you know, everything
there is a large impact and I'm not going to sugarcoat it
either. We've unfortunately had victims that have, you know,

(08:11):
because of the financial strain, it's led to self harm, it's led to
suicide, disappointment with families and everything.
So it's that side that we're trying constantly to
have the courts be cognizant of of. This is not just a non violent
offense. This has a long standing on victims. I've also
read that when it comes to seniors, they're often so

(08:33):
embarrassed that they don't want to share it with the police or even their family.
Absolutely. Like that is 100% true. Generally
anyone that's caught up in this, right, it's, they estimate only 5
to 10% is actually reported and they had I think
108,000 reports last year. So to put that in
perspective, and it is a shameful thing, especially for someone that

(08:56):
now may have really financially impacted their family
because, you know, they got caught up in it, they're forced into quick
decisions. These criminals know what they're doing. They know how to pressure you into
situations. There is that level of shame that people think, I can't believe I
fell for this. Now let's hear from the brilliant mind of
Milos Stojadinovic, our cybersecurity

(09:17):
expert from rbc. Just give us a
sense. I know you're quite a modest guy, but you're very
senior. I would argue RBC is one of the best in the
world at trying to counter this. So just give us a sense of what you
do for a living and then I can ask you a couple questions. My background
is largely in what we call adversary emulation or perhaps a little

(09:38):
bit more informally offensive security. So I have built the
majority of my career breaking into governments, organizations,
defense contractors, you name it, and using my skills of offense to inform
how we should design adequate and appropriate defense. So by attacking
organizations and institutes, we understand where are their gaps, where are their
weaknesses in their environment, across people, process and technology and

(10:01):
implementing that change. On the other side, I have a threat hunting team which
sets the hypothesis that there are threats inside of our environment. There are threats that
are targeting our clients. They actively try to hunt for those threats inside of our
environment and in our digital channels. And lastly, I have a threat modeling team that
integrates the thought process of security into the products and the solutions that we
build. So we don't think about security as an afterthought. We think about it as

(10:22):
something that's part of the journey. Do you ever think how much you could have
made on the other side? So I'm not suggesting
this, I'm just curious because, I mean, in my demonic mind, I'm going, I'm sure
they're paying you a lot of money, but you're really good at what you do.
Yeah, I mean, you sound like my little brother because you ask me that question
all the time. But look, the reality is, is that ethics is a large part
in what we play. Right? So in my role, I really have one thing.

(10:46):
If you strip all my technical skill sets away, I have integrity. If I compromise
my integrity, I obviously can't work in this industry. Not to mention there's top secret
clearances and all the other stuff that happens. But I have a deep desire to
want to help people and protect people. I don't believe in victimizing people. I've seen
friends and family around me become victims of cyber crime or even traditional crime.
Like most people in life, I've been a victim of crime as well. So it's

(11:08):
just, it's not in me. Right. It's just one of those things I think you're
born to do or not. And what got you into this? Because it is, I
would say it's arguably it's a fairly young field, but it has grown
in sophistication almost exponentially. It's just sort of how
my brain works. I am the perpetual asker of why.
When I was a kid, I used to like take apart watches and clocks and

(11:29):
stuff and put them back together. I just have an innate curiosity for how things
work. And I find that when I know how things work, I can then
hypothesize and think about how could you potentially make this do something it's not
supposed to do. So for me, it's almost like a second nature thought process.
And that I think is the hardest Thing to teach when I look for people
in this role. Like, I can teach technical skill set, but natural

(11:50):
curiosity to understand how things work, or to take a look at a box
and find a different way around a problem, that's the hardest skill set
that we have to find. So, very young age, I got really interested
into hacking stuff when I was a kid. Thankfully, didn't get into too much trouble.
It really kind of intersected with this becoming a really popular industry
because I almost went into physics, but cyber security was just turning into a thing.

(12:11):
And I went, wait, I could take this, like, computer skill set I have about
breaking stuff and actually do something with it, and the rest of it kind of
breaks itself. So paint a picture for us what is really
going on out there, because obviously RBC and all of
your clients are all massive targets. So just open our
minds to the seriousness of the threat. I kind of like to sort of summarize

(12:32):
it as the industrialization of cybercrime, right? So 15 years ago or so, if
you wanted to target a company or target, you know, someone of wealth, you had
to, like, know all this stuff about how to target them, right? You had to
know how to do reconnaissance and how you might get inside of the environment and
how do you persist in that environment and how do you move around it and
how you get access to the data. And that has become commoditized

(12:53):
in many different ways. So these days, if you fancy the
idea of being a ransomware threat actor and holding organizations for ransom,
you don't have to know how to write ransomware that's sold as a service now
on underground markets, you can get 24 by 7 support, where you can
literally talk to someone 24 hours a day, seven days a week to get support
for your malware. It's a big business model because what happens is the people who

(13:14):
are authoring the malware, the ransomware are taking a cut off of
anything that the ransomware their clients effectively are
implanting. And then on the other hand, you have, for example, threat actors
that focus on initial access brokering, so they may, you know,
breach organizations or even, you know, people of wealth into their
systems, and they just sell that access to the highest bidder.

(13:36):
So if you don't know how to compromise and you don't also know how to
write ransomware, but you've got maybe a sliver of that skill set if
you have the funds to get started, this is how you can kind of approach
that. And that's why we're seeing the scale grow up massively. That's why virtually
all of us don't answer phone calls anymore that aren't from numbers that we know
because we can almost be sure it's going to be a scam. We get text
messages all the time asking us to go on links. It's because all of this

(13:59):
has become commoditized and now you can just pay really pennies on
the dollar to enable this stuff from someone else that offers as a service. So
when you get that thought process of like hackers and cybercriminals are like people with
hoodies in their basements. They're not the real crime
has very much so stepped into cybercrime. And this is organized, it's
sophisticated and they're making tons of money. I think we opened up the conversation

(14:21):
with this being a $10.5 trillion problem globally. And
there's also increasing nation state involvement and support in this. Right. It's
not unknown, for example, that if you live in a country that's non allied with
another country and you target their citizens or their corporations, you have virtually
no chance of facing any consequence. And some countries actually even
encourage that. Is it fair to. Because the other thing I was reading about is

(14:44):
that some, those leaders actually also take a cut. I don't know that anyone's
admitted to that on paper, but I mean, yeah, that's how most of crime in
general works. Right. So. And there actually are nation states out there that
are conducting crime for their own gain. Right. DPRK is well known for this.
Right. If anyone remembers the bank of Bangladesh heist where they stole
something like, I think it was $900 million through Swift.

(15:06):
They ended up only losing about 90 million of it. They ended up laundering the
money out through casinos in Maau. But that was all the, the North Korea, really,
the Democratic Republic of North K. They are using that money
to fund their government. What happens to this money when somebody grabs
it? $10 trillion is a lot of money to launder. I think that's the hard
part, right, is there's steps to this, right. And we won't make it an educational

(15:28):
course on how to, how to get, how to move funds and launder them. But
effectively one element is gaining access to funds. The other element is kind
of separating those funds and kind of moving them around and then layering them
into, into what looks like normal proceeds. But increasingly a
lot of these funds are moving towards like cryptocurrencies, right. That don't are, they're not,
not part of the same standard regulation. But these people can leverage that wealth

(15:50):
to still do what they need to do, purchase the things that they want to
purchase. And that kind of leaves them largely outside of the regulated space that
the rest of the world operates in. How much do you think, and maybe this
is an unfair question, but they talked a lot about the housing market, certainly in
Vancouver was propped up with laundered money. The stock market.
How much do you think of the assets that we take for granted that we're

(16:11):
going to retire on are under threat because the fact is that
they're overpriced. Because this 10 trillion has to move to something
that looks normalized. It's very difficult to determine
authenticity of documentation or origination of paperwork that comes out of
country. Right. And it's increasingly more difficult in the world of
AI where you can take a look at real samples, have AI models build

(16:33):
a lot of this out for you. So I wouldn't be surprised at all to
imagine that some of these funds could be coming from illicit sources and propping up
artificial inflation in areas where it shouldn't exist.
So, Margott, we're going to go to Margott Denim. Margott was on my podcast a
while ago and she was the inspiration for doing this evening because a

(16:55):
former Crown attorney sees a lot of things in court and that
led her to creating this book and really focusing on
children, especially with what we consider to be smartphones. So Margott,
just first of all, just talk a little bit about your experience as a Crown
attorney and what you saw and what caused so much concern,
especially when it came to our story to Canadian youth. When you start as a

(17:16):
Crown Attorney, you start dealing with youth. I guess
it's where you can cause the least amount of damage when you're starting out.
And I would see them time and time again get in trouble with the law.
You look at their behavior and it's easy to get angry about the
behavior. But when you saw the pre sentence report, when you saw
the youth and what was happening in their life and

(17:39):
really looked at that, I was never a Crown attorney that thought every child should
have a criminal record. It was, how can we address this in a
societal way and help this child to get on a better track?
Because often that's not their choice. It's poor friendship,
whatnot. There was one common factor and that was self
esteem. I lecture about this now because it really is the

(18:01):
foundation of all of us, but especially children. And
now parents are putting technology in children's hands
before self esteem and a sense of self has even been built.
And then I had my own daughters and started to notice what
they were looking at at social media, all of these images
of people that quite frankly didn't exist. And this was

(18:24):
12 years ago and this isn't going to end well because we
can't have a whole generation of kids comparing themselves to people that
are making them feel bad. So I wrote children's books
and started to go to schools and educate kids about
the dangers of comparing themselves to people in media that didn't exist.
And, and in the lessons was all about the foundation of

(18:46):
self esteem. All I could see were these headlines in the last
five years about children's mental health crisis
over and over and over, and it didn't seem like anything
was happening. So I quickly pivoted from my role
with the Ministry of the Attorney General and I started raising awareness
about digital dangers. When we chatted in the podcast, we were dealing

(19:09):
with Harrison Haynes, who was a young boy and, and insecure and kind
of lived on his video games and next thing you know, he was approached by
another male and one thing led to another and it almost destroyed
him. What I loved about you when you joined the podcast, however, is you
said you gotta think of that phone like you think of handing
over keys to a car. Talk to us about that because that metaphor makes

(19:31):
a lot of sense. Because I know parents are pestered at a very early age
to get their kids a phone, very often with very persuasive arguments. You'll know where
I am, mom and dad, but talk to me about what you feel is
the dangers you mentioned initially, they haven't got a sense of self
and self esteem. But what else do you say you notice? On the back of
the book, I came across this brilliant quote from the former

(19:53):
US Surgeon General, Dr. Vivek Murthy, that essentially
said giving your child a smartphone without any boundaries or
direction is the same as giving them the keys to the
car without any rules of the road. And his last line is,
it's insane when you think about it. And it really is insane when you think
about it. Think about it. Cars are wonderful, but even with

(20:15):
cars, they put in seat belts and airbags and
guardrails. But right now we are
saying, Happy 10th birthday. Providing access
to the world through these smartphones, through social medias,
without any boundaries or direction. So when you think about
the driver's handbook, and it was intentional that it mimics

(20:37):
the driver's handbook, one, because everybody's familiar with
the look and feel of a driver's handbook, two, we expect
kids to get their license around 16 years old. Which
is what we're trying to get this movement towards,
delaying smartphones. I'm noticing that some school systems are doing
it. I think Australia or New Zealand just completely banned phones up

(20:59):
until I think the last year of high school. But you gotta be getting a
massive pushback because kids identity is wrapped up in that phone. So how
do you counter that? When they think the world's within arms reach, your
desire, if you're denying them that, they basically I have no social
currency, I can't fit in, I can't laugh at that person's TikTok video. I
mean how do you challenge that? This is where the collective action

(21:22):
comes into play because it has to be all hands on deck.
Jonathan Haidt, I don't know if you've read the Anxious Generation but He
speaks of four foundational reforms. One, no
smartphones before 14, no social media until
16, phone free schools and back to a play based
childhood. And how you counter that is if

(21:44):
we worked collectively because we all have to be on the same
page, we can turn the tide of the mental
health, children's mental health crisis around, he says within two
years. I think that's incredibly optimistic to turn this tide
around because there are options. You can get a flip phone. You don't
have to have access to the world, world. Let's continue this

(22:07):
conversation. Colin, when you're. Have you noticed since you joined,
I think it was 20 years that you mentioned you've been on the police force,
you started as a constable, moved your way up to a very senior level. Have
you noticed a change in the civility of the
people you're dealing with? And do you point it to. What Margott might be
saying is the issue is the fact that we're wrapped up in the phones as

(22:27):
opposed to humans and humanity. We see it especially
in the world of robberies, home invasions, everything.
Offenders are getting younger and they're getting more and more violent
and you're seeing access to, they have access to all this stuff at young
ages. And again it goes back to, you know, the
consequences of it at the time are they getting catch and release and just

(22:50):
reoffending and reoffending and so forth. And there's pressures
obviously with that too knowing, you know, criminal
organizations use youth to kind of carry these out as well.
They know that target them and they're susceptible, they're very
impressionable and so forth. And Milos going to you on this.
We've become a country where we really believe that other people will

(23:12):
take care of us. The government's going to take care of us Corporations. My credit
card is going to take care of me if I'm, if I get in trouble
when I'm traveling. Do you see that from a bank's perspective, that when
there is a crime against humanity, your clients are saying
this is on your watch versus things that they should be doing to watch out.
Sometimes there are scenarios where, you know, we've dealt with clients where we've

(23:33):
tried to intervene and say, hey, this is a scam. And if people are so
far down the mental trap, they're convinced that you're trying to stop them from
gaining some significant amount of money. And then, of course, a week or two later,
it dawns on them the reality. Right. And when we
return, our focus shifts to how we can fight back and protect ourselves in
this scary world of cybercrime. Some questions from the audience

(23:55):
and my takeaways.
Foreign.
Chapman, host of Chatter that matters. You lock your doors, you
protect your valuables, but how about your online activities?
Cybercrime has become a household word, one of the biggest concerns
facing businesses today. A big shout out to RBC for creating the

(24:18):
Vault, an online cyber safety playbook packed with tips and steps
you can take to counter cyber threats. Visit rbc.com
and download your playbook today, helping you protect your online
activities. Well, that matters to rbc.
I lecture about this now because it really is the foundation of

(24:38):
all of us, but especially children. And
now parents are putting technology in children's hands
before self esteem and a sense of self has even been built.
On this special episode of Chatter that Matters, former crown
attorney, bestselling author Margottt Denemy, Milos
Stojadinovic, our cyber expert from rbc,

(25:01):
and Detective Sergeant Colin Oregan, who leads a
team of law enforcement officers specializing in
cybercrime investigations.
So how do we raise the awareness? Like, what can we walk away with,
with to say that we're at least taking some steps to counter it?

(25:23):
I mean, as you said, you know, we don't maybe answer that spam phone call
anymore and believe that a Russian bride's right around the corner or
the Nigerian prince is going to send you a million. Maybe we're getting a little
better, but. And I'll go each one of you to talk about what do we
need to do to kind of set up our own defense so that we're not
just relying on RBC or relying on

(25:43):
the police force. What can we do? Yeah, a couple of basic things that
I would say that people can follow to try to improve that hygiene.
Don't use simple passwords and don't use the same password for
stuff. Everyone looks at me and goes, yes, Milos, I know,
but I tell you, I have the stats. I see what happens in breaches. People
reuse passwords all the time. They reuse passwords in their personal life and their business

(26:05):
life, which leads personal breaches to turn into business breaches or
vice versa. They reuse patterned passwords so they'll change, like, the
last symbol from, like, an exclamation mark to an AT sign to whatever's down the
row because it's easier for us to remember. Creating these unique
items that are hard to break also ensures that you're not victimized down the
line. Yes, rotate them, but pick complex ones. Don't memorize them. Put

(26:28):
them in, like, a password manager or somewhere where you can reference them when you
need to. And then the other element, too, is to stop
and kind of exercise caution when something doesn't feel right.
Social engineering is a thing that people hear about. And I sat through a talk.
I remember this very vividly. I was in front of, like, 1500 insurance folks, and
I was talking about social engineering. The whole talk was about social engineering. And someone

(26:49):
interrupted me to talk and was like, that's not possible. I would never fall for
that. Like, you can't engineer me to do something I'm not supposed to do. And
my response was, do you have children? Because if you have children, I
guarantee you you've been social engineer. The kids are great at that. That's what they
do to get what they're trying to do. And we all do it in innocuous
ways. Stop and think. Pressure is being asserted that maybe feels like an

(27:10):
urgency is being created. Stop. Think about what you need to do. If you're not
sure yourself, have somebody that you phone to get a second opinion on. Right? And
think about, I do this a lot with my family members. Not super tech
savvy, like, you got something, give me a call, shoot me a text message. I'll
give you a sanity check before you make that decision. And is there an app
that would automatically rotate your passwords? Like, has

(27:31):
technology not come up with. There's some kind of biometrics or
that protects us? There's always some trade off, generally. Right. Security and
usability are kind of like this. This linked dimmer switch. Right. The more secure
something is, the less usable it generally becomes. And vice versa is also
true. So there are technologies that exist. Some. Some sites
and platforms and applications do support it. Right. I would encourage, if any of

(27:54):
you use, like, Gmail or any of the Popular kind of mail
providers like use the multifactor authentication or the passkeys, because they are
considerably more secure. There are other alternatives, but
it's taking time to adopt in a secure way. Just because, you
know, we say, hey, use a passkey or use a certificate, it becomes a question
of storage. Right. Where are you going to put it? Right. If you put it

(28:14):
somewhere where someone else can steal it, it's no safer than having a password password.
So we're just now getting to the point where we have computers that have hardware
that's designed to kind of keep this stuff on a chip that makes it extremely
difficult, if not impossible, to extract. And you can effectively
prove that you have some of this material by doing some mathematical
challenges. But it'll take time for things to get up to speed. I'm going to

(28:35):
add one more point, actually, that I think people often over overlook. If
I asked everyone, like, actually, I'm going to do a little, a quick little poll
if that's okay. Quick poll. If you had to think about
like your most important, like, account that you have digitally, right. How many of you
would say it's your banking account, like your banking client card password? Yeah, almost
everyone. Right. What happens when you forget your banking password

(28:56):
or your password for virtually anything online? You go through that
password recovery form that asks you for your email
and then you get a link to your email to reset your. So, yeah,
their most prized type of credential that you have is your email email.
Because your email is often used to reset every other account that
you have linked to your identity. So yes, banking credentials are super important. I'm not

(29:18):
saying they're not. But also think about the protection for your email because that's
often a mechanism that's used to link back and if you're not savvy with that,
breach of your email can now suddenly lead to someone just programmatically
going through all your messages. Oh, they bank with this bank or they use this
service. I'm going to see if they use the same password on that service or
if they don't, I'm going to reset that password, delete the email before you read

(29:38):
it and gain access to it. Margott, do you want to. Why don't you give
us a little bit of advice that their birthday girls should be thinking
about or advice for us to be thinking about with our children or grandchildren. You
spoke about how we change this. How do we change the
trajectory, especially amongst young parents, young children.
I really equate it to what Mothers Against Drunk

(29:59):
Driving did to Change the culture of drinking and driving.
It took time, but it was a collective effort
and they did it through education and awareness
en masse, which has really caused a significant change
in that culture, when you think about it. I'm thrilled to
announce that I'm working on a curriculum right now with social emotional

(30:22):
learning, grades three, right through 12. We have
to educate young people. I'm not anti technology,
but we have to start to talk about their digital footprint
early. The predators online. When I was writing
the guide, I realized just how insidious it is, from brain
development to mental mental health to child predators to TikTok

(30:46):
challenges, plus all of the criminal offenses that that
impact our children. So when you really see it all in one
book, you say, wow, maybe we should delay. This is the
big challenge about getting the phones out of the schools. It's not so
much the kids. The kids are actually happy once the rules are
in place, because it's a lot of pressure to have a phone. It's the

(31:08):
parents that are calling the office to say, I want to get a hold of
Jimmy and find out what time I pick him up for soccer. How do you
think history will the social media
barons? Because there's a handful of them that are really corralling so
much of the wealth in the world. How do you address that as part of
it? Society has to put more pressure on social media
to say, as much as you like monetizing that data that everybody

(31:32):
excretes, that we've got to find a way to push back. It's a massive,
massive concern. When you saw all of those people supporting Trump,
you know, all of the SOS parents that have lost children to
social media and, you know, died by suicide and all,
all of these horrific things that have happened. And they're on Capitol
Hill and they're lobbying for laws and we are

(31:54):
even close. Legislation is never going to keep up with technology.
So to your point, we have to hold big tech accountable, but
big tech, it's so political. I'll add just a couple of points to this because
I think just having some basic conversations with people to
understand that, like, we all kind of know in the real world, there's no such
thing as free lunch. Like, everyone's heard that one before, right? That is doubly

(32:15):
true. On the Internet, there's no such thing as free products. Gmail's not free.
Anything you use, it's not free. You pay for it with your data.
That data is far more lucrative than you spending $4 or
$5 a month, right? So when we talk about kind of tech billionaires and you
know, some, some of them almost crusting trillionaire at this point status or at least
at one point they're building algorithms and they're hiring some of the smartest

(32:37):
scientists in the world to solve an optimization problem. The problem is
get someone to use the platform for as long as possible. The longer they use
the platform, the more ads they're served, the more they interact with, the more you
can monetize their behavioral information as well as
directly monetizing their time. Right. So when you start to think about
it from that perspective of like it's not free, you're paying with your

(32:59):
data and we start to feel the effects of that more,
I think we'll start to think about what this means and how we instrument
loss to protect against this and to regulate what is and isn't. Okay.
I think a big sort of landmark incident that happened a couple months ago. I'm
not sure if anybody read about this, but GM was found selling some of
their car driving statistics to an insurance company in the States that was starting

(33:22):
to raise rates for people that were perceived to be driving dangerously.
And no one ever read the end user license agreement when they bought their GM
car and got the connected drive and understood the GM actually were
giving GM the right to sell their information to third
parties for their own profit on top of selling them a car. Then that
insurance company buys that data and goes, oh wow, Bob, you drive double the speed

(33:43):
limit all the time and you're, you're, you're driving like a maniac through school zones.
We're not going to insure you or we're going to charge you double the amount.
Suddenly people start kind of feel it to be a little bit more real. So
Colin, my last question, and I'll open it up and get a couple of questions
to the audience and then. But you know, if anybody saw the Game of
Thrones and they're trying to hold the north wall and they all start climbing over
that wall and it just seems like there's no way we're going to ever

(34:07):
counter this. How do you keep the morale of our police
force, which is the bedrock of society? How do you keep the
morale up knowing that exponentially this crime is increasing?
And at the same time we do have this revolving, you know,
the motivation. Is in the people that you can help. So even though there is,
you know, overwhelming amount of cases, the ones that you can really

(34:30):
make an impact on and help and get back, or like we say that
horrible scenario of someone losing their home, you can intervene beforehand.
It's those, those ones that mean a lot to the officers that are
doing it. We're doing this job, you know, similar. We're not doing this to,
to make money and get rich. It's because we generally want to,
to help people. So that's the big thing with, with the officer,

(34:53):
you know, as my job as a leader is constantly reminding them. It seems
overwhelming, but you guys are making an impact every day that you come here,
every day that you work, and you are making a positive impact on
people, which is what they really, they really want to do.
Let's open up three questions. So go ahead.

(35:14):
What's next in cybercrime? With the introduction of AI,
we're probably two to three years away from deep fakes being
commoditized to the point where it's virtually impossible for the average person
to tell that they're not talking to the person on the other end of the
line or on the other end of the phone call or the video call because
it looks like them. And it sounds like that might sound controversial, but I mean,

(35:38):
like two years ago we had nothing and now we have very convincing stuff. And
if you look at even what's open source, let alone what's available like commercially
or on underground markets, like especially celebrities that have a lot of
video or audio content on them, it's incredibly real.
So we're on the precipice of what I call, or
I consider to be an identity provenance crisis, right? So how do you

(36:00):
prove in three years time that you are who you are when you're speaking to
someone you're not face to face. And then the question becomes is how do we
ensure the material that we create digitally can be authenticated? Right?
So there is some conversation that's being had about this in hardware
manufacturers of cameras and other companies that are looking at how
do you confirm with cryptographic proof that something was

(36:21):
generated? But we have to think about misinformation travels faster on
social media than real information does. And in a world where
we can't tell the difference between real and fake, imagine the impacts not
only economically but socially, for being able to spread
misinformation that reaches millions of people that no one can prove is real or
unreal and any kind of immediate mechanism. So I think

(36:43):
that's the big thing that's on the horizon with regards to cybersecurity
challenge in the space of AI. One thing that you can do today is
develop a family safe word for exactly the reasons that you've just
Indicated it's so important that you have a family safe
word. So if you do get that phone call and it seems a little off,
you can have that word that can keep you safe. That's something you can do

(37:05):
today with how fast AI is
growing. Is humanity at peril? A little off topic, but
it's a great question. As someone who's deeply passionate about like
privacy and the digital world, we often
think about this stuff when it's far too late, right? Like we talk about AI
ethics when that ship is sailed, right? We talk about how we're going to regulate

(37:26):
social media when it's been deregulated forever, right. I don't know that
we're quite there. You know, wouldn't it be great if we lived in a world
where all of our basic necessities were met and we got to pursue our passion
projects and the things that we wanted to work on to enrich our lives and
the lives of others around them? But there's also obviously potentially a
negative of that. And it's, you know, you have to be pretty ignorant to not

(37:46):
realize that there's a lot of wealth being concentrated in spaces that control this technology.
The interesting thing that I'm most concerned about because I'm very involved in AI right
now is our youth without purpose, which is what a job gives
you without a place to go. Socialize,
learn civil skills. If AI takes away those bottom rungs of the
ladder and then the next lungs, the next rungs, we're as a society are going

(38:08):
to have a massive problem on our hands. Because if you study history, civil unrest
leads to anarchy. Another question here. Yeah.
With so much deep fake audio and. Video out there, the Internet
and in our social media, how do we deal with what's real
and what's fake? You know, it's challenging
because it is a sociological. It's a mental problem, right?

(38:30):
It's people feel like, and they're conditioned this way and they're coached
this way that, hey, people may try to stop you, they may intercept you. They
don't want you, you know, they want to keep you in the system is often
kind of like what we hear being said, right? So it is
a challenge. This is where we're thinking about this digital provenance and content
provenance, right? How do we create very, very quickly, give somebody a

(38:51):
capability to see, I'm looking at this. Is this really published from rbc or is
this really RBC CEO? Or is it really Mark Carney or who else? Because the,
the level of chaos and manipulation that someone can enact through this
deepfake is tremendous. It's unprecedented. I think.
Just to wrap this up, what I'm so honored that we have people

(39:11):
like this serving our country, because good and
evil, it sounds like evil starting to run the table. And it's your only
counter, people like you that find a way against all odds,
7,000 cases, only 10% reported. Margott, what you're doing
with your career is successful as well. You said, this is going to be my
calling. I'm going to make a dent in it. The fact that you could probably

(39:33):
hack the Pentagon and the fact that you're doing it and saving us, it
gives me hope. But I think the only way we're going to continue to get
them inspiration and motivation. If we listen to what they say in terms
of our passwords, encountering talking to our elderly,
talking to our kids, and really having these conversations,
because if we don't lead by example, I think that this thing will

(39:55):
collapse because there's a lot fewer than them than a lot more over
there. So a big round of applause for this fantastic home.
Once again, a special thanks to RBC for supporting Chatter that matters.
It's Tony Chapman. Thanks for listening and let's chat soon.

(40:48):
When you get that thought process of like, hackers and cyber criminals are like people
with hoodies in, in, in their basements. They're not.
Advertise With Us

Popular Podcasts

Stuff You Should Know
The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.