All Episodes

December 13, 2025 56 mins

Former detective Graeme Simpfendorfer spent 27 years investigating murders, armed robberies and serious crimes. What took him months of work can now take seconds - thanks to pioneering AI technology. From identifying getaway cars instantly to tracking suspects in real-time, AI is revolutionising criminal investigations. Graeme joins Gary Jubelin to share the incredible potential and the serious dangers of this brave new world.

Listen to Graeme Simpfendorfer’s earlier I Catch Killers episodes - part one here and part two here.

Want to hear more from I Catch Killers? Visit news.com.au.

Watch episodes of I Catch Killers on our YouTube channel here

Like the show? Get more at icatchkillers.com.au
Advertising enquiries: newspodcastssold@news.com.au 

Questions for Gary: icatchkillers@news.com.au 

Get in touch with the show by joining our Facebook group, and visiting us on Instagram or Tiktok.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
The public has had a long held fascination with detectives.
Detective see a side of life the average person is
never exposed to. I spent thirty four years as a cop.
For twenty five of those years, I was catching killers.
That's what I did for a living. I was a
homicide detective. I'm no longer just interviewing bad guys. Instead,
I'm taking the public into the world in which I operated.

(00:23):
The guests I talk to each week have amazing stories
from all sides of the law. The interviews are raw
and honest, just like the people I talk to. Some
of the content and language might be confronting. That's because
no one who comes into contact with crime is left unchanged.
Join me now as I take you into this world.

(00:46):
Besides using the phone to record a conversation with a
person of interest which resulted in me being charged, I've
never been accused of being an early adopter of technology.
But the technology landscape is changing rapidly in the form
of artificial intelligence, and it's going to impact dramatically on crime.

Speaker 2 (01:04):
Now.

Speaker 1 (01:04):
I don't want to be left behind, so that's why
I reached out the former detective and guest on this podcast,
Graham Simpendorf to have a chat. For the past few years,
Graham has made it his business to understand AI and
see how it can be used to investigate crime and
also prevent crime. Graham is the CEO of Perigon Investigations

(01:25):
and specializes in using AI in the fight against crime.
Now we can bury our head in the sand and
pretend things are not going to change, but they are.
Have a listened to what Graham has to say. I
think it will become obvious that the utilization of AI
is going to be the biggest change in criminal investigation
since DNA technology came into play in the early nineties.

(01:47):
I found the chat with Graham fascinating and informative. If
you don't want to be left behind in this brave
new world of AI, have a listen to what he
has to say. Graham, thanks for I mean on I
catch Killer's.

Speaker 2 (02:01):
My pleasure, Garran conceiging, well, it is good to see you.

Speaker 1 (02:03):
But I've got a confession to make them a little
bit nervous, because when I talk technology, the amount of
information I don't know can become blatantly obvious, and I
don't want to come across as a dummy. So I
give you full permission to not laugh at my questions,
but just explain it in the most simple terms so

(02:24):
I can understand it. Because when it comes to technology,
I think if I can understand that, other people can
understand it exactly right.

Speaker 2 (02:32):
Yeah, I find the best way I'll explain it to
some others, as if I'm just explained to my mum
because she still struggle to turn on the computer. Yeah,
but I think so many people are in that position.
It's changing so rapidly. Everyone's finding a bit hard to
get their head around.

Speaker 1 (02:45):
So yeah, yeah, well it has changed so rapidly, and
when we talk rapidly, we're talking every sort of six months.
We're having conversations that six months ago you wouldn't even comprehend,
have you.

Speaker 2 (02:55):
Yeah, exactly right. And I think that's probably one of
the biggest challenges where we are facing in all parts
of society, is how rapidly it's changing for us that
are stuck in the middle trying to understand it. But
then the kids are the next generation that are picking
it up so quick and bringing into their employment already.
There's a lot of challenges, but it's also exciting at
the same time.

Speaker 1 (03:13):
Well, I think you need to. I half joke, but
because I usually get dragged into technology, I don't embrace
change too readily. But I'm certainly aware of a situation enough.
Now you've got to get ahead of this or you're
going to get left.

Speaker 2 (03:28):
Behind, exactly right. And look, we've seen a bit of
that over our time, both in the cops and just
in life, where we've seen all these massive changes introductions
of just computers and emails and how business is done
electronically tapping your phone to pay that these things were
unheard of twenty thirty years ago, so now it's just.

Speaker 1 (03:46):
Part of that day living. I remember the days, and
you might. When I first started, it was manual typewriters,
and then when the electronic typewriters came in, there was
a couple of old detectives that why would they need
an electronic typewriter? And then these stupid things computers came
in that would never catch on.

Speaker 2 (04:05):
That's right. Yeah, it doesn't feel like it was that
long ago, and probably in the scheme of things it wasn't.
But that highlights I guess how rapid this is changing,
and it's another technological change that's going to happen really
quick in the next couple of years to five years,
and we may look back in ten and almost laugh
at some of these days.

Speaker 1 (04:20):
Yeah, whatether you've been doing since you've been out of
the police. You've spent twenty seven years in the Victorian
Police working as a detective. How long you've been out
and what you've been doing.

Speaker 2 (04:29):
Yeah, well it was actually four years ago last week,
so that's gone really quickly. Took me a couple of
years to really find myself again and that identity that
we've probably spoken of many times, but I eventually got
around to start in my own company, Paragrine Investigations and Consultancy,
and in that space where you're looking at risk and
security for the private sector, but also some private investigations

(04:52):
on that higher end scale of finding people, of finding
people that don't want to be found. Thanks to some
previous stuff with obviously Channel ten.

Speaker 1 (05:01):
And just for the listeners that don't know that, I'm
also sitting opposite a TV star in part because you're
on the After you left The Left the Cops, you
were on the show called the Hunted.

Speaker 2 (05:13):
Yeah, that's right. Hunted came in to our screens a
couple of years ago now and that's what led me
to part of this work. So the people I met
through Hunted as now who I work with and collaborate
with in the private sector. So it's amazing how these
doors open up.

Speaker 1 (05:26):
And the premise of that show was someone just citizens.
It was a reality TV show, but was based on
the ability of law enforcement to track down people that
were on the runt.

Speaker 2 (05:37):
Yeah, well we did just about day and day out
in the job. But here regular citizens thought they'd give
it a go, try and avoid us, and they had
twenty one days to try and avoid the team of
hunters for the reality show and a lot of fun.
But just amazing.

Speaker 1 (05:50):
Did that upscale you like what was available? People often
assume that law enforcement is a cutting edge, but you know,
when technology changes, sometimes law enforcements a little bit.

Speaker 2 (06:00):
It sure did. The team that was put together in
the cyber aspect, So everything that was to do with
cyber whether it was online, open source, dark web, spoofing
on different machines and trying to pretend or trick the
people at the other end as to who they were.
It was a whole new world to me. And like
we've just said, I had to learn and learn really
quick and that's obviously led me on to now what

(06:20):
I do in the private sector with Peregrine.

Speaker 1 (06:23):
Okay, you've been on the podcast before and we talked
about your career and that, but just for people that
hadn't listened to that episode, just give us a bit
by way of background your policing career when you're in
Victoria and the police.

Speaker 2 (06:35):
Yeah for sure. So just over twenty seven years in
the end before I called time on that went through
all the ranks as usual, up from working the van
in the streets of Melbourne and then finally into the
crime squads with a little bit of time at homicide,
robbery and sex offense child abuse units. Two thousand and
nine made them move up to the bush up to

(06:57):
Wodonga up on the border with New South Wales. Ran
the detective's office up there. Then about the next ten
years through Black sat Day, Black Summer and all the
different events we had up there, and that's where I
still am. Yeah, I love it.

Speaker 1 (07:09):
Okay, Well, we won't revisit your police career, but if
people want to hear about your police career, you had
some fascinating cases and a lot of stories to tell.
The career police tend to tend to carry with them.
Another thing that you are doing is you're very much
involved in the advocacy for veteran police. Yeah, what made
evaded you for that?

Speaker 2 (07:31):
I think, as I sort of alluded to before that
first couple of years out for me, I was struggling,
and it was around those COVID times or post COVID,
We're just trying to find your identity again. People would
ask what do you do for work? And I struggled
to come up with an answer and how I felt
because you know, I used to be a police officer
or I'm not sure what.

Speaker 1 (07:50):
It was funny that because that is your identity, and
I think that's what a lot of people struggle with
because you go through life and regardless of everything else
you do or even your life, you're identified as a policeman.
That becomes your identity. So I do understand what you're saying.

Speaker 2 (08:05):
Exactly right, So that you know that journey in that
couple of years, which was pretty rough, you know, some
highs and lows, but come out the other side eventually.
With that, I then saw so many mates and you
would too, that are going through the same thing and
almost the same issues. Of that local identity, loss of identity,
I should say, you know, what's my purpose now? Because
your purpose was to serve your community, serve your state,

(08:26):
and you had your team and I guess you tribe
around you. And when that's gone, who am I now?
So I saw so many people struggling with that. And
it was around the same time as Police Veterans Victoria
kicked off sort of a new initiative to fill that gap,
because if you didn't go out on work cover, or
if you didn't go out physically or mentally, well, you
sort of went into a different bucket of no man's
land of just you gone.

Speaker 1 (08:47):
Yeah, so there's no context.

Speaker 2 (08:48):
Yeah, So the Police Veterans Victoria was starting. I think
a lot of states are doing a similar thing. So
start to work with them around helping people that were
getting out. So we now either help those that are
transitioning out or we help sort of the very senior
people in their later stages of life that are struggling,
but they could connect with us because we understand, we
understand what it was like to work in the job,

(09:11):
and we just help them through life and the simple
things that they might be struggling with. Finances, or struggling
with relationships, and we just just sit down and have
a coffee with them and have a chat. So I
really enjoy that.

Speaker 1 (09:22):
I think it's worthwhile in that. And I speak with
a lot of military guys, and you know, they spend
twelve months going through their basic training and turning them
into soldiers and becoming part of the military, and then
when they leave, they're just cut off. Same with police.
We go through the academy and they're breaking you down
and rolling you out as a cop. It would be

(09:44):
and I don't think it'll be that expense ive just
the last couple of months of anyone's career, if they're
going out at their own time, wouldn't it be good
if they did a decompression basically before they helped them
on the little things that you need to be up after.

Speaker 2 (10:00):
You've left the police, It's right, all those things about
just starting your own business or what's that next step,
whether it's another full time role or you start your
own business. Just all those little things on life that
are outside the job can be a lot for some people.
So yeah, it's needed. There's a gap there for sure,
and I think there's a lot of good space some
programs beyond the badge and the police veterans do do

(10:21):
a lot of work, but there's more that can be done,
that's for sure.

Speaker 1 (10:23):
Well good on you for putting some effort in there.
And I know in the conversations that I had with
you too, you were touched by the murder of the
two police officers down in Victoria recently. You knew one
of them very well.

Speaker 2 (10:35):
Yeah, well that's that's the area I worked in, so
you know, the Wangaratta areas. You know, we would work
with them almost daily, if not weekly, of working alongside
some of those guys, and you know, very sad to
watch that, and that's part of that, you know, grief
process as well that you know had. I had a
real sort of personal struggle with I'm out but I
want to help, but you have I've had to step

(10:55):
away and how can I help in other ways? And
it was just sort of a few messages here and
there to people say, you know, if you need to
have a chat, I'm here, do your best, but look
after yourself as well. So you're really close to home
that one, and we'll be for obviously a long.

Speaker 1 (11:07):
Time because that's the type of situation especially in the
small community and the rural area. That would hit very hard.
Everyone with the people.

Speaker 2 (11:15):
Yeah, exactly everyone you knew someone and the people that
responded to, right from ambulance to police to the community,
the pubs and just the local businesses that are affected
by that search and still are. Yeah, it's really tough.
And once that noise sort of stops and things sort
of start to die down a little, that's where the
real struggle can really come in, and that's when people's
mental well being can care the focus.

Speaker 1 (11:38):
It's just that stage you're going through the grief and
then the focus of dealing with it, and then all
of a sudden that goes quiet and you're left to
your own thoughts.

Speaker 2 (11:46):
Yeah, that's a dangerous time. So that's the time I
like to step in a little bit more than a
bit closer and how you're really going, how you're doing
because that noise has stopped.

Speaker 1 (11:54):
Okay, all right, Well, AI is a topic of discussion today,
and quite iron I've used AI to get a description.

Speaker 2 (12:03):
Of what AI is.

Speaker 1 (12:04):
So I'm going to read this out and then we'll
just for those that don't know AI, this is the
way AI describes itself. AI or Artificial intelligence is the
ability of computer systems to form tasks that typically require
human intelligence, such as learning, problem solving, decision making, and
understanding language. It is a field of computer science that

(12:28):
uses algorithms and data to enable machines to perceive their environment,
learn from experiences, and make decisions or take actions to
achieve goals. AI can be found in everyday applications like
virtual assistance, recommendation systems and advance web search engines. Okay,
that's AI's definition of AI. There's a lot there, and

(12:51):
my thinking that's describing what we as a human do
with our brain exactly.

Speaker 2 (12:56):
Yeah, and that's I guess that artificial intelligence piece isn't AI.
So yeah, but it's only as good as what develops
on itself. So as it sort of alluded to, it's
doing large analysis of data or problems that the human
brain or the human would take hours and hours and hours,
if not days a week to do, and it'll do
it rapidly in seconds.

Speaker 1 (13:18):
Well, using a basic example, if we were talking, if
we keep it related to a crime or a court matter,
if case law, a particular type of case law and
precedence is very heavily relied upon in court case law,
what's a statue? What are the proofs of certain offense?
All that is readily available on AI, that's right. And

(13:40):
before that would be a case of we all had
those old law books that we'd keep in our office
and have the actives looking through case law trying to
work out what way to approach an investigation or present
a matter at court. It's incredible what's readily available the fingertips.

Speaker 2 (13:57):
That's right, and it's so reallyvailable. But then it's a
matter of also then interpreting that to your case or
your particular set of circumstances. So yeah, it's great to
be able to draw that data in from so many
volumes of the back walls or fill with those books
that we talked of, where would you go to find that?
Whereas the AI can take you straight to the piece that
you need, that piece of legislation or the case law,

(14:20):
and then you, as a human element, can apply that
to your case or your situation. So it just saves
you so much time. And I actually swear by now
and obviously now that's part of my world.

Speaker 1 (14:30):
Well, like going through the detect this course, and I
would imagine it would have been very similar for you.
A lot of it was learning virtually parrot form about
the Crimes Act. Yeah, the proofs of the fence, that
type of thing just drummed into you, so you knew
what it was. Now that's readily available.

Speaker 2 (14:48):
Yeah, no excuse now to miss an element of a
charge or a point of proof to prove a case.
It's all there, ready to go. And I guess that's
part of where AI could be used to assist going
forward in that next step of what does this look
like for investigators?

Speaker 1 (15:04):
So when we talk AI like, I think you can
have the conversation on the street now on ninety percent
that people know what you're talking about and acknowledge their
views in some form. But when has that sort of
come come in the favor.

Speaker 2 (15:19):
I think on the last couple of years it really
everyone sort of embraced the idea because it's been everywhere,
whether it's you know, your chat GPT to help draft
things or research things. We've seen it within the school.
I don't think my sixteen year old daughter will use
chat GPT from how and again for some stuff, But
you know it's I think we've really sent the last

(15:39):
couple of years and people are starting to obviously lean
into that a bit more and scratch a bit deeper
than just saying, oh, AI AI is everything. AI can
be everything, but it actually you need to look at
what is the problem or what is issue I'm looking
to solve here? What do I want AI for? That's
probably the key point, because you know, you can't just say,
let's just great AI into investigations. Well, what part of

(16:02):
an investigation do you want to look at how it's
come into play? Yes, So that's the broader question that
needs a lot more thought, And there's so many elements
to it, Yeah, which I'm sure we'll go through.

Speaker 1 (16:12):
But it's someone bought up and that it was going
back probably twelve months or so, and it was someone
sitting here that we're talking about how AI could be used,
and there was a case where the courts rejected it
because you know, and I can't remember the specific offense,
but if a police officer had reasonable cause to stop
and detain a person police officers, So what was acting

(16:36):
on my mind at the time, And someone answered that
I did because it wasn't facial identification, but something came
up from a computer system, so the police officer was
being cross examined by the barrister breaking it down so
you didn't have reasonable cause and break it down to
the point where the officer said, well, actually I didn't,

(16:56):
but I had it based on that. Then the matter
was was thrown out.

Speaker 2 (17:00):
Yeah, well yeah, at what point did you form your
opinion or your belief? Well, I didn't form.

Speaker 1 (17:04):
It what was based on Yeah, So chat GPT, I've
embraced it in the past, i'd say twelve months. But
explain to people what that is.

Speaker 2 (17:15):
Yeah, well chat GPT is I guess, a whole large
system of data that's worldwide, but it is. It's not Australian,
so it's obviously has a bias towards that Western world,
but not particularly Australia. But you can ask it anything.
You can ask it to write you a script for
a book or a movie, right you write you a novel,

(17:35):
write you emails, or ask about the pyramids. And it's
just a large data set of all the information basically
that draws off the wear or whatever data is actually
put into it. So yeah, it's very broad, but you
can just about ask it to do anything or produce anything,
produce a photo, produce a recording and anything. So it's
the sky's limit.

Speaker 1 (17:56):
With my reluctance to use it, and I pay homage
to my son, and he dragged me into the world
about using chat GPT. I argue that it feels like
it's cheating, like it was to do a report, like
a report on something. It might be a report to
a council about something. He's gone, you can do that
on chat GPT and what are you talking about, Like,

(18:17):
I know how to do reports, and he posed a
hypothetical question. We put it to chat GPT and I'm
reading it. Jesus, that's done it better than I could.
And my reluctance was so I argued, back, game, but
that's cheating, Like I want to do it. I'm capable
of doing it. And he said he said these words
to me, and it sort of resonated with me and

(18:38):
changed my thinking on it. He said, when you went
to university and did you degree, well, you were relying
on textbooks to get information. And then as it evolved,
will you relying on the internet to get information to
help with it? Well, this is just the next step.
It's evolved. So you're not going to embrace technology. You
stick with your textbook stat you'll go really really good

(18:58):
if you can find the many Moore. That is how
it's evolved, isn't. It's something that we've got to embrace.
There's no ethical or moral issue against using the available technology.

Speaker 2 (19:09):
No, I don't think so, and I think it's time
saving for me. I embrace it almost every day. And
as you said, it's writing either report or writing an email.
It could be absolutely anything, but it's that time saving
of yes, you can write it, but it spits it
out to you in a format that you can just
about cut and paste into your email or your document,
but you still fact check it. I think past some

(19:31):
of those issues where people have been caught out is
these large reports that might be a couple hundred pages.
You've got to have that human lens across it to
check it and put in your own style. So I
know how I like to write and draft things. And
it depends on who your audience is. If you know
that person, so you add your personal lens, or if
it's to counsel, then you add your facts. But you're

(19:52):
just fact checking and it just saves so much time.
So that report that you probably would have written yourself
may have taken thirty minutes. To an hour or so
to really your.

Speaker 1 (20:02):
Head round the structure, because that's that's the slow part
of once I've got the structure in my head or
I'm seeing it on paper. Yeah, that's why I've got
to address and that that's what that provided I've had,
I've used it and tested it. I was looking at
a particular investigation and looking at the people involved, and
it came back with the wrong names, like I knew

(20:23):
the investigator involved or the detective involved, and I asked
the question that came back and I corrected and said,
well that person wasn't even involved in that case. I
got back up. We're sorry, we'll correct that. So there
are failings of it.

Speaker 2 (20:37):
Yeah, for sure. Yeah, and that's the I guess my
main point to those I talked about this, Yeah, you
must fact check it. You must look at it and
have a good read of what it is. So you
can't just cnut paste and send. You'll get caught out.
And I think there's already been some pretty large organizations
in that space. We won't talk about that, but yeah,
it's a tool, and I think if the more efficient

(20:59):
and effective you can using that tool, it just adds value.

Speaker 1 (21:02):
That's what I'm finding I think you talk about people
getting called out. I think there was a barrister that
was closing submissions and citing and it was identified that
those cases don't even exist. So clearly he or she
embraced that using that AI to present a closing argument,

(21:24):
which I'm sure it could do very well. But you
do need to fact check over it, don't you You do?

Speaker 2 (21:29):
Yeah, that's right, and otherwise are very embarrassing if not
the implications there. Yeah, that's pretty embarrassing professionally, but yeah,
you got a fact check. You've got to look at
what it is. But for me, it's about saving that time.
It's not having to worry about the structure. Yeah, particularly
for the larger pieces. So yeah, it's a great benefit
to me day to day.

Speaker 1 (21:48):
Looking at let's pick one aspect of AI, how it
could be used in criminal investigations. A large part of
our work as police officers were locating. Invariably you're looking
for the offender, it could be the victim you're looking for,
or even witnesses. Facial identification. Where are we at with that?

Speaker 2 (22:09):
Yeah? So facial recognition technology or FRT well advanced globally,
we're a bit behind the eight ball here in Australia
and we're finding a bit of that is the case
across the park. But it's well advanced. It's actually really
accurate these days, but it's as only as good as
what you can put into it. So again, like we've experienced,

(22:29):
you get your CCTV back of your drive by shooting
or your people that have run away from committing their murder,
and you might have some really ordinary footage. That's the
point where I can hopefully step up me to clean
it up, and then I guess run that facial recognition
technology across data sets that are already available, and that's

(22:50):
the gap that is needed. So you remember in the
days you'd actually to put your circular out across around
the police.

Speaker 1 (22:57):
Came up in the front over the front cat owner
of the police stations, in the detective's office, and.

Speaker 2 (23:02):
I think if I never had one result, because it
was fresh in that that investigator's mind, I put that guy, yeah,
in last week, that's this fellow. But out of twenty
seven years, that's one. So asking people to do that internally,
but then we've seen the amount of times you would
then ask the public because it's your last resort we
need to find this person and putting that face image

(23:23):
out to the public, and that that has its own
inherent issues, doesn't it. So you get everyone ringing.

Speaker 1 (23:28):
In the false sightings.

Speaker 2 (23:30):
Yeah, different things. You know, I know that person and
you end up with another hundred crime stoppers investigation reports
that you have to go into.

Speaker 1 (23:37):
How accurate it is it? Like, obviously a picture of
my face if I was seen walking down the main
street of Sydney, is it that capacity that they could say, oh,
if you're looking for jubilant, he was in the main
street of Sydney.

Speaker 2 (23:49):
Hours agoinndred percent Yeah and some and we talked about
time and saving. It's very accurate. And that's what I
mean by globally it's there.

Speaker 1 (23:58):
What's it go from? The the measurements between the eyes,
the type of the noose, everything. Yeah, so many different
points identification.

Speaker 2 (24:06):
And you come through like you do as you come
through customs and that's the FRT technology piece. But how
you then put that into investigations is the tricky bit
because you need to know if you're going to be
comparing against other systems, whether it's driver's licenses or your
corrections photos. They're the data sets you want to compare against.
But there's the ethical aspects and the privacy concerns that

(24:28):
go with that. So that's that's probably the piece that
still needs to be plugged in here in Australia.

Speaker 1 (24:34):
We're going to protect people's private yeah, yeah.

Speaker 2 (24:37):
And that's the issue where if we're using some of
these systems that are offshore, you're sending an image of
someone that is one of our citizens that may or
may not be your accused. We're sending that offshore overseas. Right,
that's what needs to be more sovereign based, and it
needs to be developed here for here, for our culture,
for our country.

Speaker 1 (24:55):
And what's the dangers of that? You said we need
as a sovereign system and we need police to have
the proper authority or law enforcement to have the proper authority,
Like we don't want law enforcement to know every every
movement you'll make, correct, So what's how does that? How
does that work? Do law enforcement need their own system

(25:17):
or what's the how do we protect.

Speaker 2 (25:19):
Or that that? You just led me straight into what
my business is. So thank you, So I fed the
music play. Look, there's a number of aspects there. Number
one's ethics and there are nine principles that the Australia
and New Zealand Policing Advisory Agency have given to all
law enforcement across the country. Those nine principles are trying

(25:40):
to raiw them all off the top of my head,
but ethics, accountability and so many different finite points that
exactly what we're talking about protecting the community's interests. But
there's also the law enforcement site. Community expects us to
catch your murderers, catch your sexual offenders, so that in
that regard, it's got to be built in my view

(26:00):
here for here, because we don't want our data system,
particularly a government data system, being offshore. We can't raise
so many issues, doesn't it security? National security, individual security?
So if it's built here for here, we've got more
control of it so it can be independently audited. You
can actually have a conversation with the people that are
building those large models and develop it for our bias.

(26:23):
So if we're putting in a bias that has an
American or a UK lens, then you may be drawing
in UK case law or looking at American systems or
American footage. You need to plug it into what is
Australian and purely Australian, but you need the access to
those data sets and that's where the nervousness is. So
I'm already talking with some police forces and accompanied by

(26:44):
the name of main Code and Dave Lemfords and his
team at main Code that are building these massive data centers,
incredible investment. We're talking millions and means of dollars of
investment to have the data centers here in Australia, which
is step number one.

Speaker 1 (26:58):
And in the data centers, they'd have all all that
information captured and to access that it would be by application.
If we were looking for a person that was wanted,
we would track their phone, but invariably you needed authority
to track the phone. You need the approval from the
core that it needed to be signed off at some level.
Is that the type of safeguards you're talking about, because

(27:21):
we don't need to track you with your phone now.
But if I'm going where is that Graham, I need
to speak to him about this matter and put it
out there. Okay, he was here three hours ago, he's
here right now and we go grab you. Is that
the type of privacy that you're concerned that could be invaded?

Speaker 2 (27:37):
Yeah, exactly right, and that's why it still needs to
have a process, which I'm sure the e Safety Commissioner
is looking at. It still needs to have a process
of authority that needs to go through, so you need
an independent order to look at. In our scenario, we're
looking at independent orders to oversee in fairness to what
we're trying to build. Not dissimilar to in Victorio's the
Office of Police Integrity, so the OPI would overlook all.

Speaker 1 (28:00):
We have the equivalent of le here in New South Wales.

Speaker 2 (28:03):
Yeah, So that's the level of trust that people need
to have in any system that's going to be built.
It's very difficult to have that trust if you're going
to build it off Suore. So we build it here
for here and independently ordered by Australians. But in answer
to your question, I guess in my view, crimes against
the person, so your homicides are robberie are.

Speaker 1 (28:23):
The things that we should be able to access it for.

Speaker 2 (28:25):
Yeah, they're the things that I think our community would
expect us to be able to act on. But you
ought to be able to plug into these systems. So
in the scenario, if you're just wandering down circular key
or wandering down Burke Street, Melbourne. You need to be
able to plug into those systems and that's where the
collaboration and partnership needs.

Speaker 1 (28:41):
Are they government systems or because businesses have their CCTV
the licensed premises, Every business has cameras. There are we
talking about gathering all that information or only government authorized ones?

Speaker 2 (28:54):
Yeah, that's a great question, and I guess it depends
on the appetite on how far it goes. So some
are privately run. Obviously, the City of Melbourne runs the
Safe City Center cameras in Victoria, but Victoria police already
have access to that life. Again, it's about doing it smarter.
So if you are looking for someone that's just committed
a very serious crime, we need to know where this
person is now. They've just fired a shot through some

(29:16):
building in middle of Melbourne. Here's the image, we've got it.
It's just going to expedite things. So rather than trying
to look at phone records or a triangulation that can
be a little bit accurate, if you can plug that
into that system, it will Using the facial recognition technology,
it does have the capacity to find them immediately and
then alert us where that person.

Speaker 1 (29:35):
Is how are this then with people going on the
run in this people often ask me that you know,
and you've got the alleged shooting by Desmond Freeman that
it's been months that they're looking for. But the type
of technology we're talking about now is only applicable to
the environment, and if he's in the bush, that we

(29:58):
haven't got that advantage.

Speaker 2 (30:00):
No, that's right, and that's why I think they're brought
in and I'm not that close to what I think
they're brought in thermal imaguring and different things into that area.
But look, I worked up there for so many years
and you were doing searches for people that were lost
and they wanted to be found and we couldn't find them.

Speaker 1 (30:12):
To count it rough and yeah, desolate, it's.

Speaker 2 (30:16):
Just so vast, so's it's a challenging time. But look,
I'm just so glad that they're continuing that search and
continue to fight to bring into justice.

Speaker 1 (30:24):
Oh well, they're putting all the resources in. Then he
needs we need to find out what's happened and where
he is, where he's still with us or if he's
if he's not.

Speaker 2 (30:34):
Yeah, but you raise a good point, and people out there,
would you know, that's my it's my right to aout
to walk around the city without being tracked by the government.
I guess the lens that we're I'm trying to put
on and hopefully we get to the point is the
systems won't be bothered by you. It's looking for the
person that's just done the drive by shooting, or committed
the murder, or committed the sexual assault. And of course

(30:56):
everyone agreed we want that person court. That's the piece
that needs to come in here, not tracking everyone single movements,
and that's the lens of auditing those systems to make
sure that they're being accessed appropriately. No different to any
of the systems we've gone through in policing, where you know,
if you access it for the wrong reason, high chance
you'll lose your job.

Speaker 1 (31:15):
Are we in a situation like the way AI is
rapidly evolving? That really sickly people? The government, big brother
will know where each and every one of us are
at any time.

Speaker 2 (31:28):
Well, if you're commuted a serious crime, yeah, yeah, so Look,
it's hard to The more urban we are, the more
you're going to you're going to go onto.

Speaker 1 (31:36):
Counter I think from yeah, I look at my day today,
I come here in my phone would be pinging off towers.
I go and use a credit card at the store.
They know I'm there. My car goes through down the road.
There's images of that. It's pretty much where we're at now, isn't.

Speaker 2 (31:54):
It's right, it's there now, and that's where a I
will expedite that and bring it all together. So like
a definition at the start was taking large sets of
data large it's information pulling it all together. There's already
AI out there that you know, the old case of
either an armed robber or something, you'll go to twenty
thirty different places and get all their CCTV. You collect
it all because you don't want it to get lost.

(32:16):
Some systems rewright after a week or a month, and
you don't want to be two months down the track
and go, oh gosh, they actually dumped the gun in
a being around the corner.

Speaker 1 (32:25):
You want and timely, and it was time consuming going
and going through that.

Speaker 2 (32:30):
So there is AI already of al that you can
get all of those thirty different businesses cameras that might
Fragument's sake have ten cameras each. There's three hundred cameras
you've got to go through well and all all of
that time.

Speaker 1 (32:41):
I remember how time consuming that was. We'd grabbed great,
we've got footage from ten CCTV cameras, but it would
take weeks to go through.

Speaker 2 (32:50):
And how do we give that job too or you know,
you had to go through it all yourself, And that's
that's the piece, that's the saving of times, so you
can prevent the next time robbery or of a next
sexual assault because you're straight onto that person's identity much
faster than you would have been if you're trying to
go through it all yourself.

Speaker 1 (33:08):
The argument, and I've heard this argument when we talk fingerprints,
I hear in DNA when DNA evolved and that, and
people would go, you know, like there was a suggestion
that fingerprints and DNA should be taken from everyone because
people assume we've got fingerprints of everyone. We don't necessarily

(33:28):
have fingerprints of anyone. They've got to have come before
the system with DNA, Like why not get everyone's DNA
on the day that they're born. Why not get everyone's fingerprints?
Everything like that, And people would often say, well, I've
got nothing to hide, I don't care if people know.
I was of that view earlier. Maybe throughout my police career,

(33:50):
I didn't give it a lot of thought. Since I've
left the police, and certainly you know the battles I've
had with court system and taken on Big Brother, it
does sort of, it does scare me. And so what
I'm saying that public opinion will Who cares? If you
don't do nothing wrong, You've got nothing to hide. But

(34:10):
there is an aspect of an invasion of privacy that
I'm not not comfortable with.

Speaker 2 (34:14):
I got to agree with you with that. I'm exactly
the saying through my policing career, if you'm not only wrong,
what have you got to worry about? Yeah, But now
on the other side of the fence, I think it's
really it's more important to me to have that security
that number one, you know, it's my information, it's my
DNA to want it out there and just say taking
on Big Brother, and we've seen errors in cases through DNA,

(34:35):
We've seen errors in criminal cases. So it's important to
have the integrity of these systems built right from the outset.
If we're looking at AI, how it's built right.

Speaker 1 (34:45):
Well, it's interesting and that's been the theme of your
discussion is about the integrity and the ethics involved in it,
because it is something that could be abused. I think
a lot of us felt aggrieved by the during the
COVID times where everywhere you had the g you had
the scan in, and that that frustrated me. I didn't
like that because police or the government was saying, look,

(35:09):
will not be used for anything other than tracking the
COVID case, I'd say bullshit, because because I know if
there was a murder, yeah, I would get access to
that information. Where if I was looking for you during CAVID,
then you'd scanned into such and such a place, we'd
get that information.

Speaker 2 (35:25):
Yeah. Well we had an example of that on the border.
So we're on the border with New South Wales which
is obviously closed, and we had a shooting in central Wodonga.
We got the car really quickly because it was not
much footage to go through. When we went, oh, we
know what that car is, We put that information out
to the checkpoint, bang got him ten minutes after the shooting.
Different to scanning in somewhere. But big Brother was there

(35:46):
the whole time, and I was very anxious about that
that whole time, and I didn't enjoy the job at
that time. I think a lot of cops went, you know,
I didn't sign up for.

Speaker 1 (35:53):
This well down in Victoria and I saw it. I
just got out of the police before the lockdowns happened.
And yeah, I wouldn't have liked to be the police
officer in those times. I think we were forced to
enforce the law, which is our job and what we
sign up to, but I don't think our hearts would
have been in a lot of it. In the nine pm.

Speaker 2 (36:13):
Curfews, curfews trying to stop kids from playing in parks
and oh my god, don't.

Speaker 1 (36:20):
Go the end of the beach. All right, Well, we
could do a whole nother podcast with you and I
winding about that. But yeah, it was that just to me,
it showed how power corrupted it is.

Speaker 2 (36:30):
It's a really important piece of the discussion on development
of AI and what does it mean. And that's again
and so to go on about that's why it needs
to be done here so we can control them. But
and that was part of my passion of getting involved
in the first place, because I could see a real
need for it. But I can actually see how this
can benefit saving so much time for investigators quickly identifying

(36:51):
offenders to stop that second offense, but also help in
the court process. You know, we've got horrendous weights and
delays through court the lawyer piece on the other side
for them to understand all the data information. There is
a real space for it, but it's got to be
developed ethically, morally, and I feel what better person to,
I guess, develop it than someone that's been through it

(37:13):
and lived it and knows what's needed.

Speaker 1 (37:15):
Hey, guys, it's Gary jubilin here. Want they get more
out of VI Catch Killers, then you should head over
to our new video feed on Spotify where you can
watch every episode of VI Catch Killers. Just search for
I Catch Killers video in your Spotify app and start
watching today. Give me a sense of how it changes

(37:36):
the way with approach a criminal investigation. And you know
we're talking you have only left four years ago, so
it's relatively recent. How do you see the potential for
it to change criminal investigations.

Speaker 2 (37:48):
Yeah, I'd probably give you a real basic case, so
you know, a drive by shooting or a murder if
you will, so to get away car. You get it
on footage, so you know the initial action is always
canvassing witnesses chemissing for CCTV, it's everywhere if it's urban.
So you do get your piece of CCTV that has
the car driving off at speed away from the shooting.

(38:09):
We've sent it on TV a million times and you
would have sat around the table in the office talking
about is that a twenty nineteen hold in Commodore vs VX?
What type of car is that? You need to identify
exactly what it is. AI will be able to do that,
and that's something we're looking at with Peregrim has been
able to build a data set of vehicles. It's a

(38:32):
known quantity. There's no it's not like a fingerprint where
everyone has one. There's only so many models of vehicles
in the world. So an AI program developed for law
enforcement would quickly identify immediately that is a twenty nineteen
hold in Commodore, certain model, the rims are changed or
whatever aspect or whatever unique piece of that car is

(38:53):
and it's one hundred percent accurate.

Speaker 1 (38:54):
See that it self would be beneficial. And you just
brought me crashing back into my time in the told
Up squad when we're always tracking cars and we would
go to the manufacturer and this is what we've got
then take a long time to get that and it
was always opinion based. But you're saying with AI the
right system bank, this is a car you're looking for.

Speaker 2 (39:14):
Well, it's a fixed data set, isn't it. So you
know if you're loading in every single car that's ever
been manufactured, it'll find it, and it'll find it quickly.
So we know the power of knowing that piece of
information investigation as early as possible is critical. And then
we talked before about if you did not know that
vehicle and you go to the media or the public
about can you help us find this car or we're

(39:36):
looking for this particular car, and if you get it wrong,
you can' undo that.

Speaker 1 (39:40):
No, Well that was you had to hold back sometimes too,
because if you put out the wrong one, okay.

Speaker 2 (39:46):
Well that's a real basic example of and that can
then change your tactics. So if you've quickly identified the car,
if you quickly identified what make a model it is,
you may then be able to look at the identity
of that vehicle, whether it's the red j O and
plug it in. But then you can track it across
where it went from there. So there's a lot that
can change in that simple example of just that one

(40:06):
piece of knowledge of knowing the make and model.

Speaker 1 (40:08):
If you get away car, okay, I can see the
benefits there. And if we can get and the old adages,
get the police out on the streets where we need them.
And I see in Victoria, I heard them say that
they're bringing in a policy where they're going to have
more public servants doing work than police have done. The
free up the police because they're struggling with I think
that's a good idea. Maybe if we embrace AI in

(40:31):
law enforcement and don't quite often I found in law
enforcement and you would have seen this too. When technology
changes invariably we wait a little bit. Yes, then we're
reaching out to the experts outside the police. Can you
give us a hand? It'd be good if police got
on the front foot, law enforcement got them on the
front foot.

Speaker 2 (40:51):
Yeah. Yeah, And that's I think the trigger point at
the moment is without telling government or police and what
they need to do. Now I lived on the other side,
you can't do it on your own one funding, it's
not cheap. And show me a government that's got millions
of dollars to invest in AI for their own system.
I've been at a few conferences lately around the frustrations

(41:11):
that are shared by very senior police saying when is
the corporate we going to help because it's needed, and
that's I guess where we come in. And it's certainly
not cheap. You know, if there's investors there that want
to come on board and kive me your hand, that'd
be great, but you've got to align yourself with the
right people. This isn't about profit. It's about building something
that will work for our community, to make it safer
and as you say, make the investigators time more out

(41:35):
on the road rather than behind a computer putting briefs
of evidence together, going through CCTV, trying to identify a vehicle,
or going through phone records, even those things, you know,
hours and hours and hours at work to work through that.
People think we have that already, the amount of conversations
I have with corpus that don't the cops have this? Sorry, No,
it's you know, it's funded through state governments and federal governments.

(41:57):
It's a real challenge. So there are some partnership out there.
I know monash Uni and Australian Federal Belief have an
amazing program through the ALEX Labin Artificial intelligence for law
enforcement community safety. They do an amazing amount of work
around the trail exploitation and trailed sexual material online. They're
using AI to help with that, which is fantastic because

(42:17):
it protects our kids. But there's more pieces if you're done,
and that sort of circles all the way around to
AI can be anything. But you've got to find out
what is the problem you want to solve. And it's
the same in law enforcement. You've got to pick what's
going to be the most effective and efficient way to
probably save time, which is cost of cops out on
the street, and understand why they're employing public servants because

(42:38):
the endgame is you want to have more visible.

Speaker 1 (42:40):
Presence and at the risk of losing listeners because it's
so boring and I know you would have found it boring.
But the admin side of policing, even to the point
of rostering, how much time that would take up. Now,
AI like chat GPT could work out the roster so
simply so we could three out administrative things like that.

Speaker 2 (43:01):
For sure, And and the cost in labor alone huge.
There's there's you know, I'd hate to think how many
cops are back out on the street.

Speaker 1 (43:08):
Well, handling of exhibits, tracking, and I know and I
assume Victoria would have been the same that New South Wales.
We upskilled on that because gone were the days of
the old exhibit book And where is that missing exhibit type? Oh,
that's right, it was in Grahams that's right.

Speaker 2 (43:25):
Yeah.

Speaker 1 (43:26):
So all that should in fact free up police to
do what police should be doing, and that's enforcing the
law and investigating crimes when they'd occurred.

Speaker 2 (43:35):
Yeah, exactly, And that's that's the point. I guess that's
where we are at, this rural juncture of do we
lean into it, do we trust it? And you've got
to have the confidence and partner with the corporate to
trust to build that. And there's a space that law
enforcement are heavily involved in that for sure, but you've
got to be able to fund it. And you know
that's the big piece.

Speaker 1 (43:54):
It's not cheap and you need those checks and balances.
And like when DNA came in, and I think it
was early nineties and the first case, it was one
of the early cases that DNA was used. It confused
hell Adam the first start, it was a horrific murder
of an elderly lady in her house and she'd been
sexually assaulted. And we had a good suspect, and we

(44:15):
got the suspects DNA and it didn't match, and that
everything else was pointing this is a person. And eventually
we got on to another person the DNA and there
was no admissions from him initially, but the DNA told
us this is a person you're looking for because seamen
was left at the crime scene and it blew me

(44:35):
away in that. Okay, we can charge a murder a
person with murder based on this information. But that's how
things evolve, isn't it correct?

Speaker 2 (44:44):
And that's what juries tend to expect. Now to where's
the forensic evidence, where's the CCTV? How did we ever
convict anyone before we had all this? We look now
in twenty twenty five. If the case doesn't have DNA,
it doesn't have a fingerprint, doesn't have an eyewitness identification.

Speaker 1 (44:59):
And if it's at scientifically based evidence, it's irrefutable. It
carries a lot more weight with the jury. But we've
got to be careful because is it Queensland. The laboratories
it was having a lot of problems and it didn't
stand up to the scrutiny. That's right, and that could
be the case where we fall into the AI. Well
if we don't make sure we have the proper standards

(45:20):
and regulations.

Speaker 2 (45:21):
That's right, and that's why it's so critical to build purpose,
purpose built.

Speaker 1 (45:27):
Data analysis and evidence gathering. A lot of the work
on the complex investigations, particularly when I was in homicide
and you worked homicide, you are overwhelmed with information that
comes in, and that is the real struggle. You have
to just sift through what's important and what's not. You
get a good analyst and an analyst and the way

(45:49):
that what I considered the good analyst in my time
was someone that could retain information, could reach statements and
pick up little pieces of there was a red car
scene in this witness's statement. Someone else mentioned the red
car in that street two weeks ago, These little things AI.
I could see AI just freeing that up exactly.

Speaker 2 (46:08):
It's solving problems. It's understanding the data and understanding information.
And if you're plugging all that information into your case
and into your system, and you want to know whereas
a red car mentioned every single time, and every witness
statement or every bit of CCTV footage. It won't only
just tell you where it is, it can timeline it
for you, so you know, trying to put the chain
of events and chain of circumstances of your case together

(46:31):
when you're definitely tracking someone that's been over a matter
of weeks in the lead up to something, or post
conviction or post offense as well, it'll timeline it all
that for you, which it will do it in seconds,
but then you've got to validate and then you've got
to check it. But it's removing those hours and hours
and hours. And we're not just talking four or five hours.
I think we've said before. You know, it's it's weeks,

(46:51):
if not months, of putting a brief together.

Speaker 1 (46:53):
I still remember and that it was a hard time
in policing. When I say hard, it was worthwhile. But
it was sitting in an office for about four months
on my own, going through a broof of evidence, reviewing
an old homicide three kids. It was a serial killer,
it was a bowerble one and reading statements that were

(47:15):
taken in the nineties that weren't entered on computers, so
you couldn't you couldn't use the computers to find the
information and literally going through highlighting this and then have
to keep my head around the whole investigation so I'm
not missing anything. We didn't have an analyst attached to
the investigation. In fact, I was the only one working
on it. So it was literally sitting in a room

(47:35):
for three to four months making notes and making sure
I didn't miss any information. And I look back at
something like that now, and that was it was my
When I say mind, I mean it was just.

Speaker 2 (47:45):
Hard work hard. That's a hard slog.

Speaker 1 (47:48):
And that could be literally done in a matter of minutes.

Speaker 2 (47:51):
Correct one hundred percent can and actually then trying to
identify where your gaps are or where you need to
be if you can train it to think like an investigator.
And that's probably I'm most worry about, is we don't
then get lazy investigators. We're not twenty years down the
track going. These teams can't think for themselves, they don't
think outside the box. You've still got to have that skill.

Speaker 1 (48:10):
You bought up something that this is my silly mind
ticking over. I learned how to do AFFI David's. I
learned how to type up a fact sheet. I learned
how to take a witness statement. That was the skills
I believe I had as a detechnique. They were hard
earned skills. It didn't come natural. It was trial and
error and learning and experience and learning every day. Now

(48:33):
to do a set of facts, you could, in fact,
just put in brief information and ask the computer system
AI to present a set of facts for the local court,
or a set of facts for the district court, or
this is going to the Supreme Court, a comprehensive set
of facts with reference to where the material was from.

Speaker 2 (48:53):
Just a blame pro a, Thanks Gerry.

Speaker 1 (48:55):
Sorry, I didn't I think of that business. But that
is so difficult to do and learn. Now, if I
had read set of facts that came from AI, I could,
because of my experience, look and go no, that's not right,
we don't need that. But if people don't learn that
skill to start with, what you've just said is a

(49:18):
con And that's not just we're talking law enforcement here,
we're talking detectives. Because it's a world well winner. But
it's across the board, isn't it.

Speaker 2 (49:25):
Embarrasters, lawyers, you know, it's everyone across the board. But
I feel the benefits far our way, the negatives and
I think that's an important piece of moving through to
that next stage of embracing it. But make sure we
don't lose sight of how we then skill our people
to fact check properly to make sure that it's accurate,
because it will cross reference. So you get one hundred

(49:48):
statements from your homicide brief or trying to statue homicide brief,
give me those facts and circumstances for the court. It'll
cross reference everything for you and your exhibits and timeline
it for you and put in the language that the
courts want, that the barristers want. Defense can better understand
the case and your prosecutors can better understand it. So
it should be in theory expediting all of those concerns

(50:08):
and your case conferences are very clear as to what
other points and issuing here and anyone can agree, rather
than going through volumes and volumes of all wielding trolley's
worth of evidence into the court. It definitely has its place,
but we need to make sure that people still keep
their skills in, as you said, understanding what the courts want,

(50:29):
what the investigators need, and what the law is. Well.

Speaker 1 (50:32):
To give an example, I'm just thinking when you were
saying that it wasn't unusual to get a twenty page
statement in a homicide investigation.

Speaker 2 (50:42):
That sounds a lot.

Speaker 1 (50:43):
But if you're going nothing down in the finite detail,
or you do a URISP interview and an electronic interview,
that might take three or four hours. So there's a
lot of information there, a lot of work. And when
I was leading the investigations, or when I was working
on the I was doing the summary of it. But
when I was leading the investigations, I was getting people

(51:04):
to summarize this fift It might be a fifty page transcrip,
summarize this in very condensed version, highlighting the key points,
all the areas that I might need, all the person
running the investigation might need to pull from that statement
or interview. That's done, is it?

Speaker 2 (51:23):
Yeah? And I can see a space where we even
not now to looking further, where your AI is actually
there with you in the interview room and you've got
all your data sets there with you. As you're asking
the questions of the accused, I can tell you, hey,
you're missing this piece.

Speaker 1 (51:37):
And it could do it live. And I think the
technology is even there to do that now, isn't it.

Speaker 2 (51:43):
Yeah? It's just again building it so that it's secure,
its safe. The integrity piece is part of it. But
you don't want to walk out of an interview room
and go I really forgot to ask that question.

Speaker 1 (51:53):
Well, the skill of a great detective was to sit down.
I always thought this was one of the most valuable
skills that detective had, was the ability to extract information
from a witness. And if you had AI sitting there
and you're my witness, I'm getting the statement, and sometimes
you've been up for twenty four hours, Yeah you're tired
and getting the statement, and then you come back and

(52:15):
you've I.

Speaker 2 (52:16):
Forgot that it's wrong.

Speaker 1 (52:17):
You could have AI going you haven't covered off on
when he last spoke to the victim. There's something as
simple as that.

Speaker 2 (52:23):
It's right, yeah, yeah, exactly right. It has a place
for it, and you don't want to have to be
going back to reinterview or take another statement. I want
to retraumatize the victims by going back, because every time
you turn up on someone's doorstep, I'm back again. What
do they want now? The anxiety everything, So you're there
once you're there thorough and the AI is looking at
your entire case, And as you said, a very skilled

(52:44):
investigator knew their whole case, knew everything in and out.
But nowadays, how many are they working on it? Once
there's a human element that can forget that, but that's
where AI has its real appetite to assist an investigator.
It's another tool, and I think if we look at
it has another tool, it's going to be a great
benefit to community.

Speaker 1 (53:03):
Do you think the law enforcement will embrace it?

Speaker 2 (53:08):
I hope so, I really hope so. I think it'll
free up time. But essentially the purpose is to identify
offenders quicker than what we have been in the past.
You know, who is this person? Do we use facial
recognition technology? Do we use data sets of phone records
or statement taking to help us better understand evidence and

(53:28):
the case?

Speaker 1 (53:29):
Well, to me, what you're describing and what AI brings
to it, it'd be like if I hand picked the
world's best criminal investigation team and they're at my disposal.

Speaker 2 (53:39):
That's the best way to look at it.

Speaker 1 (53:40):
Yeah, yeah, Like I'm thinking with all that, some of
the investigations and even we were having to chat the
other other day, and I think we're both carried the
trauma from it those late nights when a murders happened,
or are serious crimes happen and you need a listening
device or something put in place, and you're sitting there
at two o'clock in the morning typing up and after David,
that's going to be read by a Supreme Court judge,

(54:01):
and how exhausting that was, and get it in the
terminology that's appropriate, and you could actually blow the application
if the judge gets cranky because you haven't laid it
out properly. That could be done in a tenth.

Speaker 2 (54:14):
Of the time easy, and then some literally seconds for
it to process all of the evidence that's in the
system already and put out an affidavit in the right format.
Then you just got to fact check it. You can
have someone awake enough to make sure that it's fact checked.

Speaker 1 (54:28):
You keep coming back to the fact check. That's going
to be the failings, isn't it, Because I would imagine
one time something slips through the system and everyone goes, well,
this is why we can't use AI. There's going to
be a reluctance and police will hide behind that or
law enforcement will hide well, you can't trust that because
someone submitted the facts and had the wrong name or
something as stupid as that. That's what we're going to

(54:50):
make sure. So you need someone actually still putting a
signature to it. You can't just go, oh, well that
was a computer. It's not our fault.

Speaker 2 (54:58):
Yeah, And that's what we've got to work with law
enforcement agencies. Develop the policies and procedures that sit with
the AI. Don't compete against it. Let's work with it
and have your policies and procedures that talk to it.
This is what I expect from you. So your supervisor
as a fact checking, we're doing that now.

Speaker 1 (55:12):
They so we had to, like I'd have a junior
officer prepare an affidavid and I would have to sign
it off before it was forwarded. And that was me
fact checking that the person knew what they were doing
and it was truthful and accurate.

Speaker 2 (55:25):
That's right. You just haven't got the poor young constable
doing four or five hours on an affidavit.

Speaker 1 (55:31):
We'll take a break here. When we come back, we'll
break it down and talk about some investigations on how
that could actually help. I'd like to look at the
unsolved homicide, the cold cases too. On my mind, sort
of the ticking ovor how I could play a part
in that. But it is a brave new world.

Speaker 2 (55:49):
Isn't it. It certainly is Gary, And I think.

Speaker 1 (55:52):
I might be able to do myself out of a job,
would I like, as in hosting a podcast like nothing's
off the table, is it? No?

Speaker 2 (55:59):
That's right. That's where everyone's a bit anxious and a
bit nervous about what does that future look like? Replacing
placing jobs.

Speaker 1 (56:05):
But well, I want to talk talk about court too,
because I think that it's going to be a big
change in court, not just the role of solicitors and
whether it be defense or prosecution, but magistrates and judges
and yeah, interesting, Okay, we'll be back back shortly.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Betrayal: Weekly

Betrayal: Weekly

Betrayal Weekly is back for a brand new season. Every Thursday, Betrayal Weekly shares first-hand accounts of broken trust, shocking deceptions, and the trail of destruction they leave behind. Hosted by Andrea Gunning, this weekly ongoing series digs into real-life stories of betrayal and the aftermath. From stories of double lives to dark discoveries, these are cautionary tales and accounts of resilience against all odds. From the producers of the critically acclaimed Betrayal series, Betrayal Weekly drops new episodes every Thursday. Please join our Substack for additional exclusive content, curated book recommendations and community discussions. Sign up FREE by clicking this link Beyond Betrayal Substack. Join our community dedicated to truth, resilience and healing. Your voice matters! Be a part of our Betrayal journey on Substack. And make sure to check out Seasons 1-4 of Betrayal, along with Betrayal Weekly Season 1.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.