Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:21):
Welcome to the
Digital Forensics Now podcast.
Today is Thursday, april 10th,the year 2025.
And my name is Alexis Brignone,aka Briggs, and I'm accompanied
by my co-host, the CelebriteC2C Mentor of the Year 2025
(00:42):
people.
Mentor of the year 2025 people.
The Piper of Albany, but withbears instead of mice.
The anti-lodging long distancedriver.
The unique, the only one.
The mold has been broken aftershe was made.
(01:03):
Heather Charpentier music ishired up by shane iris and can
be found at silvermansoundcom.
People.
Hello everybody.
I couldn't find the play buttonfor the music, but here we are
oh my god, I think that's thelongest intro you've ever given
(01:24):
me look, it's a lot of thingsbeing happening.
I know people are dying to knowall about all these things, so
we can we can tackle them in theorder.
Look, look people.
Heather was the main characterof the last like two weeks.
Speaker 2 (01:37):
Like she was the main
character.
Speaker 1 (01:38):
energy.
The main, the main storylinewas going through her.
So if you're attentive to theshow of life, she was the main
character for the last two weeks, so tell us.
Speaker 2 (01:48):
I don't know, I don't
know about all that.
Speaker 1 (01:51):
It's true.
It's true, people.
It's true.
What do you want to start with,sir?
Speaker 2 (01:55):
It's been a busy few
weeks for sure, thank you.
I see everybody writingcongrats in the comments.
Thank you very much.
Well deserved, extremely.
Hats in the in the comments.
Speaker 1 (02:08):
thank you very much
exactly.
Well, well deserved, extremelywell deserved.
Yes, the pied piper of bears,huh, yes, yes and people are
like what?
Speaker 2 (02:13):
I may want to tell
the story behind the pied piper
of the bears, I think yeah, yeah, yeah, well, you tell it, you
tell what happened with bearswhat happened so I I think
everybody knows from a previousshow I love birds and I have the
bird buddy bird camera and so Ifeed the birds and I get them
on the video camera that's builtinto the bird feeder.
And there comes a time everyyear in New York that you have
(02:35):
to take your bird feeder downbecause certain animals come out
of hibernation.
Speaker 1 (02:41):
Yeah Well other than
people.
Speaker 2 (02:44):
Let me just share
what I got in my bird cam.
Hopefully everybody can see it.
Yeah, if you can't.
If you can't write in my chats,oh my gosh, um, but yeah, just
listening.
Speaker 1 (02:57):
That's not a dog
making a sound, okay no, um.
Speaker 2 (03:01):
So the bear decided
to come out of hibernation and
just hop right onto my birdfeeder for a little snack.
So, bird feeder is now down andin my house it will go back out
when the hummingbirds arrive.
Speaker 1 (03:22):
Do one of the birds
come in and eat from inside the
house?
Yeah, no.
Speaker 2 (03:24):
No out.
When the hummingbirds arrive,you want to do whatever's.
Come in, come in and eat frominside the house.
Oh yeah, no, no, I'm just gonna.
I'm just gonna hold off on thebirds for a little while oh, my
goodness, that's so scary.
Speaker 1 (03:33):
Let me send a quick
shout out to duck squash is
first time he catches the show.
He or she catches the show, sogood to have you.
Um, that's that's crazy, likescary.
That video is scary.
And where's that?
Where theater, like in a in apole, you know wall where you
have it from not a pole rightout, right outside my window
your window.
Speaker 2 (03:51):
Okay, this is yeah,
don't poke your head out no, I
didn't want to go outside in themorning.
I think you messaged me.
Hey, are you headed to the gym?
Hell, no, I am not leaving myhouse.
Um, I stayed at the house anddid my thing oh my god, yeah
well yeah yeah, lessons, lessonslearned um, yeah, definitely
Speaker 1 (04:10):
no, no, no.
Now that you learned yourlesson with the bear, now I'm
gonna introduce to everybody ohright, I already said it the
mentor of the year, and it was.
It was fantastic.
Look we, uh, we're, we'reinvited, was invited.
I was nominated for something Ididn't win, which is great
because I could really focus myenergies on celebrating Heather.
(04:30):
It was so much fun.
We ate, we got dressed up.
She sent us some pictures ofthat in a second, but I want to
play to you against her will,totally against her will.
Speaker 2 (04:39):
Completely.
Speaker 1 (04:42):
Her win.
So everybody just enjoy thiswith me please, as we comment,
live on it.
Hopefully everybody can seethis.
There's Heather Barnhart fromSouthern Bright listen to her
speech.
Speaker 3 (04:57):
Being a mentor can be
exhausting, I would not be here
today my first boss did notforce me to not push buttons and
get answers.
I have several mentors, some ofyou are in this room and you
may not even realize it, mo, youmentor me and you may not
realize it, I have mentees inthis room.
That I may not realize it.
(05:17):
It's all about giving back.
Our jobs are hard, our jobs areexhausting, no one has time to
mentor.
that's what we all say whensomeone's like hey, can you
introduce me?
Yeah, and then you end upkeeping communications with that
person.
It is our time.
You can see that you raise thenext generation.
Speaker 1 (05:36):
We're not all going
to just like get the camera out
of my face it is ongoing workthat you are going to do forever
.
Speaker 3 (05:46):
This is why we
created the 101.
The Celebrate community is aplace for you and the younger
generation to meet people, togroom them into being amazing
examiners and investigators.
Speaker 1 (05:57):
So this person has
gone above and beyond and has
done more than you can imagine.
Speaker 3 (06:03):
It is my honor.
Speaker 1 (06:04):
She said that I knew
it was just one at that moment.
Speaker 2 (06:07):
Stop it.
You did not.
Speaker 3 (06:10):
Back in the day.
You know who you are, you usethese.
Speaker 1 (06:14):
They had inside
little fire day bags.
They were bigger back then, Allright From one Heather to
another.
Oh my God.
Speaker 3 (06:23):
All right From one
Heather to another Woo.
Speaker 2 (06:28):
Oh, my God.
Speaker 3 (06:33):
Yeah.
Speaker 1 (06:38):
On fire.
No, this isn't the best partnow.
Speaker 2 (06:42):
Nobody was giving
speeches at this point, yeah.
Speaker 1 (06:43):
I know.
Speaker 2 (06:44):
Nobody was giving
speeches, speech point.
Yeah, I know, nobody was givingspeeches.
Speaker 3 (06:47):
Speech, speech,
speech, speech, speech, speech,
speech, speech, speech, speech.
Oh, I could have killed them.
I'm a co-worker right here.
I work with Heather guys.
Speaker 2 (07:02):
But thank you, thank
you so much I appreciate it.
I think I learn more every daymentoring people than I'm giving
back, so thank you.
Speaker 1 (07:12):
And then Heather went
with the plate and hit us over
the head with it.
Oh, I was a nice.
Speaker 2 (07:20):
I hate you at the end
, yeah.
Let me tell you that I hate you.
Speaker 1 (07:25):
Oh, you did, you did.
You said it a few more timesafter that, but that made it
worthwhile.
That I hate you was like themost valuable thing I got out of
the whole experience.
Speaker 2 (07:34):
It was great, oh my
gosh.
So yeah, this will be our lastpodcast, because Alex and I are
no longer friends after he putme on the spot to do a speech
when nobody else was doingspeeches.
Speaker 1 (07:46):
Well, the thing is
that she's contractually
obligated, so sorry, you knowthe lawyers involved.
Some other podcast trying tosteal her Not happening.
You know who you are Nothappening.
Speaker 2 (07:58):
So that was great.
Thank you for sharing that andembarrassing me.
However, I want to show somepictures from the conference
because it was a great time wewent.
We hung out with old friends,met some new friends that we had
never met before.
Speaker 1 (08:14):
And look how sharp I
mean Heather's just killing it.
Speaker 2 (08:18):
So this is my
coworker, Kevin Lyback, who
nominated me for the mentor ofthe year and decided to join
with Alex and Scream Speechwhile I was on the stage.
Speaker 1 (08:29):
Look, the first time
I met him that day and now he's
my brother, Like like we're likehe's like the best, Definitely
the best yeah.
Speaker 2 (08:38):
The one that I called
a jerk as I walked up on the
stage.
Everybody knows Ronan.
He's the life of the party atCelebrite for sure At every
party.
Yeah, at every party.
You're right, the Binary HickJosh.
Speaker 3 (08:55):
Hickman the man Got
to hang out with Josh.
Yeah.
Speaker 2 (09:00):
JP from France, who
works for Celebrite and does
some really good research.
If you haven't checked out hisblogs or webinars, check them
out.
It's awesome.
Speaker 1 (09:09):
Some jerk there on
your other side Some jerk.
Speaker 2 (09:11):
Yeah, I'm not even
going to acknowledge the other
guy in the picture.
Speaker 1 (09:13):
Keep going, keep
going.
Speaker 2 (09:17):
This picture is
actually three heathers.
We had three heathers in thearea at the same exact time.
Speaker 1 (09:22):
Heather overload.
Speaker 2 (09:23):
Yes, it was Yep area
at the same exact time, heather
overload yes, it was yep.
Um, besides getting to hang outwith old friends, meet new
friends, um, we got to listen toquite a few good, uh good talks
, and this is one of the one ofthe ones that stood out to me
the most, actually, was ianwhiffin and heather uh mahalik
barnhart.
They were giving a speech andspeech about verifying
(09:43):
validation, and I thought it wasa really good speech.
Speaker 1 (09:48):
Yeah, and Ian, and
both of them, I mean they're
fantastic.
Yeah great Ian is the best, andHeather too.
Speaker 2 (09:55):
And then, of course,
we got to see Geraldine Bly and
Dan.
Speaker 1 (10:00):
Dan Ogden yeah.
Speaker 2 (10:01):
Yep and K-9 Siri.
You can just see the top of herhead there, but she was up on
the stage too.
They gave a presentation abouta case that they worked on.
It was a really greatpresentation.
Speaker 1 (10:12):
And they're awesome.
I've known them for many years,especially Dan.
I've known Dan from way backwhen, and what a great examiners
and investigators Really goodpeople.
Speaker 2 (10:28):
And then we have
Alexis, myself, myself and Bill,
the phone wizard himself.
Speaker 1 (10:30):
We were all dressed
up and ready to go to the gala.
Bill is so much fun to bearound.
Speaker 2 (10:33):
Yeah, oh, definitely.
He gave a great presentationtoo.
If anybody ever gets the chanceto catch his presentation on
expert witness testimony, it'slike top-notch, gives you a
great pointers and just a reallygood idea of how to testify as
an expert witness.
We got to meet the one and theonly, lionel Natari from
(10:55):
Switzerland, who everybody knowsfor his blogs on the Unified
Logs.
Great, great guy.
What a down to earth fun justgreat guy.
Speaker 1 (11:05):
Oh and and that dude,
he's like the most sharply
well-dressed guy.
It don't matter what time ofday, what time of night, I mean
what day, weekends the guy lookalways sharp.
I'm like dude, I'm going to,I'm going to aspire to look like
you one day.
Speaker 2 (11:19):
That guy was sharp,
yeah, plus he's super super
smart.
Oh my God, yeah, definitely.
Plus, he's super smart.
Oh my god, yeah, definitely.
We were causing trouble, so wegot locked up Alexis and the two
Heathers in a jail cell.
Speaker 1 (11:32):
He gave us some time
out.
Speaker 2 (11:34):
Yeah, and then here's
a better shot of K-9 Siri and
Geraldine and Dan and Alexis andmyself at the gala.
The mentor of the year picturewith Heather and I.
Speaker 1 (11:49):
Oh, that's beautiful.
I love that picture.
Look at those there.
Speaker 2 (11:52):
That's a good one.
And then the keynote speakerwas Tim Tebow, and they had all
of the people that won an awardat the digital justice awards,
which they're calling thejustice.
We had all got to go back andactually meet Tim Tebow.
So that was really really cooland of course, we snuck my my
nominator, Kevin Lai, back backto see Tim Tebow because he he
(12:16):
really wanted to meet him aswell.
Speaker 1 (12:18):
So he came back and
Heather was excited to meet a
guy that plays sports ball orwhatever that's called.
Speaker 2 (12:23):
Yeah, no, he gave a
really good speech.
I was excited to meet himbecause of that Sports ball eh.
Speaker 1 (12:28):
You know nothing
about it, yeah.
Speaker 2 (12:30):
So that's just some
of the highlights from the
conference.
It was a really good conference.
It was held in DC last week andit was a really good conference
.
Speaker 1 (12:39):
I agree.
I think hopefully for next year, you know, if folks here get a
chance to attend, please do.
It's a great event.
It was the inaugural event.
I think it was really well run,really well done, and I've gone
to dozens and dozens of eventslike this, so I think it was
really well done.
And thank you, Sellerby, forhaving us me and Heather there
(13:01):
and be part of that event andhopefully we can do that again
in the future.
Speaker 2 (13:07):
Yeah, and they did
announce too there's an early
registration for next year.
It's at a greatly discountedprice, so if anybody is looking
to register for next year, signup.
Speaker 1 (13:17):
Do it, let's just do
it.
Speaker 3 (13:20):
Uh-huh.
Speaker 1 (13:20):
Well, I think we run
out of.
I love me so we love ourselves.
Speaker 2 (13:27):
Yeah, I think we run
out of.
I love me, so we love ourselves.
Yeah, let's do that.
That's enough of that.
Speaker 1 (13:32):
Let's do something
productive now.
Speaker 2 (13:34):
Well, actually the
next one is kind of a little bit
about me, Sorry.
Speaker 1 (13:39):
Oh, we've got some
left and Heather loves Heather,
you know a little bit left over.
Let's get it done yeah sorry.
Speaker 2 (13:46):
So we were going to
have a podcast prior to this
podcast that I'm going to talkabout coming out, but I got busy
with some work stuff and we hadto postpone until this week.
So there was a podcast calledMobile Forensics Are you Nerd
Enough?
And it was about extracting ramfrom phones put on by cyber
(14:08):
social and um, hosted by adamfirman from msab.
It was adam firman, dave louderfrom msab, wendy wong from msab
and myself and we talked aboutextracting the ram data, what
you can get from it, the thedifferent techniques.
They talked quite a bit abouttheir RAMalyzer and the
(14:29):
techniques that they use for theRAM extractions.
It aired on March 26th but it'sstill available if anybody
wants to check it out and viewit, and I'll put the link in the
show notes afterwards.
Speaker 1 (14:40):
No, it was a good
show.
I met Dave and Wendy in Swedenmaybe a month ago, a month and a
half ago Really really sharppeople.
I mean they're developing thatcapability and watch the podcast
.
It's really a capability.
You need to consider havingyour toolbox being able to
extract RAM from Android devicesand you'll be surprised.
(15:03):
We talk about it before theshow but it's worth saying again
you'll be surprised how muchstuff you can get out of it.
When you think that you haveall that you have, you will be
surprised.
Speaker 2 (15:12):
Yeah, definitely so.
This one's not about me.
Speaker 1 (15:20):
It's close enough.
It's tangentially about you, myagency.
Speaker 2 (15:25):
I did want to
highlight my agency, the New
York State Police.
They were honored with aspecial new award at the Magnet
User Summit in Nashville.
It's called the Magnet AgencyImpact Award and they provided
it honoring commitment toprotecting communities and
advancing digital forensics.
I'm just going to throw alittle picture up there.
(15:45):
But they were, we were awarded.
I wasn't there.
That's my lieutenant rightthere in the middle holding the
award and then some other NewYork State Police Computer Crime
Unit employees and some magnetemployees there presenting the
award.
Speaker 1 (16:02):
You can see Jad there
.
Or if you're looking as you'relooking to the screen to your
can, you can see jad there.
Or if you're looking as you'relooking to the screen to your
left, that's jad there.
I forgot the name of the uhgreat key guy to the left of him
, so you got some uh prettyfamous people there, plus the
nice people from your statepolice just uh, killing it,
delivering justice all aroundthe state, the great state of
new york.
So congrats to all of you.
Speaker 2 (16:24):
Thank you, and Matt,
did you see that award?
You could really hurt somebodywith that award.
It's really like pointed andsharp.
Speaker 1 (16:32):
Well, that's why you
give it to law enforcement
people.
We deal with weapons daily.
Speaker 3 (16:36):
We know how to handle
it.
Speaker 2 (16:41):
So we did want to
talk about a couple of new
podcasts that are out thateverybody should check out, um.
So the first one is calledOSINT cocktail and it is a
podcast that delves into theworld of open source
intelligence and digitalinvestigations.
Um put on by investigatorsKirby Plessis, kelly Paxson,
(17:05):
amber Schroeder and CynthiaNavarro, and they're talking
about pop culture contact andgiving some investigative
feedback on some really actuallypretty well-known cases and
series on.
There's one on Netflix, butseries on some shows, yeah, I
mean anybody knows.
Amber.
Ceo and founder of Paraben.
Speaker 1 (17:23):
There's one on
Netflix, but series on some
shows, yeah, I mean, everybodyknows Amber, ceo and founder of
Paraben.
She's been around an expert fora long time, so we all love her
and it's a great concept.
I wish I had thought of itfirst, right, because they go
into this show, right, and thenthey have the show and they
discuss about the forensics asportrayed on the show, right.
Sometimes it's ridiculous,sometimes there's something to
(17:49):
comment on and then build on.
So it's a great idea, greatshow.
So go, uh, go check, go, checkit out um.
Speaker 2 (17:53):
The next one is
haxordia has a podcast now um
called truth and data?
Um.
In the first episode they talkabout timely data preservation
on mobile devices.
Actually, let me back up for aminute.
It's Jessica Hyde, debbieGarner and Kim Bradley that are
putting on this podcast.
Speaker 1 (18:13):
All stars.
All these people are known toyou, and if they're not known to
you, you need to go find outwho they are.
I mean all stars.
It's also a great show.
Speaker 2 (18:21):
Yeah, so they have
put out their first episode and
I'm going to actually watch ittonight when we're done, so
don't ruin it for me, I haven'twatched yet.
Speaker 1 (18:29):
I'm not going to
spoil it, don't worry.
Speaker 2 (18:31):
Thank you.
I think I know what the themeis going to be here, though, but
yeah, they talk about thetimely preservation of data in
mobile devices Such an importanttopic.
If you're not getting thosedevices extracted immediately,
you're losing data every minute.
So great topic.
Speaker 1 (18:56):
No, I'm not going to
spoil it, but a good discussion,
policy-wise and technical aswell, in regards to what does it
mean to extract data vis-a-visthe preservation concept.
So again, go watch, it's reallygood.
Speaker 2 (19:06):
Yeah.
Speaker 1 (19:07):
Oh, and Jess is in
the chat, so thank you for being
here.
As always, she's saying thatevery episode is a new
conversation, a topic, reallylike focus on a particular topic
.
Like us, we kind of freewheelon what's happening all around,
but she and the folks in herpodcast narrowed down on a topic
and run it to ground reallywell every week.
(19:28):
So go check it out perfect.
Speaker 2 (19:32):
Um, let's see what
else we've got here.
So this is another topic wewere going to talk about last
podcast.
It hadn't happened yet, but ithas happened now.
So, um, amazon was set toremove the do not send voice
recording setting from its Echodevices, and it did actually
(19:52):
happen on March 28th.
So this change means that allvoice requests will be sent to
Amazon's cloud, even if usershave previously opted out.
So if you have Echo devices,your Alexa device hey, alexa,
your voice recordings are nowbeing sent to the cloud.
Mine's unplugged now.
I think I'm done with it.
Speaker 1 (20:14):
Yeah yeah, yeah, I
know, I know.
Speaker 2 (20:17):
Okay, yours is
unplugged too.
She talked in the middle of thenight when I didn't prompt her
anyway, which freaked me out.
Speaker 1 (20:27):
Maybe it was sleep
talking, you know.
Speaker 2 (20:28):
Yeah, yeah, so she's
gone now, but the feature will
be replaced with don't saverecordings, which still prevents
recordings from being stored,but does not stop them from
being processed in Amazon'scloud.
Speaker 1 (20:42):
Yeah, what the heck
is that supposed to mean?
Speaker 2 (20:45):
They say they're not
saving them.
Amazon's cloud, yeah, I mean,what the heck is that supposed
to mean?
They say they're not savingthem.
Speaker 1 (20:47):
You're processed yeah
.
Speaker 2 (20:49):
So they're going to
process them, and are they
immediately purged, or I'mconfused?
Speaker 1 (20:54):
Look, if you process
something, there has to be some
output that goes somewhere.
Speaker 2 (20:58):
It's there, yep.
Speaker 1 (20:59):
Yeah, this is the
thing, right, can you hear me?
Yeah?
Speaker 2 (21:04):
Yes.
Speaker 1 (21:04):
Yeah, can you hear me
?
Yeah, yes, yeah.
So this is the company thatalso kind of owns the ring
camera system and all that andif I'm not mistaken, they got
caught employees kind ofwatching into the blink ones to
kind of watching cameras andfootage and stuff like that.
So I did the thing.
I mean, I'm not saying I'm notaccusing them of anything.
The point I'm making is is thatI apologize, yeah, a little
(21:28):
cuff there.
Um, what I'm saying is that asconsumers, need to be smart and
the best predictor of futurebehavior is past behavior and
how much we value our privacyversus the convenience that
gives.
I think I don't know if youmentioned it, maybe I didn't
hear it the reason they're doingor grabbing doses.
For what again do you?
Speaker 2 (21:44):
you said it oh yeah,
ai, I didn't say it.
Yeah, no, no, okay, yeah, yougot it.
You're saying it right now yeah, there, there we go.
Speaker 1 (21:50):
So yeah, to train
their AI.
Like I mean, you all know ouropinions on AI in general and,
like Jess is saying in the chat,the data work persists.
Right, if you share it, it goessomewhere, and I agree with her
100%.
So you got to make an informeddecision if the convenience is
worth your privacy, if theconvenience is worth your
(22:13):
privacy, and you know yourthoughts on information being
now part of this multi-globalconglomerate.
So just food for thought.
Speaker 2 (22:20):
Yeah.
They do state, though, thatthey're not going to process
visual ID, so facial recognitionin the cloud, so I'm sure
that's coming, though, ifthey're using it to train their
AI and stuff.
Speaker 1 (22:33):
Now look, ai is such
a.
You know what.
We're going to talk about AI alittle bit later.
But yeah, now it's just grabdata for AI, grab data, grab
data, grab data.
Yeah, I'll say it real quick.
There's a push now for thesecompanies.
They want to make trainingtheir companies, they want to
make training their AIs, theywant to train it with any piece
(22:54):
of data, any book, any media,anything and consider that fair
use.
And if you're not familiar, fairuse means that you can take any
copyrighted publication ormedia and you can make small
excerpts for certain purposesright, without breaking the
copyright or the law for certainpurposes right, without
breaking the copyright or thelaw.
For example, let's say there'san article or a book and I'm
(23:15):
writing an article about thebook.
I can take a little paragraph,not even paragraph, a few
sentences to make reference inmy article about the book.
I can't take the whole book orwhole pages.
Now I'm breaking the copyright.
So I'm doing fair use of that.
It's a tiny bit for anotherpurpose and there's more
legality behind it.
I'm not a lawyer, I just did aholiday in last night old joke
(23:35):
from the 90s, I know, um, sothat's okay.
But really take everything like, let's say, all this material
that you worked on and nowyou're gonna give it just to
these people because they wantto train their ai.
I mean again, those are thingsthat are being discussed in
Congress and the courts andwe'll see how that shakes out.
Speaker 2 (23:55):
Yeah, Speaking of AI,
so AI search engines.
There was a study done.
It was covered by Ars Technicaand conducted by the Columbia
Journalism Review Toe Center andDigital Journalism.
It found that AI-powered searchengines frequently misattribute
(24:16):
news sources with an error rateexceeding 60%.
60%.
Speaker 1 (24:23):
Yikes.
Speaker 2 (24:25):
I don't know about
you, but if I am looking for a
source or something that I amGoogling even I don't want an
error rate of 60%.
Speaker 1 (24:36):
And they don't tell
you the error rate.
And that's one thing I've beencomplaining a lot about any AI
application.
They just put it in and youdon't know what the error rate
is.
You have no way of gauging orevaluating the validity and the
usefulness of this capability,and they do.
The problem with AI is if yousay, well, I'm going to give you
sources, okay, are you going tocheck the sources, what if the
(25:00):
AI misattributes the sources orjust says that it got it from
this source and didn't, and youcan tell me well, that would
never happen.
Oh, that has that look thething has.
Even AI has made up sources.
Speaker 2 (25:11):
Right, it's already
implementation the study.
The study goes on to say that.
So it said uh, researcherstested aai search tools by
providing direct excerpts fromreal news articles and asking
the model to correctly identifythe original headline, publisher
, publication date and url.
The models often providedincorrect or fabricated
(25:32):
citations.
Instead of declining to answerwhen unsure.
That's the problem.
They don't say I don't know.
They answer with something thatis just not correct.
And I think that goes to otherthings too, not just these news
sources.
Right, I mean, it's anythingyou're asking.
Speaker 1 (25:51):
Yeah, but I mean and
this is the thing and people
don't seem to understand it'scalled artificial intelligence.
But even the term how do I saythis?
The term intelligence isartificial, as in.
It's not intelligent.
The thing is not thinking Again.
What AI does is probabilistic.
It tries to determine what willbe the most likely thing to say
(26:12):
based on this training data,and then throw in a little bit
of randomness here and there tomake it human sounding right.
And that's a problem, and we'lltalk about an article in a
second.
If we're taking that as ourmain research or sourcing of
(26:33):
things, that's a problem.
It shuts down a part from myperspective, a part of your
brain, because you're offloadingthat responsibility to the
system and you become not athinker but an asker.
And the thinking to the answeryou leave it to the AI, because
that's what the AI kind ofimposes upon.
(26:53):
You Ask me questions, I giveyou answers.
So an AI prompt engineer isjust learn how to do questions.
That's important.
But how about thinking?
How about actually thinking andactually researching and
actually doing certain things?
I mean, does that make sense toyou, heather?
Speaker 2 (27:10):
Yeah, it definitely
makes sense doing certain things
.
I mean, does that make sense toyou, heather?
Yeah, it definitely makes sense, and actually that kind of
leads right into the next, thenext topic, which is the slow
collapse of critical thinking inOSINT due to AI.
So there was an article byDutchOSintguycom, dutch.
Speaker 1 (27:32):
OSINT guy.
Dutch OSint guy.
Speaker 2 (27:33):
Dutch OSint guy.
I'm reading it as a URL.
Thank you, dutch OSint guy.
Thank you, and that articlewarns that OSINT analysts are
increasingly over-relying ongenerative AI tools like ChatGPT
, copilot and Gemini, and it'seroding critical thinking in the
OSINT field completely Well,and it doesn't.
Speaker 1 (28:00):
I want to make some
comments on his article, but
before I do that, I want tohighlight some comments in the
chat, right?
So Brett Shavers again a friendof the show, a personal friend.
We love him, great expert.
He says AI can never be anexpert because he will never
admit being wrong, and I likethis other comment that he makes
we're gonna be the lastgeneration of cognitive thinkers
and it's kind of funny and sadbecause, yeah, we, we actually
(28:23):
might be the last generation ofpeople that actually care enough
to think about things yeah um,so so for the article I made, I
can mention examples of how, inthe field of OSINT OSINT means
open source intelligence youmight need an example.
He gives profile person for aparticular case and you ask the
AI to profile the person and itgives you all these results,
(28:47):
really clean, really nice, aboutthe person different interests.
These results really clean,really nice about the person
different, different interests.
But the ai uh, decides to,let's say the example uh, he
gives omit information fromfar-right forums.
Right, because the ai was nevertrained to look into these
far-right forums, right.
And then, and you understandthis, right, some people, um,
(29:10):
keep different facets of theirlife separate, right, there are
one person over here and anotherperson over there, so an ocean
investigator.
His or her job is to look ateverything that they can, right.
But if you upload it to the AIand the AI doesn't have
visibility into it, what happens?
Well, most likely that analystwill just take that data, put in
(29:33):
a report and, hey, this is good, this person is good to go, and
they have no knowledge of thisother, not destructive, but
tendencies that need to belooked into.
Let's put it that way.
Does that make sense, heather?
Speaker 2 (29:45):
It does.
It makes perfect sense With theOSINT field.
The article goes on to say thatyou need to preserve human
judgment and resist thetemptation to offload thinking
entirely to ai.
We have to resist thetemptation to rely on ai.
It's tough because you caneasily go to any of these ai
(30:07):
sites and just type a questionand it spits back an answer
immediately.
Speaker 1 (30:17):
Yeah, and I don't
believe this is, I mean, this is
my opinion.
I don't believe you can givethe AI all the information
that's been produced by humansor nature and train it to do a
model.
And remember, a model is not areal thing right, it's the best
approximation of something, inthis case, having an
all-encompassing model of theuniverse.
It's impossible.
It's too much information.
(30:37):
So AI is going to make anapproximation as best as they
can of what all the interactionsin the universe might be.
That's why it's a model.
Right, you have a car, a modelcar, it's like the car, but
smaller right, and they do thebest job at it.
Car it's like the car wassmaller right, and they do the
best job at it.
There's a point where there'sdiminishing returns and right
now it's being ubiquitous, it'sbeing thrown at us from all
angles.
(30:58):
The one thing I would say toeverybody is you have to not
only ask questions to the AI,but also question the responses,
and at some point you got torealize that, depending on what
your use case is, maybe thinkingis the better, faster way of
getting to where you need thanasking the AI to then question
(31:20):
the answers, to then verifythose answers.
Right, it's definitely faster.
Speaker 2 (31:24):
I've asked the AI
questions and then read the
answer and been like no, that'sdefinitely not right.
And you tell the AI that's notthe answer and it's like, okay,
let me, let me try again.
I'm sorry, it apologizes.
First I'm sorry and then let metry again.
It's a totally different answer.
I just I mean, which one do youbelieve?
Speaker 1 (31:43):
Well, I like there's
a thing called by coding but
people, people do.
Is they go and ask a chat, gptor whatever AI about a coding
problem and take the answer?
They just keep, you know,pasting code from the AI until
it works.
It's like they're vibing thecode.
Oh my God.
Speaker 2 (31:59):
I've tried it, I've
definitely tried that.
Speaker 1 (32:02):
How did it work for
you?
It does not work out.
It does not work out well atall.
Speaker 2 (32:06):
So I've done it
because I bug you all the time
about coding and I'm like allright, I'm not bugging him today
, I'm going to figure it out onmy own, I'll get going and I'm
like all right, it's still notworking.
I don't know what I did wrong,I don't want to bug you and I'll
just throw it into the chat GPT, because I have chat GPT.
I'll throw it in there and bewrong.
(32:29):
So anything I had right is nowgone, and it's just.
It's so terrible at that andyou know what, I shouldn't be
relying on it.
How lazy of that is.
Is me right?
How lazy of me does that?
It's just insane.
And I'm doing it because I don'twant to bug you.
Speaker 1 (32:45):
Well, I mean, I mean,
which is which is ridiculous?
I told you to bug me all thetime.
Speaker 2 (32:49):
I know, I know bug me
.
Speaker 1 (32:51):
You're not bugging me
, but even people I mean, I know
coding, you say, well, maybe Imight use it.
But what happens?
At least when I use it, I endup finding the errors.
I look at the code, like, well,this is not going to work
because of this, so I end upspending time correcting it till
it works right.
Speaker 2 (33:05):
And then I look back
and say, if I done it from the
beginning, I would have beendone already and part of the
problem is I don't know how todo it all yet because I haven't
finished all the classes.
I know, I know, but I don'tknow how to do it, so I can't
recognize the things that it'sgetting wrong like you can, so
I'm just going with it and it'snot going to work.
It's never going to work.
Speaker 1 (33:25):
Well, and you make a
point on that, right.
So obviously we ask thesequestions to the system, you
know, hoping that we can answerto things that we don't know.
And that's the thing, right?
Even questioning the answer, ifyou don't have a broad
understanding of the topic,you're never going to know where
is it wrong, right, like inforensics.
If you're a really well-versedexaminer and using AI and you
(33:47):
can verify some of it, that'sfine.
But the reality of our field isthat most folks don't have the
level of expertise to use thistool and verify the outputs
correctly, and they're going tobe copy pasting whatever it says
and it might be wrong, and theydon't have the ability to
detect that it's wrong.
So we got to really think abouthow we integrate that if we are
going to do that.
Speaker 2 (34:08):
Yes, I completely
feel this comment right here
from Aaron.
Hey, chatgpt, can you pleasebreak this code?
That was working, but I wastrying to add one little thing,
so it's usually what it is withme.
I'll take something that Ialready have working and I'll
find an additional field in theSQLite database that I want to
support in one of the Leapartifacts, and I won't exactly
(34:31):
know how to incorporate that.
So I'll just ask it to add thisone little thing and then
everything I had working is justdone.
It's just done, so thatcomment's perfect.
Speaker 1 (34:40):
Perfect.
And the little thing the AIasks like a hundred lines of
code and like how can you addmore code than the one I have
had already?
Speaker 2 (34:49):
No, Like this little
thing.
No, oh, it's insane.
So yeah, I mean it could be agood tool to be helpful, but how
helpful is it if it's steeringyou in the wrong direction?
Speaker 1 (35:04):
Well, I mean again,
my take of that is, if you have
a level of expertise on a topicand you want to use the AI to
kind of give you an assist inthe context of, maybe, something
that based on the data you canthink about, sure you can do
that.
But if you're not an expert Idon't want to say expert
knowledgeable on the topicyou're looking at, you got to be
careful.
You won't be able to verify theoutput since you don't have the
(35:25):
knowledge.
Speaker 2 (35:26):
Yeah, well, here's
another huge problem that Brett
brings up too, though.
I asked ChatGPT a question andits answer was a paragraph from
a blog post that I had writtenyears earlier.
It was my post, word for word,without attribution.
Yeah, so in one of thesearticles actually I think in
both articles it addresses thatvery issue, like it's not giving
(35:46):
the publisher the traffic onthe website for one and it's
also not giving them the creditfor the article, and, honestly,
if you asked which source it was, brad, it probably wouldn't
have even known it was you.
Speaker 1 (36:02):
Well, and I've been
blowing the horn or the alarm on
this since last year andimagine you go to court and you
did some work using Windows orExcel, let's say, or Celibrate
Magnet MSA, one of those toolsbut you used a cracked version.
You didn't pay for it.
Right, you didn't pay for it.
(36:23):
And they'd ask you at courtwell, did you upkeep your
licenses?
Was this a licensed product?
You're like no, why do youthink it's going to happen?
You stole the product to use itfor a product.
Okay, that being said, imagineusing AI.
And what about the attribution,the legality of the training
data.
Was it acquired properly?
(36:44):
Was copyright protected?
Is this tool licensed?
It's modeling data properly?
Will that come by you in thebehind later, when you use the
tool and you know it's not thatyou used it, but that tool was
stealing for lack of a betterword data from all around to
train this model?
And this is something that wejust put it on the system and we
(37:04):
don't think about theprovenance of the training data.
There's legalities around it.
Like I said before, brad, putthis article out.
You can reference it, as youknow, to fair use, but you
shouldn't just grab it and putit out there with no attribution
right.
So there's some things to thinkabout.
And how might that affect ourworkflow if we don't take time
(37:26):
to understand the provenance ofour training data for these AI
systems?
Speaker 2 (37:30):
the provenance of our
training data for these AI
systems.
Definitely Continuing with theAI theme, the National Institute
of Standards and Technologyrecently I believe last month
issued updated directives toscientists collaborating with
the US Artificial IntelligenceSafety Institute.
(37:51):
The directives removereferences to AI safety,
responsible AI and AI fairnessscientists collaborating with
the US Artificial IntelligenceSafety Institute.
The directives removereferences to AI safety,
responsible AI and AI fairnessand instead emphasize reducing
ideological bias to promotehuman flourishing and economic
competitiveness.
What do you think of that one?
Speaker 1 (38:07):
Look.
So my take on that was, onLinkedIn, that we should really
look into internationalstandards.
Not because they're perfect,let me put it this way more
resistant to influence fromoutside factors.
Let me put it that way, right,it's more resistant when you
(38:45):
have a really broad-basedprocess to get to an agreement
on what these standards are.
Now, they're not perfect, right?
Especially in some commentsthat Brett was making in that
conversation, those commentssection in LinkedIn, for example
.
There's one thing about thetechnical way of doing something
and then that technicality ofwhat you did how is it presented
(39:07):
at court?
Is it enough?
Is the standard enough to beable to use that data in the
court in the us versus a courtin europe versus a court in, you
know, somewhere in the pacific,whatever, right, those are fair
things to to consider.
But again, I will really try toimpress upon everybody the need
of international standards, totry to look for them.
(39:29):
Um, because some standards thatare really nation focus, um,
they're more for what we've seen, more prone to be influenced by
outside entities and, and theproblem with that is, uh, the
science is a science and anyoutside influences should be
(39:51):
more science.
That's the only influence.
Science is made better by morescience, right.
And even when we talk aboutethics, ethics is science, right
.
Empathy is science.
We get that.
We have scientific ways ofunderstanding it.
It has to be science.
Hopefully that makes sense foreverybody.
Look into those internationalstandards.
(40:12):
Let me put here a comment fromJess Sure there are many bodies
that are international standardbased on only the process but
also the people.
For example, dfrws and DFERReview are international bodies
in terms of peer review, andit's so important.
Our international bodies interms of peer review, and it's
so important.
A lot of the country orregional bias can only be
(40:34):
canceled out when it's presentedat international conferences,
international symposiums,international organizations,
where everybody can thencollaborate and come to an
agreement there.
So that's all I'm going to sayabout that, because I don't want
to get in too much of a picklehere.
Speaker 2 (40:52):
Yeah, that's good,
that's good.
So let's move on to a recentblog post by Christian Peter,
who we love his work.
Christian Peter is the creatorof YouFade that we've talked
about on the show quite a fewtimes in the past and actually,
before we get into his recentblog posts, there were also
(41:13):
recently some updates to Ufade.
So if you're using Ufade, goout and get that new version
with the updates andimprovements that Christian has
implemented.
But his blog post discusses howsome forensic tools may
inadvertently alter or deleteunified logs and if you've
(41:34):
listened to the show at all orread any of Lionel Notari's work
on unified logs, there's somegood stuff in there, right?
There's some really importantdata that's in those logs and
he's emphasizing that thecritical importance of these
logs in forensic investigationsand recommends that when the
device is presented with consentmeaning that the device is
(41:55):
unlocked or you have thepasscode that you extract those
unified logs prior to connectingand extracting with your
forensic tools.
Speaker 1 (42:06):
Yeah, yeah, I mean
orderable utility is something
that changes all the time.
I remember that again, the dayswhen you got to the computer
and what do you do with it?
If it's on, you would unplug itso you could stop any changes
to the system, Just unplug itand then take it and image it
right.
And then we learned that memoryis kept and I say kept, but
(42:29):
it's kept in the RAM but if youunplug it, that memory is lost,
that content is lost, and wechanged the order of volatility
on how we get those Same thingwith mobile devices.
That's changing all the time.
For example, now we're gettingextraction from Android from
that RAM, and so will thatchange our process of
acquisition?
That's discussions that aregoing.
(42:50):
But this is my big thing withthis issue and I try to be as
straight shooter as best as Ican.
So I'll give you everybodystraight.
The tools, the tooling iswiping these logs before we get
a chance to acquire them, andthis was noted not by the tool
makers but by community members.
(43:12):
Okay, and there is silence,there's radio silence to the
community in regards to thisissue, and it's up to the
community folks like us to letyou know that if you're using an
extraction tool that generatesfull file system extractions
from a device.
You're using an extraction toolthat generates full file system
extractions from a device.
You need to pull out the AppleUnified Logs, the SysDiagnose
(43:38):
Logs.
Speaker 2 (43:40):
Oh, I lost you.
Speaker 1 (43:44):
Testing, testing.
Speaker 2 (43:45):
There we go, I got
you.
Speaker 1 (43:46):
Okay yeah, my own
computer went like.
I heard the disconnectingconnecting sound, I freaked out
for a second.
Speaker 2 (43:52):
You sound a little
bit like you're in a tunnel now.
Speaker 1 (43:55):
Yeah, that sucks.
Maybe I'm using the, maybe mymain microphone got lost, so
that's why.
Speaker 2 (44:01):
Might be.
I can hear you now, though,okay.
Speaker 1 (44:04):
Sorry, everybody got
lost.
So that's why, might be, I canhear you now though, okay, sorry
everybody, I was in mid-rant,so yeah, so, uh, yeah, I think I
got folks telling me I think onthe um, you know, and yeah, I
got, I got, I got people tellingme it's not, you couldn't hear,
but not now you can hear me,all right, so it's a thing, um,
you, you need to pull thosethings out.
(44:26):
And again, let me explainsomething real quick here for
people to understand.
We're not speaking as membersof our agency.
We don't represent thegovernment, we represent
ourselves and we're commentingon something that was made known
by a community member to thewhole community and we're going
off of that right Based on thatknowledge.
And if that's the case, thenyou need to do these things
(44:47):
right and it's up to everybodythat's listening to go and try
it out yourself.
Is your tooling, deleting those,get a test phone and extract it
, do a full file system and seewhat you get in the logs, if any
right, and you test it on yourown right.
This is the science of it,right?
(45:07):
So I'm going to make that clear.
We're not here advocating orspeaking for employers or
agencies.
We're just members of thecommunity, and this is what
we're seeing.
This is what people,anecdotally, are telling us.
That being the case, you needto pull those logs out before
you get your full fascismextraction with the tooling From
a community member perspective.
And from a community memberperspective, we would like to
hear from the vendors on this assooner rather than later,
(45:31):
beyond whatever technicalsolution they might provide
which I believe they have toabout the guidance on that order
of volatility, and I would liketo hear justification in
regards to why is this happening?
Because now you've got toolingthat possibly, based on
anecdotal evidence per titleyourself, is deleting data from
(45:53):
the system, and that could beproblematic.
In this field, we're trying todo the opposite.
We're trying to acquire andpreserve.
So if you have a great reasonfor it, that's fantastic, but
whatever your reason is, thepreservation of the data is
higher in that order.
(46:13):
I don't know if you agree withme and I'm kind of going out on
a limb here, but that's my takeas a member of the community and
I want to make that clear toeverybody.
I noticed I never mentioned anymentor by name, because that's
not up to me to do.
Right, you go and test it out.
Go check the articles from.
It was Christian right thatmade the article.
Speaker 2 (46:31):
Yeah, Christian.
Speaker 1 (46:32):
Yeah, christian made
that.
Checked his article out and doyour own testing.
That's what we're here for.
We don't believe just becausewe do some testing and then we
see if that's the case or not.
And if that's the case, then weneed to then try to see how we
mitigate, uh, those, thoseissues, um, hopefully I mean I
know some people get mad at mewell, too bad, so sad, it is
what it is get the logs first.
Speaker 2 (46:54):
You can use
christian's tool you fade to get
the logs first yeah, get them.
Speaker 1 (46:59):
Get them.
Preservation look, listen'.
Podcast.
They talk all aboutpreservation and acquisition,
and not precisely, maybedirectly on this example, but
the importance, what doespreservation mean?
Right, and the legalitiesaround it, and that's something
to consider.
We are here to preserve data.
(47:22):
Our tools and support supportus on that.
And again, tools are great.
The blindly trusting tools is aproblem.
Blindly trusting tools, blindlytrusting AI.
I don't care who the vendor is.
This is not about a particularvendor or set of vendors.
This is just the concept, themoral concept of how we do our
work.
Again, our property.
(47:42):
That's the moral value, thevalues that we bring to our work
, our due diligence are we doingthe job as we need to do it?
And attention to detail Are wemaking sure we got all the data
that we need?
And if our tools are failingone of those things our moral
values compel us to speak outand look for mitigating
solutions and for permanentsolutions.
(48:03):
And I would say reach out toyour testing, to your tool
vendors on the topic, because Ibelieve it needs to be addressed
.
Yesterday, yesterday, Agreed.
Speaker 2 (48:18):
So another LinkedIn
post that we saw the last few
weeks is from Jordan Hare andit's titled Not All Encryption
is Created Equal.
So this article is aboutencrypted applications like the
gallery vault and how they oftenboast about their great
security but really in theforensic world, it's poor
(48:41):
security.
We can recover data throughforensic analysis, through
extraction.
There's often weak PIN codes,poor encryption and just data
that makes these applicationsvulnerable when we extract and
when we parse and analyze.
So it's a good articleSpecifically talks about the
(49:01):
Gallery Vault, which is anAndroid app designed to hide and
encrypt files.
We can definitely get into thegallery app, even though it
boasts that great security.
Yeah, I can think of like awhole bunch of other
applications that have this likereally big headline that where
they're like we are encryptedend to end, nobody's going to
(49:23):
get your data, and it's notalways the case.
Speaker 1 (49:27):
Well, I mean the weak
point in.
So let me put it this way Intransit, it's really well
protected, right.
It's well protected the data.
They usually do a good job andit's not hard, right.
If your app is a browser that'staken care of right, ssl will
take care of encrypting yourstuff as it moves through the
wire.
It's not rocket science, right?
(49:52):
If you're really lazy, just usea browser as your app in an app
as a browser and use SSL to getthe data out.
The issues are the endpoints.
Endpoint security is the bigAchilles heel of security, right
, and it could be used forlawful purposes Law enforcement,
we look into endpoint securityin order to be able to lawfully
(50:16):
acquire evidence to prosecuteand investigate crimes, but also
nefariously.
Bad actors will also do thatthrough malware and attack the
endpoint to be able to accessthe communications or other
things for nefarious purposes.
Even nation states, as we'veseen in the news, do that.
So there's an issue there.
Now, since that's the Achillesheel of it, there are apps that
(50:46):
really do a bad implementationof endpoint security within
themselves, and this is not thefirst time I've seen that.
There's some research done byhis, his, his, outer.
This this this examiner useslike a dog.
I forgot the name of the, ofthe, of the cheese.
I'll come back to me.
I'll come back to me.
I'll get to it later.
If not, I'll put in the notesso you can check it out.
What he did he went and reverseengineered the APK and found
(51:07):
out that the password to openthe encryption of the app was
hard-coded in the app.
What that means is you can putwhatever PIN you want, it
doesn't matter.
The thing is encrypted with ahard-coded password, which means
that if you, heather, have theapp with data and I have the app
with data and I get into yourphone, I can use that hard-coded
(51:29):
password to get into your vault, my vault, any vault, and
actually we have thatimplemented in Alib.
We got two or three differentvault apps that use that
hard-coded password and weimplemented it and you can get
into them and it still works.
It's been like years and theystill work and they're pretty
popular apps.
Incidental, chew Toy.
(51:50):
See Josh to the rescue man.
Josh is the best.
I'm so happy to have you in thechat man.
I love you.
I saw Josh a few days ago atthe event.
I didn't have a chance to spendas much time as I wanted with
him because you know, busy guy,celebrite event, rock star at
Celebrite.
So I'm hoping to spend way moretime with Josh at Techno
because we're going to see himthere, so, yeah.
(52:13):
So Josh came to the rescue withthe whole.
So the incidental tutor, hemade that research and it's
implemented in ALEEP, rightEndpoint security.
So if you're an examiner, yougot to open, broaden up your
scope of investigation, right?
So you might need to look intohow do I reverse engineer an APK
?
And classes, at least forforensic purposes, classes like
(52:33):
the mobile forensics class thatHeather Barnhart gives for SANS
teach you how to do that, how togo in and pull things from
those APKs.
If you don't know what an APKis, that's the program that runs
on your Android device and youcan do the same thing with other
mobile device apps and lookinto those and you'll be
surprised what you find.
Like Kevin is saying a lot ofit is encoding and people don't
(52:55):
see the difference betweenencoding and encryption, right,
absolutely Big difference, bigdifference.
So part of our job is not onlyhitting the button and saying,
well, it was not decoded, Ican't get in.
No, do some investigation.
Take that APK, reverse engineerit.
Get that Java jar and make itunopening and look for strings.
(53:16):
Try to see if you can findsomething that might be of
interest, and you will besurprised what you will find.
Speaker 2 (53:23):
I was just reading on
twitter or x or whatever the
heck it is.
I know I shouldn't be on thereanymore it's a cesspool of of
garbage now but I was reading onit.
There was somebody talkingabout um and I don't even know
who it was, I don't even knowhow it got into my damn feed but
somebody talking about how, oh,they'll never get my messages
because I use signal.
Yeah, yeah, it kind of fitsright along with this, but yeah.
Speaker 1 (53:48):
There's a word for
that.
It's what's your threat modelright?
So if your threat model is I'mworried that somebody, based on
my circumstances, that they willget my my conversation because
they're intercepting mycommunications, well, that
that's.
That's a lower threat level forthat.
But if your threat model issomebody is trying to actively
(54:10):
hack into my phone like a nationstate, well, just saying I have
signals and I make it better,that might be not enough.
The app is good.
There's nothing wrong with theapp.
It's probably, I think, one ofthe better implemented
communications apps and thesource code is open.
People test it and validate itall the time.
So I don't want to say thatit's bad, it's really good.
But your threat model has to,depending on what the risks are,
(54:31):
has to widen.
If you're like, let's say, in acountry you're some sort of,
let's say, like resistance,liberty-loving person and you're
trying to bring democracy inyour home country and the
government doesn't want you to,it's a made-up example Then you
got to.
Your threat model has to be waywider than maybe a person that
(54:54):
doesn't want some snooperlistening or reading their
messages.
Does that make sense?
Heather, more or less.
Speaker 2 (54:59):
Yeah, definitely.
Speaker 1 (55:00):
Yeah, think about
what your threat model is, what
are the issues that you'reconfronting, what are your risks
, what are your vulnerabilities,and then you can, you know,
prepare accordingly and, youknow, do whatever you need to do
.
Speaker 2 (55:15):
All right.
So how about we go over anartifact?
I have been planning like someartifact of the week when we do
the podcast, even though it'snot every week, but this week
I'm going to show a little bitabout Snapchat.
So Snapchat there are tons ofdatabases that are related to
the Snapchat artifact.
(55:36):
I just did a few screenshotshere of some of the databases
that are related to Snapchat.
So when you're doing aninvestigation and you find a
video or an image that's relatedto Snapchat, what do you do
next?
Like how did it get there?
Did the user record it?
What other information isavailable about the Snapchat
(55:56):
video?
So this video here you can seethe name of the video is a long
letter number combination andit's got the dot decrypted at
the end.
So I'll say I used Celebrateand they support decrypting for
this particular artifact.
So I have the file name.
I don't really have much elsefor information about this video
(56:19):
.
So I wanted to know about thisvideo.
I was doing some testing on oneof my test phones and what
other information can I find outabout this Snapchat video?
So the cash controller databaseis where you go find what's
called the column is called anexternal key and that relates to
(56:40):
the Snapchat video.
So I took the name of the videoand searched it in the cache
controller database.
The table is the cache fileclaim and when I searched the
name of the video, I get backthe user ID that's related to
the video, which was my Snapchatuser account, and then a column
(57:01):
called external key.
I take the letter and numbercombination of called external
key.
I take the letter and numbercombination of the external key
and the first thing I do is I doa search across the entire
image.
I wanna see where that video,where that external key, is
hitting and what databases it'shitting in.
So here's just a visual of allof the different databases from
(57:24):
Snapchat that that videoexternal key hits in.
Once I did that, I startedlooking through the databases.
One of the databases related toSnapchat is called SCDB-27.
And it's a SQLite database andinside of that database you can
(57:45):
find created times, you can findsave times, snap capture times.
The Z entry ID is actually thatexternal key.
So that's how I how I locatedit in this database and in this
particular table.
Another table in that samedatabase has information about
whether the video has a locationor not.
(58:06):
This one has a one that was aflag for yes, my video has a
location data.
Has again the capture timecreated time, duration of the
video and other informationrelated to this video.
This video it also hit in thememories asset repository,
(58:30):
sqlite.
I had saved this video as amemory in Snapchat, so that is
showing that it is saved inmemories and then in the gallery
encrypted DB.
That is decrypted by Cellebriteand I believe other tools
decrypt it too.
I'm not 100% sure, but I knowCellebrite decrypts it In the
snap location table is actuallythat location information and I
(58:53):
found it again by taking thatexternal key and searching it in
the database for the video.
And the latitude and longitudeactually comes back to just
about exactly where I was parkedin the parking lot before
headed in to enjoy some sushi ata local sushi restaurant near
me.
Speaker 1 (59:14):
As one does Get that
sushi.
Speaker 2 (59:16):
Definitely,
definitely.
Speaker 1 (59:18):
Which we'll be
getting a lot at IASIS, but
that's another story for anotherday.
We better be.
I expect it.
We'll be getting a lot of thatat IASIS, but that's another
story for another day.
We better be.
Speaker 2 (59:24):
I expect it.
So with this I kind of justwanted to show how you can
connect the dots on a Snapchatimage or video when it's parsed
by tools by certain tools, youmay not see the entire picture.
You may have to go into thedatabases and find some of that
information about the videoyourself.
The video was parsed perfectly.
(59:50):
I could watch it.
It's actually me filming theparking lot of the Hanzo
Japanese Steakhouse and I wasable to just prove a whole bunch
of other data about this video.
So if it was part of one of mycases, I would have had the
exact location that the videowas recorded and I would have
had all of the capture times, ifit's a memory, if it's shared,
(01:00:11):
you can find that this onewasn't shared, but all kinds of
other information just byfinding that external key in the
cache controller database andfollowing the breadcrumbs to all
of the other databases relatedto Snapchat yeah, and you make
such a great pointrumbs to allof the other databases related
to Snapchat.
Speaker 1 (01:00:25):
Yeah, and you make
such a great point for everybody
to kind of remember, right.
And if you go back to the firstslide, the tools will find the
media with their media findingprocedures, right.
But notice and this is not adig on any tool, this happens a
lot on all tools they take thatmedia if it's come from a
decrypted container or from adatabase, there is no metadata
(01:00:46):
there and if you look at ityou're like, well, there's no
metadata and it's superimportant.
But that data might besomewhere and it might be inside
a database, which, it is right,and there's no like, there's no
mapping, no position, nothing,but the lats and longs are there
as well.
So we have to go that stepfurther and figure out could
(01:01:09):
this reside somewhere else?
And you know this because youknow how apps work.
You know that some of theseapps, especially apps that
receive data and send data, willkeep track of that media on
another location.
And we're talking about itbefore the show.
If you receive a piece of mediait's called image one and you
(01:01:29):
receive another that's calledimage one, you can't, they're
going to collide and one mightbe deleted, right?
So while these systems do thisabsolute, they keep track, with
individual IDs for every pieceof media and they keep the file
name on different metadata indatabases for them to reproduce
them to you without avoidingsome conflicts naming conflicts,
avoiding all sorts of differentissues that might happen if
(01:01:50):
they don't keep track of thatinformation.
It's on you to get it.
I have another example, thatone artifact that I worked on
some time ago, one of the cacheI forgot the name of the cache.
Right now, one of the cachesthat Android Image Cache Manager
.
The Image Cache Manager.
Do the tools find the pictures?
Of course they find it.
They're there, but it doesn'tgive you the context of these
images are related to the appand they were rendered by the
(01:02:14):
app.
And some of these images forexample, one of the gallery
vaults they're all encrypted,but the little copies of the
cache they're there.
You know the contents of thevault because they're copies of
it inside that cache and the apptells you that, not the app.
Sorry, the artifact puts thattogether for you.
Right and again, never just lookat the tool and say, oh, I
(01:02:36):
don't see a timestamp, itdoesn't exist.
Think about the app.
How does the app work?
Is it coming from a decrypteddatabase or decrypted data
source?
And if that's the case, thinkthat that metadata is going to
be managed by the app somewhereelse.
The question is, where is it?
And it might be inside aprotocol of data structure, it
might be inside a SQLitedatabase, it might be JSON
(01:02:58):
inside SQLite, which we've seena lot, right, heather?
Yes, so the idea for you as anexaminer is to have that thought
process, that you're going tostep into that gap where the
tool doesn't show you something,because there might be more
times than not, something toactually obtain that has value
for your case.
Speaker 2 (01:03:16):
Yeah, and this one.
I mean the stuff I need isparsed by the tools.
The location is parsed, thevideo is parsed.
I just wanted to findadditional data and see if there
was anything else that may behelpful in a case.
Right, and there is.
I may not have correlated thatthat video was in the locations
(01:03:36):
tab, but going and following itthrough its entire lifespan in
the Snapchat databases will justgive you the full picture
related to that there's somememories, right.
Speaker 1 (01:03:46):
What makes a memory a
memory?
Speaker 2 (01:03:49):
It was saved.
I don't even know how Snapchatreally works all the way, but it
was saved in my memoriessection of Snapchat.
I actually had hit save at onepoint to get it there, so it
wasn't in my camera roll part.
It was actually saved into mymemory section.
Speaker 1 (01:04:07):
That's important,
right?
Yes, that's important.
It tells you about the userinteraction, what was happening
in the world at the moment thathappened.
Do the tools highlight that foryou?
Did it highlight the memoriespart for you?
The tooling?
Speaker 2 (01:04:21):
I didn't see the
memories part, no, but the
location and the video I saw butthat's the point I'm making.
Speaker 1 (01:04:26):
Right, you dig a
little bit and now you're
finding the memories.
There was some overt actionfrom the user to put this here
as opposed to there.
Yeah, and it's a made-up.
It's an example, but it couldbe.
It could mean many differentthings.
The thing is you need to knowthat it's there to make some
determinations about whathappened.
Speaker 2 (01:04:43):
Yeah, on older
versions of Snapchat it's not
there anymore, but there used tobe a plist called storiesplist
and that would actually tell youif the video was shared to the
user's stories.
That plist isn't there anymore,so I need to do some further
testing and investigating to seewhere that data is stored,
because it's still somewhere,Maybe one of the databases I
(01:05:07):
just haven't updated recentlyand that's kind of obsolete now
in some of the newer versions ofSnapchat.
Speaker 1 (01:05:16):
There's a lot of
questions in the chat.
I mean we're out of time, we'refive minutes over time, but I
just want to make a quick pointfor the folks asking some
questions, if you're reallyinterested in the provenance of
images the forensic scooterScott I forgot his last name.
I almost forgot his last name,coining, scott Coining yeah, he
did an incredible set ofarticles and artifacts in the
(01:05:36):
Leaps in iLeap that look at thephotosqlite database.
The photosqlite database keepstrack of a lot of better data
about your pictures in generaland one of the fields that I
like is that bundle ID fieldwhich tells you if that picture,
where it came from, if it camefrom an app and, if so, what app
it came from.
So the folks asking aboutprovenance I'll celebrate PA
(01:05:59):
Insights, pa as a module forprovenance of images.
I forgot the name of the mediaorigin.
It's really good because it'sprobabilistic in the sense of
mixed-rater termination that youhave to verify.
But it's good because ithighlights possible images where
they came from, if they weredownloaded, if it came from a
particular app, if it wasgenerated by the phone, and
(01:06:22):
again it adds up differentartifacts or indicators to make
a determination.
Then you can go and validateright.
Verify and validate.
That is well.
Let me take that back To verifythat it's good.
Validation is another thing Toverify that it's correct and I
find it to be useful for that.
So there's many ways ofinvestigating provenance of
(01:06:42):
media in iOS devices and there'sa lot of articles and tooling
that might help you on that.
Speaker 2 (01:06:47):
Yeah, definitely the
one comment that asks can
metadata specify if the videocoming from the gallery of the
internal phone are made throughthe app?
There are indicators and flagsfor that.
I had a coworker who actuallyhad a case and he was following
the path and I don't rememberexactly how he figured it out,
(01:07:09):
but I'm going to get back toMalik on that, because I know
that we had a case that actuallyinvolved the image was coming
from the gallery and then it wasshared in the Snapchat and
there is a way to tell that.
Speaker 1 (01:07:21):
So I'll get back to
you on that.
And Josh the man I mean he heclearly states that Android has
something similar in theexternal db SQLite data store,
right?
So it has a field called owningpackage or something like that,
where it also tells you theprovenance where the picture
came from.
So you get places to look iniOS and Android when that's
(01:07:42):
important for your case placesto look in iOS and Android when
that's important for your case.
Speaker 2 (01:07:51):
So let me remove this
here.
I don't know if we missed anyof the comments, but if we did,
you guys can reach out to me onLinkedIn if I missed anybody's
question Questions.
Yeah, so we are to the end.
It's been an hour and eightminutes and we have the meme of
the week.
Speaker 1 (01:08:10):
Oh wait, wait, wait,
before we do that.
Yep, scott is also in the chat,in experts in this chat.
I'm sorry, this is the smartestdata forensics hour in the
planet.
Every week we come out becausewe got the experts like oozing
out of our chest.
It warms my heart, right.
So Scott, the man, the worldexpert on photosqlite because he
(01:08:34):
is he won't accept it.
He's really humble, he's areally nice guy, but he's the
world expert on this stuff.
He's coming with some blocks,steps that can be taken to gain
confidence in this analysis iniOS device capture media.
So I'm really looking forwardto that when that comes out,
scott, and thank you for givingus a heads up that it's coming.
We appreciate it.
Speaker 2 (01:08:53):
Oh, and when we were
at the Celebrate conference
before we go to the meme of theweek here Scott did a
presentation on his PhotoSQLite,and they have a really limited
amount of time.
I have never seen somebody packso much information into 45
minutes in my life, because thatphoto, sequel, light stuff
that's a lot Like.
(01:09:14):
There's a.
He has done so many queries forthat and it's a lot to talk
about and he just you shouldhave heard how fast he was
talking, but he got it all inthere and it was was excellent,
excellent presentationabsolutely.
Speaker 1 (01:09:27):
Uh, you know it was
well headed because you're like,
I got information now alreadydefinitely so.
Speaker 2 (01:09:33):
If you ever get the
chance to hear about the photo
sequel like live from him,definitely recommend it's an
experience what's that?
Speaker 1 (01:09:42):
it's an experience
yeah, definitely so.
Speaker 2 (01:09:44):
We it's an experience
.
Yeah, definitely so.
We have the meme of the week.
You take this one away.
Speaker 1 (01:09:50):
Explain away.
So you got the classical memeof the guy with a girlfriend and
he's looking at another girlthat's walking by and the
girlfriend's like, what are youlooking at?
What?
So the guy looking is calledDieter Francis, examiner with
quotations, and the girlfriendis offended because he's looking
somewhere else and looking atanother girl.
It's called copy and paste fromthe tool report.
(01:10:12):
And the girl that's walking bythat's so attractive to this guy
, to the examiner, is copy andpaste from Gen I, from
Generative AI, and I made thepoint on that because I'm really
trying, through this timeperiod where AI is being
implemented in our tooling andour procedures, to create some
(01:10:32):
awareness that our processes andthe way we do work as
individual examiners need torevisit that and revisit it
every so often.
This habit of taking tool outputand just putting it out in our
reports with little verification, it's bad, but doing the same,
(01:10:53):
applying the same concept, togenerative AI within our tooling
, is going to be way worse.
So at the end of the day, theissue is not the tool or the
reports, or even Gen AI right.
The issue here is the examiner,us, because at the end of the
day, the ones responsible forthe acquisition, preservation
(01:11:13):
analysis.
The parsing, analysis andreporting of that data is us.
It's you that are listening.
It is the person that'swatching this podcast right now.
It is the person that'swatching this podcast right now.
So it's on us to be able tohave that property, moral
property, have that attention todetail and have our due
diligence, and we cannot offload, outsource that either to the
(01:11:35):
tool reports or Gen I generativeAI.
It's up to us and it's a callfor us to to really up our level
of expertise and look for waysto actually be effective in the
best way possible couldn't agreemore.
Speaker 2 (01:11:51):
All right, that's all
I've got, that's all we've got
so, yeah, nothing else.
Nothing else for the good ofthe order, heather I have no
more topics and we've gone pastour hour.
Not that it matters.
Speaker 1 (01:12:02):
Not that it matters
yeah, we do it every episode
anyway, so yeah, we do.
Speaker 2 (01:12:07):
You're right, it's
not the longest one that's true,
that's true.
Speaker 1 (01:12:11):
So again I want to
thank all the folks and they've
been chatting with us, watchinglive jess and brett and scott
and kevin, and all the folksthat we had a chance to
recognize in the chat.
Uh, malik making greatquestions, um you, you made the
show interesting to had a chanceto recognize in the chat, malik
making great questions.
You made the show interestingto us and interesting to others.
So thank you for being herelive.
The folks that watch later andlisten later.
(01:12:32):
We also appreciate you forsupporting the podcast.
Hopefully it brings value toyou a little bit of humor every
now and then.
And feel free to reach out toHeather, not me, I'm kidding.
Speaker 2 (01:12:49):
No, honestly, there
were a few questions in there
that I think I missed soseriously.
Speaker 1 (01:12:52):
If anybody wants to
hit me up in the linkedin
messages, I will attempt,attempt to answer your questions
absolutely and, don't worry, wewill collaborate.
So I well, again, thank youeverybody.
Uh, we love you to pieces anduh, let's keep investigating,
let's keep it going.
Yeah, thank you, take care, andeverybody, as I put the music
on, have a fantastic night, takecare, and then I hit play,
(01:13:14):
we'll be right back.
Thank you.