Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome back to the deep dive. Just take a look
at the phone that's probably in your hand or on
the desk right now. It's not just for calls anymore,
is it. It's really a pocket sized, heavily encrypted vault.
It's got your biometrics, your location history, your private chats everything.
So for an investigator, that little device is the ultimate prize.
(00:20):
But it's also an armored safe built to fight back.
Speaker 2 (00:23):
That's the key difference. Really. In traditional forensics, you're usually
dealing with a hard drive you could well clone. With
mobile forensics, you're fighting a live operating system, layers of encryption,
and evidence that's constantly changing or disappearing.
Speaker 1 (00:36):
And that fight, that technological resistance is what we're diving
into today. We're going to unpack where the critical evidence
actually hides, how experts even attempt to get it, and
some of the well surprising legal traps that can completely
sink an investigation. Okay, so let's start with the basics.
The operating systems. We've mostly got Android and iOS and
they are fundamentally different beasts. What is it about their
(00:58):
architecture that makes getting so so different depending on the device.
Speaker 2 (01:03):
Well, the biggest difference and it's a huge one, is it.
Android is built on top of the Linux kernel, and
for a forensic analyst, that's a double edged sword. It
means you have this incredibly powerful open foundation to work.
Speaker 1 (01:16):
With, right, so under that slick user interface, it's really
just a specialized little Linux machine. So how does an
analyst talk to that kernel? You can't just open a
terminal window on a locked phone.
Speaker 2 (01:27):
You can't. So they rely on something called the Android
Debugging Bridge or ADB. You should think of ADB as
the main forensic doorway. It lets an analyst connect to
the phone and run commands directly, completely bypassing the normal interface.
That is, if they have the right permissions, which usually
means you need to root the phone.
Speaker 1 (01:45):
First gain total control basis.
Speaker 2 (01:47):
Exactly superrouser access, and once you're in you can run
some really powerful commands, like, for instance, DM message.
Speaker 1 (01:53):
It's a log, a log of what the operating system
itself has been up to pretty much.
Speaker 2 (01:59):
It's like the phone's die. It shows you every driver
that loaded, every piece of hardware that woke up. It
can give you clues about tampering or you know, strange
activity that you wouldn't see otherwise.
Speaker 1 (02:09):
Now Linux and Unix systems are famous for their security,
especially sandboxing. The idea that every app is in its
own little cell, unable to talk to the others. How
does that work and why is it a problem for investigators?
Speaker 2 (02:22):
That isolation comes from something called SEA Linux Security Enhanced Linux.
It enforces what's called mandatory access control. It gives every
single app its own user ID and it blocks them
from interacting. Which is great, right. It's why some random
game you downloaded can't just steal your banking password.
Speaker 1 (02:40):
But I've heard there's a kind of flaw in the design,
something called a covert channel.
Speaker 2 (02:45):
Ah. Yes, that's the trade off. See, while the apps
are in their own sandboxes, they still have to share things.
They might both need access to your text messages, for example,
or your microphone. That's a shared.
Speaker 1 (02:55):
Resource, so they're not totally isolated.
Speaker 2 (02:58):
They're not access point creates a covert channel. A malicious
app can't read your banking data directly, but it can, say,
use the microphone for a fraction of a second longer
when you type the number one. A second malicious app,
also listening to the microphone, sees that tiny timing difference
and knows you typed a one. It's a way of
(03:19):
leaking data in tiny, tiny pieces.
Speaker 1 (03:21):
That's incredible. So the flaw is really a consequence of
making the phone usable.
Speaker 2 (03:25):
It's a fundamental conflict between perfect security and functionality.
Speaker 1 (03:30):
And getting these malicious apps on a phone? Is that hard?
Do you need to be some master hacker?
Speaker 2 (03:36):
Not always? See all Android apps or APK files have
to be digitally signed. But here's the crazy part. You
don't need to steal a signature from a big company.
You could just become an official developer yourself, sometimes for
as little as fifty dollars.
Speaker 1 (03:49):
Fifty dollars do you get a trusted certificate?
Speaker 2 (03:51):
Yep, and then your malicious app looks completely legitimate to
any forensic tool that's just checking for a valid signature.
It's a huge blind spot.
Speaker 1 (03:58):
Wow. Okay, So the architecture is let's say complicated. Where
does the actual evidence live and how do you get
past the encryption?
Speaker 2 (04:06):
This is what we call the variability problem. There's no
single answer. You've got the simcard, which might have the
subscriber ID, maybe a few old text messages, but the
real stuff, the photos, the videos, the thousands of messages,
that's all on the internal nan memory, and.
Speaker 1 (04:21):
There's no standard for where things are stored on that memory.
Speaker 2 (04:24):
None at all. A text message could be in one
place on a Samsung and a totally different place on
a Google Pixel. You have to know the specific model
inside now.
Speaker 1 (04:33):
And as soon as you try to access that memory,
you hit the encryption wall.
Speaker 2 (04:36):
You hit a solid brick wall. Full disc encryption is
standard now, and without the user's PIM or the key,
you are, for the most part, completely stuck. It's especially
true on modern iPhones, where the encryption keys are a
physically fused to the processor in what's called the secure enclave.
So even if you physically remove the memory chip, all
you have is a chip full of meaningless encrypted garbage.
Speaker 1 (05:00):
Key isn't on the chip, which brings us to another problem,
volatile data RAM. If the phone is on, there's crucial
evidence in its memory. Turn it off and it's gone.
Speaker 2 (05:11):
Forever, and that creates the investigator's dilemma. It's a true
catch Toney two. You sees a phone that's turned on.
If you power it down, you lose everything in RAM.
But if you keep it on, you have to keep
touching the screen to stop it from locking, and every
single touch you make alters the evidence.
Speaker 1 (05:28):
So you're contaminating the crime scene just by trying to
preserve it exactly.
Speaker 2 (05:31):
A defense attorney would have a field day with that.
The phone becomes this digital time bomb.
Speaker 1 (05:35):
And what about the opposite end of the spectrum, the
cheap burner phones.
Speaker 2 (05:40):
Oh, feature phones, they can be a total nightmare. Your
expensive sophisticated forensic tools often have no idea what to
do with them. They use proprietary, undocumented systems. There's no ADB,
no standard file system. You're often forced to go back
to square one with much more basic techniques.
Speaker 1 (05:56):
Okay, so since you can't just copy and paste the files,
walk us through the process. What's this pyramid of acquisition
that investigators use?
Speaker 2 (06:03):
Right? So, the basic principles of forensics always apply preserve, acquire, analyze,
and report, and you always acquire the data first. The
pyramid just ranks the acquisition methods by difficulty and reliability.
At the very bottom level one is manual acquisition. We
call it project phone.
Speaker 1 (06:20):
Which sounds low tech it is.
Speaker 2 (06:23):
It means you literally take a high resolution photo of
every single screen on the device, every contact, every call
log entry, every single text message.
Speaker 1 (06:33):
So if there are thirty thousand text messages on a phone.
That's thirty thousand pictures. How is that even advisible? In court?
You could argue the chain of custody is broken.
Speaker 2 (06:41):
It's a huge challenge. It's slow, expensive, and legally very fragile,
but sometimes it's the only option you have. Moving up.
Level two is a logical acquisition. This is much better.
It's non invasive. A tool connects to the phone and
just asks the operating system for the allocated files, like
through a backup. Friends tools, for instance, will often just
trick an iPhone into doing a normal iTunes backup because
(07:03):
that's a really reliable way to get call logs and messages.
Speaker 1 (07:05):
Okay, that makes sense. What's next.
Speaker 2 (07:07):
Level three is a file system acquisition. This goes a
bit deeper, trying to get a full copy of the
file structure, maybe even some deleted space. And then at
the top, the peak of the pyramid is level four
physical acquisition. This is the holy grail, a perfect bit
for bit clone of the entire memory chip.
Speaker 1 (07:25):
But you said that's basically blocked now by encryption.
Speaker 2 (07:28):
Non invasive physical acquisition is yes, which leaves the destructive
methods things like jtag where you sought or wires directly
onto the motherboard to talk to the chips, or even
more extreme chip off forensics, where you literally disorder the
memory chip from the board and put it in a
special reader.
Speaker 1 (07:44):
But again, if the phone is encrypted, you've just destroyed
a very expensive device to get a chip full of
useless data.
Speaker 2 (07:49):
That's the modern reality, and sometimes the biggest roadblock is
much simpler a cracked screen.
Speaker 1 (07:55):
How can a cracked screens stop the whole process?
Speaker 2 (07:58):
Well, all those tools like Celebrate or xroy, they need
you to interact with the phone. You have to tap
trust this computer or enter the pon code. If the
screen is shattered and the touch sensor is broken, you
can't do that. The whole investigation stops until a hardware
specialist can physically replace the screen without wiping the data.
It's a whole other skill set.
Speaker 1 (08:19):
Incredible. Okay, let's shift to how these phones communicate. How
do they talk to cell towers and how can that
be intercepted?
Speaker 2 (08:28):
Well, most modern cell communication is encrypted, so just listening
in is very difficult. But law enforcement can use devices
commonly known as stingrays that essentially pretend to be a
cell tower.
Speaker 1 (08:39):
A fake tower. How does it trick the phone.
Speaker 2 (08:41):
It's actually pretty clever. Your phone is designed to always
connect to the strongest signal available to prevent dropped calls.
The sting range just sits nearby and blasts out a
signal that's stronger than the real towers from Verizon or
AT and T. The phone automatically connects to it, thinking
it's a legitimate tower, and then.
Speaker 1 (08:56):
The stingray can just read all the traffic. A man
in the middle attack.
Speaker 2 (09:00):
A classic man in the middle and decrypts the traffic,
logs it and then reencrypts it and passes it on
to the real network. The user has no idea.
Speaker 1 (09:08):
And when tracking a device, there are two key numbers involved,
right the phone's ID and the user's ID exactly.
Speaker 2 (09:14):
You have the IMEI, which is the unique serial number
for the physical phone hardware. It never changes. Then you
have the IMSI, which is the identity of the subscriber
that's tied to your SIM card. It's a critical distinction
because someone could be using multiple SIM cards, multiple MSIs
in one burner phone with a single IMEI. You need
to track both.
Speaker 1 (09:35):
Okay, and this brings us to what might be the
biggest legal land mine of all, the cloud forensic tools
can sometimes find cashed log in details on the phone
a username, password. It must be tempting to just use them.
Speaker 2 (09:48):
It's incredibly tempting, and it is probably the single most
dangerous mistake and investigator can make. Even if you have
the username and password right there, you absolutely must not
use them to log into a cloud account, not a
an email, not social media, nothing, not without a completely
separate court order that is directed at the service provider
like Google or Apple.
Speaker 1 (10:06):
What happens if they do it anyway, It's.
Speaker 2 (10:08):
A direct violation of the Computer Abuse Act. You've essentially
just committed a federal hacking crime. All the evidence you
find becomes inadmissible fruit of the poisonous tree. It can
you get the entire case thrown out of court, no
matter what else you found on the phone.
Speaker 1 (10:22):
So the data sitting on the device becomes even more important.
The metadata.
Speaker 2 (10:25):
The metadata is the real treasure. GPS coordinates are baked
into photos all the time, calendar events have locations, but
one of the most overlooked sources is the phone's personal dictionary,
the spelled check history. People type in unique slang, usernames,
even password fragments. They think it's gone but the phone
(10:46):
remembers it to help with autocorrect, It can be a
gold mine.
Speaker 1 (10:51):
So this whole deep dive really shows that mobile forensics
is just a brutal landscape. You have all this variation
between operating systems, this nearly unbreakable hardware encryption, and this
whole pyramid of techniques from taking thousands of photos all
the way up to physically ripping chips off a motherboard.
Speaker 2 (11:06):
And we learned that the tech challenges go hand in
hand with these huge legal risks. One wrong move like
using a cashed password, and you could destroy your own case, right, And.
Speaker 1 (11:15):
We talked about how usability choices like an Android's design
created some of these security holes. But let me flip
that around. If usability can be a threat to evidence,
why is designing a perfectly secure phone one that's completely
impossible for anyone to get into often seen as a
failure of public safety design.
Speaker 2 (11:34):
That's the core of the whole debate, isn't it. If
a device is perfectly, absolutely secure, it means it's inaccessible
even with a legal warrant. It shifts the balance of
power completely. We see that with a secure enclave. While
strong encryption protects an individual's privacy, which is vital. A
system with zero possible access for law enforcement, even under
(11:54):
court order, puts total anonymity above justice and accountability, and
that creates a massive societal problem with no easy answer.
Speaker 1 (12:02):
Something to think about, for sure. Thank you for walking
us through this. It's an incredibly complex world.
Speaker 2 (12:07):
My pleasure is great to talk about it.
Speaker 1 (12:08):
We'll see you next time on the Deep Dive.