Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to the deep dive. Today, we are taking a
crowbar to one of the most protected digital vaults in
the world.
Speaker 2 (00:09):
The iOS operating system.
Speaker 1 (00:11):
Exactly. This is Apple's mobile fortress, this system that's really
built its entire reputation on being well impenetrable.
Speaker 2 (00:20):
It really is the gold standard in mobile security. And
that's I mean, that's why it's so critical for anyone
in cybersecurity or forensics to really understand how it's built.
Speaker 1 (00:28):
Right. We need to get past the marketing.
Speaker 2 (00:30):
Yeah, we have to look at the actual technical architecture
that makes it so strong, and maybe more importantly, understand
where the vulnerabilities are, you know, the single points of
failure that hackers and investigators are looking for.
Speaker 1 (00:41):
So our mission today is pretty simple. Yeah, we want
to grasp the architecture, understand the hardware defenses, and then
trace the pathways of attack.
Speaker 2 (00:49):
From zero days all the way to like million dollar
forensic tools.
Speaker 1 (00:53):
And that gets us right to the core of it,
doesn't it This paradox you.
Speaker 2 (00:56):
Mentioned the catch twenty two, it's fundamental to understanding iOS
because Apple keeps such a tight grip on the ecosystem.
You know, every new iPhone runs basically the same kernel,
same hardware, same architecture. The security is incredibly effective most
of the.
Speaker 1 (01:12):
Time, but that last one percent, Well, if.
Speaker 2 (01:14):
A single vulnerability is found, it's not just a key,
it's a universal master key.
Speaker 1 (01:20):
Because the environment is so predictable.
Speaker 2 (01:21):
Exactly, that exploit becomes highly reproducible across millions and millions
of identical devices. So that standardization gives you amazing security,
but when it breaks, the failure is catastrophic, way more
so than on say the Android ecosystem, which is just
so fragmented.
Speaker 1 (01:38):
Okay, so let's unpack this fortress. Let's use an analogy
to make these layers of defense clear. If we think
of iOS as like a high tech smart city, what
are the layers here? Starting from the ground up?
Speaker 2 (01:48):
Okay, so deep down, you've got the bedrock, the physical infrastructure.
That's the COROS, the Unix based kernel. Yeah, it's the foundation.
It's managing all the essential services, the security frameworks, direct
interaction with the radios, Bluetooth, all that low level stuff.
If you compromise this layer, you basically own the city's
(02:09):
power grid.
Speaker 1 (02:10):
And if COROS is the power grid, then what are
like the city's utilities that.
Speaker 2 (02:14):
Would be the core services. This layer is what talks
to the kernel. It handles all the base communications like
what think of the TCPIP stack for internet access, how
it connects to iCloud. And this is a big one.
The default browser engine WebKit.
Speaker 1 (02:28):
Okay, so we're moving up.
Speaker 2 (02:29):
What's next the media layer. This is the infrastructure for
more specialized tasks. It handles all the heavy lifting for
things like graphics, video rendering, and all the hardware acceleration
that modern apps need to run smoothly.
Speaker 1 (02:42):
And then, finally, the top layer, the one we actually
see and interact with.
Speaker 2 (02:45):
That's Cocoa Touch. This is the living room, you know,
the user facing environment. It manages the apps themselves, handles
all your taps and swipes, and deals with all the
documents and files inside those apps.
Speaker 1 (02:56):
So why is that specific stacking so important? Is it
to make sure that if I break through one layer,
I immediately hit another wall?
Speaker 2 (03:03):
Precisely? It's all about limiting the blast radius, okay, And
that brings us to the actual defense mechanisms. First up
is sandboxing, or as some call it, jailing, the classic
isolation method. It's more than that, though, every single app
runs in its own little bubble its own restrictive environment.
It can't just go and read the memory or files
(03:24):
of another app without very specific permission.
Speaker 1 (03:27):
So it's stopping them from talking to each other, right.
Speaker 2 (03:29):
It's dynamically preventing that inner process communication. That's why the
term jail break exists. People are literally trying to find
flaws to break out of that isolated jail.
Speaker 1 (03:39):
We also hear a lot about ASLR address basically out randomization.
If sandboxing is the jail cell, what's ASLR protecting?
Speaker 2 (03:47):
So ASLR is a huge defense against a really common
type of attack called a buffer overflow. Traditionally, an attacker
finds a predictable address in memory where a program's instructions
are stored, and they can exploit it over and over again.
Speaker 1 (04:01):
Because the addresses are always in.
Speaker 2 (04:02):
The same place exactly ASLR is like a constantly shifting maze.
Every time the program runs, ASLR just shuffles all those
memory addresses around.
Speaker 1 (04:11):
So it doesn't actually stop the attack.
Speaker 2 (04:12):
It doesn't stop the underlying vulnerability, no, but it makes
the attack non reproducible. An exploit might work once by
pure luck, but the next time the app runs, that
memory address is gone. It's somewhere else.
Speaker 1 (04:25):
Ah, So you make the exploit unreliable.
Speaker 2 (04:28):
You make it so unreliable it's almost useless for a
large scale attack. And on top of all of that,
of course, you have the data itself protected with you know,
military grade.
Speaker 1 (04:37):
Encryption AES two fifty six bit.
Speaker 2 (04:39):
Yep, the FHIPS one two compliance standard. So you've got isolation,
you've got randomization, you've got bulletproof encryption. Logically, that should
be the end of the.
Speaker 1 (04:48):
Story, but it's not. Because of that uniformity we talked about,
that commitment to a standardized experience. That's what opens up
this other really high stakes avenue for attack. This is
the catch twenty action.
Speaker 2 (05:00):
It really is. Let's go back to the core services
layer for a second. We mentioned WebKit, the browser engine. Well,
from a business perspective, code reuse is great, it's cheaper,
it's efficient, but for security it can be a nightmare.
If a zero day exploit, an unknown vulnerability is found
in WebKit.
Speaker 1 (05:17):
Then every single app that uses WebKit to show web
content is instantly vulnerable.
Speaker 2 (05:23):
Instantly. That uniformity that gave us all that security now
creates this single point of catastrophic failure. Wow, and that's
what enables attack like drive by downloads. You don't have
to click a bad link or download an attachment. You
just have to visit one malicious website, and because that
shared WebKit component has a flaw, your whole device can
(05:43):
be compromised.
Speaker 1 (05:44):
We've seen this in practice too. There have been some
crazy historical examples where these simplified checks just led to
fundamental flaws. Tell us about the mass grading attack.
Speaker 2 (05:53):
Oh, that was a fascinating one. It was a security
bypass that really exploited a flaw. And how Apple managed trust?
Speaker 1 (05:58):
How did it work?
Speaker 2 (05:59):
So when a developer compiles an app, it has an
internal project name. Malicious actors figured out that if they
built their malware using the exact same internal project name
as a legitimate app you already had installed.
Speaker 1 (06:10):
The system would get confused.
Speaker 2 (06:11):
Well, the system only did a very basic check. It
just checked to see if the new code was.
Speaker 1 (06:16):
Signed, just that it was signed, not who signed.
Speaker 2 (06:19):
It, right, It didn't bother to check if the new
signature matched the signature of the original legitimate app. As
long as the internal names matched and it was signed
by someone, the malicious version would just overwrite the original one.
Speaker 1 (06:31):
That's terrifying. So the malware could just run with all
the permissions of.
Speaker 2 (06:35):
The real app and access all its data. The user
would have no idea the code underneath had been completely
swapped out.
Speaker 1 (06:41):
These system level exploits are so powerful they basically become
black market commodities. We've all heard of things like the
Gray key box. This is a whole industry.
Speaker 2 (06:53):
It's a constant arms race, and the price tags they
definitely reflect the difficulty. Gray key got famous because it
was this physical box that could systematically bypass a specific
hardware flaw, which forced Apple to patch it. But you
see zero day exports auctioned online for I mean upwards
of one hundred thousand dollars, sometimes way more. For law
enforcement or state level actors. You either have to pay
(07:17):
that massive sum for a way in, or the investigation
just hits a dead end, which.
Speaker 1 (07:22):
Brings us to the final wall, the hardware wall that
Apple controls. Why is it so incredibly difficult to get
data off a modern locked iPhone even if you somehow
manage to get past all the software layers we just talked.
Speaker 2 (07:35):
About, Because the keys themselves are physically isolated, we're talking
about to really specialized low level hardware components. Here first
is the secure boot chain.
Speaker 1 (07:44):
Okay, what's up.
Speaker 2 (07:45):
It's this read only ROM that verifies the bootloader and
the kernel before the operating system even starts to load.
Speaker 1 (07:52):
So it's like a bouncer at a club checking the
OS's ID before it lets it in.
Speaker 2 (07:56):
That's a great way to put it. It ensures the
integrity of the OS load, so you can't just sneak
in a malicious kernel at startup and then for the
keys themselves. You have the secure enclave, the coprocessor exactly.
It's basically a separate mini computer on the chip that
does one thing and one thing only, cryptographic operations. It
has its own encrypted memory, its own secure power, its
(08:16):
own random number generator. It's completely walled off, so even.
Speaker 1 (08:21):
If an attacker gets full control of the main iOS kernel,
they still can't just reach over and grab the master
encryption keys.
Speaker 2 (08:27):
They cannot. That's the whole beauty of the design. The
secure enclave keeps the keys totally isolated. It prevents all
those common forensic methods like trying to do a memory
dump to find the keys because the keys are never
ever exposed in the main CPU's memory and the.
Speaker 1 (08:41):
Keys are physically tied to the chip itself, right, You
can't separate.
Speaker 2 (08:45):
That's right. The device's unique identifiers or UIDs are physically
burned into the silicon during manufacturing. The encryption keys are
then tied directly to those hardware identifiers.
Speaker 1 (08:55):
So older methods like chipof forensics are useless, now completely unreliable.
Speaker 2 (09:00):
You can't just desolder the chip and read it anymore.
Physical access is no longer than magic bullet it once was.
Speaker 1 (09:06):
So if the secure enclave is this unbreachable fortress, the
only way in is to convince the device to unlock itself,
which brings us back to the passcode and biometrics.
Speaker 2 (09:16):
And that auto white feature. That's the final wall. You
get ten failed passcode attempts and poof the data is
essentially gone forever.
Speaker 1 (09:24):
And what about biometrics, touch ID, face ID.
Speaker 2 (09:28):
They seem secure, they're convenient, but consumer biometric systems always
always sacrifice some security for user convenience. How So, it's
all about something called the crossover air rate or CEER.
That's the point where the rate of false positives letting
a bad guy in crosses the rate of false negatives
locking the good guy out.
Speaker 1 (09:48):
And manufacturers don't want to lock out the actual user exactly.
Speaker 2 (09:51):
They don't want you to get frustrated, so they often
lower the security threshold to make it work more reliably,
which makes it less strict. And that's what let's determined
attackers beat it with, you know, high quality molds or masks.
Speaker 1 (10:02):
So what's a truly secure biometric.
Speaker 2 (10:05):
You'd need something much more complex like hang geometry, which
measures multiple variables at once, all five fingers, the palm structure,
even heat signatures. But that's not exactly convenient for a phone.
Speaker 1 (10:17):
We can't really talk about this encryption wall without getting
into the huge power struggle it created. I'm thinking of
the San Bernardino case. Apple made a very specific choice
to stop storing copies of private keys on its own servers.
Speaker 2 (10:31):
They did, and that decision was driven by one marketing
to really cement their image as the leader in privacy,
and two to reduce their own legal liability.
Speaker 1 (10:43):
It completely changed the game.
Speaker 2 (10:45):
It did. Once they stopped holding those keys, they genuinely
couldn't unlock an encrypted phone for law enforcement, even if
they had a court order telling them to.
Speaker 1 (10:52):
So you have the most powerful law enforcement agencies in
the world, and they have to spend a fortune because
the company that made the device either can't or won't
open it.
Speaker 2 (11:00):
It's an insane power dynamic. They ended up spending about
a million dollars for a one time exploit. It only
worked on that specific iPhone model running that specific OS version,
and of course that exploit was immediately commoditized and became
this incredibly valuable asset on the global market.
Speaker 1 (11:17):
Okay, let's shift to the practical side. If you're an
investigator and you have physical control of one of these devices,
what are the first steps? How do you even get
it ready to talk to?
Speaker 2 (11:27):
Well, you often have to manually force it into a
specific state. We rely on modes like recovery mode, which
can be activated if the boot process fails, or even
more powerfully, DFU mode device firmware update, and that.
Speaker 1 (11:40):
Involves those weird button combinations, right.
Speaker 2 (11:42):
Yes, specific time sequences like holding the power and home
button for exactly ten seconds then letting one go. You're
signaling the device to skip the normal OS broute and
enter a state where it will accept a new firmware
image from a computer.
Speaker 1 (11:56):
So once you have that gateway open, there are two
main ways to get data off.
Speaker 2 (11:59):
Two primary methods, yeah, logical acquisition and physical acquisition.
Speaker 1 (12:03):
Logical is the easy one.
Speaker 2 (12:05):
Logical is the easier path, for sure. It basically relies
on the device's own protocols, usually by leveraging the iTunes
backup process. If the phone is unlocked and you have
the pin, you can get a ton of stuff pretty quickly,
app data, the file structure, videos, but you only get
what the OS decides to include.
Speaker 1 (12:23):
In a backup. It's not a complete picture of the
drive exactly.
Speaker 2 (12:26):
For a full true disk image, you need physical acquisition.
And this is what's effectively impossible on modern locked encrypted
devices without that.
Speaker 1 (12:35):
Key, because even if you get the raw data, it's
just gibberish.
Speaker 2 (12:38):
All you have is a massive file of uncrackable as
two fifty six ciphertext. The amount of computing power you'd
need to brute force that is, I mean, it's just
not feasible in any useful amount of time.
Speaker 1 (12:49):
So the key is still the absolute bottleneck. Okay, but
let's say we do get the key, or we have
a recent logical backup. Where do investigators hunt for those
really critical digital artifacts.
Speaker 2 (12:58):
We start by looking at the structure or even the metadata.
We look very hard at p list files. What are
those they're XML formatted property lists. They store all this
critical data that the user probably doesn't even know is there,
like the device's Imei serial number, the last time it
was backed up, even network IDs that can link the
phone to specific cell towers.
Speaker 1 (13:17):
And you mentioned time, the ability to spot when someone
is tampered with time stamps.
Speaker 2 (13:22):
Yeah, we deal with epoch time scamps, which is just
the Unix format a long string of seconds since nineteen seventy.
Our tools convert that instantly. But the real forensic gold
is in the consistency. What do you mean If a
user tries to manually change a file's time, they might
get the seconds or minutes right, but the system records
time down to the microsecond or nanosecond. So a file
(13:43):
that was supposedly created at exactly eleven point one five
and thirty five point zero zero zero seconds, that perfect
rounding is a dead giveaway.
Speaker 1 (13:52):
The tail kale wobble.
Speaker 2 (13:53):
That's a telltale wobble that suggests tampering. We compare the
file system time with the metadata time, and that's where
we find the discrepancy.
Speaker 1 (14:00):
And then they're the real gold mines, the little data statues.
People forget about that review their intents.
Speaker 2 (14:04):
Oh yeah, these are the hidden gems. Number one, drafts, folders,
email or SMS.
Speaker 1 (14:10):
How's that useful?
Speaker 2 (14:11):
People trying to communicate covertly will write a message, save
it as a draft, and their accomplice logs into the
same account, reads the draft, and deletes it. The message
was never technically sent, so there's no carrier log but
the draft often remains on the local device.
Speaker 1 (14:27):
Yeah, okay, what's number two?
Speaker 2 (14:29):
Clipboard data? It's temporary, but people are constantly copying and
pasting passwords, crypto keys, sensitive text messages before deleting them
somewhere else. That clipboard can tell you the very last
sensitive piece of information they handled.
Speaker 1 (14:42):
And the ultimate digital diary of intent the keyboard cash.
Speaker 2 (14:46):
Absolutely the keyboard cash or the dynamic text file. This
isn't a keylogger recording every single keystroke. Instead, it learns
and stores frequently typed uncommon words like what usernames, unique passwords,
specific addresses, private slang or terminology. It reveals the user's
intent and their specific authentication details. Even if they've meticulously
(15:08):
cleared their browser history and messages, you will often find
the exact keywords or logging credentials hiding in plain sight
in that cash.
Speaker 1 (15:15):
This whole deep dive really just underlines that iOS security
is this perpetually escalating arms race. Apple improves the hardware,
the secure on play, the boot chain, and that just
forces exploit developers to spend incredible amounts of time and
money to find that one tiny crack in the software armor.
Speaker 2 (15:34):
And that standardization, the very thing that makes the system
so tough, also makes every single successful exploit universally effective.
I mean, the sheer difficulty of doing a physical acquisition
on a locked phone today just highlights the incredible value
of strong encryption, and it's why every mistake and investigator
makes can be so catastrophic.
Speaker 1 (15:54):
That's right. For professionals in the field, the stakes are
just they're existential. Any mistake with author is, especially if
you're touching cloud data like an iCloud backup without the
right warrant, that can destroy your credibility and get an
entire case thrown out.
Speaker 2 (16:07):
It just takes one mistake.
Speaker 1 (16:09):
So let's leave you with this final thought. This is
a logical acquisition relies on a backup, which is just
a finite slice of data and time. What specific digital
artifact one that often reveals authentication details would you prioritize
analyzing immediately if you suspected the device owner it just
cleared all their activity to hide their communications. You're hunting
for intent, not just a message.
Speaker 2 (16:30):
You would go straight for the dynamic text file the
keyboard cash because it contains the most recent, most valuable
authentication data that the user almost certainly forgot to manually clear.
It is the footprint of their most frequently used secrets.