Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Welcome to tech Stuff, a production from iHeartRadio. Hey there,
and welcome to tech Stuff. I'm your host, Jonathan Strickland.
I'm an executive producer with iHeartRadio and How the tech
Are Yet it is a Friday, which means it's time
for a classic episode of tech Stuff. This one is
(00:26):
titled Apple versus the FBI, and originally it published on
March second, twenty sixteen. This is sort of an ongoing
struggle between companies like Apple and various agencies because, as
we will hear in this episode, the FBI wanted ways
to be able to access locked iPhones that had been
(00:49):
recovered from say a suspect, and Apple is trying to
do its best to maintain consumer confidence by tecting their
data and their devices. We've heard similar things, which I'll
talk about more at the end of this episode, that
have involved other law enforcement agencies and investigative agencies here
(01:14):
in the United States. But first let's listen to this
classic episode from twenty sixteen. So we're gonna talk about
something that's actually we're gonna have to work very hard
for multiple reasons because we're talking about Apple versus the FBI.
The whole story that's unfolding as we record this, and
to talk about what is behind that case, what are
(01:36):
the implications, what is the FBI's argument, what is Apple's argument?
And in addition to that, we have, of course the
added responsibility of remembering that this is all centered around
a truly awful crime. Yes, absolutely so. The what we're
(01:57):
what we're talking about specifically hit inanstream news when Apple
did something that a lot of tech companies have never done.
They issued a blanket statement letter and went public. Yeah,
with they went public with a response to an FBI request.
But I guess I'm getting ahead of it. What was
(02:18):
the crime? Well, let's yeah, let's I'll give you the
background on what's happened and we'll build from there. So
first we have to look back on December second, twenty
fifteen in San Bernardino, California. That's in southern California, and
that's when two well an American citizen and a legal
resident husband wife team, both of Pakistani descent, committed a
(02:45):
mass shooting. It was a Sayed Rizwantharouk and tash Fen
Malik who committed this act, and it happened at the
Department of Public Health. They were having an event that
was starting off as a training event and then it
was supposed to transition into a holiday event. And after
(03:06):
the training element, Farouk, who was actually at the event,
he was in the employee of the Department of Public Health,
he was a food inspector, left, came back with his wife.
They were both heavily armed and they both started firing
into the crowd of people at the Department of Public Health.
Fourteen people were killed, twenty two injured. Very serious crime.
(03:28):
Then far and Malik left. They fled the scene. Turned
out they had also left behind an explosive device, which
thankfully did not detonate. Failed to detonate, and they fled
in a vehicle that they had rented a few days before,
(03:49):
and several hours later law enforcement tracked them down. There
was a confrontation, there was a shootout, and then both
of the shooters died as a result of that shootout.
The FBI has stated that they believed, based upon the
evidence they were able to find that the two were
acting on their own, that they were not part of
some sort of terrorist cell in the United States. However,
(04:12):
they can't be absolutely certain of that. That's where the
crux of this issue with Apple is going to come
into play, and the big part of it is they
can't account for it's just less than twenty minutes of
the activity that happened between the shooting or leading up
(04:33):
to the shooting, and around the shooting, because they're thinking
that there's a possibility that something that happened within that
time could give them more information and at least allow
them to confirm whether or not they truly acted on
their own or if they were under the direction of
some other group, which could potentially and this is an
FBI argument, which could potentially give the government the ability
(04:57):
to prevent a future attack. And more to that point,
neither of them had been listed in any database as
a potential threat, So that puts extra pressure on the government,
right because citizens say, well, why did this happen? The
government says, we followed procedure and neither of them registered
(05:20):
on any of our There were no red flags, so
we had no way of knowing. And that means there's
actual extra pressure on the FBI to investigate this thoroughly,
partially to show that in fact, everything that could be
done had been done short of taking some extreme step
that none of us want to see, right, the idea
(05:41):
of like, Okay, everyone of Pakistani descent has to leave
the country. That's ridiculous. We would that's that's returning to
an era of the United States history that we do
not want to revisit, a World War two internment camp situation. Yeah,
we don't want that. We don't want that because it
is never fair to lump innocent people in for or
that one of them might be guilty. That's not cool, right,
(06:03):
this is I'm already getting angry and i haven't even
gotten to the part about the apple stuff yet. I'm
going for a slow burned thing myself. That's fair. So
one of the thing that's that all of this is about,
ostensibly any at any rate, is an iPhone. It's specifically
an iPhone five C probably running iOS nine. That's based
(06:23):
upon the various public filings we've seen about this. And
it was county owned. It was owned by the Department
of Public Health and then issued to Farouk who worked
for the department Department of Public Health, so it was
county owned. The FBI went to the county and said,
we want your permission to access the contents on this phone.
(06:45):
The county said, of course, you have our permission, you
can access it. Now, if that's all their worked to it,
it'd be fine. Sure, I would just access the phone.
They've already received permission from the phone's owner, and they
could cull through it to look for any evidence that
would lead them to more information about this crime. But
there is a security measure on the phone, and it's
(07:08):
a very simple one. It's on lots of smartphones. It's
a little password. In this case, it's a four or
six digit pin which can be alpha numeric if you
activate that option on the six digit. So it's one
of the two, we don't know which, and without that pen,
(07:29):
you cannot access the information because the way the pen works.
This is where we get into the tech stuff. Been right.
There are two levels of encryption, or two keys to
this encryption. One of the keys is the pen. The
other key is hard coded onto the device itself. All right,
It's only it's if you think about movies like those
(07:51):
nineteen eighties movies with the nuclear power, where you have
to have two generals put their keys into the same
sort of thing. So if you don't have the pin,
it can't combine with the hardware key, and therefore you
cannot unlock the information. The idea here is that Apple
doesn't have the pen right, which is cool because it
(08:13):
means that if you or I or you listeners have
an iPhone, it means you can trust Apple. Because they
don't know your pin right, they can't access your phone.
You know, that was the whole point, right, Two points one.
Apple wanted to make its secure enough so that consumers
would say, I feel good about buying an iPhone because
(08:36):
I know whatever I store on it, whether it's something
that's personal or it's just you know, it's no one
else's business, whatever it may be, sure no one else
is going to have access to that unless I give
them my pen. It's good for Apple, not just because
consumers are happy, but because Apple then has an out.
If the government comes to Apple and says, hey, we
want you to break into this phone, they can literally
(08:58):
say we cannot do that. It's not that we will not,
it's that we literally physically cannot grant your request because
it's impossible. And there's also I would just to expound
that point a little bit further. This I'm not suggesting
as their conscious will or intention, but this also removes
possible culpability. Yes, yes, so all of these are very
(09:22):
important points, right, So let's get into a little bit
more of why the FBI is coming to Apple. So
let's say that it's a four digit pin. Okay, all right,
all numeric, all numeric, simplest version. That means there are
ten thousand combinations that are possible for that pen. That's it.
That's you're limited to ten thousand, okay, which is that's
(09:44):
a lot, but it's not that much if you're going
to do something like a brute force attack. Right. So,
if you do a brute force attack on a normal system,
and there are only ten thousand variations, it's with a
fast enough computer, it's just a matter of minutes, right,
But there are some limitations on the iPhone that make
this harder. Does it freeze if you enter the incorrect pin? Oh?
(10:06):
It does more than freeze, all right. So first of all,
there is an eighty millisecond delay when you enter a
pin and when it gets verified that it is or
is not the correct one when the key turns right.
So with that eighty millisecond delay, that doesn't sound like much,
but it means you can't just blast a whole bunch
of numbers through. Also, you have to type it in
on a screen. You can't hook it up to a
(10:28):
computer and just digitally send the various pin combinations to
try and get through. You have to actually tap them
on the screen to do it over and over. So
that means you tap in the number eighty milliseconds pass,
you get the confirmation or denial. Tap in a number next.
If you hit ten consecutive incorrect pen entries, the phone
(10:53):
interprets that as saying someone else who is not the
owner has gotten possession of this phone. So in order
to protect the owner's information, we're scrambling everything and you
lose all the metadata. Everything is useless. So even if
you got the pen afterward, you would not be able
to access the stuff you were looking for in the
first place. You would maybe be able to turn on
(11:14):
the phone, but there wouldn't be anything there to find. Yeah,
you would have just you would have just it would
be like you just accidentally set fire to the file room,
Like you know, like you can't get into the file
room door, and whatever you try and do accidentally sets
a fire inside the file room. You then get the
key to the file room, and then you think, well,
what's the point lady lost everything, and it's it's virtually assured.
(11:37):
Yeah that out of those thousand, yeah, out of those
ten thousand tries, your first ten are going to be wrong. Yeah,
I mean just the odds are pretty much against you,
unless you're ridiculously lucky. So there's that, and of course
they don't they also if it were an actual later iPhone,
(11:58):
if it weren't the iPhone five C, they would have
an added problem, which is that with the later models
there's an additional delay after four failed tries. So if
you try four times and fail, it will then give
you a five second delay before you can try the
fifth time. After that, it gives you a fifteen second
delay after that, like by the time you get to
(12:20):
the ninth try, it's an hour delay. Like, think about
what you're doing, right, So, but the five C doesn't
have that, so you may have read about that delay.
It does not qualify for this particular space. Through force
won't work in the most basic pin situation. But in
the sixth digit alpha numeric, I just imagine the same
(12:40):
rules apply, but the yeah, yeah, yeah exactly, But the
possible answers are much greater because now you've got six
digits you've also got the possibility of alpha characters in
their alphabet characters. I believe so. Um so, anyway, from
why I hear, if you were to try and use
(13:00):
brute force on a six digit alpha numeric one, it
would take you with a with a fast computer that
had been optimized for this, it would take you about
six years to break through it. And that's going through
all the different possible combinations, assuming that there's not another
kill switch type deal like there is with the iPhone.
By the way, that's software that Apple has built into
(13:22):
the iPhone, or really firmware that Apple is built into
the iPhone. It's not like it's a fundamental you know
quality that all smartphones. Oh right, yeah, yeah, that's that's
specific to Apple phones. Right, So you've got this problem, right,
You have the FBI. They've gotten permission from the owner
of the phone to access the phone's contents, but the
(13:45):
owner of the phone doesn't know the pen because the
owner of the phone had had issued it to an employee.
The employee had come up with the pen, So they
don't know. If it's a four digit, they don't know,
if it's a six digit, they don't know. They don't
know if this this feature where after ten tries everything
gets erased. They don't know if that's necessarily active because
you can turn it off, but all indications point to
(14:09):
it being on. For one, it was on when the
phone was issued to Farouk right, and people typically don't
change their defaults vs phone users. So FBI wants Apple
to do something particular, and it's something different from what
we've talked about in previous episodes about a backdoor entrance
(14:31):
into a system. It's not quite the same thing. So
you've got this phone, you can't brute force attack it
without risking damaging the contents. Apple cannot access the information
on this phone by design, they did not want that
to have that capability. So what the FBI wants Apple
(14:52):
to do is build a new version of iOS just
for this phone, a specific iOS for this own that
disables the safety features that would one prevent brute force
attacks from happening quickly, to allow brute force attacks to
happen by hooking up the phone to a computer so
(15:13):
you don't have to tap those numbers in it shore
and three disabling that kill switch. So what the FBI
wants Apple to do is to build up a brand
new iOS again just for this one phone, and then
install it on the phone. This would all you know,
hypothetically happen on Apple grounds, like at an Apple location,
(15:38):
either at the corporate headquarters or wherever. And then in theory,
you would then destroy the the the custom iOS because
you only needed it for that one phone, right, because
it was so easy to toss the one ring, bring
it back to more door and destroy it. Right. Yeah, yeah,
(15:59):
you know what, Mount Doom plenty of backdoors. It's fine,
it's integral to the plot. That was That was Sauron's
problem was that he did not plug the security vulnerabilities
into Mount Doom. I mean, that's that's a classic example,
like you know, right up there with the whole land
War with Asia thing. So FBI has been very very
(16:22):
careful in framing this in a way that presents it
as a reasonable request, one time thing. That's a that's
a big deal, right saying this is for one phone
and one phone only, you would you know, we're suggesting
the Apple creates an iOS that's directly tied to a
unique identifier on that phone, meaning even if the iOS
(16:44):
were somehow to leak, sure it would not be applicable
to any other device on the market, so it could
only work for this one phone. So that that sounds
good in its current form. That's that's important. They didn't
go so far as to say that that's the important
part that's left out, but yeah, so that's that was reasonable.
You could argue saying this is for one case only.
(17:06):
It's it's an important case. People died as a result, extraordinary,
and we need to have a clear timeline because we
don't know if there's something what is there something planned
for December second, twenty sixteen, right, or was there another
person involved that we need to get hold of, because
otherwise that this could happen again. So they've said, you know,
(17:29):
one time only, use going to destroy it after that
use Hey, an Apple, if you don't want to make
this iOS, that's fine, you don't have to do it.
We'll do it. We'll hire some people to reverse engineer it,
build out an iOS ourselves. Here's the thing, though, Those
iPhones will only verify firmware if there's a special Apple
(17:52):
digital signature attached to the firmware. So in other words,
if the digital signature, which is unique to Apple, if
that's not in there, it won't be verified by the device,
it won't be loaded on because Apple famously wants to
make sure that their hardware and software works together. And
(18:12):
that's it. No one else gets to play in that
play box. Yeah, it's the same with their Matt computers forever, right,
Like you were really meant to run Apple software on
Apple hardware and never the twain shall part, right, So,
same thing with this iPhone. So the fbis is so
we're not even asking you to give us the digital signature,
(18:34):
which would be disastrous if Apple did that. The whole
point of the signature is to make sure that only
Apple can do this stuff. What they're saying is will
make this iOS, will give it to you. Yes, you
will sign off and send it. Right, you give a
little stamp of approval with your little digital signature, and
then we can load it onto this phone. But that
way you don't even have to build the code. See
(18:57):
we're being really reasonable. We'll be back with more with
Apple versus the FBI after these quick messages. So here's
here's the other issue. Apple kind of shot itself in
(19:22):
the foot. See this is kind of a workaround already,
this idea of being able to create a new kind
of firmware to work around the security measures while not
affecting any of the underlying data. The reason why that's
possible at all is that Apple has allowed for the
possibility of issuing a firmware update to a phone without
(19:46):
the phone's owner having to accept it. Ah. Yes, see,
if Apple had designed this so that when it pushed
out a firmware update, you, as the user, had to
log into your phone and accept it, there'd be no
way for Apple to do this because you would already
have to have the pen in order to accept the update.
(20:08):
So you can't work your way around the pin because
even the update to try and do the workaround would
still need the pen. But that's not the case. Apple
can issue a firmware update without the user's consent. This,
by the way, also a security problem. Not just a
security problem in this particular case. But what if someone
at Apple, you know, decided to code it in so
(20:31):
that you could activate the microphone remotely. Yeah, and they
and they shoot this firmware update out and you don't
have the ability to deny it. You might not even
be aware that it happens unless you happen to be
using your phone when the firmware update gets pushed to
your phone. That's an issue. So because Apple can do this,
(20:54):
that gives FBI the the leg up to make this request.
So the FBI just is trying to be as reasonable
as possible in their request while avoiding the addressing the
problems that could arise should Apple agree to it. And yeah,
(21:17):
that's that's the thing. Okay, that's that's that's the thing.
No matter how single use this might be. Yeah, no
matter how noble or even crucial the cause, right, there's
not there is not a practical way that this would
(21:39):
work without severe repercussions. Yeah, there's a there's a word
I like to use in this case. It's called precedent. Ah.
So when you said a precedent where Apple agrees, acquiesces, surrenders,
is compelled to however you want to put it, to
agree to FBI's demands, you can't undo that. That has happened.
(22:05):
And perhaps more importantly, not only has it happened in
the US government, but now other governments that operate where
Apple sells products could come to Apple and say, we
know you can do this because you have done it,
and we know you will do this because you did
do it. So if you want to do business in
our country, you know, the one that starts with Chu
(22:28):
and ends with Aina. You will do this for us.
And when you're talking about a government like China's government,
you could see how this could be used to an
abusive extent. Anyone who is identified as a dissident could
be targeted. And China's an enormous market. Right, Apple cannot,
(22:50):
as a publicly traded company turn its back on the
biggest emerging market in the world, not even emerging, it's emerged,
the biggest market in the world. Also, it's a there
are manufacturing basis for apples in China. That's part of it.
That's already been a security concern. Let's also consider I mean,
if we're being honest, Okay, how how do I say
(23:13):
this correctly? Jonathan? Oh? Yes, Okay. While there is no
universally acknowledged proof that corporate corporate espionage projects or operations
coming from China are sponsored by the government, right, there
is widespread certitude that that is the case. Ockham's razor, right, Right,
(23:38):
Ockom's razor. You look at it and you think, okay,
it is entirely possible that any hackers operating in China One,
maybe they're not Chinese. Unlikely but possible too. Maybe they're
operating from a different country and using proxies to go
through China. But considering the firewall of China, that seems
(23:59):
like that extra headache for those hackers. Three, they could
be operating. They could be Chinese and operating in China,
but not be directed by the Chinese government, in which
case they would still have to use proxies in order
to access that level of infrastructure. It just gets to
a point where you think the simplest explanation is, in fact,
(24:21):
there are these state backed hackers that are doing this
on behalf or on the request of Chinese government. If
not on the request, at least with the implicit approval.
There we go at least someone looking the other way.
But even that's not enough. There's assistance involved anyway. I'm
derailing us as well. The point being that if if
Apple were to agree to this FBI request, that is
(24:45):
a distinct possibility that it would face not only future
requests from from other government agencies as well as the FBI,
but from other countries as well, and that it would
it would be the wedge that drives open the possibility
of the things like these backdoors that Apple and other
companies have been resisting for years now. No, there is
(25:06):
one other there there is a case that I think
we should clarify here. Um when we talk about precedent,
there have been precedents in the in the legal past
wherein Uncle Sam was allowed to compel a company right
to do and this is different. But we do need
to I think we need to differentiate these because it
(25:27):
is legal in the US to compel a third party,
whether an individual or a corporation to um help how
do they word it, like execute a court order or
something like. Yeah, So you're you're talking about the all
Ritz Act, right, which is a very old thing, back
to what's seventeen eighty nine. Okay, US wasn't even the
(25:51):
US very much the right. This is not a smartphone
specific law. There were still occasional battles with the British
going on. Okay, seventeen eighty nine, so all the states
were here, yet many of them were not here yet. Yes,
seventeen eighty nine is when this want was first written
(26:11):
down as part of the Judiciary Act. So specifically, what
the all Ritz Act allows the government to do is
to compel a third party to accommodate federal orders like
a search warrant. So, in other words, the federal government
can issue a search warrant to law enforcement. The law
enforcement can go to let's say it's an apartment building.
(26:32):
They can go to the manager of the apartment building
and say, we have the search warrant. The all Ritz
Act tells us that you have to allow us to
go into this apartment to search it. And then the
apartment owners says all right and lets them in. Now,
this serves a couple of different purposes. It expedites the
work of the federal government in investigations and things of
(26:55):
that nature. And it also provides a protection to those
third parties because the third party is having to comply
with a federal request. And if you are a person
like an apartment manager and your other tenants are coming
to you and saying why are you letting them into
someone's room without their permission, you can say I have
(27:17):
to by law. That protects you as the owner as well,
because it means that you're not you're not You're not
a rat. You know you're you're you're following the law.
You're obeying the law. Right, So automatically, anything you do
in the assistance of the execution of that court order
is automatically legal. I mean as long as you're yeah,
(27:39):
as long as they're not saying, hey, we need a
search warr and you go sure, Hey, while I'm on
the way, do you mind even steal the car right
or there? Or they can't say, hey, we're going to
get a search warrant? Do you want to just lessen
now right? That would not be cool. But there's a
very important idea that's attached to the US all Ritz Act,
(28:01):
which is that, and the Supreme Court has ruled on this,
you cannot rely on the all Ritz Act to compel
a third party to action if it creates an unreasonable
burden upon that party. So you might say, well, what's
the unreasonable burden for Apple? I mean, all they're asking
for is a way around this security system, just time,
(28:26):
just the one time. So there are actually several counter
arguments to this. Um. First, uh, you know, Apple is
saying their programmers may not even know how to make
the code that would allow for this to happen. They like, listen,
we don't even know that we can build this yet,
So you're asking us to do something that we don't
(28:47):
know we can build. So that's an unreasonable burden because
it means we have to dedicate our assets to from
from projects that they should be working on to trying
to figure out if this thing is possible and if so,
how to do it now. The FBI's argument was essentially, hey,
you're in the business of writing codes. There should be
no problem, right. I would desperately need a bleep sound
(29:12):
effect right now. But I call BS. Let's be nice.
I'll call BS on that argument, because to me, that's
the same as coming up to Let's say, let's say
Ben that you are a cook, yes, and you cook
in an Italian restaurant. You're you're the Italian restaurant's head cook.
Three initial in stars. I go up to you and
I say, hey, listen, I'm from the federal government. I
(29:35):
got this, uh, this this executive order, this federal order
for you. Okay, all right, you have to now go
and make a dinner of Peruvian food right now for
thirty people that are in the restaurant. You can do
it because you cook right exactly. Or it's like saying, yeah,
(29:55):
it would be a disastrous Peruvian dinner at very least
a looser interpretation, like for another example, let's say to
make it even broader. Yeah, because I think this is
this is pretty good too. So, uh, Jonathan, let's say
that you are a doctor. All right, Okay, you're at
your doctor, doctor Strickland, doctor Strickling. Hey, hey call me Jonathan,
(30:18):
doctor Strickland's my dad. You right, So you're an easy
going doctor, clearly, Um, and you are, and uh you
you're an ear nose and throat man. Okay yea. And
uh so I come up to un say I'm from
the federal government. I have an executive order but not
(30:39):
an appointment. So you're already kind of irritated, right, and
I'm like, and I say, I need you to make
the cure for cancer. Yeah that's a bit much. Yeah,
I mean you're a doctor, right, you know about organs
and stuff cancer affects bodies. Or going to Harley Davidson
Davidson and saying, look, you you make stuff where wheels
(31:00):
are connected to chassis and they turn and a motor
keeps things going. I need you to make a bus. Yeah,
I mean, as you see where the where it's ridiculous.
It's it's you can't argue that because this company is
in the business of doing this one thing, which by
the way, is just one part of Apple's business, absolutely,
that they are capable of making this other thing that
(31:22):
happens to fall into that same category. That is ludicrous
on the face of it. So that's argument number one
about it being a burden. Okay. The second burden is
the one that we've already touched on. It sets a precedent.
If Apple can be forced to attack the security of
its own system in this case, it could happen again
and that would be a disastrous result for consumer uh,
(31:44):
you know, confidence in Apple's products. Right, that's very bad
for Apple's bottom line. So if Apple says, look, you
will make us lose millions, if not billions of dollars
in revenue, how is that not an unreasonable burden? How
can you argue that that burden is reasonable? Right exactly?
And that's so. And not only that, but then you
(32:04):
get into the foreign agent approach, the foreign state approach,
saying what if this means that China comes to us
and says, because of this other thing that we agreed
to do, we now have to do it in China
all the time, and real human beings are being pursued
and their lives are turned upside down and ruined as
(32:28):
a result of it. And it's all because we have
to comply, because this has already set a precedent that's
an unreasonable burden. And finally, they've even said that it's
a violation of their right to free speech. And the
reason for that is because code has been ruled as
a type of free speech in the past, and if
(32:49):
the government compels Apple to write code that Apple doesn't
believe in, they're being compelled to speak against their own beliefs,
thus a violation of free speech. Now that argument, most
people are saying it's probably the weakest of all of
their styling on it a little, but I gotta admit
that's pretty awesome style. Well, yeah, that is. I enjoy it.
(33:11):
And there, while it's reaching, it's not invalid. And let's
consider that Apple legally, in this kind of case, is
playing against Uncle Sam and on its home territory right.
And this means that you you might often wonder why
when there are suits or countersuits or legal problems, why
(33:33):
so many cases open with just this laundry list of arguments,
And it's because it's you know, if we could go
back to my Italian restaurant, you're just throwing this spaghetti
at the wall. It's a scattered gun approach, it absolutely is,
or if you prefer, it's casting a very wide net
because you aren't sure which tactic is necessarily going to
(33:56):
be your best one from the starting gate, so you
want to throw out all of them at once, and
if it's if it's a compelling enough argument, then you
can get things thrown out before they go any further.
And in fact, Apple has said that they're willing to
go all the way to the Supreme Court with this
particular fight. The CEO said that publicly. Yeah, which interesting
(34:18):
because I you know, we recently lost a US Supreme
Court justice here in the United States, and I actually
think he probably would have sided with Apple on this
one because of his very strict view of the constitution,
right he's an originalist antonin Scalia. Yeah, So going back
(34:38):
to this argument of it, I'm going to read a
quote and I want to see what your reaction is,
because I know what mine was. This is from Congressman
David Jolly of Florida, who said Apple's leadership risks having
blood on their hands. Ben is shaking his head and
looking at me in disdain. Not at me, he's looking
(35:01):
through me. It's absolutely insincere, first off, and it's usually
whenever you listeners, whenever you hear people make lurid imagery
based appeals to emotion, right, or these these hyperbolic accusations
(35:22):
that this is the bread and butter of I'll say
it the political class theater exactly. That's a great phrase.
And so to say this in such a way, what
it does is psychologically you get an image of somebody
with with literal blood on their hands, and then you
(35:44):
know they're trying to cast aspersion against Apple, not by
not by making any point about the arguments Apple is
making right, but by going instantly to, uh, these people
are murderers, right if we don't do why are you
defending murderers? That's that's that's essentially you know, uh, the
(36:05):
argument about you know, it's it's one of those those
legal arguments you will hear occasionally. This is an illegal argument,
but it's like the legal arguments you will occasionally hear
where it's clear that the lawyer is trying to appeal
to the jury's sense of emotion. Sure as opposed to
addressing the facts of the case itself. Right, Um, so,
I agree entirely with you. The first reaction I had was,
(36:26):
I'm offended by that statement. I mean, it's condescending and
it imagines that it imagines that the person that would
be swayed by that is not intelligent enough to read,
and it it deflects the fact that there are two
people who were responsible for that terrible exact attack, two people,
(36:49):
and those two people are the people who are holding
the guns and pulling the trigger and aiming at people.
Those those are the ones who have blood on their hands,
and of course they're they're both dead now, they both
died in the shootout with the law enforced. But the
point being that they're the ones responsible, not Apple. Apple
did nothing in relation to this crime. In fact, Apple
didn't give the fricking phone to Furut that was issued
(37:12):
by the county. Apple just made some made a device
that has this level of security on it that people wanted.
People wanted that level of security. They want the reassurance
that Apple itself can't access their phones without their permission.
It's a very important cornerstone of security. In fact, if
you look at iOS eight or earlier, Apple could bypass security,
(37:36):
they could access the information on a phone without your pen,
without you acquiescing and allowing that to happen. But they
specifically change that with iOS nine. They made it so
that they could not do that because they said, it's
important for consumers to trust the company. And how can
(37:57):
you build trust if you know in the back of
your mind and this company could at any moment access
my private information that I have not chosen to share
with them, Well, you know that trust is destroyed in
that case, right, kind of brings us back to that
unreasonable burden. So these arguments are continuing. I think the
(38:20):
next stage doesn't start till till March tenth, and we're
recording this on February twenty sixth. So I am very
hopeful that the government sides with Apple on this ultimately,
that when this gets the courts at any rate side
with Apple on this, because if they do not, this
(38:42):
is the This could be like the snowball effect where
we see more requests of this nature come in and
thence it. Once it's been established as precedent, it's much
easier to happen in the future, and it's easier to
see larger requests, like things that are that go beyond
all we need you to help us circumvent the security
(39:03):
and may go into we need you. We finally are
going to get what we wanted all this time. We
want a direct path that like a doorway that's labeled
FBI that lets us go straight into the data that
your users are storing on their devices, which leads to
because the FBI, this leads to a horrific situation, because
the FBI is an institution sure made up of individuals.
(39:28):
Remember when the Snowden leaks revealed to us the extent
of unethical use of surveillance by the NSA. People would
look up their ex girlfriends or they were star ex boyfriends. Yeah,
they were just looking up people that they were interested
in with absolutely no oversight. As a matter of fact,
when we talked about the Chinese government looking the other
(39:50):
way for hackers, that was the same thing that occurred
with the NSA. To assume that, for some reason, given
the opportunity, individuals in another law enforcement branch or another
institution would not do the same thing, yeah, is cartoonishly naive.
(40:12):
We will conclude our twenty sixteen discussion about Apple versus
the FBI after this quick break, you know, using the
argument of this is a one time use, that wouldn't
(40:32):
stop the FBI from requesting another one time use, or
another one time use, or even extending that beyond it,
saying all right now, we want a one size fits
all approach to doing the same thing because it's too
much time for us. And don't worry, we'll get a
cord order before we do it. Will make sure that
nobody else gets access to this ability. And you'll know
(40:53):
that the cord order is good because they'll be classified.
So we'll just inform you when the orders are approved.
I have said it many times that there is no
way to ensure security by enforcing a vulnerability. Yeah, and
I think that's a way to encapsulate it. But there
is a question that I have that I'm sure a
(41:13):
lot of you have as well, ladies and gentlemen, which is,
let's say the worst happens, Yes, okay, worst happens court
rules in favor of the FBI, and Apple says, nah,
we're not going to do it. Well, I mean if
if the court, assuming he goes all the way up
to the Supreme Court, this could end up becoming a
(41:36):
matter of law where it's codified that companies have to
obey that within the United States, which would mean far
reaching implications, not just for Apple, but for all companies, everyone,
any any tech company, any company really that any company
doing business in the US, not even based here, but
just doing business. So that's a big deal. It's it's
(42:00):
potentially disastrous for privacy. There's rampant possibilities of misuse. We've
talked about the possibility that if you do create a vulnerability,
someone somewhere is going to try and figure out a
way to also gain access to that vulnerability, right, And
these are not necessarily other law agencies or intelligence agencies,
(42:23):
or maybe they are intelligence agencies. They just have to
be intelligence agencies working for a different country. And yeah,
and here's the problem, because we've talked about this before,
man I. Legislation, I think we talked about this with
autonomous vehicles before. Legislation is almost always outpaced by technological innovation. YEA, yes, yeah,
but you will almost always see a case where someone
(42:46):
has figured out something really interesting to do with technology,
or perhaps even really scary things they could do with technology,
and there are no laws to cover it because before
that person figured it out, it didn't exist. So you
don't write laws for stuff that doesn't exist. We don't
have a law saying listen, guys, I I just it's
keeping me up at night. We have got to write
(43:06):
a law about what happens in the case that the
Lochness Monster is real, gets out of Scotland, comes over
to New Jersey and starts to eat people. We need
a law to protect us from this potential catastrophe, right,
I'm ridiculous. Yeah, Like if we're senators, everyone in the
audience and you, Jonathan, myself, and then one of us
(43:26):
walks in and says, guys, I know that. I know
that we have some other issues coming up, and we
have to nominate this court justice, and there's an election
coming up. But I think we need to look into
the future and look at the big picture, which is
moon boot theft, right, because I don't want people shoeless
(43:46):
on the Moon when and if we build a colony there. Yeah,
I think arguing, for example, for robot rites right now
might be a little premature that kind of thing. Maybe
not forever, but for now, it's definitely it's good to
think about. But you know, you make a very astute
point when you say, if we're talking about codifying something
or codifying a law, then what happens is once the
(44:10):
Supreme Court rules on something like this and it becomes
a matter of law, it is much It is very,
very difficult to get that kind of ruling. The Supremes
are pretty busy people. Yep, they don't hear every case
that's brought before. They absolutely don't. And then but the
thing is, you think it's hard to get one of
(44:31):
the to get those justices to change the substance of
American jurisprudence. Imagine trying to get them to change it back.
This is like a Pandora's box Pandora's jar situation. Yeah,
now this is not not good. You don't want this
to happen for multiple reasons. Now, all that being said,
(44:51):
are our sympathies with the families of those who are
wounded and killed as a result of this mass shooting.
Absolutely I feel off for them. And if there were
any other way to get to that information that did
not require Apple to be complicit in destroying its own security,
I would be in favor of it. And in fact,
(45:12):
the FBI has taken such pains they got access to
the iCloud backups that this phone creates. The problem being
that the phone didn't have an iCloud backup for the
month leading up to the actual attacks, So there could
be information on the phone that's not on the cloud,
and that's why the FBI wants to get access to that.
I totally understand the reasoning behind it, but two things,
(45:35):
of course, keep me from being completely sympathetic. One is
that the FBI has for years been trying to get
backdoor access to multiple systems exactly. Yeah, So that so
you could argue that perhaps this mass shooting is being
leveraged cynically by the FBI in order to further their goals,
(45:55):
because it's hard to say no to such an emotionally
devastating event. Opportunistically, I would say, yeah, I do know,
I believe it. And this is just my personal opinion
based on again precedent. It is completely within the realm
(46:17):
of not only possibility but plausibility that an institution would
wait for an opportune time to make this this kind
of legislation, like the argument for Internet surveillance, based on saying, hey,
we need to protect people, we need to protect you
from inappropriate content and your children, think of the kids. Right, Really,
(46:43):
this is this is a lot of the same stuff
we heard in the wake of the Patriot Act. Absolutely
where a lot of people felt the Patriot Act was
a reactionary piece of legislation that was drafted far too
quickly and what had had reached far far are too
why for what it was proposing, what what everyone claimed
(47:06):
it was all about? Right, and uh, that was a
big mess. This is also potentially a really big mess.
And the Patriot Act, the substance of it was pretty
much had been written in advance. Yeah, yeah, which is
pretty like now that used to be a controversial statement,
but now it's acknowledged. Yeah, so this is this is uh,
(47:27):
I mean the fact that the FBI has had this
plan for a while, not this specific implementation, but this
desire to get this workaround access to things. And I mean,
I totally understand their point of view. Two, they're trying
to investigate things. It's not like the FBI is necessarily
made up of the cigarette smoking man and all of
(47:48):
his cronies. You know, I'm not I'm not going I don't.
I don't mean to to uh disparage then I don't
want to demonize them, Right, That's not That's not what
I'm trying to get at. Either. The FBI's intends may
in fact be nothing but noble that they want this
in the efforts to investigate, solve crime, prevent crime from happening,
(48:09):
and not in any way that is malevolent. However, the
fact remains, whether their intentions are noble or not, it
opens up this opportunity for people whose intentions are demonstrably
not noble to take advantage of those same opportunities. Well, yeah,
and you know, I'm glad you said that, because I
wanted to see something that I want to add to
(48:31):
this something that is rarely said when we talk about
government surveillance or concerns about privacy. Right, one thing that
is rarely said is that law enforcement agencies, law enforcement institutions,
and individuals in the US actually do quite an extraordinary
(48:52):
job compared to a lot of places. If you're fortunate
enough to grow up in a place that has rule
of law, where you can walk down this street in
the dark, or you can say you can say, uh,
you know, whomever your senator president is, you can go
on the internet and say I think they stink yeah,
I think you're in jerk. Yeah, I think you're the
(49:14):
piece of bologna with shoes. And then but in other countries,
we you know, people get arrested for that, people get imprisoned.
So we're not get a race, yes, like not just
arrested or imprisoned, but the government in some countries will
take steps to make it seem like that person was
never a person exactly. They'll keep the photos, but you
(49:35):
won't be in them. And I say that because it's
a sense of much needed perspective. However, you know, I'm
not demonizing the people who work at the FBI institutions,
whether private or public, seek power, they seek further influence.
(49:59):
And that's it's not because it's some sort of James
Bond super villain thing. It's not specter. And it's because
it allows for it allows for an easier, more efficient
pursuit of whatever the original mission would be. Right, all right,
we got a little bit more to say on this
(50:20):
topic before we get to that, though, Let's take another
quick break. Ben and I we both talk about things
that here at work where we say, gosh, I wish
we had X because it would make our lives so
much easier. Well, even if we got X, we would
(50:43):
come up with why. That would be the next one. Right,
we get X. X is awesome, X is helping us out.
We're like, oh man, it's so good to have X here.
But you know what, Yeah, it would be great if
we had WHY because if we had WHY, we could
really do our jobs. Well, we get why, and then
you know, man, X and why or work out like
a dream. But boy, if we had Z, can you
imagine the level we get to now what we do, Ben,
(51:06):
We make fun audio podcasts, videos and articles that go
on the intranet and that's awesome. And so really our
capacity to do horrible, horrible harm is fairly limited. I
mean in respect to how our jobs comparatively. Yeah, yeah, Nick. Granted,
if either of us wanted to go outside and just
(51:28):
start throwing Kingo pops popsicles at people, we could go
on a popsicle rampage havoc. But that's that's not job related.
The FBI, the CIA, the NSA, a lot of those
three letter organizations, in pursuit of what they need to
do in order to fulfill their their organization's mission, in
(51:51):
some cases will step over lines that we cannot allow
people to cross because it creates a system that is
at least as dangerous as whatever problem they're trying to solve.
You know. Another example of this is the the idea
that the idea of absolute prevention. Like you know, there
(52:16):
was the old conversation about tortures several years ago, when
it was the ticking time bomb argument, which was, should
torture be legal if there is a criminal in custody
who is suspected of having knowledge of another nine to
eleven here? Yeah? Yeah, Jack Bauer from twenty four kind
(52:40):
of argument, Should torture, while reprehensible, be allowed when it
gets results? And this, this kind of reasoning is dangerous.
Not I'm not saying that because of any desire to
see human tragedy, but I'm saying it's dangerous because of
(53:00):
the assumptions it makes. Yes, that a special case will
remain a special case, right, and that perhaps the next case,
which maybe isn't quite so special, like, well, you know,
we've done it before, so what's the deal here? Yeah?
I mean you were cool last time. What happens? Yeah?
So this is this is exactly why neither I think
(53:23):
I feel pretty strongly about this. I think I'm on
the right track that neither you nor I feel that
the FBI should win out in this particular case. I
think this is something where we really need to see
Apple come out on top. I am not a huge
fan of Apple. I don't own a lot of Apple
products I have. Apple does not sponsored this show. I'm
(53:46):
not getting any money from Apple. If anything, I'm losing
money to Apple because my wife is a fan. She
wants to get an Apple Watch, but I am not.
I'm not getting anything from Apple. I do think they're
in the right, but because I don't want to see
a precedent where a company that creates a secure system
(54:07):
has to be or can be compelled to compromise that security.
It defeats the purpose of the security. And whether it's
this case, which is extraordinary and very emotional, or something
much less impactful for the general public, maybe it's something,
you know, simpler and less dramatic. It doesn't matter. You
(54:32):
cannot You cannot go down that pathway and expect things
to turn out all right. You've got to figure out
other ways to do that kind of investigation. Either Apple
needs to go in a direction where they can they
can access user data without having to circumvent a security
(54:55):
system like this, which means they have to go backwards,
which can really is not a possibility, or Apple and
other companies have to create systems where it really is
impossible for them to access without the consent or the
actions of the owner of the device. I suspect that
(55:16):
every company is rushing to develop that kind of approach
right now, because none of them want to be in
this position where Facebook, Google, Microsoft all publicly showed support
for Apple. Apple right now is um you know, part
of Apple hired a dev that worked on Edward Snowden's
(55:37):
favorite messaging app, obviasually. Yeah, and I don't know how
much of that is meant to be like a pr move.
But also they have, um, you know, the leaked Snowden
papers are out there, and I know I'm harping on them.
It revealed an ugly behind the scenes look at corporate
involvement with government request for surveyor you know, so the
(56:01):
average consumer you me, h, Gary Busey whomever. We have
much less trust in general in these companies because we
have a reason not to trust them. Well, we have
handed over so much of our own personal data. We
(56:26):
trust that the devices that hold that personal data aren't
going to just give that away to whatever entity without
our consent. We trust that that's not going to happen.
When things like this pop up where we start to
question that trust, that's problematic. There's someone else that Apple
(56:48):
has recently hired, Ted Olson. Does that name sound familiar
to you, Ted? Ted Olson's a lawyer. So Ted Olsen's
going to be representing Apple. Ted Olson's probably best known
for representing George W. Bush in the Bush versus Gore
election fallout. You know, for those of you in the
(57:10):
United States, when when Bush was running against Gore, there
was this whole battle about, you know, voting recounts, voting recounts,
and Olsen represented w George W. Bush on that and
Bush ended up winning that. So now he's representing Apple
in this particular battle with the FBI. So interesting to
(57:32):
see these kinds of personalities involved in this. And now
I know that with the public perception it's been a
little c saw ish, but general public, I would argue
the people who are not necessarily paying attention to the
text sphere, I think a lot of them are citing
with the FBI because it's a terrorist story. It's a
(57:55):
story about trying to establish the most information of about
these these shooters as possible. Sure, do you think so? Though?
I do? I think that I think at least a
lot of polls that I've seen leading up to today,
the general public tends to side with the FBI because
the FBI has a very emotional story. Apple's story is
(58:15):
much more rationally based, intellectually based, and the FBI story
is is penned on this event, this very emotionally charged Eventum.
I don't know that that's going to continue. I think.
I think people who are savvy in the text sphere,
I think the majority of them side with Apple. But
(58:38):
it's still because it's still it goes back to the
taking time bomb argument, you know. Yeah, yeah, So this
is one of the stories that we definitely had to cover.
I mean, obviously it's it's such a huge it's probably
the biggest story in tech right now as we record this. Um,
And I'm glad that you could join, maybeen to chat
about this, to kind of give your your insight as well,
(59:00):
and to talk about why this is so important not
just from a technology standpoint, but just a philosophical standpoint
in a matter of law as well. Okay, that concludes
that twenty sixteen episode on Apple versus the FBI. As
I mentioned in the intro, this is an ongoing issue,
(59:21):
and it's not just about accessing physical devices. There have
also been lots of attempts from the FBI and other
agencies to convince companies like Apple to give them backdoor
access to otherwise encrypted forms of communication. As I have
said repeatedly in various episodes, this is a terrible idea.
(59:42):
Anytime you create a purposeful backdoor to an otherwise secure system,
assuming that that is even possible, because it's not always possible.
But if you do that, all you have done is
really introduced a vulnerability. You have created a huge target
for every hacker out there that is that wants to
(01:00:04):
be able to infiltrate that system because you have created
a work around through the otherwise legitimate way to send
encrypted data back and forth. It is never a good
idea to do that. It just it decreases security rather
than increases it. And so this is an ongoing issue
(01:00:25):
between investigative agencies and tech companies. I don't expect we're
going to see it go away anytime soon. There will
always be a struggle but I sure hope that we
never get to a point where more and more tech
companies begin to introduce these backdoors, because it just makes
everyone less safe in the long run. If you have
(01:00:46):
suggestions for topics I should cover in future episodes of
tech Stuff, please reach out to me and let me know.
One way, of course, is to download the iHeartRadio app.
It is free to download, it's free to use. You
navigate over to tech Stuff using the search field. You'll
there's a little microphone icon on the tech stuff page.
You can leave me a voice message up to thirty
seconds in length let me know what you would like
me to cover, or if you prefer, you can go
(01:01:09):
over to Twitter and send me a message there. The
handle for the show is tech Stuff HSW and I'll
talk to you again really soon. Tech Stuff is an
iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio app,
(01:01:30):
Apple Podcasts, or wherever you listen to your favorite shows.