Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Technology Stuff from Hey there, and welcome to Tech Stuff.
I am your host Jonathan Strickland joining me in the
studio today. He's my friend, he's yours. He's Ben Boland, Ben,
welcome back to the show. Hey, Jonathan, thank you so
much for having me on the show. Even though we
(00:27):
know that we're both going to have to work assiduously
to keep this a family friendly show. You've invited me
on to talk about, Yeah, talk about a subject that
is guaranteed to get us. As my grandfather would say,
all head up, we get all head up in here. Yeah,
(00:49):
So we're gonna talk about something that's actually we're gonna
have to work very hard for multiple reasons, because we're
talking about Apple versus the FBI, the whole story that's
unfolding as we record this um and to talk about
what is behind that case, what are the implications, what
is the FBI's argument, what is Apple's argument? And in
(01:11):
addition to that, we have, of course the added responsibility
of remembering that this is all centered around a truly
awful crime. Yes, absolutely so. The what we're what we're
talking about specifically hit mainstream news when Apple did something
(01:31):
that a lot of tech companies have never done. They
issued a blanket statement letter and went public with a
They went public with a response to an FBI request.
But I guess I'm getting ahead of it. What was
the crime? Well, let's yeah, let's I'll give you the
background on what's happened and we'll we'll build from there.
(01:53):
So first we have to look back on December two,
two thousand fifteen, uh and San Bernardino, Kelly, Flornia. That's
in southern California, and that's when to well, an American
citizen and I legal resident, husband wife team, both of
Pakistani descent, committed a mass shooting. It was a Sayed
(02:17):
Rizwan Faruk and tash Feen Malik who committed this act,
and it happened at the Department of Public Health. They
were having an event that was starting off as a
training event and then it was supposed to transition into
a holiday event. And after the training element. Faruk, who
was actually at the event, he was in the employee
(02:39):
of the Department of Public Health, he was a food inspector, left,
came back with his wife. They were both heavily armed
and they both started firing into the crowd of people
at the Department of Public Health. Fourteen people were killed,
twenty two injured. Very serious crime. Then uh Faruke and
(03:00):
Malite left. They fled the scene. Turned out they had
also left behind an explosive device, which thankfully did not detonate.
Failed to detonate uh and they they fled in a
vehicle that they had rented a few days before, and
several hours later law enforcement tracked them down. There was
(03:21):
a confrontation, there was a shootout UH and then both
of the shooters died as a result of that shootout.
The FBI has stated that they believed, based upon the
evidence they were able to find that the two were
acting on their own, that they were not part of
some sort of terrorist cell in the United States. However,
they can't be absolutely certain of that. That's where the
(03:44):
crux of this issue with Apple is going to come
into play. And the big part of it is they
can't account for it's just less than twenty minutes of
the activity that happened between the shooting or leading up
to the shooting, and around the shooting, because they're they're
thinking that there's a possibility that something that happened within
(04:08):
that time could give them more information and at least
allow them to confirm whether or not they truly acted
on their own or if they were under the direction
of some other group, which could potentially and this is
an FBI argument, which could potentially give the government the
ability to prevent a future attack. And more to that point,
(04:29):
neither of them had been listed in any database as
a potential threat. So that puts extra pressure on the government,
right because citizens say, well, why did this happen? The
government says, we followed procedure and neither of them registered
on any of ours and there are no red flags,
(04:51):
so we had no way of knowing. And uh, that
means there's actual extra pressure on the FBI to investigate
this thoroughly, partially to show that in fact, everything that
could be done had been done short of taking some
extreme step that none of us want to see. Right,
the idea of like, okay, everyone of Pakistani descent has
(05:12):
to leave the country. That's ridiculous. We would that's that's
returning to an era of the United States history that
we do not want to revisit, like a World War
two internment camp situation. Yeah, we don't want that. We
don't want that because it is never fair to lump
innocent people in for fear that one of them might
be guilty. That's not cool, right, this is I'm already
(05:33):
getting angry and i haven't even gotten to the part
about the apple stuff yet. I'm going for a slow burning,
that's fair. So one of the thing that's that all
of this is about, ostensibly any at any rate, is
an iPhone. It's specifically an iPhone five C, probably running
iOS nine. That's based upon the various public filings we've
(05:57):
seen about this um and it was county owned. It
was owned by the Department of Public Health and then
issued to Farouke who worked for the department Department of
Public Health, so it was county owned. The FBI went
to the county and said, we want your permission to
access the contents on this phone. The county said, of course,
you have our permission. You can access it. Now. If
(06:18):
that's all there were to it, it would be fine.
I would just access the phone. They've already received permission
from the phone's owner, and they could call through it
to look for any evidence that would lead them to
more information about this crime. But there is a security
measure on the phone, and it's a very simple one.
It's on lots of smartphones. It's it's a little password.
(06:42):
In this case, it's a a four or six digit
pin which can be alpha numeric if you activate that
option on the six digit. So it's one of the two.
We don't know which, and without that pen, you cannot
access the the information because the way the pen works.
(07:03):
This is where we get into the tech stuff. Been
There are two levels of encryption. You are two keys
to this encryption. One of the keys is the pen.
The other key is hard coded onto the device itself.
All right, it's only it's if you think about movies
like those nineties movies with the nuclear power, where you
have to have two generals put their keys into the
(07:26):
same sort of thing. So if you don't have the pen,
it can't combine with the hardware key, and therefore you
cannot unlock the information. The idea here is that Apple
doesn't have the pen, right, which is cool because it
means that if you or I or you listeners have
(07:46):
an iPhone, it means you can trust Apple because they
don't know your pin, right, they can't access your phone. Uh,
you know that That was the whole point one. Two
points one. Apple wanted to make its cure enough so
that consumers would say, I feel good about buying an
iPhone because I know whatever I store on it, whether
(08:07):
it's something that's personal or it's just you know, it's
no one else's business, whatever it may be, no one
else is going to have access to that unless I
give them my pen. It's good for Apple, not just
because consumers are happy, but because Apple then hasn't out
If the government comes to Apple and says, hey, we
want you to break into this phone, they can literally
say we cannot do that. It's not that we will not,
(08:30):
it's that we literally physically cannot grant your request because
it's impossible. And there's also I would just to expound
that point a little bit further. This I'm not suggesting
as their conscious will or intention, but this also removes
possible culpability. Yes, yes, so all of these are very
(08:51):
important points. Right, So let's get into a little bit
more of why the FBI is coming to Apple. So
let's say that it's a four digit pin all right,
all numeric, simplest version. That means there are ten thousand
combinations that are possible for that pen. That's it. That's
you're limited to ten thousand, which is that's a lot,
(09:14):
but it's not that much if you're going to do
something like a brute force attack. Right. So if you
do a brute force attack on a normal system, and
there are only ten thousand variations, it's with a fast
enough computer, it's just a matter of minutes. But there's
some limitations on the iPhone that make this harder. Does
it freeze if you enter the incorrect pen? It does
(09:36):
more than freeze, right, So, first of all, there is
an eighty millisecond delay when you enter a pen and
when it gets verified that it is or is not
the correct one. Right, So with that eighty millisecond delay,
that doesn't sound like much, but it means you can't
just blast a whole bunch of numbers through. Also, you
have to type it in on a screen. You can't
(09:56):
hook it up to a computer and just digitally send
the various pen combinations to try and get through. You
have to actually tap them on on the screen to
do it over and over. So that means you tap
in the number eighty milliseconds pass, you get the confirmation
or denial, tap in the number next. Uh, if you
(10:17):
hit ten consecutive incorrect pen entries the phone interprets that
is saying someone else who is not the owner has
gotten possession of this phone. So in order to protect
the owner's information, we're scrambling everything, and you lose all
the metadata. Everything is useless. So even if you got
the pen afterward, you would not be able to access
(10:39):
the stuff you were looking for in the first place.
You would maybe be able to turn on the phone,
but there wouldn't be anything there to find. Yeah, you
would have, just you would have just it would be
like you just accidentally set fire to the file room,
like you know, like you can't get into the file
room door, and whatever you try and do accidentally sets
a fire inside the file room. You then get the
key to the aisle room, and then you think, well,
(11:01):
what's the point you lost everything? And it's it's virtually assured.
Yeah that out of those yeah, out of those ten
thousand tries, your first ten are going to be wrong. Yeah,
I mean, just the odds are pretty much against you,
unless you're ridiculously lucky. So there's that, and and of
(11:23):
course they don't they also if it were an actual
later iPhone, if it weren't the iPhone five C, they
would have an added problem, which is that with the
later models there's an additional delay after four failed tries.
So if you try four times and fail, it will
then give you a five second delay before you can
(11:44):
try the fifth time. After that, it gives you a
fifteen second delay after that, like by the time you
get to the ninth try, it's an hour delay. Think
about what you're doing, right, So, but the five C
doesn't have that, So you may have read about that delay.
It does not qualify for this particular case. Through force
won't work in the most basic pin situation. But in
(12:06):
the six digit alpha numeric I just imagine the same
rules apply, but the yeah, yeah, exactly, but the possible
answers are much greater because now you've got six digits.
You've also got the possibility of alphabet alpha characters in
their alphabet characters. I believe so. Um So, anyway, from
(12:28):
why here, if you were to try and use brute
force on um a six digit alpha numeric one, it
would take you with a with a fast computer that
had been optimized for this, it would take you about
six years to break through it. And that's going through
all the different possible combinations. Assuming that there's not another
kill switch type deal like there is with the iPhone.
(12:48):
By the way, that's software that Apple has built into
the iPhone, really firmware that Apple is built into the iPhone.
It's not like it's a fundamental you know quality that
all smart phones, right, Yeah, yeah, that's that's specific to
Apple phones. Right. So you've got this problem, right, you
have the FBI. They've gotten permission from the owner of
(13:10):
the phone to access the phones uh contents. But the
owner of the phone doesn't know the pen because the
owner of the phone had had issued it to an employee.
The employee had come up with the pen, So they
don't know if it's a four digit, they don't know
if it's a six digit, they don't know. Um, they
don't know if this uh this feature where after ten
(13:30):
tries everything gets raised. They don't know if that's necessarily
active because you can turn it off, but all indications
point to it being on. For one, it was on
when the phone was issued to Farouke, right, And people
typically don't change their defaults post phone users. So FBI
wants Apple to do something particular, and it's something different
(13:55):
from what we've talked about in previous episodes about a
back door entrance into a system. It's not quite the
same thing. So you've got this phone, you can't brute
force attack it without risking damaging the contents. Apple cannot
access the information on this phone. By design, they did
(14:16):
not want that to have that capability. So what the
FBI wants Apple to do is build a new version
of iOS just for this phone, a specific iOS for
this phone that disables the safety features that would one
prevent brute force attacks from happening quickly, to allow brute
(14:38):
force to attacks to happen by hooking up the phone
to a computer so you don't have to tap those
numbers in it shore and three disabling that kill switch.
So what the FBI wants Apple to do is to
build up a brand new iOS again just for this
one phone, and then install it on the phone. This
(15:00):
to all, you know, hypothetically happen on Apple grounds, like
at an Apple location, either at the corporate headquarters or
wherever um. And then in theory, you would then destroy
the the the custom iOS because you only needed it
(15:20):
for that one phone, right, because it was so easy
to toss the one ring, bring it back to more door,
and destroy it right, Yeah, yeah, you know what, Mount doom.
Plenty of backdoors. It's fine, it's integral to the plot.
That was that was Souron's problem was that he did
not plug the security vulnerabilities doom. I mean, that's that's
(15:43):
a classic example, like you know, right up there with
the whole land war with Asia thing. So FBI has
been very very careful in framing this in a way
that presents it as a reasonable request, one time thing.
That's a that's a big deal, right, saying this is
for one phone and one phone only, you would you know,
(16:04):
we're suggesting that Apple creates an iOS that's directly tied
to a unique identifier on that phone, meaning even if
the iOS were somehow to leak, it would not be
applicable to any other device on the market, So it
could only work for this one phone. So that that
sounds good in its current form. That's that's important. They
(16:25):
didn't go so far as to say that that's the
important part that's left out, But yeah, so that's that
was reasonable. You could argue saying this is for one
case only. It's it's an important case. People died as
a result, extraordinary and we need to have a clear
timeline because we don't know if there's something what is
there something planned for December two? Right? Or was there
(16:48):
another person involved that we need to to get hold
of because otherwise that this could happen again. Um so
uh they've said, you know one time only, use gonna
stroy it. After that use Hey, an Apple, if you
don't want to make this iOS, that's fine, you don't
have to do it, will do it. We'll hire some
(17:09):
people to reverse engineer it, build out an iOS ourselves.
Here's the thing though, Those iPhones will only verify firmware
if there's a special Apple digital signature attached to the firmware.
So in other words, if the digital signature, which is
unique to Apple, if that's not in there, it won't
(17:31):
be verified by the device, it won't be loaded on
because Apple famously wants to make sure that their hardware
and software works together. And that's it. No one else
gets to play in that plato. Yeah, it's the same
with their Matt computers forever, right, Like you were really
meant to run Apple software on Apple hardware and never
(17:54):
the Twain shall part. So same thing with this iPhone.
So the fbis is so we're not even asking you
to give us the digital signature, which would be disastrous
if Apple did that. The whole point of the signature
is to make sure that only Apple can do this stuff.
What they're saying is will make this iOS, will give
(18:14):
it to you. Yes, you will sign off and send
it right. You give a little stamp of approval with
your little digital signature, and then we can load it
onto this phone. But that way you don't even have
to build the code. See we're being really reasonable. So
here's here's the other issue. Apple kind of shot itself
(18:35):
in the foot. Uh see, this is kind of a
workaround already, this idea of being able to to create
a new kind of firmware to work around the the
security measures while not affecting any of the underlying data.
The reason why that's possible at all is that Apple
has allowed for the possibility of issuing a firmware update
(18:57):
to a phone without the phones owner having to accept it. Ah. Yes, see,
if Apple had designed this so that when it pushed
out a firmware update, you as the user, had to
log into your phone and accept it, there'd be no
way for Apple to do this because you already have
(19:18):
to have the pen in order to accept the update.
So you can't work your way around the pin, because
even the update to try and do the workaround would
still need the PEN. But that's not the case. Apple
can issue a firmware update without the user's consent. This,
by the way, also a security problem. Not just a
(19:38):
security problem in this particular case. But what if someone
at Apple, you know, decided to code it in so
that you could activate the microphone remotely. Yeah, and they
and they shoot this firmware update out and you don't
have the ability to deny it. You might not even
be aware that it happens unless you happen to be
(19:59):
using your phone when the from where update gets pushed
to your phone. That's an issue. So because Apple can
do this, that gives FBI the the the leg up
to make this request. So the FBI just is trying
to be as reasonable as possible in their request while
(20:20):
avoiding the addressing the problems that could arise should Apple
agree to it. And yeah, that's that's the thing. Okay,
that's that's that's the thing. No matter how single use
this might be, no matter how noble or even crucial
(20:45):
to cause, right, there's not there is not a practical
way that this would work without severe repercussions. Yeah, there's
a there's a word I like to use in this case.
It's called precedent. So when you set a precedent where
Apple agrees, acquiesces, surrenders, is compelled to however you want
(21:09):
to put it to agree to FBI's demands, you can't
undo that that has happened. And perhaps more importantly, not
only hasn't happened in the US government, but now other
governments that operate where Apple sells products could come to
Apple and say, we know you can do this because
(21:32):
you have done it, and we know you will do
this because you did do it. So if you want
to do business in our country, you know, the one
that starts with Chu and ends with Aina, you will
do this for us. And when you're talking about a
government like China's government, you could see how this could
be used to to an abusive extent. Anyone who is
(21:55):
identified as a dissident could be targeted. And China's an
enormous market. Right, Apple cannot, as a publicly traded company,
turn its back on the biggest emerging market in the world.
Not even emerging, it's emerged. The biggest market in the world. Also,
it's a their manufacturing basis for apples in China. That's
(22:18):
part of it. That's already been a security concern. Let's
also consider I mean, if we're being honest, Okay, how
how do I say this correctly? U Jonathan um Oh? Yes, Okay.
While there is no universally acknowledged proof that corporate corporate
espionage projects or operations coming from China are sponsored by
(22:42):
the government, there is widespread certitude that that is the case.
Razor right right, Ockom's Razor. You you look at it
and you think, okay, it is entirely possible that any
hackers operating in China. One, maybe they're not Chinese, unlikely,
(23:03):
but possible to Maybe they're operating from a different country
and using proxies to go through China. But considering the
firewall of China, that seems like that be extra headache
for those hackers. Three they could be operating. They could
be Chinese and operating in China but not be directed
by the Chinese government, in which case they would still
(23:26):
have to use proxies in order to access that level
of infrastructure. It just gets to a point where you
think the simplest explanation is, in fact, there are these
state backed hackers that are are doing this on behalf
or on the request of Chinese government. If not on
the request, at least with the the implicit approval there
(23:46):
we go at least someone looking the other way. But
even that's not enough. There's assistance involved anyway. I'm I'm
derailing us well. The point being that if if Apple
were to agree to this FBI request, that is a
distinct possibility that it would face not only future requests
from UH from other government agencies as well as the FBI,
but from other countries as well, and that it would
(24:09):
it would be the wedge that drives open the possibility
of things like these back doors that Apple and other
companies have been resisting for years now. There is one
other there. There is a case that I think we
should clarify here. Um when we talk about precedent, there
have been precedents in the in the legal past wherein
(24:29):
Uncle Sam was allowed to compel a company right to
do and this is different, but we do need to
I think we need to differentiate these because it is
legal in the US to compel a third party, whether
an individual or a corporation to um help, how do
(24:50):
they word it? Like execute a court order or something like. Yeah,
so you're you're talking about the All Writs Act, right,
which is a very old thing, back to seventeen eighty nine. Okay,
the US wasn't even the US very much right there,
right nine. This is not a smart phone specific law.
There were still occasional battles with the British going on, okay,
(25:14):
seventeen nine, yet many of them were not here yet. Yeah,
seventeen eighty nine is when this writ was was first
written down as part of the Judiciary Act Um. So specifically,
what the all Ritz Act allows the government to do
is to compel a third party to accommodate federal orders
(25:35):
like a search warrant. So, in other words, the federal
government can issue a search warrant to law enforcement. The
law enforcement can go to let's say it's an apartment building.
They can go to the manager of the apartment building
and say, we have the search warrant. The all Ritz
Act tells us that you have to allow us to
go into this apartment to search it. And then the
(25:57):
apartment owners says all right and lets them in. Now,
this serves a couple of different purposes. It expedites, uh,
the the work of the federal government in in investigations
and things of that nature, and it also provides a
protection to those third parties because the third party is
having to comply with a federal request. And if you
(26:18):
are a person like an apartment manager and your other
tenants are coming to you and saying, why are you
letting them into someone's room without their permission? You can
say I have to buy law that protects you as
the owner as well, because it means that you're not
you're not you're not a rat. You know, you're you're
(26:39):
you're following the law, you're obeying the law. Right, so
automatically anything you do in the assistance of the execution
of that court order is automatically legal. I mean, as
long as you're as as long as they're not saying, hey,
we need a search war and you go sure, hey,
while I'm on the way, do you might ave steal
a car? Right? Or they'll be or they can't say, hey,
(27:02):
we're going to get a search warrant. Do you want
to just listen? Now? That would not be cool. But
there's a very important idea that's attached to this all
Ritz Act, which is that and the Supreme Court has
ruled on this you cannot rely on the all Ritz
Act to compel a third party to action if it
(27:24):
creates an unreasonable burden upon that party. So you might say, well,
what's the unreasonable burden for Apple? I mean, all they're
asking for is a way around this security system, just
one time, just a fun time, just the one time.
So there are actually several counter arguments to this um. First, uh,
(27:46):
you know, Apple is saying their programmers may not even
know how to make the code that would allow for
this to happen. They like, listen, we don't even know
that we can build this yet, So you're asking us
to do something that we don't know we can build.
So that's an unreasonable burden because it means we have
to dedicate our assets to from from projects that they
(28:08):
should be working on to trying to figure out if
this thing is possible and if so, how to do
it now. The FBI's argument was essentially, Hey, you're in
the business of writing codes. There should be no problem.
I would desperately need a bleep sound effect right now,
but I call b S. Let's be nice I'll call
(28:29):
bs on that argument, because to me, that's the same
as coming up to Let's say, let's say Ben that
you are a cook, yes, and you cook in an
Italian restaurant. You're you're the Italian restaurant's head cook. Three
initial int. I go up to you and I say, hey, listen,
I'm from the federal government. I got this, uh this
(28:49):
this executive order, this federal order for you. You have
to now go and make a dinner of Peruvian food
right now for thirty people that are in the restaurant.
You can do it because you cook right exactly, or
it's like saying, yeah, um, it would be a disastrous
(29:10):
proving dinner at very least a loose interpretation. So like
for another example, let's say to make it even broader,
because I think this is this is pretty good too. So, uh, Jonathan,
let's say that you are a doctor. All right, okay,
you're at your doctor, doctor Strickland, doctor. Hey call me Jonathan,
(29:31):
doctor Stricklands. My dad. You So you're an easy going doctor, clearly, um,
and you are, and uh you're an ear nose and
throat man. And uh so I come up to you
and I say, I'm from the federal government. I have
an executive order, but not an appointment. So you're already
(29:53):
kind of irritated. And I'm like, and I say, I
need you to make the cure for answer that's a
bit much. Yeah, I mean you're a doctor, right, you
know about organs and stuff cancer affects bodies. Or going
to Harley Davidson Davidson and saying, look, you you make
stuff where wheels are connected to chassisas and they turn
(30:16):
and a motor keeps things going. I need you to
make a bus. I mean, you see where the where
it's ridiculous. It's it's you can't argue that because this
company is in the business of doing this one thing,
which by the way, is just one part of Apple's business,
that they are capable of making this other thing that
happens to fall into that same category. That is ludicrous
(30:37):
on the face of it. So that's argument number one
about it being a burden. Okay. The second burden is
the one that we've already touched on. It sets a precedent.
If Apple can be forced to attack the security of
its own system in this case, it can happen again,
and that would be a disastrous result for consumer Uh,
you know, confidence in Apple's products that very bad for
(31:00):
apples bottom line. So if Apple says, look, you will
make us lose millions, if not billions of dollars in revenue,
how is that not an unreasonable burden? How can you
argue that that burden is reasonable? Right? And that's so,
And not only that, but then you get into the
the foreign agent approach, the foreign state approach, saying what
(31:24):
if this means that China comes to us and says,
because of this other thing that we agreed to do,
we now have to do it in China all the time,
and real human beings are being pursued, and uh, their
lives are are turned upside down and ruined as a
result of it. And it's all because we have to
comply because this has already set a precedent. That's an
(31:47):
that's an unreasonable burden. And finally, they've even said that
it's a violation of their right to free speech. And
the reason for that is because code has been ruled
as a type of free speech in the past. And
if the government compels Apple to write code that Apple
doesn't believe in, they're being compelled to speak against their
(32:11):
own beliefs, thus a violation of free speech. Now that argument,
most people are saying it's probably the weakest of all
of their styling on it a little, but I gotta
admit that's pretty awesome style. Well, yeah, that is. I
I enjoy it. And there, while it's reaching, it's not invalid.
And let's consider that Apple legally, in this kind of
(32:33):
case is playing against Uncle Sam and on its home territory, right.
And this means that you you might often wonder why
when the air suits or countersuits are legal problems, why
so many cases open with just this laundry list of arguments.
And it's because it's, um, you know, if we could
(32:54):
go back to my Italian restaurant, Uh, you're just throwing
the spaghetti at the wall. It's a scattered gun approach,
or if you prefer, it's casting a very wide net
because you aren't sure which tactic is necessarily going to
be your best one from the starting gate, so you
want to throw out all of them at once. And
(33:14):
if it's if it's a compelling enough argument, then you
can get things thrown out before they go any further. Um.
And in fact, Apple has said that they're willing to
go all the way to the Supreme Court with this
particular fight. The CEO said that publicly. Yeah, which interesting
because I you know, uh, we recently lost a U. S.
(33:35):
Supreme Court justice here in the United States, and I
actually think he probably would have sided with Apple on
this one because of his his very strict view of
the Constitution, an originalist Anton. Yeah. So, uh, going back
to this argument of it, I'm going to read a quote,
and I want to see what your reaction is, because
(33:56):
I know what mine was. This is from congress spend
David Jolly of Florida, who said Apple's leadership risks having
blood on their hands. Ben is shaking his head and
looking at me in disdain. Not at me, he's looking
through me. It's absolutely insincere, first off, and it's usually
(34:19):
whenever you listeners, whenever you hear people make lurid imagery
based appeals to emotion, right, or these these hyperbolic accusations
that this is the bread and butter of I'll say it, uh,
(34:40):
the political class, which political theater exactly. That's a great phrase.
And so to to say this in such a way.
What it does is psychologically you get an image of
somebody with with literal blood on their hands. And then
you know they're trying to cast aspersion against Apple, not
(35:01):
by not by making any point about the arguments Apple
is making, but by going instantly to, uh, these people
are murderers, right if we don't do why why are
you defending murderers? That's that's that's essentially you know, uh,
the argument about you know, it's it's one of those
(35:21):
those legal arguments you will hear occasionally. This is an
I legal argument, but it's like the legal arguments you
will occasionally here where it's clear that the lawyer is
trying to appeal to the jury's sense of emotion as
opposed to addressing the facts of the case itself. UM
So I agree entirely with you. The first reaction I
had was I'm offended by that statement. I mean, it's condescending,
(35:44):
and it imagines that it imagines that the person that
would be swayed by that is not intelligent enough to read,
and it it deflects the fact that there are two
people who were responsible for that terrible exam attack, that
two people, and those two people are the people who
are holding the guns and pulling the trigger and aiming
(36:05):
at people. Those those are the ones who have blood
on their hands, and of course they're they're both dead now.
They both died in the shootout with the law enforcement.
But the point being that they're the ones responsible, not Apple.
Apple did nothing in relation to this crime. In fact,
Apple didn't give the freaking phone to Fruit that was
issued by the county. Apple just made some made a
(36:28):
device that has this level of security on it that
people wanted. People wanted that level of security. They want
the reassurance that Apple itself can't access their phones without
their permission. It's a very important cornerstone of security. In fact,
if you look at iOS eight or earlier, Apple could
bypass security. They could they could access the information on
(36:52):
a phone without your pen, without you acquiescing and allowing
that to happen. But they specifically change that with iOS nine.
They made it so that they could not do that
because they said, it's important for consumers to trust the company.
And how can you build trust if you know, in
(37:12):
the back of your mind this company could at any
moment access my private information that I have not chosen
to share with them. Uh, well, you know that trust
is destroyed in that case. That kind of brings us
back to that unreasonable burden. So the these arguments are continuing.
Uh I think the next stage doesn't start unill till
(37:35):
March ten um, and we're recording this on February. So
I I am very hopeful that the government sides with
Apple on this ultimately that when this gets the courts
at any rate side with Apple on this, because if
they do not, this is the This could be like
(37:57):
the snowball effect where we see more or requests of
this nature come in and then once it once it's
been established as precedent, it's much easier to happen in
the future, and it's easier to see larger requests like
things that are that go beyond all we need you
to help us circumvent the security and may go into
(38:18):
we need you. We finally are going to get what
we wanted all this time. We want a direct path
that like a doorway that's labeled FBI that lets us
go straight into the data that your users are storing
on their devices, which leads to because the FBI, this
leads to a horrific situation because the FBI is an
(38:38):
institution sure made up of individuals. Remember when the Snowden
leaks revealed to us the extent of unethical use of
surveillance by the n s A. People would look up
their ex girlfriends or they were stark ex boyfriends. Yeah,
they were just looking up people that they were interested
in with absolutely no oversight. As a matter of fact,
when we talked about the Chinese government looking the other
(39:03):
way for hackers, that was the same thing that occurred
with the n s A. To to to assume that,
for some reason, given the opportunity, individuals in another law
enforcement branch or another institution would not do the same
thing is cartoonishly naive, you know. Using the argument of
(39:26):
this is a one time use, that wouldn't stop the
FBI from requesting another one time use, or another one
time use, or even extending that beyond it, saying all
right now, we want a one size fits all approach
to doing the same thing because it's too much time
for us. And don't worry, we'll get a court order
before we do it. We'll make sure that nobody else
(39:46):
gets access to this ability, and you'll know that the
court order is good because they'll be classified. So we'll
just inform you when the orders are approved. I have
said it many times that there is no way to
ensure security by enforcing a vulnerability. Yeah, and I think
that's a student way to encapsulate it. But there is
(40:06):
a question that I have that I'm sure a lot
of you have as well, ladies and gentlemen, which is,
let's say the worst happens. Worst happens, court rules in
favor of the FBI, and Apple says, no, we're not
going to do it. Well, I mean if if the court,
(40:27):
assuming he goes all the way up to the Supreme Court,
this could end up becoming a matter of law where
it's codified that companies have to obey that within the
United States, which would mean far reaching implications, not just
for Apple, but for all companies everyone, any any tech company,
any company, really, any company doing business in the US,
(40:49):
not even based here, because doing business so that's a
big deal. It's it's potentially disastrous for privacy. Uh, there's
ampen possibilities of misuse. We've talked about the possibility that
if you do create a vulnerability, someone somewhere is going
to try and figure out a way to also gain
(41:11):
access to that vulnerability. And these are not necessarily other
law agencies or intelligence agencies, or maybe they are intelligence agencies.
They just have to be intelligence agencies working for a
different country. And yeah, and here's the problem, because we've
talked about this before, man, I legislation, I think we
talked about this with autonomous vehicles before. Legislation is almost
(41:34):
always outpaced by technological innovation. Yes, yeah, but you will
almost always see a case where someone has figured out
something really interesting to do with technology, or perhaps even
really scary things they could do with technology, and there
are no laws to cover it because before that person
figured it out, it didn't exist. So you don't write
laws for stuff that doesn't exist. We don't have a
(41:55):
law saying listen, guys, I just I just it's keep
me up at night. We have got to write a
law about what happens in the case that the Lockness
Monster is real, gets out of Scotland, comes over to
New Jersey and starts to eat people. We need a
law to protect us from this potential catastrophe. Iculous. Yeah,
(42:17):
Like if we're senators, everyone in the audience and you,
Jonathan myself, and then one of us walks in and says, guys,
I know that, uh, I know that we have some
other issues coming up, and we have to nominate this
Court justice, and there's an election coming up. But I
think we need to look into the future and look
at the big picture, which is moon boot theft right,
(42:40):
because I don't want people shoelace on the moon when
and if we build a colony there. I think arguing example,
for example, for robot rights right now might be a
little premature that kind of thing. Maybe not forever, but
for now, it's definitely it's good to think about. But
you know, you make a very astute point when you say,
if we're talking about codifying something, we codifying a law,
(43:03):
then what what happens is once the Supreme Court rules
on something like this and it becomes a matter of law,
it is much It is very, very difficult to get
that kind of ruling. The Supremes are pretty busy people.
They don't hear every case that's brought before. They absolutely don't.
And then but the thing is, you think it's hard
(43:25):
to get one of the to get those justices to
change the substance of American jurisprudence. Imagine trying to get
him to change it back. This is like a Pandora's
box Pandora's jar situation. Yeah, now this is not not good.
You don't want this to happen for multiple reasons. Now,
all that being said, our our sympathies with the families
(43:49):
of those who are wounded and killed as a result
of this mass shooting. Absolutely I feel awful for them.
And if there were any other way to get to
that information that did not require Apple to be complicit
in destroying its own security, I would be in favor
of it. And in fact, the FBI has taken such
pains they got access to the iCloud backups that this
(44:13):
phone creates. The problem being that the phone didn't have
an eye cloud backup for the month leading up to
the actual attacks, so there could be information on the
phone that's not on the cloud, and that's why the
FBI wants to get access to that. I totally understand
the reasoning behind it. But but two things, of course
keep me from being completely sympathetic. One is that the
(44:34):
FBI has for years been trying to get backdoor access
to multiple systems. Yeah, so that so you could argue
that perhaps this mass shooting is being leveraged cynically by
the FBI in order to further their goals, because it's
hard to say no to such an emotionally devastating event.
(44:59):
I would say, I do, I do know, I believe it.
And this is just my personal opinion based on again precedent.
It is completely within the realm of not only possibility
but plausibility that an institution would wait for an opportune
(45:20):
time to make this this kind of legislation, like the
argument for internet surveillance based on saying, hey, we need
to protect people, we need to protect you from um
inappropriate content and your children, think of the kids. Really,
this is this is a lot of the same stuff
(45:41):
we heard in the wake of the Patriot Act, where
a lot of people felt the Patriot Act was a
reactionary uh, piece of legislation that was drafted far too
quickly and what had had reached far, far too wide
for what it was proposing. What what what? What everyone
(46:01):
claimed it was all about, and Uh, that was a
big mess. This is also potentially a really big mess.
And the Patriot Act, the substance of it was pretty
much had been written in advance. Yeah yeah, which is
pretty like now that used to be a controversial statement,
but now it's acknowledged. Yeah, So this is this is uh,
(46:23):
I mean the fact that FBI has had this plan
for a while, not the specific implementation, but this desire
to get this workaround access to things. And I mean,
I totally understand their point of view. Two, they're trying
to investigate things. It's not like the FBI is necessarily
made up of the cigarette smoking man and all of
(46:43):
his cronies. You know, I'm not I'm not going I don't.
I don't mean to to uh disparage them. I don't
want to demonize them, Right, That's not That's not what
I'm trying to get at. Either. The FBI's intentions may
in fact be nothing but noble that they want this
in the effort to investigate, solve crime, prevent crime from happening,
(47:05):
and not in any way that is uh malevolent. However,
the fact remains, whether their intentions are noble or not,
it opens up this opportunity for people whose intentions are
demonstrably not noble to take advantage of those same opportunities. Yeah,
and you know, I'm glad you said that because I
want to see something that I want to add to this,
(47:27):
something that has rarely said when we talk about um
government surveillance or concerns about privacy right. One thing that
is rarely said is that law enforcement agencies, law enforcement institutions,
and individuals in the US actually do quite, um, quite
an extraordinary job compared to a lot of places. If
(47:50):
you're fortunate enough to grow up in a place that
has rule of law, where you can walk down the
street in the dark, or you can say, you can say, uh,
you know, whomever your senator president is, you can go
on the internet and say, I think they stink. I
think you're in jerk. Yeah, I think you're the piece
of bologna with shoes. And then but in other countries,
(48:13):
we you know, people get arrested for that, people get imprisoned.
So we're not gonna raced, Yes, not just arrested or imprisoned,
but the government in some countries will take steps to
make it seem like that person was never a person exactly.
They'll keep the photos, but you won't be in them.
And and I say that because it's a sense of
(48:35):
much needed perspective. However, you know, I'm not demonizing the
people who work at the FBI. Institutions, whether private or public,
seek power. They seek uh further further, influence. And that's
(48:55):
not that's not because it's some sort of James Bond supervillain.
It's not specter. It's not specter. And it's because it
allows for it allows for an easier, more efficient pursuit
of whatever the original mission would be right, and sometimes
that ends up stepping over a line that should not
(49:17):
be crossed. So so yeah, like like Ben and I
we both talk about things that here at work where
we say, gosh, I wish we had X because it
would make our lives so much easier. Well, even if
we got X, we would come up with why. That
would be the next one we get X. X is awesome,
X is helping us out. We're like, oh man, it's
(49:37):
so good to have X here. Yeah, it would be
great if we had why because if we had why,
we could really do our jobs. Well we get why,
and then you know, man, X and Why are working
out like a dream. But boy, if we had Z,
can you imagine the level we get to Now what
we do Ben, we make fun audio podcasts, videos and
(49:59):
articles that go on the Internet, and that's awesome and
so really our capacity to do horrible, horrible harm is
fairly limited. I mean in respect to how our jobs comparatively. Yeah. Granted,
if either of us wanted to go outside and just
start throwing kingop hops popsicles at people, we could go
(50:19):
on a popsicle rampage. But that's that's not that's not
job related. The FBI, the c, I, A, N, S A,
a lot of those three letter organizations, in pursuit of
what they need to do in order to to fulfill
their their organization's mission, in some cases will step over
(50:41):
lines that we cannot allow people to cross because it
creates a system that is at least as dangerous as
whatever problem they're trying to solve. Another example of this
is the the idea that the idea of absolute prevention.
(51:03):
Like you know, there was the old conversation about tortures
several years ago when it was the ticking time bomb argument,
which was, should torture be legal if there is a
criminal in custody who has suspected of having knowledge of
another nine eleven here? Yeah, yeah, Jack Bauer from twenty
(51:27):
four kind of argument. Should torture, while reprehensible, be allowed
when it gets results? And this, this kind of reasoning
is dangerous. Not I'm not saying that because of any
desire to see human tragedy if I'm saying it's dangerous
because of the assumptions it makes that a special case
(51:51):
will remain a special case, right, and that perhaps the
next case, which maybe is that quite so special, Like, well,
you know we've done it before, So what's the deal here? Yeah,
I mean what happened? Yeah, So this is this is
exactly why neither I think I feel pretty strongly about this.
(52:13):
I think I'm on the right track that neither you
nor I feel that the FBI should win out in
this particular case. I think this is something where we
really need to see Apple come out on top. I
am not a huge fan of Apple. I don't own
a lot of Apple products I have. Apple does not
sponsor this show. I'm not getting any money from Apple.
(52:36):
If anything, I'm losing money to Apple because my wife
is a fan, so she wants to get an Apple Watch,
but I am not. I'm not getting anything from Apple. Uh.
I do think they're in the right because I don't
want to see a precedent where a company that creates
a secure system has to be or can be compelled
(52:58):
to compromise that security. It defeats the purpose of the security,
and whether it's this case, which is extraordinary and very emotional,
or something much less impactful for the general public. Maybe
it's something, you know, uh, simpler and less dramatic. It
(53:19):
doesn't matter. You cannot you cannot go down that pathway,
uh and expect things to turn out all right. You've
got to figure out other ways to do that kind
of investigation. Either Apple needs to go in a direction
where they can, uh, they can access user data without
(53:41):
having to circumvent a security system like this, which means
they have to go backwards, which can really is not
a possibility. Or Apple and other companies have to create
systems where it really is impossible for them to access
without the consent or the actions of the owner of
(54:02):
the device. I suspect that every company is rushing to
develop that kind of approach right now, because none of
them want to be in this position. Facebook, Google, Microsoft
all publicly showed support for Apple. Apple right now is
um you know, part of Apple hired a dev that
(54:23):
worked on Edward Snowden's favorite messaging app obasually Yeah, and uh,
and I don't know how much of that is meant
to be like a pr move, but also they have
um you know, the leaked Snowden papers are out there,
and I know I'm harping on them. They revealed an
ugly behind the scenes look at corporate involvement with government
(54:47):
requests for surveillance. You know, so the average consumer you me, uh,
Gary Busey, whomever, we have much less trust in general
in these companies because we have a reason not to
trust them. Well, we we have handed over so much
(55:11):
of our own personal data. We trust that the devices
that hold that personal data aren't going to just give
that away to whatever entity, uh, without our consent. We
we trust that that's not going to happen. When things
like this pop up where we start to question that trust,
(55:33):
that's problematic. There's someone else that Apple has recently hired,
Ted Olsen. Does that name sound familiar to you? Ted?
Ted Olson's a lawyer. Uh so, Ted Olson is going
to be representing Apple. Ted Olson's probably best known for
representing George W. Bush in the Bush versus Gore election fallout.
(55:56):
You know, for those of you in the United States
when when Bush was running against Gore, there was this
whole battle about you know, voting recounts, voting recounts, and
Olsen represented uh, W. George W. Bush on that and
Bush ended up winning that. So now he's representing Apple
in this particular battle with the FBI. So interesting to
(56:21):
see these kinds of personalities involved in this. And I
know that with the public perception it's been a little
sea saw ish, But general public, I would argue, the
people who are not necessarily paying attention to the text sphere,
I think a lot of them are siting with the
FBI because it's a terrorist story. It's a story about
(56:45):
trying to establish the most information about these these shooters
as possible. Do you think I do? I think that.
I think at least a lot of polls that I've
seen leading up to today, the general public tends to
side with the I because the FBI has a very
emotional story. Apples story is much more rationally based, intellectually based, um,
(57:09):
and the FBI story is is penned on this event,
this very emotionally charged event. UM. I don't know that
that's going to continue. I think. I think people who
are savvy in the text sphere, I think the majority
of them side with Apple. But it's still because it's
still it goes back to the taking time bomb argument.
(57:30):
You know, yeah. Yeah, So this is one of the
stories that we definitely had to cover. I mean, obviously
it's it's such a huge it's probably the biggest story
in tech right now as we record this, um, and
I'm glad that you could join Mayben to to chat
about this, to kind of give your your insight as well,
and to talk about why this is so important, not
(57:52):
just from a technology standpoint, but just a philosophical standpoint
in a matter of law as well. So then where
can people find other stuff what you do? Ah? Yes, uh, well,
if for any audience members who are interested in stories
about Big Brother and overarching government surveillance, you can join
(58:17):
me along with my producers Noel Brown and Matt Frederick
at our show Stuff they don't want you to know,
on which you have made several appearances, Jonathan. I enjoyed
all of them. Oh, thank you. I like it when
we go on the pun tangents, which sadly doesn't happen
too much off air. But oh and side note, folks, Yeah,
(58:40):
I know we didn't make very many jokes today, but
that's because I think I can I can say you know,
you and I are on the same page with this.
This is a very important issue, more more important than
it may appear at the time. Yeah, we we made
light of a few of the arguments simply because we
scot off at the uh the stated intent as opposed
(59:05):
to what we believe is the true intent of them. Yeah,
we've done that a couple of times. But and also,
you know, we do acknowledge that this ultimately is being
tied back to a really awful event. I mean, we
can't you know, we can't get around that, and you
have to acknowledge it and be respectful, I think, right.
(59:25):
And plus uh, Plus we think we're funny, so that too,
and we want you to remember that we think we're funny, right,
that's our opinion. We don't we don't expect you to
find us funny. That would be presumptuous, but we are.
We think we're pretty hilarious. You can also find me
with my co host Scott Benjamin, who has appeared on
Tech Stuff as well. On our show Car Stuff is
(59:47):
show about everything that floats, flies, swims, or drives Jonathan.
Folks can see us uh together even at times on
our shows What the Stuff and Brain Stuff, which answers
every day science. Why does popcorn pop what is it
GPS and uh I know that this is something I
recommend every time if there's someone who hasn't checked it
(01:00:09):
out yet. You have another show called Forward Thinking, which
is a bright, cheery show about the future, and it
really is. It's a very it's it's an optimistic view
of what our future can be based upon the very
long history of humankind coming up with ingenious solutions to
big problems. Well, yeah, and you have, um, you have
some great questions on there. The research is top notch.
(01:00:32):
This is a video and an audio series, so I
highly recommend it. Well, thank you so much, Ben, and
I really do hope that you and I get the
opportunity to do some more of those videos together. It's
been a long time since we've done a two person episode.
I wish you know what I wish I could tell
everyone is to go to Brooklyn and catch the two
of us competing and what was it Thendunderdome. Yeah, there's
(01:00:55):
gonna be no no puns barred and we go there. Alright. So,
guy is, if you have any suggestions for future episodes
of Text Stuff, let me know. Send me an email.
The addresses tech Stuff at how stuff Works dot com,
or you can drop me a line on Facebook or Twitter.
The handle of both of those is text stuff H
s W and I will talk to you guys again
(01:01:16):
really soon. For more on this and thousands of other topics,
is it how stuff works dot com