Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Intro (00:01):
This is a Technikon podcast.
Peter Balint (00:06):
It's not always obvious, but responsible technology activities often benefit
from some ethical guidance and considerations. And this stands to reason.
After all, technology projects may bring to light new ethical issues,
which didn't exist before the technology did. Hello and welcome.
I'm Peter Balint from Technikon and this is our podcast
(00:28):
series called "Ethics and Technology A Prerequisite for European Research".
We look at H2020 research projects in Europe and what
role ethics guidance has in order to bring them to
successful conclusion. With this guidance, technology outcomes are less likely
to be plagued by unintended consequences and exclusions of humanitarian sensitivities.
(00:51):
It's now becoming customary to install ethics experts in technology
related projects. But to what end? And how can we
ensure that the value in ethics policies are realized by
the entire consortium? To help us get a better understanding
of how this all comes together, we welcome Rebecca Roache ,
a British philosopher and senior lecturer at Royal Holloway, University
(01:14):
of London, and Jonathan Seglow , associate professor in the Department
of Politics, International Relations and Philosophy, also at Royal Holloway. Together,
they inform the EXFILES Project Consortium about ethical considerations, which
will come to light in this highly technical endeavor to
create tools and methods to unlock mobile telephones which have
been taken into evidence by law enforcement agencies. Thank you
(01:37):
both for joining us today.
Rebecca Roache (01:39):
Thanks for having us.
Jonathan Seglow (01:40):
Thank you.
Peter Balint (01:41):
Rebecca. We'll start with you. When it comes to ethics,
we often follow precedents or what has been suggested, ruled
or practiced in the past. But we're talking about technology
projects now. And in many cases, there is no precedent.
So how is this handled?
Rebecca Roache (02:01):
Yeah, this is this is tricky. I mean, by following
a precedent, we kind of have an easy approach that
we can just do whatever, follow whatever ethical norms have
been established already, which may or may not be the
correct ones. But, you know, at least by following precedent,
(02:22):
it's going to be fairly uncontroversial. So we're talking here
about police access to encrypted data on people's mobile phones,
and we don't have a clear precedent for that. So
I think probably the closest we have is something like
circumstances where the police gain the authority to enter somebody's
(02:43):
home and look around whether or not the the person
who lives there is is happy with that. With mobile
phone encrypted data, it's a bit more difficult because they
can't do that. Encrypted data is difficult to get into
without the right sort of, you know, without the encryption
key which mobile phone manufacturers claim not to be able
(03:07):
to provide or maybe really can't provide, they kind of
have to hack their way into it. That's kind of
what happens at the moment where we might be worried
centres around the issue of sort of to what extent
is encrypted data comparable to the stuff you have lying
around in your home? A lot of us use our
(03:29):
phones almost like a sort of outsourced memory. You know,
we use it. We sort of have ten years ago,
we might have remembered phone numbers of our friends and family.
Probably many of us don't anymore. We store it in our
phones instead. It contains our photos, correspondence a lot, lots
of stuff that we are... we store on our phones
(03:51):
under the assumption that because it's encrypted, nobody else is
ever going to see it. So it's almost less like
- the data you have on your phone is there's a
sense in which it's less like the possessions you have
in your home and more like the thoughts you have
in your head. And this this makes it tricky to
decide what to do ethically, for one thing, because there's
(04:12):
an issue around, you know, this is this is new.
What what do we say about it? How important is
an individual's privacy? Is it more important than the interests
of the police and the public in general to have
criminal investigations take place? I mean, you asked about precedent,
and I think there's a difficulty here with what precedent
(04:34):
to follow. But there's also a concern about what precedents
might we be setting here with any decisions we make.
We don't know how technology is going to be used
in the future. And however it's used when people come
to make decisions about how data should be protected, under
(04:55):
what circumstances the state should get access to individuals data
and so on, they're going to be looking at what
decisions have already been made so that they're going to
be treating decisions that we make about access to encrypted
data on mobile phones as a precedent. And that might
be quite worrying. So if you imagine in the future,
if it becomes possible to store data instead of on
(05:18):
on your mobile phone, if we if people in the
future start using implanted devices , for example, to I
don't know, to sort of increase their memory. Then if
legislators in the future look to what we're doing now
and think, well, there's a precedent for law enforcement agencies
to access encrypted data. And if that were later used
(05:39):
to justify accessing the data that's actually implanted in people,
then we might think that is a bit a bit
more intrusive and sort of more intrusive than we would like,
certainly more comparable to accessing the thoughts in your head.
So I think, you know, as well as making decisions about,
(05:59):
you know, the issue at hand, you know, what are
our rights to privacy when it comes to mobile phones?
We also have to be sort of almost sort of groping
in the dark, thinking how might the decisions we make
now be applied to technology that arises in the future?
And we don't know what that's going to be.
Peter Balint (06:18):
Okay. So what's happening right now is is actually could
have quite a bit of impact on how things are
handled in the future when it comes to ethics. Jonathan,
do you have anything to add to that?
Jonathan Seglow (06:28):
Well, I think there are real questions, which we haven't
always encountered before the era of kind of very sophisticated
mobiles we have now. One might be, whose data actually
is it? Who owns it? On the one hand, we
think of it as private information, which we share with
a few selected others, or some others than yourselves. But at the same time,
that's only enabled because of certain technology, because of these
(06:50):
mobile devices and the networks. And therefore, if the kind
of data we have is only possible because of new
developments in technology, you might think, well, they have some
kind of claim over it because they enable us to
own certain kinds of data. So, you know, it makes
a big difference when you're using something like a mobile
as opposed to an old fashioned sort of notebook you'd write things on, also,
(07:14):
of course, the sheer range of data we're collecting not
just contacts of people, but perhaps quite personal medic information
or information about your employment history, all sorts of things
which we we have very strong interest in keeping privately,
which we wouldn't have collected previously. I would agree with
Rebecca's point that we are setting precedents. We can look
(07:38):
forward to a world, perhaps, or dystopian world where potentially
huge amounts of data could be accessed by law enforcement authorities,
you know, for good reasons but raising these privacy issues.
But I think Rebecca and I wouldn't want to see
ourselves as engaging in a kind of top down model
where ourselves, as ethicists or the more technical participants in
(08:01):
the EXFILES project are simply, in the long run, kind
of instructing other citizens, other parties and society what these
precedent should be. We ought to see what we're doing
is more of a kind of democratic conversation than the
reports we write of the EXFILES project. That's more part
of a conversation which citizens should have with each other,
with themselves, about how we're going to resolve these issues
(08:23):
in the future. Because in a democratic society, law enforcement agents should
be accountable to the people they serve. So it's ultimately
their interests, which matter, and it shouldn't be anyone's to
proclaiming what they think their view is and instructing the rest.
So precedents should be sort of worked out in a
reasonably public and democratic manner rather than a sort of
(08:43):
eerie elite expert manner.
Peter Balint (08:46):
Okay. And this makes complete sense. Now, you mentioned the
technical part of a project. And I'm wondering, you know,
when I look at a H2020 project in Europe, for example,
let's say I'm outside looking in, I might observe that
the ethics contingent in a project is not comprised of
technical individuals usually. And it's the same in reverse, too.
(09:08):
So what's the best way to handle this knowledge gap
between collaborators in a project?
Rebecca Roache (09:15):
Yeah, I think this is it's a double edged sword.
I mean, you get this with with interdisciplinary research in general.
You know, on the one hand that the discoveries we
want to make about the world don't slot neatly into
the categories that correspond to university departments. So, you know,
it often happens that interesting questions we want to address
(09:36):
will require an approach involving researchers from various disciplines and with
sort of various educations and pools of knowledge, and that's
what we've got here. So we've got the, you know,
the computer experts, the legal people. Jonathan and I don't
fall into either of those categories. And then we're doing
(09:57):
the ethics part, and it gets easier as time passes. Because,
you know, I think Jonathan and I understand more about
the technical aspects of smartphone encryption than we did at
the start of the project. But certainly you're kind of
groping in the dark in a in a way, because,
you know, the the the ethical principles we we might
(10:19):
suggest are premised on certain assumptions. About what the technology
can do. And those assumptions are often sort of quite
naive or incorrect for various reasons. I mean, the way
that we have been approaching it is by having regular
discussions with the more technical researchers on the project. So
sort of having regular meetings to where we where, Jonathan,
(10:41):
I explain what we've been doing and sort of share
our drafts and having a discussion where the more technical
guys have an opportunity to say oh actually, you know,
in this in this paragraph you're making this assumption which
is actually incorrect or they can sort of refer to pass,
you know, historical cases that might be relevant for us
(11:03):
to look at of, you know, sort of arguments about
how data is access and what the technology involved is doing.
So an example that we've discussed in the past has
been the debate around key escrow accounts, which is where
you have an individual has some private data and there
might potentially be somebody some sort of state body who
(11:23):
wants access to it in the future. So then you
kind of have a one solution is to have the
data or the key to the data to access the
data held by some independent party. And there are sort
of costs and benefits of of doing that. But that's
that's a case that is sort of has some relevance
to the to the issue that we're looking at on
(11:45):
the EXFILES project. I don't think Jonathan and I would
have known about it without being pointed in the right
direction by our more technical colleagues. So I think it
is it is about it is about communication and and
sort of regular communication because obviously, you know, sort of
make if you make a an incorrect assumption early on
(12:06):
and then just kind of carry on working, you can
potentially waste a lot of time going down a blind
alley and writing a lot of stuff that's actually not
going to be very useful. So we've tried to sort
of have this this ongoing dialogue with our colleagues who
have the right sort of expertise to inform the ethics
(12:27):
we're doing .
Peter Balint (12:29):
Okay, so like any other interdisciplinary project, apparently communication is the
key here. Jonathan, do you have anything to add to that?
Jonathan Seglow (12:37):
Well, I would actually add that really it's been very
useful in writing our report. We've had frequent discussions with
the computer scientists at our university. It certainly enabled us
to write something which is much more policy focused, which
isn't too abstract or speculative, which is grounded in the
realities in which the are all sorts of criminal behaviour
(12:59):
and serious questions about how that's going to be addressed.
So it's certainly been productive for us. Perhaps it's been
productive for the computer science people as well because it's
been a process of mutual learning. We have to see
our own discipline from the outside and to present what
is sometimes quite of concepts in a very clear and
simple way. And I think that's a good thing to
(13:21):
have to do. Not so dissimilar from teaching. I suppose
you've been teaching each other and I hope the end
result will be a kind of usable ethics, which is
a report which can actually have some impact on how
the more technological side of this project is conducted. And
ultimately therefore will lead to a more sort of ethical
(13:42):
technology where you don't have a policy where nearly something
is practically possible technologically, it's therefore done because there's all sorts
of constraints, clearly legal ones under the different jurisdictions which EXFILES covers,
but also related to that ethical ones. So I think
it's a, as it were, as a society, as such,
(14:02):
it's a productive interchange between those two sides. And it's
been certainly interesting to me, I think, Rebecca, to be
part of that interchange.
Peter Balint (14:10):
Okay. And Jonathan, what do you think about this idea
of challenges in in a large H2020 project regarding ethics?
And we can continue to use EXFILES as an example.
Jonathan Seglow (14:23):
Well, the H2020 projects are intended to address pressing social
problems in European societies, or certainly this one. The way
we thought of it is there are different parties involved,
principally law enforcement agencies and citizens who are the most
part users of these very advanced mobile phones. But I
(14:44):
think it's a mistake to pitch those two interests against
each other, or the interests of the manufacturers, of mobile
devices and the providers of networks. After all, as I
was mentioning earlier, there are issues of accountability. But what's important,
agencies are not simply working on behalf of citizens and
(15:05):
gain their legitimacy from them. So were they to overstepped
the mark in invading the privacy of users of mobiles
in their own crime fighting efforts not to observe basic
procedural safeguards, not to be as transparent as they feasibly can?
Then they really would have sort of passed over a
(15:25):
boundary which exists for a reason in a democratic society. These
organisations exist on our behalf. It's worthwhile to thinking that
from the point of view of mobile manufacturers and the providers
of networks, although on the one hand they do have
a clear interest in making sure their devices and networks preserve privacy.
(15:46):
And that's why people will go out and buy them
and use them. They also have other kinds of interests
which are in some ways in conflict with that, not
being associated with crime, for example, their own business reputation.
So the ethical challenge is really when you have more
than one kind of interests which is in conflict or
(16:07):
contradiction with the others, and you have to find some
kind of balance between them and sort of less different
policy options, which will strike that balance in different ways,
allowing law enforcement agencies to access data with the consent
of mobile manufacturers or practices that can be done in secret,
which clearly impacts upon their accountability or simply keeping with
(16:31):
the status quo, in which case it's going to be
a lot of illegal or maybe hacking or having a
blanket ban, which is going to make accessing encrypted data,
which is going to make crime fighting efforts much, much
more difficult. So it's really it's really a question of
weighing these different interests against each other, which has sort
of indicated or are represented within these different bodies and not
(16:52):
just between them and trying to arrive at some kind
of reasonable compromise which will do something to satisfy all
parties and also, I suppose, which is, you know, legally actionable.
We're not compromising some sort of abstract report that these
are different policy options which clearly want to be actually
translated into laws and regulations which could be put to
(17:16):
work in the in in the in the countries, which
EXFILES covers.
Peter Balint (17:20):
Mm hmm. Okay. And, Rebecca, what do you have to
say about challenges?
Rebecca Roache (17:25):
Yeah, I think that Jonathan's right - it is this tricky balancing act,
you know, the sort of trade offs involved. And when
you make decisions about who's who's rights should give way
to who's and sort of how you can get the
best outcome of sort of, you know, balancing all the
different interests. And I think it's made it's made a
little bit more complicated by I mean, this, in a way,
(17:48):
is a sort of dispassionate analysis. And I but I
think it's when it comes to sort of how how
the public are likely to respond to decisions that are
made in this area. There's there's some sort of emotion
involved as well. You know, people all the time, we
sort of sign that user agreements when we sign up
(18:11):
to social media websites and so on, we kind of
infere we tick the box saying we agree for them to
use our data in various ways. And yet there's still
controversy when the those organisations do use our data in
ways that it might be sort of technically in line
with the terms that we've signed, but which we still
(18:33):
don't like or we haven't foreseen, you know. So people
don't necessarily like the, the balances that are being struck,
even if they don't necessarily see that there's a better option.
You know, I think that there's this sort of wider
set of issues as well. I think around, you know,
sort of people, especially we've seen in the last couple
(18:53):
of years with the the sort of investigations into the
way that Facebook uses data. And there is there are
lots of people who who are disturbed by the way
that our data is being used by large organisations and
the sort of intelligence with which it's been analysed and the, the,
(19:17):
the surprising applications it's been, it's been put to we're
disturbed by all of that, but yet we sort of
continue to use these, these websites and so on. So
I think there's, you know, there's in one sense, you
could you could address these issues by just sort of like, okay,
let's think about the rights and how we can how
we can strike the best balance to keep as many
(19:38):
people happy as possible. But on the other hand, there is this
is sort of like public relations side of things that,
you know, there is this emotional element involved and there
sometimes a bit of irrationality about it. And you can't
necessarily just respond to that by saying this is irrational,
you know, because of things like the governments want to
(20:02):
maintain public support, political parties want votes. Law enforcement agencies
care about the confidence of the public. And this isn't
always something that arises through rational arguments. It also arises
through emotions and, you know, trust and. Sort of who's
(20:24):
who we talk to, which groups we align ourselves to,
which Reddit discussions we've made, we've read and so on.
So I think it is it's a really tricky balance.
I mean, I think sort of Jonathan and I, I mean,
maybe this is a point about how much how much
this sort of ethical analysis can do that we can
sort of do this this overview of what the ethical
issues are and sort of make some suggestions about what
(20:47):
the various options could be. But, you know, sort of
how how the public might receive the the ultimate decision
is I think that's that's a different matter.
Peter Balint (21:01):
You know. Yeah. It's clear.
Jonathan Seglow (21:03):
Well, I. I agree with that. I mean, because it
is the public that ultimately have to make these decisions,
and we really are't having any input into that. And I
agree with Rebecca. We have to take people as they
are with irrationalities and strong feelings about issues of privacy
and such like. Not not as they might ideally be.
Peter Balint (21:22):
Exactly. Well, I think if anything else, we've learned today
that ethics in technology is not an easy topic and
it's not something necessarily easy to grasp. But hopefully with
our discussions today and with our future discussions, we can
have a maybe a little bit of a clearer picture.
(21:43):
So I thank you two , for coming on today and
for sharing your knowledge about ethics and how it works
in a technology project like this.
Rebecca Roache (21:50):
Thank you, Peter.
Thank you very much for having us.
Outro (21:53):
The EXFILES Project has received funding from the European Union's
Horizon 2020 Research and Innovation Programme under Grant Agreement Number 883156.