Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Intro (00:01):
This is a TECHNIKON podcast.
Technikon (00:11):
Hello and welcome. Looking back at the various plans of
action when the COVID pandemic struck, one of the best
digital solutions involved contact tracing. This was a surefire way
of using the power of technology to prevent the spread
of COVID. But responsible innovation and engineering dictates that certain
measures must be taken to prevent things like mass surveillance,
(00:34):
data leaks and function creep. In this episode, we hit
these issues head on and uncover how to mitigate the
risks under such great time constraints. We will hear from
two guests hosts through a special arrangement with the Alpen
Adria University of Klagenfurt here in Austria to look at
contact tracing in an age of data responsibility. Your student
(00:56):
hosts are from the Master's Program in artificial intelligence and cyber security.
Claudia Maußner and Nikita Soldatov speak with Carmela Troncoso , who
is one of the key designers of the Exposure Notification
Protocol DP3T , which is widely used now in contact
tracing apps all over. Let's have a listen.
Nikita Soldatov (01:23):
Hello, my name is Nikita.
Claudia Maußner (01:25):
My name is Claudia. Nikita and I are students in
the master's program of artificial intelligence and cyber security at
the Alpen Adria University of Klagenfurt. Today, we want to discuss contact tracing,
which has been and still is a very hot topic
in the COVID pandemic. We are especially interested in the
(01:45):
principles of responsible engineering and data privacy. With us is
Professor Carmela Troncoso of EPFL in Switzerland, who heads the Security
and Privacy Engineering Lab. She was one of the key
designers of the Exposure Notification Protocol DP3T , which most
contact tracing apps are using. Welcome Carmela and thank you
(02:08):
very much for taking the time for the interview.
Carmela Troncoso (02:10):
Thank you. I'm very glad being here.
Nikita Soldatov (02:13):
In order to get the pandemic under control, there was
and still is a strong emphasis on contact tracing. In principle,
manual contact tracing, which is carried out by the responsible
health authorities by questioning infected people, can be distinguished from
digital contact tracing via mobile apps, which is correctly referred
to as proximity tracing. Both variants are usually applied in combination.
(02:39):
As manual contact tracing alone quickly reaches its limits, the
support by digital solutions has been promoted all over the world.
Claudia Maußner (02:46):
And there is considerable data privacy risk associated with this.
For example, the risk of mass surveillance, misuse of data,
function creep or data leakage. This is why the need
for responsible engineering processes that take exactly these risks into
account is becoming increasingly evident. However, there's yet no unique
(03:08):
definition of the term "responsible engineering" or "responsible research and innovation,"
which is often used synonymously. One framework which was developed
in 2013 defines it as
through collective stewardship of science and innovation in the present."
It distinguishes the 4 dimensions
(03:37):
for example, aims at systematically considering all possible implications and
risks of a newly developed technology. This also involves considering
ethical and social aspects. The use of a proximity tracing
app must not lead to discrimination against infected persons or
people who decide not to use the app. Another very
(03:59):
important principle is the one of inclusion. The entire development
process should be open to the public and include as
many parties and specialists as possible with different backgrounds and interests.
More recent perspectives on responsible innovation still agree on these four principles,
but also add transparency as a fifth principle. The topic
(04:20):
also receives strong institutional support from the European Union and
national organizations, and the discussion on the framework is still ongoing.
Nikita Soldatov (04:29):
OK, so there is no clear definition of responsible engineering,
but Carmela what does it mean to you? Would you say
that the development process of DP3T protocol is a good
example of responsible engeneering?
Carmela Troncoso (04:43):
So, I think that one of the things that we
actually had in mind was already the description that Claudia gave,
which is the future, thinking about the future and what
this technology will become and how it will be used.
And when we did that, we not only looked at
the idea of data like a lot of people focused on the
application and the data, but we took a deep look
(05:04):
at infrastructure, what kind of infrastructure we were putting on
the world and how could it be reused. And what we are
trying to do is to minimize the ways in which
this could be repurposed for other things, which meant that
it can only be designed for the things that we
think is responsible to use and cannot be used irresponsibly.
Nikita Soldatov (05:26):
However, let's go back to 2020, when several European countries
were launching their own contact tracing apps. Germany for one
has been the most successful in terms of number of
downloads among the other EU states, while, for example, Italy's
start was not so spectacular. By the end of 2020
more than 24 million Germans had installed the the Corona-Warn-App ,
(05:48):
and that number keeps growing until now. Italian government stated
that Immuni, which is the name of their app , would
reach its full potential of 60 percent of the population between
40 and 75 years old would install it. Considering that
only 16 percent or so, which is around 10 million,
have actually downloaded, skeptics would probably say that this app
(06:09):
is a failure. But is the problem the app itself?
One of the main concerns that Italian citizens had was
that their sensitive information, such as health details or whereabouts
would be stolen. That might show a lack of interaction
between the developers and the stakeholders. Italian citizens as the
final consumers of the app were not part of the development process
(06:30):
and were not well informed about the measures and technologies
that are being used in order to protect their private information.
On the other hand, Corona-Warn-App developers seem to be a
bit more engaged in conversation with the public, although Corona-Warn-App
developers have also somewhat failed to deliver sufficient information regarding
the security measures used to the public. In the end
(06:51):
of the day, they managed to claim certain trust from users. However,
sometime ago, there was even a suggestion to introduce certain
incentives for installing the app in order to boost the downloads. Nonetheless,
the Corona-Warn-App has also encountered other problems as well, such
as poor communication channels with testing laboratories, which also emphasises
(07:12):
the importance of collaboration between all stakeholders involved in the development,
release and promotional of contact tracing apps.
Claudia Maußner (07:19):
OK, thank you very much, Nikita, for sharing this information
with us. It looks like the download rates might be
influenced not only by the choice of technology, but also
by the level of transparency and collaboration between stakeholders during
the design process. Carmela, to what extent do you think
it is important to involve public consumers, meaning the users
(07:40):
of the proximity tracing app in the development process of such
an application?
Carmela Troncoso (07:46):
So that's very important. And some of the things that
Nikita said now, it's actually very real across the board
in Europe, the fact that users do not really understand
many parts of the protocol, I don't know if not much,
because they would not involve or a lot of it
because privacy technologies are not really intuitive and very hard to
grasp in general, the idea that we can give you
(08:09):
notification if you were with someone not knowing with whom
you were and who are you, is something that people
cannot handle. But I think that it was also a
second part and it is also very important, it was
something that we were not very good at in the beginning,
that there are other users of the app and those
are the doctors and the contact tracers. And I think
(08:31):
we failed those who are not communicating towards them. What
is the added value of the technology? And it caused
a lot of mismanaged expectations that also resulted in a
lot of doctors telling people don't use it because it's useless.
And all of that didn't work very well. And I
think that the technologist and I have to say also
(08:52):
many governments in their public speech need to work much
more on how to explain technology and not only think
of the citizens, but everybody that is on the chain
and from doctors to contact tracers that have to work
with it. Because if not all of these stages work well
the app doesn't work as much as we would have expected.
Claudia Maußner (09:14):
Yes, you're absolutely right. It is rather difficult for normal
people to understand the technical details and that might be
why many are worried about data privacy. In this context
also the compliance with legal frameworks, namely the GDPR and
national data protection laws, as well as human rights have
to be considered. Specifically article 5 of the GDPR requires
(09:37):
the processing of personal data shall be lawful, fair and transparent.
Of particular importance in connection with contact tracing apps, other
principles of data minimization, storage, limitation, purpose limitation, integrity and confidentiality.
As the collected data is concerning health, which is a
special category of personal data, article 9 is relevant as well.
(10:00):
It requires the explicit consent of the data subject to
collect and process data. Nikita, what would in general be
good techniques, digital contact tracing to assure that the GDPR
is enacted?
Nikita Soldatov (10:13):
There are actually various approaches towards providing a sufficient level
of security in those apps. For example, apps may differ
in the way they trace proximity. While some of them
use Bluetooth proximity tracing, others track person's location by a GPS.
In the first method, phone transmits anonymous time shifting identifiers
to nearby devices. Those identifiers are then being saved in
(10:37):
the contact history log. After that, in case a certain individual
gets infected, everyone who came into proximity with them will
get a notification, suggesting that they undergo a COVID test. The
evident benefits here are that this method does not record
location or identity of a user. However, this approach is
not really applicable in case if the user has become
(10:59):
infected by touching a surface and the patient has touched.
Another feature that may differ between apps is the place
where they store the contact history log mentioned before, namely
centralized and decentralized approaches. In a centralized storage method each
phone sends a anonymized ID plus identifiers gathered from other
(11:19):
phones to a centralized database, which analyses gathered information and
sends notifications to supposedly infected citizens. Anonymized ID , by the way,
is basically created by a random number generator at each
device and supposedly does not bear any data associated with
the user. On the contrary, decentralized storage method implies that
(11:39):
the phone exchange its anonymized ID with a central database while
each phone has its own local contact history log. First, the phone
downloads the database that handles contact matching alerts sending with
out the server. Final question here is, how does the
app receive information about the person's COVID status? Some apps
(12:01):
for example allow users to book a test through them,
so they would receive the result and inquire for permission
to send an alert to their recent contacts. Otherwise, they
would have to upload the photos, scan the QR code
or link the test result into the app using a code.
Anyway, the permission from a user is mandatory to share their
status with their contacts, although it will not reveal their identity. Carmela,
(12:25):
what are the most important measures to protect data privacy
in the protocol from your point of view?
Carmela Troncoso (12:31):
So the main idea to protect privacy is to make
all of this computation locally. But let me go back
to this responsible engineering point. What we really wanted to do
is to make this impossible to reuse for other things.
And that is the reason why we have it on
the phone. Whenever we want privacy, privacy is not the goal. Privacy is the means,
(12:51):
protect the users from further damage. So that is one of
the things we did, like move everything to the phone,
to the server doesn't . But apps have many more
protections to protect privacy. Most apps actually have dummy traffic. That
means that whenever you upload and contact the server, it
(13:12):
does not reveal whether you are positive for covid and
uploading your keys or not. And there are many other
mechanisms like this that ensure that no communication from the
app will ever leak identities or health information about the users.
Nikita Soldatov (13:28):
Actually, it is notable that several major European countries, such
as Switzerland, Austria, Germany and Belgium have either developed their
contact tracing apps based on DP3T from the beginning or
switch to this protocol at some point after admitting its
numerous advantages.
Claudia Maußner (13:45):
Well, to me, it looks like that a DP3T protocol is
a great piece of security and privacy engineering. I'm sure
this was not an easy task. So which were the
greatest challenges in the process and what are the main
lessons learned from the development of the app?
Carmela Troncoso (14:00):
So I think there are two there are two things
that are the technical and the non-technical of the non-technical side,
we have already spoken about this need to actually communicate
with the stakeholders and the stakeholders and not only the
final users, but anybody that is in the middle, including
also to communicate with the developers of the app to
make sure that they actually implement and implement all of
(14:24):
these extra functionalities and these extra mechanisms that I said
that some of them are also nonintuitive and we many times
had to correct them and work with them to make
sure that information was not revealing anything. The traffic does
not reveal anything. But the biggest challenge was to actually
deal with the mobile platform and indeed it was great
(14:45):
at Google and Apple took our protocol and put it
in their devices. But it also meant that we had
to play by the rules and that meant that we
have to play by whatever the phone does. And the
phone is a device that works to save battery and
save processing, and that means that a lot of privacy
preserving mechanisms like this dummy traffic it was talking about before,
(15:05):
it has to run in the background and it has
to take some battery from the phone. It's not going
to work so well. And many of the privacy mechanisms
that we designed in theory will never work on a phone.
And we need to work on either get rid of
the constraints provided by Google and Apple or to work
(15:27):
on a better mechanism. If you ask me, we maybe
need to start thinking about how to move power from
those entities to actually create the responsible engineering that we want,
and not the one that they think is actually good.
Nikita Soldatov (15:42):
All right, thank you very much, Carmela, for taking the
time to share this enriching thoughts with us. Unfortunately, we
don't have enough time to cover everything since this topic
is so broad, but I'm sure that the ideas that
we discussed today will definitely enhance public knowledge regarding the
contact tracing apps.
Claudia Maußner (15:59):
Thank you very much, Carmela, for the interview.
Carmela Troncoso (16:01):
Thank you. It was a pleasure.
Technikon (16:04):
Thank you, Nikita and Claudia. And special thanks to the
Alpen Adria University. See you next time.
Outro (16:14):
This podcast has been brought to you by Technikon.