Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:12):
Hi and welcome to episode three of in privacy as a foreign language.
And this is a special episode brought to you by the data dr all about the very exciting data protection intensive conference 2022 that I attended in London.
Now that I know for some a conference like this may sound like hell or something that would send you to sleep.
(00:41):
But it was amazing.
It was full of innovation discussions,
debates and lots of information to absorb.
It was my first day of the data protection conference actually,
but like all others,
it was full of freebies and refreshments and who doesn't like a free pen.
So the first session I attended,
(01:05):
it was held over two days to well 33 days really,
the conference was some of it was breakout sessions and some of it was keynote speakers.
So the first session I attended was an opening session it was the opening of the conference and it was opened by none other than john Edwards in his very first public speech as the newly appointed um information Commissioner for the U.
(01:35):
K.
So he talked about the proposed UK GDP are reform and how it should not be seen as radical.
He also implied that it was being imposed.
Has no basically he was trying to say that the GDP are UK reform being imposed no additional burden on businesses.
(02:06):
He actually sees it as a clear intention to reduce the regulatory burden in order to create a more kind of,
what did I streamlined?
Um it's more to create a more streamlined law that effectively protects people's rights.
(02:34):
Um,
one question that everyone has on the lips was,
what about the UK adequacy decision?
Does any potential UK GDP are reform?
Put the agreement with the EU at risk.
And that was something that was talked about throughout the whole conference was about the UK adequacy,
(02:59):
which we'll talk about a bit more um the Department for Digital Culture,
Media and sport.
So D C M.
S.
That who are responsible in the government for the public consultation um on the series of data protection reforms,
(03:30):
they have committed to a high,
high standards when consulting on the various reform topics,
john said that he struggles to see how any legal protections will be less in cardiff that is afforded in those in Copenhagen?
So that was his kind of um take on the UK Adequacy decision.
(03:58):
He's saying,
well how can how can it be okay in Wales but not okay in Copenhagen when actually that the two G D P.
R.
S are are very,
very sick at about the same,
pretty much.
Um,
his final thoughts that were quite refreshing to me where he wants to see an I.
(04:20):
C.
O.
So the Information Commissioner's office that is curious agile and as a regulator,
they move fast and fix things.
So what does all of this mean for us in the day to day life that we live well for me it means that change is coming for the UK,
(04:44):
which when we look at the current regulations that we all strive to comply with.
That isn't a bad thing because you know,
our UK GDP are was taken from the E U G D P R.
And some of it doesn't fit for us.
So the fact that they want to try and fix those things not make them worse but fix them,
(05:09):
it's a good thing.
Um I believe that changes can be made and it can be easy.
So I don't think that the changes they will make because of the consultation committees that they have on the various topics are people who do the jobs day in day out.
(05:29):
You know there are some big privacy and data protection law firms involved with the government D.
C.
M.
S.
And the Chancellor for that said that you know he was very happy to listen.
Um So that's a good thing.
Watch this space A.
(05:52):
Mhm.
So next up um in my sessions I listened to Daniel Sis Kind who is a fellow in economics from Oxford University.
(06:20):
All thought of a brilliant book called A Weird World Without Work.
Um Sorry if I pronounced that name wrong.
But um he explores in his talk the impact of technology,
especially artificial intelligence on work and society and for a lot of people,
(06:43):
artificial intelligence is a scary topic.
Ah And when and if we think about how aI could replace some jobs that currently need humans.
You know,
I think if you just think about that that some kind of artificial intelligence could replace the jobs that we do.
That is a scary thought.
(07:04):
However,
he also starts to talk about how AI relies on humans,
you know,
it's it learns and is taught by patterns of behavior human possesses.
So yes,
artificial intelligence could replace some of the jobs that are out there,
(07:27):
white collar jobs and blue collar jobs,
however it needs us to teach it how to do that.
So it kind of goes full circle and I think it's not as scary as as we think it is.
Um But yeah that that talk was really cool um and it was nice to hear someone's opinion and how artificial intelligence can make things better but it also can make mistakes as well.
(08:05):
So the last um Keynote speakers that I listen to on day one it was all about the Children's code in the UK and its global influence.
Now this speech was given by the Deputy Commissioner of the I C E O steve Wood and Lorna cropper director at field and field fisher.
(08:31):
So the children's code became applicable on two September 2021.
The code has raised the bar for the level of data protection given to Children um that the providers of online goods and services to Children need to adhere to.
Um It it basically has had a dramatic influence across the globe as well because you know,
(09:01):
certain other states in the US have come up with their own Children's online code and other countries as well have then put provisions in place for Children's data.
So we kind of set it set it off in the first place,
which is great to see us leading the way,
especially in Children's data.
Um what we learned in this session,
(09:24):
how some businesses,
how they've gone about implementing the code,
for example,
conducting risk assessments,
um what we also can do to mitigate the risks when looking at Children's data.
So some of the things that were brought up by Lorna and and steve was privacy by design and default.
(09:49):
That comes into play massively in the Children's code because as a minimum,
you know,
privacy by default should be there for everyone,
especially when providing services for Children.
Um we should make sure that privacy by default as standard um the settings,
(10:12):
you know,
should be there from the off when engaging software providers or invoking new processes.
It should be a foundation for for anything when considering Children's data privacy by design was next on the list.
And this is where we should design processes or Softwares that include appropriate security measures when handling Children's data.
(10:41):
So privacy by default is your standard settings on any kind of software or or process or kind of service that you know,
has the basic foundations there for data protection,
so not tracking people nudge techniques,
(11:03):
all those things should be,
There is a default where privacy by design means that you incorporate in your design extra measures um to protect that data whether it's Children's later or not,
privacy by design is used across the board.
Um We should always be able to answer the following when we look at Children's data.
(11:28):
So why do we collect or handle that data?
Um What do we collect?
You know what what information what personal information are we collecting from Children?
Do do we need to collect it?
You know um how and when do we collect the data?
(11:49):
Are we asking unnecessarily certain data points um who collects the data?
You know,
is it us is it a direct way of collection or an indirect way of collecting that data?
All of the above unnecessary.
And you know,
(12:10):
are we able to justify then our reasons for the above?
So you know when we say why do we collect Children's data or because we have to can you justify that?
No we can't.
So we have to be a little bit more specific why do we collect a child's first name?
(12:31):
You know,
it could be because in order to fulfill the contract for the service that the child is signing up for,
we need a name and that is how you justify it.
Something else that the code stipulates?
Is it has 15 standards?
(12:53):
So not only do we have the G.
D.
P.
R.
Which is you know the articles and this now has its own 15 standards,
admittedly a lot of these are taken from the G.
D.
P.
R.
Standards.
The principles.
So I want to just tell you a little bit about each one um and how we kind of go about what do they mean?
(13:22):
Because sometimes it's really difficult to to understand what they mean.
So number one best interests of the child.
Now the best interests of the child should be um a primary consideration when you design and develop online service is likely to be accessed by a child.
(13:46):
So you just put the best interests of the child.
First,
Number two is Data protection Impact assessment.
So D.
P.
I.
A.
We should undertake a D.
P.
I.
A.
To assess and mitigate the risks to the rights and freedoms of the Children who are likely to access your service.
(14:08):
Now you keep hearing the words likely to be accessed.
So the way that this is um risk assessed is is your service likely to be accessed and you'll hear that throughout these standards.
Um So you have to take into account when doing the D.
(14:32):
P.
I.
A different ages capacities,
development needs of the child and ensure that the D.
P.
I.
A.
Building compliance with the Children's code.
Um Number three is aged appropriate application.
(14:53):
So this basically means that you take a risk based approach to recognizing the age of the individual users and ensure that you effectively apply the standards in this code to the child users so either establish age with the level of certainty that is appropriate to the risks and the rights and freedoms of the Children that arise from our processing or apply the standards in this code to all your users instead.
(15:27):
So what they're saying is we can either verify the age of a user so whether they are a child or not and then put in place certain standards or certain,
you know,
approaches because that child is because the user as a child or we could just apply the Children's code standards to everyone regardless of age.
(15:59):
Um and that's the age appropriate application either is fine.
Either way is fine um transparency.
So yep this is something that we spoke about before.
It's one of the principles um transparency is the privacy information that we provide to users.
(16:21):
It must be,
you know,
clear prominent,
concise and it should be suited to the age of the child.
So what you'll find is a lot of people have a Children's privacy notice as well as another privacy notice for all website users for example or all customers um and that basically is bite sized,
(16:46):
it's a lot more digestible and user friendly to the age of the child.
So detrimental use of data is number five and this means do not use Children's personal data in ways that have been shown to be detrimental to their well being or that go against industry codes of practice.
(17:12):
So basically don't use Children's data in ways that you shouldn't and that could harm something about the child's data in the future.
You know,
it could stop them getting into college or it could stop them getting alone later on,
you know,
use the data wisely.
Um and according to the law,
(17:33):
number six policies and community standards.
So this is about upholding your own published terms,
policies and standards um including but not limited to privacy policies,
age restriction,
behavior rules and content policies but what it's saying is have your policies and processes in place but make them worth the paper that they are written on.
(18:01):
Don't let them just be a tick box exercise.
Um So number seven is default settings and this is linked to privacy previously by default and the settings must be high privacy by default unless you can demonstrate compelling reasons for a different default setting,
(18:27):
taking into account the best interests of the child.
So they should always be high privacy by default settings.
Number eight,
this is data minimization and this is where we collect and retain only the minimum amount of personal data um that we need to provide the elements of the service to the child.
(18:53):
So only collect what you need to collect and don't collect more.
So number nine is data sharing.
So do not disclose Children's data unless we can demonstrate a compelling reason,
for example,
taking into account the best interests of the child.
So do not share any data with anybody else unless we can justify the reasons why for example,
(19:21):
it might be in the vital interests of that child on a gaming platform or something like that and you noticed something a bit untoward happening.
It might be that you need to share some personal data with the parent guardian.
Local authorities maybe Um # 10 is geolocation.
(19:42):
So switch geolocation options off by default unless again we can demonstrate with compelling reason for geolocation to be switched on.
Now I can't think of any compelling reason but you know,
(20:04):
he could be that location for example,
that's the only one I can really think of is like location data.
So it might be that you your child uses find my friend or something like that or you know find my my my phone.
That's the only one I can really think of.
It's a difficult one,
(20:26):
Parental controls is # 11.
So we should provide parental controls give the child age appropriate information about this.
So if the service allows a parent or carer to monitor the child's online um activity or track their location,
(20:47):
then um we need to make that obvious to the child as well that they're being monitored.
Number 12 Profiling.
So switch options off to use profiling by default because unless you can demonstrate compelling reason for the profile to be on by default taking into account the best interests of the child.
(21:17):
So only allow profiling if you have appropriate measures in place to protect to protect the child from any harmful effects in particular being fed content that is detrimental to the health and well being.
So,
you know,
profiling should be off because what profiling is is it allows and profiling is when you kind of allow um basically it can suggest online content and serve content to users down to where they are,
(22:02):
when they use it,
how they use it.
And it could be how frequently they get served this content as well.
So it could be quite detrimental to a child if you don't want them to be served content that they're that's not aged appropriate.
# 13 is Nagy techniques.
(22:23):
So much techniques um shouldn't be used to lead or encourage Children to provide unnecessary personal data.
Um you know,
they should be used in a positive manner and not to try and collect more information than necessary.
Number 14 is connected toys and devices.
(22:44):
So this kind of standard talks about if you provide a connected toy or device to the service,
ensure that you include effective tools to enable conformance to the code.
And basically this is like if for example kids fitness band or an interactive teddy,
(23:07):
if they are connected devices,
they should have similar,
you know,
standards in place to follow the code and lastly is number 15,
which is online tools and this is you need to provide prominent tools,
accessible tools to help Children exercise their data protection rights and report concerns?
(23:31):
They have rights,
just like adults have rights and even more so there is another level of protection because of the Children's code.
So this talk was really interesting um and something that I feel quite passionate about as well,
um and it's something that I would like to learn more about.
(23:52):
So Lorna was also very passionate about it and a very good speaker and it was good to see the debate between the two,
which I really enjoyed.
(24:24):
So I thought I'd do just a little quick test with my nephew.
(30:08):
So the,
the bit I just did with Noah,
my nephew who is too,
it was just to show you the perception of,
of Children and admittedly he is only too.
Um,
but he understands his name and where he lives and certain points that others could use,
(30:32):
you know,
especially when it comes down to pictures and things like that.
So I'm going to keep doing that as he gets older and we'll see how his perception changes.
But it was just a fun little element of this episode.
So last on today's agenda is what happened on the last day of the data protection intensive conference.
(30:54):
So on the last day I attended the closing session which was around,
I think about two hours long if I remember rightly.
And the closing session was a panel of speakers that included ruth Boardman who is a partner at Bird and Bird James Snook who is the director of D C M.
(31:21):
S for the government and Eduardo restaurant.
I think I pronounced that correctly who is a partner at Hogan levels.
And it was a really interesting panel of people to debate the subject of international data transfers.
(31:44):
So the discussion um,
was about the new approaches to promote a secure cross border transfers.
Um All of the panel members are part of the UK international data transfers expert council that has been put together by D C.MS
(32:05):
I believe.
And it's to tackle some of the topics that need addressing that need making better in our U K G D PR
And one of them is international data transfers.
Um and there are growing challenges in this area and they talked about the challenges and the opportunities actually about how we can make it better simpler for small businesses.
(32:34):
So it's not so time consuming or money crushing for them,
which often it can be as they don't have the resources like big companies do.
So how does the UK currently deal with international data flows?
Well,
in order for us to transfer data,
(32:55):
we must ensure we have adequate mechanisms in place to do this.
So we can rely on either adequacy decisions,
appropriate safeguards or derogations.
(33:16):
One of those three we must have in place to transfer our data outside of the UK.
So most commonly used,
I believe,
is probably the Adequacy decision in its relied upon to transfer data's data internationally.
(33:36):
It means the country we transfer the data to has been given an adequate decision by the european commission And now there are only 13 countries or territories that have been given this adequacy decision.
So there's quite a few on that list and that poses a question that is the european commission's way of allowing a country to apply for adequacy too complicated.
(34:10):
Out of date,
maybe that might be one of the answers because it is very complicated.
And you know,
you'd think there'd be more countries on this list.
So I think that's one thing that they're going to look at is how do we best address adequacy.
You know,
if we choose to send data to a country or territory that does not have an adequacy,
(34:37):
then we need to look up either appropriate safeguards or derogations.
So the appropriate safeguards.
These could be standard contractual clauses,
approved codes of conduct or certification mechanisms.
Ad hoc contractual clauses,
(34:59):
international agreements or binding corporate rules.
Now,
out of the most appropriate safeguards,
the sec.
S is probably what is the most commonly used are the easiest to apply.
And this is basically a contract.
So if you want to transfer data outside of the UK to a country that does not have an adequacy decision.
(35:24):
What we put in place is a data protection agreement.
So company a in the UK wants to use the services of company B in the US,
let's say.
And what we need to do is because the U.
S.
Do not have an adequacy decision,
we have to assess Company B's um mechanisms that they put in place to keep data safe.
(35:53):
And if we're happy with those mechanisms,
what we do is then enter into a data protection agreement which includes the standard contractual clauses.
Um And this opening debate was really interesting as they debated about what is wrong with the process because the I C O and the government have released the the UK standard contractual clauses,
(36:19):
the u k s c c s,
which we all had to update when they came out and now we have gone and put on a UK addendum to those sec.
S to cover international data flows,
so back to it and we have to rewrite the templates.
We already had an ad in the addendum.
(36:41):
Okay,
We're given a grace period to do this and and implement it.
But again,
it's how many more times can we change this and add an addendum and add an addendum to make sure it's correct.
You know,
maybe this says the original,
the original plan is not sufficient enough.
So that's why the debate was quite interesting because,
(37:03):
you know,
we have these things in place but they're always changing.
So maybe they're not adequate in the first place that the mechanisms we have.
So it's something that they want to look at as part of the expert committee,
we need a simpler way of agreeing on transferring data internationally as every company probably does this and needs to do it at some point.
(37:30):
You know,
if you've got a small business and you need a server that's hosted outside the EU UK or one of the,
you know,
13 adequate countries because it's cheaper to do.
So they provide the right amount of organizational and technical measures in place to keep everything secure.
But you still then have to go and make sure you've got sec is in place or an international um agreement with them or the company has to apply for binding corporate rules,
(38:03):
which may I say you've had to reapply for under the new U K GDP are so you know,
it's not simple and we need to address the fact that it's not very simple and we need to be able to easily do this,
making sure that obviously everything is kept safe and secure.
Um The most important things you can do to help in your business is always check where the service or provider stores the data that you are transferring to them.
(38:36):
That's probably one of the things I look at first when someone says to me,
I want to use this new software or can you help me with this?
I always make sure that I look at where the company is established,
where they store their data.
Um And and certain things like that.
Just to kind of help me establish which route I'm going to take um if the country is not on the adequate list,
(39:05):
then that's a flag.
And that's when I'd be saying to,
you know,
staff this is when you should flag it to me as a D P O and you should say I've checked this this and this and this is what I found that just helps me and then I don't have to necessarily go and find that information myself because I don't know necessarily what that software is being used for and why and so that's something you can do to help.
(39:34):
Quite simply ruth Boardman said um to the director of DCMS
What can you do to make it any better?
Don't make it any worse than it already is.
You know whatever the outcome is don't make it any worse because surely it can't be any worse.
(39:54):
Um So yeah that was quite quite simple and frank.
On a good note the U.
S.
Has started talks to get a new privacy shield agreed which would be a big weight off most shoulders I think because you know the US is probably the biggest country in where you know we have lots of software providers and right now they don't have a privacy shield.
(40:23):
So the sec is kicking or you know one of the others like codes of conduct or the certification mechanisms.
So that would be really interesting to see what comes out about that.
Um So yeah that that debate was you know that kept me on the edge of my seat and I know that's hard to believe but it was very interesting.
(40:45):
Um Next up I went and listened to pareja uh Agar a wall now I don't think I've said that right,
but she is a brilliant person to listen to and she's a behavioral and data scientist,
visiting professor of social inequities and injustice at Lopburi University.
(41:11):
Now those topics in themselves are interesting,
but the way that she talks about them is is really good.
She has many talents,
but most recently she wrote a book called Sway and it's about unraveling unconscious bias.
And she also has another book which is Wish We knew what to say.
(41:34):
And this is talking with Children about race,
you know,
and I think it's so important that we understand those those two headlines because this talk I bought an air of comedy but also to the fascinating subject of unconscious bias.
(41:59):
Now in her words,
everyone has some sort of bias whether we know it or not,
everyone has some sort of bias.
And even when we look at um technology technology has an unconscious bias which brings me full circle as what what we said at the very beginning about artificial intelligence being taught by humans.
(42:26):
We we that's how we uncover the fact that technology also has an unconscious bias because that technology is being taught by humans who have an unconscious bias.
So some of the funny things that she bought um on the presentation was things like there was an image of the racist soap dispenser and it was a video that went viral about how um a black gentleman put his hand under the soap dispenser and it would not dispense soap However,
(43:05):
and he kept trying on both hands wouldn't work.
However,
if he put some white tissue paper over his hand and then tried to get the soap dispenser to work,
it would work.
So things like that.
You know,
we're really interesting because I've never really thought about it like that how technology could have an unconscious bias because of the human reality behind teaching Ai and the algorithms behind ai.
(43:34):
Um so it's really interesting.
I go and research it because the unconscious bias or the unraveling of the unconscious bias book sounds really,
really interesting.
So I'm definitely going to I think purchase that when it comes out.
Um Yeah,
so that was the the kind of busy day that that started off the morning and then it was time for coffee and networking before you knew it.
(44:14):
Mm hmm.
So the last session I attended was one trust session and they were doing a podcast live from this session and it was called that privacy talk was really interesting.
Again,
it had Eduardo from the panel previously in the morning and they were discussing the topic yet again about international transfers.
(44:46):
And that was really interesting.
I won't go into it too much because you can go ahead and listen to their podcast but overall the conference um really did um make me confident or help my confidence.
It made me more comfortable with some of the topics because I think when you do this job it can be quite lonely maybe,
(45:13):
shall we say because you know,
I'm working a lot of the time with myself but with other people um but it was really,
really great too meet Ito from Ornette Law who was my mentor throughout my exams over two years and it was a real pleasure and I felt honored to spend a couple of days with her as well because you know,
(45:39):
I wouldn't be in this role if it wasn't for her guiding me through the,
the really difficult exams that I went through.
So it was really lovely to see ITo and spend time with her and meet other delegates as well that have been through the exams,
you know,
(45:59):
a lot of the people I met came from a law background.
However,
that isn't my background,
you know,
I had no knowledge of data protection really.
When I started this role,
my background was in privacy was in policies,
um compliance governance.
(46:19):
So that's kind of where it came into um,
you know,
helping me so I have it all to ito really to thank for the two exams that I managed to pass.
So yeah,
it was a really lovely experience being at the conference and getting to work with some vendors as well and and find out what tech solutions there are out there to help us as businesses because you know,
(46:47):
sometimes the really time-consuming jobs can be something that is not fixed but could be aided by a tech solution.
So it was really interesting to meet the stands that were out there and do a bit of networking with different people as well.
So thank you very much for listening.
(47:08):
that's the end of episode three privacy as a foreign language.
Make sure you hit the like button,
share it.
Leave me a comment.
Give me an email and happily listen to your suggestions for any future episodes.
Thanks.