Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Yusra Ahmad (00:00):
Are we using
biometrics?
Is that the right thing forentry?
I mentioned that one before.
If you're an employee, what ifyou don't want to use your face.
.
.
you don't want your employer tohave such a detailed set of data
around your facial makeup.
Should they be forced toprovide you a different
mechanism to enter into theirbuilding?
(00:20):
Or does experience kind oftrump that?
These are the types of thingsthat we do - keep on top of
what's coming out, and thenprovide some guidance off the
back of that on how that couldbe achieved in a more ethical
way.
Debra J Farber (00:36):
Hello, I am
Debra J Farber.
Welcome to The Shifting PrivacyLeft Podcast, where we talk
about embedding privacy- by-design and default into the
engineering function to preventprivacy harms to humans, and to
prevent dystopia.
Each week, we'll bring youunique discussions with global
privacy technologists andinnovators working at the
(00:56):
bleeding edge of privacyresearch and emerging
technologies, standards,business models and, ecosystems.
Welcome everyone to ShiftingPrivacy Left.
I'm your host and residentprivacy guru, Debra J Farber.
Today, I'm delighted to welcomemy next two guests: Luke
Beckley and Yusra Ahmed.
(01:17):
Luke is the Data ProtectionOfficer and Privacy and
Governance Manager at Correla, acompany that processes gas
consumption data across the UKconsumer market on behalf of gas
suppliers.
For over 25 years, Luke hashelped organizations understand
their data to enable the datathey process to work for them
(01:40):
and, more recently, to enablethat data with great governance
and compliance with theprevailing data protection laws.
He's also the Chief ComplianceOfficer for Hope4, a charity
based in Moldova that helpsvictims of human trafficking,
and he's worked as a DataPrivacy Consultant to the East
London Business Alliance, whichhelps social enterprises and
(02:04):
small charities with their dataprivacy obligations.
Yusra Ahmad is the CEO ofAcuity Data, which specializes
in delivering sustainable changeby leveraging the full
potential of data analytics anddecision making across an
organization's entire operations.
With a proven track record ofsupporting a range of FTSE 500
(02:27):
clients to realize theirstrategic objectives to the use
of data, Yusra believes thatethical data management is key
to every organization's survivalin this digital age.
So, Luke and Yusra have beenworking with The Real Estate
Data Foundation - and combinedthat's known.
.
.like if you condense that,that's known as The RED
Foundation - not to be confusedwith the AIDS Foundation that
(02:50):
goes by the same name that, likeBono, is connected to.
So, let's just separate thosetwo ideas right here.
And then, The RED Foundation,they shift that conversation
left and ensure the data pointsgenerated by 9.84 million people
on a daily basis is done sowith privacy and transparency at
(03:11):
the center.
The RED Foundation is asector-wide alliance that
enables the Real Estate folks tobenefit from data.
Yusra currently serves as theChair of The RED Foundation's
Data Ethics Steering Group andLuke serves as the Chair of the
Foundation's Engagement andAwareness Group.
(03:31):
So, you can see why we haveboth of them on the call today.
Welcome, Luke and Yusra!
Luke Beckley (03:36):
Thank you, Debra.
Thank you very much forinviting us on; and, as you've
already mentioned, I am one ofyour super fans, so it's a
privilege to be here.
Thanks.
Debra J Farber (03:44):
Yes, yes, I have
to admit, Luke is definitely
one of my super fans.
It's one of the ways I'vegotten to know him - through
conversations on LinkedIn, andit's not just because he's a
super fan that I invited him on.
He's got plenty to talk to usabout in his field of expertise.
That's how we connected -through a lot of engagement on
LinkedIn.
I guess we'll just dive in.
(04:05):
So, it appears that the realestate industry is seeking to
leverage a lot of personal dataat scale, and it feels like
we're entering a perfect stormwhere the scope, the
sophistication, and connectivityof data is increasing
exponentially.
While this may bring advantagesfor real estate, obviously has
(04:27):
a lot of new challenges.
So, can you share with us someof the current drivers for this
change?
Yusra Ahmad (04:33):
So, Debra, thank
you so much for inviting me on
your podcast, as well.
I'm really excited about havingthis conversation with you and
Luke.
So I've worked within the realestate industry for coming up to
about 20 years now.
[Debra (04:44):
Wow] Yeah - it's
getting there.
Right?
I probably shouldn't mentionthose numbers anymore.
When it relates to the driversfor change.
.
.
I mean, certainly I started offin this industry working on
space and occupancy management;and so data, for me, has always
been an integral part of the waythat you do business in real
(05:06):
estate, and any organization.
Right?
When we talk about the driversfor change, I think there has
been a slight step changerecently, and a lot of that has
been driven through technologydisruption, ultimately; and this
increasing desire to digitizethe way that everything
functions, in particular, inreal estate.
(05:28):
But, there's a number of otherthings, I think, that have been
going on in the background.
One of them, from ourperspective, is ESG.
Now, Luke and I have beenworking on a number of different
things outside of REDFoundation as well, but what
we're also seeing is is that, asorganizations become much more
focused on driving theirdiversity and inclusion agenda,
(05:51):
their environmental impactagendas, et cetera, there's a
heavier reliance on data, notonly because regulation,
especially here in Europe, isdemanding, but also just from a
social responsibilityperspective.
You need to understand whatyour status quo is before you
can make that change.
What we're increasingly findingis that some of that data is
(06:15):
missing, and it's missingbecause it's not captured or
it's not captured in the rightway or the right structure, and
within that there is an elementof sensitive data.
So, whether that's PII orwhether that is data that can
somehow be linked back in orderto identify a particular
organization, or a particularorganization deems it to be
(06:37):
sensitive.
So this kind of issue aroundtrust emerges as a result of ESG
.
Going beyond that, I thinkthere's,
additionally, we talk about thisconcept of 'workplace
experience' quite a lot.
It's been something that I'vecertainly heard about over the
past decade.
At the very least, it's a movetowards agile working.
(06:58):
It's a move towards moreflexible working environments,
but it's also impacted by thechange in the workforce dynamics
.
By that I mean the youngergeneration coming into the
workforce having a completelydifferent set of needs than we
certainly have been trained toaccept, I suppose, in a more
(07:21):
traditional or conventional way,but also things like COVID and
the lockdowns.
That's also changed what theopinions or the attitudes of
people who may havetraditionally accepted the need
to be in the office nine to five, five days a week, or sometimes
seven days a week, because I'vemet those people as well.
(07:41):
So all of that in order tobecome more efficient and to
create this workplace experienceagain.
That is a very data intensivetype of activity and type of
service which also relies onsensitive personal, potentially
information, with the rise ofsensors in order to track
(08:02):
occupancy and that type of stuff.
The next I would mention - andI'm going to leave it there for
now, but there are many moredrivers - is something that I'm
incredibly interested in, whichis 'smart city development.
' Ultimately, that means thedigitization of real estate as a
whole.
Whenever I say 'smart cities,'I get a bit scared that people
think of it as some sort ofJetsons episode.
Debra J Farber (08:25):
Oh, be careful,
this might be the name of the
episode now.
Yusra Ahmad (08:33):
Well, there is a
temptation to think of it in
those terms, but the reality isthat most capital cities, most
metropolises across the world,are already smart cities.
And, again, in order for you toprovide that digital experience,
for you to be able to navigateacross a city in the way that we
have become used to, through anapp, through your phone,
(08:54):
whatever it is, that requiresdata right and that requires
digitization of a particularasset.
And so, as our builtenvironment becomes increasingly
more advanced and moredigitized, again, this is the
rise of data and the need fordata, and I'm sure we'll talk
(09:16):
about this a little bit later on.
But, this then kind of kicks inthis concept of how do you
ensure that you are managingthis data in the most ethical
way possible?
A lot of that is really notjust to do the right thing, but
so that people you're collectingdata about, who are the people
you're delivering services, totrust you with that information,
(09:38):
so that there is this unspokentradeoff - you give me the
service and I give you the datathat you require.
But, there's a gentle person'sagreement that you're going to
do the right thing by me andmake sure I'm protected.
So, a few thoughts for you.
Debra J Farber (09:55):
Yeah, that is
really helpful for
contextualizing the state of theindustry and what's changing,
why we're even having thisconversation today.
Right?
Luke?
Do you have anything to add tothat?
Luke Beckley (10:07):
I don't think
there's very much more to add to
that.
I think that's verycomprehensive.
I think there's almost anawakening that's happening in
the real estate industry.
They've been collecting datapoints - and we'll get on some
of those shortly about what datahas been collected - for many,
many years; but, not in any wayusing that data to drive the
best use of their assets.
(10:29):
And, I think this - and you justtouched on it - this is one of
the things that she and Iconverse on very frequently:
"How do we bring all of theobligations - the compliance
elements, the privacy elements -together with the actual volume
and different data points beingcaptured in a way that actually
allows those developers andthose assets to be utilized in
(10:51):
the best way possible?
I think a lot of that, as youjust mentioned, is the driver
that kicked off through theunfortunate circumstances of
Covid.
People's return to worksuddenly made them realize how
they wanted their spaces to workfor them.
So, there's a whole heap ofsocial proof and pressure
building as well as an awakeningand awareness of all the
(11:13):
different data points that havebeen captured and how that can
converge with that social proof.
Now, you need for better spacesand better use of spaces in a
environmental way and for thebenefit of the individuals using
those spaces.
It's a really interesting anddynamic and great place to be
kind of, what we feel is, at thestart of that awakening process
(11:36):
.
Yusra Ahmad (11:37):
I would also add to
that.
I think there's an element ofplay that's within the industry
as well.
I'm not sure if you would agreewith me, Luke, but there's this
sense of, as there's newtechnology coming out, real
estate - not only operators butoccupiers -want that for
themselves.
I certainly have been through afew rounds where, when AI came
(11:58):
out, when big data came out asconcepts, straight away,
everyone was like, "well, let'suse that.
And it's almost as though theindustry is trying to grapple
with these technologies and thenthinking about the use cases
afterwards; rather than the usecase ahead of time and then what
works for us, which is acompletely different challenge
(12:18):
that I end up having to do.
Luke Beckley (12:19):
Yeah, I 100% agree
with that, though.
You know I do, and that hasalways been the case.
Debra J Farber (12:24):
You know, that's
fascinating.
Oh God, I have so many questions, but I want to make sure we
actually stay on a track too.
What comes up are questionsaround ethics and motivations,
and real estate, generally, hasbeen such a market where it's
worth whatever someone's willingto pay for it.
So, you know, I wonder how muchrevenue - like any business in
(12:49):
corporate America - is going tobe driving the use of these
technologies before they'reready.
Like you said, if they'reasking you to think of the use
cases later, but let's startplaying with the technologies,
then it almost feels like a lotof that would be experimenting
on people in real time to seewhat works, which is not
recommended for ethical techinnovation and deployments.
(13:12):
But, before we even get thatfar, my brain is going off in
different directions based onwhat you just brought up, let's
talk about what types ofpersonal data these real estate
companies are collecting andusing, or want to collect and
use in the future, and for whatpurposes.
Luke Beckley (13:39):
The answer is a
vast array of data is being
collected, and possibly somethings that individuals using
the spaces and all the companiesthemselves don't necessarily
consider personal data or thinkabout.
So, you range from people'saccess into buildings you go
through.
As Yusra's already mentioned.
they capture data through thevarious sensors that try and
help drive the smartness of thebuilding.
You've obviously got CCTV andsurveillance cameras -
Debra J Farber (13:57):
- I was just
going to say, "obviously -
alking about the UK
Luke Beckley (14:00):
.
.
.
in the world.
So, I know there are big realestate companies with private /
public spaces that have gotlarge CCTV operations in place,
and then you've got the actualindividual records of the people
who are in the building.
You've got medical records -special category of data.
For example, a lovely thingcalled PEEPS, which is 'Personal
(14:20):
Emergency Evacuation Plans.
' They need to be held bylandlords and all the occupiers
because, obviously, if you havepeople with a specific set of
needs in an emergency situation,you need the Facilities Team to
understand what those needs areand react in a way that enables
you to get that individual outof the building quickly.
But, that's a special categoryof data, and inherently, real
(14:43):
estate organizations have beenquite bad from a privacy
perspective about how that iscaptured, stored, and secured.
So, the array of data is justenormous and ever changing.
Yusra's already touched on it -the sharing of that data then
becomes an even more complexrequirement.
So, for example, when you'retrying to do energy efficiency
(15:07):
and you want to look atoccupiers and landlords,
suddenly you're into a accesscontrol, sharing of data.
That brings this whole plethoraof additional complexities that
we'll touch on later.
I mean, that's just a briefissue of review, but, as you can
imagine, t hose sets of dataare quite voluminous in their
own right.
Debra J Farber (15:27):
Yeah, yeah - in
fact, also, what comes to mind
is exact location data.
I know many retailestablishments, for instance,
make use of location data foranalytics purposes, like how
many people came through?
I don't need to necessarilyidentify them, but like how many
people - humans; let's identifythat that's a human and how
many came into the store onwhich days, so that we can
(15:50):
better staff our store on thosedays, or for whatever inputs or
whatever analytics purposes thatthey have.
How are retail establishmentsputting protections around that
location data, if at all?
Luke Beckley (16:05):
I think the answer
is
And, I think back to the pointabout use case testing is
possibly fairly valid.
So, you're right.
What happens is Wi-Fiinstallations in stores, for
example, or in retail assets,potentially have the ability to
track individuals around that,as they're pinging the
individual access points, if yousign up to that Wi-Fi, you're
(16:29):
effectively allowing them accessto your location data for them
to track you around the store.
And again, something Yusra andI touched on and talked about
last week was, walking pastindividual stores who have also
had that data shared with them.
So they're enabled in a mannerto present you with an "Oh, I'm
walking past X retail store andhere I popped up and said I get
(16:51):
X points on my loyalty card,"right, so I think there's a wide
use of that and a drive towardsusing that for the analytics
purposes you mentioned.
But, I think there is complexity, both from a transparency
perspective about exactly howthat data gets used that is not
necessarily being grasped thatwell.
So, I think there's a lot morework that can be done around
(17:14):
real explaining how geolocationand / or Wi-Fi location pings
are being used and can be used.
Yes, I get it.
Ethically, you're trying to getthe composite, providing the
best experience for the consumer.
But, that's not necessarilytranslating into the
transparency about that databeing used and for whom it's
being shared with to drive thatexperience and whether the
(17:37):
consumer even wants it.
There is a whole heap of workto do.
Debra J Farber (17:40):
But, you get
points.
You get 10% off yourtoothpaste, for giving up your
privacy.
Luke Beckley (17:46):
Debra, we can go
into this offline.
I can assure you, it is muchmore scary than you think.
Debra J Farber (17:53):
Oh no, I could
assure you, I know how scary it
is.
[Luke (17:55):
I'm sure you know.
] So, the argument here, is itthat the privacy notices are
insufficient?
Or, are you saying they'resufficient, but nobody looks at
them and it's not part of theconsumer experience?
Or, to drive them to look at it?
Or, in the awareness about howtheir data might be used and
secured and maybe sharedanonymously?
Or, is it shared directidentifiers?
(18:17):
Is it the practice that we wantto change?
Is it that we just want to makeit clearer to consumers?
Or, are they really just noteven embedding the requirement
appropriately into, you know,whether it's the Wi-Fi's notice.
.
.
.
Luke Beckley (18:30):
I think it's a
combination of all of those, but
predominantly the first two youmentioned, which is: 1) the
complexity of privacy notices -especially here in Europe and
the UK - a plain and simple,straightforward language is
meant to be used, but what youalso find, or tend to find,
quite often, is very complex andvery long privacy notices with
(18:54):
not necessarily veryuser-friendly language used to
actually detail exactly what'shappening to that data; 2) Then,
the second thing is - and Ithink we'll touch on this a bit
further into the conversation -the education and awareness from
a consumer level about beingable to really understand and
read those privacy notices, evenif they're in plain language,
to understand what the potentialimpacts of that data sharing
(19:18):
really is.
So, there's an education piecethat needs to be done, but I
think there is also a clean upof transparency piece definitely
required.
Yusra Ahmad (19:27):
I think, if I can
just jump in - I think we're, in
a way, a little bit luckythough, within the UK and across
Europe, because we do have thebenefit of GDPR, whatever some
people might think about it.
At least it's a starting pointin terms of providing some
guardrails in terms of howorganizations behave.
And I'll do a little plug forRED Foundation here.
(19:50):
Right at the start of our REDFoundation venture on the Ethics
Committee, we wrote a piecespecifically around consent -
which is at the heart of thisconversation - and I guess the
challenge is that, whereas partof GDPR states that "consent
must be explicitly given by thedata provider to the processor,
(20:15):
etc.
, the reality is that how canyou consent if you don't know
what you read or if you've justflicked through it in order to
get to the bottom?
This is no comment on anyone,because I've done that many a
time myself - just give me the Yfile; I'll sign anything.
But, the reality is would youeven understand it if you read
it, if you're reading hundredsand hundreds of pages?
(20:36):
However, what I would say isthat this situation is getting
better because there isincreasing onus on organizations
to simplify the messaging andto articulate the key points of
these long-winded, historicallycomplex documents, so that we
make things easier forindividuals before they click,
(20:59):
"sign.
" I mean.
At the end of the day, though,if you're, if you're holding
them hostage.
.
.and I'm not going to say where, but I've definitely been in
one of these massive shoppingmalls where you lose signal to
your telephone and the only wayto be able to communicate with
the world is to sign up to theirWi-Fi.
Some of that, that's whereethics come into play.
(21:19):
Is that really appropriate?
Is that really fair?
I guess, that's the question.
Debra J Farber (21:25):
Yeah, that's a
really great point.
I mean, I don't think it's fair.
Just a personal antidote here -or anecdote - I feel like I
slurred that to say "antidote,"but anecdote.
I was actually a subcontractoron the Target breach, like the
set of consultants that came into assess the current state,
(21:46):
what the future state should be,recommend changes, etc.
So, for weeks I was flying toMinneapolis to help Target at
headquarters with the Tiger teamof other folks, and one of the
things that we did was juststart cataloging all of the
potential areas of datacollection.
One of those things was thebrand new Target Wi-Fi
(22:09):
agreement, saying that you wouldbe gladly tracked - well not
gladly.
In exchange for Wi-Fi, you'dhave this experience and then
they'll be tracking you aroundthe store and all that.
Who would ever think - thiswas years ago, so close to a
decade ago at this point, maybe9 years - who would ever think
(22:31):
that in signing up for Wi-Fi ina shopping experience, that that
would mean that you'veconsented to have all of your
movement tracked as you're goingaround the store?
Which, is exactly what you'resaying has now been propagated
across other buying experiences.
I just don't think that youreveryday person, if you ask them
(22:53):
on the street, has any ideathat this is going on, even if
they did consent to whatever thedisclosure language was when
being prompted for Wi-Fi.
I think there is definitely atransparency issue and it's kind
of crazy that nine years ago,this was my experience as it was
just coming to market.
I feel like at that time,Target was avant-garde in the
(23:15):
space of leveraging that kind oftracking technology compared to
a lot of other retailers in thespace.
I'm wondering, also, is it onlyconstrained to retailers, or are
there other real estatedeployments of tracking
technology?
So, I'm thinking giantconvention centers and when you
have 100,000 person conferenceor 20,000 person conference, are
(23:36):
there people being tracked?
Are you agreeing to Wi-Fi andbeing tracked - your physical
location, movements throughoutthat experience?
But then, I think aboutsomething like where you're told
you're being tracked, andyou're told that it you know
what the purposes are and whenthey end - something like when I
worked for Amazon and I went totheir Amazon Go store, where
(23:57):
you literally walk in the store;there are no checkout lanes.
They just recognize you're aperson and what this person's
doing the whole time.
And then, when you check out,it knows what's in your cart
because there's sensors on theceiling and all that that's
tracking and it knows where youare, what you bought, or what
you're taking out of the storewith you.
It just charges your accountand it's a seamless experience;
(24:17):
and theoretically, the lines are- well, there aren't really
lines.
Right?
So, I've seen something likethat where you are aware going
in; the whole concept is you'rebeing tracked through the store,
but then when you leave, thequestion is what information is
still being retained and how'sthat being used and is it being
shared is still there.
But, at least there's anawareness, because the entire
(24:38):
point is that you do everythingby yourself.
There's nobody checking you out, right?
So, anyway - what I'd like totalk about is The RED foundation
.
I'm assuming most of myaudience, like myself, has not
that much awareness into thereal estate industry and how
they're dealing with theseissues overall; so, what data
(24:59):
protection, privacy and ethicalchallenges does the Foundation
seek to solve for today?
And then, how are youapproaching those issues?
Yusra Ahmad (25:09):
Well, if I can jump
in, maybe I'll take a step back
and talk a little bit aboutwhat The RED Foundation was set
up to do.
So, The RED Foundation - theinitial premise was to create a
platform within the real estateindustry, focusing initially
within the UK, because thefounder of the Foundation was
(25:30):
based here, frankly.
So, that's why it started here;but, it's to create a platform
whereby we can come and discussdata-related issues.
And so, the Foundation isbroken down into 4 different
committees, an overarchingcommittee and then three
subcommittees - one which ispopulated by a number of
different universities withinthe UK.
(25:50):
Another which is all focused ondata standards.
As we know, without datastandards we really struggle to
structure, manage, and producequality data; and that's a
significant challenge for uswithin real estate.
I'll talk about that anothertime, but my biggest challenge
was the fact that individualdatasets aren't structured to be
(26:14):
integrated with each other, soit makes life very difficult
when you're trying to reportagainst the whole life cycle.
And then, the final committee,which I've only just recently
taken over as Chair, is theEthics Committee.
Now, as part of that, we havegot, I think it's about six
(26:34):
different ethics principles,which we are advising
organizations to sign up to.
So, these are (26:42):
1) accountable;
2) transparent; 3)
proportionate; 4) confidentialand private; 5) lawful and 6)
secure principles aroundmanaging your data.
So, it's all very simple.
It's all aligned with GDPRguidance as well as other
guidance that's come out throughlegislation here in the UK and
Europe.
What I would also say is thatthis is a pan- real estate
(27:07):
platform, so everyone that'sjoined is doing it gratis.
So, we are volunteering to be apart of this; and we are
representing, or we joined atthe very least to represent, a
wide subset of the industry.
So, we have software providers,real estate consultancies,
developers, occupiers, lawyers,academics; it's a significant
(27:33):
subsection of the industrythat's being represented.
So there's, I think,approximately about 60 of us in
total and it's growing and weare welcoming more organizations
that want to be a part of theconversation.
So, within the ethics piece,what we've essentially been
doing is to try to work out howwe can ensure or provide
(27:57):
guidance to the industry on whatethical data management
actually means.
To your question, Debra, whatare these blind spots that we
have?
Now, my background has beenpredominantly within the
occupier space and soimmediately, I think my
inclination would be "Well,actually, within real estate, we
(28:18):
capture very little personalinformation and that's very
specific.
" But the truth is is thatthat's inaccurate.
Luke already talked about, likewrote that off right at the
outset.
But I think what we're actuallytrying to say is that there's
no one within - we don't believethat anyone in - the real
estate industry is doinganything unethical.
(28:38):
But what we're trying topropose is that we are mindful
in terms of the way that weapproach data management.
That means ultimately,distilling it down into one
sentence, make sure that you'realways capturing as much as you
need and no more, and only forthe time period that you need it
(28:59):
.
Ultimately, that's the core ofit.
I think what tends to happen is,and we touched on it a little
bit at the start, thisexcitement about possibility
without a use case, and so theamount of times I've heard,
"Well, why don't we just grab abunch of really smart data
scientists, get them in a room,throw some data at them and see
what comes out?
(29:19):
Oh, no, no, no, no, pleasedon't.
That's not the best or mostefficient, or even actually the
most ethical, if you like, wayto go about it ultimately,
because what you should do isyou should have a very specific
reason for capturing some ofthis data and a very specific
viewpoint on what you want to dowith it and then use it for
(29:40):
that particular purpose, and ifyou need it for something else,
especially if you're capturingpersonal information, then you
need to go back and reconfirmthe consent for you to use that
data in the way that you proposeto use it.
So, beyond that, what I wouldalso add is is that it's really
important to drive home thisviewpoint that as real estate
(30:03):
becomes more digital, as westart to ask more things of our
technology, our need is going tochange.
So one of the things that I'vebeen hearing about - I'm not
sure how many people haveactually started incorporating
this into their assets - is theability to use biometric
entrants into a particularbuilding.
(30:24):
So, just as you would open youriPhone with your face ID, you'll
be able to walk into youroffice with that same face
without having a badge, as youhistorically would do.
And that means that you'regiving away more of your data.
And there's implications withthat.
Right?
Ultimately, if you're now, youare going to be putting demands
(30:46):
on your employees in order toprovide you with that
information or allow you to useit in that way.
So, what we're really doing is,within this steering committee,
we've actually established aplaybook and that playbook has
got a number of case studieswithin it, talking about a
number of case studies and howdata was captured, why it was
(31:09):
captured and how to make surethat you're ethical in the way
that you're managing that data.
And the view is is that asthese use cases or case studies
come about, we can make thatinformation available to a wider
audience.
So as you're reading through,you can go oh, I never even
thought of that as a particularpotential issue in the way we're
(31:31):
using data now or in the future.
So you can grab some guidanceor advice.
I'll stop with my monologue now.
Debra J Farber (31:39):
Oh, that was
really great info.
That was super helpful forcontextualizing it - what The
RED Foundation is workingtowards and why.
I really appreciate that.
What are some of the ethicalquestions, Yusra, that the real
estate sector needs to stilldeal with?
Yusra Ahmad (31:54):
I think the first
one is "hat does ethics mean to
us, right?
So the first question that wewere trying to touch on is this
one of consent.
Now, when we talk about realestate, in my mind, I
straightaway go into the factthat there are so many different
sectors.
Today, we talked about Retail,but we haven't even touched on
(32:15):
Residential.
One of the most interesting,specifically for me, case
studies that we looked at waswithin Residential Care Homes.
In particular, it's thiscomplication between, I suppose
you would say, providing peoplewith freedom, but capturing or
invading their privacy in orderto provide them with that
(32:36):
freedom.
So take, for example, an agingpopulation as we know that we
are, more people are gonna begoing into residential care
homes.
If we were able to incorporatesensors or cameras into units
that tracked, monitored them,they were able to identify when
an individual needed help beforethey were able to ask for it.
(32:57):
Let's say, for example, theyfell and a sensor was able to
notice the fact that thisindividual's had an incident and
someone was able to visit, well, actually, you're giving them
some freedom.
But where does that line kindof stop?
And I think, as we become, astechnology advances and as we're
able to offer these types ofservices and experiences,
(33:20):
I think there's always aquestion of "Is that right?
How do we make sure that we'reprotecting consent?
But also, how can people changetheir minds, and I think that's
slightly overlooked.
In a lot of the use cases I'mseeing, you provide your consent
in order for your data to betracked?
1) should it be captured tobegin with, because do you
(33:42):
actually need it in order toprovide that service?
2) Second of all, how are youcapturing that consent for that
service?
But also, in the same instance,how are you allowing people to
opt out (but not an opt out interms of not providing you with
the consent); it's being able toretract that consent once it
has been freely given.
I think, being able to keepthat in mind in everything that
(34:05):
we're doing, I think that's achallenge.
Right?
Because there's so manydifferent ways that you could be
reaching that in a way withouteven knowing.
Debra J Farber (34:14):
Yeah, and
there's also seems to be so many
different forget even the usecases, like the very specific
use cases but differentstakeholders, right, you've you
already mentioned.
You've got Owner versusOccupier, and so what are the
different rights?
But then you think aboutEmployer versus Employee and how
much monitoring in that spacethat theE mployer can and should
(34:36):
do of their Employees.
Then, you've got like theretail organization - I don't
want to say 'versus,' but likethe opposite perspectives here.
Right?
So, you've got the Retailers ofa space, but then you've got
the Temporary Licensees of thePublic coming in to look around
the store and maybe buy somethings, and you know, so on and
(34:59):
so forth.
You've got Services where youmight be going to the dentist or
doctor, and like what are theymonitoring beyond what you know?
I feel like there's all ofthese different stakeholders
involved in just the tracking,sharing and taking up space or
being in these spaces.
Are you coming up withdifferent use cases around
ethics and then coming up withstandards based on different
(35:22):
stakeholders?
How are you approaching thefact that there are multi
stakeholders involved here?
Yusra Ahmad (35:26):
You're absolutely
right.
There are a number of differentstakeholders and the issue is
as well is that an individualcan take up multiple different
roles.
Right?
We're almost always doingmusical chairs with these roles,
because we could be everythingat the same time.
What we are trying to do islook at it from the perspective
of different sectors, but alsodifferent stakeholders within
(35:49):
the sector, and trying tounderstand what do they need
and also how they can go aboutachieving that balance.
Ultimately, I guess thechallenge is is keeping on top
of not only the regulation,which is lacking in a way.
.
.
I shouldn't say lacking, Ithink it's developing; but, I
(36:11):
think the other aspect (which ismore of a challenge) is keeping
on top of the technology as itevolves.
I sort of you know, tongue- in-cheek, sort of mentioned how we
have this desire to look at newtech and innovate the back of
that.
We should absolutely be doingthat, first of all, just to make
(36:32):
that clear.
I think we should be looking atwhat's coming up - and ChatGPT,
I think, is the newest toy inthe box - but, what we have to
be careful with is that, aswe're leveraging these types of
things, is, to your point, tostart to look at it from a
different perspectives.
What are the different waysthat people could be leveraging
this type of tech, but alsoproviding some steers or
(36:53):
guidance that leverages theregulation we do have but also
provide some practical guidanceas to what that truly means, and
I think that's a reallydelicate balance.
I'll give you another example,but it's really from a city
perspective.
Very similar to your rewardexample that you mentioned
(37:14):
earlier - when you're looking ata city that is densely
populated, lots of traffic onthe road, you know people are
struggling to get from one sideto the other.
It's really rather anunpleasant experience, and I've
experienced a few of those.
I won't name them, but there isthis ambition to capture
(37:37):
location data, people'smovements, et cetera, and be
able to redirect them.
So, on the face of it, thatbecomes such a wonderful idea,
right?
Just as Google Maps redirectsyou when you're driving, right,
- best route here or there -what if it were to go further?
If you're looking for arestaurant for the night, it
(38:00):
actually tracks the busiestareas and redirects you
somewhere else and provides youwith rewards for going to
somewhere that's a little bitquieter.
These are the types of things orexperiences that I think we
have to start to foresee,capture and assess for their
(38:20):
ethics, essentially.
This is the challenge withethics, right, which is why I
can't give you a direct answerthe question of ethics is really
not.
.
.
there is no one right answer.
I mean certain things, there isone right answer: right or
wrong.
But, most of the time it's aquestion of, on the balance of
things, which is the 'right' /'best' thing to do or which is
(38:42):
the 'most sensible' thing to do.
Is it sensible to give someone agreat experience or an easier
experience?
Yes, but should they bemanipulated into going somewhere
that they otherwise might notwant to go?
I don't know, right?
It's all about consensus ofwider society in terms of what
is right or wrong, what the bestthing or what is the common
(39:04):
consensus of what we should bedoing.
This is why it's so wonderfulto be on the Steering Committee,
because we can debate some ofthese things out and say and
think about things in a slightlydifferent way to what we are
potentially programmed to do orwhat our instincts would say.
So yeah, it's fun; it'sinteresting.
I don't think we've capturedeven a fraction of all of the
(39:27):
types of areas or the questionmarks that we should be looking
at, but I think the challenge isalways going to be that we're
going to need to be keeping upwith whatever is happening
within the wider world or withinindustry.
Are we using biometrics?
Is that the right thing forentry?
I mentioned that one before.
If you're an employee, what ifyou don't want your employer to
(39:51):
have such a detailed set of dataaround your facial makeup.
Should they be forced toprovide you a different
mechanism to enter into theirbuilding?
Or does experience kind oftrump that?
These are the types of thingsthat we do is keep on top of
what's coming out and thenprovide some guidance off the
(40:11):
back of that on how that couldbe achieved in a more ethical
way.
Debra J Farber (40:15):
That's really,
really fascinating, and thank
you for restating those usecases, or not restating them,
but sharing them with us.
Maybe, Luke, you could talk alittle bit about what it'll take
to ensure that real estatecompanies start leveraging
privacy- enhancing technologiesfor data capture and sharing
(40:37):
data in a privacy- preservingway, beyond security controls
like biometrics, which makessense that there might be people
who are afraid of giving them,and that's a great conversation
that you just had, Yusra, aroundthat tension and how you have
to think on balance and thinkabout what your ethics are -
(40:59):
which is the point of thoseethics: to create tension in the
org so that you can actually belike, "oh, this is important,
we have to figure out what ourvalue system is, and then the
ethics is the approach togetting there to the set of
values, which might differslightly across depending on who
the stakeholder that you are,at what time and at what you
(41:21):
think the future should be.
But sorry, I started out saying, Luke, I would love to
understand your opinion on whenthe real estate industry will
start leveraging PETs.
Luke Beckley (41:32):
So, just to go
back on a couple of things that
Yusra's mentioned and just addone more group into the RED
Foundation, which is theEngagement and Awareness group,
which I am the Chair, and I havethe wonderful job of getting
people like Yusra to talk to youlike this, which hopefully is
worth receiving.
(41:52):
But, interestingly, thosewords, engagement and awareness,
I think are crucial in theprocess of getting the industry,
the real estate industry, notto rush into - and I think, as
Yusra's touched on it - rushinto the adoption of what is the
newest and shiniest tech,because it might make what is
(42:13):
perceived to be the experienceof the different stakeholder
easier, slicker, quicker.
And then there are a whole heapof the additional obligations
around that, of course, likewe've talked about, the case
studies and the case law thatkeeps testing the various
different elements of theregulation.
I think it's what we're tryingto do - one of the key
(42:34):
principles and our driver fornext year - try and take those
ethical considerations thatYusra so wonderfully articulated
and push those out to the realestate organizations who are
looking at adopting, for example, any form of biometrics /
facial rec into buildings.
This has been talked about -I've been part of conversations
(42:56):
over the last X number of years,7 years or so, where this has
been brought up on an almostannual basis.
"Well, can we just put facialrec in?
And then, of course, whathappens is it goes beyond just
the use for access controlSooner, or then it goes into the
digital place making, makingthat whole environment and
experience better, not only forthe employer but potentially for
(43:18):
visitors, potentially forindividual occupiers, consumers
in the public spaces.
And so it's the principles, theoverall principles of the GEPR,
which is what we still operateunder, still currently operate
under in the UK; and that'sanother totally different
subject for another day.
I think it's really educatingthe real estate industry - and
(43:42):
the like, I said, the wholeplethora of real estate players
- on how those ethicalconsiderations need to be taken
into account and then how theprivacy- enhancing tech can be
applied to that.
And, all just the principles -adopt them properly.
Storage limitation - if you aregoing to go down the facial rec
(44:04):
route, you don't need to storethe actual recognition picture
for longer than what?
A half- second?
Nanosecond?
It's got to do the recognitionthen should be deleted; but,
these considerations don't gettaken into account.
It's the speed of access; it'sthe ease of access.
All of those elements areconsidered first before the
(44:26):
actual ethical question of itsuse - whether it is actually
allowed under the regulation orthe law in the first place
because, again, as Yusramentioned, there are definitely
easier routes to allowing peopleaccess to the building.
For example, putting a digitalpass on their phone.
Right?
Still don't need the facial recpiece, so what is the
(44:48):
recognition piece for other thanto enhance the experience?
I think it's.
.
.
there's a step before theapplication of the technology,
which is the education andawareness piece for the real
estate industry.
I think we need to engage withthem on the ethics questions and
then start educating them onthe various privacy- enhancing
tech that is available, which,for me, is the foundation of
(45:12):
building trust with thosestakeholders.
That seems to be a word thatdoesn't get used very often but
is, I think, fundamental to howpeople engage with those spaces
and with the assets.
Yusra Ahmad (45:23):
A couple of years
ago I had the privilege of
attending an Evanta event, whichis a Gartner company.
They organize events for CDOs.
This is cross- industry, fromtelevision to charities to real
estate, all sorts oforganizations being represented.
They were all struggling withthe same ethics question.
(45:47):
It is by no means, I think, aunique challenge to real estate.
Everyone is capturing more andmore data.
I think everyone is seeing somemassive, huge fines, like
hundreds of millions of dollarsworth of fines, walking out the
front door.
They're saying well, how do I,as a CDO, make sure I protect my
(46:12):
organization from that type ofexposure?
Here's where we are.
Debra J Farber (46:17):
Yeah, that makes
a lot of sense.
It makes me think.
I just want to bring up,because we've been talking
facial recognition and then youdescribed a use case where maybe
we'll just use people's facesfor them to walk in - a camera
looks at them as they walk intotheir workplace and determines
whether it's you or not?
I just want to make thedistinction that there is a
(46:38):
difference between "facialidentification.
Is this person who claims to beDebra Farber, Debra Farber?
Righ?
That's a one-to-one try tomatch versus facial recognition.
where you're looking at ageneral population of people,
people are walking in - who isthis person?
Oh, we have a giant database ofmillions of people.
(46:59):
This is Debra Farber.
That's facial recognition.
That's different fromidentification, right?
Where you're just, it's thevery authentication of the human
, because you already have afacial print somewhere for that
employee.
In some ways, I think it couldbe less scary to just walk
people through the technology inan easily consumable way that
(47:19):
they can understand and then optin for.
I also want to ask, a lot ofthese things around employees in
the UK and Europe is differentfrom the United States.
I'm going to ask real quick ifyou can maybe talk about the
role of Works Councils inapproving some tracking
technology within organizations.
I think Works Councils are likeunions, but they operate a
(47:42):
little differently on your sideof the pond.
Are you able to enlighten usthere with what that process is
like?
Yusra Ahmad (47:49):
To be honest, I
think Works Councils are a lot
more stringent on the continent,the likes of Germany,
Switzerland.
My interactions in the past, mysphere has crossed past them.
In Europe, they've been a lotmore challenging.
I think, in the UK, where youhave unions, not every type of
(48:13):
job is unionized; they tend tobe much more for public services
.
Luke Beckley (48:20):
Public services
definitely are very union- heavy
, in the UK at least.
Yusra Ahmad (48:27):
Again, it depends
on which part of the sector that
we're talking about.
If you're talking about thingslike logistics, for example, if
you're moving more into theconstruction type of space, I
think that's where you start toencounter some of those making
sure that you engage with theWorks Councils, or the unions as
we call them here, anddescribing your need and also
(48:51):
the articulating the impactspecifically that that is going
to, whatever it is that you'retrying to achieve is going to
have on employees or theindividuals in question.
As you move further, I think,into the occupier side of the
fence, a lot of the focus tendsto be on more of the white
(49:12):
collar, I suppose, type ofservices.
I think it becomes a much moreequal balance of individuals.
You probably don't encounterthe unions as much as you would
otherwise.
Debra J Farber (49:24):
That's helpful.
Thank you.
I really just didn't have asmuch insight into Europe
generally when it came to unionsand works councils.
That actually helps shed somelight that you refer to them as
unions, even in the UK, like wedo here, and that they might
operate slightly differentlyfrom the rest of the continent.
So, I learned something, thankyou.
I learned many things, butthank you for that.
(49:46):
So, Luk e, we're talking here alot about the industry generally
and how approaches are requiredto gather the stakeholders and
let them understand, educatethem and their communities on
all of this data collection andhow to balance with the right
controls to preserve privacy,whether it's through PETs or
security controls orarchitecture and just different
(50:07):
ways of thinking.
Well, since this is a showaimed at privacy engineers, I'm
curious if you have guidance forthem as to are there
opportunities basically forprivacy engineers to work in the
real estate sector?
Obviously, you can't haveprivacy enhancing technologies
deployed and deployedappropriately without experts in
(50:27):
that field to understand thespace and appropriate ways of
deployment and what's worked forothers in the past and maybe
taking a look at research andall of that.
Is there appetite right now forprivacy engineers to be hired
into real estate companies oreven occupiers of that space's
overall approach to privacy intheir space.
Luke Beckley (50:50):
I've definitely
got appetite for it, if that
helps.
I think there is a huge needfor privacy engineers in the
real estate space.
I think we are .
.
and again, back to the reasonThe RED Foundation was set up
and founded was the perceptionfrom the people who were part of
that organization that the realestate industry is behind other
(51:14):
sectors in understanding bothhow to leverage the data
correctly, be processed the datain an ethical and secure way.
Of course, then the next layeron is "well, how do we make sure
that we protect the privacy anddata elements for the various
stakeholders that are involved?
(51:35):
As we mentioned earlier, wecaptured quite a lot of personal
data, but the rush to justadopt technology to enhance
experience is, in my opinion, ahuge opportunity for real estate
companies to take on boardprivacy in engineers, listen to
(51:56):
their input in conjunction withthose who are talking through
the ethical questions about someof the processes that some of
the real estate companies wantto take on board and are
implementing, and to reallydrive the enhancement of the
technology that is ultimatelygoing to build that trust with
(52:16):
the real estate companies, thehousing associations, the
residential developers, all ofthose different stakeholders are
ultimately trying to build aproduct that is trusted by their
potential purchasers and alloccupiers, but the tech at the
moment isn't supporting, in myopinion, the privacy enhancing
(52:39):
elements sufficiently.
So I think there is a hugeopportunity there to try and
bring privacy enhancing tech andengineering into this space.
I, for one, am desperate totake on a privacy engineer in a
slightly different field to realestate, because I still think
there's been I've got the samechallenges in a different field
and then that's where I'm goingto podcast, but I think there is
(53:01):
huge opportunity for theprivacy engineering space in the
sector.
Debra J Farber (53:05):
That's great to
hear.
I don't see too many roles, atleast stateside, for retail
hiring.
Well, maybe not just retail,but let's say the real estate
industry in general hiring.
I really do hope to see more ofthat.
I am curious, though comparedto other industries, guess the
assumption would be that there'snot that much software in the
real estate industry, but that'sprobably not true these days.
(53:28):
You know, I even look atsomething like WeW ork, to
either to visit somebody that'sthere or, I even for a short
time had like a small office inone of them.
You know, even then, while youwouldn't think software was a
big part of their value prop,there was this WeW ork website
app thing that you could allconnect to and kind of almost a
(53:48):
little bit of a local socialnetworking thing for those who
are WeW orkers, and alsoguidance of their different
properties, and so there wassome software involved.
I remember - this is years ago,so I was poking around - going
they can improve here.
But, I'm wondering, you know,is privacy- by- design and
engineering generally, likebaking into the software
development life cycle,something that's important to
(54:09):
the real estate industry, or isit more around privacy enhancing
technologies and deploying thatacross their analytics stack.
Like.
Just give us a little sense ofthat, if you could, for software
developers.
Luke Beckley (54:20):
So, that
particular type of application
and or engagement is definitelybecoming more prominent.
So I'm conscious of a number ofdevelopers who are running
their own apps to enhance thecommunity, basically of
occupiers and themselves in thedifferent spaces that they've
(54:40):
got, trying to bring consumersto the retail elements of the
different spaces and trying tobring that all under one hood of
a particular app that'srelevant to that particular
space.
I'm also trying desperately toavoid names.
So, I think you've got twotiers really.
The first one is definitelytrying to drive the development
(55:01):
of the technology but try andtake into account the prevailing
data protection laws.
Okay, so we're looking at itfrom a high level data
protection perspective and whatwill allow a risk based level of
compliance to be gatheredbefore deployment of said
technology.
That lower layer, that more baselayer for me, the privacy- by-
(55:22):
design element and theintroduction of the privacy and
enhancing tech, I think is stillvery much in its infancy,
because it feels like we'restill in a retrospective,
retroactive, reactive scenarioin order to try and bake that,
in which Yusra and I have hadnumerous conversations recently
on the ESG thing we mentionedearlier - where we're trying to
(55:46):
make what looked like smallchanges further down the line,
actually result in huge cost,huge time consumption, and then
the ROI of that change suddenlydoesn't get baked into the ROI
of the actual end product.
And so it comes, "Let's not dothe privacy enhancing tech bit.
(56:06):
Are we as compliant with thelaw as we can be with the tech
stack as it currently stands?
Yes, we are.
Then we're good to go".
I think there is, as I've said,the opportunity for privacy
enhancing tech in conjunctionwith a deeper understanding of,
and need to embed, privacy- by-design from the start of the
development of these kind ofapplications, where you're
(56:29):
bringing community together,you're bringing occupiers
together, you're bringing thespace utilization under one
banner.
I think is in its infancy.
Debra J Farber (56:39):
Great well, it
sounds like a lot of opportunity
.
I would love, as we close today, if you could tell us how
people could get involved withThe RED Foundation.
Are there any resources thatyou'd like to point them to?
How can they collaborate?
Luke Beckley (56:51):
Yeah, so we are on
LinkedIn, so we will drop the
information.
We've also got a website, so wecan pop that across as well and
I can go into the show notes.
Obviously, Yusra and I are morethan happy to get contacted
from anybody via LinkedIn andtalk to anybody about getting
involved in real estate and tech.
Debra J Farber (57:12):
Excellent.
Well, I'll definitely put thatinformation in the show notes.
Y Luke, any last words ofwisdom before we close today to
share with the audience.
Luke Beckley (57:23):
That's definitely
up to you, Yusra.
Yusra Ahmad (57:26):
Words of wisdom.
There's so many things, right?
I think that there is - andfollowing on from your
conversation just now aroundopportunity - there is a huge
amount of opportunity where itrelates to data, where it
relates to real estate tech, theprop tech sector, and cretech
sector, whichever way you wannacall it.
It's growing year on year andit's significantly sizable.
(57:51):
W ith that there areimplications; there's
challenges, I think.
If we bring it down to the mostbasic viewpoint, the more that
you start to digitize yourproperty, the greater risk that
you open yourself up to fromsomeone, a hacker somewhere,
(58:12):
logging in or breaking into yourspace and creating some real-
world damage, potentially.
So, from a privacy perspective,I think there's definitely
opportunity there; but, what Ialso think is is that we
shouldn't allow that to deter usfrom innovating and progressing
because these are obstaclesrather than blockers.
(58:36):
We just need to find a way tojump over them, and there's
smart enough people to be ableto do that and create some new
exciting experiences for us.
I would say.
So, come with plenty of ideasand an open mind, and there's a
lot of stuff that we can bedoing here, a lot of things we
(58:56):
can do.
Luke Beckley (58:56):
Yeah, exactly, and
start - you're right - and
start seeing privacy and privacyenhancing tech and privacy- by-
design, not as a blocker.
It's a great way to finish.
Not as a blocker but as abuilder for actually doing the
right thing, building the righttech, providing the right
experience, and building thecustomer trust.
Debra J Farber (59:14):
Yeah, to me it
sounds like we really need to
threat model for both privacyand security, prevent the
criminals - "criminal hackers.
I say this my fiance is ahacker, an ethical hacker - he
does that for a living.
So, I always wanna mention thatthese we should say "criminal
hackers, not just hackers, as ifthey all are criminals.
(59:36):
Have the right controls.
If you're doing your threatmodeling, you're figuring out
those use cases - not just forsecurity, but threat model for
privacy as well where you coulduncover what in your environment
, the potential harms that theuse cases would have to people,
as well as to your assets.
Continue doing that to uncoverwhat the potential threats are,
(59:56):
because they're gonna constantlybe evolving and changing.
So, it's not a 'one and done'compliance thing.
It's an ongoing part ofmanaging your tech stack today.
If you build it in ,privacy inby design, you'll be more
proactive, you'll be more agilewhen you go to market.
You're building privacy intoyour MVP.
You're not waiting till laterto figure out, now that you have
(01:00:19):
all this technical debt, howyou're gonna have to redo things
and re-architect things.
Someone I respect very much inthe ethical AI space space,
Chowdhury, talks about that.
how adding to one's car - yeah,it's for safety, but it gives
the user trust in your overallsystem.
(01:00:41):
So, you end up driving fasterwhen you know you have breaks
and have these guardrails builtin and safety around the car
actually doesn't slow you down;it makes you faster.
I feel the same way aboutprivacy- by- design or even
ethical AI.
You build this stuff in fromthe beginning.
It's now market- ready to beconsumed by those who do care
about safety and privacy andsecurity and humans and not just
(01:01:05):
users and nameless analyticsstats.
Right?
The protections are real andthe entire economy and the
companies that might be morevisible and might likely get
fined and have differentregulatory bodies concerned
about what they're doing,they're gonna be very careful
(01:01:27):
about their downstream - thecompanies they work with, the
technologies they work with.
So, if you wanna sell to largerorganizations that know they
have regulatory scrutiny on them, you'll get through the sales
process faster if youproactively have been thinking
about privacy rather than comingto market with a product or
service in the real estate spaceand then expecting that it's
(01:01:50):
just gonna be consumed and thenfinding out you have to fix all
these things.
Right?
So again, it's all aboutincreasing ROI.
Privacy is not a blocker iI\fyou actually think about it
early and embed it into design,architecture, and software
development, and DevOps.
So thank you, Luke and Yusra.
Thank you so much for joiningus on The Shifting Privacy Left
(01:02:11):
podcast.
Until next Tuesday, everyone,when we'll be back with engaging
content and another great guestor guests.
Thanks for joining us this weekon Shifting Privacy Left.
Make sure to visit our website,shiftingprivacyleft.
com, where you can subscribe toupdates so you'll never miss a
show.
(01:02:32):
While you're at it, if youfound this episode valuable, go
ahead and share it with a friend.
And, if you're an engineer whocares passionately about privacy
, check out Privado (01:02:40):
the
developer- friendly privacy
platform and sponsor of thisshow.
To learn more, go to privado.
ai.
Be sure to tune in next Tuesdayfor a new episode.
Bye for now.