Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Debra Farber (00:00):
Hello, I am Debra
J.
Farber.
Welcome to The Shifting PrivacyLeft Podcast, where we talk
about embedding privacy bydesign and default into the
engineering function to preventprivacy harms to humans and to
prevent dystopia.
Each week, we'll bring youunique discussions with global
privacy technologists andinnovators working at the
(00:21):
bleeding edge of privacyresearch and emerging
technologies, standards,business models and ecosystems.
Today I'm delighted to welcomemy next guest, Jeff Jockisch,
Partner at Avantis Privacy, DataPrivacy Researcher at Privacy
Plan, and co-host of the weeklyLinkedIn event - or, yeah, I
guess we call it a podcast aswell - called "Your Bites, your
(00:44):
Rights, which focuses on townhall style discussions around
ownership, digital rights andprivacy.
Welcome, jeff.
Jeff Jockisch (00:54):
Thanks, Debra.
Great to be here.
Wonderful to chat.
Debra Farber (00:57):
Absolutely.
It's been too long since we'vecaught up, so why not do it
publicly on this radio showright, this podcast?
I know you did not start outworking in privacy.
And, at Privacy Plan, whereyou've been for a while, you've
been focusing on data sets -like, specifically, 'privacy
data sets,' privacy consultingand privacy training.
(01:18):
And, your website states thatyou "research and create data
sets about data privacy to gaininsight into the privacy
landscape.
Can you give us some context asto what you mean by 'privacy
data sets,' maybe your approachto creating them, and what you
mean by gaining insight into theprivacy landscape?
Jeff Jockisch (01:40):
Yeah, absolutely.
I guess that's actually a fewquestions.
Debra Farber (01:43):
It kind of is.
I'm sorry.
Jeff Jockisch (01:46):
Well, I guess it
sort of gets to the core of who
I am.
You know, I sort of grew up intechnology and marketing
technology, sort of the otherside of the equation from
privacy.
I actually worked in a searchengine for a long time before I
got into privacy, and that wasmy introduction to it, doing
some work in CAN-SPAM and COPPAas part of an SMS-based search
(02:11):
engine.
I won't really go into thatbecause it's a little bit deeper
, but I really built a lot ofdata sets when I was doing that,
essentially building out aknowledge graph on the back end
of a search engine.
That's really where I cut myteeth on data science and
building data sets.
I really loved doing that.
(02:32):
So, when I got into privacy, Ireally loved the privacy world,
but I didn't really want to dowhat everybody else in privacy
did.
I didn't really want to work oncompliance.
I realized as I was studying formy CIPP/ US that the way I was
studying for it was buildingdata sets of privacy laws to
(02:56):
study them and building datasets of privacy podcasts that I
was listening to, to learn.
It occurred to me that I like tobuild structured datasets and
then I could do that in theprivacy world.
I just really fixated on that,and I started building massive
(03:16):
datasets around different things.
I've got a huge dataset ofprivacy laws, like all the
privacy laws in the UnitedStates, and not just a list of
them, all the differentattributes and things like that.
Probably there are law firmsand organizations that do that.
But I built a lot of otherthings like datasets of privacy
podcasts, datasets of all thedifferent aspects of privacy
(03:42):
regulators across the world.
I think I've got maybe one ofthe largest ones of that and
datasets of privacy brokers andall aspects of data breach laws
in the United States and all thedifferent things that happen
there.
When we start analyzing thosedatasets and looking at all the
different attributes, you reallylearn a lot and that's where
(04:05):
you get those insights.
Debra Farber (04:06):
That makes a lot
of sense and it definitely
explains - you're a "data guy.
" That's how you got into doingprivacy datasets and it shows
that people get into privacyfrom all different angles,
whether it's being a developerand coming in and starting to be
a privacy engineer writing code, or privacy architects coming
(04:30):
over from other spaces to workon privacy, or compliance folks,
GRC folks, lawyers.
You are absolutely the firstperson that I have ever met that
has worked on datasets, whichreally drew me to you a few
years back when I saw the workthat you were making public.
Now I know you recently startedworking for a company called
(04:52):
Avantis Privacy to work onlocation privacy and deletion;
and, I think we're going tocenter some of our conversation
today mostly around 'locationprivacy.
' I'd love to hear you elaborateon the work that you're doing
there.
Jeff Jockisch (05:06):
Sure, that was
actually a little bit of an
outgrowth of what I was doing atPrivacy Plan, so sort of still
doing it at Privacy Plan.
As part of my research atPrivacy Plan, I built a dataset
of data brokers.
I found that nobody in theworld, that appears to me, was
(05:26):
really tracking them.
There are actually a feworganizations and, frankly,
there are a couple of state lawsthat require data brokers to
register - one in California,one in Vermont, and a couple
other states are trying to passsome of those laws now; but,
only about a little less than1,000, probably close to around
800 data brokers, have actuallyregistered.
But we know, way back in 2014,the FTC estimated there were
(05:50):
about 4,000 data brokersprobably a lot more now.
We've actually got a databaseof 3,000, actually 3,200 data
brokers right now.
Debra Farber (06:00):
Wow.
Jeff Jockisch (06:01):
Well,
painstakingly, I built that
dataset over three years now andfrom a variety of different
sources, starting there withthat 800 and building it out in
a variety of different ways.
Part of that's proprietarymethodology, but a lot of it is
just scouring the internet fordatasets where somebody says
(06:26):
this is a data broker, that's adata broker, and combining every
dataset of data brokers I canfind and growing it from there.
Then a lot of people saw thelists and say, "ey, I'm a data
broker and these are my datasources.
If you just keep adding all ofthat up painstakingly over time,
you end up with a lot.
Sometimes I go into individualmarkets and say, "okay, well,
(06:49):
who are the healthcare databrokers?
Who are the political databrokers?
Who are the location databrokers?
and have to individually, indifferent market segments, try
to find who are the brokers.
It's just a lot of work.
Debra Farber (07:01):
Absolutely sounds
like a lot of work.
That's an astounding number.
I'm not surprised that there'sthat many data brokers, but I am
in awe of the fact that you'vebeen able to identify and kind
of tag that many.
So, with Advantis Privacy,you've been working on helping
to delete geolocation data.
What is your approach there?
Jeff Jockisch (07:25):
Yeah, so what
happened was, as I was building
this data broker dataset, one ofthe guys from Avantis - one of
the original two founders there- approached me and wanted to
chat.
I actually did some consultingfor them.
I ended up coming on board withthat organization.
Now I've sort of became a fullpartner there.
(07:46):
Avantus' approach originally wasto actually be a data deletion
service sort of like Incogni,Optery, PrivacyBee, DeleteM e.
One of the things we realizedearly on is that nobody was
deleting location data.
Part of that is because thelocation data brokers, until the
(08:08):
FTC sued Kachava last year,really were not letting people
delete their location data.
It was not really an option.
They all claimed that that datawas anonymous and therefore not
personal data and notnecessarily deleteable.
They very quickly startedputting changes into their
privacy policy and throwing updelete pages where you could
(08:31):
enter your MAID, maybe emailaddress, and delete that
information once the FTC suedKachava.
Now it's possible to deleteyour location data, but people
don't know who those brokers are.
They don't know how to findtheir MAID, their mobile ID
number.
They didn't even know it's athing.
Debra Farber (08:52):
Before this
conversation - I've been in
privacy; I talk about locationdata - I didn't even know, but
I'm not an expert in brokers, Ididn't even know that location
brokers were kind of a separatecategory from a data broker.
They are.
Break down the difference there, because that is definitely
something I think people wouldbe interested in learning more
about.
Jeff Jockisch (09:13):
Yeah, we've had
150 location data brokers
defined that are storing yourmobile advertising ID, as well
as series of lat-longs - placeswhere you've been: your home,
your office, your shoppinglocations, your church, your gym
, every place you've been andall the routes you used to get
(09:34):
there.
They've got all thisinformation and they're selling
it frankly to just about anybodywith a credit card.
It's not necessarily quite thateasy to buy it, but it's not
hard either.
Debra Farber (09:44):
Yeah, let's
explore that.
What are some of the currentrisks to our location privacy?
I know there's a lot to expoundon here.
I'll let you answer first andmaybe we'll go deeper.
Jeff Jockisch (09:55):
Well, there are a
lot of risks to your location
and a lot of ways that yourlocation leaks.
We're actually trying to puttogether a little bit of a
categorization system of whatthose different location risks
are and where your location endsup.
Frankly, your MAID is reallyonly one of those vectors, but
(10:16):
it's a very large one and onethat people don't really
understand.
That's one of the reasons we'refocusing on it, because you can
delete your profile informationfrom data brokers and that'll
get rid of things like howeasily your address is exposed
to people search websites.
That's something that the otherdeletion services can do
(10:38):
reasonably well (maybe notperfectly), but decently well.
But, they're not getting atthis other MAID information.
Then there are other thingswhere you're exposing your
location information in avariety of other ways too, like
social media posts.
your car, when you drive itaround, exposes your information
(11:00):
, sometimes through your mobileID information, also through
automated license-plate readersand through some other things,
like other ways through your car.
Plus, your phone can also leaksome data through cell tower
information in a variety ofother ways, though they're a
little bit harder for non-lawenforcement organizations to get
(11:24):
at.
Debra Farber (11:25):
What are the
restrictions there?
Actually, before I get to therestrictions of why it's hard
for law enforcement to get atsome of this data, I did want to
expound upon the risks to ourlocation privacy.
What are some of the privacyharms that could occur?
Obviously, there's tracking.
Jeff Jockisch (11:42):
Sure, actually, I
was talking about the types of
data rather than the risks.
Debra Farber (11:46):
Yeah, that's fine,
you could switch gears.
Jeff Jockisch (11:49):
Yeah.
Well, the risks are, it dependsupon who you are.
If people have your location,if you're an at-risk target,
they can do a lot.
First of all, it's a physicalsafety risk.
If people know where they are,they can come and get you.
If you have enemies, if youhave somebody who's pissed off
at you because you saidsomething on social media, they
(12:11):
can find you.
They can come after you.
They can SWAT you.
We know this is a growingproblem.
They can send people after you,which is bad.
If you've got a stalker, theycan come to your house.
They can find out where youwork and go there.
They can intercept you on yourway places.
That's a really huge problemfor people that are celebrities
(12:34):
or athletes or CEOs or peoplethat have got stalkers, domestic
abuse victims, things like thatcertain types of people.
Even if you just are a regularperson, you really don't want
your location to be out there,because identity thieves can
then use that information tocommit crimes in your name.
(12:57):
If they know where you live andthey know lots of things about
you, it's much easier toimpersonate you.
Those are a couple of thereally biggest risks.
I guess that's where I'm sortof leave it there.
There are other ones as well.
Debra Farber (13:13):
So this might be
one you're thinking and holding
back, but I do want to bring upthe risk of, in a post-Roe v.
Wade world now where in a statelike Texas, with Roe being
overturned and with anybodybeing able to pretty much accuse
someone of having an abortionor seeking an abortion which is
now against the law in Texasgovernments can use law
(13:38):
enforcement or governments canobtain this data to find someone
and prosecute them, or to checkand see, maybe through a health
app or health data set thatthey get their hands on can tell
that you've been near, or to,an abortion clinic or that you
went across state lines to anabortion clinic or something
(14:01):
along those lines.
Right?
Yep.
So, law enforcement can even bea threat actor here to one's
privacy, which is kind of scary.
Jeff Jockisch (14:09):
Yeah, law
enforcement, or just your
neighbor who doesn't like you,right?
It's pretty scary And, to beclear, it's not just about women
that have abortion in thosecircumstances, right, if you get
pregnant, there are just asmany circumstances where you
don't actually carry the baby tobirth, where it's not an
(14:30):
abortion, where it's potentiallymiscarriage or some other
circumstance, where it's not anactual abortion, but somebody
else who's looking at that fromthe outside might think it was
an abortion when it wasn't.
Right?
And, they can accuse you ofhaving an abortion.
Right?
And put you through all kindsof circumstance and stress based
(14:50):
upon some of these new lawsthat are put in place.
Right?
And, that's pretty horrendous.
Debra Farber (14:57):
I agree.
I think there's definitely somefundamental harms that can
happen as a result of gettingaccess to that geolocation data.
So, I guess to that point, Iwanted to understand from you
whether or not these locationdata brokers are always getting
direct, precise information, orare they also inferring
(15:20):
someone's location based onbehavior or some other elements?
Jeff Jockisch (15:26):
So most of the
data that we're looking at is
actual location data that isbased upon your GPS.
Some of it is.
It can be Bluetooth, some of itcan be Wi-Fi, some of it can be
cellular, but generally it'stranslated and then connected to
your mobile advertising IDnumber.
They can do inferences andstuff like that as well, and a
(15:50):
lot of them do, but oftentimesthose inferences, I think, are
more like trying to imputewhether or not, based upon your
path and stops, that you were ata particular place or at
another particular place, right?
I don't know if you've everbeen in Google, but sometimes
you drive by and you park in aparking lot and it doesn't know
(16:13):
if you went to the grocery storeor you went to the Dunkin
Donuts right next to it.
That kind of inferences arethings that these folks might
make.
Right, did you go there?
or did you go to the mosquethat was in the corner of the
shopping mall, right?
So if you go to a mall that hasa mosque in the corner, they
might all of a sudden think thatyou're going there instead of
(16:35):
going to the Dunkin Donuts nextto it.
Debra Farber (16:37):
Yeah, that makes a
lot of sense.
And I could see how that can bemisused as well.
You previously said that thegeolocation brokers in the past
didn't really considerthemselves data brokers, and I
think you hinted that that'sbecause they had anonymized data
.
Jeff Jockisch (16:54):
Yes.
Debra Farber (16:55):
And so can you
speak to that.
W hy is anonymization notenough?
Why does that not take them outof being covered by data
protection laws or privacy lawsor by the ruling, the Kachava
ruling?
Just speak to us aboutgeolocation and anonymization
and some of the challenges.
Jeff Jockisch (17:14):
Sure, sure,
Anonymization has always been a
way to essentially avoid privacylaws, And the problem is that
for a while now we have knownthat anonymized data can often
be deanonymized, And locationdata is particularly vulnerable
(17:36):
to this kind ofre-identification.
If you take three precisegeolocations - especially if
it's got a time element attachedto it, which all of this stuff
does - take three of those datapoints, or even four, you can
identify Americans with 95% to97% accuracy.
(17:59):
So, if I have four data pointson you, just randomly from some
set of location data, Debra, Ican identify you 90%, let's say
95% of the time that it'sexactly you.
Debra Farber (18:11):
Yeah, that's crazy
.
Jeff Jockisch (18:13):
Because probably
one of those data points is your
house, one's your office, andone's you're on the way to work
from the highway And I can telljust from looking at those three
or four points of data thatit's you because it can't be
anybody else.
Debra Farber (18:26):
Right, because
it's almost like a behavioral
pattern that's being recognized.
Jeff Jockisch (18:31):
Yeah, that's what
it is.
Right?
And this is proven,statistically proven.
There's a couple of articlesThere's one from two years ago
from Nature Magazine that provesthis right, and so it's
impossible for these people tosay that it can't be
re-identified.
Got it.
But this is actually, we'veknown this for a long time.
I think it was back in 2011,2012,.
(18:51):
Lantana Sweeney from - I forgetwhere she's from, maybe MIT -
has done this research, so we'veknown about it for a decade.
Debra Farber (18:59):
Right and yet,
because there weren't any
regulations saying otherwise, somany ad brokers' responses was
just to anonymize.
And I know you wrote aboutprivacy theater.
So on LinkedIn I recently sawyour post that reads quote
location leakage coupled with'consent theater' is an
existential threat to ourprivacy.
(19:21):
For many it's a threat to theirphysical safety and well-being.
" Can you elaborate on thisexistential threat and what you
mean by consent theater?
I mean, I have my ideas, but Iobviously want to hear from you.
Jeff Jockisch (19:34):
Yeah, well, the
way this information gets
collected is primarily from yoursmartphone, from apps.
Right.
You are going to go and load upyour smartphone and you're
going to turn on a weather app,right?
Because you want weather.
You need to know if it's goingto rain tomorrow or if it's
going to rain in the next 15minutes.
(19:54):
I've got a weather app that Iuse that tells me, when I look
at it, if I need to bring outthe umbrella in the next 15
minutes, and it's highlyaccurate.
It's got great know-it-date onit.
Right?
But, it also sends my locationto a data broker, and that
location is connected to amobile advertising ID, and it's
going to get sold to Venntel andKachava and Gravy Analytics and
(20:19):
a whole bunch of other folkslike that.
Right?
And what people don't realize,though, is that that's happening
.
That is because they put thisweather application on their
phone and said, "Yeah, i wantthis application, that they're
essentially agreeing to givetheir location information to
that application; and thatapplication is going to monetize
it by selling it to theselocation data brokers.
(20:41):
They don't even know thatthey're there, that this
ecosystem exists or thatinformation is going to get sold
, and then that information isgoing to go to anybody that
wants to buy it, including lawenforcement, Homeland Security,
the FBI.
Anybody can essentially getthat data.
Right?
And when the government'sgetting it, they're essentially
(21:03):
circumventing the FourthAmendment because they can get
it without a warrant.
Identity thieves can find easyways to essentially buy that
information.
If I've got a stalker, thatstalker can buy - they can't
necessarily buy just my data.
They can't walk in to a databroker and say give me Jeff
Jokisch's location.
But, if they happen to knowwhat state I'm in, for instance,
(21:26):
they could probably buy a swathof data in my state or my city,
something like that, and then,based upon that blob of data
they got, they could probablyfigure out who I am.
If they know a little bit aboutme and be able to figure out
more about me, you know what I'msaying.
So if they knew, for instance,where I worked and my name, they
(21:49):
could probably figure out who Iam.
That makes sense, right?
Debra Farber (21:52):
Yeah, no, that
definitely does make sense.
And then getting all theconsents, the check- the- box
that everyone gives to theprivacy policy, and just to move
forward with installing an appon their phone or on a website,
gaining access to or creating anaccount.
I'm assuming that that's theconsent theater part where we
(22:13):
can't possibly manage all ofthese relationships and
understand and remember everyonewe've consented to using our
data for what purposes.
Jeff Jockisch (22:25):
Right.
For me, the consent theater isthis: they don't mind giving my
location to the weather appright; but, what they're not
telling me when I say "OK tothat, is what happens to my
location information downstream.
It's completely non-obvious tome or to anybody else that that
(22:46):
location information is going tobe sold and is going to end up
with Homeland Security, withpotentially a threat actor, with
potentially an identity thief.
How is that possible?
Debra Farber (22:58):
Yeah, i mean it
shouldn't be.
I hate data brokers, personally.
I know that's something thatHeidi Sass, who's a member of
the Avantis team as well, she'salways saying, "I hate data
brokers.
I'm not afraid to say it here.
For the most part, they'retrading on unsuspecting people
who may have technically givenconsent but not really knowing
(23:20):
what they're consenting to andusing their data, sometimes
selling it to their detriment ofthose people.
How can people protectthemselves from having their
data collected and sold by databrokers and location brokers?
Jeff Jockisch (23:33):
Sure, Well, I
mean, there's really two phases
to this.
Right?
I mean we can help delete thestuff that you've already leaked
.
Avantis can help you deletethat historical information, but
you've also got to stop leakingthe information.
You've got to stop givingconsent to these apps, and that
can be a harder thing.
We're also working on somethingthat may be able to help with
(23:55):
that, but that's for a latertime in a later show.
Debra Farber (23:59):
Oh, excellent,
because I was going to ask
whether you think it's incumbentupon the phone manufacturers
who have the, whether it'sfirmware or software, whatever
the pop-ups that are enabled totell you what a particular app
is going to use your data for, Ifind that those topics 'use it
(24:20):
for marketing,' 'use it forcustomization.
' I find that the reasons thatdevelopers can select are very
broad so as not to really tellyou the real purpose behind the
data use downstream, but more ofa category that can be selected
just so that they can moveforward and start the data
(24:41):
collection.
I feel like it needs to be moregranular and or allow apps to
be able to notify individualsabout a more granular use of
personal data.
Jeff Jockisch (24:55):
Yeah, I
definitely agree.
I mean it needs to be moregranular.
It would be great if there wassomething that said "ey, we're
going to use your data, butwe're never going to sell it.
Right?
Too.
So, like I'm going to use itfor my marketing, but I'm not
going to sell it to anybody whocan sell it onward right.
That would be awesome if Icould give my location to my
weather app without having togive it to the whole world like
(25:17):
that checkbox.
Debra Farber (25:19):
Yeah, wouldn't we
all?
It seems like that should bethe way things are.
Right?
Optimizing for the benefit ofhumans and not necessarily for
corporations to exploit them forthe purposes of just making
money off their data that theydidn't consent to.
So I'm with you on that.
Jeff Jockisch (25:39):
Yeah, I actually
think, Debra, that some of this
is changing.
I think some of this consenttheater what I've referred to it
as is going to change a bit.
I think the consent boxes areprobably going to change a bit
in the next couple of years, andthere are companies like Apple
that are taking somewhat of aleadership role there.
In terms of MAIDs, because ofthe ATT framework that Apple
(26:03):
rolled out, there are a lot ofpeople that have opted out of
that.
So there are less people on theApple operating systems - iOS -
that have their mobileadvertising IDs turned on now
than on Android, for instance.
That's good.
Right?
However, a lot of those peoplestill have a lot of historical
location data out there.
(26:23):
So, even if you're running aniPhone and you're like, "h yeah,
well, I turned off my mobileadvertising ID Well, that's
awesome right.
Except, even if you turned itoff, you know, like six months
ago, all the data that you letthose companies collect for
years is still sitting out there.
Now, some of it might be olderand maybe less valuable, but if
you still live in the same house, go to the same office, go to
(26:46):
the same routes, all that datais still sitting out there for
somebody to use against you.
Debra Farber (26:50):
So, that's a
really great point.
All it does is stop the datacollection, but not necessarily
purge it.
So, how could individuals gainmore control over their own data
, especially if we don't have aunified at least in the United
States, we don't have aequivalent of a GDPR And we just
(27:10):
have some states, likeCalifornia's DCPA and a few
other states that have kind offollowed suit with the privacy
law.
Jeff Jockisch (27:17):
Well, like this,
Avantis can help you try to
purge that data with ourlocation purge, because with
Apple, you can actually go backin, turn your MAID on, grab that
MAID number, send it to us, andwe can purge that information
for you.
It's actually harder on Android.
So, on the Android devices,only 3% of people roughly have
(27:40):
turned off their mobileadvertising ID, which is sad.
However, there's a problem onAndroid, and that's this: if you
have turned off youradvertising ID, there's no way
to know what it used to be.
So, all of that historicalinformation is now no longer
reachable.
(28:00):
So, you can't turn your AndroidID back on, figure out what it
was and then send that MAID tous so that we can delete that
past history unless you canfigure out some other way to
figure out what your ID was.
It's like unreachablehistorical data.
You know what I'm saying?
Debra Farber (28:31):
Yeah, well, that's
definitely, security-wise,
that's a data availabilityproblem now, potentially.
I mean, arguably, could that bea good thing, like it's no
longer identifiable potentially,or is it still going to get
bought and sold?
Jeff Jockisch (28:45):
Still going to
get bought and sold, and
somebody, anybody who grabs morethan three or four points of
data can re-identify it andfigure out it's you.
Debra Farber (28:53):
Right.
So if somebody deletes their IDoff of Android, are you
basically saying that Googlewill still track that ID even
though that person is no longer- it's not considered "Let's
purge that data.
It's not good anymore.
Jeff Jockisch (29:08):
It's nothing.
Google's tracking you, right?
Because at that point, Google'sno longer sending new
information based upon anymobile advertising ID.
But, they've already sent itout, right, and those brokers
have collected it and it'ssitting in their data sets, got
it?
That makes sense.
So if you re-enable your mobileadvertising ID, it generates a
(29:30):
new one, so then they can starttracking you again, but it's
based upon a new mobileadvertising ID.
Debra Farber (29:36):
I see, I see.
So, I think we've been talkingabout how Avantis can help
people protect themselves fromhaving their data bought and
sold and how to request purgesbe made.
But what about organizations?
What can they do better toprotect people's privacy when
collecting and using geolocationdata?
Jeff Jockisch (29:54):
So if you're
talking about like a commercial
interest, I think that's prettyinteresting.
We're actually talking withsome organizations now about how
they need to start thinkingabout this from a threat
mitigation situation.
And, especially if you'retalking about people that maybe
targets in your organization forattack you know whether it's
(30:16):
the C-suite or your IT personnelor other vulnerable individuals
where you know attackers mightcome after them, they're going
to come after them based uponinformation that they can find
that's publicly available, right?
So some of them are actually,you know, going to these data
deletion companies and trying todelete all their information on
(30:37):
their C-suite employees, theirIT employees.
Frankly, they should probablybe doing it for all their
employees, but at least thoseones that are most vulnerable.
But, right now, they'reignoring all the location data,
right, which is a bigvulnerability, so they should be
thinking about deleting thatlocation information as well.
It's a big hole.
Debra Farber (30:56):
Yeah, definitely.
That sounds like a realsecurity challenge for corporate
security.
And I do appreciate the answer.
But, what I was trying to getat is should companies, for
instance, not collectgeolocation data in the first
place?
Oh, I see what you're saying.
Or is there a way to do this ina more manageable way?
If they do collect it, is therelike a privacy enhancing
(31:19):
technology or architecture thatcan be used to better protect
the privacy of people?
Jeff Jockisch (31:24):
You know, I'm not
sure if I've got a great answer
to that.
I think maybe the best way tobe careful with that data is to
not collect the precisegeolocation, but collect more
vague geolocation information,because generally you don't
actually need that data to beprecise.
Debra Farber (31:44):
Is it more like
people would want to know what
state you're in and that's thelevel?
Or, could it go deeper thanthat and be county and be okay?
Jeff Jockisch (31:53):
Yeah, I mean
that's not bad, right?
I mean, if you can pinpoint itdown to a house level, I'm not
even sure what the actual fine-grained level is that's
problematic, right?
But, if you were to blur thatdata out so that was within
(32:14):
miles instead of feet, youprobably would not have an issue
with it, right?
If it was within like a mileradius and probably if you're in
urban or rural areas.
But it's not going to reallychange my weather report, and
it's probably not going tochange a lot of other things.
Maybe it changes it for thepeople that are trying to figure
(32:36):
out what store you're going to,which could be problematic for
certain applications.
But for a lot of use cases youdon't need precise geolocation,
but they're storing it anyway.
Debra Farber (32:47):
Yeah, that makes
sense.
That's helpful.
I do wonder, and if people hereare listening, maybe there's an
opportunity to specificallyfocus on location data.
I know Privado, the sponsor ofthis podcast, actually does
quite a bit of making sure that- you could scan your code to
make sure that location data isbehaving as intended and you're
(33:10):
not actually collectingsensitive data when you didn't
want to be as an organization orit can put you into risk or
harm privacy.
But it seems to me that there isdefinitely an opportunity out
there to educate companies onhow to protect people and not
harm them, especially when itcomes to geolocation and precise
data.
Perhaps it's as you say, thatyou get not precise geolocation
(33:35):
data but work with astatistician or somebody who can
take a look at making sure thatdata is not identified or
re-identifiable, making surethat the granularity is not
going to harm individuals, andkind of do that threat
(33:56):
identification and threat modelfor the product beforehand,
before you ever ship.
That will do wonders.
And you could refer to peoplelistening, you could refer to
the previous episode on threatmodeling with Kim Wuyts and
learn a little more about thethreat modeling approaches and
(34:16):
when you should do that.
But, I do think that having theright experts on board to help
with whether or not this meetsprivacy bars and not just
compliance, but can actuallyprotect the people behind the
data is pretty essential.
Jeff Jockisch (34:30):
Yeah, I
definitely agree.
I think if you change thegranularity, you could vastly
decrease the reidentifiabilityof the location data.
Debra Farber (34:41):
Excellent.
Jeff Jockisch (34:42):
And it might not
be that hard to do, actually.
I mean, if you have lat- longs,you could just lob off the last
few decimal points potentially,and you'd be there.
Debra Farber (34:52):
Fascinating.
Well, it's good to know Youdon't think it's too complex.
Hopefully this is some food forthought for the privacy
technologists in the audience.
Jeff, is there anything else,any other resources that you
would point people to if they'reconcerned about data brokers or
location privacy; or you couldalso plug your website for
(35:18):
learning more about privacy datasets.
Jeff Jockisch (35:21):
Yeah, well, I
think there's a lot of stuff on
Privacy Plan about data sets.
I'd also say follow me onLinkedIn; I do a lot of posts
there.
I also promote a lot of otherpeople there.
And check out "rivacy podcasts.
I've been a little bitnegligent, but I maintain a huge
(35:42):
data set of privacy podcastsAnd, frankly, Shifting Left is
finally in the data set, sothat'll be awesome.
And yeah, I'm going to be doinga Top 10 list here in the next
couple of weeks.
Debbie Reynolds has been on meto redo that because I haven't
(36:02):
released one for a while, and Ithink there's something like 200
privacy- focused podcasts in mydata set.
Debra Farber (36:10):
That's just so
wonderful.
I really look forward to thatwhen that comes out.
I've seen the list.
And I have looked at thedatabase and I'm really glad
that you finally added S hiftingPrivacy Left.
We're relevant!
Jeff Jockisch (36:24):
You're actually
not relevant.
You're actually high up therenow too.
So, I haven't actually pushedthe top lists, but you're in
there, so that's awesome.
Debra Farber (36:32):
Oh, that's
excellent.
It's really great to hear.
Like I said, I'm really lookingforward to seeing that.
And what else?
Before we go, let me have youdiscuss a little bit of your
podcast, "our bites, your rights.
Do you mind just giving us alittle overview about the types
of things you discuss when ittakes place, and how people can
(36:56):
participate?
Jeff Jockisch (36:56):
Yeah, we've
actually been on hiatus for a
bit, but Christian Kameir and Iare going to be ramping that
back up.
We actually had an episode acouple weeks ago and hopefully
we're going to get back on aweekly rhythm.
But, Your Bytes, Your Rights issort of a weekly
interdisciplinary discussion ondata rights And we focus a lot
(37:21):
on privacy and data ownershipand sort of all the stuff that
goes around with that.
We try to bring in people fromnot only privacy, but just all
kinds of different relateddisciplines to talk about how we
can better own and demand ourrights to our data.
Debra Farber (37:40):
Yeah, and I've
participated in several of those
discussions and I've beenreally enlightening and engaging
with other experts in the field, and so I encourage others to
attend them.
Once you get them going again,no worries taking a hiatus.
I took a few off this summermyself.
There's just so much going onin the field that we just have
(38:01):
to remember that we're human andwe can only - we need to rest
at times, and you can only getdone when you get done, but I
look forward to participatingagain.
Jeff Jockisch (38:10):
Yeah, absolutely,
you were great when you were on
.
We just finished up an episodewith Tom Kemp about the DELETE
Act in California, which mayfinally have a regulation with
some teeth in it on data brokers.
Debra Farber (38:31):
Yes, that's
wonderful.
He's going to be on the show ina week and a half, maybe two
weeks actually in the recordingtimes, yeah.
Jeff Jockisch (38:38):
Awesome.
Well, that'll be great.
Well, then you'll have all thedetails on the DELETE Act.
Debra Farber (38:42):
On the DELETE Act,
as well as his new book.
"Privacy and Big Tech, which isdefinitely an interesting book.
We'll talk a little bit moreabout that when he's on the show
.
Then he's also a seed investorfor a whole bunch of privacy
tech companies.
He's a very interestingindividual, and I really look
forward to having thatconversation.
(39:03):
Yeah, I love it that we'retalking about similar things,
you and I, but taking differenttacks.
You have this town hall styleengaging show where anyone could
join the LinkedIn Live eventand then ask questions or join
within that format.
I love it.
It's all complimentary andreally enjoy the value that you
(39:26):
bring to the field, especiallyyour enormous focus on data sets
.
Until you came on the scene, Ihadn't seen anybody else doing
that hard work of pulling it alltogether and then making it
pretty accessible publicly.
Well, thanks for your work.
Jeff Jockisch (39:41):
I appreciate that
.
Hopefully we'll put out somenew ones and some big ones soon.
Debra Farber (39:46):
Great, Jeff.
Do you have any other words ofadvice or anything you want to
leave our listeners with beforewe close?
Jeff Jockisch (39:53):
I think.
Just some kudos to you forShifting Left, loving the new
podcast.
Thanks for all you do.
Debra Farber (40:00):
I really
appreciate that very much.
It's a lot of work.
I got a lot of personal valueand pleasure out of having these
conversations.
Thank you for being on the show.
I'm sure I'll have you back onin the future.
There's just so much going onin this space.
Good luck to you on Avantis andthe new consumer location
(40:22):
privacy and deletion tool.
I'll be paying attention.
If people wanted to reach outto you and collaborate or ask
questions, is LinkedIn the bestplace or is there somewhere else
that they can go?
Jeff Jockisch (40:35):
LinkedIn is
probably the easiest way to
reach me.
Debra Farber (40:37):
Excellent.
Well, I'll put a link to thatin the show notes and you can
have a great day.
All right, take care.
Thank you for joining us todayon Shifting Privacy Left to
discuss privacy, data sets,location privacy and data
brokers.
Thanks for joining us this weekon Shifting Privacy Left.
Make sure to visit our website,shiftingprivacyleft.
com, where you can subscribe toupdates so you'll never miss a
(41:01):
show While you're at it.
If you found this episodevaluable, go ahead and share it
with a friend, and if you're anengineer who cares passionately
about privacy, check outPrivado: the developer-friendly
privacy platform and sponsor ofthis show.
To learn more, go to privadoai.
Be sure to tune in next Tuesdayfor a new episode.
(41:21):
Bye for now.