All Episodes

March 5, 2024 54 mins

In this week's episode, I sat down with Jake Ottenwaelder,  Principal Privacy Engineer at Integrative Privacy LLC. Throughout our conversation, we discuss Jake’s holistic approach to privacy implementation that considers business, engineering, and personal objectives, as well as the role of anonymization, consent management, and DSAR processes for greater privacy. 

Jake believes privacy implementation must account for the interconnectedness of privacy technologies and human interactions. He highlights what a successful implementation looks like and the negative consequences when done poorly. We also dive into the challenges of implementing privacy in fast-paced, engineering-driven organizations. We talk about the complexities of anonymizing data (a very high bar) and he offers valuable suggestions and strategies for achieving anonymity while making the necessary resources more accessible. Plus, Jake shares his advice for organizational leaders to see themselves as servant-leaders, leaving a positive legacy in the field of privacy. 

Topics Covered: 

  • What inspired Jake’s initial shift from security engineering to privacy engineering, with a focus on privacy implementation
  • How Jake's previous role at Axon helped him shift his mindset to privacy
  • Jake’s holistic approach to implementing privacy 
  • The qualities of a successful implementation and the consequences of an unsuccessful implementation
  • The challenges of implementing privacy in large organizations 
  • Common blockers to the deployment of anonymization
  • Jake’s perspective on using differential privacy techniques to achieve anonymity
  • Common blockers to implementing consent management capabilities
  • The importance of understanding data flow & lineage, and auditing data deletion 
  • Holistic approaches to implementing a streamlined and compliant DSAR process with minimal business disruption 
  • Why Jake believes it's important to maintain a servant-leader mindset in privacy

Guest Info: 

Send us a text



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

TRU Staffing Partners
Top privacy talent - when you need it, where you need it.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Jake Ottenwaelder (00:00):
In privacy, we need to understand that we're
providing a service to the restof the organization and we are
leading by example and we areserving others through our
leadership and through ouradvocation.
Just understanding that maybeyou're not going to see the
rewards every single day.
Take time for yourself.

(00:20):
Make sure that you understandthat you are hopefully fighting
the just fight, but really beinga servant leader and uplifting
others to enable them to doprivacy on your behalf.
That's how we leave the bestlegacy and continue to grow
adoption and education ofprivacy.

Debra J Farber (00:39):
Hello, I am Debra J Farber.
Welcome to The Shifting PrivacyLeft podcast, where we talk
about embedding privacy bydesign and default into the
engineering function to preventprivacy harms to humans and to
prevent dystopia.
Each week, we'll bring youunique discussions with global
privacy technologists andinnovators working at the

(00:59):
bleeding edge of privacyresearch and emerging
technologies, standards,business models and ecosystems.
Welcome everyone to TheShifting Privacy Left podcast.
I'm your host and residentprivacy guru, Debra J Farber.
Today, I'm delighted to welcomemy next guest, Jake Ottenwaelder
, Principal Privacy Engineer atIntegrative Privacy, a

(01:22):
consulting firm that he launched, which centers around a
holistic approach to privacyimplementation.
Jake, having moved from acybersecurity background into
privacy, has been a one-manprivacy engineering team across
late-stage startup organizations, most recently GoFundMe.
He also worked at Deloitte forquite some time and has a lot of

(01:44):
consulting experience.
Jake lives outside of Seattle,Washington, where he's raising a
puppy and creating modellandscapes in his spare time,
and today, if you can't tell,we're going to be chatting about
implementation of privacyobjectives into the business and
engineering and what that takes.
Welcome, Jake.

Jake Ottenwaelder (02:04):
Hey, Debra, thanks for having me.

Debra J Farber (02:06):
Oh, it's my pleasure.
To start off, why don't youtell us a little bit about your
journey from Security Engineerto Privacy Engineer and what
motivated you to switch yourfocus, and why focus on privacy
program implementation?

Jake Ottenwaelder (02:19):
In college, I focused on cybersecurity.
My university had an honorsprogram dedicated to
cybersecurity and I was reallyinterested in going down that
route.
I got my first job doingcybersecurity consulting with
Deloitte and throughout thatexperience started to learn more

(02:39):
and just understand more aboutdata.
And when I got the opportunityto leave that organization I
started doing more justcompliance automation, and part
of the compliance that I wasautomating was privacy.
So, I moved a couple more timesand slowly got strictly into
privacy and absolutely love theabstract nature of the industry,

(03:05):
how we have regulations butthere's a lot of connection
between regulations andtechnology and ethics.
I think it's a problem thatnobody's going to be able to
solve, but I love theopportunity to just be able to
add my name into the hat of,hopefully, people who can affect
change and improve privacyaround the world.

Debra J Farber (03:27):
That is a very noble goal, so I'm glad that you
were able to be delighted byprivacy and the wicked problem
that it is and trying tostreamline some things.
So, you talked about it beingpretty abstract and loving that.
I want to talk a little bit moreabout that because I think
that's been a real pain pointfor a lot of folks, especially

(03:47):
engineers.
Where it's so abstract, youjust don't have the appropriate
requirements to test to to knowthat you've met a privacy goal
or aim, and so that abstractionhas been a double-edged sword in
many ways.
Got this principle-basedprivacy right or privacy
principle that you want to makesure is embedded into the
business, but then well, whatdoes that really mean when

(04:10):
there's no set of requirements?
How do you get from point A topoint B?
How do you take the abstractionand actually get to
requirements?

Jake Ottenwaelder (04:17):
I'm going to tell a little embarrassing story
.
In college, I was sitting thereafter college and looking for a
job and I was sitting there andI was like I don't understand
how everybody has to work somuch.
How can problems be so hardthat you can't just solve it in
a couple hours?
And I was sitting there as thisnaive college kid not believing

(04:40):
that problems could really bethat hard, as I just kind of
coasted through college and Iwas fortunate enough to not have
to worry a lot about finishinghomework - like taking a long
time to do homework and stuff.
So, I was coming from thismindset of just like how can
problems be so difficult?

(05:00):
And yeah, abstraction in privacyis huge, but I love it because
it's such a unique experience orunique perspective that
everybody has on it.
So, every individual person haswhat they believe their privacy
should be respected for or whatthey believe their data is

(05:20):
valued at.
And, as a privacy engineer, assomebody who's looking at the
regulations, a lot of times theregulations don't go far enough
or we look at the regulationsand we say it's telling us this
thing, but what's the essencebehind this regulation?
What's the purpose of trying toget consent, and does that mean

(05:40):
that we just put up a bannerthat says accept all or does it
mean that we give people equalchoices and stuff?
So, there's a lot of.
.
.That level of abstraction andthat almost, I want to say,
advocating for everyone equally,that really became a big
motivator for me.
And, so I guess, to answer yourquestion, how do you get that

(06:04):
into requirements?
It's really hard.
It takes a certain mindset.
It takes a mindset of likealways being ready to like,
constantly learn and being opento new experiences and talking
to people, and it takes a verykind of global mindset that I
try to consider individuals whoare unique in their populations

(06:26):
and require the additionalsupport that we should be giving
everybody, no matter wherethey're coming from.

Debra J Farber (06:34):
That makes a lot of sense.
In fact, that actually inspiresanother question for me -
security is very much system-focused.
Right?
So, when we talk aboutcompliance of a system, we're
talking about is it securedaccording to requirements?
But it really takes a mentalshift to then be like no,
privacy is about people.
It's about protecting peoplefrom harms.

(06:54):
And then obligations on thecompliance part is around
obligations of protecting thatdata in a certain way that's
linkable to a person.
And so how has that shift, mindshift, been for you to make as
you moved from security intoprivacy?
Because it's clear that youhave made that mind shift, even
just talking about giving peoplechoices instead of just putting

(07:15):
a banner up there to say youhave compliance; or really
thinking about the people in thepopulation and maybe some edge
cases and who could be harmed bythis.
I guess partly my asking thisis also wanting to educate
others on how you've made thatleap and maybe how they can then
get security folks to startthinking in this way, even if
they remain in the securityfunction, but just expansive

(07:37):
understanding of what privacymeans.

Jake Ottenwaelder (07:39):
Yeah, first thing that comes to mind is take
a second to look around ateverything going on and really
understanding.
I think there was an article,and I can't think of it off the
top of my head, but it talkedabout a different character or

(08:00):
customer profiles.
I think there's a lot ofstudies around customer profiles
and stuff.
I think there's different data.
I think it's actually relatedto privacy specifically, maybe.
But looking at just the broadrange of people's experiences
and just thinking about that,and then for me, having the
opportunity to work on systemswhere I was working with

(08:22):
billions of connected devices -GoFundMe is a massive platform
that I was super honored to workon that requires so much
support - and just looking atdifferent people's stories and
who are really needing helpgives you that perspective that
you're there to serve.
The biggest thing for me isthat it's not my data as an

(08:46):
organization.
They're collecting information,but it's not really their data.
They're borrowing it from otherpeople and so, just like as I
would borrow my lawnmower from aneighbor, they would probably
expect it back at some point.
So this is all data thatorganizations are borrowing and
so, as stewards of the data, weneed to be respectful of those

(09:10):
people's wishes.
So, I think security plus ethicsto me like the security areas
is, like you said, a lot morestraightforward.
They have a lot more compliance.
As you add in this ethics, thenyou get to kind of privacy, and
so people who are interestedshould look into more ethical
considerations and justunderstanding like is it really

(09:34):
really complicated stuff.
Like what is free will?
Do you have free will if youhave a.
.
.
Bringing up cookie bannersagain, they're one of the most
challenging UI conversations andissues around consent notices
as well.
It's like do you have free willif you don't have the
expectation or if the UI ispushing you in a different

(09:55):
direction?
So, how can we really allowpeople to be more free with
those decisions?
That's the area of privacy thatI love to play in, have
conversations in, and try topush an organization to adopt
what I consider to be the moreethical kind of approach.

Debra J Farber (10:12):
So it sounds like some of the reasons that
you've been able to really makethat switch is you actually were
working for a company that hada real good focus on B2C.
Right?
So, you weren't just a B2Bbusiness that's trying to figure
out the privacy stuff andthinking about privacy rights
being a compliance obligationand so just what's the best way
to achieve compliance in aprivacy perspective.

(10:33):
But you're really like had toactually.
.
.
your personas within yourorganization, a major one was
individuals that were gonna justfund some startups and those
individuals data that you werecollecting, and so you were kind
of seeing part of that need toprotect a person as opposed to
just B2B flows.
Does that resonate with you?

(10:54):
Do you think that that's someof the reason?

Jake Ottenwaelder (10:56):
Yeah, definitely.
I think I'll even go back alittle further, because I first
started to really exploreprivacy when I was working at
Axon.
Axon's the company that hasdeveloped at the tasers and body
cameras and they have a verylarge market in that.
When you look at privacyconsiderations for a company
that develops policingtechnologies, all the privacy

(11:21):
concerns around that were huge.
Looking at like video cameradata, that and audio recordings
and looking at tracking ofpolice activity or where people
are going, obviously see thatbeing, like you see body camera
images released online orpublicly released, and how all

(11:44):
of that affects privacy as well.
So, definitely like that was thefirst, like you're seeing the
people kind of dealing with that, and then my other
organizations definitely had alot of individual, like yes,
these are real people that aregoing to be interacting with
these sites and these web appsthat are.
.
.

(12:04):
we're having like billions ofdevices and millions and
millions of people a month thatare touching these things that
we need to.
.
.
not only so that it'simplemented consistently, but
also robust for all of thepeople that were servicing.

Debra J Farber (12:21):
That makes a lot of sense.
So let's turn to the topic ofthe day - implementing privacy
into an organization.
You claim to take a holisticapproach to privacy
implementation through yourorganization Integrated Privacy.
Now, what does it mean to,first of all, to 'implement
privacy' and then tell us aboutyour holistic approach to

(12:41):
implementing privacy?

Jake Ottenwaelder (12:43):
Yeah, I think just implementing privacy is
like something that a lot oforganizations just try to do,
and I think they start off byputting in a cookie banner with
One Trust is like the big.
.
.everybody uses that platformor just creating a privacy
notice and sticking that up onthe page and calling it a day.

(13:03):
That's where I see a lot oforganizations using it and you'd
be surprised the even largerorganizations, as they continue
to grow, still maintain those askind of the foundations of
their privacy program just apage on their website and this
little banner that'll pop up forcertain people.

(13:23):
I took a lot of thought intrying to figure out what my
next steps were after leavingGoFundMe and I really felt like
there's an opportunity forpeople to adopt and have better
relationship with privacy withinorganizations.
So, for me, that kind of camein through this mind of like

(13:44):
integrative privacy beingsimilar to like an integrative
wellness.
I find, or I'm a believer ofintegrative wellness and just
understanding your body as awhole, and I take that similar
approach.
When looking at integrativeprivacy and implementing privacy
or privacy technologies, it'snot just a checkbox, it's not

(14:05):
just the compliance.
It's the ethics; it's thepeople of the organization that
need to be educated and broughtinto the fold, to be enabled to
be better data stewards ofpeople's information.
So, that's kind of where I takethis integrative and more of a
holistic approach to privacyimplementation.

Debra J Farber (14:28):
Thank you, that's helpful.
Let's talk about metrics forsuccess.
How do you define a successfulimplementation and what makes
for, I guess, a badimplementation or not as
successful?

Jake Ottenwaelder (14:41):
Yeah, I think implementations are always such
a challenge.
Privacy technology is thisarea, I think, where these
privacy engineers have so much Igive them so much respect and
because they have to deal withimplementations that are so

(15:01):
unique to the tech stack thatthey have to work with but also
have to be forward looking andbe ready to scale or change with
regulations that are coming outand also be able to accept or
look at each individual personthat's coming to the page and
give them options for theirprivacy.

(15:24):
So for me, a successfulimplementation really starts
with this adoption of thetechnology, looking at how it
can be best integrated into theentire stack, and I think a lot
of successful implementationsare on the backs of strong
understanding of what data is inyour company and where that

(15:46):
data is going.
I think, if you have thatunderstanding, you can implement
privacy technology much easierand have a much better
foundation for that.
Bad implementations I thinkeverybody has an experience with
this.
It's when there's alwayssomething that's kind of popping
up.
I think a lot of it comes downto planning and scoping, making

(16:09):
sure that you can take a second.
.
.
I just want to get in and dostuff.
I want to do cool stuff.
I want to fix things.
But, you've got to take a beatat the beginning and look at the
landscape and look at thepicture of what you're trying to
do and if you don't do thatthen you're gonna have surprises
that come up.
You're gonna have issues withscoping or planning or resources

(16:32):
that just can prolong thisimplementation.
And it's a lot about likebuilding that trust for the
organization, because privacy isnot really a money-making
operation.
If you're doing successfulimplementations and you're on
time and on spec, you're gonnahave a lot more trust with the
organization when the next toolkind of comes into play that you

(16:55):
need to work on.

Debra J Farber (16:56):
Yeah, you know, I also think of that as, if
you're just trying to do theminimum necessary and just
comply with a new regulationInstead of looking at the
broader trend of what thatregulation is trying to solve
for - like, "Oh Okay, we canstill do this with all our data
," but like a new regulationcomes in and says we need to be

(17:18):
careful about location data andthe way that's used and whatever
, and you're just doing pointedsolutions around one data type
and you're not seeing thebroader effort to get your arms
around how data could be used,misused, the harms around it and
get you know, then each time aregulation comes, you're
constantly gonna have tore-implement something else

(17:39):
rather than thinking about "whatis good privacy.
Because if you just think,w"What is good privacy and what
does it look like to respectthat and let's now try to
automate for that you know, thenif you do that, compliance will
follow.

[Jake (17:52):
Yeah, yeah, exactly].
It'll be easy to make similarsimple tweaks based on new
regulations.

Jake Ottenwaelder (17:57):
Yeah, exactly that's why you run the podcast,
Debra.
You hit it right on the head.
That's exactly.
.
.
if you're kind of continuing todo these band-aid solutions,
you're gonna always be behindthe eight ball.
If you look at "What's theright thing and this is where
the ethics kind of come itwhat's the right thing to do,
what's the right thing for ourcustomers?
" And, sometimes you might getthose things wrong, but as long

(18:18):
as you're kind of shooting forthat, you're gonna get a lot
further than just kind ofplaying whack-a-mole, so to
speak.

Debra J Farber (18:26):
Awesome.
What are some consequences thatyou've seen for badly
implementing privacy?
Obviously, I kind of talkedabout that you constantly
playing catch-up and in a stateof frenzy of trying to be
compliant when you're onlyaddressing it in short bursts of
compliance projects as opposedto privacy comprehensively

(18:46):
integrated in.
What are some otherconsequences that you've seen
for attempts at implementingprivacy but that are done wrong
or done inefficiently orineffectively?

Jake Ottenwaelder (18:56):
Yeah, I feel like the biggest issue with
badly implementing privacy orone of the consequences is you
end up having a lot of stuffjust like lost or a lot of
skeletons in the closet.
You gain kind of a lot of techdebt, which is the term that I
hear a lot of people using,where it's like you're

(19:17):
implementing something and youhave engineers kind of working
on stuff but you're continuingto kind of dig a hole for
yourself and that is just reallyexpensive to get out of.

Debra J Farber (19:29):
It's expensive to get out of and I will even
throw in there, as someone who'sbeen doing operational privacy
for 18 years that the more techdebt there is, the more I - you
know I'm technically orientedbut I'm not a hands-on
technologist.
So, the less ability I feelthat I have to change, make
change, in the organization, theless desire I have to even take

(19:52):
a job in that company.
Those are downstream problems,I think, that need to be thought
about when you talk about thistech debt, because it's like
"You're just making it harderand harder for me to be able to
solve the problem by notaddressing it.
So, I don't want the job.
It's just not going to set meup for success or the company.
"

Jake Ottenwaelder (20:08):
I consider tech debt is obviously a lot of
like the technical stuff.
Also, we got to talk about,like the procedures around that,
everything that you do inprivacy, or I consider
everything that I'm doing to besetting a precedent.
So, whether it's you'reimplementing a tool and it's
halfway done, that is setting aprecedent of "it is that way,

(20:30):
now right, and so it takes a lotmore effort.

Debra J Farber (20:34):
Like it didn't meet the MVP almost of the
deployment.
Yeah it's like it's not evenmeeting the minimum viable
requirements of the need.
Is that kind of what you'resaying?

Jake Ottenwaelder (20:43):
So like, even if you're you're implementing a
technology or that that's likeout- of- date, the processes
that you're putting around it;or as you kind of continue to
grow as an organization, there'sthe technology that might be
out- of- date or lacking behindor or not up to compliance.
Then, there's also theprocedures associated to that,

(21:04):
which are all precedents thatalso require a lot of effort to
start to change and work on aswell.
So, I think it's it's both ofthose that we deal with.
With privacy engineering, notonly the technology, but also
the processes that that we needto update and and modify.

Debra J Farber (21:21):
Got it.
So, building on that, how wouldyou summarize what it takes to
achieve a successfulimplementation of privacy?

Jake Ottenwaelder (21:30):
I think, starting off, I mentioned, it's
the strong foundation, I think,understanding the data that that
you're working with and wherethe data is going, and there are
tools that can do that.
I think it requires a lot ofeducation and driving awareness
by the privacy team.
Like I said, there are toolsthat will help you look at data

(21:51):
discovery.
It's why do we need this, whatis the benefit of it and how
else are we going to get thisvalue if not doing this?
And so, it really starts withthat strong foundation and then
it takes a lot of Planning andfor me it's been a lot of like
project management work as welland growing in my skills around

(22:14):
project management to be able totalk with an engineering team
and Understand the technologybehind it and help them
implement and develop thesoftware, but then also
communicate high- level goals,high- level needs and escalate
that to be able to get theresources and get the support
that I need.
So, I think that's in essence,and we're distilling down a lot

(22:36):
of information.
So, that's what I would I wouldkind of say is a successful
implementation is focusing onthose kind of key areas.

Debra J Farber (22:43):
Yeah, that makes sense.
I want to kind of talk aboutsome of the challenges that I've
faced when I wanted to helpimplement privacy into an
organization.
I want to know everythingthat's going on in that
organization around privacy.
But sometimes, that's.
.
.
I want to do this so that I canmaybe piggyback on some other
projects and just add privacyrequirements.

(23:04):
That way you're not recreatingthe wheel; you can leverage
efficiencies.
You know, there's there's somany good reasons to do that.
But sometimes, especiallyengineering- heavy scaling tech
companies - and I'm pulling frommy own experience at Amazon,
but it could be any of the largebig tech or just giant
enterprises or just anyengineering focused org that is

(23:27):
moving fast and not necessarilywaiting for you to put
guardrails around.
- you just can't knoweverything that's going on.
I was working for Amazon.
1.5 million people work therenow.
There was no way - I wasfinding out about consent
decrees and fines that I had noidea about because I wasn't part
of Legal.
When you're in a giantdecentralized Security and

(23:51):
privacy within those orgs, soyou might be within one business
unit and you have no idea whatother business units are doing,
and it's not for a lack ofpartnerships or wanting to know.
It's just too big to get yourarms around all of the moving
pieces.
That was very frustrating for mebecause I wanted to know and,
in order for me, I've learned alot about my ADHD in recent
years, about like divergentmindset.

(24:13):
I need to know and have high-high level understanding of
everything going on and I alsoneed to know.
I need to go deep into thesilos of the components of to To
have understanding of any oneof those like in order for me to
have that big picture.
so, so without that it's alittle disorienting because you
don't know what the things going.
When working in an organizationto implement privacy, how do

(24:36):
you stay on top of everythingthat's going on in an
organization related to personaldata?

Jake Ottenwaelder (24:42):
You mentioned working in Amazon.
I mean, I feel like thesechallenges are very similar to
the work that I've done as wellwhere I was a solo team kind of
privacy engineer with 300software developers in the
organization and trying tounderstand what all the sprints
are doing and what people arekind of going on.

Debra J Farber (25:13):
Well, and for me , I find that I'm paying
attention to too many things,right, like it's not a lack of
attention, it's everything iscoming in and not undefined
filled, you know not, you know.
So I'm filtering it all out ornot out, but I'm, I'm making
sense of it all.
Instead of just like maybethree things, I'm seeing like 10
, right, and so it's moreoverwhelming.
But, I still feel I need to getthat info so that I can better

(25:36):
have a mental map about thestate of it.

Jake Ottenwaelder (25:46):
Because for me, when I look at privacy
you're doing PIA's, it's verymuch this like interconnected
system.
I might be doing a PIA for anauthentication system, but that
authentication system is goingto inherit that risk from other
kind of services as well or passtheir risk along.

(26:06):
So, everything is connected andI looked at it at that granular
level.
First, I'll say mental healthis super important and I think
to your point, with a lot ofstuff happening all the time
around privacy, it does take mea lot of time to be able to like
sit with my thoughts andprocess it.
Unfortunately, my mind kind ofraces at night and I just need

(26:29):
to kind of lay there and processstuff out or write things down
that's going through my brain,and it does take a toll.
And so, I definitely thinkmental health has been a huge
topic in the workforce and Ithink that's it bears the same
kind of power here as well.
For me, focusing on how to stayon top of things,

(26:49):
I think there's a lot of powerin enabling and getting close
to, or just enjoying and tryingto befriend, the people that you
work with and just trying toget access to roadmaps and
looking at stuff from a high-level.
I think the more that you canwork with other people and have

(27:11):
other people like really enjoyworking with you, who are in the
positions that are on thedifferent engineering teams, or
you can get access to what theirroadmap looks like or sit in on
a monthly kind of meeting justto let your ear your perk up if
anything new kind of happens.
That's how I've been working onit and it's a job that, and

(27:35):
even for me, when I was oneperson with 300 engineers, not
to mention what you had to dealwith with like an Amazon- sized
organization, even just enablingother people to kind of help
support you in that; it's notsomething you can do on your own
, but it is, in my opinion, veryvaluable, like we said, to be
able to recommend the right kindof privacy steps.

Debra J Farber (27:58):
Yeah, that's true and it has me even thinking
that I guess it makes.
.
.
you definitely want to havethose relationships and people
to almost keep an eye on certainthings for you.
Like "Hey, if you ever comeacross this where there's
they're starting to collect alot more data elements linked to
this person, give me a callBecause I'd want to track down

(28:19):
what that is and like the goalsthere.
Or "if you sniff out thatthere's going to be a new,
there's new systems beingdeployed and it hasn't been well
communicated, let me know," youknow, or this way I can do a
DPIA if there's high risk orwhatever it is that the job role
is.
Obviously, not everyone's doingPIAs.

Jake Ottenwaelder (28:37):
Yeah, keeping alerts on Confluence pages is
something I've done before.
Like a new kind of page isadded in Confluence, just have
that sent to my email.
I'll wake up, I'll have ahundred emails, but I'll just
kind of look through that prettyquickly and be able to say,
"Okay, like that makes sense.
I mean it feels bad, but Idon't want to say go spy on the

(28:58):
rest of the organization.
But, it helps to just kind ofsee and just understand what's
going on.
And I think, as long as you doit and, again, create those
relationships, build theconfidence and trust with other
people that you talk to.
I have great friends fromoutside of the engineering or

(29:18):
within other engineering teamsthat I rely on today and get to
talk to, and it's nice to alsojust have people at work that
you can enjoy talking with.
So, yeah, those are some of mytips as well.

Debra J Farber (29:32):
Well, thank you for that.
So, let's start withanonymization.
What are some of the blockersthat you've seen when deploying
anonymization as a privacycontrol within orgs?

Jake Ottenwaelder (29:43):
Yeah, anonymization is a very
sensitive topic for a lot ofpeople because I think there's
no clear consensus, at leastfrom a legal perspective or from
a risk and complianceperspective, like how to really
prove data is anonymous; and so,there's a lot of.
.
.I feel like the consensusaround anonymizing data is very

(30:05):
hard to really define, and sothe biggest blocker that I've
had with deploying anonymizationis the early stages of getting
people to buy in on trying to domasking.
And, I think to reachanonymization you kind of have
to start with basic masking ofdata and then you move into

(30:27):
"Okay, we're gonna start totransform the data a little bit
into this pseudonymized kind ofstate and the hope would be to
eventually reach, or have somedata sets reach, this standard
of anonymization.
So, it's a very long kind ofprocess and so just getting the
buy in is generally a pretty bigblocker that I've had, because

(30:50):
organizations, again, arefocused on the money; and so, if
you can't show them what thedollar signs are that are the
risk that's associated to nothaving anonymized data when not
a lot of companies have beenfined for classifying their data
as anonymous when it's not;it's really hard to start to
prove and get that to be adopted.

Debra J Farber (31:13):
So, I think maybe it's the framing, because
when I think of anonymizing data, I think of it as one of the
big ways you can take data thatyou have for one purpose and
then be able to use it forsecondary purpose freely,
because it's no longer personaldata once it's anonymized.
So, a lot of the times, this isgreat for opening up the value

(31:36):
of data for analytics purposesand insights and all that.
I kind of see this happening alot in the data science space,
where you wanna anonymizecertain things.
Then where it's a challenge isthat if you're anonymizing.
.
.it depends on if you're anonym.
.
.
It's input privacy or outputprivacy, right?
Is it anonymizing before you dothe analytics so that you can

(31:58):
freely do that, or is it doingthe analytics and then
anonymizing the output beforesharing it with third parties?
I guess there's different usecases, but I kind of see
anonymization as being used morearound "How do you open up and
use more and get more value outof the data set, and that really
being tied well to additionalrevenue as opposed to compliance

(32:18):
to take you out of likepotential risk of having
personal data.

Jake Ottenwaelder (32:24):
I definitely agree, and that's the general
push.
I think a lot of people canachieve what is considered more
of a pseudonymized state throughthat.
The leap betweenpseudonymization and
anonymization is pretty big inmy opinion.
When we look at pseudonymizeddata, we can very much just take

(32:45):
the data set as its own entityand ask like is this table able
to connect you back to aspecific individual on its own?
Generally, that's a lot easierto achieve than anonymized data.
And taking the definition fromlike GDPR and there's a lot of
regulations and legalconversations from Germany as

(33:08):
well that I've studied,anonymization is "dDoes there
exist a data set that could orthat would re-identify this
individual and from acomputational analytics
perspective, does it exist?
" is a much more difficultquestion to prove that it's not.
So, I think a lot of companiesare accepting the risk of not

(33:34):
being able to fully proveanonymization.
But, again, accepting that risk- there's not been a lot of
regulation in the space or notbeen a lot of fines in the space
, and so that's where I see theposition at.
I know there are contingents onboth sides of this privacy
industry of people who believein anonymization as possible and

(33:55):
others who don't believe it'spossible.
Generally, for your example ofuse cases around data analytics,
there are ways that we canaggregate or manipulate data to
what I would still considermaybe not fully anonymized; but,
it would be very well-maskedand very well-pseudonymized so

(34:16):
that you can still do additionalanalytics or secondary use on
it.
But it's very hard to fullyreach that purist anonymized
state.

Debra J Farber (34:28):
Yeah.
There's also a difference, Ithink, when we talk about
anonymization versus ananonymized dataset, versus you
could have a dataset that youwant to make anonymous for
release.
When it's released, nobody canactually, outside of the
organization itself.
It's pseudonymous data.
It exists pseudonymously but itwould be, for shared purposes,

(34:52):
it's anonymous for what has been.
.
.
no one can link it back otherthan the company that actually
has the identifiers orcapability to link it back.
Technically, one would say thedataset is a pseudonymous
dataset because it can bere-identified; but for the
purposes that it was released,that data that nobody else can
re-identify it.

(35:13):
I feel like we need bettervernacular around those
distinctions because I thinkthose distinctions have further
differences with importance thatwe don't really.
.
.We just lump anonymization in.
It depends on whose perspectiveyou're looking at.
From a company that just seesanonymized statistics, I
wouldn't be able to figure outwho contributed what data to

(35:35):
that anonymous dataset.
But, those that do hold thekeys might be able to
re-identify.
For their purposes, it'spseudonymous, but for my
purposes it's anonymous.
I think that gets a littleconfusing.
I'm also curious what do youthink about differential privacy
using differential privacytechniques to achieve anonymity.

(35:55):
It's a very high bar under GDPRto have data be considered
anonymous; but, I do see a lotof people talking about
differential privacy as one wayto get there.
What are your thoughts on that?

Jake Ottenwaelder (36:06):
Yeah, I've done statistical analysis around
differential privacy, actuallyimplementing it and stuff, which
I think I was very privilegedto be able to do that.
It is a very interesting andhopeful space.
When I looked at how to doanonymous data, well, the

(36:27):
biggest thing that I thinkeverybody agrees upon is there
has to be some uncertainty inthe data.
It's unfortunate, but the moreuncertainty that we add to the
data, technically, the lessvaluable it becomes.
There's always going to be thistrade-off of you could have a
perfectly anonymous table ofones and zeros those are the

(36:50):
only values that are there butit won't be very valuable.
Differential privacy helps addsome of those elements to it;
but, I would caution by sayingjust because you implement
differential privacy doesn'tmean you're going to reach a
state of anonymization, becausedifferential privacy still has
some limitations.
You could have, if you're usingan Epsilon value that's too

(37:13):
high, you could inadvertently.
.
.that's not really privacypreserving anymore.
And, I know some organizationshave very high Epsilon values,
but they still say we're usingdifferential privacy.
Well, you're ruining thetechnology if you just have an
Epsilon of 100 when a normalvalue would be 1.
That gets exponentially worse.

Debra J Farber (37:35):
So, what would make that better?
Would having a formalizedstandard that people can claim
that they met the requirementsof a differentially private
standard - like Epsilon must beunder this, auditability of that
- to make it so that anyone whoclaims that differential
privacy techniques andcompliance with that is claiming

(37:57):
that they're meeting certainEpsilon values and privacy
budgets and stuff.

Jake Ottenwaelder (38:02):
Yeah, that would be great and I know there
are conversations.
I think differential privacyworks really well
when you're looking at a pointin time and you're looking at a
privacy budget for a point- in-time query.
If you're trying to pullstatistics once and you need
that.
It doesn't work, if you'retrying to, to your point,

(38:22):
release a dataset, that is - ornot, that it doesn't work, but
it becomes a little more riskywhen you're trying to release a
dataset that's going to existfor a long period of time
because anonymization is not aone and done thing.
It's an ongoing assessment thatneeds to constantly be reviewed
because new datasets could comeout that could re-identify it.
Tomorrow, Facebook could openup a new API that gives a bunch

(38:46):
more data that would allow us tore-identify some data that they
had previously released.
So, it's a constant review andI do know of some organizations
and startups - I'm not paid tomention their names, so I won't
- but there are startups who aretrying to develop.
They do an analysis of yourdata, they give you masking

(39:08):
techniques, and then theypartner with organizations and
legal firms who would sign offon us as a legal firm or willing
to say that this meets ourstandard of anonymization.
That gives organizations alittle bit more comfort.

Debra J Farber (39:23):
That also tracks .
That's very similar to what wasgoing on in the HIPAA world for
a long time.
Yeah.
We all agree that thede-identification standard is no
longer.
.
.
even though it's still in law,is no longer good and it's not
anonymous.
It's not necessarily meetingall the need for the moment, but
there always was this use caseor requirement under HIPAA that

(39:43):
you would have a statisticianthat you would hire to certify
that your data set under HIPAAwas impossible to re-identify or
there's a ridiculously lowlikelihood of the ability to
re-identify.
So, I'm not surprised that - Idid not know this - but I'm not
surprised that this is nowbecoming a little bit expansive
beyond HIPAA and now you canhave tools and techniques from

(40:05):
data scientists that can workwith the attorneys or whatnot,
to actually provide rigor aroundthe process of ensuring that it
can't be re-identified.

Jake Ottenwaelder (40:16):
Yeah, and it puts me, as a private engineer,
in an interesting positionbecause if I'm taking and I'll
self-proclaim that I take moreof a purist view of anonymized
data from GDPR.
In a lot of cases I don't thinkfully anonymized data can exist
because, like I said, it's anongoing living process - that it

(40:38):
might be anonymized right nowbut it might not be anonymized
in a couple months from now.
And so, as somebody as a privateengineer, if somebody else is
willing to accept that risk,sometimes I need to take a step
back and say, o"Okay, if thisorganization is willing to say
that this is anonymous, even ifI'm going to do my best to
protect it as well as I can, andeven if I don't think it

(41:00):
reaches anonymization standards,you have to sometimes allow
other people to accept that riskfor you.
I do hope in the future that weget to a point where, like you
said, we have a standard aroundwhat an Epsilon value could be,
or we look at PCI Compliancewith the security world.
It was a very largelyindustry-driven standard that

(41:22):
was developed and maybe.
.
.
I think privacy is following alot of the trends that security
has been maturing through andmaybe that's something in the
cards that we're hopefully goingto see pretty soon.

Debra J Farber (41:35):
Thank you, that's really helpful.
Thanks for sharing yourperspective on that.
Let's now turn our attention toConsent Management.
What are some of the currentblockers to implementing Consent
Management capabilities intoorganizations?

Jake Ottenwaelder (41:47):
I think a lot of private engineers or people
who are trying to implementprivacy within the organization
would say the main blockeraround consent management is
just this need for everyone tocollect and consume data.
We want to understand whatpeople are doing, understand

(42:08):
their experiences on our website, understand all the tracking
and things going on; and, withconsent management, we have to
take a step back and look atwhat is the purpose behind our
platform and what should we bedoing as part of our
functionality and what should wemake sure that customers might

(42:31):
not be expecting that we'redoing and that we ask them if
that's all right.
So, I think the major blockeragain is the shift in mindset
around being data stewards andunderstanding that data has a
risk and being willing to letpeople make decisions on their
own and taking it sometimes froma business perspective to
continue to kind of build thattrust.

Debra J Farber (42:53):
Got it.
That makes sense.
So what are some holisticapproaches that you would
suggest to implement?
Consent management platformsand other features that enable
compliance while supportingmarketing and advertising goals?

Jake Ottenwaelder (43:05):
I think from a holistic perspective, I look
at again meeting kind of themarketing and advertising teams
where they are understandingkind of what their flows are and
understanding how can Iintegrate and how can I
integrate my privacy checks, orhow can I enable them to be
better data stewards andunderstand when do I need to get

(43:28):
tapped on the shoulder toanswer a question, or just kind
of building that relationship.
So focusing on kind of wherethose marketing teams are at to
support the adoption of betteradvertising technologies, or
understanding like where we needredundancy or if we need
multiple platforms and how thatmakes it challenging to look at

(43:51):
like data flows and I thinkunderstanding again with consent
is very widespread it shouldpopulate through a lot of the
different data flows and theirconnections that the
organization works on.
So working with data teams tomake sure that's a key
functionality, a key centerpiecein how they look at data and

(44:15):
what information they're allowedto process.

Debra J Farber (44:18):
Makes sense.
And then, last but not leastfor today, we'll talk about what
are some of the currentblockers to implementing rights
management capabilities throughDSAR processing into
organizations.
For instance, processing data,access and deletion not just the
request, but then like thedelivery of them right.

Jake Ottenwaelder (44:36):
Yeah, this is an area that's probably one of
the biggest challenges that alot of organizations have that
maybe they don't know becausedata is, again, a very living
thing within your organizationthat you could delete it in one
spot and then it mightrepopulate there or might
trickle down from anotherlocation.

(44:57):
So, understanding where yourdata is flowing and the data
lineage has been a reallyinteresting area that I think a
lot of data analytics platformsare focused on now, like data
lineage and customer trajectory.
We have to look at the samething with privacy, like where
is the data flowing through?
If we delete the data in thisone location?

(45:19):
Is this the starting point ofthe flow or is this kind of near
the end?
How can we continue to gothrough those flows?
And the biggest area that Ithink is still a challenge for a
lot of functions and areas orvendors are like being able to
audit these data deletions aswell.

(45:41):
How can we prove that we've donesomething that we've said we've
done and how much proof are weable to kind of provide as well?
I think those are all prettybig challenges.
And then you get into legalconcerns around deleting data
and having to have thoseconversations and have back and
forth around "do we really needall of this information or are

(46:05):
there ways that we can know yourcustomer KYC with financial
data, with less information?
How can we separate out ourdata flows so that we have legal
data in one cold storage andall of the other data that we're
maintaining in more of anactive state that can be deleted

(46:26):
more easily?
" A lot of stuff that takes alot of thought and forethought
and conversations within anorganization.
Without that, you're going toimplement a tool that is just
going to again have lack ofadoption, not fully cover the
organization.
In my opinion, it's a lot worseto say we're doing this and

(46:49):
find out that you're not doingit versus saying we're
consistently building to improveour program.

Debra J Farber (46:56):
Yeah, and so are you seeing that as companies
are adopting tools, datadiscovery, mapping and DSAR
delivery tools, and such thatthey're implementing the tools
without, maybe, processes firstdefined as to how those tools
will be used?
Or is it more?
What are some holisticapproaches to implementing a
streamlined, compliant DSARprocess that has minimal

(47:19):
disruption to the business?

Jake Ottenwaelder (47:22):
Yeah, it starts with, again,
understanding the processes ofthe business and understanding
where the data is going.
A lot of organizations thatI've seen are going to go to the
bells and whistles companies inprivacy technology and they're
going to go ahead and purchasethe best tool, but then they're
going to sit there and not knowwhat to do with it, or they're

(47:44):
going to connect it to a couplesystems and think they're done.

Debra J Farber (47:47):
Right, right.

Jake Ottenwaelder (47:49):
It takes somebody who's done it before
and has this mind andunderstanding that you have to
look at every system.
Hopefully, IT has a list ofsystems.
If not, you have to build asystem inventory and then you
have to go through andunderstand how are people
logging into the system?
Because if people are loggingin through Okta or some other

(48:11):
service, then - SSO service -you have to make sure that
you're deleting there, or do youhave to delete accounts on the
application itself?
It's about who's accessing thesystem, what they're doing.
Is it a personal account or isit a business account?
For instance, if you're arecruiter for a company and
somebody says, delete my data,that recruiter should

(48:33):
technically go into theirLinkedIn profile because they
were acting as an agent of thecompany and delete their
messages with that recruit ifthey no longer want to have
their data collected by theorganization.
That's a massive edge case thatI believe should be part of a
DSR process.

Debra J Farber (48:52):
It makes sense.
I mean really, LinkedIn justbecomes an extension of your CRM
.

Jake Ottenwaelder (48:56):
Yeah, exactly .
There's no API that's going todo that and you can't force.
.
.there's no way to do thatwithout it being a manual
process.
Really, I think DSARs are thisstopgap and this is going to be
my hot take for the episode.
I think DSARs are not the bestsolution when it comes to

(49:19):
deleting or trying to worryabout data.
I think we should take more ofan approach connected to
marketing emails or emailaddresses, that when, if
somebody doesn't respond orinteract with your platform over
a certain period of time, thatdata should just automatically
be deleted.

Debra J Farber (49:36):
Yeah.
Data minimization and just theprinciples of just keeping data
accurate and up to date and allof those things.
Yeah, absolutely.

Jake Ottenwaelder (49:44):
Yeah, because I think a lot of DSAR processes
are just being utilized by thepeople who know that they're
being utilized or know that theyexist.
I would tend to say that thatwould be people who are more
highly educated about privacy ingeneral.
So, we look at is it adiscriminatory practice against
people who might not be as welleducated?

(50:05):
I, as somebody working inprivacy, believe that everybody
in privacy should be a humanright.
How can we make DSARs a processthat everybody's aware of, that
everybody can participate in,even if they don't have the
technical background orunderstanding to be able to do
that?

Debra J Farber (50:23):
I think that's really great advice.
Do you have any words of wisdomto leave the audience with
today?

Jake Ottenwaelder (50:29):
Yeah, hopefully my voice wasn't too
annoying to the rest of theaudience.

Debra J Farber (50:34):
You have a great microphone you're using.
It's been wonderful.

Jake Ottenwaelder (50:37):
Thank you.
I really appreciate the time,Debra and you having invited me
onto the podcast.
The last kind of word that Iwould like to say, or just my
final thought, would be aroundthis concept of being a servant
leader.
I think in privacy, we need tounderstand that we're providing
a service to the rest of theorganization and we are leading

(50:59):
by example and we are servingothers through our leadership
and through our advocation.
Just understanding that maybeyou're not going to see the
rewards every single day.
Take time for yourself.
Make sure that you understandthat you are hopefully fighting
the just fight, but really beinga servant leader and uplifting
others to enable them to doprivacy on your behalf.

(51:23):
That's how we leave the bestlegacy and continue to grow
adoption and education ofprivacy.

Debra J Farber (51:31):
That is great advice.
Thank you, Jake, for yourservant leadership here and for
sharing your wisdom with thelarger audience.
Jake, thank you so much forjoining us today on The Shifting
Privacy Left podcast.
Until next Tuesday, everyone,when we'll will be back with
engaging content and anothergreat guest or guests.
Thanks for joining us this weekon Shifting Privacy Left.

(51:53):
Make sure to visit our website,shiftingprivacyleft.
com, where you can subscribe toupdates so you'll never miss a
show.
While you're at it, if youfound this episode valuable, go
ahead and share it with a friend, and if you're an engineer who
cares passionately about privacy, check out Privado: the
developer-friendly privacyplatform and sponsor of this

(52:14):
show.
To learn more, go to provado.
ai.
Be sure to tune in next Tuesdayfor a new episode.
Bye for now.
Advertise With Us

Popular Podcasts

United States of Kennedy
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.