All Episodes

January 30, 2024 51 mins

My guest this week is Jay Averitt, Senior Privacy Product Manager and Privacy Engineer at Microsoft, where he transitioned his career from Technology Attorney to Privacy Counsel, and most recently to Privacy Engineer.

In this episode, we hear from Jay about: his professional path from a degree in Management Information Systems to Privacy Engineer; how Twitter and Microsoft navigated a privacy setup, and how to determine privacy program maturity; multiple of his Privacy Engineering community projects; and tips on how to spread privacy awareness and stay active within the industry. 


Topics Covered:

  • Jay’s unique professional journey from Attorney to Privacy Engineer
  • Jay’s big mindset shift from serving as Privacy Counsel to Privacy Engineer, from a day-to-day and internal perspective
  • Why constant learning is essential in the field of privacy engineering, requiring us to keep up with ever-changing laws, standards, and technologies
  • Jay’s comparison of what it's like to work for Twitter vs. Microsoft when it comes to how each company focuses on privacy and data protection 
  • Two ways to determine Privacy Program Maturity, according to Jay
  • How engineering-focused organizations can unify around a corporate privacy strategy and how privacy pros can connect to people beyond their siloed teams
  • Why building and maintaining relationships is the key for privacy engineers to be seen as enablers instead of blockers 
  • A detailed look at the 'Technical Privacy Review' process
  • A peak into Privacy Quest’s gamified privacy engineering platform and the events that Jay & Debra are leading as part of its DPD'24 Festival Village month-long puzzles and events
  • Debra's & Jay's experiences at the USENIX PEPR'23; why it provided so much value for them both; and, why you should consider attending PEPR'24  
  • Ways to utilize online Slack communities, LinkedIn, and other tools to stay active in the privacy engineering world


Resources Mentioned:


Guest Info:

Send us a text



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

TRU Staffing Partners
Top privacy talent - when you need it, where you need it.

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Jay Averitt (00:00):
I think that's where the biggest challenge is
for companies - having privacy,have that voice, and having that
voice early.
Instead of, "hey, we're aboutto release this, does this look
okay?
If you're in that situationwhere someone just says, "Hhey,
we're about to release this,does this look okay?
I mean it's going to be nearlyimpossible for privacy to make

(00:21):
any kind of dent.
You might be able to be likeh"Hey, there's this one thing
you should do.
But, you can't really hold up arelease.
But, if you're embedded earlyin the design process, that's
where a company really showsmaturity from a privacy

standpoint (00:37):
how early is privacy being consulted?
And then, on top of that, isprivacy actually being listened
to as part of the process?

Debra J Farber (00:48):
Hello, I am Debra J Farber.
Welcome to The Shifting PrivacyLeft Podcast, where we talk
about embedding privacy bydesign and default into the
engineering function to preventprivacy harms to humans and to
prevent dystopia.
Each week, we'll bring youunique discussions with global
privacy technologists andinnovators working at the

(01:09):
bleeding edge of privacyresearch and emerging
technologies, standards,business models and ecosystems.
Welcome everyone to ShiftingPrivacy Left.
I'm your host and residentprivacy guru, Debra J Farber.
Today, I'm delighted to welcomemy next guest, Jay Averitt,
Senior Privacy Product Managerand Privacy Engineer at

(01:32):
Microsoft.
Jay transitioned his careerfrom a technology attorney to
Privacy Counsel and, mostrecently, to Privacy Engineer.
Jay is an active member of theprivacy engineering community.
He serves as an evangelist inthis burgeoning field.
I'm delighted to interview himtoday on this Data Privacy Day

(01:53):
episode.
This episode is actually goingto come out about two days after
Data Privacy Day, which isJanuary 28th every year, so
we're going to chat about hiscareer path, his experiences in
privacy engineering roles atcompanies like Twitter and
Microsoft, some of the communityprojects that he's working on;
and, of course, ways to spreadprivacy awareness.

(02:15):
Welcome, Jay.

Jay Averitt (02:16):
Hi Debra! Thanks so much for having me.
I've been a big fan of yoursand this podcast for a long time
, so it's really an honor to beon the podcast.
So, thanks again for having me.

Debra J Farber (02:27):
Oh, absolutely.
Well, thanks for sharing yourinsights.
I'm really excited for thisdiscussion about privacy
engineering as we see it shapingup in the industry.
Why don't we kick off ourdiscussion with you telling us a
little bit about your originstory?
I have some follow-up questionsI'm really eager to ask you

(02:47):
about, but I feel like it'sgoing to set the tone for the
rest of the conversation.

Jay Averitt (02:51):
Yeah, sure.
I grew up just being reallyinfatuated with technology from
an early age.
It was funny.
Just the other day, my wife waslooking at a video of all of us
talking about what we wantedfor Christmas as children, and I
was the ninth grade at thetime, and she came back to me
and she said, "What exactly didyou ask for for Christmas?

(03:14):
And, I said that was a 14.4 baudmodem.
And she just laughed and she'slike, "God, you were such a dork
then and you know that kind ofsets the tone of how I was as a
child and just really infatuated.
I grew up thinking, you know,h"Hey, I'm going to maybe do
something with computers.
But I also was reallyinterested in the law by reading

(03:35):
books.
The John Grisham books reallygot me interested in law and my
dad had a.
.
.
he wasn't a lawyer, but he was arisk manager, so he had a lot
of workings with lawyers and Igot a lot of contact with
lawyers and so I was alsointerested in that.
When I got to college, I wastrying to decide if I was going
to major in somethingtechnology- related or go to the

(03:56):
traditional political scienceroute and then go to law school.
Luckily, my dad reallyencouraged me to major in
something technology- relatedand so I majored in Management
Information Systems, graduatedcollege with a Management
Information System degree.
It was during the boom in 2000where lots of people were
looking for programmers.
So, I ended up getting a job atNCR as a C++ programmer and I

(04:22):
started seeing a lot of theprogrammer jobs being shifted
off to India and overseas.
During my first couple of yearsthere, I got a little nervous.
I was like, "whoa, maybe therewon't be software engineers in
the future.
Well, you know crystal ball.
Now I can see that that wascompletely wrong.
But, I was like, "maybe this isthe time to pivot and get a law
school.
So I went to law school and inlaw school looked for ways to

(04:49):
use my law degree withtechnology and tried to think
about how I could do that, andstarted out doing software
license agreements, which wasgreat.
But, at the end of the day,when you're working on a
software license agreement, youcould be working on a contract
to sell a house or sell a car orany of that.
So, it wasn't really playing upto my tech skill set.

(05:12):
But in 2010, 2012, I startedseeing cloud really come into
play into what I was doing froma software licensing experience
and I got really curious aboutthe privacy impact.
In the early 2000s, everythingwas being installed on computers
themselves or servers, but notbeing installed out in the cloud

(05:36):
somewhere else, and so Istarted thinking about privacy.
Back then, got my CIPPcertification way back in 2012,
when it was all pencil and paperand just really planted a seed.
When I was primarily a softwarelicensed attorney, I really
looked for ways to do privacystuff.
Every time I had an opportunityto do privacy stuff I just

(05:59):
really loved it.
So, that kind of led me downthe path of, "Hey, I'm not
really loving this practicinglaw and looking at contracts all
day.
What can I do that is privacyrelated, but not doing that.
And until the GDPR hit, therereally wasn't an avenue for

(06:20):
privacy engineers.
I mean, there were peopleobviously working in privacy
from the tech side, but HIPAAand GLBA, things like that were
just not big enough to support alot of folks working on the
technology side of privacy.
But when GDPR hit, that sort ofchanged, and I thought
consulting was kind of a way totow my way into those waters and

(06:44):
so I did some consulting for acouple of places, and PWC was
the biggest place.
I spent some time doingconsulting and I really enjoyed
that work, getting to see lotsof different things.
But, I wanted to really focuson the tech.
That's just where my love wasand working specifically on
privacy tech projects for techcompanies, and so when I got an

(07:08):
opportunity to do that, I didand made the transition to
working as a privacy engineer.
I've got to tell you I've lovedthat transition and my days are
much happier looking andworking on privacy tech projects
than looking at contracts allday.

Debra J Farber (07:25):
Oh my God, do I hear you on that.
Like me, you shifted yourcareer left addressing privacy
earlier on from a focus on lawto engineering.
I went to law school, too.
It's not a common story, but Idefinitely feel a kinship with
you about that because I knowthe hard road of becoming a

(07:47):
lawyer and what it takes toreframe your legal and GRC
mindset towards privacy and dataprotection to an engineering
mindset - building anarchitectural design and
actually implementing privacy bydesign into the tech stacks and
systems and people, processesand technology.

(08:08):
What was your journey like fromprivacy lawyer to privacy
engineer in terms of mind shift?

Jay Averitt (08:14):
Yeah, I think it is as you correctly indicated.
It is a big shift in the wayyou are thinking, because so
much of being an attorney isreally looking at the risk of.
.
.
and I'm not saying you're atraffic cop as an attorney, but
you end up saying 'no' quite abit and figuring out.

(08:38):
.
.
obviously, good attorneys try tofigure out ways around saying
no'no,' but you're not focusedin on the technology innovation
happening.
That's really not what you'redoing.
You're really looking at okayo"kay what is the risk to the
organization, how can I bestprotect the organization?
And obviously you want thebusiness to be successful, but
you're not looking and focusingin on the technology around it.

(09:03):
So, I think that that was adifficult shift and I really
just wanted to.
.
.I just did the work and endedup spending 750 hours in a
software engineering boot campso I could get up to speed on
Python and JavaScript and whatwas happening and there; and,

(09:23):
while I don't really code much,it was just great to see what
architectures were out therefrom a tech stack and just get
excited about creating stuffagain.
I think that's the real mindshift.
Yes, in my role I spend a lotof time trying to help engineers
.
I say, "I really think whatyou're doing is super cool.

(09:45):
What can we do from the privacyside to make it a value add?
So yeah, I think the wholemindset is a bit different from
the legal side and wasrefreshing for me to make that
change.

Debra J Farber (09:59):
Yeah, yeah, I think you made that change at
the perfect time.
As you were saying, it wasright after GDPR went into
effect and there was a real needto technically address a lot of
data protection requirements,and you just couldn't do that
without privacy- knowledgeableengineers.
For me, one of the challengesthat I've had my whole career is

(10:21):
that I could see where privacyand data protection was going to
go.
I could see where it needed togo and it felt like the world
was slow walking its way toshifting left.
It was very frustrating for me.
For me, it was so clear justbecause being in privacy for 18

(10:41):
years.
I went and I got my CertifiedScrum Product Owner
certification because I trulybelieve that you needed embed
privacy into the productdevelopment process; and I'm
like, "this is where it's goingto be and I did this back like
almost a decade ago at thispoint, maybe longer, before they
had roles, before they hadroles for privacy product

(11:02):
managers and it just I gave upon trying to make that happen
when there was no political willin companies to hire for that.
So, I've had to kind of createdifferent career path entry
points for what interests me.
Just to give you a littleperspective of my challenge - I
didn't have the informationmanagement system background.

(11:22):
I was an English major with aBusiness minor and then went to
law school.
Right?
So, I have had to self-learn alot.
But, the process of learning iswhat is, I think, such a big
draw for me.
What do you think around that?
I've seen your posts onLinkedIn about constantly
learning and that that's arequirement really for privacy

(11:43):
engineering.
Do you want to expand upon thata little bit and add some of
your thoughts about it.

Jay Averitt (11:50):
Yeah, I mean, I think you're right.
I mean, I think a couple ofthings you just said they're
actually ranked really true withme, and particularly the slow-
walking privacy to where it is.
And, we're still slow- walkingat this point, I think, because
in my mind, privacy and securityshould be fairly close.
Your organization should lookat privacy almost the same as it

(12:13):
does security, because theirequally as important.
But, at this point everyoneknows security is super
important.
They know privacy is somewhatimportant, but I don't think
it's really gotten where.
.
it doesn't ring true tocompanies exactly.
That doesn't have a one-to-onecorrelation.
I think we still have a ways togo and I think we will get

(12:33):
there, but we're still kind ofslow- walking.
Going back to the learning partof that, yeah.
I think, to be in this field,you have to want to learn
because, not only are privacylaws changing all the time and
there may be differentregulations coming out.
Just for Quebec had a newprivacy law come out and I had

(12:56):
to look at things that I wasdoing from an advertising
standpoint that I hadn't lookedat before, so you might have
something like that come out.
But also, the tech is justconstantly changing.
I mean, AI has always beenaround, but these LLMs and being
incorporated into absolutelyeverything didn't exist until

(13:18):
the last year or so.
Then, looking at what that meansfrom a privacy standpoint is
something that is really new andnovel and folks are scrambling
trying to figure it out.
So, you have to want to learnand you have to really enjoy
that learning aspect.
I mean for me, I mean that'show I knew that I was making the
right shift out of law intomore of a tech-focused role was

(13:43):
where that bootcamp I was partof.
I was working on a project andI was sitting there looking at
code and working on my code andbefore I knew it, something like
six or seven hours had gone byand I can tell you I never
looked at a contract and losttrack of time.
So, I think you have to befascinated with technology and

(14:04):
be able to just appreciatelearning new things to really
excel in privacy.
And, I think that's one of thebest things about privacy is
that it's sort of a haven forthose that really like the life
long learning aspect.

Debra J Farber (14:20):
I 100% agree.
That, for me, is the.
I never get bored.
I have ADHD, I'm neurodiverse.
I found an area where I willnever get bored and, as you
mentioned, there's brand newtechnology all the time.
Maybe it's XR.
Maybe it's cloud.
Maybe it's AI.
What's great is that now thatwe have privacy engineers that

(14:40):
exist, and a lot of them in thebig tech firms that have the
ability to scale and bring tomarket technology rather well,
they're going to need to figureout, "how do you strategically
do this.
I feel like privacy engineeringis more now about the strategy
of architecting it and designingfor it before you're ever even

(15:00):
going to the softwaredevelopment lifecycle.
There are so many otherengineering aspects to think
about that you could actuallystrategize and be part of the
well- architected way ofdeploying these technologies for
privacy, data protection,security, trust, safety - all of
the tangential things that youcare about in addition to
privacy and data protection.

(15:21):
So, for me, I just love thatit's shifting from a compliance
and risk - well, there's stillrisk but from a compliance
standpoint and legal standpointtoo, more thinking about how
could we strategically work withteams to bring this to life and
be part of the innovationdevelopment process.
Not the people who always say'no' and make it harder, but the

(15:43):
people who enable you to nowunlock the value of your data,
maybe in data science, and beable to share data or use data
and train on data wherepreviously you couldn't because
you're deploying privacyenhancing technologies or you've
got a new architecture that hasthe privacy constraints built
into it so that misuse of datacan't really happen, and things

(16:06):
along those lines.
So, I'm just loving the factthat we could really be part of
the value proposition.
I mean, again, 18 years in thisspace.
It's taken a long time tofinally for me to finally feel
like I've got the roles - and Iwork for myself in the
consulting and stuff - the rolesthat feed my happiness.

(16:26):
Yeah, so speaking about 'feedingyour happiness,' can you
compare for us your experiencesworking at Twitter versus
Microsoft?
Obviously, I don't mean in good, bad, like 'spill the tea'.
I mean that would be nice, butthat's not what I'm asking for.
More of, just compare the waysthat the privacy is set up and

(16:47):
maybe some of the goals of thecompany might be different.
I know Microsoft has a reallydistributed privacy
responsibility based on eachteam, each product line, along
those lines.

Jay Averitt (16:58):
Yeah, it's funny.
I think Twitter, working atTwitter, is probably about what
you think working at Twitter isfrom the outside world.

[Debra (17:07):
Pre-Musk or post-Musk?
] Yeah, I was going to say inthe pre-Musk era.
I can't really speak, to whatit' s like to work at X.
But, in the pre-Musk era,Twitter was this app that
happened by accident.
The founders of Twitter didn'tknow exactly what it was going
to be and it turned into thisgiant app that everyone was

(17:29):
using and became extremelypopular.
So, with that, they didn'treally know what they were doing
from a privacy standpoint orweren't even thinking about it
at the time.
So, they had a lot of stuffhappen.
The FTC came in, made someconsent orders; and so, they
really put, I think, a good teamin place to make their privacy

(17:53):
program more mature.
I think they had Lea Kissner inthere, who's just amazing in
privacy, probably the mostprominent privacy engineer, or
one of the top two or three, ifnot the most prominent; and the
whole team there at Twitter wasreally great and just a bunch of
super smart folks.

(18:14):
But, the program itself wasreally sort of.
.
.
the framework was there toreally get things done.
I mean, even building aframework is difficult.
I've seen companies, especiallystartups, that they just don't
have a framework, and until youhave that framework and program
in place and they're convincedthe org to actually shift

(18:36):
privacy left and have engineersand privacy engineers involved
in the design process, it'stough to put that program in.
Just even knowing where yourdata is and all that is top.
So, Twitter had all of thatfoundation in place.
It was really kind of trying toget buy-in from engineers to
follow the framework that was inplace.

(18:58):
None of the processes wereformalized to the degree that
they are like at Microsoft, forexample, but they had the right
framework in place.
As far as the culture at Twitter, I mean, like I said, it's
about what you expect.
It felt like a very youngorganization even then.

(19:18):
Twitter had been around for awhile, but it still kind of had
that startup feel to it.
And then, transition toMicrosoft.
I guess, until I got there, Ireally hadn't seen what a mature
privacy org had looked like.

(19:39):
I think there's probably acouple of other companies out
there that have super matureprivacy orgs.
I know Google has a prettymature privacy org and some
others do as well, but I hadn'tbeen on the inside to see
anything like Microsoft before.
There already was this buy-infrom engineers, buy-in from
pretty much the org in generalthat h"ey, look, we're really

(20:02):
going to shift privacy left.
We're going to consider privacyearly on in the process and
I've got a team of folks thatreally get privacy and I'd say,
you know, privacy is extremelyimportant.
I mean I can't say that everytime that there's a privacy

(20:22):
problem that is a thousandpercent resolved to my
satisfaction, but I will saythat if I say something from a
privacy standpoint, I'lldefinitely get looked at and
it'll be considered and it'sreally valued, which I think
that's where the biggestchallenge is for companies is
having privacy, having thatvoice, and having that voice

(20:43):
early.
Instead of h"Hey, we're aboutto release this, does this look
OK?
If you're in that situationwhere someone just says, "Hey,
we're about to release this,does this look OK?
" I mean it's going to be nearlyimpossible for privacy to make
any kind of dent.
You might be able to be like,hey, there's this one thing you
should do, but I mean you can'treally hold up a release.

(21:05):
But if you're embedded early inthe design process where I mean
there's a thousand things like,for example, Microsoft will
have a private preview or apublic preview before it ever
gets to some kind of release andthey're meeting with me before
they even are testing this outon Microsoft employees.

(21:25):
So, I think that's where acompany really shows maturity is
from a privacy standpoint ishow early is privacy being
consulted, and then, on top ofthat, is privacy actually being
listened to as part of theprocess?

Debra J Farber (21:42):
That's pretty awesome for a large company.
I've not had a positiveexperience in a large company
like that where they have trulyunderstood their mandate for
privacy and data protection andstaffed the organization
accordingly.
You know, I am envious of yourexperience there.
I wish that upon everyone,though.
I think that that is how itreally should be.

(22:05):
Right?
I mean, we need to get to thatpoint.
I do wonder, though, even whereI have seen privacy
responsibilities, no matterwhether it's an engineering or
risk or whatnot in a largeengineering- focused
organization like, for instance,I'm pulling from my Amazon
experience, one of thechallenges I had was I might

(22:27):
have an understanding of myproduct offering or unit that
I'm infor me that was AWS or Prime
Video; but, there was just toomuch going on across the
organization and it was just toodistributed a responsibility
that, like I really had no ideawhat was going on outside of our
individual business unit.
And, I wonder to what extent doyou have suggestions even, in

(22:51):
the spirit of Privacy AwarenessDay how organizations can better
, especially engineering-focused ones, can better message
"Here's how we can unify as anorganization.
" You know what I mean?
So, have more of a unified kindof perspective.

Jay Averitt (23:09):
I think you bring up a good point.
I mean, you know, there's220,000 employees at Microsoft,
so me getting to know all ofthem is impossible.

Debra J Farber (23:18):
There's over a million at Amazon, over a
million.
I worked at IBM and I said I'llnever work for a company this
large again and somehow I endedup at a million person company.

Jay Averitt (23:29):
Right.
So it's impossible.
You can't know everybody and youcan't even know.
.
.
I feel like I know a decentamount of folks in privacy, but
I don't know everybody.
But, I do think that there's apretty good unity between, for
example, I'm on the Office 365side, so we're looking at things
like Exchange and OneDrive andSharePoint and stuff like that.

(23:55):
There are other teams that areworking on things like Word and
Excel and all of that, and Iknow who those folks are and
there's a lot of overlap betweenthat; for example, if I see
something that I'm working onthat's impacting their space, I
certainly reach out to them andsay, "hHey, I think this might

(24:17):
look okay from a privacystandpoint, but what do you
think?
And then vice versa, and thenthere's things that the other
day somebody, ironically enough,was asking me about.
There's apparently a OneDrivepodcast and I support OneDrive
and they wanted to do a giveawayof people who left reviews and
stuff like that.
And they're like, hey, can youlook at this from a privacy

(24:39):
standpoint?
And I was like, well, this isjust different, because usually
I'm looking at things,consulting with engineers about
their designs and stuff, and Iwas like I can certainly give
you my thoughts on this from aprivacy standpoint, but we
probably need to involve somepeople from our marketing
privacy team to look at this,because I may not know all the

(25:01):
ins and outs of giveaways andthere's probably some legal
ramifications of that, so maybewe should look at that.
I think it's impossible to knowan entire.
.
.
if you're at a giant companylike these big tech companies,
you can't know everybody.
Privacy is not going to begiant, even at these big tech

(25:23):
companies, so you can at leastget to know a few folks across
each division and then whenstuff pops up, you can say, "Oh,
who's there?
" And I guess I've also got thegood fortune of having a really
senior team and so if I havesomething that pops up and like,
hey, who's in marketing privacy?

(25:44):
T hey know.
" So that's lucky.
But, if you don't have that team, I think making those
introductions outside of yourmatrix into other teams is
important because, I think, yeah, unity in privacy is great and
building those relationships isgreat.
Even, I guess, just from mypost on LinkedIn, I had somebody

(26:07):
from LinkedIn's Privacy teamreach out to me over our Teams,
since Microsoft owns LinkedIn.
I introduced myself, and so Idon't have a lot of interaction
with LinkedIn Privacy, but I waslike, o"Oh, that's cool that
I'm getting to talk withsomebody from LinkedIn privacy.
So I think the more interactionyou can make is great and I'm

(26:31):
really big on just having acommunity of privacy because
ultimately, like we're talkingabout the whole learning aspect
of this.
Like nobody knows everythingand there's a plenty of blind
spots I've got and I need abunch of people to balance
things off of, and so I think weall do.
I think the more you can do toreach out across outside of your

(26:51):
specific team to build thoserelationships is great.

Debra J Farber (26:55):
Yeah, that's awesome.
Thank you for that.
We talked about, a little bitearlier, about how privacy
engineers can leverage theirexperience and role to enable
privacy design, research,architecture development and
data science without being seenas a blocker to business.
You know, we talked about that;it is a challenge, but how can
privacy engineers actually be anenabler rather than a blocker?

(27:16):
What are some tips that you'vekind of picked up in the trade?

Jay Averitt (27:21):
Yeah, I mean I think it's all about
relationships.
I mean I think that when I'mtalking with engineers,
especially new engineers thatare going through a privacy
review, a lot of them are almostnervous because they don't know
what to expect.
They don't know what they'relooking at, and I try to build a
relationship with engineers andsay, h"ey, look, we're on the

(27:43):
same team.
I love innovation and, you know, let me see what this cool
thing is you're doing and let'sjust look at maybe how we can
make sure that privacy is being[inaudible].
" I mean, I may look at it andbe like, "Hey, you know you did
it just exactly the way I wouldhave done it and you know
there's no need for me to reallyinject any further privacy into

(28:05):
it.
But I mean, if there is a need,it's usually like h"Hey, did
you consider this?
And a lot of times it wasn'tintentionally that they made it
not focused on privacy.
It's maybe they just didn'tthink about it from that
perspective.
So I thinkthe first tip I would say is
really build those relationshipswith the engineers and, you

(28:28):
know, try to show them thatwe're all on the same side and
how privacy can be a value- addto the organization.
I think the second tip isnobody likes a traffic cop, so
just being a blocker and saying'no' is not going to make you
win any friends or make youvaluable to your organization.
How you can be valuable to yourorganization is say, "Hey look,

(28:51):
this is a way we can makeprivacy more friendly, make our
users love our product more and,ultimately, if the user loves
our product more and trust ourproduct more because it has the
best privacy features, thatmakes it more innovative.
It makes it I mean, it createsour overall goal of, you know,
increasing the bottom line.
So I think that's the thing Iwould suggest.

Debra J Farber (29:11):
Yeah.
That makes a lot of sense.
Appreciate it.
Let's turn now to some of thework you do.
What is a technical privacyreview?
I know you're always such agood sharer of information to
educate others.
You're an evangelist of privacyengineering, like myself, so I
really appreciate that.
You're constantly posting yourthoughts, and I did see that you

(29:32):
posted recently about like"people asking me what a
technical privacy review is andthen answered it, so I figured
this is a good spot to ask thatfor you again.

[Jay (29:39):
Yeah, sure] You have an answer all ready

Jay Averitt (29:43):
Yeah, I think technical privacy review is
something that I didn't know awhole lot about until I actually
started doing it, so it'sinteresting.
I mean, a lot of it is beingconcerned about the data flow.
You know, specifically, theengineers will bring something
to me and the first questionsI'm asking really are around

(30:04):
"Hey, what new data are wecollecting as part of this
feature?
And then, once we find out whatthat new data is, figure out
what the classification of thatdata is.
So, basically, eachorganization has different
levels of sensitivity of thatdata, but based on how sensitive
that data is and specifically,if it falls into the categories

(30:28):
of where the GDPR would requireyou to honor a DSAR [a DSAR is
where a data subject or anindividual makes a request to
have their information beingdeleted] we want to make sure
that we're only retaining thatdata for a certain period of
time because we want to be ableto honor those requests and
things like that.

So, it's really looking at (30:47):
1) the data being collected, how
sensitive is the data beingcollected, and then 2) how does
the data flow.
So, looking at data flowdiagrams to see how the data
flows from the front endapplication to the back end, so
really looking and seeing "Okay,you know how long is it staying
in that database?

(31:07):
For how long?
And, you know, just looking andmaking sure that everything is
coinciding with our dataretention schedules and all that
.
So, that's a large part of it.
And then, after we perform thatreview, making recommendations
of, h"Hey, maybe we shouldn't be.
You know you are retaining itfor this period of time.

(31:29):
Why do you need to retain itfor that period of time?
And then asking them, after youask that question, I'd say,
okay, well, you do need toretain it for that long time.
Maybe is there some way we cande-identify the data.
So, really looking at all ofthat, and then there's - it's
not just me; there's also wehave compliance folks and
attorneys on the phone that willreally analyze things from a

(31:52):
GDPR standpoint.
And so, with that, I kind ofbuild my role as more of an
interpreter to the attorneys tosay, b"Because looking at these
data flow diagrams can getreally complicated, specifically
like when you're looking at allthese repository names and all
the different types of data.
So, really being able to distillto the attorneys, h"Hey look,

(32:15):
we're, you know, collecting thislevel of data, the sensitive
type, and you know we'reretaining it for this long.
Is this gonna make sense from aGDPR standpoint or do you think
there are some safeguards weshould put in place?
" Or you know, specifically, ifwe're looking at GDPR, looking
at the data flow to Europe, howstuff is being processed in
Europe or whether or not it'sbeing processed in America, and

(32:36):
figuring out if we can andcompliance also would look at
that and to see what we have inour contracts with our customers
, to see, hey, can we processthis and this geographical area,
and things like that.
So it's really, I would say tosum it all up, is really looking

at (32:53):
1) the data being collected - what type of data being
collected?
and then, 2) that flow of data.
It's really the crux of atechnical privacy review.

Debra J Farber (33:02):
Awesome, thank you.
That's definitely some insightinto technical privacy reviews
and you know, there's not likeany one way to do it.
Either way there's gonna bedifferent workflows, processes,
and whatnot for each company;but, I think that's a great
high- level overview of thetypical process there.
That makes sense.
So, like I said, Data PrivacyDay, when this is published,

(33:26):
will have just happened two daysprior.
What are some of the DataPrivacy Day activities that
you're participating in thisyear, especially around privacy
engineering; but, whateveryou're involved, in?

Jay Averitt (33:36):
So, I'm actually speaking at a Privacy Everywhere
Conference at the University ofIllinois to talk with Saima
Fancy on LLMs in the privacyworld and figuring out and
talking about AI governancearound those, doing a chat
around that.
And then, I think the daybefore, on the 25th, I'm
actually giving a Fireside Chatfor Privacy Quest which I think

(33:59):
you're involved with as well.

Debra J Farber (34:02):
Yeah, why don't you tell us a little more?
I kind of threw this softballbecause I wanted you to talk it
up.

Jay Averitt (34:08):
Yeah, I think the work that's being done by Mert
Çan is really interesting.
You know security.

Debra J Farber (34:14):
Just in case anyone didn't hear, he's talking
about Privacy Quest.
If you don't mind kind of giveus a sense of what is that
before the analysis.

Jay Averitt (34:23):
So Mert Çan kind of gamified privacy and has
different games and simulationsto show privacy from, instead of
just being h"Hey look, here's aquiz on privacy or here let me
tell you about privacy.
It actually makes it more of agame.
And you know, security doesthis.

Debra J Farber (34:45):
capture the flag kind of competition.

Jay Averitt (34:49):
Right, exactly.
But, privacy doesn't.
Maybe some places do, buthistorically hasn't done it very
well.
So, the work I think he's doingis really interesting, and he's
organized a Festival aroundthat where he's got different AI
teams working - one on one sideand one on the other - and then
he's got a number of FiresideChats.

(35:10):
I think you're hosting a GameNight.

Debra J Farber (35:12):
I'm doing a Quiz Night.
I'm doing three different quiznights, but yeah, he's got all
of these events to gamify thelearning experience for privacy
engineering.
I mean, he's got a wholeplatform.
That's what Privacy Quest does,but this is, you know, using
Data Privacy Day to create itsseparate set of events over one

(35:35):
month, so it's not just DataPrivacy Day.
When you're listening to this,at the end of January, this goes
on into mid February as well.
There's all sorts of eventswith the two different factions:
the AI doomers versus, like theAI optimists, and you have to
pick one and you'll be part of ateam, because everybody who's
joining in, you know, will becompleting tasks and such

(35:58):
towards experience points forthe team.
So then you get to play aroundwith a storyline too that like
walks you through the setting,and he's using, AI- generated
images to go along with thenarratives and the games, and
they's pulling it all togetherwith different puzzles and

(36:18):
learning modules around privacyengineering that is super fun.
So, to anyone who's listening,tune in for Jay's Fireside Chat
and definitely, you know, justcheck out the platform too,
because it's perfect for privacyawareness any day of the year.
But then of course this eventis kind of fun too.

Jay Averitt (36:36):
And, yeah, super interested in your quiz night
and see how that goes.
I think what he's doing issuper interesting and I think
that is needed because reallywe've not done a great job in
privacy of making it fun.
I mean, that's actually one ofmy goals is making a privacy
training that's actually fun totake, because I think security

(36:56):
does that, but I don't thinkprivacy does that very well.

Debra J Farber (36:59):
Yeah, yeah, this is also something you'll see if
you ever go to DEF CON.
For those who are listening, Igo every year.
My other half is a hacker.
This is a thing you know we do.
I've been for like at least sixtimes, so it's like six and
eight times, I'm not sure, but Ithink around six.
Anyway, these are the types of -they call them CTFs - Capture
the Flag kind of games that itgamifies because you get like

(37:21):
red team versus blue team.
You could take differentpersonas, so you can actually
say, "Today I'm gonna be a blueteamer and see what it's like to
play this game, thinking interms of defending my systems
right Versus I'm on the red team, I'm trying to find the
vulnerabilities in it andshifting the mindset in how you
would think in terms ofattacking.

(37:43):
That's traditional in security.
And so, for privacy, it's notgonna be so clear of defend and
attack; but, you're starting tosee that in AI attacks for
privacy.
You could start seeing how, ifyou threat model for privacy and
what the potential privacyharms are, you could simulate
very similar to how security hasdone it, and this is a little

(38:05):
premature secret announcement,but we may be at DEF CON next
year.
I think this is our plan.
I'm a formal Advisor for PrivacyQuest.
Sorry, I should probably havestated that upfront, but we're
trying to kind of move in there.
So many engineers go there.
There's an opportunity tocapture the interest of the
security engineers who have anoverlapping interest in privacy
and data protection within theirorganizations.

(38:28):
Then, there's, last year -Jutta Williams started a really
successful AI Village eventwhere she also did a CTF, but
for people to attempt to getLLMs - all the different LLMs
that exist, the base models - tooutput something it shouldn't.
Right?
And then, gamifying thatexperience and then being able

(38:50):
to then have a comprehensivelibrary of potential challenges
that can then be addressed forthreat modeling specifically.
And that was a success.
And so, I'm starting to seemore privacy / AI overlapped
tangential to security CTFshappening at DEF CON.
So, hopefully we can make thathappen this year.
It's our goal.

Jay Averitt (39:10):
Yeah, it's super exciting.
I hope you guys do make it toDEF CON.
I'd love to see that.

Debra J Farber (39:15):
Yeah, I think it's just a matter of.
.
.it's the logistical aims.
Everything at DEF CON iscommunity organized.
So, someone has an idea forwhat they call a Village, a car
hacking village or a planehacking village or a.
.
t.
hese are real villages thathappen at DEF CON.
They bring in a plane, theybring in a car, Voting Machine
Hacking Village right, andthat's hardware and software.
But, you know they have so manydifferent villages and so it's

(39:37):
just a matter of getting thespace, getting on the agenda,
making sure you have theinternet connectivity, like a
dedicated safe line, becauseyou're certainly not going to
use the most attacked network inthe world, which is the DEF CON
Wi-Fi network during that week.
All right, I'm talking way toomuch about this, but I'm really
excited.
Okay, I was also delighted tosee that we both are sitting on

(40:00):
the Programming Committee forthe PEPR 24 conference, the
Privacy Engineering, Practiceand Respect USNIX Conference.
Are you excited for theconference this June and how has
attending PEPR been of valuefor you?

Jay Averitt (40:14):
Oh my gosh.
I mean, yeah, I'm super excited.
PEPR' 23 - I've never been at aconference that I actually got
so much value out of.
I mean, usually a conference,you kind of just 1) one.
I never find real value in theprograms being presented.
I mean there may be one or twoout of 50 that you find

(40:35):
interesting and then, like, thenetworking aspect of it is so
difficult because there's somany different people doing so-.
.
.
I mean, for example, not tothrow IAPP under the bus because
I'm speaking at the GlobalForum, too.
But, it's just a differentatmosphere.
When you've got so many lawyersand so many people in privacy
doing so many different things,it's hard to find people doing

(40:58):
exactly what you do.
At PEPR, while privacyengineering is a big umbrella,
you're surrounded by like 200people that really get what you
do.
It's great for networking andthe programs, I mean they were
so good.
Even some that were above myhead, because I don't understand
.
.
.
I understand differentialprivacy and what you're trying

(41:18):
to accomplish, but I can't do it.
It's fascinating.
So, yeah, it was, like I said,the best conference I ever
attended.
Super excited about going thisyear and I mean I plan on going
every year that it's availableand I hope everyone does
honestly, because I think thatsense of community we have and
privacy, I think it's great andit really getting just to chat

(41:40):
with all these folks who I'dinteracted with on LinkedIn or
in other places was just greatand it was easy to do instead of
having to seek people out at agiant conference.

Debra J Farber (41:54):
Yeah, I agree with everything you just said.
It was, I came back justfeeling on a high.
It's a two day conference andyou might remember me up in
front sitting next to JasonCronk writing constantly - I'm
old- school; I still write notes.
Typing isn't the same thingfor me.
My brain learns, kind of likeas I'm writing, too.

(42:15):
I'm thinking here's all theseamazing topics and interesting
presentations, potentialspeakers for this podcast - just
a wealth of information.
I was just really astounded by,again, the networking
opportunity but, everyone wasexcited to be there.
No one was there because " allmy company sent me to have to be
here.
This was something peoplefought to get budget to go, not

(42:39):
that it's expensive conference.
It's actually pretty reasonable.
It's a nonprofit and everything, and it's not vendor- heavy or
anything.
It's pretty much sponsored bycompanies that want to hire
privacy engineers, so it's likethey have their recruiters there
, if anything.
I met some of my heroes.
I think I even met you therefor the first time in person.

Jay Averitt (42:58):
Yeah, we did.
Yeah, we did meet there for thefirst time.

Debra J Farber (43:00):
Yeah, it was absolutely wonderful and I urge
people who are interested inprivacy engineering, or if this
is your main focus, this is aconference not to miss.
It was just such an exchange ofideas.
I mean, everyone in privacyengineering, too, is coming from
different perspectives, right?
Somebody who's deep intodifferential privacy is not

(43:21):
usually deep into another PET.
Right?
And so, the people who aredeploying PETs, like there's
different libraries, you knowopen source libraries; they know
the different deploymentmechanisms; they're deep in
maybe the data science, butthey're not necessarily crossing
over into being deep into allof the privacy enhancing

(43:42):
technologies.
So, as those people weretalking about those deployments
and you know, we heard fromgovernments, like the government
of Singapore, on how they didsome implementations of
interesting things.
I'm trying to remember what itwas, but it was just a
cross-section of industry andacademia and evangelists and
just a happy place, and I knowthat this conference is going to

(44:02):
have thousands of people in thefuture.
I believe that.
I mean, I totally see that.
So, if you really want to getin on it now, where it's small
enough to feel super manageableas a conference, easily be able
to talk to anyone there -everyone's just excited to like
meet others who are interestedin this space and share ideas.

(44:23):
I'm still working on a projectthat I found by attending PEPR
and meeting a company that's gota consulting firm and I'm a
subcontractor to them nowworking on a California DMV
mobile driver's license andverified credentials project.
Like I can't tell you how manydividends my attending PEPR was,

(44:45):
and so thank you for sharingyours.
I feel like I'm going on and onabout my experience, but I
wanted to, since we're bothProgramming Committee members, I
just want to underscore that wejumped on being on this Program
Committee because of howwonderful an experience it is.
It is my delight to volunteer.

Jay Averitt (45:04):
Absolutely 100% echo everything you just said.
I can't speak highly enoughabout the conference.

Debra J Farber (45:12):
Awesome.
So, we're getting towardsclosing.
I know we could go on all dayjust talking about our love of
the space, but I'd love to hear,as people try to keep up with
what's new in the discipline ofprivacy engineering, what
resources you refer people toand what communities you might
plug into.
How do you stay up- to- date?

Jay Averitt (45:31):
Yeah, for starters, listen to this podcast; but no,
I think podcasts are actually agood source of information, not
just this podcast, but otherprivacy podcasts out there.
I learned something fromlistening to them because
privacy engineering is such abig umbrella.

(45:53):
Everybody's doing something alittle different.
So, I love hearing other peopletalk about what they're doing
and I learned from that.
But, I think there's also.
.
.
I think LinkedIn,actually looking at your feed, I
think connecting with a bunchof folks that are in privacy and
looking at you know what'spopping up is great.
I mean, I learned a lot fromjust seeing stuff pop across my

(46:17):
feed and I think that's a goodsource of information.
As far as like buildingcommunity, I think LinkedIn is
another way of doing that.
I think there's various Slacksout there that have tried to
create a privacy community, butthere's not one that I can fully
recommend at the moment.
But, I think LinkedIn andlistening to podcasts really are

(46:40):
the primary ways I do it.

Debra J Farber (46:43):
Interesting.
Those are the primary ways I doit, too.
I was curious if there were anySlack groups that you're like
"Oh, this community is justtotally hot right now, but I
haven't really found one myself.
I mean, I do know that there'slike communities like openmined
d.
org, right, where, if you areactually like working with data
science tools and supertechnical and want to understand

(47:05):
deploying PETs to unlock thevalue in data science of your
data sets, you could go reallydeep.
There's 16,000+ communitymembers in their Slack group
that are actively working ondeploying these things, taking
their free courses and thenasking for help from mentors
there that volunteer to help.
But, it's not one that Iparticipate in because, again,

(47:28):
it's too technical for mypurposes.
Yeah, those are the ones Ifollow too.
I used to follow a lot more on,you know, Twitter.
I'm just not on X anymore asmuch.
I mean, I occasionally go thereto see what crazy is going on
on the platform, but I don'treally use it.
I think there's an opportunityout there for communities to be

(47:49):
stood up.
I think there's a thirst fromprivacy engineers to be able to
ask questions of other people,and so maybe I'll try something
with Shifting Privacy Leftbrand, but I would need some
partners to reach out.
If anybody wanted to work withme and maybe Jay and maybe some
other evangelists out there too,bring that to life.
Just before we close, whatwords of wisdom do you have for

(48:13):
getting into privacy engineering?

Jay Averitt (48:15):
Yeah, I mean, I think it's if you've got that
passion and that love oflearning and I think there's
some resources out there.
I think the privacy classesthat are on it, like Privado and
stuff, are good places to start, and I think there's some books
out there that are good, likethe 'Data Privacy: a runbook for

(48:35):
engineers.
' Things like that are good togive you an overview.
But, you know, I also think youknow if you're really wanting
to build up your tech skills,coding is one way to do it; but,
I think really more importantthan that is understanding data
flows and being able tounderstand and articulate,
distinguish between differentdata types and then being able

(48:57):
to understand how the data isflowing from the front end to
the back end.
I think those are criticalskills.
So, yeah, I think that's greatand I think that also looking at
LinkedIn and looking and seeingwhat people are posting who are
active in the field is a goodway of seeing different ways.
You can kind of break in.

Debra J Farber (49:15):
Awesome.
Well, Jay, thank you so muchfor sharing your experiences and
insights, and thanks foreveryone else for joining us
today.
Until next Tuesday, when we'llbe back with engaging content
and another great guest orguests.
Thanks for joining us this weekon Shifting Privacy Left.
Make sure to visit our website,shiftingprivacyleft.

(49:37):
com where you can subscribe toupdates so you'll never miss a
show.
While you're at it, if youfound this episode valuable, go
ahead and share it with a friend.
And, if you're an engineer whocares passionately about privacy

, check out Privado (49:51):
the developer-friendly privacy
platform and sponsor of thisshow.
To learn more, go to privado.
ai.
Be sure to tune in next Tuesdayfor a new episode.
Bye for now.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.