All Episodes

February 13, 2024 44 mins

This week, I chat with Jake Ward, the Co-Founder and CEO of Data Protocol, to discuss how the Data Protocol platform supports developers' accountability for privacy by giving developers the relevant information in the way that they want it. Throughout the episode, we cover the Privacy Engineering course offerings and certification program; how to improve communication with  developers; and trends that Jake sees across his customers after 2 years of offering these courses to engineers.

In our conversation, we dive into the topics covered in the Privacy Engineering Certification Program course offering , led by instructor Nishant Bhajaria, and the impact that engineers can make in their organization after completing it. Jake shares why he's so passionate about  empowering developers, enabling them to build safer products. We  talk about the effects of privacy engineering on large tech companies and how to bridge the gap between developers and the support they need with collaboration and accountability. Plus, Jake reflects on his own career path as the Press Secretary for a U.S. Senator and the experiences that shaped his perspectives and brought him to where he is now.

Topics Covered

  • Jake’s career journey and why he landed on supporting software developers 
  • How Jake build Data Protocol and it’s community 
  • What 'shifting privacy left' means to Jake
  • Data Protocol's Privacy Engineering Courses, Labs, & Certification Program and what developers will take away
  • The difference between Data Protocol's free Privacy Courses and paid Certification
  • Feedback from customers and & trends observed
  • Whether tech companies have seen improvement in engineers' ability to embed privacy into the development of products & services after completing the Privacy Engineering courses and labs 
  • Other privacy-related courses available on Data Protocol, and privacy courses  on the roadmap
  • Ways to leverage communications to surmount current challenges
  • How organizations can make their developers accountable for privacy, and the importance of aligning responsibility, accountability & business processes
  • How Debra would operationalize this accountability into an organization
  • How you can use the PrivacyCode.ai privacy tech platform to enable the operationalization of privacy accountability for developers

Resources Mentioned

Guest Info

Send us a text



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

TRU Staffing Partners
Top privacy talent - when you need it, where you need it.

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Jake Ward (00:00):
The idea of 'getting ahead of the curve' always
really appealed to me.
I think it, honestly, appealsto developers as well, because
nothing's worse than getting tothe end of a process and having
somebody say, "I see, you didthis, you can't do that, go fix
it.
They'd much rather understandthe constraints ahead of time
and work around them.
Solving problems is whatdevelopers are fundamentally

(00:23):
there to do.
They have to predict the futurewith every line of code.
It's much easier to know whereyou can't go when you start that
process.

Debra J Farber (00:32):
Hello, I am Debra J Farber.
Welcome to The Shifting PrivacyLeft Podcast, where we talk
about embedding privacy bydesign and default into the
engineering function to preventprivacy harms to humans.
.
.
and to prevent dystopia.
Each week, we'll bring youunique discussions with global
privacy technologists andinnovators working at the

(00:53):
bleeding edge of privacyresearch and emerging
technologies, standards,business models and ecosystems.
Welcome everyone to TheShifting Privacy Left Podcast.
I'm your host and residentprivacy guru, Debra J Farber.
Today, I'm delighted to welcomemy next guest, Jake Ward,
Co-founder and CEO at DataProtocol, a developer support

(01:16):
platform.
Data Protocol enables platformsto get more out of their
developer partners and theirtechnical workforce, while
giving developers theinformation that they need the
way that they want it.

Today, we're going to discuss: Data Protocol's privacy (01:28):
undefined
engineering courses andcertification; how to deliver
the shift left message forprivacy so that it resonates
with developers; and then,industry trends that he's seeing
in the space.
Jake, welcome.

Jake Ward (01:46):
Debra, it's a pleasure.
Thanks for having me.

Debra J Farber (01:47):
Absolutely.
So, first, I really want todiscuss your background, because
I find it fascinating.
You started out incommunications, even working as
a Press Secretary for a U.
S.
Senator, before stepping intothe role of President and CEO at
The Application DevelopersAlliance, and now focused on
developer education anddeveloper support at Data

(02:11):
Protocol.
Tell us a little bit about yourcareer journey and how you
ended up focused on enablingsoftware developers.

Jake Ward (02:18):
Yeah, my career has been a long and windy road,
Debra.
I had the good fortune of mychildhood dreams and ambitions
coming true - wanting to be onthe cast of The West Wing and
work in politics and withpolicymakers.
I worked in the House ofRepresentatives and the U.
S.
Senate for awhile.
I also had the incredibly goodfortune of working in the Senate

(02:40):
for Olympia Snow from Maineduring the initial Net
Neutrality fight in 2006, 2007,when all of the issues around
the future of the Internet cameto pass sort of in an esoteric
way that only people in DC wouldcare about.
But, the long and the short ofit is that it was the first time

(03:01):
that the Valley cared andunderstood that Washington
mattered; and it was the firsttime that Washington understood
the technology companies and theValley mattered.
I had the opportunity to have afront row seat for that.
So, when I left The Hill in2007, I worked with a lot of the
now very large technologycompanies, as well as some of

(03:24):
the smaller, to deal with publicaffairs issues around privacy
and patents and other issues ofthe day.
That ultimately culminated infounding The Application
Developers Alliance, which wasthe first trade association for
software developers in 2012.
This was an incrediblyeducational experience and

(03:44):
brought me into the orbit ofsome real luminaries in the

space (03:49):
Joel Spolsky and Brad Feld and Don Dodge and folks
like that.
I'm dealing with Linda Smithfrom Twilio, back when Twilio
was really up and coming.
That experience shaped myworldview of, and particularly
my relationship to, thedeveloper workforce and the idea
that I've been working towardssince then that the men and

(04:11):
women that build software, thatdesign software, are largely the
manufacturing class of thedigital age and that it is
incumbent upon the rest of us tounderstand and support that
workforce if we're going to livein the world they're going to
build for us.

Debra J Farber (04:27):
Wow, that is a fascinating career! I'm sure you
have so many stories.
I'm not sure if they'd be likeWest Wing style stories.

Jake Ward (04:35):
They're more like Veep.
People would ask all the time,"is it more like House of Cards
or West Wing?
And I would say it's more likeVEEP.
Oh, that's funny, so almostlike a comedy of errors.
So, what happens when you give24 year olds the keys to the
country?

Debra J Farber (04:47):
It's true, you've done a lot on the Hill,
so there's many interns and many- I lived in DC for five years,
so I know many interns andyoung people getting into
politics.

Jake Ward (04:58):
Old fitting suits and low salaries.

Debra J Farber (05:01):
Yeah, very much so.
Okay, so let's tell us a littlebit about the Data Protocol
Platform.
What have you built?

Jake Ward (05:08):
In 2017, when I left The Application Developer's
Alliance, sort of the wake ofCambridge Analytica, I spent a
couple of years traveling andmeeting with software developers
, working publishers of mobileapps, as well as front-end
developers and systems engineers, to ask them one overarching

(05:29):
question - "What do you needthat you're not getting?
Or, how would you prefer to getthe information you need to
build cool stuff faster but withless risk?
At the end of this sojourn of acouple of years, what I had was
, I believed, a roadmap togiving information to this
growing workforce of 30 millionor so.
Information, education, supportin a better way.

(05:53):
They said, "I want video first.
They said I want to get itfaster.
They said I want to be treatedlike I'm Bank of America or Air
France, like I matter to a largeplatform.
I built Data Protocol with anincredible team and with empathy
at the core of the interface,which any developer who logs

(06:17):
onto our platform will recognizeimmediately that it's built for
them.
There's a CLI component to it,there's a hands-on keyboard five
that is on every course, everyshort code, every resource, and
we pack as much information andsupport material into a very
short period of time, with theguidance of an instructional
design team here in-house, sothat we can get people back to

(06:40):
doing what they came to do,which is build really cool
products.
The end result, I believe, is aplatform that can give
developers answers, while givingthe companies that depend on
them, that invest in them, thatcare about them, a bridge to
communicate, support, andlargely enable them to go faster

(07:03):
with less risk.

Debra J Farber (07:04):
That's really cool.
Tell us a little bit about howthe platform works, how your
client base is formed.
I believe you're a nonprofit,is that correct?

Jake Ward (07:15):
No, no, we are very much for profit.

Debra J Farber (07:16):
You're a for profit, how does your business
model work?

Jake Ward (07:18):
Anybody who would like to help with that
profitability should call meimmediately.
We work with partners of allsizes, from Meta to Slack and
Intel to the smallest.
We have content coming up nextweek with Circle, the financial
services company.
We work with them to identify aproblem for their developers.
You want to help them navigatea privacy assessment or you want

(07:42):
to talk to them aboutprogrammable wallets?
We build content, so we will beresponsible for writing all of
the scripts, shooting all of thevideo, animating the content,
hosting it on our platform,putting in prefilled notes,
writing assessments, questions,knowledge checks all of those
pieces so that a partner cansimply put a link to their

(08:04):
channel on our platform in allof their dev docs, emails,
website, et cetera.
Send them over single point oflogin.
They get all of thatinformation: the resources, the
video, the data.
As a partner, you get access toattributable metrics as well as
aggregate, "How is my communitybehaving?

(08:26):
Do they like this information?
Was this useful?
What should I change about myproduct or my process to improve
value to that developercommunity?
The end result is tens ofthousands of developers on data
protocol.
com using the courses andresources and guides and
documents every day.
While these are partners,companies are able to extend

(08:51):
their existing developerprograms.
There's nobody that publishesan API or has an SDK or
generally relies on developersthat hasn't invested in
developer documents, emailcommunication, events,
programming, etc.
We fit right inside that toextend and deepen the developer
experience on behalf of thosepartners and give developers

(09:13):
what they need faster in aformat that is more useful to
them.

Debra J Farber (09:18):
That is really compelling.
I know that you provide thesecourses.
Most of these courses are free.
Are all of them free?

Jake Ward (09:26):
All of them are free.
All of the courses are free.

Debra J Farber (09:28):
What's exciting to me is you're finding out the
needs of what one company inparticular might need.
They're just like we need thissupport for our developers on
these issues.
After they pay you to godevelop the course, the course
is free for anybody.
I love that it's building thecommunity outside of the
organization too.
There's something to plug into.
Well, I guess it benefits thatcompany because, as they're

(09:50):
hiring, they could say "Hey,have you taken this course?
It's available to them.
You don't have to first be anemployee or anything like that.
The knowledge is shared.
I just love that aboutcommunity building.

Jake Ward (10:03):
That's right.
Wed do.
There's a lot of content on theplatform that is even product
agnostic.
We did five tips to compliancewith GDPR.
We didn't build that with apartner.
We built it with subject matterexperts in order to support the
community at large.
We do a lot of internaltraining for some very large
companies, particularly aroundprivacy engineering, which I
know we'll talk about in asecond.

(10:24):
Our real bread and butter iswith companies who are either
managing very large third-partydeveloper programs so people
that build on their tools orhave a developer user-base :
customers, people that arepaying for a service, use of an
API, use of a metric dashboard,et cetera.

(10:44):
We help be the on-demand,scalable partner management arm
of that company.

Debra J Farber (10:53):
Got it.
That makes sense.
What does the concept of'shifting privacy left' mean to
you?

Jake Ward (11:00):
I think that my passion point around shifting
privacy left is a littledifferent than is conventionally
agreed upon among privacyprofessionals because I come to
it from a developer bent, but Ireally like the idea and the
metaphor of pushingaccountability and
responsibility further to theleft, into the arms of the

(11:22):
people building the products,that if you can align the
responsibility, accountabilityand decision-making with the
people whose hands are on thekeys designing the product and
building it and they takeresponsibility for it, you're
going to have a safer product.
I want to occasionally tell thestory of the Roman bridge
makers who, after they werecommissioned to deliver a bridge

(11:44):
, would be forced to sleep underit with their family while the
first legion of the Roman armymarched over their heads on that
bridge.
Wow, the idea of sleeping underyour product.
Did I do this well?
Will this be secure?
Putting your name on it wasn'tenough.
If we could ask all developersto sleep under their code, to be

(12:06):
responsible and put their nameon it, not just their company's
name, their name, how muchbetter would products get.
How much time would bededicated to the front-end
design?
How much privacy by design, theprinciples would be
incorporated earlier on withprivacy engineering.
Marketers are always in the room.
Why can't we have a couple morelawyers?
Why can't we have some privacyexperts?

(12:27):
The idea of getting ahead ofthe curve always really appealed
to me.
I think it honestly appeals todevelopers as well, because
nothing worse than getting tothe end of a process and having
somebody say, "I see, you didthis, you can't do that, go fix
it.
They'd much rather understandthe constraints ahead of time
and work around them.

(12:47):
Solving problems is whatdevelopers are fundamentally
there to do.
They have to predict the futurewith every line of code.
It's much easier to know whereyou can't go when you start that
process.

Debra J Farber (13:00):
Absolutely.
I think your definitionactually really aligns at least
with my definition.
Now, when I explain whatshifting left means, I usually
talk about how you want toaddress privacy problems and
preventing them earlier on inthe shift from a mindset of just
when data is collected and thedata life cycle through its
destruction and shift into thesoftware and product development

(13:23):
life cycle when you're buildingthe products and systems that
personal data will be housed in;and how, if you do that, you
could prevent a lot of thedownstream compliance problems.
You build it right the firsttime.
It makes most sense, butabsolutely agree.
In fact, I'm going to probablyname this episode like shifting
accountability left fordevelopers or something like

(13:43):
that, because you're absolutelyright.
It's just another emphasis onan outcome for shifting left is
that, if you make the developersaccountable and not only
responsible, but alsoaccountable for their own code
and it being safe - and that canmean other things beyond
privacy - then you will have amore safe product.

(14:04):
The output is going to be safer.
The process is going to makesure, if you have a good
framework, is going to make sureof that.
It behooves anybody to do thatin any organization.
I want to make 'sleep underyour own bridge that you create'
the new 'eating your own dogfood.

(14:25):
' Let's make that a thing.
I absolutely love this storyand I'm going to start retelling
it as I continue my advocacy inthis space.
Thanks for that.

[Jake (14:34):
You're welcome.
]Let's talk about Data Protocol's
Privacy Engineering courses andthe certification that you can
opt into that is led by renownedprivacy engineering instructor,
Nishant Bajaria.
For the audience, I haveincluded a mention of this
course in my episode - t woepisodes ago, my 50th episode,

(14:56):
"My top 20 privacy engineeringresources for 2024.
It is definitely belongs inthat top 20.
I am just excited to dig deeperhere and have you pull out,
Jake, the reasons for why thiscourse is so helpful for
supporting developers.
First, tell us a little bitabout the course, the lab

(15:17):
component and the certification,and then what will developers
come away learning?

Jake Ward (15:23):
We started this.
We were incredibly fortunate towork with Nishant in the
earliest days of Data Protocol.
This was our first big tentmoment.
I joke frequently that theplatform is so good we could
even teach privacy.
When we set out to do privacyengineering, it was perfect
timing because Nishant's bookwas just coming out and the

(15:45):
content was very fresh.
He's incredible on camera,obviously, in a compelling
figure.
We built an eight course, sixlab curriculum around privacy
engineering and there's also afairly substantial final
assessment, at the end of whichyou are a Certified Data
Protocol Privacy Engineer.

(16:07):
We had one goal when we beganthis process, which was to make
privacy engineering publiclyaccessible, to democratize it.
Certainly, you could go toCarnegie Mellon, you can be in
the world, you can do thework-life experience much like
Nishant has done, but that takesyears or costs lots of money.
We wanted to build a bridgebetween engineers and lawyers

(16:29):
and privacy professionals, tospeak that language as best we
could, and delivered in a formatthat was digestible and
operational.
I frequently say that there aretwo types of smart people:
those who can make simple thingssound complex and those who can
make complex things soundsimple.
Nishant is certainly the latterand he operationalizes all the

(16:49):
privacy, engineering principlesthroughout these courses,
throughout his book, and we weredelighted to bring that to life
.
The key to the platform as faras I'm concerned is how
compelling it can make anycontent.
It's a lot easier when thecontent is also pretty
compelling, but with hands-onkeys, particularly in a lab

(17:12):
setting, we can keep people'sattention.
The retention goes way up.
Proficiency goes way up.
Our passage numbers aresignificantly higher than
industry standard, and it's notbecause it's easy.
It's very hard for thecertification.
The key to the experience,though, is that it is

(17:32):
step-by-step, it builds on topof each other and it is
operational, so that you'llremember how to take it out into
the world, as my kids talkabout at school.
It's project-based, and soyou're not learning it so that
it's rote.
You're learning it so you knowhow to use it.

Debra J Farber (17:48):
That's awesome and as someone who's
neurodiverse like me I findspecifically ADHD.
Sometimes it's like what is mynext step?
There's all this stuff and thischaos.
What's the first step to take?
So when I can come across somematerials that can basically be
like here's the first step, thenthen do this, then this is the

(18:10):
order that makes the most sense,suddenly everything becomes
much clearer.
It's not that I can't figurethese things out, it just might
be overwhelming at first.
So, I really appreciate thatthat's inherent in how you
designed the coursework, becausethat's how I learned very well,
so it's great.
How long has the course and thecertification been available

(18:31):
now?
Oh, and also, what willdevelopers come away learning?
I'm not sure if you mentionedthat.

Jake Ward (18:37):
Developers, lawyers, privacy professionals generally
will come away understandingsort of the keys to
categorization, to theanonymization and to the way
that you can integrate privacyengineering principles into a
system - so the architecturecomponents of it.
The idea was to build acurriculum that was

(18:58):
comprehensive enough that youcould start on Nishant's team
after taking it right, that youcould be ready to at least
understand the vernacular andthe concepts and be part of the
that universe.
I like to think that weachieved that goal, certainly,
as far as I'm concerned, thebest privacy engineering program
out there.
But, there are also many otherresources that are going to

(19:21):
stand the test of time.
I am thrilled that there aremore and more of them.
The idea that we're competing ispretty silly, like I just want
more and more people tounderstand that these principles
are out there and that youshould understand them and that
they should be adopted andimplemented and widely available
in the wild for as little moneyas humanly possible.
The program, I think, launchedin June two years ago, so we're

(19:44):
going to come up on two years inJune that these have been
around.
Numbers-wise is really how wetake a look at it.
We're, I think, well over athousand certifications now a
thousand badges now, as well as4,300 hours of content consumed

(20:05):
since we launched, which ispretty good.
It's an interesting way ofkeeping a score on the
engagement level of things.

Debra J Farber (20:12):
Yeah, so you mentioned certifications and I
don't think we made a gooddistinction between what the
process of taking the course isand then there's a separate
process of opting intocertification and what that
means.
Do you mind disambiguating that?

Jake Ward (20:23):
No, no, not at all.
As I said, there's eightcourses in six labs.
They run in sequential orders.
You can't skip ahead.
You can't get out of order.
That path is locked.
At the end of each sectionthere is a badge, relatively
short assessment to achieve thatbadge and then you move on.
The courses are notparticularly long.
I think the longest one is 30minutes, the shortest one is

(20:43):
probably three or four - it'sjust an intro component at the
front end.
And then, there is a finalassessment.
Now all of the content, all thecourses, all the guides are free
.
We do charge for thecertification assessment because
, frankly, we have to pay for it.
So we ask people to pay for itas well.
You can take that assessment acouple of times, but it's going

(21:05):
to take a while.
It's probably an hour long,50-minute-long assessment.
I think it's 100 questions, andthe idea is to make you really
earn it, that if you're going toget this certification - and
it's issued by Credly; it liveson your LinkedIn; you can put it
on your resume.
If you're going to do that, wewant to make it count.

Debra J Farber (21:24):
So, you basically reflect back the
learnings in a.
.
.
It's not just you took a course, but you can demonstrate that
you understand it.

Jake Ward (21:32):
Yeah, I mean, it's open book.
The resources and notes arethere, and the way the platform
is set up.
You have access to all of thatstuff; but, you need to
demonstrate that you have theproficiency to implement that in
a very operational way.
It's scenario-based, which Ithink is incredibly important
and a much better way toevaluate somebody's
comprehension and retention ofinformation, particularly the

(21:55):
complexity around this.
Now you've asked me earlier,"Can a lawyer take this?
Can a privacy expert take this?
Absolutely.
There are hands-on labs.
They are more technical, butyou're not going to need to go
learn JavaScript in order toparticipate and get certified.

Debra J Farber (22:12):
Excellent, that's great to know.
And then, what trends have youbeen seeing since it was offered
?
What feedback have you gottenon.
.
.
not so much the course I'm notasking like do people like it
but just overall.
What are thoughts and feedbackaround what can be improved?
What gaps of knowledge dopeople still have?
What trends are you seeingoverall?

Jake Ward (22:32):
It's actually really interesting because we ask
people "Do they like it?
artly because we want tocontinue to iterate and improve
the platform and the featuresand functions that we offer each
user, but also because it's amatter of deciding which content
to invest in moving forward.
eople universally really enjoythis content.
Anybody who's made it throughat least two of the courses has

(22:55):
near universal approval of theformat, the function, the
operationalization and whythey're there in the first place
.
We even ask questions like "isthis what you thought it would
be?
Some people say no, but thenthey're delighted with the
result.
The trends are actually sort offascinating.
The vast majority of contentconsumed is on weekends, which

(23:17):
tells us that these areprofessionals that are setting
aside time out of their lives togo take these courses so that
they are better prepared to dotheir day jobs.
I think that is - I'm reading alittle bit between the lines,
but I think it's a really goodsign to the future of privacy
and for the industry writ largethat people are giving of their

(23:38):
own time in this way.
This is not corporate training.
This is something else.
This is the ability to improveyour knowledge base as well as
your abilities at work.
I think that's reallyinteresting.
Since we met what two years ago, we have seen a huge increase
in privacy engineering as atopic, available curriculum as

(24:03):
resources and guides.
I mean, as your Top 20 Listindicates, this is coming.
That's good news for everybody.

Debra J Farber (24:09):
Excellent.
That's really great to hear.
Just for the audience, I hadmet Jake maybe about two years
ago, just shortly after thiscourse was made available at one
of The Rise of Privacy Tech's .
.
.
at their very first in-personevent in Silicon Valley.
It's just amazing to catch upwith you now and see how
impactful the work has been.

(24:29):
It's a lot of large techcompanies that you work with,
which send their engineers totake these courses at data
protocol.
To what effect have they seenimprovement in their ability to
embed privacy into thedevelopment of products and
services within their orgs?

Jake Ward (24:48):
For the companies who are sending their employees,
their technical teams, throughthe privacy engineering courses
certifications.
eW have gotten, and continue toget, really good feedback that
everybody's on the same page,everybody can speak the same
language, now that decisions aremade more quickly and in the

(25:09):
right direction more frequently.
I don't want to name names, butsome very large companies that
you wouldn't necessarily havethought of as A) digital or B
particularly concerned withprivacy ow feel like they have
their arms fully around thesystems that they need to have
in place to protect both theirexisting data but the future

(25:29):
data that they'll be collecting.
We also recently launched aDSARS course and badge: courses
and badge, curriculum and badgethat are often gobbled up by
people that are also taking PrE.
So if you're a partner and yousend 43 of your engineers
through the privacy engineeringcurriculum, a pretty good chance

(25:52):
that at least half of them aregoing to opt into the DSARS
courses, which are not.
I mean, they're just there,they're just available but these
folks are looking to continuethat education.
and again, that's a great signfor both that workforce, that
company, and for the industry atlarge.

Debra J Farber (26:09):
Absolutely.
You mentioned you have theDSARS courses.
Do you have other privacyrelated courses or any in the
pipeline?

Jake Ward (26:16):
We do.
We have a number of privacy-related courses that we put out
sort of in tandem with theprivacy engineering curriculum,
the idea being that when peopleare here, they're going to want
to touch other privacy relatedcourses.
So our initial, our very firstcourse that we launched with was
a pPrivacy by design principles101.

(26:37):
What does it mean?
How do we use it?
Course?
We have, as I mentioned, theGDPR course.
We have some other componentsaround building apps for kids
and the privacy restrictionsrelated to C.
We have several pieces ofcontent that are directly
related to designing for privacy, engineering for security,
thinking about data storage,encrypting end- to end.

(26:59):
Courses like that that are Sortof soft, skilly, right, like
thinking about it from anarchitectural design standpoint,
rather than use this cloud andplug it in this way and this is
the tool you want to use.
It's more like choose the toolthat fits your need.
Here's how it works, and thatcontent is all product agnostic,
platform agnostic and deliveredby subject matter experts.

(27:21):
It's really good stuff.

Debra J Farber (27:23):
That's great.
I have to check some of thoseout.

Jake Ward (27:25):
I'd love to get your feedback.

Debra J Farber (27:26):
Yeah, yeah.
I think I'm going to do alittle binging this weekend, ake
, while I have you here as acomms expert working with
developers, I'd love to hearsome thoughts on, lik, what are
some better ways for usgenerally, in organizations to
support developers and beingable to message up challenges
around privacy that they comeacross.

(27:47):
Are they not not heard?
Do they not know where to.
like where are some of the gapsthat you see in industry right
now that developers havechallenges smounting those gaps
where communications or othertools or something could be
helpful?

Jake Ward (28:03):
It makes a ton of sense, and I also think it's the
right question.
First and foremost, the mostpersistent communication-
related challenge I've seen overthe last 12 years is that there
is a fundamentalmisunderstanding or
misconception.
= I guess that developers don'tcare about privacy, tj they
don't care about the idea, theconcept of privacy, and that's

(28:25):
just not true.
The people that build, whetherit be large software packages or
small mobile app softwarepackages or software in general
they care as much about privacyas anybody else and often more
Righ They are users just likeeverybody else.
The idea that anybody would wantto build something that has a

(28:46):
propensity to break or that itwould eventually run into a
brick wall of legal restrictionsor regulatory compromise is
ill-informed.
Developers care, they care alot.
What they frequently don't haveis the right support while
being pressured to go faster.
Product teams, engineeringteams, even startup ecosystem

(29:12):
right?
everything's about speed tomarket.
Can you get it done?
Will this work?
And, that the integrity of theproduct from a failure
standpoint is largely about theengineering, not the use of data
.
The reason this company iscalled Data Protocol is because
it's about the rules that governdata and how everything that we
do in this digital ecosystemhas to come back to that.

(29:35):
Whether it's about buildingproducts that work, or building
products that are compliant, orbuilding products that are
simply improving the way thatsystems talk to each other,
there has to be an alignmentaround the protocol related to
data.
We're helping to bridge thatdivide every day, every chance
we get where developers who careabout privacy, quality, and

(29:59):
speed have the information theyneed.
But, I would start by assumingthat developers want to do the
right thing for the right reasonand go from there.

Debra J Farber (30:07):
I think that's really important to reiterate,
because even I'm guilty ofthinking in those terms
sometimes.
But it's really like, yeah,it's coming from pressure of
executives at startups that havemaybe borrowed a crap ton of
money from investors and nowfeel like they have to hit the
market hard.
.
o.
r, if it's not startups, it'slarge companies that have

(30:28):
investors that they need to meetcertain metrics and such, and
that's where the move fast andbeing pressured to go faster,
while you don't necessarily haveall of the requirements even
for an MVP that would make theproduct safe, like privacy and
security and ethics generally.
That makes a lot of sense.
It's not the developersthemselves that don't want to

(30:49):
address privacy, but a lot ofit's just been.
I've heard a lot of developersin the past - and I've been
doing this a long time, so I'mnot calling it from recent
experience - but a lot in thepast just say that it's not
important enough to interrupttheir flow.
That was really because theydidn't understand what it really
meant and how large the fieldof privacy is and what it really

(31:10):
means.
I think again, education wassuper important.
The field's gotten smarterabout it and then it sounds like
the support you're providingcan really enable them to maybe
move fast still, but have theright support.
What I'm curious about, though- because we've talked about the
importance of now making themaccountable - what are the ways

(31:31):
that companies can make theirdevelopers accountable?

Jake Ward (31:33):
I think, by making the assumption that they, too,
care about privacy, have them bepart of the process from
beginning to end, much likeprivacy professionals have asked
to be part of the designprocess from beginning to end.
The culmination of that isteams, departments, full
companies, are more bought intonot just the practicality of

(31:57):
checking a box but the potentialof achieving success.
Yes, it is compliant, but italso has a market differentiator
of being privacy- first.
Yes, it is a great product, butit also won't break.
Yes, we got to market first orfastest, but we also can
maintain our position as amarket leader because we can

(32:19):
market ads privacy first, secureand here for our users.
We've seen that in recent years.
Apple's move into being aprivacy company was a
masterclass in marketing.
They didn't do it differently,they just attacked the other
players in space anddifferentiated themselves from

(32:40):
everybody else from a marketingperspective.
That's not going away.
People care about that whenthey're thinking about the
products they're going to buy.
So should then the people whoare building those products.

Debra J Farber (32:52):
Absolutely.
You're right it is amasterclass in marketing because
there are some things that theywere doing and collecting data
just like all the other big techcompanies, but because they did
actually, I think,differentiate how they
architected their hardware, inthe way that they've locked down
some of their, in the way thatthey've architected the hardware
of the phones and theircomputers, some ways have been

(33:14):
privacy- first to begin with,but in other ways that they've
collected data have been justthe same as the big tech
companies.
When there was a snafu inrecent years because they had
built up so much trust withtheir amazing marketing campaign
that differentiated that bysaying how much they care about
privacy, it didn't impact themmuch.
The goodwill was just was there.
Their stock wasn't impacted bya major snafu.

(33:37):
So, it's a great commsmarketing approach.
But you're right, this approachwith the developers will then
support this whole companymessaging capability to really
put the company in the bestprivacy-focused light.
So, what you discussed with me,it sounded like you want to
make them partially responsible,but accountability - is like
you own this.
Who do we blame if this iswrong?

(33:58):
And still, I see somethingprivacy- related, it goes wrong,
people will blame a privacyperson.
Or, if there's a breach, maybethey'll blame we didn't have the
right security Like there's notnecessarily like, "W maybe you
should have threatened modelbetter how do we get the
developers to actually like beaccountable for the privacy

(34:18):
harms they cause?
Is there a way topsychologically tie them to
metrics where they'll want tomake sure that they don't do
certain things?
They could be moving fast andwant to do all the right things
but still have oversight or, youknow, have overlooked a few
things, or maybe didn't use theright design pattern, maybe use
the accidentally made a mistakeand you know data is now exposed

(34:43):
.

Jake Ward (34:44):
Again, the idea for me of shifting left is to align
accountability withresponsibility and then, if
you're pushing that left so thatthe developers are part of that
process, what you've now doneis create an internal
accountability so that you've .
.
.
the incentive to not let itbreak, to not let it be a
mistake, to not let it do harmis already there; but, what

(35:05):
happens too often is, oncedevelopers are no longer part of
that process, it goes to thelawyers, it goes to the privacy
professionals.
They say "Look, it's not myfault, I did everything that I
was supposed to do and then Ihanded it to you and you didn't
fix it.
You didn't create the structurearound it, the rules around it,
the pieces weren't in place toguide me through that process.

(35:29):
But if everybody is on the samepage, that process is built in
a collaborative way.
Everybody's accountable andthat that's the goal.
It's not to not put everybody indifferent silos, but to put
them in swim lanes moving in thesame direction.
Developers build to achieve anend.

(35:49):
One of those ends is a privacyprotection.
Great, tell me what that means.
Tell me what protection youneed me to put in place.
Tell me what the design patternneeds to look like, so that we
can be assured that we can stillgo fast.
Mistakes are made out ofignorance and desperation, not
out of any sort of ill intent.
If we can eliminate both ofthose things the time pressure

(36:14):
and or the lack of support youare aligning responsibility and
in doing so, that internalaccountability takes care of
itself.

Debra J Farber (36:24):
In listening to your talk.
The way I would operationalizethis in an organization would be
to obviously discuss witheverybody and get to who owns
what is the process, what shouldthat look like?
But then document that in somesort of governance and
accountability policy, then SOP.
What I'm hearing is I wouldmake sure that there are privacy

(36:44):
requirements that must belisted in every single product
development set of requirements.

[Jake (36:49):
Absolutely.
I] would make sure that thedevelopers had testing criteria
to those privacy requirementsand that maybe different parts
of the business, like theoperational CPO parts of the
business - as well as, if theywanted, Privacy Counsel - could
also look at it as not asdoesn't have to have their eyes
on this as constantly asoperations - would be able to

(37:10):
see what's coming up in the nextsprints and what is on the
product roadmap.
This is something that is notoptional.
This is like they're gettingthese updates.
They have to know and beaccountable for reading them and
understanding how theseproducts.
.
.
the product development andsoftware development are moving
towards goals, and then have thedocument what are those end

(37:30):
goals we're all striving to, sothat we are in those swim lanes
moving towards the samedirection, because you want to
align all of those requirementsacross the business, across
legal, across softwaredevelopment and security,
because you'll want to alignthem across other business areas
as well.
It shouldn't be, are thereprivacy requirements for this?

(37:52):
It should literally be anytimepersonal data is being used that
we literally get productmanagers know to ask what are
the privacy requirements forthis and you're working a cross
business to have that so that itreally is built into the
process and tech stacks andcoding and design requirements.
All of that should be alignedto whatever those end goals are.

Jake Ward (38:16):
I think that's hundred percent right, Debra.
I would also add that thosepolicies and processes should
not be handed down from thepolicy team, from the privacy
team, to the developers.
They should be built incollaboration with the
developers so that it is morereadily operational and that you
have buy-in from the outset.

(38:38):
The idea that developers buildand lawyers lawyer is a problem.
If you put those two peopletogether and say, h"Here's the
standard we have to meet, do youhave best practices from a
development standpoint that youwould put in place to achieve
that?
" great, standardize it.
Do you have best in classstandards that you would put in
place if you had time, if youhad more resources?

(39:00):
Is there a better way to dothis?
Great, that's your gold star.
Then you're creating growth foryour organization, but you're
also putting privacy experts,legal experts, and engineering
experts in the same room to pullout of them.
A best path forward for all ofthose organizational successes

(39:21):
which is not, coincidentally,what a privacy engineer does is
to pull all those thingstogether and put a stamp on it.

Debra J Farber (39:29):
Absolutely.
That's why I think operationalpeople are so essential in
privacy.
It's not just about theexpensive lawyers and engineers
that do their thing.
It's also about those thatimplement into the business;
look across business processes,which hello is how information
flows through business processesand systems.
This is all coming from just apost I've seen recently I want

(39:51):
to put out there.
I don't think it's assuccessful in a business to have
lawyers talk to engineers bythemselves and then thinking
they should just go implementstuff.
There's a lot involved in theimplementation of policies and
procedures, in the businessrequirements separate from
engineering.
Then, you really need somebodyin a mid-sized, large, or

(40:13):
enterprise organization, youneed people to make this happen
and own privacy in anorganization.
For me that is not sitting inlegal or sitting in engineering,
but you don't have to have anopinion on that unless you want.

Jake Ward (40:29):
I think that's all right.
I think that's all right.

Debra J Farber (40:31):
Yeah, I'm just going to put in a plug for
PrivacyC ode that MichelleDennedy and Kristy Edwards
co-founded.
I think a lot of the work thatthey are building out in their
platform - it's the tooling theywish they had when they were
working in privacy.
Michelle Dennedy was the CPO atCisco, among other amazing

(40:51):
accolades.
Kristy Edwards is her technicalco-founder, who's been working
at Splunk and executive- leveldeveloper for many years.
These are the tools that theywish they had when they were in
those operational roles.
So, I want to put a plug in forthem because a lot of what you
just talked about - how do youeffectuate that alignment not
just the education - you've gota lot of that support for

(41:13):
developers and how they canlearn and how they could
implement, but the actualalignment in the business -
PrivacyC ode is working onlibraries for privacy stuff and
for different patterns, and howdo you align maybe the values of
the organization and theprivacy requirements.
What does that mean fordifferent areas of the business

(41:35):
and such.

Jake Ward (41:40):
Michelle was an original Advisor to Data
Protocol.
We're huge fans and she andKristy are doing an incredible
job with PrivacyCode.

Debra J Farber (41:48):
Amazing.
Do you have any words of wisdomto leave the audience with
before we close?

Jake Ward (41:51):
I'm hopeful that many of your listeners will log on
and start taking some of theprivacy engineering courses.
Again, they're all free.
I'd love feedback from anybodyabout how they are being
received, how useful they are,things that we can do to
continuously update the contentand to improve the experience.

Debra J Farber (42:10):
Excellent.
Well, Jake.
Thank you so much for joiningus today on The Shifting Privacy
Left Podcast.
Until next Tuesday, everyone,when we'll be back with engaging
content and another great guest.
Thanks for joining us this weekon Shifting Privacy Left.
Make sure to visit our website,shiftingprivacyleft.
com, where you can subscribe toupdates so you'll never miss a

(42:31):
show.
While you're at it, if youfound this episode valuable, go
ahead and share it with a friend.
And, if you're an engineer whocares passionately about privacy

, check out Privado (42:40):
the developer-friendly privacy
platform and sponsor of the show.
To learn more, go to privado.
ai.
Be sure to tune in next Tuesdayfor a new episode.
Bye for now.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.