Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Nabanita De (00:00):
I think knowing the
space is very helpful in
talking to customers.
So, like you mentioned, ifsomebody has a personal story
there that they felt a need of aproblem and then they're
solving that, I think it becomesso much easier when you're
interacting with, maybe otherpeople in the space, because you
intuitively understand whatthey're saying, what kind of
problems they have.
(00:20):
You can ask better questions inyour product discovery calls.
So, I would say that's an edgethat engineers in this space who
already have done some privacyAI work would have when they
think about starting theirstartups, because, first of all,
they have the network to tapinto and talk to.
Second of all, they intuitivelyunderstand how these different
systems work - what kind ofissues they have seen.
(00:42):
Overall, I think tying that toyour personal mission, tying
that to what you're trying tosolve and then building that out
, I would say that's also anexcellent point that you bring
up.
That would be a great start aswell.
Debra J Farber (00:54):
Hello, I am
Debra J Farber.
Welcome to The Shifting PrivacyLeft Podcast, where we talk
about embedding privacy bydesign and default into the
engineering function to preventprivacy harms to humans and to
prevent dystopia.
Each week, we'll bring youunique discussions with global
privacy technologists andinnovators working at the
(01:14):
bleeding edge of privacyresearch and emerging
technologies, standards,business models, and ecosystems.
Welcome everyone to ShiftingPrivacy Left.
I'm your host and residentprivacy guru, Debra J Farber.
Today, I'm delighted to welcomemy next guest: Nabanita De,
software engineer, serialentrepreneur, and now Founder
(01:37):
and CEO at Privacy License.
This woman is just so impressive!
In addition to a successfulcareer trajectory, having worked
previously in engineering atMicrosoft, Uber, and Remitly,
Nabanita is a powerhouseleveraging technology to solve
some major world problems.
She founded the Nabanita DeFoundation.
(02:00):
org, which assists people inmaking a return to work after a
career break in 102 countrieswith $0 spent marketing dollars,
where she was awarded a FastCompany World Changing Idea
Honoree Award.
She also founded Covid H elpfor India.
com, which streamlined COVID-19resources into one location with
(02:22):
a 35 country reach, and shealso previously founded FiB in
2016, an open source fake newsdetecting Chrome extension for
Facebook, which earned her theGoogle Moonshot Prize at a
Princeton University hackathon.
She's also just won so manyhackathons I couldn't even
(02:43):
fathom to list them all here,and today we're going to discuss
Nabanita's new startup, PrivacyLicense, whose mission is to
transform the AI landscape byseamlessly integrating privacy-
by- design and AI governanceinto data systems.
Its first product to market isPrivacy GPT, which we'll talk
about in detail, and the need toeducate and grow more Privacy
(03:05):
Champions is something thatwe'll be talking about as well.
Nabanita De (03:09):
Hi Debra! Thank you
so much for this wonderful
introduction.
Debra J Farber (03:13):
Absolutely.
I have so many questions foryou.
You've done so many things inyour career and you're still - I
mean, in my opinion, comparedto me, you know, a young,
professional with so much aheadof you.
So, to kick things off, can yougive us a little overview of
your origin story and how yourlife experiences led you to
(03:35):
today, where you found an AIprivacy startup?
Nabanita De (03:38):
Yeah, absolutely.
I think the good way todescribe it would be like
privacy found me instead of theother way around.
So, I actually started atMicrosoft, where I was a
Software Engineer and also AIResearcher.
I did AI research at MicrosoftResearch and I actually built a
project which leads to inferringlocation from text without any
(04:03):
location indicators in it, and Ithought that was my first step
towards like something inprivacy.
And then, I joined as aSoftware Engineer at the Redmond
campus at Microsoft where GDPRhad just come out around that
time in 2018.
I was doing the end to end datalakes and machine learning and
moving terabytes of data fromon-prem to cloud, and I took
(04:25):
part in some data anonymizationand tagging and finding data for
these sorts in big data lakeswithin Microsoft at that time.
I then moved to Uber where, as aSecurity Engineer, I sat in the
Cloud Security team, but Icollaborated directly with the
Privacy Engineering team and ledsome company by data
classification and DLP effortsto ensure that Uber's cloud
(04:48):
infrastructures are compliantwith privacy laws like GDPR and
CCPA.
I then moved to Remitly as anEngineering Manager for Privacy
Program and Privacy Engineeringand led all of that stuff.
And then I feel like, workingacross all of these different
companies of different scalesand different roles, I realized
that there are common challengesthat companies of different
(05:11):
sizes and in different roles aretrying to solve; and yet,
there's no one particularsolution to build that.
And that's where I was like oh,I do have a masters in AI; I
have all of this privacyexperience.
I should absolutely be theright person to be building a
privacy and AI startup.
Debra J Farber (05:28):
I love it.
I love it.
Okay, so tell us about PrivacyLicense.
What's its mission?
What are your future plans withthe organization?
Nabanita De (05:36):
Yeah, I think my
overall goal is to ensure that,
with all of these different lawsand regulations that are coming
out today - I think that morethan 130 countries have their
own data production laws andthere are over 160 different
privacy laws - which means thatif companies want to enter
global markets, they would haveto have some sort of privacy
(05:58):
license to operate there, whichmeans complying with those laws
and building out appropriateprivacy systems internally to
facilitate that.
That's where the mission is toempower these companies - to not
only enter these global marketsbut also ensure that privacy is
something that's protected as ahuman right for their consumers
and employees.
Debra J Farber (06:19):
So, it's both
aimed at consumers.
.
.
that you'd be selling toconsumers as well as to
enterprises?
Nabanita De (06:26):
The initial goal is
to sell just to enterprises,
but in process of doing so, Ithink consumers play a huge
aspect of interacting in thespace of privacy by, for example
, requesting for their data inform of Data Subject Requests or
understanding how they canincorporate privacy.
So, it would have theeducational aspect of it as well
, but our primary focus is tosell to enterprises.
Debra J Farber (06:49):
Yeah, that makes
sense and I do want to point
out to the audience, from what Iknow, from what talking with
you, that Privacy License, youplan to have multiple products
under the brand of, or the theorganization of Privacy License,
with the first product thatcame out being Privacy GPT,
which we'll get to in a littlebit.
(07:10):
That's my understanding.
Correct?
You plan to add different appsand tools?
Nabanita De (07:15):
Yes, absolutely.
I'm planning to build anecosystem of privacy apps that
would be part of that platform,which companies can leverage for
different things that theywould be working on.
For example, to serve theseapps appropriately, we'll be
building a tool to do that.
Then, we'll be looking intoconsent management - and all of
those stuff are part of theroadmap.
(07:35):
But, our first entry point intothe market was building out
this Privacy GPT, like youmentioned, a privacy firewall
for Chat GPT.
So, once you download thisChrome extension, you go on Chat
GPT and you type something, itwill redact the sensitive
information (i.
e.
, PCI, PHI, PII) from 62countries without the data
leaving your browser, ever.
(07:56):
So, yeah, you can use Chat GPTwithout worrying about multiple
privacy concerns.
Debra J Farber (08:01):
Yeah, I love it
and we're going to definitely
deep dive into that.
I did want to ask you first,though, so that we don't get too
deep into talking about PrivacyGPT without me asking, but what
was it like switching from anEngineering Manager role focused
on Privacy Program Managementas well as Privacy Engineering,
but your Engineering Managerrole at Remitly to now building
(08:22):
a startup?
That's got to be different; theexternal pressures, the demand
from investors, the excitementof building something new.
Tell me a little bit about whatit's been like for you.
Nabanita De (08:40):
Yeah, actually,
you know, surprisingly it was
very similar, I would say,because I think when I joined at
Remitly I was sort of doing thezero- to- one stage for
technical privacy and then, Ithink, having those experiences
of like.
so, my role was unique in theway that I sat between our Legal
/ GRC and our Technical Teamsacross the organization and sort
(09:02):
of translating between Legal /Regulatory to tech roadmaps and
then back from tech people backto like the Legal and the GRC
folks.
I think sort of having thoseexperiences and going into the
startup- land.
One thing I did at Remitly issort of understand jobs to be
done across the organization,and you know what machine
(09:25):
learning teams, finance teams,IT or like - what different
teams want out of privacy when Ihad joined Remitly initially.
I think very similar frameworkwhen I went into the startup, I
think the first step is likefinding Product-M arket Fit.
You sort of go into the industry.
You talk to a lot of differentpeople.
You try to understand what arethese jobs to be done that
(09:46):
ideally will solve the bigchallenges in privacy?
What are these bigger drivers,and sort of having those
conversations.
I feel like it was very similar, so I did not feel like like I
was doing something different.
As part of being at Antler, itwas an excellent experience.
It's very similar to yourtraditional accelerators like YC
or Tech Stars, but Antler islike pre-idea, pre-seed,
(10:10):
pre-money, so they are likezero- day funding agents [Debra:
Let's say " day one" instead of"Zero day, just for all the
security implications] Yes, dayone, day one accelerator.
So, I guess it was great, in away, that I knew the privacy
side of stuff, but maybe I stilllearned a lot about the
(10:31):
business side of things.
Like how do you think about abusiness plan?
How do you write a deck?
How do you think about privacy,maybe from a first- principles
approach?
How do you think aboutgo-to-market?
How do you think about building, like pipelines and consumers,
and how do you basically do thisprocess?
From a business standpoint, Ithink that was something I had
(10:53):
done.
I never had like this formalcoaching before.
In the previous startups thatI've been successful at, I sort
of tried different things andsaw what worked and what didn't.
It wasn't like I knew exactlywhat needed to happen versus
here.
I think, in hindsight, I couldlook back and be like, "oOh, I
did this one thing I remember inmy previous startup, but
essentially this is howsuccessful entrepreneurs do XYZ
(11:16):
things.
So I think that way it was likevery helpful to like be in that
program and have like excellentmentors guide you constantly.
Debra J Farber (11:24):
That's awesome.
I hope to see in the futuresome of these prestigious
accelerator programs, you know,maybe focused on privacy or
privacy in AI or, you know, asopposed to just startups
generally, so you could justkind of pump out more ethical
tech.
I don't have that money toinvest, but I'm hoping somebody
out there does in the nearfuture because there's
(11:45):
definitely need for it.
I'm so glad that you've gotthat coaching.
Now, let's talk about yourfirst product to market.
Let's talk about Privacy GPT -a super sexy name because, you
know, obviously Chat GPT isstill all the rage in the public
consciousness around generativeAI.
You described it earlier as a"privacy firewall for ChatGPT
(12:08):
prompts.
"
Nabanita De (12:15):
Yeah, I think Chat
GPT in the last year, I think
they've grown to over 100million users.
Every day there are exabytes ofdata just in prompts that are
being sent into Chat GPT thatyou know people are adding to it
.
But, when you think about itfrom a privacy standpoint, there
are so many different concerns.
Right?
For example, you're a doctor,maybe, and you're typing.
(12:37):
.
.
like in the sense, like maybeyou're using Chat GPT to get
something quickly, but in theprocess there might be some
patient information you'retyping into it.
You can imagine, from a HIPAAstandpoint, that's not something
you're supposed to do.
You could be somebody sittingin a consultancy and you're
trying to summarize a big docthat you're trying to write, and
(12:58):
in that process you might havegiven away some of your employee
confidential data into Chat GPTwhich, again - based on privacy
laws - you're not supposed touse anything which you don't
have consent for.
There are so many privacy-related concerns when the way
people are using, you know, theuse cases into Chat PT being an
excellent, like productivitytool.
(13:20):
So, by default, so manycompanies have, you know, banned
chat GPT within theirorganizations because they
haven't figured out a right wayto make this utility tool to
work.
So my thought process there wasthat I do want people to use
tools like ChatGPT or other LLMsthat are out there; but can I
empower them so that whilethey're using this tool, they do
(13:41):
not compromise the data privacystandards that are required to
completely use this toolproperly?
So, that's where I sort ofbuilt out this Privacy GPT which
, like I mentioned, Redax, PCI,PII, PHI for free at this time
from 62 different countries.
And, also secrets in code, sothat way people can use, try it
(14:04):
out and use Chat GPT with alittle bit more privacy
protection than they had before.
Then, we are also releasing apaid pilot which will have much
more advanced technologies in itto redact sensitive information
while maintaining context, andthen have the companies across
the world maybe use this tool toempower their employees to use
(14:26):
productivity tools withoutcompromising privacy.
Debra J Farber (14:29):
It's a noble
effort for sure.
I love it and, since this is atechnical audience of privacy
engineers here, can you describea little bit about how Privacy
GPT works in practice?
How did you architect this?
Nabanita De (14:44):
Yeah.
So, the way right now it worksis it entirely sits on the
browser side for the user.
It's a Chrome plugin.
Once you download it from theChrome store, like Privacy GPT,
and you add it to Chrome and yougo to chatopenai.
com and you type something andwhen you click on "hit send, it
doesn't send to chat GPT, itfirst sends to the browser
(15:05):
plugin that you have downloaded,privacy GPT, which is sitting
on your browser, and then itbasically we have built out like
a state of the art algorithm tolike redact - like detect these
sensitive information - andthen redact that on the browser-
side of stuff and then also givepeople the option to redact
individual things or, you know,un-redact, because there might
(15:28):
be some use cases where theremight be false positives there.
We have also added that optionto do so.
So, you can imagine if some ofthese are name, location,
whatever - what it does it findsthese things.
Then, it replaces a name withthe word "name, like it
maintains the context.
So that way when you're usingtools like Chat GPT, your
context is still maintained butyour sensitive data has been
(15:50):
redacted at the same time.
And likewise, we do the samething for multiple categories
across - I mentioned (15:55):
secrets in
code, then a name, email ID,
address, phone number, SSNlicenses of different countries.
All of these things areincluded as part of that - that
people can use and leverageright now - and the way it's
doing in the back end isbasically we have leveraged some
of the machine learning andnatural language processing
(16:17):
techniques to do this redactionand that's how it's working
right now.
Debra J Farber (16:20):
That's awesome.
So.
.
.and forgive me, since I'mdefinitely more familiar with
data discovery methodologies butthat's typically when you're
looking across multiple datastores and you're trying to
figure out what is personal data, what is sensitive data, right?
So, for something like this,where it's more of a.
.
.
almost like a DLP or, as yousay here, a 'firewall,' that you
(16:42):
want to only allow in the nonpersonal data, non sensitive
data that you're sending to thetraining set.
I guess I want to betterunderstand how you're figuring
out whether something ispersonal or not.
Is it like regular expressions?
Are you using something beyondthat type of matching?
And, I don't know as much aboutNLP, so I'm just going to be up
(17:03):
front here with you, just say,"No, that's the secrets.
Nabanita De (17:08):
I think there is a
mix of multiple things - there
are regular expressions, therealso very commonly established
state- of- the- art naturallanguage processing techniques,
like 'part of speech tagging,'where you can know what parts of
speech, like what is now, whatis what, what is what.
And then there are also 'namedentity recognizers.
' There are multiple thingswithin NLP that we leveraged to
(17:31):
sort of achieve this at thistime.
Debra J Farber (17:36):
That's awesome.
So, I hadn't told you this yet.
.
.
this shouldn't be too much of asurprise, but my other half is
an offensive security guy.
Right?
He's a hacker, works for Yahooon their Paranoids team.
When I mentioned.
.
.
I'm sitting there you're like,'Oh, this is cool.
Nabanita De just came out withPrivacy GPT.
" this one.
He was like "I want to take itapart and see what's there.
You know, I want to go in andsee what she built and see what
(17:59):
I can find Right.
And so not only did he not findany security issues with it, so
I'm not surprising you on thispublic show, you know, with
anything terrifying.
What I wanted to call out wasthat he found it difficult to
try to even find how it wasstructured but ended up saying
that he thought it wasimpressive for how many
(18:21):
countries you cover, basically -that you've got very large
coverage with the algorithm thatyou've written, based on what
he was able to detect.
I thought that was actuallyreally exciting.
It sounds to me like you're notjust coming to market in the
United States, but you werethoughtful about how this could
be used across the world.
Is that how you wouldcategorize it?
Nabanita De (18:42):
Yeah, absolutely.
Thank you.
First of all, thank you so muchfor sharing this.
I think the thought process wasthat because I'm a privacy
professional, that's my (even ifI'm on the startup side) on the
industry side, at the end ofthe day I want users to
prioritize user privacy.
My thought process there islike t"here is so much talk
around privacy and how you cansafeguard privacy and but can we
(19:05):
empower general people firstand sort of give them a window
into what they could do andempower them with these kind of
tools.
So, right now, another thingthat I haven't mentioned is
Privacy GPT also works if youdon't have access to the
Internet because, again, thedata never leaves your system;
it entirely sits on the browser-side, so you can use it when you
(19:28):
may be on the transit and youwant to quickly put something
and redact that.
It also works in that aspect.
The second aspect from theprivacy standpoint is also.
.
.that I know that when GDPRdata is truly anonymized in that
sense, then different privacylaws like GDPR do not really
apply in that sense.
That's another way from thatstandpoint as well.
Debra J Farber (19:49):
I think that's
really thoughtful and I think
it's also indicative of, well,your past experience, where
you've had to work across somany different countries to get
your efforts and yourfoundations to hit the maximum
benefit of people.
So, do you plan to build asimilar product around other
(20:10):
LLMs too, or is it going to bearound ChatGPT specifically and
that's it for LLMs?
Nabanita De (20:17):
No, I think we
actually got a lot of great
feedback when we launched onProduct Hunt and we were Top 4.
A lot of people wanted us tobuild around multiple LLMs,
multiple browsers, add morefeatures into it.
So, we are looking into all ofthose things and that actually
would form as part of the paidpilot process that we are
(20:38):
launching.
So, if anybody's interested,please feel free to reach out to
me.
And then, as part of buildingmore products, like the space of
detecting sensitive informationI think that plays into so many
different areas within privacy.
So, now that I have thisalgorithm built out where I can
accurately detect sensitiveinformation, I plan to leverage
(21:00):
that into many other spaces,like data inventorying and
eventually leading that to doingDSARs appropriately and things
like that.
So, there are multiple otheruse cases that this algorithm
will sort of federate into.
Debra J Farber (21:12):
Okay, awesome.
Let's talk a little bit aboutwhat it's like to come to market
.
You mentioned Product Hunt.
I see that Privacy GPT hasranked 4th globally on Product
Hunt.
Congrats, first of all, onhaving so many eyeballs on the
product and then, of course,votes for it; I do have a few
questions about that.
(21:33):
So first, can you tell us alittle bit about Product Hunt
and how startup foundersleverage it to prove market
interest?
And then, I think most of theproducts there are consumer
products, but not all of themnecessarily.
So B2B privacy and AI folks,like myself, could benefit from
learning a little more about howProduct Hunt is used, if you
(21:54):
don't mind.
Nabanita De (21:56):
Yeah.
So, Product Hunt is essentiallya platform where people launch
their products into an audienceof people who generally use
multiple source of products.
Ideally, it's like a votingranking place where, once you
launch the product, there's acommunity of people who are
looking through those productsof the day and then, if they
resonate with what you havebuilt, they'll upvote your
(22:17):
product, they leave comments andreach out to you.
So, it's a great place to sortof put what you have built out
there to sort of see if there isa general market need for that;
and you can also specify whatyou're building.
So, I have seen some B2Bproducts also go into Product
Hunt because, essentially, a lotof the different Fortune 500
(22:38):
company folks also go on ProductHunt to look at cool products
of the day and then maybe evenreach out based on if they saw
something that resonated withthem and they're interested in
bringing the pilot out.
I would say that this is agreat space that if you have a
product, like a cool product,that's been built out.
You can sort of prove the nichemarket and release it out.
(22:59):
It doesn't have to be consumer-focused, and you can add some
demo or something around whatyour product does and what it's
trying to do.
Then, that could be like agreat place to get the word out
in terms of what you're building, what you're doing; and if
people are interested, theycould potentially reach out and
talk to you.
Debra J Farber (23:16):
Awesome.
So, do you find that that'stypically, you know, like Heads
of Innovation or internalbusiness folks that are reaching
out, or do potential investorsreach out as well?
Nabanita De (23:28):
So, I've actually
gotten both from our Product
Hunt launch.
I've had investors follow meand reach out.
I've also had like generalcompany folks from different
Fortune 500 companies also reachout and show interest in our
paid pilot - they have actuallysigned up for the paid pilot
through the product.
I linked our paid pilot intothe Product Hunt page.
(23:51):
I would say that good amount of.
.
.
that's a place usually where alot of.
.
.
at least at Antler, I had seena lot of people do Product Hunt
launches.
So, I would say that'ssomething that startups
frequently use to launch theirproducts and sort of get some
quick feedback from the market,I guess.
Debra J Farber (24:10):
Okay, awesome,
thanks for that.
I'm learning more and more.
This has been a crazy market.
You know, I've been focusing onprivacy tech for the last three
years and it's not hard tonotice that raising money in
this current economy has been achallenge.
Do you think it's easier toraise money in the current
economy if your product isrelated to buzz-worthy concepts
(24:34):
like AI or is it about the same?
Is it really difficult?
Tell us a little bit about whatthe raise process has been like
for you.
Nabanita De (24:42):
Yeah, I think we
have been focused on finding the
product market fit and talkingto many people and, you know,
iterating on our products andbuilding.
So, essentially haven't reallygone into the raise part of the
phase as aggressively as wewould like yet; but, I would say
that just based off of talkingto different investors and
(25:06):
people and hearing from them,especially in the Antler cohort
and in different other.
.
.
like New York Tech Week - Ialso went to that when I was in
New York for the Antler cohort.
I think just talking to themmade me realize that if your
startup is actually providingsome sort of value and solving a
clear niche area where you havea waitlist of people who are
(25:30):
interested in building, inbuying what you're building, are
they interested in becoming abuild partner or something like
that?
I think showing that level oftraction and having a clear
defensibility as to why youstand out, having a clear mode
in terms of what is a uniqueselling point in this space and
how do you you know what is aunique insight here, having
(25:51):
those things is very crucialwhen you're raising; and having
that, I think, makes it easierto raise.
Then, I think for early startupfounders, that is something
they sort of have to go throughby talking to many people
building, seeing the tractionthey have, seeing who are going
to use that.
So, it's a process to get there.
I personally do not feel likejust building in privacy or
(26:14):
building in AI automaticallyqualifies somebody to raise.
I would imagine that they, eachfounder, will have to do some
sort of due diligence themselvesto really put that together in
form of a pitch deck or whatevermethods that they're using to
reach out to investors, angelinvestors, or VCs.
And once they have that andthey have that conviction and
(26:34):
VCs on their end are able to dothe due diligence and validate
that this is a scalable venturebackable business, I think then
it becomes easier to raise.
Debra J Farber (26:42):
Yeah, that makes
a lot of sense.
You know it's been so hard toraise for privacy tech.
I do wonder, though, if pairingthat with AI like "h, we have a
privacy tech solution thathelps in the AI space" is kind
of a sexier topic, or investorsare just willing to spend more
buckets on the space of AI rightnow.
Well, you know, it's definitelysomething that we're kind of
(27:05):
observing as it's happening, sowe can assess that over the next
year.
What words of advice would yougive for other software
engineers that might be seekinga transition to founding a
privacy or AI startup?
Nabanita De (27:20):
Yeah, I would say,
especially in the privacy space,
I think there's still a lot ofeducation that needs to happen.
Something I've observed is Ihave to really tweak my audience
in terms of am I talking tosomebody who is in the privacy
space and knows about datainventory and DSARs and cookies
and all of that stuff.
But, for somebody who is not inthe privacy space, we use the
(27:42):
same technical jargon.
They are like "what?
So I guess you have to reallythink about who the audience is
and really figure out a sweetspot so you can sort of simplify
what you're really saying andthey can understand that better.
For software engineers,essentially, I think we bring in
a lot of the technical hat andexpertise or something I have
(28:04):
personally like learned overtime is take off my software
engineering and builder hat andstart really think from a
product, from a businessstandpoint, because at the end
of the day, as softwareengineers, we just want to build
and shape and find that coolproducts that resonate in the
market.
But, when you move over on thestartup side, you don't have
(28:26):
hundreds and billions of dollarsto go spend and experiment.
You are running pretty lean.
You have limited funding,limited budget, so really
prioritize what are you building, what are doing.
In the privacy space, I wouldsay there's a lot of different
problems that need solving andanybody who is interested in
this space, first of all, cometalk to me; I'd love to talk to
(28:46):
you.
And the second thing would bereally think more deeper in
terms of what is the firstproblem you would solve in this
space, because it's a huge space.
How would you prioritize it?
Then, how do you bring thatvalue to the customers that that
you're building for, how doesit resonate with them?
And then, sort of quicklyiterate and build through, like
really actively, consciouslytake off your builder hat and
(29:11):
really get into the firstprinciples of the 5 Whys, I
guess.
Debra J Farber (29:15):
That's really
good advice, kind of going back
to basics with first principles.
I think that makes a lot ofsense.
I've also seen a lot ofengineers, in the privacy tech
space at least, really deeplysee one particular problem at
the company they're at and go,"G osh, I'm tired of dealing
with this problem in my company.
If I can make this, maybe Icould turn this into a product
(29:37):
so that I solve not only theproblem for this company but for
I could then sell it to others.
Right, and, and that has alsobeen a great launching off point
.
But then, the challenge is"Wwell, I'm not necessarily sure
I understand the privacy techmarket and how the product gets
purchased and how .
.
.
there's just growthopportunities, I think, no
matter how you're jumping intobecoming a founder - especially,
(30:00):
what I have seen is softwareengineers that get knowledgeable
on privacy in one particulararea, like maybe advertising and
privacy and the nuances of thead tech space.
Right?
Like, "oh, I know exactly that,I know this tech stack, I know
how to solve these problems andthis is how we can fix it," but
then again, you don'tnecessarily know all of the
(30:21):
problems that different privacypersonas might have and how do
you get them motivated to helpget buy- in for the product?
Nabanita De (30:29):
I think knowing the
space is very helpful in like
in talking to customers.
So, like you mentioned, ifsomebody has a personal story
there that they felt a need of aproblem and then they're
solving that, I think it becomesso much easier when you're
interacting with other people inthe space because you
intuitively understand whatthey're saying, what kind of
(30:49):
problems they have.
So, you can ask betterquestions in your product
discovery calls.
I would say that's an edge thatengineers in this space who
already have done some someprivacy AI work would have when
they think about starting theirstartups because, first of all,
they have the network to tapinto and talk to.
Second of all, they intuitivelyunderstand how these different
systems were, what kind ofissues they have seen and
(31:13):
overall, I think tying that toyour personal mission - tying
that what you're trying to solveand then building that out.
I would say that's also anexcellent point that you bring
up.
That would be a great start aswell.
Debra J Farber (31:23):
Great.
Okay, so we're getting closerto Data Privacy Day 2024, you
know the end of January.
So, I definitely wanted to askyou about privacy awareness,
especially because I see on yourwebsite that Privacy License is
building out a PrivacyChampions Program.
I'm curious, how does creatinga Privacy Champions Program fit
(31:47):
into your mission for PrivacyLicense, and then what are your
goals for this program and howdo people join?
Nabanita De (31:55):
I think the reason
why I created this Privacy
Champions Program is I see thereare so many people who can
benefit from being in theprivacy space.
Essentially, privacy becomes athing that is only resorted to
you know, privacy managers orprivacy lawyers where
essentially the entire companyshould be contributing in some
(32:15):
sort of privacy tasks becausethey do deal with sensitive
information.
So, my goal behind that programis can I empower entire
companies and differentstakeholders in privacy to
understand how do theycontribute to privacy, what
tasks they could do, and sort ofbe the champion for privacy
teams in their individual teams?
(32:36):
That way, when your privacyteam comes to you and be like
"hey, I need to, I need you tobuild this for GDPR, you know
exactly why that needs to bebuilt and you can be the
champion for privacy in yourorganization and incorporate
like privacy by design.
To answer the question on howcan somebody.
.
.
so, there's a link if you go onmy website right now - Privacy
(32:57):
OS.
ai - there's a link to sign upfor the Privacy Champions
Program and you can sign up forthat.
Once you sign up for that, wewill be reaching out to you very
soon in terms of joining theprogram.
Your expectations and my goalis to set you up with a
community of people who are alsoin similar spaces as you and
(33:17):
you are, at the end of the day,the Privacy Champion for your
team, for your organization.
Debra J Farber (33:23):
Yeah, that's
great.
So, tell me a little bit moreabout what this Privacy
Champions Program looks like.
I obviously get the idea thatwe want to get champions from
across different companies topeople will join this program.
Just tell us a little bit aboutonce they join.
What can they expect to learnor to bring back to their
(33:43):
organization?
You mentioned, for instance,the GDPR and doing DSARs.
What does that mean exactly?
Is it a matter of your takingrequirements from various
legislation like GDPR or CCPA,and you're contextualizing what
it means and kind of have alibrary - I don't mean a
(34:04):
software library, I mean likejust a library of privacy,
knowledgeable privacy areas thatsomeone could become more
knowledgeable about and kind ofstandardizes across companies?
Or, are you thinking ofsomething else?
Nabanita De (34:16):
Yeah, I think
that's something that I am doing
in a different space withinPrivacy License, but I think my
goal for Privacy ChampionsProgram is where somebody who is
not in privacy ideally wants tojoin, and I try to understand
their motivation in terms ofwhat they are doing right now,
and then basically provide alist of recommendations or a
(34:38):
weekly dose of recommendationsof things that they could do in
their organizations toprioritize more of privacy.
For example, think about datateams, maybe in smaller
organizations where they mightbe doing their own version of
privacy or whatever, that sort.
But being able to say that,"okay, if you want to get
(34:58):
compliant," for example, withGDPR it necessitates that each
team has their own version ofinventory and that can bubble up
to a company-wide inventory,and then you can know an
accurate, up-to-date RoPA, soyour Legal teams are not running
after you.
So, for me, it's like somebodymaybe who joins from a Data team
.
I can be like okay, you can dothese XYZ things to be prepared
(35:21):
a little bit ahead in time, sothat way, when you do come
across all of these additionalthings, you already know why
that's happening.
So, helping them understand alittle bit more on what is
different.
Customer profiles - ideally,how do you contribute in privacy
?
How do you think about privacy-by- design?
And then also, like across theindustry, what other people who
(35:45):
are also in the same roles asyou, what are they doing in
privacy?
So, having that sort ofcommunity as well.
So it's like doing both at thesame time.
Debra J Farber (35:53):
That's pretty
awesome.
So what I'm gathering, then, isthat a Privacy Champion doesn't
just necessarily need to comefrom, like, a GRC team.
It literally could come fromany technical team or business
team, or just wherever somebodywants to be the champion for
privacy and bring back knowledgeto their organization, and then
(36:15):
this is kind of a place forthem to start.
Nabanita De (36:17):
Yes, so it could be
like literally somebody in HR
or somebody like very likenon-related, non-technical; but
at the end of the day, anybodywho touches sensitive data
should be thinking about privacy.
So, my goal is to how can Iempower that those individuals?
Debra J Farber (36:33):
I love it.
I love that you're helping toshape a better world, so thank
you for your service.
Before we close today, do youhave any calls to action for the
audience?
Nabanita De (36:47):
Yeah, like I
mentioned, we have our paid
pilot up.
So, we'll be launching verysoon.
If you're interested in thepaid pilot for Privacy GPT, go
on the website PrivacyOS.
ai and fill out like the paidpilot, sign up for it, and I
will reach out to youindividually.
I would also love to talk topeople.
(37:08):
If you're listening to this andyou're interested in talking to
me and joining the PrivacyChampions Program and you want
to learn more about how you canyou or your organization can
benefit from it, please feelfree to reach out to me on
LinkedIn.
Also, if you just want to talkabout privacy and you feel like
there are some burning privacyneeds in your organization that
(37:29):
you feel like look throughmultiple solutions but nobody is
solving that, I would love toknow what those things are.
Come talk to me.
Feel free to reach out to meand I would love to jump on the
call.
Another call to action will beI also have a newsletter on
privacy.
It's on LinkedIn and it's free.
It's called 'Nabanita'sMoonshots.
' Feel free to subscribe to it.
I try to share my nit-bitsaround privacy, around consumer
(37:53):
rights, around how can yousafeguard sensitive information,
post-principles for delete lawand bunch of legal, technical
and then GRC - multiple sides ofprivacy.
So, feel free to give it afollow.
Debra J Farber (38:07):
That's pretty
awesome, and so I'm going to put
all of those links in the ShowNotes so that everyone can
access them, and I wish you goodluck on the rest of your
journey here.
I'll definitely be following it.
I plan to join the PrivacyChampions, so, you know, look
forward to being on part of thejourney with you.
Nabanita De (38:26):
Yeah, Debra, you're
already a Privacy Champion.
I feel like you're doing suchcool work, like this podcast
itself.
You bring on such great peopleand I always learn so much every
time I hear a new episode fromyour podcast.
So, I am truly honored to be onthis podcast and share my
journey with you and lookingforward to all of the wonderful
(38:47):
stuff that you will continue todo in this space.
Debra J Farber (38:50):
I really
appreciate that.
Thank you, and you know Idefinitely want to have you back
on in the future to check inand see all the great progress
that you've made.
Nabanita De (38:59):
Thank you so much
and thank you everybody for
listening to this.
Debra J Farber (39:02):
All right, well,
Nabanita, thank you so much for
joining us today on TheShifting Privacy Left Podcast.
Until next Tuesday, everyoneone will be back with engaging
content and another great guest.
Thanks for joining us this weekon Shifting Privacy Left.
Make sure to visit our website,shiftingprivacyleft.
com, where you can subscribe toupdates so you'll never miss a
(39:23):
show.
While you're at it, if youfound this episode valuable, go
ahead and share it with a friend.
And, if you're an engineer whocares passionately about privacy
, check out Privado (39:32):
the
developer-friendly privacy
platform and sponsor of the show.
To learn more, go to privado.
ai.
Be sure to tune in next Tuesdayfor a new episode.
Bye for now.