All Episodes

April 2, 2024 43 mins

Today, I’m joined by Amaka Ibeji, Privacy Engineer at Cruise where she designs and implements robust privacy programs and controls. In this episode, we discuss Amaka's passion for creating a culture of privacy and compliance within organizations and engineering teams. Amaka also hosts the PALS Parlor Podcast, where she speaks to business leaders and peers about privacy, AI governance, leadership, and security and explains technical concepts in a digestible way. The podcast aims to enable business leaders to do more with their data and provides a way for the community to share knowledge with one other.

In our conversation, we touch on her career trajectory from security engineer to privacy engineer and the intersection of cybersecurity, privacy engineering, and AI governance. We highlight the importance of early engagement with various technical teams to enable innovation while still achieving privacy compliance. Amaka also shares the privacy-enhancing technologies (PETs) that she is most excited about, and she recommends resources for those who want to learn more about strategic privacy engineering. Amaka emphasizes that privacy is a systemic, 'wicked problem' and offers her tips for understanding and approaching it.

Topics Covered:

  • How Amaka's compliance-focused experience at Microsoft helped prepare her for her Privacy Engineering role at Cruise
  • Where privacy overlaps with the development of AI 
  • Advice for shifting privacy left to make privacy stretch beyond a compliance exercise
  • What works well and what doesn't when building a 'Culture of Privacy'
  • Privacy by Design approaches that make privacy & innovation a win-win rather than zero-sum game
  • Privacy Engineering trends that Amaka sees; and, the PETs about which she's most excited
  • Amaka's Privacy Engineering resource recommendations, including: 
    • Hoepman's "Privacy Design Strategies" book;
    • The LINDDUN Privacy Threat Modeling Framework; and
    • The PLOT4AI Framework
  • "The PALS Parlor Podcast," focused on Privacy Engineering, AI Governance, Leadership, & Security
    • Why Amaka launched the podcast;
    • Her intended audience; and
    • Topics that she plans to cover this year
  • The importance of collaboration; building a community of passionate privacy engineers, and addressing the systemic issue of privacy 

Guest Info & Resources:

Send us a text



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

TRU Staffing Partners
Top privacy talent - when you need it, where you need it.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase,
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Amaka Ibeji (00:00):
Early engagement is about trust, and trust has to
be earned and it could be aprocess.
It all starts by saying, "Wedon't lead the conversation with
technology.
The conversation really startswith the problem we're trying to
solve.
For each conversation, we wantto be sure that we are embedded,
because I say, "Privacyengineers, we're multiple hat,

(00:24):
we're always context switching.
You talk to the HR team.
The next moment you're talkingto a data engineer.
You're constantly contextswitching.
So, one of the things we needto understand is, once we
understand the problem we'retrying to solve, the next thing
is we begin to ask ourselves,"How can we solve for this?
What techniques should we beusing?

(00:46):
And then we get to technology.
I say, "For you to be invitedover and over into the room
early on in the conversation.
You have to be a pleasure towork with.
You need to know when to askthe right questions, you need to
know when to listen, to absorb;because, at the end of the day,
shifting privacy left is allabout the win- win- win

(01:08):
situation.

Debra J Farber (01:09):
Hello, I am Debra J Farber.
Welcome to The Shifting PrivacyLeft Podcast, where we talk
about embedding privacy bydesign and default into the
engineering function to preventprivacy harms to humans, and to
prevent dystopia.
Each week, we'll bring youunique discussions with global
privacy technologists andinnovators working at the

(01:29):
bleeding- edge of privacyresearch and emerging
technologies, standards,business models, and ecosystems.
Welcome everyone to ShiftingPrivacy Left.
I'm your host and residentprivacy guru, Debra J Farber.
Today, I'm delighted to welcomemy next guest, Amaka Ibeji,
Privacy Engineer at Cruise.

(01:50):
In this role, she architectsand engineers robust privacy
programs and controls, but itgoes beyond implementation of
rules and regulations.
She strives to create a cultureof privacy at Cruise.
Amaka's interests span morethan privacy.
She's passionate about privacyengineering, AI governance,

(02:12):
leadership, and security; andrecently started her own podcast
called PALS Parlor Podcast.
PALS is an anagram for PrivacyEngineering, AI governance,
Leadership, and Security.
Today I'm really excited tochat with Amaka about: her
career; how organizations canachieve privacy compliance

(02:34):
without sacrificing innovation;her new podcast; and then some
trends that she's seeing in theworld of privacy engineering.
So a big welcome to you, Amaka.

Amaka Ibeji (02:44):
Thank you so much, Debra, for having me.
I'm glad to be here.

Debra J Farber (02:48):
Excellent, excellent.
Well, I think it makes sense tojust start off with your
privacy origin story.
You started out in security andmoved into privacy engineering,
so tell us a little bit aboutwhat motivated that transition
and how you went about makingthat change.

Amaka Ibeji (03:02):
That's a great question.
I actually started out mycareer as a software engineer.
I was really fascinated aboutleveraging technology for
business process improvement.
That fascination did not lastlong, as I became aware of cyber
criminals who can exploitvulnerabilities in application,

(03:24):
either to gain unauthorizedaccess, corrupt the data, or
steal information.
I decided to explore the worldof cybersecurity simply to
understand the tools, techniquesand mindsets or motivations of
cyber criminals or, put itmildly, threat actors.

(03:44):
My goal at the time was to learnenough to enable me to build
robust applications, and Ijokingly say it's been over a
decade and I haven't made my wayback.
I know - rather, I have evolvedmy career into privacy
engineering.
So for me, beyond protecting ITinfrastructure, going to

(04:09):
humanizing the data we collectand process, it's beyond, you
know, external threat actors toalso looking into internal
threat actors, both within theorganization and those we choose
to partner with.
For me, you know, at the end ofthe day, when you look at my
career, underpinning all of thisis really about accelerating

(04:32):
business objectives whileadvocating for users through
respect for them and userexperience.
So, that has been my journeyand I think every move has
layered upon previousexperiences.
I'm excited and, of course,with the wake of AI, getting

(04:53):
into AI governance is just sucha delight, because privacy plays
a critical role in thedevelopment of AI.

Debra J Farber (05:02):
Absolutely.
Do you mind talking about someof those overlaps with AI?

Amaka Ibeji (05:06):
Yes, when you look at privacy engineering, one of
the things you think about isthat we're actually looking from
the user's perspective.
What are the experiences of theusers?
You could do that withtraditional applications.
AI comes with an amplificationof this risk.

(05:27):
Of course, it also comes withthe benefit; so, we begin to
look at issues around bias.
How might that impact the user?
You think about the data thatis collected to train AI models.
You begin to ask yourselfwhat's in the data?
How can we anonymize this databefore it gets into a training

(05:49):
data set?
Those are some of the corequestions we look at when we
have AI-related applications toreview.
Those are the overlaps betweenprivacy engineering, especially
in the world of AI governance.

Debra J Farber (06:03):
That makes a lot of sense.
I really like that customerobsession aspect of privacy, as
well; and, you make reallyexcellent points about the
overlap.
So, you're no stranger at allto innovation.
I mean, I love your careerbecause you started out doing a
lot more around looking attechnology and processes, and
people and kind of moving into.
.

(06:24):
.security really doesn't lookat the people's perspective as
much as from a systemperspective.
And so, when you moved toMicrosoft, you were in its
Research and Incubationsorganization and you focused on
embedding compliance by design -so a much broader mandate than
just privacy - in strategy,partnerships, and research
initiatives.
Honestly, that sounds like adream job to me, just for the

(06:47):
perspective you must have had.
I mean not dream job from thecompliance sense, but from the
mandate of what your job was.
What did you learn from thiscompliance-focused experience
that has helped you in yourcurrent privacy engineering role
as you've moved on now toCruise?

Amaka Ibeji (07:03):
I love the job at Microsoft and one of the reasons
is, while I was at Microsoft, Ireported to an amazing leader
who had deep legal expertise and, although we were focused on
privacy compliance, we found shewas pulled into partnership

(07:25):
conversations early.
One of the things that affordedme the opportunity to do was to
introduce early engagements.
You know, drive the notion ofearly engagement even in
partnership conversations,because for us to have a
collaboration, Microsoft has thetechnology, but we'd really
have to partner withorganizations that have the data

(07:47):
.
Sometimes, this can be drawnout negotiations because from a
compliance point of view, thesepartner organizations want to do
right by their customers andthe data that has been collected
.
However, when you enter intothe world of privacy enhancing
technologies, you can creativelystart having conversation,

(08:11):
construct of how to driveinnovation while protecting
these individuals.
So, that in itself gave me theopportunity to see how
partnership conversation can beaccelerated, leveraging privacy
enhancing technology.
So, that was really a lightbulb moment for me, really

(08:32):
exciting, and I love every pieceof it.
That was where I truly cut myteeth on the conversation of
shifting privacy left, taking itreally early on in the
conversation.

Debra J Farber (08:45):
I think that's awesome.
That's great.
You know, in a lot oforganizations that discussion is
still being had.
Right?
We're trying to make the changeto shift left and it's great to
hear.
.
.
Microsoft's been pretty good athaving shifted left into
privacy for many years.
I think it kind of had somechallenges at one point, maybe
15 years ago - I'm ballparkingit here - and then they

(09:07):
responded to the market byhiring a lot of privacy folks,
especially in technology.
So, I think they've led the waya lot of the times on good
governance when it comes tostaffing, federated teams of
privacy experts, like across thetrust and safety and as they're
building products and services.
It's really great to hear thatyou've been able to apply what
you've learned there in yournext job.

(09:28):
I think it especially now, it'sperfect timing because this is
when the privacy enhancingtechnologies are really scaling
and not just new ideas comingout of research; they're really
at the point of being able todeploy them in companies a lot
easier.
That's great.
Compliance is important, but itdoesn't always make for strong

(09:48):
privacy or even effectivesecurity.
Right?
Sometimes, it could be lookedat as just checking boxes to get
through.
Yeah, I did a requirement.
Yeah, we did a check.
You know we did a policy.
So, what advice do you have forengineers who want to shift
privacy left into design,architecture, engineering, and
data science, to make privacyactually be impacted and not

(10:09):
just a compliance exercise?

Amaka Ibeji (10:11):
That's a great question, and I like to start by
setting the stage with thenotion of 'best fit rather' than
'best practice,' because a lotof things we talk about is best
practice.
However, best practice informsbest fit.
When we talk about best fit,best fit starts with the notion
of let's look internally.

(10:32):
What are we doing?
And get intimate with theproblem we're trying to solve.
I always say "the conversationshould start from there what is
the problem we're trying tosolve?
What goal are we trying tosolve?
What goal are we trying toachieve?
And so, once we get there,especially for privacy engineers
, we have to get to the pointwhere we're being invited into

(10:54):
the room early, and I say itvery often that early engagement
is about trust, and trust hasto be earned, and it could be a
process.
It all starts by saying, "Wedon't lead the conversation with
technology.
The conversation really startswith the problem we're trying to
solve.
For each conversation, we wantto be sure that we are embedded,

(11:18):
because I say privacy engineers, we're multiple hat.
We're always context switching.
You talk to the HR team, thenext moment you're talking to a
data engineer.
You're constantly contextswitching, and so one of the
things we need to understand is,once we understand the problem
we're trying to solve, the nextthing is we begin to ask

(11:39):
ourselves how can we solve forthis, what techniques should we
be using?
And then we get to technology.
And I say, "for you to beinvited over and over into the
room early on in theconversation, you have to be a
pleasure to work with.
You need to know when to askthe right questions.
You need to know when to listen, to absorb, because at the end

(12:00):
of the day, shifting privacyleft is all about the
win-win-win situation.
The team has an acceleratedpath because they get insight of
what to do early.
You also buy time as a privacyengineer to do deep research,
depending on the problem you'retrying to solve.
Ultimately, the business andend users enjoy the benefits.

(12:25):
I always say, "A well-designedsolution should be very seamless
, easy and technology to takethe backstage - it shouldn't
even be seen.
So, it's an experience that theusers would long for and
appreciate.
That's going to be my advice.

Debra J Farber (12:46):
I think that's great advice.
It's a great jumping off pointthen to now turn to the next
conversation, which is aboutculture.
So, you're passionate aboutcreating a culture of privacy
within an organization andwithin engineering teams.
One of the hardest things to doin a company is to change its
culture.
I know because I've read lotsof articles on it.
I know because I've tried tochange a culture before, even in

(13:10):
a much smaller company thanMicrosoft.
So, I'd definitely like to hearabout your successes and
failures, like what has workedwell and what has not worked
well when trying to cultivate aculture of privacy.

Amaka Ibeji (13:23):
That will take me back to my days in consulting.
You know, I had some experiencein Deloitte as a consultant.
I would say that first of all,as a privacy engineer, you get
to wear many hats.
One of them is as a coach, andthe reason I bring this up in
relation to culture is, while wehave policies and procedures,

(13:48):
my experience from working inconsulting and observing
first-hand the culture oforganization is that people like
simplicity.
Anything that enables them toachieve their goals, they will
go for it.
So, that being said, our policydocuments by themselves do not
drive culture change.
People understanding what cango wrong begins the shit.

(14:14):
And when I say as a privacyengineer, one of the hats you
should constantly wear is thatof a coach; for every
interaction that you have, it isa coaching moment.
What you do is what are youtrying to achieve?
You get embedded, but you'recoaching to leading people to

(14:35):
understand what can go wrong,and that starts the conversation
.
However, you see that theconversation of what can go
wrong also has its limitation.

I think a better narrative is: "how can we unlock more with a (14:50):
undefined
modified approach?
" That is where the conversationbegins to get exciting.
While we have this policy,I always say, "Policy
documentation should bereference materials.
" In the everyday lifestyle, ourapproach, I always say that our

(15:15):
practices must match ourpolicies.
In the practice, we need tounderstand: what can go wrong;
what can we do better to unlockmore; leveraging, improve,
enhance approach.
That's why I talked about bestfit.
So, it's not about we need todo X Y Z in terms of compliance
and this is best practice.

(15:36):
By the way, I always seecompliance as the minimum step.
We always need to go above andbeyond and say, To achieve more,
to achieve this, what do weneed to do?
" And one of the ways I like todo this is when I get into a
room and the business team tellsme, "this is what we are set to

(15:57):
achieve.
I ask the question "What else?
What else are you looking toachieve?
Because that begins the realconversation.
Everybody has something prettyto say about their goals, but
sometimes there might beembedded notions that you're not
saying out loud.
You need to be able to read theroom and ask "What are we truly

(16:17):
trying to achieve?
Once you get to the heart ofthe matter, that is where change
starts from; and, once they cantrust you enough to let you in
on what they are trying toachieve, You can begin the
conversation of doing your deepresearch to ensuring that we can
achieve this within theguardrails that you provide for

(16:40):
them.
At the end of the day, I thinkyou have, through partnership.
.
.and that's when they begin tothink about you even early on
from a conceptualization phase,because you bring value to the
conversation.

Debra J Farber (16:54):
I think that's really great advice.
I really do so.
Following up on that, as you'rebringing innovative products
and services to market thatbring value to customers, that
could even be internal customers, what are some approaches that
you've seen to make privacy andinnovation a win-win rather than
zero-sum game?
Right?

(17:14):
Obviously, this is privacy- by-design language here.
What are some of the ways thatyou would suggest that we could
do that?

Amaka Ibeji (17:22):
It's clear.
I always say that humans arevery innovative and creative.
I don't always go into a roomwith the notion of going there
to be the solution provider.
I lead the conversation.
The reason I say that is thesame creative ability people use

(17:44):
in bypassing controls, they canstill channel that creative
approach to complying with thesecontrols if they see the bigger
picture, if they see how muchmore we can achieve together.
And so, when I get into theconversation - especially from
looking at.
.
.of course, I have my privacyby design principles in hand -

(18:07):
I'm not just going to read thatout; it's going to be embedded
in my conversation.
It's going to be embedded inhow I lead the team.
When I say you should go in aswearing the hat of the coach,
when you ask those deepquestions, it's not just asking
the question, I'm leading theteam on a journey.
When we talk about things likeprivacy by default we want to

(18:31):
understand, "Have we accountedfor the users that will be using
this application or thisproduct?
That's the first question.
Do we have outliers that we maynot be accounting for?
Because we can be in a roomwhere we're accounting for 90%
of the users who will be usingthis application.
"What if there are outliers?

(18:51):
How do we account for whatmight be of interest to them?
When we have this conversation,we come back to the drawing
board and say, "This is what weknow as a date.
This is how we're going todesign the product.
However, we will make thisproduct customizable so that if
there is a population we're notaccounting for, we're empowering

(19:15):
them to make adjustment as theysee fit.
" So it's all about coming inwith an open mindset to design,
but also with some notion ofunderstanding who the typical
user will be and also thinkingabout who might the outliers be

(19:36):
for this application and what isimportant to them.
One more thing - we might knowpeople's interest as of today.
Context changes over time foreven the typical user.
We should also ask ourselveswhen their contexts change, how
are we empowering them to ensurethat this product still serves

(20:00):
their need at every time?
That is the heart of theconversation and that is the
heart of designing the productthat goes to market.
That's where I come in from andthat's the kind of conversation
I love to lead.

Debra J Farber (20:13):
I think that is what we need more of, for sure,
and I think we need more people.
I mean, it's part of culturechange to understand that the
privacy professionals,especially privacy engineers,
are there to empower them tomake choices that are going to
meet the customer obsession.
We're not here to detract froma product or say no.

(20:36):
We're here to help companiesachieve their innovation while
still meeting the needs of abroad swath of customers, and
not just a main persona, but theedge cases and thinking about
the future, too.
I like what you said aboutmaking it customizable in case
there's something you hadn'tconsidered in the future.

(21:00):
Awesome.
So, are you noticing right now- and this is a broad question
to you - are you noticing anytrends in the privacy
engineering space?
This could be broad or thiscould be applicable to a
particular product you'rethinking of but that stand out
to you in the world of privacyengineering?

Amaka Ibeji (21:22):
I mean, I think we've talked about this even on
this call, but I would say itagain that, yes, the growth of
privacy enhancing technologies,that is one area that excites me
, and the number of players inthe space is one to watch out
for, because this is becoming,we're beginning to see
productization of privacyenhancing technologies.
So, I think this is the righttime for privacy engineers not

(21:42):
just to look out for theproducts coming but to
understand, corely, the concept;because, at the end of the day,
to design a solution, it's notjust one- size- fits- all.
You may have to do somecombination of those.
So, understanding the coreconcept of privacy enhancing
technologies and look out forthe vendors in the space, the

(22:04):
players in the space, is amazing.
In fact, last night I wasreading an article.
It's the impact of PETs onbusiness, individuals, and
society; and, one thing thatcaught my attention there's
really a quote from the articlethat says, The PETs market is
expected to reach a value of$25.8 billion by 2033.

(22:26):
" There is a lot of movement inthat space and, to me, I'm truly
excited and really watchingthat space very closely.

Debra J Farber (22:36):
I'm just as excited as well.
I think there's a lot there.
I've covered a lot on thisprogram around privacy enhancing
technologies used in the datascience space.
That's clearly unlocking theability to use data that
previously has been consideredlike plutonium - you got to lock
it down and can't use this databecause it would violate

(22:56):
privacy.
Right?
But these privacy enhancingtechnologies would enable the
use of that data for analyticswhile preserving privacy.
But there's also other usecases, right?
There's privacy enhancingtechnologies used to prevent
confidentiality leaks.
There's, you know, maybe intesting - of your product
testing - and stuff along thoselines for masking.
What are some of the privacyenhancing technologies you're

(23:18):
most excited about?

Amaka Ibeji (23:21):
A ton of them.
I, in fact, for this, I justposted something on LinkedIn
that I'll be talking aboutprivacy enhancing technologies
in the days to come.
Differential privacy isexciting.
Releasing analytics in a waythat cannot potentially identify
an individual in the data set -that's exciting from a data

(23:42):
release point of view,especially when we're looking at
queries.
Another exciting one for me iswith regards to homomorphic
encryption and the way that it'sdesigned, where we could drive
encrypted query into a securedata set without having to

(24:04):
reveal this data set to externalactors, but enabling them to
get the insight that they want.
You know we've seen a lot ofthese across organizations, so
different organizations comingtogether.
You talk about law enforcementmaybe wanting to query a
financial database owned by abank and so they send, probably,

(24:27):
a query into there.
.
.to be able to pick up anactivity of one user.
So, rather than letting thefinancial institution into the
insight that I'm looking to findout information of Bob, for
example; because that is also aprivacy leak.
They could encrypt the query ofBob and search into the

(24:50):
financial institution's databaseto return the query back to the
search party in an encryptedmanner until they're able to get
the insights that they want.
Now, this is also something thatwe should also consider
internally.
For example, if there's alitigation and employment legal

(25:12):
wants to query a data set, theyshouldn't have to pass the name
to an analytics team or aproduct owner to be able to get
insight of that employee.
They should probably be able toperform an encrypted query, get
the insight back to employmentlegal, and then that piece of

(25:34):
information stays with them,instead of saying, "Hey, can I
get the information of Bob,because that's already a leak
that something is going on withBob.
So, this is really exciting,not just across organizations,
but also to think about itinternally within organizations.
You think about employee survey.

(25:55):
Currently, we do a lot ofconfidentiality around employee
survey.
How about we take away thetrust component and ensure that
we can get deep analyticswithout revealing who does what?
So there's a lot of excitingnotion that we could even take
beyond cross organizations, buteven within organizations,

(26:16):
especially for organizationsthat are large and are scaling.

Debra J Farber (26:19):
I think that's great and those are some real
key examples.
I appreciate that.
Besides our podcast shows, andwe're going to talk about yours
in the next question, what areother resources - so books,
newsletters, slack communities,you know anything like that -
that you'd recommend to thoseinterested in privacy
engineering?

Amaka Ibeji (26:40):
Yeah.
You know, I started with aquote.
I don't even know where I heardthis, but it says "a
professional never gets boredwith the principles.
So I'd say, start with aprivacy by design principles.
Have them close to heart andlet it mean something for you,
even in your design approach.
One book that I really like,and I think it's free, it's

(27:02):
"Privacy Design Strategies byJapp-Henk Hoepman.
I hope I'm pronouncing thatcorrectly.
That's an amazing resource, andalso this is the right time to
begin exploring privacyenhancing technologies.
There are a ton of materialsout there, and for me, one of
the commitments I'm also makingis also to really post on

(27:23):
LinkedIn fairly daily to talkabout these concepts, as I
encounter them, to drive qualityconversation.
Let's understand the challengesand also empower those who are
coming behind us to take thisand build robust application.
At the end of the day, we donot have any organization in

(27:44):
silo.
We are part of an ecosystem,and enabling others across
different organizations todevelop robust applications is
as important as the applicationsor the products that we build
within our organization.
So, amplifying this work issomething that I'm taking up as
a challenge, and it's been veryinspiring coming on this journey

(28:07):
so far.

Debra J Farber (28:09):
That's awesome and I'll put a link to the
Hoepman's Privacy DesignStrategies book.
I also want to do a call outfor Jason Cronk's book,
"Strategic Privacy by Design,which really takes Hoepman's
Privacy Design Strategies -which gives you all of these
approaches and tactics based onyour use cases for, you know,
are you going to minimize, areyou going to separate?

(28:29):
How do you address the privacychallenge from a strategic
perspective as you're designingout your products and systems.
Jason does an amazing job inhis book "Strategic Privacy by
Design, Second Edition," ofreally educating and taking that
work, and the work of otheracademics, to make it practical
and how you can actuallyapproach this in your

(28:50):
organization.
So, I'll add a link to both ofthose in the show notes.

Amaka Ibeji (28:56):
Maybe I can add two resources here that would be
very helpful.
While I was speaking, I talkedabout what can go wrong, and
addressing what can go wrong isreally understanding the privacy
threat landscape.
One very amazing resource isthe LINDDUN framework.
The other one, as we begin tostep into the world of AI, is
also to look at the PLOT4AI.

(29:17):
You know, which questions thatwe could use in driving these
conversations and understandingwhat kind of answers are we
getting and what can we do withthose answers.
So I think those two resourceswill be amazing and I can send
the link after this.

Debra J Farber (29:31):
That'd be great.
That'd be great, and those aretwo topics we've covered in
detail on the show.
We've had Isabel Barbara talkabout Plot4AI, which was kind of
an extension of the LINDDUNFramework but applied to AI
systems, so it's a littlebroader than privacy.
And then, the LINDDUN Framework- we've had Kim Wuyts on, who

(29:52):
was a real driver of that beforeshe moved on to, I think, at
PwC now.
She's a real driver of thatacademic work, and LINDDUN now
is the premier threat modelingfor privacy framework.
Absolutely, I think everybodyshould go check that out.
Those are really two greatresources.
So thank you for highlightingthem.
Let's talk about you.
Let's talk about your new show,the PALS Parlor Podcast.

(30:13):
What motivated you to launchthis and who is it for?

Amaka Ibeji (30:16):
It's all about amplifying the conversation
we're having.
When I look at PALS, PALS is anacronym that pulls together my
experience and interest overtime: Privacy, Engineering, AI
Governance, Leadership, andSecurity.
And again, it was exciting thatI could come up with these

(30:42):
acronym because when I speak tofolks, I always want to speak to
people as pals, as friends, andamplify this conversation.
When I launched the PALSpodcast - the PALS Parlor
Podcast - one of the audiencesthat I had in mind was really to
speak to business leaders;taking technical concepts,
breaking it down for businessleaders to really engage and see

(31:04):
what they can unlock withintheir organization, enabling
business leaders to do more withthe data that they have within,
you know, reasonable guardrails.
That was my intent.
The second set of folks ispeers, you know, pioneers in
this space.
How can we come together toensure that we're challenging

(31:27):
our thoughts, we're improving inour thinking, and come together
to even bring about morecreative techniques that we, in
isolation, haven't thought about?
So, that was really mymotivation for getting started
with the podcast.

Debra J Farber (31:43):
Well, that's awesome and, as one of your
peers, I'm delighted tocollaborate with you in the
future on things.
I think there are a few privacyengineering- focused podcasts
and my goal, too, with thisshow, with Shifting Privacy Left
, is to help build community andhelp get followership for the
desire of serving the community,not for any personal desires

(32:04):
larger than that.
I want to service the communityof privacy engineers and help
it grow and be an evangelist.
I just think that that's aperfect role for me, just
amplifies all you know who isDebra.
Debra's an evangelist forprivacy engineering.
Right?
I think that's great.
I think it's a perfect place tocome from with this desire to

(32:25):
help, not only share knowledge,but a platform and a way for
people to come together and wantto combine efforts to really
make impact in our ownorganizations, but then also the
industry at large.
So that's awesome.
What has your experience beenso far and how many episodes
have you recorded and publishedand what's the future?

(32:46):
Where do you see this going forthe rest of the year?

Amaka Ibeji (32:49):
So far, we have four episodes out and a
combination of solo podcast andalso having guests on the show.
When I talk about the PALSParlor, growing up we always
called the living room theparlor where we have guests;
but, it's also a place wherefamily members, friends also

(33:11):
meet and have the conversation.
So, there will be a combinationof me talking, you know, solo
about concepts and also bringingin peers and pioneers in the
space to have the conversation.
We will be continuing on thatmomentum for the rest of the
year and just hoping that we cananswer deep questions that
people have, because it's also areaction of the questions that

(33:33):
I get from my LinkedIn dailyposts.
One of the top ones is peoplewithin security wanting to do
more in the privacy space andasking what resources do they
need to make that leap.
So those are the sort ofquestions we will be answering,
while also addressing someconcepts within the privacy

(33:55):
space as well.

Debra J Farber (33:56):
I think that's great, especially since we were
just talking about the LINDDUNThreat Model Framework, and most
threat modelers out there arein security.
I mean, that's really where itkind of grew up.
So, from threat modeling thenyou get red teaming and pen
testing and all the differenttests that you can do, and so
it'd be great to really educatethe security community at large,

(34:17):
especially those that have thisthreat actor.
.
.basically, hacker mindset, onhow can you exploit so that you
can prevent that?
Right?
My fiancé is a hacker himself.

[Amaka (34:26):
Oh, wow!] Yeah, yeah! So, he works on bug bounty
programs and such, and so Itotally see the value of getting
a lot of security folks toreally understand," Hey, there
are not only now threats comingfrom AI, but there's this whole
field of privacy, where I thinkour challenge in this, in
getting the hackers to startthinking about threat modeling

(34:48):
for privacy is that there reallyhasn't been a cohesive set of "
Top 10 vulnerabilities forprivacy.
Right?
They've been lofty goals likeyou don't want a breach; they're
not things you can really testto at a technical level.
So, I do know that there aresome folks that are thinking
about maybe working with OWASPor thinking about how could we

(35:10):
get a set of vulnerabilitiesthat are truly privacy
vulnerabilities that we can thenhave as a community to vet for,
and threat model for.
I think LINDDUN's a great placeto start.
It really is kind of avulnerability model that allows
for threat modeling, but youcould see what's the result of
that threat being exploited.

(35:31):
Oh okay, this is the problem.
Right?
It really is a vulnerabilitystandard as well, and so it'd be
great to kind of work backwardson what are those specific
things that we could test for?
Do you have any thoughts aroundthat?
Have you seen anything there orthinking about that?

Amaka Ibeji (35:47):
I like that you explained it that way, because
one of my goals is really tobuild a community of boundary
spanners, because we always saythat you cannot have privacy
without security.
We want a situation where wehave boundary spanners, people
who understand the securitydomain but also have a rich

(36:07):
understanding of privacy domain,especially from the threat
actors.
We don't want people who aresilo thinkers.
Of course, silo thinkingenables expertise, but we need
boundary spanners who canunderstand the big picture; and
that's why we're driving thisconversation to reach people
that not necessarily callthemselves privacy engineers but

(36:29):
also understand the deepconcepts and appreciate all of
that.
I think, with the work that youhad just mentioned around the
OWASP Top 10 if we could havesomething similar in privacy or
a combination.
I always lean towards anintegrated approach.
That's always very important.
So, we save time; we make itseamless; we want to do a

(36:51):
review; we have the securityfolks, privacy folks in the room
having this conversation and noone is lost in the conversation
because there's an appreciationof the threat actors.
We're context- switching fromexternal threat actors to
internal threat actors andthere's a respect for that
across board.

Debra J Farber (37:11):
That is true.
That is awesome.
I love that.
And, again, any way that wecould collaborate or get others
listening to this to kind ofcollaborate on that effort, I
think the better the industrywould be.
So, speaking aboutcollaboration, what's the best
way for folks to reach out toyou?

Amaka Ibeji (37:28):
To reach out to me, it's on my LinkedIn page.
I say I respond faster toLinkedIn DMs than I do to my
email.

Debra J Farber (37:39):
So do I! Oh, that's awesome; and, I do want
to point out, you are, on adaily basis, you've got this
goal of posting at least oneeducational privacy engineering
post.
Right?
Do you want to talk a littlebit about that?

Amaka Ibeji (37:50):
Yes, I thematically think through what I post.
For example, we talked aroundthe LINDDUN framework for about
a week before we moved intoPlot4AI, like you rightly said,
because the Plot4AI was buildingon the LINDDUN Framework.

(38:11):
So, I thematically post this,and I try to do it daily so that
it's bite-sized.
For this incoming week, I'mlooking around privacy enhancing
technology.
Then there will be a sequel tothat.
For example, we could talkabout how privacy enhancing
technology can impact databreaches.
With the SEC notification,disclosure notification - that's

(38:33):
something that's a hot topicI'm also interested in; and,
also partnering with boardcybersecurity to also look at
how can we see the disclosures,the 8Ks and the 10Ks?
How can we begin to look at itand what does it even mean for
Chief Privacy Officers andPrivacy Engineers.
So, those are areas we're goingto be unpacking in the future

(38:54):
and also looking at riskmanagement.
What does it mean having foundthis risk?
What is risk response supposedto look like so that privacy
engineers are not overwhelmed byfinding risk and not knowing
what next to do afterwards?
So, thematically, I like to putmy post in a thematic way that

(39:15):
makes sense and people canalways go back and follow
through the sequel.
So, yes, I'm really excitedabout those.

Debra J Farber (39:23):
That's really cool.
Anything else that you'd liketo plug or say to this audience
before we close?
And besides, everybody, gocheck out the PALS Parlor
Podcast.

Amaka Ibeji (39:34):
Yes, I think I'd like to close with something I
learned from Brad Lee (fromPrivatus Consulting) "Privacy is
a wicked problem.
It's a systemic problem, so weneed system thinking to address
privacy problems, and so when wethink about it, we need to
ensure that we're thinking froma system mindset and building

(39:59):
robust application.
One way to address systemproblem is by collaborating
deeply, bringing relevantstakeholders to the table, and
also ensuring that we get to apoint where the solutions that
we come up with suffices all ofthe resources and the
constraints that we have at thattime to get into a good enough

(40:22):
solution.
I think that will be awesome.

Debra J Farber (40:25):
That's excellent .
Amaka, thank you so much forjoining us today on The Shifting
Privacy Left Podcast.
I think your passion for thespace and your enthusiasm really
shines through and I'm excitedto listen to the PALs Parlor
podcast, hear more of yourwisdom, and to actually meet you
in person someday soon.
I mean, we're both in the samestate, in Washington state; and,

(40:47):
you know, I hope our pathscross in person because I think
your passion matches mine andI'd love to collaborate somehow.

Amaka Ibeji (40:54):
Thanks, Debra, for having me.
This was so much fun and apleasure to be on your show.

Debra J Farber (40:59):
Oh, thank you.
Until next Tuesday, everyone,when we'll be back with engaging
content and another great guest.
Thanks for joining us this weekon Shifting Privacy Left.

Make sure to visit our website: shiftingprivacyleft. (41:09):
undefined
com, where you can subscribe toupdates so you'll never miss a
show.
While you're at it, if youfound this episode valuable, go
ahead and share it with a friend.
And, if you're an engineer whocares passionately about privacy

, check out Privado (41:25):
the developer-friendly privacy
platform and sponsor of thisshow.
To learn more, go to privado.
ai.
Be sure to tune in next Tuesdayfor a new episode.
Bye for now.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.