Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
All righty, then.
Ladies and gentlemen, welcomeback to another episode of
Privacy, please.
I'm Cameron Ivey over here withGabe Gumbs, as always, and
today we have a special guestwith us, sonia Siddiqui.
She is a privacy andcybersecurity leader lawyer with
Emerging Technologies.
She's an ex-Coinbase employee.
(00:20):
But, sonia, I'll let you kindof chime in and just tell the
listeners a little bit aboutyourself.
We'll go from there.
Speaker 2 (00:30):
Yeah, sure.
So first of all, thank you bothfor having me on this show.
I'm a big fan and very excitedto be, you know, a guest now.
My name is Sonia Siddiqui.
I am present day FractionalPrivacy Counsel and founder of
my own practice, HamrackSolutions, where I advise tech
companies and crypto companieson building scalable, compliant
(00:50):
privacy programs.
I'm also a former chief privacyofficer at Kohler and, as
Cameron you mentioned, I wasalso a head of privacy and
security legal at Coinbase for abit and also sort of an early
early joiner there, and I bringa business aligned lens and a
practical application as itrelates to privacy and tackling
all the challenges that come inthis space.
Speaker 3 (01:12):
Love it.
Welcome to the show.
It's a pleasure to have you.
Speaker 2 (01:15):
Thank you.
Speaker 1 (01:17):
What before we get
into the meaty, greedy, or
that's not even a saying whatwas the dream before becoming a
lawyer?
Like where did that all comefrom?
Speaker 2 (01:29):
Man, where did it
come from?
So it came from the dreambefore becoming a lawyer was I
wanted to be an architect.
But you know, it turns out I'mreally bad at math.
I do love design.
I'm terrible at math.
And so that dream got deferred.
And then there is a little bitof a holy trinity still in South
(01:49):
Asian culture about what youcan be, and so it's doctor,
lawyer, engineer, and so whichone doesn't use math right?
And so I joke and it sounds sad, but it was fine.
I really do think I landed inwhat is truly my calling, which
is to be a lawyer.
It may have not been the mostidealistic path, but I really
love what I do.
Do you know?
Speaker 1 (02:10):
Love that.
So we were talking offline andI'd love to get kind of into it.
Gabe, you had a, you kind ofhad a question.
We can start off from there.
Speaker 3 (02:19):
First kind of a you
know, a follow-up introductory
question.
Also, you know you built yourcareer on that intersection of
law, technology and privacy, aswe were talking about offline,
and I was curious what drew youto it.
I mean, you mentioned kind ofthe natural progression of you
know, maybe by elimination, whatelse was left for you to really
explore and exploit as anintellectual human.
(02:39):
But what drew you to the field?
Because at any given point youcould have done like some of our
black sheep's in the family andsaid you know what?
I'm not going to be a doctor,lawyer or an engineer, I sure
should have tried.
Speaker 2 (02:53):
Yeah, no, and that's
a great question.
I think I in particular aboutending up in sort of the privacy
and the cybersecurity sort offield as I was going through law
school.
I think during my time in lawschool there was a case that
came up in New York aroundsurveillance of minority
communities, particularly Muslimcommunities, and it was quite
controversial at the time andthere was a lot of sort of
(03:14):
litigation around it and sort ofquestions around civil rights.
And I've always been kind ofinterested in sort of the civil
rights side of the world and hadbeen exploring interns and
clerking with civil rights firmsin law school and taking on
some of those issues.
And this one was interestingbecause it really came at the
intersection of the inherentright to privacy that human
(03:35):
beings have and civil rightsright, like tying those two
things together, and I found itso fascinating and so
interesting and really spoke tome just as an individual and as
a person as a part of thatcommunity.
As I graduated law school andbegan to explore my professional
path, I started trying tofigure out what I wanted to do
and I think I mentioned thisbefore.
Part of it was truly somethingI'm passionate about, and some
of it was luck and timing right.
(03:56):
And so, as I was graduating lawschool, it was on the heels of
sort of well on the heels, Idon't know but GDPR was about to
come into effect.
Essentially, we had seen thee-privacy directive come down
and kind of in this limbo space,and so I met some privacy
professionals and started kindof realizing that this is
(04:17):
something that actually could betruly a meaningful career for
me to really kind of dig in onthe privacy side of things.
And so I started my career overat Grand Thornton, which is a
consulting firm, and helpedstart up that actual privacy
service offering there.
And this came in again 2017 to2018, as privacy was, gdpr was
going to effect and then, ofcourse, ccpa and so on.
(04:38):
I just loved that.
I mean, candidly, we all haveto be able to survive and make a
living, and so I love that Iwas able to kind of marry that
just fundamental need that I hadwith something that I really
love which is reallycontemplating the human need for
privacy and how do we preservethat as we continue to evolve as
a civilization really.
Speaker 3 (04:59):
Follow-up, another
one of those interesting
intersection and interplayquestions.
So you express an extremelystrong interest in the interplay
between emerging technologiesand privacy Blockchain, for
example.
You worked at Coinbase early on.
I know you've expressed somepretty thought, leading thoughts
(05:20):
around things such as ML and AIand as it pertains to training,
data and privacy also.
So what do you see as the mostpromising or concerning
applications in those emergingtech spaces?
Speaker 2 (05:34):
Help me understand a
little bit more.
Promising concerningapplications of.
Speaker 3 (05:37):
Yeah, so in the
technology world, there's some
things that are very promisingas it pertains to privacy and
privacy protections, and thenthere are others that are very
promising as it pertains toprivacy and privacy protections,
and then there are others thatare a bit more detracting from
that, if you would, namelyadding a net positive effect to
the world.
And so are there any emergingtechnologies, whether it be
blockchain or AI in particular,that you see having the most,
(05:59):
either concerning applicationsor the most promising
applications?
Speaker 2 (06:04):
Yeah, I honestly
don't know.
I don't know if there'sanything that really like sticks
out to me.
I think there's some, that justkind of challenge and we talked
about this things that kind ofchallenge the norms and our
expectations of what it means tobe privacy protective.
And I think one of the thingsthat we continue to see,
particularly with, like, privacyenhancing technologies,
particularly on the blockchain,is that the tension between
(06:29):
anonymization and legalobligations, right and like, how
do you contend with that?
And like is the world reallytruly built for people to be
truly anonymous?
And I don't know.
Those are just likephilosophical questions I answer
.
So nothing really sticks out tome.
But that's kind of just mygeneral thoughts on the macro
thoughts I have in this space.
Speaker 3 (06:47):
Sure.
So, more specifically, if Iwere to pick explicitly on one,
two things in particular, let'stake privacy regulation, maybe
GDPR, and then let's takebiometric data.
Do you see that as having morepromising applications in the
real world or more concerningapplications?
Speaker 2 (07:05):
Yeah, and that's a
great.
That's a great question, and Ithink it's something I keep
coming back to over the courseof my career is like examples
like that, and I'll preface thiswith you know, views are my own
, and so anything I say isreally just my own thoughts and
views and nobody else's.
And so I really do think thattechnology is evolving faster
than regulation can keep up, andI think we hear that a lot.
(07:26):
A lot of people say that.
But what does that really mean?
And I think a really goodexample of this touching on
biometrics is Tools for Humanityand WorldCoin.
Right, users look into this orb.
It scans their iris, itgenerates a unique digital code
and from there they're issuedWorldCoin tokens.
The company behind it arguesthat the system's anonymous and
there's no raw biometrics thatare stored and it can't be
(07:47):
reversed, engineered to figureout who you are.
But under GDPR, the very act ofscanning and creating a
biometric template is stillconsidered processing personal
data.
So there's a real clear clashhere, where you have this crypto
project that's engineered tominimize exposure of personal
information, but a regulatoryframework that still treats it
(08:08):
as traditional data processing.
And so it's an unansweredquestion really, and there's
some push there around.
How do we solve for that?
I think that same dynamicexists when we think about AI,
when we think about server-sidead tech you know other parts of
I've worked in crypto for a longtime now sort of transactions
being pseudonymous and permanent, and that immutability directly
(08:30):
colliding with GDPR is right toerasure, right.
These are not new things, butthey remain challenges and
unanswered from regulators.
Speaker 3 (08:37):
Right, right, right
right.
So let's keep pulling on thatthread a little bit more.
I've been in InfoSec the betterpart of my entire adult life,
and my lesser than adult lifewasn't terribly far from when
Bill Clinton signed the largerComputer Security Act, et cetera
.
I'm dating myself a little bitthere, but here's the thing that
(08:59):
I'm always torn with.
I agree that technology movesway faster than the law, but I'm
torn between whether or notthat's a good thing or if it's
just something we'll have tolive with.
I tend to lean towards.
I think it's a good thing,because one of the things about
technology moving so fast is wealso leave a lot of technology
behind us, like for all thethings that we know of today
(09:20):
that stays with us, that wethink of as being ubiquitous.
There's tons of protocols andother technologies that you know
.
If we had spent the time tohave legislated and regulated
them, it's equally hard to saywhether or not we would have
evolved to the technology thatwe have today.
And so how do you balance that?
How do you balance the need fortechnology to move as fast as
(09:41):
it does with what I think anywayis a requirement for the laws
to be interpretable broadlyunder some reasonable human's
expectation of what it reads tobe.
Speaker 2 (09:52):
Yeah, and that's such
a great point.
I think the issue doesn'tnecessarily lie in the fact that
regulations often come prior tothe innovation that creates a
new need, right?
I think that, sure, there'sthis challenge that these laws
were written for a world of filecabinets and databases and now
(10:13):
we're in this new world ofzero-knowledge proofs and AI and
blockchain.
What really, I think this comesdown to is, as we think about
this from a legal perspective,it's not necessarily rewriting
the law in turn, so much as it'sallowing for new
interpretations andunderstanding the spirit of the
law, right?
So GDPR is the mostcomprehensive privacy regulation
(10:33):
that we continue to haveglobally, but how do we allow
for those standards, thoseprinciples, to continue to be
upheld but also applicable insomewhat of a new world order?
Speaker 3 (10:48):
Yeah, that's a great
point.
I appreciate the allowing forthe interpretation to be updated
for the world that we're in,versus the simple trying to
rewrite things, because for asmuch as the world has changed, a
lot of it genuinely is the sameunder the surface.
Speaker 2 (11:05):
Now for that to
happen, though, gabe regulators
and those that are interpretingthe law need to be very savvy
and this is something I've saidbefore and knowledgeable on the
technology and how it operates.
I think, even with the EU DataAct, right, we see somewhat
reductive requirements for verycomplex organizations, right,
(11:30):
and so I think it does require alevel of study, and I think the
most successful practitionersin these spaces will be the ones
that really commit to having adeeper technical understanding
that underpins their legalknowledge as well.
Speaker 3 (11:49):
That makes sense.
We were talking offline and Ithink there is definitely a
little bit of friction betweenthe law and the understanding of
how the technology works.
Right, if I were to make ablanket statement that a
biometric model constitutes aprivacy risk, I'm of the opinion
that absolutely it does.
I know that there are differentmodels that have different risk
levels of exposure and thereare a few that have zero.
(12:12):
But how you get to the modelstill has this middle ground of
you have to train the model andso while in the process of
training the model, you willhave something that is exposed
to risk.
But I do see sometimes wherethe legislation feels a little
too broad.
Because if I were to say inthis conversation, try to
attempt to say that you knowlike a linear model or
regression model, you know anaive Bayes model, I apologize,
(12:34):
I know math is.
Speaker 2 (12:35):
I'm bad at math.
Speaker 3 (12:38):
It's OK for what it's
worth.
I joke all the time.
There's lots of things I'll doin public.
Math is not one of them.
You're working to getabsolutely naked and run through
a field.
Sure, you want me to do math inpublic.
You got the wrong guy.
But there's some models thathave no risk.
They contain literally nobiometric data.
But if you think about themodels that people use in their
homes every day to bring it homea little bit more, voice
recognition models is a greatexample.
(13:00):
They're trained on voicerecordings and they create
unique fingerprints, such sothat those embeddings can be
reversed engineers for thepurpose of re-identification
because they contain uniquevoice prints.
I don't have an expectationthat the law should get that
detailed, but again, like, howdo we balance that?
(13:20):
Maybe you're right, maybe theanswer is more education on the
legislation side, but newtechnologies are really starting
to blur the lines of where youcan apply that thought logic.
Right, like it's not just thefile.
Speaker 2 (13:34):
Yeah, it's a new
calculus, right, so and.
Speaker 3 (13:38):
I don't.
Speaker 2 (13:38):
I guess I don't have
an answer for this either, but
there has Well, not right now onthis show, but there has to be
a more meaningful debate,discussion and analysis of these
novel technologies, and howthey process data and I think I
mean having been a practitionerfor a dozen years here voice
(14:01):
recognition models is cakewalk.
We can run that through gdpr, weabsolutely can, and we'll get
an answer, and we can mitigatethe risks and we can silo the
data on the back end and we canmake sure it's only processed
for certain reasons.
Yes, we can do that, um, butvoice is identifiable and
there's it's really kind of hardto de-identify that and have a
voice recognition model right.
(14:22):
So that's an example where thatprocessing is so clear right.
And we're entering territorywhere the processing isn't as
clear right Like we're even whenwe think about server-side ad
tech.
We're no longer storinguser-level data.
It's instantly anonymized.
It's never identifiable tobegin with.
However, if we take a more, ifwe take this sort of stricter
(14:44):
understanding of how this isaddressed by regulation, if the
data passes through your systems, even fleetingly, it's still
processing.
But is it Right?
How do we deal with that?
How do these companies thinkabout this stuff?
And I think these are thingscompanies are contending with
every day and they're findinganswers.
But you know, some leadershipin the space from the regulators
(15:04):
, some indication, would be alsosomething that would be super,
super helpful and it'll comefrom somewhere.
I think sometimes these thingsare industry-led, sometimes
they're regulator-led.
I think here we may see someindustry leadership that'll
emerge and it's in theirinterest right to define that,
and my hope is that it comestogether.
Speaker 1 (15:21):
I want to keep
pulling on this real quick, gabe
.
So obviously we're pretty wellfamiliar with the concept of
privacy by design and sincewe're talking about, like novel
technologies, ai, how it'salways moving extremely fast
From your point of view you know, you've been in-house counsel,
(15:41):
you've done professionalservices for firms how are we
able to apply those technologies, privacy by design with these
technologies?
They're always evolving thatmust be like an extremely
challenging thing to kind ofnavigate.
Speaker 2 (15:54):
Yeah, I think so.
I think it's how do we applythe principles of privacy by
design on an evolving tech?
I think the principlesthemselves, while privacy by
design itself is like GDPRprinciples.
Speaker 3 (16:06):
Mm-hmm.
Speaker 2 (16:08):
And I want to talk
about this very practically
because I've been in-house, Ithink, a lot of companies
sometimes and I've been atvarying types of companies some
companies will treat it reallyas a compliance checkbox, and
those companies they may be in adifferent growth stage or they
may not be as enmeshed intechnology development or
product development, and somaybe that's candidly, that can
(16:28):
be okay for some companies.
I'm not going to say thateveryone needs to have these
holistic, beautiful,well-engineered privacy by
design isn't meant to inhibitinnovation.
It really is truly the sense ofembedding those reviews in the
(17:00):
development's life cycle, sortof continuously.
And so when we think about AImodels, this could mean
assessing the risk at each stage, training, deployment,
fine-tuning, because what's lowrisk right now and today may
become high risk tomorrow, evengiven evolving a new attack
vectors and techniques, right.
(17:21):
And then for blockchain, again,it means designing by default
with things and this is workI've done zero knowledge, proofs
and synonymization andselective disclosure from the
outset.
It has to really really be aliving process.
With AI moving as fast as it is,you can't just sign off on a
single privacy review and callit done.
You really do need ongoingguardrails built into the
(17:43):
development lifecycle, so on.
Tech shifts, privacy safeguardskind of shift with it.
I really do see this.
It's like a wave right, likethere's always some ebb and flow
, but there needs to be thiskind of vibiness.
Can we call it that?
Speaker 3 (17:55):
especially if we're
talking about ai.
Speaker 2 (17:58):
We can definitely
talk about vibiness yeah, right,
that's my legal advice it vibes, yeah, okay we're just gonna
vibe our way right through allof it and tonya to that same
point.
Speaker 1 (18:08):
It's like that's
probably why it's probably not
best to write laws these, thesestate laws in particular, like
we don't want to write lawsbased on technology Right.
Speaker 2 (18:19):
Yeah.
Speaker 1 (18:20):
And I don't know if
any states have done that before
.
I haven't really done myresearch there, but I think that
just makes a good point to like.
That's probably something thatyou would never want to do, just
because of how quickly thosethings change.
Speaker 2 (18:32):
Exactly it would be.
It would be stale the day itcomes out.
Speaker 1 (18:36):
Yeah, basically.
Speaker 2 (18:42):
I mean thinking about
that a little bit more and
always coming back to sort ofthis North Star of using a
principles based approach whenaddressing any of this stuff.
I mean, we think about this incyber as well.
What are the principles thatneed to be regulated Right?
And it's risk, it'saccountability, it's human
impact Right.
So, you know, right now we havelawmakers that chase always
right, lawmakers often chase theshiny object.
(19:04):
Four years ago was crypto,right now it's regulated AI, and
so the second you drive it offthe lot.
What does it even mean, right?
And so I love my metaphors, andthey're always so inaccurate.
But a principles-based approachallows these legislations,
allows these requirements toreally focus on you know,
regardless of the tool,regardless of the tech, what
(19:27):
rights individuals need toretain, what duties companies
should bear when they processdata minimization, transparency,
all that stuff, and then letregulators apply those
principles flexibly to newtechnologies.
I think that's what we'recalling for right.
Gdpr and CCP are greatregulations.
They really touch at the coreof those principles.
Now the flexibility in theapplication needs to come right.
(19:48):
It needs to be demonstrated tokind of allow for indication to
industries that innovation isokay, innovation is allowed, it
won't be stifled, it has to bedone responsibly.
Speaker 1 (19:59):
Yeah, agreed.
Have you guys heard of that?
Horses to cars, analogy withprivacy law.
Speaker 2 (20:04):
Vaguely, but I don't
remember it.
Speaker 1 (20:06):
I can't remember who
actually came up with it, but I
recently heard of it because ofSteve Elkins, who wrote the
Minnesota law.
Great conversation, actually hehad it with Ron.
Ron and Steve had a greatconversation but they were
talking about like outdatedframeworks and how you know.
The analogy kind of highlightsseveral key points about current
(20:27):
state of technology and privacylaw and how you can't write old
laws to new technologies.
Basically, I'm kind of talkingabout so it's pretty cool
because it's like horses to cars.
Back in the day I think the oldstory was like botched this,
but it's laws of the horse wererendered almost useless.
Cars moved at speeds, horsescouldn't, didn't need to be fed
(20:49):
and rested, that kind of thing.
Like it's just a.
It's a pretty cool analogy fortoday compared to like back then
when horses and cars were inthat change I guess.
Speaker 2 (20:57):
No, I agree, I
definitely agree.
Speaker 1 (20:59):
It's pretty cool.
Speaker 3 (21:00):
So, tony,
congratulations.
You've been recognized as aFellow of Information Privacy,
and so what does leadership inthe field of privacy mean to you
?
Speaker 2 (21:13):
What does leadership
in the field of privacy mean to
you?
What does leadership in theprivacy?
I am always humbled when peoplethink that I am a leader in the
privacy space.
What does it mean to me?
It means pushing the envelope alittle bit on how everyone
thinks about this stuff, and soto me it's pushing privacy
professionals to think aboutthese things in different ways.
(21:33):
As lawyers, I think, in-houselawyers especially there's this
notion of like get to yes right,and I think that's particularly
interesting for privacyprofessionals Like how do you
get to a yes right?
And seasoned in-house privacyprofessionals have like tons of
scars from battles on this.
But I think it's such a healthyexercise and it really gets to
(21:55):
back to the things that we'retalking about right, where you
can kind of distill, where youcan no longer apply your
expertise because of thelimitations in front of you,
right?
But it does really mean pushingthe envelope, challenging being
a partner to the businessesthat you support or the clients
that you support, but alsochallenging them to think about
things differently.
It means being willing to havehard conversations about what
(22:19):
the regulatory activity needs tolook like.
It means I don't know.
I think those are the things itmeans to me.
I just have a lot of fun.
I think I think it's importantto have fun when you're doing
this work, and that's a big partof who I am and what I do.
I would tell my husband aboutthis.
He's like when would you stopworking?
I was like I don't think Iwould.
I just really like it.
Speaker 1 (22:37):
It's just fun.
So, yeah, right, because it'slike, and especially because
it's always ever changing, Imean, that's the.
That's the cool thing aboutthis, this space and other you
know, muslim girls that might bewant to do what you do someday,
and getting to see someone likeyou be successful in that
(22:58):
position that you're in it mustbe something that's just very I
don't know rewarding, in sensethat you might have others that
look up to you that see, oh mygosh, I could actually do
something like this.
That's really cool.
Yeah, it's always weird when,like I do something like this.
Speaker 2 (23:11):
That's, that's really
cool.
Yeah, it's always weird when,like I do have like younger high
school college age females thatcome up and say, like I want to
be like you and I'm like whatme?
I think you're doing way betterthan me and I hope they do
surpass what I've done and whereI've come to.
But yeah, it's always nice tohear if it's important, no, go
ahead.
Speaker 1 (23:29):
I was just gonna say
it's important in this industry
because, you know, the techindustry in general is full of
males more than females, sothat's that's why it's also very
important too.
Speaker 2 (23:41):
Yeah, and I really,
really dug myself a niche here
between law and cyber and tech.
I really was like, let's justfind the hardest places to just
find a seat.
It's definitely, I think thereare certainly.
Yeah, representation absolutelyis important.
Diversity is important.
I think we're certainly seeingmuch more of it as we continue
to.
You know, as time marches on,but it was hard, I think.
(24:04):
The first you know, first,early years of my early, early
part of my career, I was oftenone of the few you know females
of color minority, and that'sslowly changing, for sure.
And yeah, it's it's.
You deal with challenges allthe time in terms of unconscious
biases to outright biases, andyou kind of have to just
navigate those and you develop athick skin.
(24:27):
But then you also realize thateverything you do, everything I
do, hopefully just makes adifference.
As long as I continue topractice in this space, I do
hope to create space for morepeople like me and others.
Speaker 3 (24:38):
While we have you
here, then for those folks who
do look up to you and may notnecessarily have an opportunity
to interact with you one-on-onewith any regularity, is there
one book or resource podcast notnamed?
Privacy Please.
Speaker 2 (24:53):
Privacy Please,
privacy Please.
Speaker 3 (24:55):
That you'd recommend
to someone who wants to better
understand privacy andtechnology.
Speaker 2 (25:00):
Yeah, I am looking at
my bookcase right now, the one
that always sticks out.
There's two that stick out tome and I'm looking at them.
One is the Unwanted Gaze byJeffrey Rosen.
It says the destruction ofprivacy in America.
I like that one, I do.
Okay, I have three.
(25:21):
Okay, that's one.
I think one that every privacyprofessional should have on
their desk is Dieterman's FieldGuide to Privacy.
Who doesn't love a field guide.
It is so useful.
It is the easiest referencetool that you can have on your
desk, like, if you don't have it, guys, buy it and just put it
on your desk.
Speaker 1 (25:37):
Are these graphic
novels as well?
Are there pictures for peoplethat like pictures?
Speaker 2 (25:42):
Fortunately.
No, You're talking to a lawyer.
Speaker 1 (25:44):
This one's a pop-up,
oh, it's a pop-up.
Okay, I like pop-ups.
Speaker 2 (25:46):
And then the and the
last one is weapons of math,
math destruction.
Gabe, you may have heard ofthat one.
Speaker 1 (25:52):
Yeah, math, I thought
we didn't like math.
Speaker 3 (25:55):
We don't, but we like
math destruction though, so
it's a lot yes.
Speaker 2 (25:58):
I don't know who
that's by, but I can probably
tell you I love that one.
Speaker 1 (26:02):
While you're looking
that up, kathy O'Neill Okay,
what do you think is the biggestmisconception from tech
companies about privacy?
Speaker 2 (26:14):
law.
I think the biggest one is thatit's just a compliance checkbox
, and so I think that, likeearly stage tech companies or,
like you know, founder builders,I think the important thing to
think about is privacy at theoutset.
And I think and I can say this,I think I feel like I do really
(26:35):
give some credit to Coinbasehere on really contemplating
privacy from the beginning andit was as for my tenure there
and prep beyond that.
I haven't been there so I can'treally speak to it but it was
such a core part of it's.
Such a core privacy is such acore piece of the crypto
industry itself, but Coinbasereally held onto that as part of
it's.
It's such a core privacy issuch a core piece of the crypto
industry itself, but coinbasereally held on to that as part
of their, their design and theirproduct development and it kind
(26:58):
of really sits very neatly.
But the reason it does isbecause it was designed kind of
at the outset right, and so ascompanies continue to build, I
mean keeping that in mind, it'sso much easier to consider
privacy and have it be anenabler for the business when
it's thought of as such right,when you think about it as a
compliance factor.
It almost always comes by as anafterthought and then almost
(27:21):
always is very difficult tomeaningfully implement, right?
I mean, cameron, I know you'rea transcend Think about all the
companies that are looking forthese privacy automation
solutions and the biggest painpoint is that they haven't had
anything up to this point.
And we've hit an inflectionpoint where they need something.
Right, but at that point ittakes so much more engineering
(27:43):
work to pull something togetherversus having built from the
start.
But that comes from a mindset,right, and the mindset has to be
like this is a core part of ourfabric.
If we're processing personalinformation, if we want to be
viewed as trustworthy, right,forget about privacy, think
about trust, right, just thinkabout what it means, right.
If we want to be a trustedfiduciary of customers, if we
(28:06):
want to be a trusted partner forcustomers, if we want people to
trust our product and withtheir children, what does that
look like?
Right, you don't have to usethe P word, right?
What does that look like?
And that should force founders,force developers, to be
embedding those things that wewon't mention by design, right?
Speaker 1 (28:27):
Nail on the head
because and I'm talking to you,
cybersecurity leaders I'mtalking to CISOs and CIOs,
people that are making decisionsfor a lot of larger companies
that have privacy underneaththem Privacy is an enabler, it
is a way to help grow thebusiness, just like you were
kind of stating, like that's theway you have to see it,
(28:47):
especially in today's world.
Speaker 2 (28:49):
Yeah, and I think
talking to leadership about that
often again comes back down tothe word trust.
Right, it's user trust, it'scustomer expectation, and I
think I mean we see all thesesurveys all the time but when we
think about when customers,when we ask customers what their
expectations are when they'reusing a product, one of the top
ones almost always is securityand safeguarding of the data
that they're sharing with theseproducts.
Right, and what does that tieback to?
(29:11):
Right, it ties back to strongdata protection controls and
privacy by design.
Speaker 1 (29:15):
What do you think
about that, Gabe?
You're on that other side.
Speaker 3 (29:20):
I think it's true why
we started this podcast many
moons ago.
The intersection of privacy andsecurity cannot be understated.
You cannot have securitywithout privacy and you cannot
have privacy without security.
You can have some semblance ofsecurity, you can have some
minor controls, right, so youknow, you can do things like
apply identity controls, butthat isn't privacy by any
(29:42):
stretch of the imagination.
But you also cannot haveprivacy at all without the
security controls.
If you do not applyconfidentiality controls and
integrity controls andavailability controls, you will
not succeed in having privacy.
I think the challenge that Isee, as someone who interacts
with security leaders frequently, is they still see them as
(30:03):
separate problems.
They still see them as distinctand unique and apart from each
other, even though they willacknowledge that there's an
overlap.
I think they've strayed fromtheir roots that confidentiality
, integrity and availability, asagreed upon as the core tenets
of security, cannot be strippedout of the DNA of privacy.
Speaker 2 (30:21):
Yeah, I agree.
I mean, they're absolutelyinterdependent.
Speaker 3 (30:25):
Yeah.
Speaker 2 (30:26):
And it's the key, and
I think a lot of companies get
this right right you have toacknowledge that interdependency
and be willing to work crosslines.
I mean especially whensometimes privacy programs are
borne out illegal which is notabnormal still but there has to
be that cross collaboration withthe security partners, what I
call the arms and legs ofprivacy.
Right, they're the ones thatare going to be able to
operationalize a lot of thisstuff.
(30:47):
I'm just writing privacypolicies, right?
No, I'm just writing privacypolicies, right?
Speaker 3 (30:52):
No, I'm just kidding,
but you know, speaking of
writing, you do have quite thereputation as being a writer and
a thought leader.
How do you stay ahead of trendsin this space?
Speaker 2 (30:59):
I listen to privacy,
please?
No, I do.
Speaker 3 (31:03):
You heard it here
first, gentlemen.
You heard it here first.
Speaker 2 (31:06):
I do, but I listen to
a lot of podcasts.
I mean, I don't know.
This is like the same questionif you ask somebody like how do
you get in shape, and they tellyou diet and exercise, the
answer is always going to be thesame and nobody wants to hear
it.
But I'm subscribed to a ton ofnewsletters that are privacy
related.
I'm part of the IAPP.
I'm very lucky to have a verybroad network of fellow CPOs and
(31:30):
privacy professionals.
I I mean Cameron, saw me at thelast conference.
I know a lot of people and so Ihave a really good pulse, I
will say, on what's going on andwhat is afoot and what could be
, both here in the US and abroad.
And then, of course, I mean mywhole family, my kids, my
toddlers, listen to podcasts too, so we all have our things we
listen to.
So on my drives I'll usuallyhave something on helping me
(31:53):
stay up to date, just with notjust privacy but with tech right
, just pure, like what's goingon, particularly with the
clients that I'm supportingtrying to stay up to speed with
that as well.
So I mean there's no easyanswer.
I guess we should say that chatGPT plays a role here, but I
mean, I don't really really Ihaven't used it for keeping up
with that.
(32:13):
Is that a bad thing to say?
Speaker 1 (32:14):
but I mean, it's just
the way I can do it right now
but it's just another.
I mean just like using gemini.
It's just an easier way to getresearch and and obviously you
want to question everything youdon't want to yeah, and there is
value to the long form, right.
Speaker 2 (32:27):
so I think gpt had,
or these um LLMs have value,
they absolutely do and it's notgoing away.
But I think when it comes tomeaningful research and
understanding, there is somelegwork that should be done.
The traditional, the good oldgold fashion way where you read
that fourth paragraph down onthat article, right, you go all
(32:48):
the way through.
Speaker 1 (32:50):
You know what?
Okay, first of all, two things.
I think, like you werementioning before, you know a
lot of people.
You're easy to like.
We met, we instantly just had agood vibe because you have a
good sense of humor, which Ithink also helped you in law
school early on, and likedealing with law, because you
said you had to have thick skin,but you also had a good sense
(33:11):
of humor and I think having thatalso helps in a lot of
different areas and jobs, butespecially in the one you know,
being in law, because I knowthat that's probably like, at
least from movies and stuff thatI've seen it's like a.
You know it's a doghouse, it'slike it's a.
I saw suits.
I don't know how realistic thatwas.
It's based on my life yes, but Ido this all the time.
I always forget the secondthing that I was going to say,
(33:33):
which was the bigger point.
What were we just talking aboutbefore that?
The last thing that you endedon reading traditionally reading
, writing, math, I forget.
I had a good point.
It'll come back to me, buthaving a brain fart see, I do
this to myself all the time Ihad two thoughts.
(33:54):
I had to say the first one andI forget the second one.
Speaker 2 (33:57):
I do the same thing
if I don't ask me to repeat
something, I just cannot um, letme track back.
Speaker 1 (34:02):
Let me see, see if I
can remember.
Why don't I do the following?
Speaker 3 (34:05):
So why don't I give
you the opportunity that we
rarely give guests?
Why don't we like to turn thetables and ask us a question?
We've interviewed at this pointcountless numbers of privacy
and security professionals.
Speaker 2 (34:18):
Ask away.
I talk a lot about what I thinkis important to see in privacy
professionals.
I've been in-house and I'veworn a ton of different hats in
my years of practicing.
Curious from your guys'perspective, what stands out to
you?
I mean, I'm very honored to behere.
What makes you see someone andsay like this is a person I want
to know what they think about.
This is someone that we see asour definition of success in the
(34:41):
industry.
What does that mean to you?
What does that look like for?
Speaker 3 (34:43):
you Raw, unfiltered
passion.
Within the first answer to thefirst question you give them, it
literally just comes spewingout of their very being.
They tend to lean in, theystart really.
You can see it in the bodylanguage.
The eyes lights up, theshoulders come up and they get a
little close to the camera.
If you're in person, the samething too.
(35:05):
I have a strong appreciationfor it.
I have a strong appreciationfor it no matter what someone
does.
But as it pertains to privacyand security in particular, I
don't think it's a place to beblasé.
It's definitely not a place tobe equivocating.
You either lean into this thingand you're here for it, or make
room for others that are, andthe folks we tend to interview.
We usually choose them based onthat.
(35:26):
I will not call any names,vibes, and we've very rarely had
to do this, but I can onlythink of maybe two guests where
it's like, wow, that person hasless passion than my house cats.
Speaker 1 (35:41):
Hey, don't let the
house cat demeanor fool you just
because they seem like they'reuninterested.
That's a good point.
Speaker 3 (35:47):
That's a good point.
Yeah, yeah, given the rightmotivation, they get real
interested, don't they?
Speaker 1 (35:52):
Let's be honest, they
play with strings.
Speaker 3 (35:54):
Is that open?
Speaker 1 (35:56):
Okay, so I agree with
Gabe, definitely passion,
because it helps Like someonethat has passion and curiosity,
like that's why I fell in lovewith this industry and you know,
I came into like cybersecurityand then we kind of molded into
privacy and then I fell intoprivacy in it.
I just love the complexity ofeverything on both sides, when
(36:18):
it comes to the law, when itcomes to technology.
It's just fascinating to me.
There's so much and so much.
There's so much darkness,there's so much positivity,
there's just so much going onback and forth.
So anyone like yourself that'spassionate about it too, it's,
it's so fun to have theseconversations and learn where
you came from and why you, youknow, have that passion as well,
(36:38):
and everybody has a differentreason for it too.
Everybody has a differentbackstory.
It's just, it's very unique and, to be honest, like we're
honored to have you here Firstof all, this is just, we're just
two goofballs talking aboutprivacy and security, and I mean
you guys, our guests.
When we have guests on, that'sthe real treat for us and,
(36:58):
hopefully, for our listeners.
Sure For sure.
Okay, I thought about what Iforgot about.
What do you think about?
All right, we have companiesthat still use these old school
10,000 page terms and conditions.
Is it hurting their company oris it helping their company?
Well, maybe that's the wrongquestion to ask.
I think it's hurting it, butwhy do companies still is it the
(37:22):
same mindset of like it's justa compliance checkbox?
Like, are those companies thatare still outliers?
And why do we still have that?
Why is that still a thing?
Because nobody reads that stuff.
Speaker 2 (37:35):
Yes, and nobody reads
privacy policies.
Really, I mean, I do all daybut Well, naturally.
Yes, interesting question day,but well, naturally, yes, um, uh
, interesting question.
I think I have so many, I haveso many thoughts on this.
Just general space from from,uh, user, my own terms of
service user agreementperspective.
There's so many different lawsthat companies have to contend
(37:57):
with, and I think there is anelement in a level of paperwork
that you need to bound customersto when things go wrong.
Right, these are our promises,uh, promises from a company to a
user and a user to a company.
Right, that's the contract.
That is the contract that needsto come into place.
And so I don't think there'llever be the death of the 10,000
page user agreement, because,particularly in the US, which is
(38:18):
much more litigious thananywhere else in the world,
without that there's there couldbe causes of action.
That would just lead to way toomuch liability for customers
and, sorry, for companies, andso it is, to an extent, a
protective measure.
And then, to the other extent,if we think about the customer,
it's kind of a know your rightsmechanism to let them know what
their rights are and what theirrights aren't right and what
(38:40):
they're signing up for, and thenit's the last step of love it
or leave it, and most peopledon't read it, so they love it
and they move on right and thenyou're bound to arbitration
clauses and that's really fun.
I hope all the lawyerslistening to this just laugh at
that.
But that's the user agreementside.
On the privacy notice policyside, I think things are
interesting.
I think after GDPR we kind ofsaw the shift initially this
(39:00):
pendulum swing to having like alot of global comprehensive
privacy policies, like onepolicy notice to rule them all,
one notice to rule them all, andI think we're starting to see
more jurisdictional, the startupand more jurisdictional
approach.
And I think the driving forcethere from regulators is
transparency and clarity tousers.
And so what I do think we'reseeing on the privacy side is
(39:21):
regulators being like thisdoesn't make sense, like I need
one for my pick a country Idon't know Australia right, we
need one that is specific toAustralia.
We want to know where the DPOis in Australia, we want to know
what their rights areparticularly, and so we're
starting to see more again thisbreakout of
jurisdiction-specific notices orcompanies that host them, sort
of against the IP address rate.
So yours will populate andit'll be the specific Australia
(39:43):
one or the Japan one or whatever, I think that trend is coming
out now.
Maybe people have seen otherwise, but that's sort of been my
observation doing this work andso it's interesting and I do
think that is meant to help theend user.
I will say there are customersthat absolutely do read the
privacy notice, particularlywhen you're dealing with in the
financial space, right when,like I mean, I absolutely read
(40:07):
it for financial products.
So I do think you're seeingmore and more of this in the
other direction.
Speaker 3 (40:13):
And those are shorter
.
Speaker 2 (40:15):
Those are shorter and
they are a bit more clear.
I think there's much moreflexibility with privacy notices
versus user agreements with,like, making them the UI
friendly yeah, the UX friendly.
And making them the ui friendlyyeah, the ux friendly.
And making them short form,like.
I think google does a good jobwith this for what it's worth.
I think other companies do too.
I think tesla does a prettydecent job too that was a great
(40:36):
take.
Speaker 1 (40:36):
Appreciate that, gabe
.
You got anything else?
Tech privacy wise.
Before we get into some of ourfun questions, which I'm excited
about because we haven't donein a while- obligatory boo down
with meta, but otherwise no, I'muh they have a great.
Speaker 3 (40:51):
No, I'm just kidding,
yeah yeah, look, I gotta, I
gotta roll it out at least oncea show, otherwise my sponsors
don't, they don't cash.
Yeah, oh, you gotta get acouple boo.
Speaker 1 (41:03):
Um, that's always fun
.
Well, actually, before we getinto the fun questions before
and wrap it up, is thereanything, sonia, that you want
to talk about that we didn'ttouch on?
Before we get into the funnies?
Speaker 2 (41:14):
no, we had a really
fun conversation about emerging
technologies and regulatory.
That was fun for me, hopefullyfun for you guys guys?
Speaker 1 (41:24):
No, I don't think so.
Anything you want to promote Iknow you said you're going to
Boston next week.
Speaker 2 (41:29):
I'll be in Boston
next week at the AI conference,
so people should find me.
And then I launched my website,which is very beautiful.
I designed it myself.
Speaker 3 (41:38):
Yeah, that's nice.
Speaker 2 (41:40):
My architect dreams
have fallen short to the web
design.
Tamaracksolutionscom is my.
Oh wait, no, I'm sorry.
Wow, I messed it up.
Tamarack dot solutions.
It's even cooler.
Tamarack dot solutions is mysite, so that's my little solo
shop.
Speaker 1 (41:55):
How do you spell it?
Speaker 2 (41:56):
I just like that font
T-A-M-A-R-A-C-K dot solutions.
Speaker 3 (42:06):
Go check them out,
listeners, we'll make sure we
include that in the show notes,as well as our social outreach
as well too.
Sonia, it was an absolutepleasure having you on the show.
Good luck in Boston.
Wait, wait, wait, wait.
Oh no, no, I'm not signing off.
I'm turning it over to you, myfriend, but before you go, we do
have some privacy probingquestions.
Some probing privacy questions.
Okay, we do that are camera youpose on guest, if you would,
(42:30):
sir.
Speaker 1 (42:34):
I'm going to admit
this was created by Gemini,
knowing your background andstuff.
So if that's weird, sorry,that's fine, but I found these
kind of interesting and we'redefinitely going to use one that
we've used with most of ourguests, because, that's, we'll
go ahead and get that one out ofthe way.
First For your toilet paper.
Situation in the house for youand your husband is the toilet
paper, is the flap on the top orthe bottom?
The only answer.
Speaker 3 (42:57):
The only, the only
answer.
And the rest of those peopledon't deserve privacy, by the
way, if they keep it true Ishould have said it depends, so
I could it could, because whatif?
Speaker 1 (43:11):
what if you
accidentally did it and you
didn't realize it because youwere just in a rush?
Speaker 2 (43:16):
then you should fix
it, because there should never
be that much of a rush yeah,that's true, good point.
Speaker 1 (43:21):
If you had to create
a new privacy policy using only
emojis, what would that be?
Speaker 2 (43:26):
Let me look at my
keyboard.
Speaker 1 (43:28):
Check it out.
Speaker 2 (43:28):
I know what emojis I
have.
Speaker 1 (43:30):
That sounds like way
too futuristic if we're using
emojis for just.
Speaker 3 (43:36):
I mean.
Speaker 2 (43:36):
I would have.
Speaker 3 (43:37):
You're describing Gen
Z lawyers, essentially.
Speaker 2 (43:39):
I would have the
smiley face with the eyeglasses.
I would have the smiley facewith the eyeglasses and then I
would.
No, I would have the smileyface with the the disguise,
sunglasses and the mustache,then I would have the melty
smiley face, and then I wouldhave the praying hands.
Speaker 1 (44:00):
I like it All, right,
okay.
Speaker 2 (44:03):
I'm not so jaded, but
that's okay.
Speaker 3 (44:08):
Sonia, give it time.
Speaker 1 (44:10):
You can only use one
app for the rest of your life.
What app would that be?
Speaker 3 (44:16):
currently Does the
off button count as an app, or
is that I mean?
Is it?
Speaker 2 (44:19):
like based on high
utility, Because I really should
just say no, no no, it's basedon.
Speaker 3 (44:24):
you have the
restrictment of one app.
That's the only thing.
Yeah, your phone will onlyallow you to use one app for the
rest of eternity.
Choose wisely.
Speaker 2 (44:34):
I would just use my
chat GPT app and develop a
parasocial relationship with mythat's the right answer.
With my LLM and have a lot ofother problems.
But it's useful it does,because then I can pull
everything.
If you buy a service, you canthen pull everything from the
web.
So as it really went out, thinkabout it.
Speaker 3 (44:54):
I thought you'd go
calculator app, but all right.
Speaker 1 (44:59):
Calculator app, ai
can do the math for you.
Speaker 3 (45:01):
That's why Sonja's
answer was the better answer.
That was the right answer that.
Speaker 1 (45:05):
That makes me okay.
So be careful with this, though, because I've seen people I
don't know if you guys have seenthis on social media, but I
can't remember where I saw itbut someone was using chat, gpt
or gemini to, instead of likethe, what was it like?
The husband would start usingit to get more direct answers,
and then the wife started totalk like the gemini, so it
(45:29):
would like basically get oneover on the ai.
Does that make sense?
So the wife was jealous thatthe husband was using ai to have
these normal conversationsbecause she's too complicated.
And then she started to talklike I don't know, it's very.
Speaker 2 (45:44):
there's all these
weird niche things.
Yeah, Like romanticrelationships with their yeah
their models and stuff.
I think it's I don't know.
I think it's a little strange.
I also find it very like itwill always tell you that you're
right.
Great job.
Yeah that's a great point, andso it's too.
For me as a cynical lawyer.
It's way too positive to beright.
(46:05):
I'm like.
No, I agree, there's no way I'mright all the time right.
Speaker 1 (46:09):
But that's when you
know you should always question
it, because you can literally itcould be right and you can tell
it no, that's not right, thisis what I meant.
And they'll be like oh, I'm sosorry, I missed that.
It's too agreeable.
Speaker 2 (46:20):
Yeah, yeah too
agreeable.
And this is also the debate Iknow a lot of people say is this
going to replace lawyers?
I think maybe some of the morerote functionality, like
functions or things that aredone by lawyers, maybe, but
again, legal analysis and theability to be like it's.
So, especially in the privacyand cyberspace, there's so much
(46:42):
gray right.
There is no one right answer,there's no one way to get to an
answer, and I think that's whereit's so hard to replace that
human element that I mean wetalk about this all the time.
Privacy has such a deeply humanelement to the work that's done
and I think it's written intothe way it's even regulated.
When you think about GDPR andlike asking about the rights and
impacts to humans, humanfreedoms, right, like that is
(47:05):
such a deeply human thing andyou really have to think about
that as a person, and so I don'tthink there's really coming
away from that.
You need people behind that,one will say, and hope.
Speaker 1 (47:14):
Well, thank you, it
was such a pleasure to have you
on.
I know we've kind of gone overtime, so thank you so much for
joining us and just doing whatyou do.
It's inspiring, and we werejust I'm just honored to know
you and get to collab with you,so thank you so much for being
on.
Speaker 2 (47:30):
Thank you, Cameron,
Thank you Gabe.
Speaker 3 (47:31):
Sonia, it was an
absolute pleasure.
Enjoy Bye.