All Episodes

November 25, 2024 • 50 mins

Send us a text

Join us for an enlightening journey into privacy and technology with Avi, founder of TrustIZ. Discover proactive strategies to tackle privacy challenges and gain insights into Israel's evolving privacy laws post-GDPR alignment. Explore the intersection of privacy and cybersecurity and the transformative role of AI in privacy processes. Avi's dynamic perspective, inspired by the energy of Hamilton, emphasizes the importance of being "in the room where it happens." This episode is a must-listen for those eager to understand the future of privacy in our tech-driven world.

Support the show

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:10):
all righty, then.
Ladies and gentlemen, welcomeback to another episode of
privacy, please.
I'm your host, cameron ivy,alongside my other host, gabe
gums.
Gabe, how you doing today, man,I am well.
How are you, sir, doing?
Well, right, great, we got a wegot a guest with us today.
Um, avi, he's been on.
Actually we were looking back.

(00:31):
Um, what was it?
uh, almost like was it threeyears ago almost that, while
yeah, a lot of different thingsgoing on, so we wanted to catch
up with you and see what wasgoing on, what has changed, what
you're in, you know what you'regetting into right now, what
you're dealing with with, andjust talk privacy and all the
things.
So, avi, welcome back on theshow, man Welcome back.

Speaker 2 (00:51):
Thank you very much.
First, you're going to have totell us how it is.
You look five years younger,even though we saw you three
years ago.
You should go the other way.

Speaker 3 (01:00):
Yeah, that's true, I can tell you.
It's not for a lack of stressand stuff going on in my life.
So it's some.
Yeah, it's that magic elixir Iseem to be.
But yeah, thank you, you guysdon't look too shabby yourselves
.
I appreciate it, especially forbeing a Jets fan and having

(01:24):
kids too and a company likethat's a lot it.
It is I'm not sure which orderI would put those in in terms of
levels of stress, but uh jetspan father and uh company owner
.
But uh, yeah, for sure that'sthat'll do it.

Speaker 1 (01:37):
So what's going?
What's going on right now?
What?
Okay, let's, let's catcheverybody up that they never
listened to your episode.

Speaker 3 (01:43):
Yeah so if anyone wants to go back and listen to
that, that was a fun episode wedid in February 2021.

Speaker 2 (01:52):
So it was almost three years ago.

Speaker 3 (01:53):
Yeah, awesome, awesome.
So back then I was, I washeading the privacy department
at a UK law firm that was basedout of here out of Israel
department, at a UK law firmthat was based out of here out
of Israel.
And pretty quickly after that,after we had that episode, I
kind of made a little bit of achange where I moved from a law

(02:15):
firm providing legal advice toyou know, always in privacy to
my clients to a company thatprovided data protection officer
consultancy services, and thatwas an interesting transition.
I sort of felt a little bit Ihad reached a point in my career
where I felt like, just as aprivacy lawyer, I felt like

(02:36):
companies were coming to me orclients were coming to me when
something was broken, whensomething was wrong.
We were talking about thisbefore Gabe, about things being
broken and having to was wrong.
We were talking about thisbefore Gabe, about things being
broken and having to be fixed.
And that's where I felt like Iwas always coming in after the
fact.
And I really wanted to kind ofgrab the bull by the horns and
just be there right at thebeginning and right before

(03:00):
things were broken and try toprevent them from being broken,
and so I joined the consultancythat helped companies build
privacy programs, offer dataprotection officer services,
worked with some really awesometech and awesome companies right
on the cutting edge of the techthat was coming out of here out

(03:21):
of Israel, obviously, thestartup nation.
I did that for a couple ofyears and then a year ago a year
and a month actually ago Idecided to go out on my own and
jump in the deep end and Istarted my own consultancy.
It's been a year now and thingshave been going.
It's been a whirlwind, but, youknow, things have been really,

(03:45):
really going great, and so whatI do now the company that I
founded is called Trust Is andit's short for it's sort of a
play on the phrase trust iseverything right, because
everything that we do in ourspace, in tech, in compliance,
whether it's privacy or AIgovernance or things like that
it's all about building trust.

(04:07):
That's the end game.
That's what we're alwaysstriving for is building trust
with consumers, building trustwith the people that are using
the different tech, and that'swhat I help companies do.
At the end of the day, I helpcompanies build that trust
through privacy, compliance,data protection officer services

(04:27):
and AI governance, and sothat's been sort of my journey
over the past three years.
It's been an amazing one andI'm super like thrilled with the
stuff that I've achieved, butalso I'm looking forward and I
think there's a lot, a lot tostill do, and I'm excited about
it.

Speaker 2 (04:45):
Awesome, it's awesome .
Well, A congratulations on thenew venture.
That's just huge.
That's a lot of work, and hatsoff to you.
Thank you.
By the way, for those that wereinterested, it was season two.
Episode 52 is where Avi firstGosh.

Speaker 1 (04:59):
That's crazy Episode 52.

Speaker 2 (05:00):
Yeah, season two, episode 52.
A lot's changed in the worldsince season two, episode 52.
We're on season five now and so, from a catch up perspective,
what is what?
What's the number one changeyou've seen over those last
three years?
That's really affected the wayyou are thinking about the
problems today.

Speaker 3 (05:19):
So obviously, from a technological perspective, you
know it's, it's been.
How many minutes are we intothe episode now?
And we still haven't said theterm AI.

Speaker 1 (05:28):
Actually, I mentioned it before, but yeah, it's AI
right, no question like handsdown.

Speaker 3 (05:34):
that's what everyone's talking about and
it's coming up in every singleconversation.
It's funny because even themore traditional companies
before that weren't as techsavvy.
You know, like you had thetraditionally.
You sort of had the tech heavycompanies that were building
this really cool cutting edgetechnology, and then the more

(05:54):
traditional ones, and now eventhe traditional ones, like you
know just to throw an exampleout there, like a local
construction company is thinkingabout oh, how can I leverage AI
in order to make my people moreefficient and reduce costs and
things like that.
So it's really sort ofperforated everything and gotten
into everything and AI is justall over the place.

(06:16):
So that, in terms of thetechnology, that is certainly
something that I'm seeing muchmore of today than we did back
then.
The other thing, like if I'mthinking about, like the clients
themselves, what's interestingis, I'm actually seeing a lot
more maturity when it comes totalking about these types of

(06:36):
issues of compliance andregulation and things like that.
There seems to be there seemsto have been a very good
maturity in terms of wherepeople are coming to.
Like you're seeing companiescoming much, much early on in
the journey, which obviouslymakes my job A much more
interesting, b much more, youknow, efficient and effective,

(06:59):
if I can get in, like on theground level, when they're just
laying the foundation and thescaffolding of the product, and
I can, you know, steer them inthe correct direction.
So I'm seeing thoseconversations happen earlier in
the cycle and that's veryexciting and, I think, something
that's really been a goodbyproduct of some of the

(07:23):
regulations that we've seen comeout over the last couple of
years.

Speaker 2 (07:28):
Look, we couldn't spell your name without AI,
could you?
Awesome?

Speaker 3 (07:32):
I like that.
I can't spell your name withoutAI.

Speaker 1 (07:36):
I don't know if I'm jumping the I don't want to say
gun here, that's the wrong termbut especially because I'm going
to be bringing up the war, likehow has the war affected what's
going on in privacy and datasecurity and everything?
What have you seen there?

Speaker 2 (07:50):
And how's that?
You might want to be morespecific.
Which of the wars?
The war, yeah.
Which of the wars?
So that's fair.

Speaker 3 (07:57):
Yes, that's fair.
So, for those who don't know mybackground, I'm originally from
the US, but I grew up here inIsrael, have lived here since
age seven, so really grew uphere and live with my family
about 20 minutes north of TelAviv, which people are probably

(08:19):
familiar with that area If youhaven't been here, it's very
beautiful.
Familiar with that area, if youhaven't been here, it's very
beautiful.
I would recommend coming andvisiting, although I would wait
until things calm down a littlebit, which we're all hoping to
happen soon.
So, domestically, for us it'sbeen really interesting.
Obviously, personally, Istarted my business on

(08:43):
Octoberober 1st 2023, so sixdays before, you know, october
7th, which was the the big, uh,terrible, terrible day that
happened last year, and so thatwas like a shock to the system
and I was totally likequestioning myself for months
after that, like what am I doing?

(09:04):
How can?
I possibly build a business andhow can I possibly make this?
Is this viable?
And what have I done to likethe future of my family?
And I'm really proud to saythat it was.
It was a, you know there wasthere was a rough patch there at

(09:35):
the beginning, but I reallypushed through and persevered
and I'm very thankful that I hadthe support of my family and
doesn't necessarily have to bestarting a business.
But if you're hearing this andyou're thinking like, oh, I want
to make this change or I wantto, you know, go into this type
of business, or I want to startmy own business, or anything
like that, I think what I tookaway from the experience

(09:57):
personally is that we can reallyovercome a lot of challenges
and so, and so I would say youknow, go for it, you can, you
can.
I believe in every singleperson in our amazing industry.
So I would, I would encourageeveryone to take those risks and

(10:17):
good things will come.
So that's sort of on the kindof personal side, I guess, or
national slash personal sidewith me.
Interestingly, in parallel, inthe past year actually, israel
has managed to pass a pretty,pretty serious reform of our

(10:40):
privacy law.
There was an amendment that waspassed a few months ago.
It's going to go into effect in2025.
And so because of that changein the law, we're actually
seeing a lot more interestactually domestically from
companies that didn'tnecessarily take privacy all

(11:02):
that seriously because of sortof the history of privacy
legislation in Israel, it'shistorically been something that
hasn't really been on the topof people's minds.
And now, because the law haschanged and gotten more serious
and the regulator has gotten waymore ability to impose
sanctions and much moreauthority, people are all of a

(11:24):
sudden perking up and startingto take note of it, and so from
that perspective, it's also beenreally great because we've
gotten a lot of people, you know, calling in and saying hey, we
heard something's going on.
Can you help us navigate these?
you know this new thing andwe're able to be there and help
them navigate it, so that's beenalso a really cool thing that's

(11:47):
happened.

Speaker 1 (11:48):
Yeah, that must have helped your business, I would
imagine.

Speaker 3 (11:50):
Yeah, definitely it is helping, for sure.
There's actually even arequirement, similar to the GDPR
requirement, for certain kindsof companies and for certain
kinds of entities, like publicentities and companies
processing a high volume ofsensitive personal data to

(12:10):
actually appoint a dataprotection officer, and so
that's a job title that justhasn't existed here beforehand,
and all of a sudden now is notonly is it existing, but it's
now a legal requirement incertain circumstances, and so
we're getting questions andqueries from companies about,
like, what is this new thingthat we need that we need to

(12:31):
deal with.
So that's.
That's been really cool andreally interesting, and I've
been enjoying a lot of theconversations that we've been
having around that.

Speaker 2 (12:40):
What was the driver of such legislation?
I mean, I'm always curiousbecause you know like in the US,
obviously CCPA is not at thefederal level and has a very
different driver than you know.

Speaker 3 (12:51):
Every piece of legislation has, like, the
history behind you know how itcame to be and this law is no

(13:12):
different.
So for people who aren'tfamiliar, Israel's privacy law
is actually pretty, you know.
Actually, one of the first onesin 19 is from 1981, is.
Israel's privacy law.
So it's, it's.
It's pretty old, I think.
Certainly predates me.
So there you go, giving alittle bit of my disclosing my

(13:36):
age right there, but you canimagine what technology was like
back then and it hadn't gonethrough many major there.
I think there were two majoramendments that were made over
the years, but nothing recent.
In 2011, israel received theadequacy status from the

(13:58):
European Union.
Under the old regime, the EUPriv directive this is pre GDPR
received adequacy status and,for people who may be less
familiar with that, what thatbasically means is that data
could flow.
There are only a handful ofcountries around the world who
have this status and Israel isone of them, and basically what

(14:20):
it means is that data can flowfreely from the European Union
to Israel without the need foradditional measures like
standard contractual clauses orthings like that.
So that was achieved in 2011with the promise that Israel's
privacy law was going to bemodernized and sort of amended.
That didn't really happen, andthen the EU actually came and

(14:45):
said we're doing a review of allof the adequacy decisions that
were given and we may bethinking about those who haven't
brought their law up to speed,you know, potentially maybe
taking that away or puttinglimitations on it, and so it was
actually that sort of what wasthe main, I think, the main

(15:10):
driver Obviously, there wereinternal things also the privacy
regulator really wanted toreally push this forward and it
was like a personal almost likea personal driver for him to be
able to raise the bar when itcomes to privacy compliance in
Israel, and they were able to doit.

(15:35):
They had a really intense periodof sessions in Parliament.
Obviously, this type of pieceof legislation has to go through
hearings and committees and theprocess is pretty lengthy.
So and and they had they had alot of sessions on it, but they

(15:55):
were able to finally pass itthrough and it was passed in
August and it's going to go intoeffect next August in 2025.

Speaker 1 (16:03):
That was my next question.

Speaker 3 (16:04):
Yeah, yeah, so it's going into effect in August 2025
.
And it's interesting because,you know, we have this phrase
called the Brussels effect,right, which is regulation out
of the EU that impactsregulation around the world,
right?
So the formation of CCPA, basedon obviously different from the

(16:26):
GDPR but almost like, based onsome of the principles of how
GDPR was built, is sort of oneeffect that it had.
You know, EU to US and nowwe're seeing, like this is the
Israeli law is sort of theBrussels effect that we received
from the GDPR was like the pushfrom the adequacy standpoint to

(16:47):
say, guys, you need to get yourducks in a row if you're going
to keep this adequacy status.

Speaker 1 (16:52):
I don't know if we went over it.
What's the name of this one?
Does it have a name?

Speaker 3 (16:55):
Of Israel's privacy law.

Speaker 1 (16:57):
Yeah.

Speaker 3 (16:58):
Yeah, it's called the Protection of Privacy Act.
It's still 1981, but it's.

Speaker 1 (17:03):
I'm sure it's got an acronym right.
It's probably Israel.

Speaker 3 (17:07):
P-O-P-A.
I think Papa maybe.

Speaker 1 (17:10):
Oh yeah, Papa.

Speaker 3 (17:11):
Papa, I've heard of that, not to be confused with
Tom Papa.

Speaker 1 (17:17):
And there's also one called PEPA somewhere else.
Not Peppa Pig, Right there'sPOPIA.

Speaker 3 (17:22):
There's PIPEDA, there's all the.
When you get privacy, it'salways with a P.
So there you go.

Speaker 1 (17:31):
It's true.
Do you know any of the maindifferences?
You said there's similaritieswith the GDPR, Any key
differences that stand out toyou that you might that come off
the top of your head.

Speaker 3 (17:41):
Yeah, so it's always interesting when you talk about
like local legislation andcomparing it, comparing the
approach to the way that it'sapproached in other countries.
Obviously, you know, in the USthe privacy is very much seen as
a consumer right, whereas inthe EU.

(18:01):
It's seen as a basic human right, consumer rights whereas in the
EU it's seen as a basic humanright.
So in Israel as well, weconsider privacy is considered a
human right.
There's constitutional, youknow, israel's constitution does
talk about people's right toprivacy in their bodies and in
their homes, and so that isbased on Israel's constitution

(18:27):
and so from that perspective,the approach is a bit similar.
Traditionally, israeli privacylaw has been actually very
security focused.
So because we have a lot of,you know, we're seen as like the
cyber nation, right, there's alot of cyber companies, there's
a lot of focus on cybersecurity.

(18:48):
So there's always been thisconflation in Israel between
privacy and cybersecurity andpeople who say, oh, I'm, you
know, I'm protecting the data,therefore I'm privacy compliant,
and trying to have to explainthat, no, actually those are two
, you know, they're obviouslytwo sides of the same coin, but
they're two separate issues.
And so now what we have in theprivacy law is we do have, we do

(19:16):
have a principle-based law.
One of the main differencesbetween GDPR and Israeli privacy
law is that, while GDPR has sixdifferent lawful bases for
processing so we have consentyou have contractual necessity,
legitimate interest, lawfulobligation, etc.
Obligation, etc.

(19:42):
In Israel it's all based onconsent.
So you have to obtain consentin order to process personal
data, and if someone withdrawsconsent, then you can't process
their data anymore.
For those familiar with theCanadian privacy regime, it's
very similar to that in thesense that Canada also has a
consent as the sole lawful basisfor processing personal data.
So that is one very majordifference.

(20:05):
Also, conceptually, israelstill.
Unfortunately, one of thethings that we still have in
Israel is a definition ofdatabases.
So it's not about processing ofpersonal data.
It's about processing of datain databases, which is a little
bit of an antiquated way ofthinking about things, but like

(20:26):
protecting a database as opposedto just looking at specific
processing activities.

Speaker 2 (20:31):
I'm not super offended by that, I think,
unless one defines database toonarrowly, because what I like
about it is some things thatothers kind of try to like
Databroke is certainly statesidetry to skirt is that they're
not like keeping a database of abunch of PII right, like

(20:52):
because of how they keep it.
They're like, well, that's nota database, obviously they're
just getting cheeky with some ofthe wording there.
Not a database, obviouslythey're just getting cheeky with
some of the wording there.
But I mean a database does notnecessarily have to mean an ACID
compliant, you know RDBMSsystem.
By any stretch of theimagination it could just be
this notebook.

Speaker 3 (21:09):
Yeah no, it's true, it's true, but then, yeah, it's,
it's true, it's just, I feel,what, what, what's?
I think what bothers expertsmost about it is it's assuming
you know the way that moderntechnology has evolved and the

(21:31):
way that we live our lives andthe way that we have our
personal data and our PII andour personal information out
there.
I guess database implies somesort of structured base of
information, whereas we have alot of information that's out

(21:52):
there in unstructured formats,exactly.

Speaker 2 (21:54):
Right.

Speaker 3 (21:55):
Unstructured, unstructured formats, all kinds
of connections between differentdatabases and and you know data
lakes and places where ourpersonal information resides
that you can't necessarilyclassify into a neat filing
cabinet of database A, databaseB, database C.

Speaker 1 (22:17):
So is it like a general term?
Is it a general term?

Speaker 3 (22:21):
Yeah, it's a general term.
There isn't really a specificdefinition of what a database is
.
It's just a collection ofpersonal data that then needs to
be protected.
So if you think about, let'ssay the way we think about it is
more sort of as opposed to aspecific system.
We think about it more of thetype of information.

(22:44):
So let's say you're going andyou're working.
Let's say you're looking at acompany A that has human
resources data.
They may have seven differentsystems in which that data is
stored, but you look at itholistically and all those

(23:05):
systems together comprise onedatabase, which is HR
information.
So it's sort of more of aconceptual legal term than it is
a technological term.
But yeah, I mean it's just sortof a different way of thinking
about it and conceptualizing it.

Speaker 1 (23:20):
That's interesting because you know, I don't know,
do you work with companies inthe US as well?

Speaker 3 (23:26):
Yeah, sure.
I work with companies all overthe world.

Speaker 1 (23:28):
Okay, that's good to know for the listeners too.
But, yeah, so, compared to,like, what I'm hearing, compared
to Israel's and the European,you know, gdpr and their privacy
acts, compared to the stateprivacy acts that are in the
United States, like CCPA there'sa lot of them coming out in
January of 2025.
I think there's like six orseven, and a lot of what I'm

(23:52):
seeing in those is there's a lotof there's thresholds, there's
opt-out mechanisms, there's alot of things that are in there.
There's similarities in there.
They're very different becauseof the way you said.
It's more about the human rightcompared to the consumer, and
that's why there's so manylittle things that the state
laws have.
Am I on the right path?

Speaker 3 (24:13):
Yeah, you're definitely on the right path and
I completely understand.
So the answer is from thatperspective.
In terms of scope, it's muchmore similar to GDPR.
So there's no thresholds.
Your database, right.
So, for example, if you're, thesecurity measures that you have
to impose on the data that youhave will depend on the

(24:49):
sensitivity of the data, thenumber of data subjects that you
have, the number of people thathave access to the database.
That's an interesting one thatdoesn't exist.
So access rights determine thesecurity measures that you need
to impose.
So that's an interesting one,but in terms of the, I guess,

(25:13):
very much different from the USstate laws that are coming out.
From the US state laws that arecoming out because, number one,
it's all data subjects,regardless of the consumer
context.
So they can be employees, theycan be, you know, individuals,
even if they aren't directconsumers of the business.
So it's not just necessarily inthe business context and also,

(25:35):
there isn't that threshold, it's.
It applies to all, allprocessing.
So so nobody, nobody's reallyexempt, even if they're
government or health care orwhatever yeah, no, obviously
there's the, the standardhousehold household exemption,
like there is in the gdpr.
You know, if you're keepinglike your the, the, the modern

(25:56):
day equivalent of a Rolodex.

Speaker 1 (25:59):
Oh man.

Speaker 3 (26:00):
Yeah, there you go See.
Now you're showing your age,now I'm showing my age.

Speaker 1 (26:05):
Although I did say that the 1981 law predates me,
so you, can guess even thoughI'm in the 80s too, so if that
makes you feel any better, yeah,there you go too.

Speaker 3 (26:21):
So if that makes you feel any better, yeah, there you
go.
Um, so, so, yeah, so there's thethe household exemption exists,
but not the uh but, but, butcertainly in terms of the um,
certainly in terms of theapplicability of the law it
would apply to to anyoneprocessing uh personal data.
And then the interesting.
The other interesting thing tonote for anyone who is doing
business in Israel or has anyinterest in Israeli privacy law

(26:45):
is that we have a very aregulator that is certainly
using or not use I shouldn't sayusing, that's not the right
word but is very much leaninginto the new regime and talking
very much, you know, going toevery speaking at every privacy

(27:05):
event in Israel, talking abouthow enforcement is going to ramp
up and they're really going totake it to the next level and
putting people on notice.
People on notice, and theregulator has received a lot
more leeway and a lot moreauthority under the new law.

(27:27):
And they do very much seem atleast from what they're saying
publicly to they seem likethey're going to start, you know
, cracking down on enforcementAugust 2025.
So it's going to be interestingcertainly.

Speaker 1 (27:40):
What's the most common challenge that you run
into with your customers rightnow today?
Is there like one that standsout that's coming up very
frequently?

Speaker 3 (27:50):
Yeah, I think it's still.
I mean, this was the case backwhen we spoke last and I think
it still is very much the case.
The clients, the companies thatI tend to work with, are
operating globally, and the lackof a unified standard globally
is just a really, reallydifficult thing for companies to

(28:12):
be able to comply with and tostand by.
Obviously, companies that areoperating in the EU will, and
other countries will, take theGDPR as the global benchmark.
Yeah, but once you're, onceyou're, once you go beyond the,
you go, if you're in Israel, theEU and the U?

(28:33):
S, for example, and then youstart going into other
jurisdictions.
Um, then it starts becomingreally tricky and really
complicated and you in somecases, have conflicts between
certain laws, right.
So I work, for example, I dosome work with with fintech
companies that there's always,you know, financial regulation

(28:54):
versus privacy.
Are you retaining the?

Speaker 2 (28:54):
data.
Are you deleting the data?

Speaker 3 (28:55):
Everyone has different interests and they're
all like you know we need to.
We need to delete it under EUlaw is privacy?
Are you retaining the data?
Are you deleting the data?
Everyone has differentinterests and they're all like
you know we need to.
We need to delete it under EUlaw, but we have to retain it
under you know, South Africanbanking regulations.
So how do we reconcile those?
So that's just an example.
But you know, building a globalprivacy program sounds really

(29:16):
great in theory but is verydifficult to implement in
practice and that's I would saythat's probably the most
difficult thing that I find, youknow.

Speaker 2 (29:27):
Yeah, I imagine I certainly want to be mindful of
the time.
I know certainly how much morewe do it, dylan.
I've got a little time to hangmyself there.
I'm not certain how much morewe do it.
I've got a little time to hangmyself there, but we're nearing
the end of November which meanswe're pulling up on 2025
territory, which means it's timefor predictions.

(29:47):
What do you see?
What do you see really?
Either changing the landscapein the year coming and or
becoming more ever-present.
That is there now.
What are your?

Speaker 3 (30:01):
general predictions for 25?
Wow, that's a great, that's anexcellent question.
So I think this state privacylaws.
It's snowballing and thesnowball is just going to
continue growing.
We have, I think, growing.

(30:24):
We have, I think, 19, if I'mnot mistaken, 19 laws in the US
and we're going to keep going.
I was a little bit hopeful forwhen the APRA was introduced,
the draft federal privacy billearlier this year, but that was
a very short-lived experiment,as were its predecessors, so I
don't see that coming anytimesoon, but I do certainly see

(30:45):
states adopting many more statesadopting consumer privacy laws.
I think that we'll see somereally interesting regulatory
questions coming out around AIand privacy.
I think that there's, you know,there's still a lot of

(31:07):
unanswered issues around that,things like data retention,
deletion, you know, questionsthat really we haven't been
grappling with, we haven't hadto grapple with yet, but we
really, we really will, and Ithink that's gonna that's gonna
be super interesting, like froma from, as a person that is very

(31:27):
interested and follows veryclosely the developments in AI
governance as well as privacy,just to see where that is taking
us right.
So some of the, for example,some of the copyright lawsuits
that are pending in the courtsnow against some of the big AI
providers, like OpenAI andGoogle and others, around the

(31:49):
use of copyright in AI.
I think, yeah, and just acontinuation of the trends that
we've seen, with regulators sortof cracking down on the big
tech.
I think I don't see any.
I don't see anyone slowing downon that front and and I think
2025 will will be no different,I do think, if I can say

(32:16):
something, that's going to be,it's probably going to survive.
This is my prediction.
I do think that the thirditeration of US adequacy in the
form of the data privacyframework seems to be at least
for now, seems to be safe.
I haven't seen any majorchallenges of that.

(32:38):
As opposed to its predecessors,privacy Shield and Safe Harbor,
which were both struck down bythe EU courts, it doesn't look
like the data privacy frameworkis going down that route, which
is, in my opinion, a very goodthing that we can finally put
that whole, all thoseshenanigans, behind us, and just

(33:00):
focus on actually doing thework.

Speaker 1 (33:02):
So from that perspective, I'm happy.

Speaker 3 (33:05):
Wouldn't that be nice right?

Speaker 1 (33:07):
Well, so you have your predictions.
Those are good.
Now take what you would hope tohappen.
What do you think is one thingthat could actually make things
less complicated, for how thingsare so complicated, especially
for these global companies thatare dealing with data all over
the place?

Speaker 3 (33:24):
Yeah, I think that anything that brings us closer
to some sort of global standard,some sort of like an
international body releasing aAgain, I'm realistic, so I don't
think this is actually going tohappen.
But if there was one thing thatI would be hopeful that would

(33:46):
happen is it's one of theinternational bodies, whether
it's the UN or the OECD, or oneof these international bodies
would sort of stand up and sayokay, enough with this nonsense
of jurisdiction by jurisdiction.
Here's the global privacystandard and we're sort of going
to make things these are.

(34:07):
The principle obviously needsto be principle based.
It can't be you know specificrules, because each country will
interpret and implement itdifferently in the same way that
different countries in the EUinterpret the GDPR differently
and somehow it still seems towork.
So we can have harmonization ona principle based harmonization,
harmonized privacy framework.

(34:29):
You can figure out the acronymfor that PBPF PBPF the principle
based privacy framework, andsorry've got to put the G the
global in there.
Yeah, there you go.
Doesn't it work the other wayaround?

(34:50):
First you figure out a reallygood acronym and then you figure
out what it stands for right,like CANSPAM.
That would be a backronym.
A backronym, oh, there you go.

(35:10):
Very good, I like it, I like it.
So, yeah, just something likethat that would allow companies
to, just because, at the end ofthe day, it's the frustrating
thing, I think, from myperspective, is you have
companies that really want to dothe right thing.
It's not like someone's comingand saying, oh I want to do
shady stuff, so let me likefigure out which where I need to
incorporate and what I need todo in order to avoid X, y, z

(35:32):
rules.
And that's not the case.
The case is we want to do theright thing, but it just works
out that because we're operatingin 10 different jurisdictions
and we have 10 different legalregimes that apply to us, it
just makes things that much morecomplicated, and so that's what
I would hope to see is somesort of harmonization.

Speaker 1 (35:52):
I can't imagine.

Speaker 3 (35:53):
No.

Speaker 2 (35:53):
Harmonization, that'd be awesome.

Speaker 1 (35:55):
When you're working with these privacy teams, or
maybe they don't even have aprivacy team, who knows?
I mean, if they're a prettylarge company, I would imagine
they would.
But what do you see in the techside when it comes to, like
that, collaboration with thesecompanies or customers that
you're working with?
Is there still a disconnectthere from your perspective?
Is it still lacking on theprivacy side, or what do you see

(36:17):
from that end?

Speaker 3 (36:18):
So I work with various different sizes of
companies, as you say, thesmaller ones that don't
necessarily have an in-houseprivacy, and I'm sort of the
sole privacy consultant as anexternal consultant.
That's always a challenge.
that's always a challengegetting to the relevant people

(36:42):
in the company trying to kind ofput together the regulatory and
the tech and trying to figureout how to, how to, how to work
collaboratively and how toconvince them that working with
me is actually good for them.
And uh, and I'm not trying to,I'm not trying to block anyone
or stop anyone.

(37:03):
I you know.
I'm just trying to help us do abetter job and build trust
Right, because trust iseverything.
Um and then yeah, and theninternally, um, I think that
there's internally.
I think there is also a youknow, sorry, companies that have

(37:24):
an internal resource tends tobe better because they tend to
be already on top of things andmore sort of collaborate with
the technical people.
But I will also say that one ofthe things that I am seeing
which I think is great is a lotof tech tools in our space that

(37:46):
are designed to sort of enableand give the internal privacy
teams more sort of collaboration, more control, more alerts
about what's going on in thecode and within the actual
technical teams, and that'sreally cool to see that people
are actually investing in thattype of software.

Speaker 1 (38:09):
That was leading me to my next question, which was
are you still seeing a lot ofcompanies still using legacy
tools for privacy now thatthere's so many better tools out
there?
Not to not to do a shamelessplug with something like
transcend, but, uh, you know,like thinking about that next
gen privacy where we're tryingto automate things to make it
easier for those tech teams andthose privacy teams to make

(38:29):
things a lot easier for theirconsumers to build that trust.
That's the biggest thing, likeyou were talking about is
building trust is how do youbuild that trust?
Well, I think having a techthat using some kind of privacy
solution, that is, a platformthat is very efficient and
automated, to where you canunderstand your data, know where
it is and have things automatedfor your customers and

(38:52):
consumers to make them feel likethey have that trust in that
technology as well.

Speaker 3 (38:57):
Yeah, absolutely, and I think that that's one of the
things that AI is going tospecifically in our space is
going to really make things much, much easier for the experts is
sort of giving you that bird'seye view of what's going on and
things like ropa's record ofprocessing activities and you

(39:19):
know other standarddocumentation that used to take
you know ages to complete cannow be automated and and kept up
to date in real time.
I've seen and I work withseveral tools like yeah like
transcend and like some of theother players in this space that
I know that are out there andare being built literally as we

(39:41):
speak, and it's really excitingto see like tools that are
scanning code and giving privacyteams sort of real time access
to okay, not what it says in thepolicy, because the policy is
already two months outdated,because the engineers have
already moved five steps aheadof what it says in the policy.

(40:04):
It just hasn't been updated yetbecause the update date hasn't
arrived yet, but actuallyscanning in real time, giving
alerts in real time, givinginformation in real time and
cutting out the, you know,alleviating the necessity to do
all of these manual tasks, allthe time and I think that's

(40:26):
going to be really, really great.

Speaker 1 (40:28):
Yeah, agreed, because I know a lot of the older tools
.
It takes a lot more resources,so a privacy team isn't going to
know what an engineer wouldknow For sure.
They're going to need to lotmore resources, so a privacy
team isn't going to know what anengineer would know For sure
they're going to need to usethose resources?
Yeah, definitely.

Speaker 3 (40:40):
One of the biggest things that I hear from clients
is like oh, we feel like we'reworking for the tool more than
the tool is working for us Right.
That's like the number onepiece of feedback that I hear
about some of the more legacytools is like we need to keep
feeding it information asopposed to, and we haven't yet

(41:03):
seen the benefits come back tous whereas the you know, some of
these ai tools are almost likeplug and play to use uh, to use
one of the older uh antiquateduh, you know, one of the older
uh tech phrases we don't.

Speaker 1 (41:21):
Yeah, I mean, it's interesting, it's going to be
really neat to see how thatdevelops and if, if, uh, ai and
privacy can find that balance ofinnovation without uh, without
taking a hit on the privacy sidefor sure for sure.

Speaker 3 (41:35):
No, I, I definitely, I think it's definitely going to
happen.
It's not a question of if, it'sa question of when.
And it's going to be greatbecause what it's going to do is
it's going to almost giveprivacy professionals
superpowers, because they'regoing to be the number one thing
that a privacy professionalalways yearns for and desires is

(42:01):
to be in the room.
Do you remember that?
Did you see?
Have you seen Hamilton?
So I love that song.
You know the song in the roomwhere it happens.
Room where it happens, the roomwhere it happens.
Yeah so that's what a privacythat's all a privacy pro ever
wants to be is in the room whereit happens.
Yeah, so that's what a privacythat's all a privacy pro ever
wants to be is in the room whereit happens, in the room where
that decision gets made in theroom where the architecture of

(42:23):
the new feature is getting, isgetting built, and what these
tools are going to allow them todo is be in the room where it
happens, because they'reliterally going to be seeing it
in real time, as as the code isbeing written, as opposed to,
you know, three months down theline when they discover that
they have a feature that's nowbeen collecting data, sharing

(42:43):
data.
You know these tools have beenonboarded.
They don't even have any clueabout what's going on.
Now, all of a sudden, we're,we're real time.

Speaker 1 (42:52):
We're seeing it real time.
We're shortening.

Speaker 3 (42:54):
Exactly, we're shortening the lag time for them
and and that's you know- it'sgoing to give.

Speaker 1 (43:00):
Yeah, it's going to be huge.
That goes back to your point inthe beginning, avi, where you
said I don't want to come intothese after the fact.
I want to be able to see itwhen it happens and be a part of
it in the room where it happens.
Interesting, ok, and thisepisode is brought to you by
Hamilton.

Speaker 3 (43:14):
You can go see it on.

Speaker 1 (43:15):
Disney Plus.

Speaker 3 (43:16):
Lin-Manuel Miranda.
There we go, yeah.

Speaker 1 (43:18):
Awesome.
Is there anything that you wantto bring up that we didn't get
to talk about?
I know we're coming up on timefor the hour, but anything that
you want to talk about, or ifyou want to let the listeners
know how to find you, if they'reinterested in working with you,
learning more from you orconnecting with you, whatever,
whatever that is yeah, withpleasure.

Speaker 3 (43:39):
So, um, thank you very much.
Uh, yeah, so the company istrust is, uh, the website is
trust isai, uh, it's with a z,so t-r-u-s-t-i-zai or zed.
If you're from the UK and youcan find me on LinkedIn, I am

(44:00):
very active on LinkedIn.
I love to connect with peoplethere.
I love to have conversationsthere.
Please feel free to reach out,dm, engage with any of the posts
.
I always love to hear people'sfeedback and people's thoughts
on the things that I post andthe things that I speak about,
and if anyone needs any helpwith anything relating to

(44:23):
privacy, consulting globally,dpo as a service or AI
governance, please also feelfree to reach out.
That would be awesome andreally looking forward to having
great conversations, because ifthere's one thing that our
community is good at, it'shaving really, really
interesting conversations withreally interesting people like

(44:44):
yourselves.

Speaker 1 (44:45):
I agree.
That's why you're on, becauseyou're interesting and you care,
you are you.
I think that one thing thatstands out to me from my
perspective is that you're justbeing you and you're digging in
and just going all in.
I love that and I thank you.
I appreciate that and thank youfor taking the time to be with
us again.

Speaker 3 (45:04):
Thanks.
Well, the feeling is definitelymutual.
Cam and Gabe, it's really, it'salways a pleasure to speak to
you guys, and same.

Speaker 1 (45:12):
One last thing before you go though, yeah, for sure.
What, since we're coming up onThanksgiving.
What's, what's?
What's one thing that you'rethankful for?

Speaker 3 (45:21):
Oh, I wow, that's or multiple things.

Speaker 1 (45:24):
No.

Speaker 3 (45:24):
I was.
I was gonna say I'm thankfulfor so many things that I you
know it's hard to pick just one,but I would have to say I'm
really thankful for my family.
I think this past year has been, you know, both in terms of on
and our dog, who gives meemotional support on a daily

(46:01):
basis.
You know she actually my dog,brownie is a pretty by now is a
pretty big privacy expertbecause she's just had to have
so many conversations with meabout some of the stuff when I
didn't have anyone else around.
My wife was out at work and mykids were at school.
So now she knows all about youknow, building privacy programs,

(46:25):
drafting policies, reviewingDPAs so she actually has a lot
of knowledge in the privacyspace.

Speaker 1 (46:32):
So she's pretty certified, sounds like.
She's oh she is definitely byfar the most certified canine
privacy professional out there Ithink, uh, you should start
like a I don't know, like ainstagram page with you and you
and her like a privacy sidekickor something yeah, for sure, for

(46:53):
sure.

Speaker 3 (46:53):
Yeah, I mean it's, it's, she, doesn't she it's.
It's very hard to.
She's pretty cagey, no punintended, so it's hard to get
you know her, her view on things.
It's usually me giving her myview, but uh, but she's, uh,
she's a good listener.

Speaker 1 (47:11):
I love that man.
Well, Avi, thank you again foryour time, and it's always great
catching up man.
I'll definitely talk to you onLinkedIn and so on, but good
luck going into 2025.

Speaker 3 (47:23):
Thank you so much for having me on again and wishing
you and your families, andeveryone listening, a very happy
Thanksgiving.

Speaker 1 (47:31):
Appreciate that Same to you and yours.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.