All Episodes

July 11, 2023 55 mins

This week’s guest is Tom Kemp: author; entrepreneur; former Co-Founder & CEO of Centrify (now called Delinia), a leading cybersecurity cloud provider; and a Silicon Valley-based Seed Investor and Policy Advisor. Tom led campaign marketing efforts in 2020 to pass California Proposition 24, the California Privacy Rights Act, (CPRA), and is currently co-authoring the California Delete Act bill.

In this conversation, we discuss chapters within Tom’s new book, Containing Big Tech: How to Protect Our CIVIL RIGHTS, ECONOMY, and DEMOCRACY; how big tech is using AI to feed into the attention economy; what should go into a U.S. federal privacy law and how it should be enforced; and a comprehensive look at some of Tom’s privacy tech investments. 

Topics Covered:

  • Tom's new book - Containing Big Tech: How to Protect Our Civil Rights, Economy and Democracy
  • How and why Tom’s book is centered around data collection, artificial intelligence, and competition. 
  • U.S. state privacy legislation that Tom helped get passed & what he's working on now, including: CPRA, the California Delete Act, & Texas Data Broker Registry
  • Whether there will ever be a U.S. federal, omnibus privacy law; what should be included in it; and how it should be enforced
  • Tom's work as a privacy tech and security tech Seed Investor with Kemp Au Ventures and what inspires him to invest in a startup or not
  • What inspired Tom to invest in PrivacyCode, Secuvy & Privaini 
  • Why having a team and market size is something Tom looks for when investing. 
  • The importance of designing for privacy from a 'user-interface perspective' so that it’s consumer friendly
  • How consumers looking to trust companies are driving a shift left movement
  • Tom's advice for how companies can better shift left in their orgs & within their business networks


Resources Mentioned:

Guest Info:

Send us a text



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Tom Kemp (00:01):
The AI is using all that data being collected and
it's not only automatingdecisions, but we've now seen
the use of generative AI tocreate text, images, etc.
But one of the ways that BigTech is using it is to keep
people on their products, tomake their products more
addictive, to get more attentionso we're obviously in the

(00:24):
'attention economy,' where we'recompeting for eyeballs.

Debra J Farber (00:28):
Welcome everyone to Shifting Privacy Left.
I'm your host and resident"privacy guru, Debra J Farber.
Today, I'm delighted to welcomemy next guest, Tom Kemp.
Tom is a Silicon Valley-basedauthor, entrepreneur, investor,
policy advisor, and now author.
Tom was the Founder and CEO ofCentrify (which is now called
Delinia), a leadingcybersecurity cloud provider

(00:51):
that amassed over 2,000enterprise customers, including
over 60% of the Fortune 50.
For his leadership, Tom wasnamed by E Y as finalist for
'Entrepreneur of the Year' inNorthern California.
He is also an active SiliconValley angel investor with seed
investments in over a dozen techstartups.
In addition, Tom has served asTechnology Policy Advisor for

(01:14):
political campaigns and advisorygroups, including leading the
campaign marketing efforts in2020 to pass the California
Proposition 24, otherwise knownas California Privacy Rights Act
(CPRA), which amended the CCPA.
He's co-authoring bills such asthe California Delete Act of

(01:34):
2023.
Welcome Tom.

Tom Kemp (01:41):
Hey, great to be here.

Debra J Farber (01:42):
Excellent, I'm so glad you're here.
You're doing so manyinteresting things out in the
privacy tech market, justworking, like myself, working at
the broader, overarching levelsrather than just one initiative
.
But I'd love to first kickthings off by diving right into
your new book.
It's get to be released.
It's called "Containing BigTech: How to Protect Our Civil

(02:03):
Rights, Economy and Democracy.
What motivated you to write thebook and who did you write it
for?

Tom Kemp (02:09):
Well, thank you.
First of all, I wanted tocreate a simple and
comprehensive look at issuesconcerning big tech that your
Uncle Larry or your averagepolitician could say, "h-ha, i
get it, without having to handthem 500 articles and say, "go
at it.
It actually works to Big Tech'sadvantage that there's
confusion on these issues.

(02:31):
I wanted to just give a simple,but comprehensive, look at what
are the issues that have arisen, not only as it relates to
privacy and digital surveillance, but also persuasive technology
and, of course, the big kahuna,artificial intelligence and
really trying to deeply diveinto how the largest tech

(02:52):
players are using AI, both in apositive way, but also
potentially negative ways aswell.
Second, I also wanted to saythere are actually some simple
solutions out there to many ofthese problems, which I provide
- solutions both for consumers,your Uncle Larry, but also for
policymakers as well.
Then, finally, I wanted toexplore some of the latest and

(03:15):
greatest things that arehappening.
One of the big things is theDobbs decision, we're now in a
post-abortion rights America andthat has some significant
impact on privacy.
But also, we've had the rise ofTikTok and with the whole
addictive technologies,persuasive technologies to keep

(03:35):
people hooked on it.
Then, finally, AI, andobviously people are looking to
draft new laws to take intoaccount AI, but I really wanted
to give the readers a deep diveon what AI is, how AI can be
used for good, how AI can bebiased or exploitation, and how
it relates to the collection ofour information as well.

(03:56):
Those were some of themotivations behind the book.

Debra J Farber (03:58):
I love it.
I have an advanced copy here, agalley copy, and I really like
how accessible you've madeeverything.
So, literally, my Uncle Larrycould pick this up.
I don't have an Uncle Larry,but I have an equivalent, can
pick this up and easilyunderstand the challenges that
you identify without too muchtechnical jargon getting in the
way.

(04:19):
It's one of the things I reallylove about this book; it makes
it really easy and clear tounderstand the potential
problems.

Tom Kemp (04:26):
Well, thank you.
I think the interesting thingis, it's incredibly timely;
every day there's a newheadline, either about AI or
about privacy, or there's newlaws, etc.
I think this is also verytimely and people need to keep
up on it, not only Uncle Larry,but privacy experts such as
yourself.

Debra J Farber (04:45):
Oh yeah, definitely.
We're all working at such quickspeeds these days as technology
advances or there's constantlynew iterations of whether it be
AI or new tech, that it's reallyhard for even privacy experts,
security experts, to knoweverything that's going on.
This is really helpful.
I like how each chapter focuseson a particular area of concern

(05:08):
.
You've got 'digitalsurveillance,' 'data brokers,'
'artificial intelligence,''persuasive technology,' 'kids'
online safety,' 'extremism anddisinformation,' and
'competition.
' If you don't mind, I'd like togo through each of those topics
with you and just have you givea short synopsis of each of the
privacy, surveillance, orsafety harms.

(05:29):
Does that work?

Tom Kemp (05:31):
Yeah, absolutely.
I mean to summarize the themesjust to put those chapters into
context.
I fundamentally believe that wehave five tech companies that
are monopolies, that are some ofthe most powerful corporation
the world has ever seen, withamazing reach.
We're talking about companieslike Google.
They have 4 billion users.
That's half the Earth'spopulation.

(05:53):
So, we're talking aboutcompanies with incredible reach.
I mean, GM took 100 years tosell hundreds of millions of
cars and within 20 years, googlehas gotten to 4 billion users.
And they're very muchunregulated and they're causing
serious threats to our civilrights, our society, our

(06:14):
democracy.
So, if you take a look at thefirst three chapters: digital
surveillance, data brokers, databreaches, that really focuses
on the data collection that'shappening today.
I fundamentally believe thatour digital exhaust, our
personal information, is beingway over collected and it's now

(06:36):
being weaponized in apost-abortion America, and that
weaponization, I think, is goingto extend to trans, LGBTQ
America, etc.
So, one thing we need to takeinto context is there have been
monopolies in the past, likeStandard Oil.
Standard Oil was incrediblypowerful in the 1900s or 1910s,

(06:58):
etc.
But they didn't know everythingabout us, and so that's the big
difference and we'll talk moreabout data brokers; but there's
been identity theft associatedwith data brokers, fraud, of
course there's the ability touse that data to discriminate
against people, etc.
The second grouping of chaptersreally focuses on artificial

(07:18):
intelligence.
How is Big Tech usingartificial intelligence?
Take in account that the AI isusing all that data being
collected and it's not onlyautomating decisions, but we now
have seen the use of generativeAI to create text, images, etc.
But one of the ways that bigtech is using it is to keep

(07:41):
people on their products, tomake their products more
addictive, to get more attention.
So, we're obviously in theattention economy where we're
competing for eyeballs.
I talk about how the use of AIhas exasperated problems in
terms of extremism anddisinformation, but also really
taking a look at how AI is beingtrained on children and the

(08:02):
risk associated.
Then, finally, the last chapteris "competition.
What I really talk about thereis that these companies are
monopolies.
They really own some large keydigital markets.
The issue there is theirmonopoly positions actually
exasperate the problems withdigital surveillance and data

(08:24):
over- collection and AIexploitation and bias.
And so in isolation not factorin the fact that these are
monopolies; and, you got to lookat the antitrust aspect because
you're not going to get changesunless there's some sort of
pressure for them to changetheir business practices.

(08:46):
My goal of my book with thesetopics is not simply say here
are the problems.
I actually want to providesolutions, both for consumers,
but also for policymakers aswell.
So I want to say that thingscan be done to actually make
these problems better.

Debra J Farber (09:03):
That's actually a really good point.
I am asking you to frame theproblems so that people go and
buy your book, but I have notasked you to - there's a plenty
in the book about how to addressthese problems and your
approach and what you would do.
So that is so much of - that'sa lot of benefit to the book
itself, so I definitely thinkpeople should go out and get it.

(09:25):
When is it available topurchase?

Tom Kemp (09:27):
It's actually available now for pre-order and
the ship date is mid-August, andit will be available not only
in hardcover but ebook.
So you've got Kindle etc.
But it's also going to be anaudiobook, so you can get it on
one of your favorite audiobookproviders as well.
So it's going to be on alltypes of media and it's
available in mid-August.

Debra J Farber (09:45):
Excellent.
Well, I'll definitely put alink to the pre-order in the
show notes so that people caneasily find it.
Now, Tom, I know you've beenworking with others to rein in
big tech with proposedlegislation.
There's quite a bit that you'vebeen working on, So tell us
about what legislation you'vehelped get past so far and what
you're working on now.

Tom Kemp (10:04):
Great, thank you.
Yeah, so look, it's verydifficult to change things in
Washington, DC.
And if you really look at wherethe activity has been happening
, it's been happening in Europe.
The "Brussels effect.
Obviously, the EU was the firstwith the GDPR and now we see

(10:26):
coming down the pipeline theDigital Services Act, the
Digital Markets Act and it lookslike the Artificial
Intelligence Act is going topass.
So, clearly Europe is settingthe standards And then, in the U
.
S.
, it's historically beenCalifornia.
It's called the "Californiaeffect, and for good reason.
I mean, California has led thenation on consumer protection.

(10:48):
You can go back to autoemissions etc.
Specific to privacy, which Iknow what's really of interest
to your audience is thatCalifornia actually put into the
Constitution in the early 70sthat privacy is a fundamental
right.
The word privacy is not in theU.
S.
Constitution, but it is in theCalifornia Constitution And

(11:08):
California has led.
For example, California was thefirst state to have a data
breach notification law.
And, what happened right in theGDPR timeframe
that the California ConsumerPrivacy Act, or CCPA, was passed
by the legislature And frankly,they were kind of forced into

(11:29):
it because a gentleman by thename of Alistair MacTaggart was
going to put it on the ballot -and he was the person that wrote
it.
So, it passed, passedunanimously.
It passed in 2018 and went intoeffect in 2020.
But what Alistair immediatelysaw that the industry was trying
to water it down, so he decidedhe would upgrade the CCPA with

(11:49):
the CPRA (the California PrivacyRights Act), and he knew that
he could not get it through thelegislature.
So, he did a ballot initiative,which became Prop 24 in the
2020 campaign, and that's whereI hooked up with Alistair as a
full time volunteer on thecampaign and worked on the
campaign full time as avolunteer for six months.

(12:10):
And, I was basically the ChiefMarketing Officer, and I'm very
proud of the fact that it passed.
It got over 9 million votes.
And so, I think that tells theaudience and people that there's
a significant appetite withconsumers for more privacy.
And I think one of the mostsignificant things that was

(12:30):
added was that California nowhas a dedicated privacy agency,
the California PrivacyProtection Agency, which, when
it gets fully staffed, will havemore people in it than the FTC,
the Federal Trade Commission,has for privacy.
And then, lately I've beenworking on the California Delete

(12:50):
Act.
I actually proposed this billto my local State Senator, Josh
Becker, and I co drafted it andbeen actively working on, and
this involves data brokers.
So, just stepping back,companies we directly interface
with that collect ourinformation.
you can call that flexion"first party data.
So, you can contact Walgreensor Walmart, say, "hey, i'm a

(13:14):
customer and you can exerciseyour CPR or CCPA data deletion
rights etc.
But the problem is that thereare entities called data brokers
that we don't have a directrelationship and we don't know
who they are.
And so I'm going to call thatthird party data, and they
collect a lot of sensitiveinformation like our precise
geolocation or medicalconditions etc.

(13:36):
Now California and Vermont havea registry, but the onus is on
the consumer to go to each andevery one and say "please delete
me, but that could takehundreds of hours.
So I thought it would be coolto have a single website where
you can say, hey, go and deletemy data.
It's kind of the equivalent ofthe FTC do not call registry
thatlast year, Senator Assoff (who'

(13:57):
s a Democrat) and SenatorCassidy (who's a Republican)
proposed at the federal level aswell.
So, we kind of took that as thebasis and behind the back of
our heads, we thought, the FTCDo Not Call Registry has over
240 million Americans using it.
It's incredibly popular.
And there's some federal stuff.
So, the current status withthat is that it passed - the

(14:18):
California Delete Act, SenateBill 362 - passed the California
Senate and it's now on theassembly and it needs to work
its way through, etc.
So, it's going to be a long andtough road, etc.
And then, the final thing is Iactually a branched out beyond
California.
I worked on a Texas data brokerregistry bill that was signed
like a week ago or so, andthat's 2105 and that actually

(14:42):
makes Texas the third state tohave a data broker registry law.
Obviously, with the CaliforniaDelete Act, we're trying to go
well beyond just having databrokers register but provide the
portal to allow consumers to doa global delete of their
information as well.
So, yeah, very active andreally kind of where things are
happening is at the state levelAnd I know your listeners know

(15:04):
that last year we had fivestates.
We're now up to 10, with Texasbeing the 10th.
Oregon is probably going tosign.
I think Delaware is out there.
So, you know, it's the fullemployment act for privacy
people with 12 different statelaws, potentially by the end of
the year and in play.

Debra J Farber (15:21):
Yeah, that is quite remarkable.
It kind of reminds me of theslow trickle of the data breach
response acts in each of thestates.
We couldn't get one federaldata breach law on this is how
you respond to a data breach.
Right?
Like, if we can't get somethingas simple as just a data breach
and how to respond to respondto it as a federal law,

(15:44):
do you think that will ever seea comprehensive US privacy law?
I'm really pessimistic thatwe'll ever see one, mostly
because the reason we don't haveone now has nothing to do with
privacy.

It has to do with (15:57):
Who's covered by the law.
Are you exempting a federalgovernment from that?
Do you have a right to sue?
These are some of the issuesthat Republicans and Democrats
can't seem to agree on to comeup with and pass a federal law
that would apply to all thestates and then render the state
laws no longer viable.

(16:17):
So, then the other thing isthat state laws usually have the
right to add more protectionsthan a federal law has and go
beyond, but it is possible for afederal law to prevent that as
well.
And so these are things thatare being argued to death; and I
just don't see it everhappening.
What are your thoughts?

Tom Kemp (16:39):
Yeah, well, you did a brilliant job of talking about
the path of the data breachnotification laws.
California passed theirs and ittook like 17 years later before
the 50th state - I think it wasAlabama or something - to add
that to their law.
And then, you're right.
I mean, if you look at the 50different data breach
notification laws, they havedifferent requirements, et

(17:00):
cetera, and it's a completepatchwork.
And during the 2020 campaign, Iwas part of a policy group with
one of the political campaigns,and I actually, in this working
group, I said, h"ey, let's addto the platform that we would
have a national data breachnotification law, and people

(17:21):
were like I'm not so sure aboutthat And so that kind of like
didn't make it and et cetera.
And so, it's difficult, right?
I mean, the last major piece ofmajor privacy legislation was
what?
HIPAA or Gramm Leach Bliley?
And, that was more sectoral andthat happened in the 90s.

Debra J Farber (17:39):
Yeah, that was before I even started in privacy
.
That was like forever ago.

Tom Kemp (17:46):
It was before Meta / Facebook formed.
It was before the iPhone andthe whole mobile revolution.
We've had the complete growthof Big Tech occur in a
completely unregulated way andmanner.
At the same time, we've also,over the last 30 / 40 years,

(18:06):
we've had a loosening of theantitrust enforcement and laws.
These tech companies have beenable to basically make 600 plus
acquisitions without them beingquestioned or stopped, et cetera
.
And so now, we've woken up to asituation where people, in an

(18:29):
unfettered manner, can collectour information.
And, I think the best exampleof that is health care
information.
Yes, we have HIPAA, but thatonly applies to covered entities
.
But what if I decide to start astartup that's a health care
app?
Right?
I can collect the information;as long as I put in my privacy
notice that I will turn aroundand sell it, I can do anything

(18:50):
with it.
And that's a fundamentalproblem.
But, I am hoping thateventually it will get to the
point where we hit 15 / 20states that have privacy laws,
that people will eventually haveto say, "Hey, we finally need
to get this across the goal line, et cetera.
But, there are a couple of keysticking points that you brought

(19:11):
up, the first of which is the'private right of action.
' Right?
That's always been a stickingpoint.
Republicans don't want peopleto be able to sue, while the
Democrats are interested in that.
I think you can actually findcompromise.
You can maybe limit that toidentity theft or something.
I think there could be acompromise.
But the other issue, of course,is 'preemption,' and the

(19:33):
Republicans want a federal lawto basically represent the
ceiling while Democratstypically want it to be the
floor and allow states to beable to innovate.
And it will be more difficultto be able to find a compromise.
Although, I have a few ideasabout that; but nonetheless,

(19:55):
that's kind of where we're atright now.

Debra J Farber (19:56):
Yeah.
So, OK.
So if there were comprehensiveU.
S.
privacy law, you talk about inthe book, some of the
ingredients of what should gointo it.
Do you mind ?

Tom Kemp (20:08):
Yeah, sure.
I mean, well, first andforemost, consumers should have
privacy rights.
Obviously, they're not spelledout in the U.
S.
Constitution, so we need to beable to actually document them
in legislation.
I think the GDPR has alwaysrepresented the 'gold standard,"
And they do a very nice job.

(20:28):
And the CCPA, I think, hadabout 60% matching to the GDPR
consumer privacy rights.
And then CPRA added a few.
For example, the CPRA added the'right to correct,' for example
.
And the fundamental thing, atleast in California, is that
consumers should have the 'rightto know' (i.

(20:49):
e.
what information is beingcollected) and they should have
the 'right to say no to the saleof their information.
' And, California evenintroduced the concept of
'sensitive personal information'as well.
But, I really think where thisneeds to go is, if you look at
AI, the Biden administrationproposed an AI Bill of Rights.

(21:09):
And, if you look at both GDPRand even CPRA, the focus really
has to do with "Hey, you shouldhave the right to reject
automated decision making Thatyou basically say "I don't like
that decision because it was alldone by algorithms, et cetera,
and you created a profile of meand you inferred information and

(21:30):
you made the decision.
Now, that's actually in theGDPR, the CPRA that needs to be
designed as part of theregulation process, and the
California Privacy ProtectionAgency has not yet written the
actual regulations in thatspecific area as well.
But the question is, is the GDPR, even with the CPRA with its

(21:52):
focus on automated decisionmaking, is that out of date
already because we now have seenAI be used in a generative
sense to create images and text?
And, so what rights do you haveif people use your image, your
text, your content, et cetera?
So, I talked about some of therights that people should have

(22:16):
as it relates to AI, beyond thetraditional privacy rights.
So, they're kind of likemerging together.
And, I think we need to kind ofgo where the, using a Wayne
Gretzky term, you know, "ass thepuck to where the player is
going to be versus where thatthe player is right now, and I
think we need to factor in AIand maybe broaden the definition
of the 'right to rejectautomated decision- making' to

(22:39):
include more of the generativeuses of AI that could be used in
a discriminatory manner as well.
So those are the stuff on theconsumer privacy rights that
should definitely be in a U.
S.
privacy law.

Debra J Farber (22:54):
Yeah, thanks for that.
That's a pretty comprehensivelist that really resembles kind
of a GDPR as well, to the mostthat you can at the in the U.
S.
level.
But what is always interestingis that we talk about things in
terms of 'what is a right.
' We have new rights, but whatthat does for organizations is

(23:14):
now create new obligations forthe business.
So, can you address whatcompanies would need to focus on
to comply with thecomprehensive U.
S.
privacy law?

Tom Kemp (23:24):
I think kind of the nature of your question is
that's great that people haveconsumer privacy rights, but
there needs to be a set ofcorresponding business
obligations and you knowbusiness obligations such as:
data minimization, prohibitdiscriminatory uses of data.
Obviously, businesses need torespond to rights requests and

(23:45):
they should be able to implementappropriate security measures.
Obviously, you're very familiarwith, in Europe they have to
have Data Protection Officers,right?
I think that actually in the U.
S.
should be a good thing.
maybe make it for biggerorganizations or organizations
that have more monthly users;and then, potentially for highly

(24:06):
sensitive uses of personalinformation, there probably
should be data protection impactanalyses.
But, I think there's two inparticular that I want to focus
on that I think it's becomingmore pressing and it's hitting
the headlines .The wholetargeted advertising, the
behavioral advertising, whichdigital surveillance pumps data

(24:26):
into the whole advertisingecosystem here.
You know, if you look at Europe, kind of the next wave of
regulations they're doing is theDigital Services Act and the
Digital Markets Act And theyexplicitly ban the use of
sensitive data for targetedadvertising and they explicitly
ban the use of targetedadvertising for kids.

(24:49):
And, I think we really need tolook at that either as part of a
child safety law or a revamp ofa law like HIPAA that your
sensitive data cannot be used.
Because, if .
you , today being June 30th whenwe're recording this, this that
an article came up from themarkup that pharmacies are

(25:10):
actually sending -you you knowover-the-counter prescription
and medical purchases andsearches - to Meta via the
Metapixel.
And, so now there's a link ofIP addresses that people have
said you know HIV tests thatthey've searched for or they're
purchasing, etc.
or plan B - you know things ofthat nature that's available

(25:34):
over the counter but supersensitive.
Right?
And so, the fundamental questionis is that, should not HIPAA
protections be extended beyond'covered entities,' and also the
focus on kids?
Should there be a version twoof COPPA or COPPA?
I don't know how peoplepronounce it.

Debra J Farber (26:01):
Yeah, whatever you want it to be.

Tom Kemp (26:01):
But, in version two that Senator Markey is proposing
has gotten traction; itactually does, I believe, ban
the targeted advertising to kidsas well.
So, that's one area that Ithink is a particular interest.
The second area, which I wouldlike to definitely have a
federal privacy law mandate issupport for Global Privacy

(26:26):
Control (GPC) because thefundamental problem that we have
- and this is a problem inEurope as well - is that it's
death by cookies.
Right?
Every time you go to a website,accept cookies - yes.
But if you say "No, then youget this page that pops up and
you want analytics, you wantmarketing, blah, blah, blah,
blah, blah, and it basicallybecomes a big old dark pattern

(26:46):
that you're like.
"I just want to see who won thefootball game today! I don't
want to spend five minutes oneach website accepting cookies
and doing all these things," etc.
And so there really needs to bean opt- out signal, and the
fundamental issue that we haveis that privacy is too hard for
consumers, that they have toconstantly tell the

(27:09):
organizations "don't collectinformation, don't collect and
doing the accept the cookies andblah, blah, blah, blah.
" They should be able, throughtheir browser or on their mobile
phone just say "No.
I don't want my data sold orshared," etc.
And that should just be asetting that you have in your
browser and businesses shouldrespect that.

(27:31):
And the other way to makeprivacy incredibly simple is for
the third party entities, thedata brokers, to have what I
described in the CaliforniaDelete Act, which is the ability
to go to a single page and say"Delete me and no longer track
me.
As well.
Just imagine if everyone had tosupport Global Privacy Control.
Then, you just set something inyour browser on your phone and

(27:53):
you take 30 seconds to do that,and you cover all the first
party data.
And then, if you go to awebsite and put your email
address and some otheridentifiable information about
you as it relates to databrokers and hit "submit, that
covers you for all the thirdparty data: 30 seconds on your
phone and your browser, 30seconds going to a website.
That would fundamentallyaddress the pain and suffering

(28:15):
that consumers have to gothrough with this whole cookie
whack-a-mole thing or trying toworry about well who's really
collecting and selling my data.
Because the reality is databrokers are entities that we
don't even know about because wedon't directly interact with
them as well.
So, I think fundamentally theway that we need to look at it
like tech companies design theirproducts from a user interface

(28:37):
perspective.
They start like "what's thebest user interface?
We tend to design privacybackwards, like from the back
end, and it's painful for theconsumer.
We should redesign privacyproducts to like how can we make
this as friendly as possible toconsumers?
And frankly, a lot of the BigTech companies don't want to do
that because their businessmodel is collecting as much data

(28:59):
.
And if we made it friendly, butat some point we got to put our
foot down and say privacy, eventhough we may technically have
the right like in California,people are not taking advantage
of it And we need to be able toallow people to take advantage
of it.

Debra J Farber (29:12):
Yeah, I mean you've definitely said a lot of
interesting things there.
I mean, the first one thatcomes to mind is with GPC,
Global Privacy Control.
One of the downsides I see toit is, I mean, it's called
Global Privacy Control, but ifyou're opting out of something
via the web, it doesn't opt youout from mobile or from other
ways that you're interactingwith that organization.
So, it may have the false - aperson, might come away with the

(29:35):
false sense that they opted outof a particular company
completely, but it's really justthrough one channel.
So I would love to see someimprovement around the GPC kind
of methodology of maybe there'sa way to, on the back end, they
can link your accounts so thatyou could delete them.

Tom Kemp (29:55):
I 100% agree with you.
I mean, GPC right now is kindof a browser plugin, and
obviously in a mobile world youuse mobile apps.
But, at the end of the day youreally need the opt out signals
to be hard coded in the law, andthen you can set the standard
and say, "Okay, you need tostart supporting that in 2028.

(30:17):
And then the technology willthen catch up from there.
But the key thing is there'snot consistency with the current
set of privacy laws at thestate level, whether or not you
actually have to enforce theGlobal Privacy Control.
I would like to see that as astandard at the federal level as
a business application, andthen that will drive the
technical challenges of mobileversus browser etc.

Debra J Farber (30:39):
Absolutely.
I think it's an excellent firststep, a major step in fact,
because that's what's going toget the attention from Big Tech
resources, people to startthinking about tech solutions,
much like with the death ofthird party cookies and Google
coming up with new ways to lookacross market segments like
Flock and others.
even if there've been somemissteps in there, they're at

(31:01):
least putting a lot of mindstogether to try to tackle the
problem due to what's in thelaw, here at GDPR.
But still, I definitely thinkit's going to move the needle.
So how would we enforce thislaw in your dream world if we
had all of these components ofthese rights set up and these
business obligations for afederal, comprehensive U.

(31:21):
S.
privacy law?
How should it be enforced?

Tom Kemp (31:24):
I think, look, we need a dedicated supervisory
authority.
What we now see in Europe isthat people judge shopping in
the U.
S.
, people are all running to onejudge and following a lawsuit in
Texas because this judge is themost friendly.
And so what we've seen a lot ofcomplaints in Europe is that,

(31:48):
o"Oh well, these companies arebased in Ireland And there's a
lot of political pressurebecause Ireland's a small
country and Meta and some ofthese other businesses are very
large employers, et cetera.
And so now you actually startseeing, even with DSA and DMA,
more of a centralization, whileGDPR made it more disparate with

(32:08):
the local countries, et cetera.
So, I do think we do need astrong central authority, and
that central authority, thatagency - could it be part of the
FTC?
Sure.
Could it be separate?
I think there's a lot of synergywith having it right there with
the FTC, and it should have theauthority, as opposed to having

(32:30):
the authority distributedacross Attorneys General and 50
states, et cetera.
So, I think that should be thecase, just like there's one
Attorney General in the UnitedStates.
And, of course, yeah, you couldhave 50 State Attorney
Generals, and you could havestatewide supervisory
authorities, but there should bea strong central one.
Obviously, there should bepenalties, the private right of

(32:50):
action.
I think that you could probablysplit the difference and say,
"Okay, let's just have theprivacy right of action like we
have in California.
That's more narrowly focused,that if the data was stolen
that's associated with theability to actually hack and
breach someone from an identitytheft perspective, and so maybe

(33:11):
you could have the privacy rightof action have to do with
medical information or hackingor whatever, as opposed to broad
.

Debra J Farber (33:18):
And just to interrupt for anyone who doesn't
know, a "privacy right ofaction means that an individual
has a right to sue the companyfor malfeasance, as opposed to
their only recourse going to aregulator and reporting it and
hoping that the regulatorprosecutes.
.

Tom Kemp (33:35):
Exactly.
For example, in California, ifI go to Meta and say, "please
delete my information, and theyblow me off, I can't sue Meta
because they did not allow me toexercise my right of deletion.
But, the Attorney General or,starting actually tomorrow, July
1st, the California PrivacyProtection Agency, if I raise a

(33:58):
stink with either entities andsay, "hey, Meta's not letting me
delete my information, theycould go after them as well.
So it's a private right ofaction, whereas the bugaboo
where, as opposed to waiting asyou said for a regulator.
But, the bigger issue with thefederal law in the areas of
enforcement, have to do withpreemption.

(34:19):
And I'm actually of the mindwhich is that Washington moves
at a glacial speed And it's been40 years since we've had
anything privacy related withHIPAA and Graham Leach, in terms
of the significant way.
Even if we passed a federalprivacy law, it would be so hard

(34:39):
to amend it, et cetera.
So, I actually prefer thestates be the labs of democracy,
quoting Justice Brandeis whosaid that 100 years ago, and
allow states to be able to usethat as a floor, especially
because technology moves so fast.

(34:59):
But, I understand that's a boneof contention, But maybe there's
a way you can do it, which ismaybe you can pass the federal
privacy law and say for thefirst few years there's no
preemption, but after three orfour years states could preempt
it.
That maybe could be a goodenough compromise, although
federal government basicallysays t"his is it, take it and

(35:22):
that's it, and you can't add onit as well.
"So, maybe there is a way I
haven't really thought thisthrough, but maybe there is a
way to compromise as it relatesto preemption or give maybe
California a carve out becauseCalifornia historically has been
the most aggressive and set thestandards, et cetera.

(35:42):
But, I do fundamentally knowthat, especially when it comes
to consumer protection, it isimportant - that I am of one
that when push comes a shove,that the federal government
should not preempt states whenit comes to privacy, given the
fast nature of technology andthe slow speed in which the

(36:04):
federal government acts as well.
So I think we always need tohave that lab of democracy, as
Brandeis said, that happens aswell.
So, those are the private rightof action and preemption have
always been the bones ofcontention, and maybe in the end
we're going to need one side orthe other to have a decisive
victory in Washington to finallyjust shove it through a federal
privacy law.

Debra J Farber (36:24):
Yeah.
Well, I'll keep my fingerscrossed, but I won't dedicate
too much brain space tofollowing that just because I
think it's going to take forever.
But, I think that the approachthat you're taking is worthwhile
, even if it doesn't get passed,because it raises the issues

(36:45):
comparing it to statelegislation.
I really do hope that we havefederal privacy law.
I think everybody says that.
The challenge is what's in thatfederal privacy law to get it
passed.
But we'll see.
These are interesting times.
Yeah, absolutely Okay.
So let's switch gears from thebook and talk a little bit about

(37:06):
your work as a Seed Investorwith Kemp Au Ventures.
What do you look for in privacytech or security company before
?

Tom Kemp (37:14):
So, I've been very fortunate that I had good
success with some of thecompanies that I started with.
And, being here in SiliconValley, it's very common that
you have friends that arestarting new ventures or people
in your network, and you want togive them some Angel money.
Just like, maybe if one of yourlisteners wanted to start an

(37:36):
ice cream store and then theygot some money from their Uncle,
Larry, who's a dentist, thatputs some money into it as well.
I'm not a venture capitalist.
I'm just a guy that works witha business partner, Adam Au, and
together we invest some moneyin startups.
So far, we've done 15investments that are active

(37:58):
right now.
And obviously, given mybackground, it's very much
security and privacy techrelated.
And so, what I look for in acompany is that, first of all,
starting a company is a teamsport.
You need great co-founders.
Oftentimes someone comes to meand says I've got a great idea,
and I'm like, "Well, who else isworking on it?

(38:19):
And it's like, well, i'm theCEO and if I get a bunch of
money, i'll hire a bunch ofpeople.
But no, you really need to havea team.
You need to have someCo-Founders working with you,
because if that one person gets,God forbid, gets hit by a bus,
then everything goes away.
But, if you have a team to kindof bounce ideas off and be able
to sustain the business if oneof the founders steps away, then

(38:42):
it makes it more sustainable.
The second thing is you reallyneed to pick a product that
solves a real pain point And youneed to make sure that,
whatever company that's forming,that their solution is a pain
killer, not an aspirin.
And oftentimes there's just toomany nice- to- have products,

(39:04):
and you really need somethingthat customers are in a great
deal of pain and they're willingto bet their time and their
money on a small startup and goout on a limb and buy that
product because it will reallyfundamentally help the business.
So, no 'nice haves.
' You got to have a 'must have.
,' And so, I spent a lot of timetrying to figure out is this a

(39:25):
real pain that this productsolves?
Obviously you want a largemarket, not only to help you
raise money but gives you a roomto make mistakes and still
succeed.
But oftentimes, I think peopleend up having a little niche
product and then they say, well,it's part of this huge privacy
market, but you need to berealistic.
Are you really just a littletiny, tiny niche or is there

(39:49):
really a sizable market that ifyou multiply the number of
people that would buy yourproduct times the average
revenue that actually couldbuild, say, $100 million
business in five, six, sevenyears?
And then, the other thing -probably the last thing - is it
can't be too crowded of a market.
I have people come to me andsay, "Oh, i'm going to build a

(40:11):
better product than these otherfour to five startups, but the
other four to five startups haveraised $50 million and they
have $0 in the bank and theyjust have a little faster widget
, et cetera, and it's just likeboy, that's going to be really
difficult to succeed.
So you want to find a marketthat's going to be big enough,

(40:31):
but it's at the right time; t'snot as competitive as well.
So, those are some of thethings that I look for, which

is (40:39):
Is there a team?
Is it a must have?
Is it a large market?
But, is it not too crowded ofmarket for the market segment
that they've initially picked?

Debra J Farber (40:49):
Yeah, that's great.
It's really insightful.
And I know what is the checksize you normally give out.
I know this isn't just $10,000,$25,000 checks, Angel- level.
You're doing seed investment.
So, how much scale a companycan do.
So, of course you're going tocare because you're putting some
significant money down.

Tom Kemp (41:07):
Yeah, no, I mean, I think or some companies that
need some more money, then Iwill bring some other business
colleagues and friends to theparty.
And so, yeah, typically I'mgoing to call it a consortium

(41:29):
But, if they need more money,then typically they should
really probably get a earlystage VC, which has
institutional money put into it;and there's a bunch of seed VCs
out there that have $30 / $40million dollar funds, and those
seed VCs may put like $1 million/ $1.
5 million, and then we would putin a couple hundred thousand.

(41:50):
And so, we would be kind oflike the second leg or the third
leg of the stool.
So, I mean, it really dependson kind of how early they are,
the stages.
But we've written some biggerchecks collectively, but
sometimes I've written $25,000checks just to be part of a
consortium, et cetera.
So, it really depends on whatthe company's needs are and how

(42:13):
excited I am, et cetera, butit's fun because here I'm in
Silicon Valley and this is wherepeople make things happen, and
it's great to be on the groundfloor, working with
entrepreneurs to get theirdreams and visions funded.

Debra J Farber (42:27):
I think that's great.
We're cut from the same cloth,I just don't have the money you
do to invest, so I invest withsweat equity.
So I want to talk about some ofthe privacy tech companies that
you invested in, like Secuvy,Privaini and Privacy Code, and
what inspired you to invest inthem.
But, I first want to discloseto the audience that I'm on the
Advisory Boards for Secuvy andPrivaini, and I am a Angel

(42:50):
Investor in Privacy Code, so weboth have vested interests here
in these companies.
But, I am curious about, youknow, two of those companies -
Privaini and Privacy Code - arekind of creating their own
category in privacy tech.
Right?
You were talking about before,you don't want to invest in
companies where there's acrowded market, but it does seem

(43:11):
like you invest where there isa brand new category and that
has its own set of challenges -to be heard in the marketplace,
where exactly do you fit intothe needs.
And, it's a lot of what I helpthem as an Advisor on is go- to-
market fit and how do you get itstand out and be heard amongst
the other privacy tech andsecurity companies out there?

(43:31):
You know, I guess this is tosay I understand the draw of
working with eager startups withreally great ideas, but I'm
still curious about whatinspired you to invest in them
and companies like them.

Tom Kemp (43:45):
Oftentimes you look for analogous markets or
segments that pre-exist andthey're popular there and then
as you go to a new regime or newindustry, would this be
applicable.
So Privacy Code really focuseson Privacy Engineering, baking
that privacy into the productdevelopment process, etc.

(44:09):
And, in the security world thatthere was the whole DevSecOps
and Security Engineering teamsthat people 15 / 20 years ago
realized that "wait a minute,when we build software code, we
need to build security into it.
For me, when the founders ofPrivacy Code presented it was

(44:34):
like my light bulb went off.
Which is what motivated anddrove the early security market
was PCI-DSS, etc.
And then, there was a bunch ofissues with code not being
secure and engineers had to bemore security conscious and
build it in.
I felt that now that we haveGramm- Leach- Bliley - I mean,

(44:55):
sorry, we have GDPR.
We have CPRA.
We're going to have DSA, DMA.
We're going to have Kids DesignCode like the California AADC -
the Age-Appropriate Design Code- but software engineers have
no clue what they need to do andbuild and how they handle
information, and so Privacy Codeuniquely provides kind of a

(45:16):
library that helps engineersbuild, and so that's really kind
of the - your main concept isthat 'shift left' and that we
need to shift the privacy intothe building of products.
Similarly, in security, there'sbeen security risk analysis
score cards, etc.

(45:37):
Is this vendor secure?
Is this product secure?
What's the code?
What's the rankings, etc.
And Privaini, with their CEO,Sanjay, does the same thing for
privacy in terms of looking atpublic privacy policies.
They look at historicalbreaches that have occurred and
it will allow, for example,people in the purchasing

(46:00):
department or the Chief PrivacyO fficer, when they deal with
third parties, to be able to doan assessment, and I thought
that was just brilliant.
And I saw the success thatcompanies have had in security
and this needs to apply toprivacy as well, because how do
you actually quantify how good acompany is as it relates to
privacy?
You need to eventually apply ascore, and they do that.

(46:23):
And then, the other one that wegreat minds think alike with
working with these companies,Secuvy, I really like the fact
that they really focus a lot onthe use of artificial
intelligence and they reallyfocus on unstructured data, and
I really felt that AI actuallycould be the smart.
Intelligent use of AI couldactually add that to the mix and

(46:47):
build it from scratch, asopposed to, after- the- fact try
to bolt AI on.
Build it from Day 1 with theintelligence, the machine
learning etc.
and then applying it to findingPII and unstructured
information, which is themajority of data is actually
unstructured.
I thought that was just a coolidea as well.
So, it's just kind of a lot ofthe reasons why I invest is like

(47:11):
, "Hey, this has been a greatidea, say in security or some
other space, and now in this newworld of privacy, being driven
by all these laws andregulations and consumer
awareness, those thingsapplicable in the privacy world
- if yes, then there's a greatopportunity there.

Debra J Farber (47:29):
Yeah, that makes a lot of sense.
I'm equally as inspired by,obviously, this is called "he
Shifting Privacy Left podcast.
I believe that we need to shiftleft into engineering DevOps
These tools really help withthat and managing information
earlier on in the organization,so that makes a lot of sense.

(47:52):
Thanks for sharing yourinsights there.
I guess the last question Ihave for you is do you have any
advice for how companies canbetter shift left in their
organizations and within theirbusiness networks?

Tom Kemp (48:03):
Look, the privacy market is incredibly dynamic and
there's so much activity.
If you took a six-month leaveof absence and then you came
back to work, you're like, "ohmy gosh, there's like five more
states, right.
And so I really highlyrecommend that if you're going
to be a privacy professional,you can't rest on your laurels;

(48:25):
and people need to listen topeople like yourself with your
podcast, to keep what's up- to-date.
They need to read the dailyIAPP newsletter and participate
in the forums on LinkedIn, etc.
and be able to carve out timeevery day or every week to make

(48:48):
sure that you're keeping abreastof where things are going, etc.
Because, unfortunately, inprivacy, it's very fragmented.
Right?
There is not a national privacylaw.
There are 10 state laws.
There's 50 data breachnotification laws.
There's Europe as GDPR, but DSADigital Service Act, Digital

(49:09):
Markets Act are going to applyto some companies as well.
So, from my perspective is thatI believe organizations and
individuals should havededicated people focused on
privacy, that they should haveprivacy, to your point, not
being after- the- fact thing,but needs to start baking that

(49:32):
into the building of theirproducts, how they interact with
consumers, etc.
because what's happening now isthat consumer expectations are
sky high.
They've been burned.
Consumers have been burned toomany times with data breaches,
with Cambridge Analytica, etc.
and so they have highexpectations.

(49:53):
And again it goes back to what Isaid before; think about it
from the consumer perspective.
Too often we build it from likewhat is our perspective, what
our needs are, and you need tokind of flip it around right,
and so my suggestions are havededicated people, carve out time
to keep up- to- date, abreastof what's happening, because
it's incredibly dynamic.

(50:13):
If you don't like it orwhatever, then this is probably
not for you.
Sorry.
You've got to keep up withwhat's going on at the state
level because you may have acouple thousand customers in
Texas and HB4 just passed and hejust signed it and it becomes
effective next year.
So, guess what?
If you want to avoid a lawsuitfrom some of your consumers in
Texas, you've got to followTexas privacy law.

Debra J Farber (50:38):
I mean, and just to hang a lantern on that and
to underscore it or whatever,pick your metaphor.
I think that applies to theprofession generally, beyond
even just new laws.
You're constantly having newstandards, new frameworks, new
technology that's arising andcausing new problems, that's
causing new points in time whereyou need to think about your

(51:00):
approaches.
Should you scrap what you'vealready built because it doesn't
comport at all, you can't makeit legal, or even beyond legal,
customers won't find ittrustworthy anymore?
Or, should you buy somethingelsewhere that just makes your
privacy better?
or you need to deploy a newarchitecture or governance?
I mean, it's constant learning.
I can't imagine someonesuccessfully being in privacy,

(51:24):
whether it's an engineer, alawyer, an operations person, an
architect, what have you,without having to constantly
learn and constantly reassesswhat your approaches should be.
For me, it's part of why I'mdrawn to this space.
I'm never bored.
There's so much; but, now I dohave to say it can be

(51:45):
overwhelming.
If you're looking broadly andnot within a smaller domain,
within privacy, like I do, youcan get overwhelmed with the
deluge of information becausethere's so much going on in the
space.
As you said, customers, peopleare moving the markets these
days.
It's not, "Oh, another find fora data breach, another fine for
a data breach.

(52:06):
" That was just humans weregetting used to that.
It was sad.
That was not causing any change, but, as you said, the
realizations that we are stuckin a surveillance capitalism,
Big Tech spiral, how do we getout?
It is, i think, consumers whoare looking to feel trust, want

(52:26):
to trust companies or not givedata to companies they don't
trust, or not buy something fromcompanies they don't trust.
And that, I feel, is driving a'shift left' movement in at
least creating products thatconsumers will trust better.
Now, hopefully, they'll justcontinue to be a trend and
there'll be more transparency,but it's going to take laws to

(52:47):
also keep them in line.

Tom Kemp (52:49):
It's a dynamic market and it can be challenging, but
the good news is, if you cankeep up with it, then you become
even more valuable andstrategic to the company and in
your career.

Debra J Farber (53:00):
Absolutely.
100%.
Thank you for adding that.
Well, Tom, any last words toour audience before we close?

Tom Kemp (53:05):
No, I think we hit on a lot of stuff, and I really
appreciate you interviewing me.
This has been amazing.
This has been a great podcastepisode here.
If people want more informationabout the book, they can go to
tomkemp.
ai, or if you just want to goahead and pre-order, go to
containingbictech.
com.

Debra J Farber (53:25):
Excellent.
Well, Tom, thank you forjoining us today on Shifting
Privacy Left to discuss your newbook, what a comprehensive U.
S.
federal privacy law should looklike, and some of your privacy
tech investments.
Until next Tuesday, everyone,we'll be back with engaging
content and another great guest.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.