All Episodes

March 26, 2024 75 mins

In this week's episode, I am joined by Heidi Saas, a privacy lawyer with a reputation for advocating for products and services built with privacy by design and against the abuse of personal data. In our conversation, she dives into recent FTC enforcement actions, analyzing five FTC actions and some enforcement sweeps by Colorado & Connecticut.

Heidi shares her insights on the effect of the FTC enforcement actions and what privacy engineers need to know, emphasizing the need for data management practices to be transparent, accountable, and based on affirmative consent. We cover the role of privacy engineers in ensuring compliance with data privacy laws; why 'browsing data' is 'sensitive data;' the challenges companies face regarding data deletion; and the need for clear consent mechanisms, especially with the collection and use of location data. We also discuss the need to audit the privacy posture of products and services - which includes a requirement to document who made certain decisions - and how to prioritize risk analysis to proactively address risks to privacy.

Topics Covered

  • Heidi’s journey into privacy law and advocacy for privacy by design and default
  • How the FTC brings enforcement actions, the effect of their settlements, and why privacy engineers should pay closer attention
  • Case 1: FTC v. InMarket Media - Heidi explains the implication of the decision: where data that are linked to a mobile advertising identifier (MAID) or an individual's home are not considered de-identified
  • Case 2: FTC v. X-Mode Social / OutLogic - Heidi explains the implication of the decision, focused on: affirmative express consent for location data collection; definition of a 'data product assessment' and audit programs; and data retention & deletion requirements
  • Case 3: FTC v. Avast - Heidi explains the implication of the decision: 'browsing data' is considered 'sensitive data'
  • Case 4: The People (CA) v. DoorDash - Heidi explains the implications of the decision, based on CalOPPA: where companies that share personal data with one another as part of a 'marketing cooperative' are, in fact, selling of data
  • Heidi discusses recent State Enforcement Sweeps for privacy, specifically in Colorado and Connecticut and clarity around breach reporting timelines
  • The need to prioritize independent third-party audits for privacy
  • Case 5: FTC v. Kroger - Heidi explains why the FTC's blocking of Kroger's merger with Albertson's was based on antitrust and privacy harms given the sheer amount of personal data that they process
  • Tools and resources for keeping up with FTC cases and connecting with your privacy community 

Guest Info

Send us a text



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

TRU Staffing Partners
Top privacy talent - when you need it, where you need it.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyri

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Heidi Saas (00:00):
They want to know what data segmentation you have
in control to make sure that theinformation collected is only
used for that purpose.
So, they've got the purposelimitations built in here, as
well as a data retention limitset.
They did not say how long.
Previously in the location datacase they said five years was
too long.
You can read between the lineson that.

(00:21):
But in data retention, you'vegot to have at least one, and
then tag your data with that,because you're required to have
the technical means forachieving deletion and you've
got to be able to demonstratethat.
That is where the privacyengineers have longevity, for
work right there - all day.
We need people to come in andknow how to do this.

Debra J Farber (00:40):
Hello, I am Debra J Farber.
Welcome to The Shifting PrivacyLeft Podcast, where we talk
about embedding privacy bydesign and default into the
engineering function to preventprivacy harms to humans, and to
prevent dystopia.
Each week, we'll bring youunique discussions with global
privacy technologists andinnovators working at the

(01:00):
bleeding- edge of privacyresearch and emerging
technologies, standards,business models, and ecosystems.
Welcome everyone to TheShifting Privacy Left Podcast.
I'm your host and residentprivacy guru, Debra J Farber.
Today, I'm delighted to welcomemy next guest: Heidi Sass,
Founder of H.

(01:21):
T.
Saas, LLC, where she serves asan attorney focusing on privacy
and technology.
Heidi is not afraid to publiclypoint out when companies are
behaving badly and makes publiccalls for them to do better.
She states on her LinkedInprofile, "I rip tech tools apart
, sometimes to make them betterand sometimes to light asses on

(01:44):
fire.
I work with tool makers,startups, VCs, small business
owners, research groups and somenice people I don't talk about.
I understand people and systems.
I am not cheap and I will notdo box ticking duties nor
marketing or sales for yourbusiness.
I love it.
She's like a hero of mine.

(02:04):
Her advocacy has been so helpfulto the community and she's
pretty much always right abouteverything she says.
She comes with receipts, andtoday we're going to talk about
recent Federal Trade Commission(or FTC) enforcement actions, as
well as some recent updates andinformation about enforcement
sweeps from a couple of U.
S.

(02:24):
States, specifically Coloradoand Connecticut.
Heidi recently did a LinkedInLive event with privacy and
security communications expert,Melanie Ensign - also a good
friend - and, I know her to bejust a tremendous powerhouse in
that space of communications.
Melanie's the CEO atDiscernible, and their LinkedIn

(02:45):
Live event goes over the sameenforcement actions we're going
to talk about today, with theadded benefit of Melanie's
wisdom around communications.
After finding that presentationso informative, I just knew I
had to bring Heidi on to TheShifting Privacy Left Podcast to
talk to our community ofprivacy engineers, especially
since these actions will likelyaffect your companies and how

(03:06):
they architect for privacy goingforward.
So, Heidi, welcome.
I'm so excited to chat with youtoday.

Heidi Saas (03:13):
Thank you so much for having me.
I'm very excited to addressyour community of people.
I am not an engineer, just tostart with that.
I'm just a lawyer, just anothersuburban mom on social media.

Debra J Farber (03:25):
Oh, I think you're selling yourself short.
But yes, you come from thelegal background, as do I, but
your focus has absolutelyshifted to privacy.
In fact, you didn't start outin privacy, so why don't you
kick off the conversation bytelling us a little bit about
your journey into privacy lawand then why advocating for
privacy by design and defaulthas been so important to you?

(03:45):
And, you might also throw inwhy you hate data brokers so
much.

Heidi Saas (03:52):
Well, thank you for the introduction and for the
kind words.
I really appreciate that.
I also have a lot of respectfor what you're doing in the
community and bringing peopletogether and educating us.
We learn so much when we listento each other.
I'm honored to be on your show.
My journey started with - I'mfirst in the family to go to
college.
I grew up in a blue collarcommunity and I saw there are

(04:12):
several different components topower.
There's money in the lawprimarily, so I wanted to find
out how does that work.
I went to undergrad in DC and Iworked at a lobbying law firm
and quickly figured out how thatworks.
Then I went to law school andstarted working consumer rights
for consumers.
At that time, it was afinancial crisis and you could

(04:32):
just buy spreadsheets ofaccounts and people were
collecting on it and using thecourt to garnish wages, and it
was a disaster.
We ended up lobbying forDodd-Frank and the Card Act and
reform in the state legislatureson wage garnishment and bank
levies.
It was a mess.
At that time, the ability torepay algorithms being used in
finance were not really based onscience because they didn't

(04:54):
really have any regulations thatsaid, here's what ground truth
looks like or construct validitywhen you're dealing with
algorithmic bias anddiscrimination.
So that had to be handled.
Dodd-Frank came in; the CFPBwas established.
They started to standardizethings and get some rules in
place and things started to getbetter.
But, after working in consumerrights and seeing the things

(05:16):
that I saw and oh, the manyshady areas that I had to
traverse, I decided that I'vegot to start working more on
data.
I found privacy because I said,"You know I've got to start
working more on data.
And I found privacy because Isaid that's the only place where
I see we can have agency overthe information being used
against us in all these lifecritical areas.
So I did, oddly enough, aGoogle analytics course.

(05:36):
They used to have them postedfor free and the funneling and
lookalikes and how you set uptargeted advertising and I was
appalled.
I was shocked - "how can this be?
Do other people know this ishappening?
I was so shocked I couldn'tbelieve this was happening.
So I said "I've got to learnmore.
Not because it was just Googlethat inspired me, it was just

(05:57):
the level of harm that was aboutto become apparent because it
was so hidden behind complexity,people didn't even know why
things were not happening forthem.
There was no explainability.
There are certain things thatare required under the law, like
adverse action notices' wheninformation is used against you
in a hiring decision and ittells you what data was used

(06:18):
against you so that you can havea right to dispute it and
correct it on your credit report.
But nobody's doing that,because they're using data they
bought from data brokers andfeeding it into snake oil
prediction tools and callingeverything LLM- powered.
Now whether it's actuallyattached to the LLM in the
background or not, right.
So, yeah, we are data.
All we are is data.

(06:39):
I learned as much as I couldabout privacy.
I took the data privacy exam in2019 and then started working
at the beginning of Covid ondoing consulting, which was kind
of odd, but for me it worked inthe odd kind of way where
everyone in the world was forcedinto my parameters.
I was working from home with nochildcare and then suddenly

(06:59):
everybody else was too.
So, I said "yYou know what?
I want to meet as many peopleas I can find.
I want to find the researchersthat are writing these papers
that fascinate me.
I want to find the people thatare doing ML engineering and I
want to learn from them.
All of these things likereinforcement learning.
That was integral.
I thought that was so importantto figure out right after I had
just experienced parenting.

(07:21):
So these are just some of thethings that got me into the path
that I'm in now with privacyand ethical AI.
It was all based on consumerrights, because that's the only
thing we have now to enforce therights we have in the
information being used againstus.

Debra J Farber (07:36):
That's right.
We don't have fundamentalrights of privacy, except for
Fourth Amendment stuff.
It's more based on consumer'srights and only where there's a
harm.

Heidi Saas (07:44):
Yeah, we don't really have those Fourth
Amendment rights.
Yeah, for right now you don'thave an expectation of privacy
and information that you sharewith a third party, so that
they're pretty free to sell it.
They're trying to address thatissue, but Congress is falling
short because it's obviously anelection year.
But, every year they need thesame targeted advertising
systems to generate campaigndollars and so, yes, they cannot

(08:05):
effectively legislate on thisissue without impacting their
own standing, their own seat inCongress.

Debra J Farber (08:14):
That definitely makes it tough.
So, today we're going to unpacksix recent cases - enforcement
actions, mostly around the FTC(the Federal Trade Commission)
bringing these actions, and I'lllist them as an agenda for the
audience right now.
The first is the FTC v.
InMarket, where we'll talkabout sensitive data.

(08:35):
The next is the FTC v.
X-mode or OutLogic, where we'lltalk about location data.
The third is the FTC v.
Avast, which is relatively newand that's around browsing data.
The fourth is The People v.
DoorDash, so that's reallyCalifornia - the State of
California suing DoorDash, wherethere's a mismatch between

(08:56):
promises and the privacy noticeand the technical capabilities
they had.
The fifth is the Connecticutand Colorado State reports and
some of the enforcement sweepsthat they've done so far.
And the last but not least,we'll talk a little bit about
the FTC v.
Kroger, which has to do withstopping the merger between
Kroger and Albertsons.

(09:18):
And so, before we dive deepinto those specific cases, can
you give us a high-leveloverview of the FTC enforcement
for the audience?
How does the FTC bringenforcement actions against
companies, the effect of theirsettlements, and why do privacy
engineers need to be payingcloser attention to these cases?

Heidi Saas (09:37):
So, I want to start with I am a data privacy and
technology attorney, but theseare my personal opinions, not
legal advice.
If you have legal questions, Iwould ask that you take notes
and take them to your counsel,based on the information that I
am sharing with you today foreducational purposes.
Now the FTC will start.
Sometimes I'll get a complaintthat will ask for an
investigation from an outsidethird party like Epic or another

(10:01):
consumer group to say, hey,something's going on over here.
This is what we think ishappening.
We're asking you to investigate.
So that's one way they'll bringit.
Another is consumers will bringinformation and make complaints
through their portal, wherethey have an overall view and
they share that.
It's a Sentinel Network.
They share that with otheragencies so that they can get an

(10:22):
above- ground, look at what'shappening between businesses and
consumers out there.
And so sometimes they get amass of complaints about a
certain thing - funeral homesand the problems that they had
recently with funeral homes.
They get a lot of complaints inthat area and they say what
this is worth investing ourassets in doing some
investigations.
At that point, once they'veidentified something they need

(10:44):
to look into, they send a 'civilinvestigative demand,' a CID
letter.
The CID letter shows up andsays, "Hhey, we want to talk to
you about these particularissues, we want to know these
types of things, and you need toset up a meeting with them
pretty soon so that you yeah,you meet and confer.
You need to do that within twoweeks so that you can talk to

(11:06):
them and set out a timeline forhow you're going to hand over
information to answer theirquestions.
Now, at this point, they may bedelighted with what you're doing
and everything seems fine, andwe're going to use this as the
industry standard when we goafter other people, because what
you're doing covers all thebases and that's great.
More likely than not, that'snot what they're going to find.

(11:30):
They're going to find someissues.
They're going to look at youragreements.
What are you telling the public?
They're going to look at yourtechnology stack, because they
have technologists that work forthe government now and have
been for a couple of years.
But at this point, becauseenforcement does take a while
rulemaking takes a while as wellyou're starting to see the
fruit from these efforts so theycan look at what you're telling

(11:51):
in the front versus what thebackside of your system is
actually doing, and if there's adifference between the two,
they're of the opinion that isunfair and deceptive.
Just flat out, yeah, so there'sreally no need to go into why
it looks this way.
It is this way.
You have an obligation to knowit is this way, and if you don't
, that's no excuse.
That's why I think it'simportant for people that are

(12:14):
working with data to understandthat accountability is the
second thing after transparency,and the accountability is the
reason why we want transparency,because we want to know who
made this decision and who do weneed to come for when we see
something we don't like that isreally helpful.

Debra J Farber (12:33):
and then the way that I've seen this play out
and I think it's important tosay up front is that the effect
of these enforcement actionsalmost is that you've got
another set of laws on the books, right, like some people didn't
know to a specificity, that nowwe're going to be hearing like
holdings that are pretty muchfeel like case law, even though

(12:55):
it's not technically a caseadjudicated by a judge.
In the legal system it'senforcers of the FTC.
So it's interesting in its ownright that it almost has a
rulemaking capability with theholdings.
And that's been something thata lot of people have fought
against, like why should the FTCwith their unfair and deceptive
trade practices mandate thatpeople don't necessarily know?

(13:19):
The claim is they don't knowahead of time what might be fair
, unfair or deceptive.
I think that's bullshit.
I mean, I think it's kind ofclear what's unfair and
deceptive if you're lying inyour policies to people about
what is going on right.
And then the other thing isthat we're not just talking
about great, a case was settledand some fines were paid right.
We are talking that any FTCaction involves the company,

(13:46):
once you settle, submitting to20 years of audits.

Heidi Saas (13:50):
Yeah, that's exactly right.
They're also they're creating adeterrent effect and, to answer
your earlier question, they dothis on purpose to drive home
clarifications that they've madein the business guidance.
So if you follow the guidance,that's prescriptive measure.

Debra J Farber (14:06):
If you follow that, you should be in good
shape, and I'm sure we couldtalk for hours and hours and
hours too.
So let's, I guess, stay focusedon these cases and start with
the first one.
The first one to discuss is theFTC versus in-market, and the
holding states that the datathat is linked to a mobile
advertising identifier or anindividual's home is not

(14:29):
considered de-identified.
Tell us more about whatorganizations should take away
from this holding and why are wetalking about de-identification
instead of, like, anonymization?

Heidi Saas (14:39):
Yeah well, they said in the agreements our data is
anonymized and aggregated,everything's cool, cool, cool.
And then when they looked at itthey said you know it's not.
It's totally not.
And in ABAS in particular, theywent a little further to
explain, like all the differentagreements and what was allowed
through linkages.
It's basically it's like it'snot me that's re-identifying it

(15:00):
later and violating people'sprivacy.
It's my business partner, liketwo steps down the way, but
you're still using the same IDso that you can link it together
later for the purposes oftargeted advertising.
So this is kind of tell metargeted advertising is not okay
in these circumstances withouttelling me it's not okay in
these circumstances.

(15:21):
The FTC Act is very, very oldand at some point we needed
things like that to tell us thatusing cameras for peeping Tom
purposes was not okay just forsalacious publishing in the
newspapers.
These kinds of cases had tocome forward.
So at this point we're sayingthat in this particular instance
on in-market, the sensitiveinformation is important because

(15:41):
the inferences that you havefrom the information of them
being at a certain place, that'snot okay for you to go and sell
to all your friends fortargeted advertising, because
the risk of harm to a human isgreater than any potential
business case that you have fortrying to keep this under wraps
and do this.
They have obligations for whatde-identified means, and this is

(16:03):
not the first time we've seenthis particular definition from
the FTC, but this is the firsttime that we've seen that extra
nugget at the top here aboutdata linked to a mobile
advertising.
A maid or an individual's homeis not de-identified.
Ctv is the next holy grail ofadvertising.
They're going to try to watchyou watch TV and try to figure

(16:24):
out what advertising means doingthis, and they're doing all of
that using the home IP address.
So I think that kind of a shotacross the bow from the FTC,
just letting ad tech know I seeyou Like identifiers, join keys,
those sorts of different kindsof workaround systems.
It's as if consumers are sayingplease don't do this, and then

(16:46):
the industry says but how aboutif we do it this way?
And it's the same thing.
You're not hearing us right now, and so it has to have an
economic cost to ignoring whatneeds to happen, and so that's
what these cases are about.

Debra J Farber (17:00):
And so the de-identification approach.
Is that something under the IAB, the advertising kind of lobby?
The IAB's consent frameworkJust got blown up.
Yeah, let's talk about some ofthe ramifications of it getting
blown up, so to speak, that theIAB consent framework isn't
considered legit here at FTC butalso in Europe.

(17:22):
Right, we don't have to go intocomplexities of the European
stuff there.
But for instance, we have, likethe alternative identifiers.
We can't use them, if you don'tmind discussing that a little
bit.
And then cookie deprecation andhow that's affected and some of
the technical downstreamimpacts that this holding might
have.

Heidi Saas (17:40):
Sure, if we look at the definition that they gave us
in here in this particulardecision in order, the
definition section is where it'sby lawyers, for lawyers.
That's where it's at.
I love it, but it has aparticular meaning in here.
If we look at this, it'llexplain a little bit more about
why other regulators see thesame problem.
De-identified information meansthat it cannot be linked
directly or indirectly to aparticular consumer or their

(18:03):
device, and so you have to havetechnical safeguards to prevent
re-identification.
Now if you think about ad techand you put all that first-party
data together and send it offfor lookalikes and then send it
over to LiveRamp foridentification and then this and
that that's not preventingre-identification On its face,

(18:24):
that is not what you're doinghere.
It also is no longerfirst-party data when you give
it to somebody else.
That alters that data.
The second part is you have tohave business processes that
specifically prohibitre-identification.
So you have to have it in yourcontracts that say to your third
parties you cannot re-identifythis data.
And then you have to have theprocesses to make sure that they

(18:45):
don't have inadvertent releaseof the information.
You've got to audit every nowand then on these third parties
to make sure that they'refollowing your contractual
clauses here, because if they'renot, the liability runs up
chain.
So deleting the data is hard,de-identifying the data not as
hard.
But the exact technologicalmethod that you use to do

(19:08):
de-identification is going to beunder review by the FTC.
They want to know how did youdo the de-identification?
Who did the de-identification?
Did you test it to make sure itcouldn't be re-identified?
Did you test your third partiesfor the same reason to make
sure they couldn't re-identifythe data?
Not until you've jumped throughall those hoops, can you say

(19:30):
that that data has beende-identified?

Debra J Farber (19:33):
Yeah, that's definitely a change.
That's the expectation ofcompanies.
I would think that almost allthe companies out there are, for
the most part, certainly notcomplying or don't have
processes that start to list allthe answers to those questions,
and so definitely seems thatprivacy posture management is
something that companies shouldbe focusing on this year, so
that you could, similar to theEU the accountability principle,

(19:57):
be able to demonstratecompliance with if you say
you're de-identifying stuff thatyou are actually are, and so
these assurances, theseattestations, almost, or what
you're promising in your privacynotice, has to be accurate to
what you're doing.
So let's move on to the nextcase, which is the FTC versus
X-MODE made by Outlogic, wherethe holding focuses on

(20:19):
affirmative, express consent,the definition of a data product
assessment and audit programsthat are now required, and the
requirement of data retentionand deletion.
There's a lot to unpack there.
How would you explain theimportance of the holding?

Heidi Saas (20:36):
I'll start with giving credit where credit's due
.
Joseph Cox, who was at Vicewhen he first started looking
into X-Mode and is now at 404Media, has been on this issue
with this particular data brokerfor a while.
Byron Tao, also looking intodata brokers and was looking
into X-Mode.
He has a new book out I thinkit's called Means of Control.
X-mode was one of those friendlylittle apps that was called

(20:59):
Drunk Mode or something.
It was trying to help peopleget home safely at night or
something, and they ended up notgiving notice of what they were
doing, even though if you didgive notice to people in one of
those long privacy notices foran app like that, people aren't
really reading that, I'm prettysure.
So when this case came up, thisis something where I believe
the FTC wanted to make it knownthat you just can't track people

(21:24):
down just because you havetheir location information, like
they've got to have some sortof privacy in their personal
being and where they are andwhere they happen to be with
their devices.
So they said if you are goingto collect this type of data,
then you have to haveaffirmative express consent,

(21:44):
right, affirmative expressconsent.
It requires clear andconspicuous disclosure that
cannot be inside your privacynotice, terms of service or
other similar document.
It has to be separate.
This is a separate form ofconsent, so people know I'm
tracking your location andtelling everybody else about it.

Debra J Farber (22:04):
And that the concept of affirmative express
consent has been around for along time.
A lot of European laws havethat.
That's pretty much theirdefault.
There's other US laws thatrequire it.
So that affirmative expressconsent has always been that it
cannot be embedded in with otherterms.
It has to be for each thingyou're consenting to, a separate
line item that you'reconsenting to and that you have

(22:25):
to actually take an action to.
Like you have to go check a box.
You can't you know it can't be.
Like you know, check this boxif you want to not choose your
data.
Like it has to be the easiestway for someone to actually
reflect that they've taken anaction to indicate that they're
giving.

Heidi Saas (22:45):
I'm glad that you mentioned that part right there.
Oh yeah, absolutely so what theaffirmative express consent is
not is using an interface thathas the effect of subverting or
impairing user autonomy,decision-making or choice.
So those are dark patterns.
Without calling them darkpatterns, you know what that is?
What they did find when theywere looking into these issues
is that they found consentbuttons but they didn't link to

(23:05):
anything.
Or if they didn't find them,they were so heavily buried
inside and they were like barelylit and it was driving traffic
to click okay, and those sortsof things.
So this is the first time wherethe FTC said no, clicking I
accept is not I accept toeverything.
You've got to be clear andconspicuous about this type of

(23:27):
consent.
That's new for ad tech, becausethe location data that's where
you get push notifications.
I see you just pulled into theY parking lot.
Would you like to also go get acoffee next door after those
kinds of things?
Yeah, and it just creeped andcreeped and creeped into our
life so much this is theregulator saying back back with
that, because that was beyondthe expectations that consumers

(23:50):
had when they downloaded yourapp and the definition of the
data product I wanted tohighlight, for people and
engineers will especiallyappreciate this is that it's any
model, algorithm or deriveddata, so that includes the
inferences, manual or automatedpredictions, audience segments

(24:10):
that is in here.
I ripped this language rightout of their decision.
This is new for people and, yes,the trade organizations and ad
tech are freaking out about this, but this is one of those
things where you've seen itcoming everywhere else in the
world.
Like you said, all the otherregulators are coming at them
for the same thing.
This is just the last stand,like you have to accept.

(24:31):
This is where we are now.
This is the world you live inand there are downstream notices
required going three years back.
So people are going to startgetting they've already gotten
these letters, like people gotthese letters from Xmode and
Outlogic that say, hey, ftcpopped us and we had to get rid
of a bunch of data.
So if you bought some audiencesin the last three years, you
should probably get rid of thator whatever.
So that whole otherconversation you might want to

(24:54):
have with council if you've gotthat kind of letter, because,
honestly, how are you even goingto find that data?
That, knowing what I know aboutwhat warehouses look like,
especially in this industry,like how are you even going to
find that data?

Debra J Farber (25:07):
to get rid of it , that data Like what is the
requirement here?
That if you can't remove itfrom your training set, that you
scrap the model and start over.
I mean, like I don't seecompanies actually reading it
that way, or they're willing toread it that way, but to comply
really would mean that you haveto find a way to be able to

(25:28):
delete it from your systems,from your models.
So I'm sure we're going to seea lot more.

Heidi Saas (25:32):
Machine learning is like so hard.

Debra J Farber (25:36):
But it is a question as to like to what
extent does deletion need tohappen and where?
But you know, how do we getthat flow of data when you're
sharing with third parties?
How do you control that flow?
Obviously, there's ways to doit and there's completely new
architectures, and it would takeyears and years for, like,
self-sovereign identity or otherdecentralized identity
architectures that can enablethis.
But that's not where we reallyare today, right?

(25:57):
So, like, what is theexpectation of companies that do
hold this data on our behalfand then feeding it into models?

Heidi Saas (26:03):
This is going to be back up policies and checklists.
So a supplier assessmentsprogram those are PIAs a privacy
program, sensitive data programwere mandated, including
assessments, which are audits.
That's the cost of forgivenessfor this company.
Now businesses are looking atit and going well, we don't have
to do all that because that'stheir cost of forgiveness for
this one company.
This company has to do this for20 years, but also they didn't

(26:25):
even have a data retentionschedule.
And what is your deletionmechanism?
Well, we don't know.
And then that's a problem,because if you told people that
you are getting rid of your dataafter you no longer need it,
then you need to be able todemonstrate that you can do that
.
There needs to be validity andvalidation, so the buttons for
people to give your consent needto be as easy to find as the

(26:47):
ones to withdraw your consent.
That's something where they'reborrowing from our friends in
the EU as well.
A lot of these systems work ina multinational environment, and
so you've held out for a longtime just having the American
Wild West version.
It's going to be closer to timewhere you need to afford most
people technologically more.

Debra J Farber (27:07):
GDPR rights.
Absolutely, I totally agreethis case.
We're definitely going to needprivacy engineers to implement
technical measures, right thatattorneys and analysts and
consult privacy consultants.
They can't do with paper.

Heidi Saas (27:27):
Yeah, they're required to bring in people.
You have to go hire experts, iswhat they said.
You don't know how to figurethis out.
You got to bring in expertsfrom the outside and you have to
be willing to work with themand not claim privilege or
confidentiality and keep themaway from the parts you don't
want them to see.

Debra J Farber (27:39):
Oh, my God, that just in my experience that has
been such hell where you've beenobstructed by the own business
so that you can't do that.
Complete your mandate, becauselegal makes everything
privileged and confidentialAnytime the word privacy comes
up.
now we have lawyers andtechnologists working together
on behalf of the people Now wehave lawyers and technologists
working together on behalf ofthe people Agreed, and so I

(28:02):
think we're going to see,especially as the economy
improves and tech jobs start tobe on the rise again, I think
we're going to start to see moreprivacy engineers getting hired
because of this very risk thatcompanies have right.
Those attorneys that arereading FTC consent decrees and
a lot of them do do that it'smore engineers that aren't as
following this as closely right.
They're going to want me tobring in those experts, and you

(28:23):
could get them different ways.
You could hire them full time,but you could also hire
part-time contractors, becausethat's just another method to be
able to scale your team.
So just consider thepossibilities, but, like, you
definitely are going to need tobring on privacy engineers to
demonstrate this compliance andmy dog makes an appearance.

Heidi Saas (28:40):
Yes, right.
And data scientists MLengineers, data scientists all
of these awesome people theyexist and I hope to be driving
work for a lot of people.
I know these cases are startingto make people nervous.
I hope that doesn't mean morebusiness for enterprise council,
who hasn't really done anythingto get people in a better
position, because I've beenwatching the same things that we
have been watching all thistime.

(29:01):
Yet they haven't really doneanything to change the basic way
that their companies aredealing with consumers.
They're just saying that werespect your privacy, but they
haven't changed the way they doanything to show that they are
respecting people's privacy youmean, like funding the privacy
office and its mandate.

Debra J Farber (29:19):
Yeah right, Consumers, they're starting to
call bull on that, which I'mreally glad to see.
I'm really glad to see it.
We need our privacy counsel.
Look, you and I are bothattorneys.
But I will say it again, I'vesaid it often on this show I
think that the fact that privacycounsel has owned privacy in
organizations since thebeginning has held back the

(29:40):
shifting of left of privacy intoearlier and earlier, into
before products and services aredeveloped, because they didn't
know enough to make the casethat this is your problem too,
or we could address this earlierand instead it's we'll just
hire outside counsel, we'll justbring in an audit team.
It's just a little lessunderstanding of the technical

(30:02):
needs, right, and where even theoperational aspects that need
to get done.
But the good news is that thatis, we're building up the
expertise for privacyengineering right now, and I
think holdings like this aregoing to continue to make
councils, especially enterprisecouncils, start engaging with
actual engineers, designers,data scientists and strategists

(30:24):
like you and me.
Yeah, absolutely oh.
Now this one's a reallyinteresting one.
I think let's turn to the FTCversus Avast, where we're
focusing on independentthird-party audits for every
product or service thatprocesses sensitive data.
This really focuses on browsingdata is sensitive data.
What should we know about theactions holding?
Well, they punked them prettygood.

Heidi Saas (30:46):
Hey, we're here to protect your privacy and
security and we're selling yourinformation to everybody is
actually what they were doing,and the FTC was like we're about
to punk you hard because theydeserved it.
But also this case.
I believe they chose the facefor this case because it shows a
flex on extraterritorialjurisdiction.
This is a UK company, the FTCcan reach you, and so that is

(31:09):
exactly what they did, and Ibelieve that's why they did this
to make this particular pointbut also because it was a slam
dunk.
Basically, that's what theywere doing is that they're
selling something and then doingthe exact opposite for profit.
That's called fraud.
But they didn't get shut down.
They said you can't do thisanymore and we're going to have
to audit you, and this isdefinitely a good example for

(31:33):
other people to know that whatyou say in your representations
to the public need to actuallyfactually be true in your tech
stack as well.

Debra J Farber (31:42):
Otherwise it's a deceptive trade practice which
the FTC, under Section 5,article 5, section 5, is able to
have authority to go after anycompany.

Heidi Saas (31:52):
They also wanted to make a point here about the
browsing information.
I also wanted to make a pointhere about the browsing
information.
That's sensitive information,and they made a very big point
of putting that statement out.
Collecting this and using thisis presumptively prohibited
without consent.
The presumptively prohibitedpractice is the words.

(32:12):
Those are the words that LenaKahn used when she was
discussing the recent cases asshe opened up the PrivacyCon
conference of papers last week.
That was an excellentpresentation of papers, by the
way, but those are the wordsthat she used.
Presumptively prohibitedpractice.
Yeah, you've been doing thisthis way, but you should not

(32:34):
have been, and here's why youshould not have been.
The potential harms and here iswhat you have to do, moving
forward, if you want to do this.
I wanted to pull this out tohighlight what audits mean, what
are the assessments, becauseI've been working on drafting
audit criteria for algorithmicbias and drift and other issues

(32:54):
pursuant to different laws thatwe have here and in the EU for a
couple of years now.
When people talk to me aboutaudits, they go oh, that sounds
dull as dirt, but it's not, andI wanted to bring this out so
that you can have a better ideaof.
What does it look like for youto have to do one of these
audits?
Well, the third party is goingto need to do it for you because
it's independent third partyassessment.
However, you need to know whatthat report says when it comes

(33:17):
back.
Okay, because then you're goingto need to know how to
remediate your system to get inline with compliance, or if you
need to scrap certain things orwhatever your issues are.
But why pay for a report youdon't know how to read?
The audits here requiredocumentation for each product
and service when you decide tocollect, use, share, disclose or
maintain that browsinginformation.

(33:38):
And the documentation they want.
They want names.
They want the name of theperson who decided they want to
collect the browsing information.
They want that person's name.
They want the names of theother people in the group that
made the decision.
If it was a group, they want toknow what your purpose was for
when you decided that.
They want to know what datasegmentation you have in control

(33:59):
to make sure that theinformation collected is only
used for that purpose.
So they've got the purposelimitations built in here, as
well as a data retention limitset.
They did not say how longpreviously in the location data
case, they said five years wastoo long you can read between
the lines on that but a dataretention you got to have at

(34:20):
least one and then tag your datawith that, because you're
required to have the technicalmeans for achieving deletion and
you've got to be able todemonstrate that.
That is where the privacyengineers have longevity for
work right there all day.
We need people to come in andknow how to do this.
That is not coming from yourlegal counsel's office, okay.

Debra J Farber (34:40):
Absolutely.
That's really fascinating.
I think Supercharge expand theneed for privacy engineers and,
as it is, we're not pumping outenough of them quickly enough,
although jobs listed havereceded in the recent year,
given just the tech cycles oflaying people off, but the need
is still there.
So, whether it's a consultant,a contractor or someone you

(35:01):
bring on full time, you're goingto need teams of people working
on this.
I wonder, heidi, this alsosounds like maybe a job for not
just an outside auditing company, but maybe a job for a DPO's
office, the DPO for a.
You know the EU requirement tohave a DPO.
It says that it doesn't have tobe one person.
It could be a group of peoplethat have various backgrounds

(35:23):
that help support auditing thedecisions that have been made
around privacy in anorganization.
I look here and I'm like if theperson who is making the
decisions on the means andprocessing of personal data has
to list their name now and hasto be public, and it's, like you
know, going to make them gowait, do I really want my name
attached to this product orservice that's doing X, y or Z?
I think that that's going to bea good bellwether as to whether

(35:45):
or not you have an icky like.
Is this just an icky productthat makes you feel like you're,
you know?

Heidi Saas (35:50):
Oh, it's going to give rise to pause.

Debra J Farber (35:52):
for sure it gives you pause right Like do
you want your name actually onthis, if you don't?

Heidi Saas (35:57):
want your name on it , then what are you doing?
Because we used to ask people.
If you don't want thishappening with your data, then
don't do it.
But in your professionalcapacity it's different because
of everybody else's demands onwhat needs to happen for the
project.
But if your name has to go onit and the regulators need to
know Because, remember, theseare the assessments that are

(36:17):
done and handed into the FTC tosay we did our homework, we're
totally checking behind the messwe made, we're cleaning it up
and these are the people workingon it.
So if you're not willing to putyour name on what you're doing
and tell the government I'm theone that's over here cleaning
this up then, ouch, maybe youreally need to get into another
line of work, or the businessthat you're working with needs
to hire other people who arewilling to put their name on it,

(36:39):
for whatever reason they have.
But that's the accountabilitything that they can come back at
any moment and say we looked atyour audit reports and we think
we smell fish and so we'regoing to come in and check some
things out and then we're goingto come for your people that
said that we were in charge ofthis.

Debra J Farber (36:54):
You think there'll be individual liability
eventually.

Heidi Saas (36:57):
Everybody can have as many legal problems as Elon
Musk with the FTC.
If you keep screwing up over 20years Look at Zuck's problems
you can have this many problemsand then another 20 years and
another 20 years Like, yeah, howmany generations of Zucks are
we going to have under FTCregulations here?
Eventually, one of these thingsare going to stick and we're
going to make some progress onthis, but when it comes to what
the privacy people need to do,is that the arguments that they

(37:20):
have been making to the C-suiteare fear and risk mitigation and
those sorts of arguments.
This needs to be put in termsof economic opportunity.
This is our chance to get aheadof the curve.
Please, let us invest now inmaking the changes to the
infrastructure that we need toso that we operate in a more
human centric way.

(37:42):
Once we have accomplished thatand can demonstrate it to the
public, to the regulators, toeverybody, you're going to be in
a better position thaneverybody else who is just
waiting to see what happens.
For those early adopters, forthose who can see around the
curve, for the ones who have thestomach or the appetite for
making transformative change,now is the time to get in on

(38:04):
doing that, and so, yes, bringin people who can help you with
strategy on a global sense, sothat you don't make a perfect
regulatory system for onecountry but it's inoperable with
everything else and all of yourother lines of business.
Right, there's some nuance tothat, but this, I think, is the
best time for opportunity forpeople to use this guidance that
they're getting from thesecases to say now we have to look

(38:25):
at going about things in adifferent way.
There's no longer anopportunity to put your head in
the sand and say we don't reallyhave anything to guide us on
the litigation or regulatorylandscape, because, yes, you do.

Debra J Farber (38:39):
Yeah, you can't use that complexity as cover
anymore.
And if you could also talkabout, like, the operation of
third-party software here andhow that is going to be affected
.

Heidi Saas (38:48):
Well, the business is now going to be responsible
for everything in your techstack, including the SDKs,
whether you know about them ornot.
You've got to know when yougive the documentation for each
product and service in the audit.
Remember I said that you've gotto give that documentation
naming names.
That includes any third-partysoftware within your product or
service, so it's every productand service plus every SDK that

(39:10):
lies beneath the wrapper in yourservice.
You've got to do some extensiveauditing all the way through
your code to find out who elseis doing what in there and,
honestly, businesses use toomany tools.
This is another opportunity, Ithink, for privacy engineers to
come forward in a trustedposition and say why don't we
reduce risk by reducing theamount of tools that we're using

(39:33):
in the infrastructure and thenjust try to streamline things
this way?
Because we have these toolshere and they have similar
capabilities as these othertools.
But HR likes using this one andmarketing likes using this one.
But if you retrain your peopleand get them all lined up on
using the same suite of tools,you'll have fewer other random
startup tools that were shinyone day and somebody decided to

(39:56):
go ahead and integrate it, andthose are the ones with the SDKs
in them.
Get rid of that stuff.
If you really don't need itaround, you can start by cutting
off access to those sorts ofthings.
If you were looking over thisto review this, just go and
quietly cut off access and ifpeople don't complain, they
never needed it.
That's true.

Debra J Farber (40:14):
That can be harsh in some environments, but
yeah, that is the age old thingyou know.
Turn it off and see who screamsabout it.

Heidi Saas (40:20):
Yeah, Well, I mean, they have a way to come and tell
you I really need it, and thenthey can argue with you why.
But you and I both know, indealing with the tools that are
in the pipeline, there are justtoo many tools.
There really are too many tools, and now people are starting to
see the problem of having allthe different tools in it,
Because if you're on the hookfor each and every one of those
tools and you don't even knowthey're in there, you should

(40:40):
probably start taking a look.
It's time to do the checkup.
It's time to do the.
You just turned 45, check up,and you've got all kinds of
extra doctors.
You've got to go see now too,because you got to that age or
something.
Right, I am actually 45.

Debra J Farber (40:53):
So it's hilarious that you just picked
that number.
And you're right.

Heidi Saas (40:57):
That's the number where, all of a sudden, all of
these screenings and everythingis Right, yeah, so whatever they
got tools for it, yeah, butthis is the time where
businesses need to go throughthis kind of a process in
dealing with what they havegoing on in their tech stack If
they want to have confidencemoving forward, because-jerk
reactions every time there'sanother state privacy law that

(41:19):
requires their own little flavorof this or that it's really not
going to benefit you in thelong run.
So you need to build for whereyou're going, not where you are,
and I think that you've got tohave privacy engineers that are
able to work with the otherpeople on your cross-functional
team so that they can understandwho do I come to when I have a
question that I think involvesan ethical issue or something
like that, on how I code this.

(41:40):
Do you want me to use thislibrary?
Do you think it will cause aproblem?
Those kinds of questions.
They need to have somebody to goto to ask those questions, but
they also need somewhere wherethey can put their opinions to
say you know, I know I'm justdoing the background work on
this sprint, but I see if we dothis this way, we may increase
the efficiency or this or that,because they see things too, and
sometimes you can go to workand just do your job, or

(42:02):
sometimes you can go to work andamaze people at your job.
I think if you tell them thisis a safe space for you to come
forward.
I know you're new privacyengineers are new.
I know you're new, but youropinions matter and your advice
is valuable.
So speak up and be a part ofmaking this better for everyone,
because you've got your skills,but you also have your life
training and so that giveseverybody a different point of

(42:23):
view.
So I think this really is sucha great opportunity to do so
much with technology now insteadof hide under the bed.

Debra J Farber (42:30):
I agree, you're right, and this gives ammunition
for privacy engineers who wannaaccomplish more and get more
budgets and to get the buy-in oflegal get the buy-in of just
the C-suite.
You're right, it is a hugeopportunity and we could get
away from the technical debt wehave if we tackle some of this
upfront change of strategy to bemore human-centered.

Heidi Saas (42:51):
I did a talk at a banking-heavy industry event
recently and, yeah, tech debt issomething we really had to talk
about.
It's something they'd neverheard of before.
I was like but you've had allthese preemptions for so long.
That's why you have tech debt,and so you know.
I was like you know what?
I'll tell you this When's thelast time you went to the bank
and looked at the screen thatthe teller is looking at?

(43:11):
Have you seen what they'relooking at?
It looks like DOS.
They're hunting and pushing thefunction keys.
That's where they are withtechnology.
They're still using four-digitpin numbers for the ATM.
Come on, these are signs oftech debt in our face everywhere
.
But yeah, there are betterthings that we can do.
We've already known for 20years better things that we can
do, but they haven't had anysort of economic incentive to do

(43:34):
it, because it's cheaper to donothing unless you have to.

Debra J Farber (43:37):
Thank you for that.
I appreciate it.
Let's turn to the next case.
It's the people.
So the people of Californiabring in a case against DoorDash
or a complaint against DoorDash, and this case deals with
CalOPPA, which is a Californiastate law that was passed in
2004.
I was in law school at the time.
The very law that requirescompanies that do business in

(43:59):
California to have a privacynotice, which they call in the
law, privacy policy, just toconfuse everyone, even though
it's a notice on their website.
That link, that bottom we'vehad way before GDPR the bottom
of every website that saysprivacy policy or California
privacy policy 20 years ago.
What we're talking about, yeah,20 years ago.
So what happened in this case?

(44:21):
Why are we talking about CalAPAtoday and what happened with
DoorDash?

Heidi Saas (44:25):
Well, this was another one of those cases where
they wanted to make a point.
The issue that they have withDoorDash, with their marketing
cooperative, is they needed toclear up what's a sale, because
everybody's like, well, if Ihave all these extra friends and
we're just sharing stuff, likenone of it's a sale.
And so that was a giantloophole and the regulators said
, we see you.
And so here you have thisdecision that says that the

(44:50):
involvement within the marketingco-op was done in exchange for
the benefit of advertising topotential new customers.
Not even that they got to dothe advertising, but they had an
opportunity to benefit frompotential new customers and that
participation in the marketingco-op alone was therefore a sale
under the CCPA.

(45:11):
There's no going about theloophole anymore with friend of
friend kind of thing.
Your involvement in this co-opthat's conferred a benefit to
you, whether you use the benefitor not, is that's conferred a
benefit to you.
Whether you use the benefit ornot is irrelevant.
It conferred a benefit.
And so that is what they call asale, so that it means it
follows the CCPA, even if itdidn't Like when you go to sue

(45:33):
somebody, you sue them for everycause of action that you can,
because maybe not all of themmake it.
So they included CalOPA.
That one made it all the way tothe end.
It may have even surprised them, but if they had gotten to a
situation where maybe one of thearguments weren't going to
succeed, they had multiple linesof attack to go with here, and
the CalAPA part was importantbecause 20 years ago they
decided you have to have aprivacy notice with certain

(45:55):
pieces of information in it andin 20 years you still haven't
gotten that satisfied.
That was also making.
The point was just you have noexcuse for this.
This isn't about a new law,this is an old law that you're
not following.
The document said one thing, thetechnology did another and they
didn't do any auditing of theirthird parties.
They said to their thirdparties now don't you use this

(46:15):
data for this and that?
And they said okay, and theydid whatever they want.
But the inferences and theother data had already been
moved on, so far removed.
There was no possibility thatDoorDash could go back to afford
what's called cure under thelaw, meaning you put them back
in a position that they wouldhave been but for the harm that
you've caused them and theycould not cure this because

(46:37):
there's no way they could getdownstream where all the data
had gone like ink into the water.
There's no longer a cure periodin California, so that's no
longer an issue for othercompanies.
But it's important to note thatwhen they first started this
investigation, doordashimmediately said we'll stop.
We'll stop doing everythingthat we're doing over here to
get in compliance with the law.
We hear you, we'll stop doingthat.
But they got them under thiscomplaint here because they

(46:59):
couldn't cure it.
So, try as they might, theycould not locate all the data
once they had released it intothe ether.

Debra J Farber (47:06):
So what I'm hearing is that this is a data
governance problem, that becausegovernance of how the data
flowed through the organizationwas not deployed or wasn't
possible, that you couldn't thengo find where that data was to
delete it.
Is that a good summary?

Heidi Saas (47:22):
So it wasn't just personal data.
It's inferencing on personaldata as well.
So derivative data which?
Becomes personal data but yes,that's right Derivative data
Fascinating, all right.

Debra J Farber (47:33):
So it's really important to underscore here
that you want to make sure that,whatever public notices you
have, you have technology thatbacks up or not necessarily
technology you have a technicalimplementation that you can
point to that backs up that,that statement you're making to
the public.
Otherwise, you could putyourself at risk for enforcement

(47:53):
actions or, in some cases,class action lawsuits, which I
think we'll be talking about inthe next section where we talk
about state reports andenforcement sweeps of Colorado
and Connecticut.
First let's tell us about someclarity around breach
notification timelines inConnecticut.

Heidi Saas (48:12):
Yeah, so in general the notifications need to go out
when legal counsel says this isa breach and you and I both
know a lot of homework andevaluation and determination
goes into saying this is abreach because you don't want it
to be a breach as soon as it'sa breach, then you've got a lot
of obligations.
If it's an incident, you don'treally have the obligations, you

(48:34):
just need to fix the issuein-house.

Debra J Farber (48:37):
And I just want to point out there because I
think this will be helpful topeople who haven't looked at
breach stuff for as much as wehave that there's a difference
between a breach of a system,which may not have a privacy
problem, and what we call aprivacy breach, which is the
trigger for all of thisreporting and set of obligations
that you have.
But it's just purely a securitymatter if there's an incident

(48:59):
that doesn't involve personaldata, that's a different type of
a breach of a system versus abreach of the data is kind of
the differences there.

Heidi Saas (49:08):
Yes, what they're worried about is unauthorized
access, whether it be by ahacker or by someone in a
different department in the samecompany who shouldn't be
looking through the HR files orwhatever.
It's unauthorized access wherepersonal information is
available for them to exfiltrate, to view, to use in any way
that can cause harm.

(49:29):
That's what they're trying toprotect with these rules about
what are breaches, what areincidents and when do you have
notice provisions.
Now, in this case, when you dohave a breach and personal
information is leaked and thereis potential for harm to the
consumers other than thebreaching of their data all over
the place, then you need tonotify Connecticut as soon as

(49:52):
you've discovered this issue,not once you've taken your time
and done the homework and talkedto counsel, and 15 emails later
we're finally okay, we've alldecided it's a breach.
Now we need to tell people, andthe reason why they've changed
the trigger from determinationto discovery is because the
amount of time that was wastedin between there, maybe even
weeks at a time, would be wastedin there, and all of that time

(50:16):
the data is leaked out on thedark web, being bought and sold
and bad things are happening andpeople have no way to know
about it and why.
Because council wanted someextra time to decide really if
it was a breach or not, and sowe're sorry you don't get that
extra time.
We do know you need to do moreinvestigation here, but you got
to tell us right away and thenyou can go and make your other

(50:36):
further determinations.
But this notice requirement isas soon as you know there's a
problem you need to tell us.
It falls in line with othernotice requirements.
Dfs in New York is 72 hours anda lot of those businesses
cryptocurrencies and allbusinesses working in finance
they report much, much sooner assoon as they see something
suspicious, because there is agood information sharing network

(50:59):
there.
So as soon as they seesomething fishy, everybody else
sees something fishy and theyshut it down immediately.
They don't wait 71 hours andthen make the notice to the
regulator Like that's not what'shappening.
It's against the common benefitof having these systems.

Debra J Farber (51:12):
Right and what you're describing also it's
because they've got these likeFSI, stack and all these
different ISACs, but it's wherethe industry groups coming
together to share threatinformation, threat intel, so
it's really set up for security.
And then there's even sometimeslike government private
cooperative, it's comingtogether to share this
information right so that we cantackle the problems or security

(51:33):
incidents that turn intoprivacy.
Incidents is what we care abouttoday on this call, but I think
it's a result of the securitymechanism and looking for those
threats ends up being like greatwell, I had the privacy piece
on to this already robust fusioncenters and communication of
threats across an industry likefinancial services, banking
specifically.
We don't really see thatanywhere else.

(51:55):
I think there might be some forgovernment services, but I've
not really seen anything asrobust as the financial services
area.
So it'll be interesting to seeover time how lessons learned
from that approach and thenhaving all of your
breach-related communicationdocuments and everything all set
up before the breach has everhappened, so that you're
prepared when something does notif, but when something does

(52:19):
then you can certainly hit theground running much faster in
terms of this obligation ofreporting to the state when an
incident that may or may not bea breach, but certainly affects
personal data, has beendiscovered and not when you've
determined.

Heidi Saas (52:34):
Give consumers a chance here to do something to
like call the credit bureaus andput a freeze on.
Give them a chance to dosomething instead of you try to
paper and CYA as fast as you canRight change passwords even.
Yeah, exactly, and you've gotinsurance for this.
It's not news when somebodygets breached, because
everybody's getting breached inthe news all the time.
Yeah, you might have a shortimpact on share price, but

(52:56):
you'll move and you'll rebound,depending on how you deal with
consumers and your messaging.
That's what Melanie was sayingabout communications is that if
you're sketchy about it and youdon't give a lot of information,
or if you give information andit's wrong in the course of
trying to do crisis and incidentresponse, that's gonna increase
your liability.

Debra J Farber (53:15):
Absolutely.
Speaking of that, what aboutcompanies that believe that
they're exempt from certainprivacy laws altogether?

Heidi Saas (53:22):
Well, best of luck to you, right, it kind of
depends on where you are, whatyou're doing, what kind of data
you're collecting, what you'redoing with it.
I mean, there are a lot ofexemptions in there and
businesses do fit into them, butthere is also a lot of
gymnastics being done to makesure that they can fit into them
as well, and all I can sayabout that is not every

(53:45):
profitable business model isguaranteed a right to exist.

Debra J Farber (53:51):
Oh, amen to that .
I know that one of the slidesyou had in the past conversation
with Melanie, you wrote thatthe CT DPA, or the Connecticut
DPA, received 30 complaints inthe first six months, I guess,
of its existence, and so onethird of those complaints
involved entities.

Heidi Saas (54:07):
I would be willing to bet the majority of the third
that didn't fall in to theenforcement category here
because they were exempt.
I'd be willing to bet most ofthose were exempted by Graham
Leach.

Debra J Farber (54:19):
Graham Leach Lyley.
Oh okay, so you think financialservices?

Heidi Saas (54:22):
I most certainly do.
Those are.
The largest area of consumercomplaints are financial
services.
And so now they're like oh, nowI complain about my bank or the
shady insurance company or thisbanking company or whatever,
and I can totally complain aboutthem because of this new law in
Connecticut and they had to go.
Oh, you know what we're sorry.
Yeah, that exemption was boughtand paid for a long, long time
ago.

Debra J Farber (54:48):
Okay, so just a different regulator.

Heidi Saas (54:48):
They're basically saying they're exempt from
because it's wrong regulation.
Exemptions are not therebecause people drafting
legislation are nice and they'retrying to hook up their friends
Like no, they're negotiated,they're bought and paid for by
the people whose interests willbe served there.
They have to argue why theyneed the exemption.
And for Graham Leach, they saywe don't need the administrative
burden because it will raisethe price of access to loans to

(55:12):
people.
It will raise the fees we haveto charge them.
They will have reduced accessto credit.
All of these horrible, horriblethings are going to happen.
If we have to honor people'srights, you've got to keep doing
what we're doing exactly thisway.
So you have to give us anexemption.
We can't have any furtheradministrative burden and the
justification for that is thatwe've already got all of these

(55:32):
security measures in place.
We've got it covered, so wedon't need to do this.
It would be like putting a haton a hat.
So you're going to cost usmoney to do something we've
already got covered.
So don't do it, just give us anexemption.
Meanwhile, every other day inthe news you see the same
institutions with sloppy InfoSecall over the place.
All the time.

(55:53):
I call bull on the ground,leech exemptions, just because
that's how I feel about it.

Debra J Farber (55:59):
Well, one of the things I love about you, heidi,
is we always know when nobody'sever like I wonder how Heidi
feels about something.
You know, I think many wouldsay the same thing about me, but
I really appreciate that in youas well.
We say what we mean, we meanwhat we say.

Heidi Saas (56:14):
I'm dealing with these industries for so long,
like I see these problems andthey have been these problems
for so long.
I saw when the data brokerindustry started I was working
on Capitol Hill and then I wentoff to law school and then the
FACTA was passed, which amendedthe Fair Credit Reporting Act in
2003.
And it put in place thereseller exception that created
the data broker industry.

(56:34):
That enabled all of the crazythings that are happening today.
And we're just now getting tothe point where other people are
starting to pay attention.
Like for the last five years orwhatever.
I feel like I've been the ladyin the tinfoil hat, like crying
the storm is coming, but nowpeople understand and I was like
it's almost like everythingI've done in my life has gotten

(56:54):
to the point where I can now useall of my skills to the fullest
ability.
I've got an undergraduate degreefrom a lobbying school in
international relations andgovernment affairs.
I've got a law degree.
You know I did the privacycertifications.
I've done all the work in AI.
Now it's at that point whereI'm starting to use all of those
different skills for privacy,law, technology, ethics and use
it to try to say here's the pathforward that we need to take,

(57:16):
and it's the first time in thelast five years or whatever,
where I feel like people arestarting to listen, not because,
like gosh, that's aninteresting idea and it sounds
right, because it's alwayssounded right to them.
They're listening because theydon't really have an option not
to.
So I don't know how I feelabout that.

Debra J Farber (57:33):
I don't mind some forced Heidi sass on me.

Heidi Saas (57:35):
I kind of enjoy it a lot so that's where I'm at,
like I've had it with trying tobe nice with people and trying
to get them to trust me.
Like I'm over that You'reeither going to heed my word or
regret that you didn't.
So that's Same, same, that's.

Debra J Farber (57:47):
I mean, this is why I don't work in a company
anymore and I work outside.
I feel like I affect morechange by educating the audience
, educating the industry on whatis yeah you definitely do.
And then I can in a company thatreally just wants to get in my
way of getting things done therebecause they just want me in

(58:08):
the position but they don'treally want me to make any
change.
Right, like I'm done with that,I don't care if it comes with a
high salary.
Like I can't even just like sitthere and just accept that I
will.
I want to actually make thechange and then I get.
You know, they stand in the wayof that.
So it's been frustrating, butit's also even for us
opportunity, opportunity.
The winds are changing andwe're right here at the
beginning of it, not theprecipice, but the beginning of

(58:29):
that change where you can't justdo status quo anymore as a
company.
You really need to get yourpersonal data processing in
order.

Heidi Saas (58:37):
Yeah, like Colorado and their enforcement.
They told you what they'relooking for.
Kind of like the French acouple of years ago.
Yeah, talk about the sweeps,the sweeps, the enforcement
sweeps.
Yeah, the sweeps, or whateverthe CO started.
Colorado started talking aboutthe sweeps.
It reminded me of when theFrench did the sweeps a couple
of years ago the cookie bannersTimes.
A credible threat ofenforcement is enough to get
businesses to do the right thing, and so what they are looking

(58:58):
for here are opt-out mechanisms.
And do they work?
You've got to have cleardisclosures, especially with
regarding sensitive data,children's data.
And what are you doing intargeted marketing?
Because data brokers, analyticsand the identity verification
services those are under themicroscope right now, and for
obvious reasons.
But that's what the regulatorsare looking at right now in the
different states.

(59:19):
Colorado just took the extrastep to put it out front and say
to you this is what we'relooking at.
Yeah, that brings us up to theblocking, the merger, which was
all about data science.
This was not about groceries,this is all about data science.

Debra J Farber (59:33):
Yeah, so this is about the Kroger merger with
attempted merger with Albertsons.
So why are we even discussing acase about M&A and competition
law?
You know, help us understandthe connection between the use
of personal data and antitrust.

Heidi Saas (59:47):
So if you think about your browsing history and
how personal that information is, you know like what sites you
go to and what's in yourshopping cart and those sorts of
things.
Now look at your bank book andfigure out where do you spend
the most of your money.
Most of it goes to the grocerystore because we have to eat.
So you either eat a lot atrestaurants and bless you if you

(01:00:08):
can afford that, but a lot ofpeople spend the majority of
their budgets for food at thegrocery store and what you buy
tells a huge story as well asthey're monitoring you in the
store.
You're in there once a week,maybe every other week,
something like that, but you'rein there pretty often.
You have a pretty solidrelationship with your

(01:00:28):
supermarket that they don't havesomewhere else Like.
This is the last bastion of bigbox that you have to go to
because you can order yourgroceries online, but you
probably don't wanna have to dothat every single week.
If you're driving past thegrocery store and need to grab a
few things, you're going toswing in there.
If you have ever tried to buyanything at the grocery store
without your little discountcard, it costs a lot more.

(01:00:50):
So they decided a while agothat your information was worth
something.
If you get the little discountcard, give them your email
address or your phone number.
Then you get the discounts inthe store, but they get to track
all the information across allof their grocery stores.
So this one may say Kroger, butit's also this, that and the
other different grocery storeline that's all owned by Kroger

(01:01:13):
because it was number onecompany and number two company
in food sales and they wanted tomerge.
Now, in that instance, the FTC.
You got to ask permissionbecause that's too much market
control right there and theexecutives themselves admitted
that, yes, this would be a hugemonopoly if we do this.
The reason why they're stoppingthis is because it's

(01:01:35):
anti-competitive.
You need to have more grocerystores to compete with each
other so that you have moreoptions to buy the food and so
the farmers have more places tosell the food.
In addition to, the workerswould be harmed by fewer places
to work and consumers would havefewer choices.
So you're looking at harm toeconomy as a whole, harm to
workers and harm to consumers,and those are the reasons why

(01:01:58):
this merger was deemedanti-competitive and bad for our
overall food economy and theyput a stop to it.

Debra J Farber (01:02:05):
Makes sense, and so why is this important to
privacy?

Heidi Saas (01:02:10):
They have all of our data.
They know everything about us.
If you want them off of yourtrail and you don't want them to
know where you are as a woman,you need to buy tampons, diapers
and Depends every week,otherwise they're going to know.
They're also going to know ifyou've got thinning hair.
They're going to know if youbuy arthritis cream.
They're going to know if youbuy dog food or cat food.

(01:02:30):
All of these different thingssay something about who you are.
They're going to know if you'realways eating processed food or
if you spend your time in theouter ring buying just raw
ingredients, and they're goingto put you into marketing
segments based on your behaviorinside the store, in addition to
the food that you buy.
So if you look at everything inyour pantry and ask, what does

(01:02:51):
this can of beans say about me,it says something to the grocery
store.
It's just the digital exhaust,so to speak, on the web that
you're not thinking so muchabout.
That is what they're doing.
It's data science, and they'vehad a very strong data science
game for a long time.
The Markup did a report on this18 months ago, I think and it
was mind-blowing how much moneythey have invested in this

(01:03:15):
business.
I believe Kroger's data sciencecompany is making an obscene
amount of money, maybe evenrivaling what they're selling in
food.
So, yes, because they don'tjust collect this data, they're
using this data to sell toeverybody else that wants to
know who has arthritis and likesto eat beans and has a cat,
because they would be perfectfor my advertising list for this

(01:03:37):
new cat condo I'm selling, orsomething right Like.
That's why they're collectingall this data about you, and
this says so much about whatwe're doing.
Your transactions sayeverything about you as a
consumer.

Debra J Farber (01:03:50):
Right.
Well, that really sums up, Ithink, the six cases we were
going over today.

Heidi Saas (01:03:56):
That was exhausting.

Debra J Farber (01:03:58):
Absolutely, and I expect a lot more coming too
right.
I mean, the FTC is just warmingup.

Heidi Saas (01:04:07):
Well, they have two new commissioners.
It's been three Democrats andso there's been some scuttlebutt
about that, but that is how themakeup is.
There've been two and theyfinally got confirmed.
So there are two Republicancommissioners joining now and
they are getting set up in theoffice and getting to know
everybody.
And I'm excited to see becauseprivacy is a bipartisan issue it
is, so I am excited to see,once they get settled in, what

(01:04:29):
projects they find that arealready in progress that they
want to start working with orpromote or do something with.
It'll be interesting to see onceall five commissioners are
there at the FTC.
It's not that big a spot.
It's like the size of a lawfirm in a big city, like 1,000,
1,200, maybe 1,500 people.
It's not that big a spot.
It's like the size of a lawfirm in a big city, like 1,000,
1,200, maybe 1,500 people.
It's not that big.
They've got personalities andgoals and agendas and those
sorts of things.

(01:04:49):
I do anticipate seeing morestuff coming from them, but I
also know this first big sprintwas to make a good statement and
hopefully they'll keep the ballrolling.
But right now I think it'sbeneficial for the community at
large to take a moment and calmdown and then decide what does
this mean for what I'm doing?
And then, like we've beensaying, take whatever

(01:05:11):
opportunity you see to makeimprovements going forward,
because this is not gonna stop.
But we need a few more minutesto marinate in what this means
now before we start making anymore decisions, right?

Debra J Farber (01:05:23):
Absolutely, absolutely, in fact.
Are there any resources yourecommend to our listeners that
will help them keep up to datewith future FTC enforcement
actions?

Heidi Saas (01:05:33):
You know what LinkedIn is, where I have my
collection of smart people andyou know I write.
I've gone after people that doresearch and create content and
those sorts of things, so I havetrusted sources of information
for me.
Like you guys create mynewsfeed so, you know, start
working in the privacy communityand sharing with other people
that create content, because asthese things come up, then you

(01:05:54):
know they'll share theiropinions on it and post other
things to help give you guidance, to find out.
You know what is everybodythinking about these issues and
yeah, there are some law firmsand privacy companies that put
out newsletters and things likethat to let you know.
Here's what this means.
But I think this is one of thosethings where, if you have a
mentor, you can discuss thesethings with your mentor.
But also this is an opportunitywhere you have to network with

(01:06:17):
other people and seek mentors innew areas where you maybe have
not thought about finding amentor, because your mentor
doesn't have to do exactly whatyou do to teach you the craft
like a Jedi.
You can have a mentor thatworks in an ancillary field,
like they work in cyber and youwork in privacy.
They can be a mentor as well.
You can have a new decisionfrom a regulator and you can
discuss it what it means to youfrom your engineering

(01:06:39):
perspective and privacy, andthen you can talk to somebody in
cybersecurity and say what doesthis mean to you?
Because we're all working onthe same system, right?
So I think this is anothergreat opportunity to invest in
each other and learning as weall learn through this together.
Is there a book to read?
No, because by the time you geta book published, it's already
out of date and has badinformation.

Debra J Farber (01:06:58):
Right, I mean, I've been thinking about writing
one myself and then I'm like,but it's a never ending, it's
going to be constantly tellingpeople I don't want to write a
book for this reason.
Right, I don't want to have towrite a book, I write enough.

Heidi Saas (01:07:07):
I think on LinkedIn share enough of my information,
but I also have trusted sourceof information that I like to
banter with.
Like when these decisions camedown, my signal was blowing up.
I was on fire for days and daysbecause we're having these hash
out conversations with peoplelike well, what does this mean?
Well, did you see that and whatdoes that mean?
And those sort of behind thescenes conversations or whatever

(01:07:29):
you know it's.
That's how we figured out, thisis what this means to us, and
then we can go out together andadvise the businesses and say
here's what this needs to meanto you so that you can do
something differently.
Or you can document that I washere and told you what to do and
you didn't do it, and that'scalled prior notice.
Right, right, absolutely.
That's a problem If peoplearen't going to bring in a

(01:07:51):
consultant and say you need todo this, you need to address
your issues, otherwise you'vegot a report sitting around
waiting to get you in trouble.
I've heard enterprise counseldiscourage businesses from
bringing them in to do auditsfor algorithmic bias or pipeline
reviews for their data for thisreason, because they don't want
to create prior notice document.
I'm thinking that could not bemore behind the curve than

(01:08:13):
anything else I've ever heard.

Debra J Farber (01:08:14):
Like that's ludicrous problem that's going
to face them in the future,right when their business has
been ruled illegal or maybe theytry to.
Disgorgement is going to makethem get rid of the model that
they trained on, and we'll haveto do it.

Heidi Saas (01:08:28):
Precisely.
The disgorgement is the newfavorite remedy.
I love it too, but businessesneed to think like, if you're
getting this kind of advice andyou're not really feeling
confident about that advice, askyourself how many lawyers do
you see going to jail for givingbad business advice?
Not the ones that work forTrump, but other than that.
How many lawyers do you seegoing to jail for giving bad
business advice?

(01:08:48):
I don't really see many, do you?

Debra J Farber (01:08:49):
I mean, I can't even think of, and you know
they're out there doing it allday, yeah, so businesses need to
think about that.

Heidi Saas (01:08:55):
If that's the advice you're getting, you don't feel
like you're being heard, thatyou absolutely should get advice
from someone else and see ifyou feel comfortable in making a
decision after you've gottenmore than just one opinion on
the matter, because you can'tknow everything in this field.
We all have to work together onthis.
That should be suspect tobusinesses, I think, is that why
should I not go through andfind out what all is in my tech

(01:09:18):
stack?
Because I not knowing is goingto be a better position?
Like it's not.

Debra J Farber (01:09:23):
No, there's no other area that I can think of
that does risk analysis andaddresses risk that way.

Heidi Saas (01:09:30):
Just buy more insurance.
They said yeah, you know what?
We've mitigated the risk bybuying some more insurance and
we've squeezed this languageinto the terms of use.
It's cool.

Debra J Farber (01:09:39):
Yeah, but insurance companies want you to
demonstrate to them, before theygive you insurance these days,
that you have controls in placefor privacy.

Heidi Saas (01:09:45):
I'm psyched to see insurance companies with
technologists because they'vehad data scientists for a while,
but they're bringing intechnologists so that they can
look more into what is going onon the other side, because
they've been working ontechnology in-house for obvious
reasons, but they're starting tolook at technology outside and
be like you know what?
I looked at what you have inyour tech stack and we're not
going to insure you because ofthese reasons.

(01:10:08):
Whether you know about them ornot, that shouldn't have to be
the insurance company's job, butI'm excited to see some of
those smart people.

Debra J Farber (01:10:15):
They're mitigating their risk.

Heidi Saas (01:10:15):
Yeah, I'm starting to see some of those smart
people go work for the insurancecompanies and I'm not really
all that mad at them for it.
I get it, but yeah, likesomebody has got to be able to
do that.
It's just.
It's encouraging to me to seethe new fields that are open for
people that have engineeringskills.
You know there is privacy, likeyou don't have to just go and
build video games, which can befun, but that doesn't have to be
the only future that you seefor yourself.

(01:10:36):
If you know how to code, thereare so many other things you can
do, like working in privacy andengineering.
In particular, I feel like youhave a role in working in civil
rights, because all of thesesystems were built to process
human data without any respectfor human dignity, and so, where
the culpability of the lawmeets, the code that's causing
the harm.
We've got those places to makechanges now, and we need the

(01:10:58):
privacy engineers to be there tohelp us make the right changes
so that we don't make the systemworse.

Debra J Farber (01:11:04):
Absolutely.
What's the best way for folksto reach out to you?

Heidi Saas (01:11:07):
I only have one social media thing because I
just I don't have time to doanything else.
But yeah, I'm on LinkedIn and Ilove meeting with people and I
think that that's great.
But this is not legal adviceand if you have legal questions
you can still ask me and if I'mnot licensed in your
jurisdiction, I may be able torefer you to another attorney
who is in your jurisdiction whocan help you with your

(01:11:27):
particular issue.
I am licensed in New York,connecticut, maryland and any of
the states by reciprocity forthose three.

Debra J Farber (01:11:35):
Awesome.
So DC is one of the two.

Heidi Saas (01:11:37):
Yeah, I think that, well, I had to take three exams
because none of those were byreciprocity.
So I took three bar exams to dothis, but it opened up, I think,
like 36 different states that Ican through reciprocity if I
need to.
Yeah well, I'm not a litigator,so I mean it's not that
important that I need to be ableto run into every courtroom,
because sometimes it's legaladvice and sometimes it's just
consulting advice on privacy andstrategy and things like that.

(01:11:59):
So I'm on LinkedIn, that's mycorner, that's where I'm going
to be at.

Debra J Farber (01:12:03):
Excellent, and so, before we close, do you have
any additional insights you'dlike to leave the audience with?
No, that was a lot, yeah it wasjust a whole.

Heidi Saas (01:12:10):
You know what I mean .
We talked a lot about the FTCtoday.
We can talk another time aboutthe CFPB.

Debra J Farber (01:12:15):
That would be great.
No-transcript.

(01:12:46):
There are many regulators forprivacy in the United States,
especially at the federal level.
Yeah, let's revisit that andmaybe unpack some of those in
the next episode.
Heidi, thank you so much forjoining us today on the Shifting
Privacy Left podcast.

Heidi Saas (01:12:59):
Thanks for having me .
I love what you're doing withthis podcast really.
Yeah, I mean it's fun.

Debra J Farber (01:13:06):
I'm having great conversations like this.
Until next Tuesday, everyone,when we'll be back with engaging
content and another great guestor guests.
Thanks for joining us this weekon Shifting Privacy Left.
Make sure to visit our website,shiftingprivacyleftcom, where
you can subscribe to updates soyou'll never miss a show While
you're at it.
If you found this episodevaluable, go ahead and share it

(01:13:29):
with a friend, and if you're anengineer who cares passionately
about privacy, check out Privato, the developer-friendly privacy
platform and sponsor of thisshow.
To learn more, go to privatoai.
Be sure to tune in next Tuesdayfor a new episode.
Bye for now.
Advertise With Us

Popular Podcasts

United States of Kennedy
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.