All Episodes

March 29, 2024 45 mins

Privacy has been a major casualty of the Internet. Twenty five years ago Silicon Valley leading light, Sun’s CEO Scott McNealy said we “have zero privacy. Get over it.” And that has been the truth ever since and the profits basis for AdTech behemoths Google, Meta, and more.

The good news is that, in the quarter century since advertising became the Internet’s business model, few have gotten over it, including powerful national and regional regulators.

In this wide-ranging discussion with Steve and George, Michelle Finneran Dennedy, CEO of PrivacyCode, Inc. and Partner at strategy consulting firm Privatus.Online 
speaks to this shift toward restoring online privacy and what her company is doing to streamline implementation of privacy enhancing practices.

In this Making Data Better episode, Michelle addresses:

  • The immorality of Data Subject Access Requests
  • Ethics vs. zero trust
  • The impossibility of privacy regulation compliance
  • AdTech’s shifting model
  • The liability tornado about to strike data hoarding enterprises

This is an important and exciting conversation. Take a listen. 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:23):
Welcome to Making Data Better, a podcast about
data quality and the impact thatit has on how we protect,
manage and use the digital datacritical to our lives.
I'm George Peabody, partner atLockstep Consulting, and thanks
for joining us.
With me is Lockstep founder,steve Wilson.
Hi, steve, g'day, george, goodto be with you, glad you're here
.
So, steve, let's get into it.

(00:44):
Today we're addressing what Iconsider to be the biggest
negative effect of the internet.
My list has more than one entryon it, but my biggest one is
the loss of privacy.
We've got big tech companieswho have every incentive to know
everything about me, becauseour business model is based
entirely on advertising to metotally targeted advertising, a

(01:08):
corollary of that or along withit.
We've got an entire ecosystemof data brokers who aggregate
and resell every bit of mydigital dust that they can
hoover up.
So, and then there's thisenterprise belief that it's the
best thing you can do is storeevery bit of data that you've
ever encountered to help withyour marketing and maybe risk

(01:29):
management.
And then, of course, analyticsyou're going to apply to all
that someday that make all thatdata be useful to you.
So we've got this pervasivehabit and business model that is
all about learning everythingyou can about people, without
regard for the immediate needfor the data and without any
concrete reason.

(01:50):
Often, in other words, as otherstudies have shown, even with
all that data, bad decisions andinjustices are made.
Okay, Steve, I give up.
I realize I'm complaining aboutthe state of the world here.
You've worked in digitalprivacy for years.
Where do we start?

Speaker 2 (02:07):
Well, that's interesting.
I got into privacy almost byaccident from many years working
in identity.
Often put that in air quotes.
Security and privacy aredifferent.
Cyber security starts, I think,with important data and
thinking about what makes dataimportant to you, and then you
have controls to protect thevalue of data.

(02:29):
There's a well-known set ofdimensions that you look at data
through from the securityperspective.
I think that privacy is sort ofthe opposite of that.
I think that privacy is aboutrestraint.
I think that privacy is aboutwhat you don't do with data
rather than what you do do.
I think that privacy is aboutwhat you don't do with data
rather than what you do do, andtherefore the technology is not
as important as policy andgovernance and I put policy in

(02:51):
quotes as well.
It's not just those big, thick20-page policies, but policy
like what is your organization'sculture?
What do you want to do withdata?
How is it important to you thatold promise that we respect
your privacy by businesses?
If they really mean that?
Here's the paradox of privacy.
They need to choose not to knowthings about you.

(03:13):
They need to say we're going tomind our own business, we don't
want to know everything aboutyou.
That's respecting privacy.
So that's how I come at this.

Speaker 1 (03:23):
Wow.
So, yeah, I kind of went in theother direction, didn't I?
I was all about collectingevery bit of data and whining
about that.
You're pointing out that it'sabout minimizing the data that's
being captured in the firstplace.
Well, to help us furtherunravel this privacy knot, it's
our great pleasure to welcomeMichelle Finnernan-Denady.

(03:43):
Pleasure to welcome MichelleFinnernan-Denady, who's the CEO
of Privacy Code Inc and partnerat strategy consulting firm
Privatus Online.
Get that right, michellePrivatus, privatus.
All right, I was going to ask.
I was thinking earlier is thisa world where we use the word
privacy or privacy?

Speaker 3 (04:03):
This is going even back further.
So there is publicus, which isyour public persona or your
cultural, ethical rules that youfollow, and then there is
privatus.
So privatus is your privateself or your individualized self
.

Speaker 2 (04:18):
Gorgeous.

Speaker 1 (04:19):
Well, you were an attorney before becoming a CEO
and a consultant.
I want to ask, to begin with,what brought you to privacy?
How did you get into thisdomain?

Speaker 3 (04:30):
Yeah, I think, like so many people, steve included
we kind of backed into it.
So I started out life as anintellectual property lawyer.
I was a patent litigator andthen entered the world of high
tech, which is interesting to me.
I was doing a lot of medicaldevices and pharmaceutical work
in patents and then ended up inhigh tech and I think are we

(04:52):
supposed to call ph scale really, on the internet was dawning
and you could buy and trade anddo business online and so

(05:31):
somehow, magically, thatabsolves you of doing what you
needed to do.
And as a patent litigatorcoming into this environment and
a businesswoman looking overthe Sun Microsystems portfolio,
I said this is really.
The paradox for me was that thevalue is in sharing information.

(05:51):
The value was we had peoplelike Whit Diffie who was talking
about encryption, we had earlyversions of what we now call
Kubernetes and we were dealingwith what we called the grid or
the utility back then which weall now call the cloud
distributed secure compute, andso when I looked at the
portfolio and the business needsof this company, as well as the

(06:13):
165 countries in which we weredoing business and the various
cultures that those imply,everything rang a bell in my
head saying if you want to dobusiness well, you better get
privacy right, because you havecustomers and you have employees
and these are pretty criticalcharacters in your business

(06:37):
drama.
And if you're a governmententity, you have citizens and
voters and those are prettyimportant constituents for you.
So wherever you find a person,you find a story that needs to
be told, a connection that needsto be made, and so you have
privacy and identity.

Speaker 2 (06:58):
Love that story to be told.
The theme of our podcast iswhat is the story behind the
data?

Speaker 3 (07:12):
And the story, you know, it's amazing.
I mean, data is quite benignuntil humans get their eyeballs
on it, and then it can be verygood and it can be very evil,
and so I think that's the thingto remember is, you know, this
is an ingredient.
So data, really telling thestory, I think is the right
thing.
And, exactly as you said,george, is like the old
fashioned way for the old peoplewho don't like making money.
They stockpile as much data asthey get their hands on, they

(07:34):
scrape the internet and theysteal all of our data and they
put it into generative AI models.
That seems really attractive inthis day and time, doesn't it?
It seems like you'd givesomebody a trillion dollars to
do more of that.
The reality is it's alreadysetting a time bomb off.
You can dig in a pile of ponydung as much as you want.

(07:55):
You ain't going to find a ponyin there, but to find something
that's fresh, something that'snew, something that's targeted,
something that's individualized,that has value.
That's fresh, something that'snew, something that's targeted,
something that's individualized,that has value, that's current.
So the old-fashioned way ofjust because I can, I do because
mainframes are cool that'schanging, and it's changing very

(08:18):
, very rapidly the new valuemodel is coming into focus.

Speaker 2 (08:22):
Let me react to that really quickly.
And we didn't prep for this, soI apologize for this story out
of the blue.
But you'll remember 15 yearsago, the Google Street View
catastrophe, where the StreetView cameras had a good job to
do.
They're photographing thestreets of the world.
But some bright engineerrealized that the equipment on
the back of the Google car had aWi-Fi transponder and they

(08:43):
could also pick up Wi-Fitransmissions that were quoted
in the public domain.
And this chap innovated bycollecting that data and just
started mining it to see whatwas there.
It was a process of discovery.
It was a process of literallydata mining.
You know there's a vein of data, let's mine it, let's see
what's there.
That was the Google culture.
They reward innovation.

(09:04):
They make a stupendous amountof money out of using data in
innovative surprising ways.
And when they all got sprung andthey said you were collecting
that data without intent, youdon't need it, you didn't tell
people you were doing it.
It doesn't matter that it wasin the public Wi-Fi mish, it was
personal data, you had no rightto just grab it they were

(09:27):
shocked and they sort of gavethe guy an attitude transplant
and said we won't do that again,but I thought it spoke to the
culture of an innovative company, a purely digital company that
feels as though it's got therights.
Like frontiers people to youknow it's like the old oil rush.
We know this valuable treasureis under the ground or under the
digital ground.
Let's go and find out what'sthere and dig it up and use it.

(09:48):
And I thought that is anathemato privacy.
And yet it's socounterintuitive that you should
not do that.

Speaker 3 (09:57):
Yeah, I think that's right and.
I think that's where it'sgetting really interesting,
because we now have more subtlecontrols than we had back in
those kind of early kind of oildays of get whatever data you
can, because we didn't have fatpipes to every home, we didn't
have the stability in the Wi-Ficonnections that we have today,

(10:19):
we certainly didn't have thefidelity of the identity tools
to layer whose persona is whosewhere, and so there was sort of
this feeling of just because Ican, I will.
And the reality is, I think themore innovative view,
ironically, is quite oldfashioned.
That's a weird sentence to saythat innovation always starts,

(10:42):
for me, with ethics.
What do people want?
The things that people becomereally, really addicted to and
they really really like arethings that bring out our human
passions hunger, thirst, lust,greed.
We love being on social mediaas much as we hate being on
social media.
Why?
Because we crave connectionwith people.

(11:04):
It's a very basic human thing.
So, by really looking at, whatdo we really want, what are we
serving, how much does it cost,what are people wanting to do
with that information and whatif that information is false?
So starting to have thesequestions again because we have
the fidelity that we have todaythat we did not 20 years ago, I

(11:26):
think, opens up a whole new canof really valuable innovation.
I've stunned George intosilence.

Speaker 1 (11:35):
Yeah, you have.
I'm trying to take what yousaid and trying to search for
the evidence that there are techcompanies who actually get what
you're saying and agree with itand are casting themselves as
stewards.
I'm still looking at theparadigm, the business paradigm
of the net, which is basicallyadvertising.

Speaker 3 (11:57):
Still ad tech?
Yeah, yeah, and I think that'sright.
I think what exists today, youstill see remnants of that
old-fashioned sort of ad techworld.
Right, it's like Remnants.

Speaker 1 (12:08):
That's Google's.
That's virtually 90% ofGoogle's revenues, if not more.

Speaker 3 (12:14):
Yeah, but it's changing.
I mean, look at just, theEuropeans are saying, oh, we
don't like your Google cookiesand things, and so that changed
radically and overnight.
And how much money are theyspending into trying to shift
their models?
How much money are they nowinvesting into trying to find
various ways of obfuscation andvarious privacy enhancing

(12:36):
technologies to either do ananonymization or figure out how
to do radical personalization?
I think even those big, bigcompanies are investing in
what's next.

Speaker 1 (12:46):
You put your finger on what I suspect is the major
mover of the market these days,which is regulation.
Is that right?

Speaker 3 (12:55):
It always has been.
So this nonsense, this nonsensethat, oh, regulation is going
to stop innovation how wasSilicon Valley formed?
Silicon Valley was formed webroke up the telephone
monopolies and there was allthis room for building.
There was all this sort ofregulation.
After deregulation, theyre-regulate other things and

(13:17):
people find innovative models togo around things, sometimes to
find a loophole, but alsosometimes to truly innovate.
Regulation hasn't stopped thepace of innovation in human
history yet.
So, yes, the lobbyists willcontinue to lobby and they're
adorable and we love them.
However, the facts are thefacts.

(13:40):
The cash is the cash.
I don't see a slowdown, eventhough we have actually known
rules.
I would prefer to have actuallya federal law in the United
States rather thanstate-by-state patchworks.
I think it protects more people.
I think it gives us a way toshow that we're driving on the
right side of the digitalhighway and we can signal to

(14:03):
other drivers that we're on themove.
This patchwork of little powerfiefdoms, I think, is
destructive.

Speaker 1 (14:12):
Absolutely.
Your point makes me think thathow glad I am, despite its
occasional misses, that the FAAexists and does such a good job
of making air travel as well.
It's the safest way to move aperson in the world.

Speaker 3 (14:26):
Yeah, well, you put your finger on it.
I've long said it.
This is one of my firstconversations with Steve.
They had these great drawingpeople at this conference we
were at and we were talkingabout what did we do when piracy
was the biggest threat tocommerce, when we were trading
nutmeg and tea?
Well, we had admiralty law, andso when a ship was in port, it

(14:48):
was one thing, when a ship wasin the shipping lanes, it was
another thing.
We followed each other, we hadrules, we understood what it
meant to help a distressedfellow sailor, and a lot of this
was to prevent pirates, topreserve goods and services, to
cling together with thetechnology that we had to do
commerce and literally find ourway in an emerging planet,

(15:12):
despite which country of originthat vessel emanated from and
where they were heading.
I think that's where we really,despite a world that's starting
to be more and more kind ofnationalized again, for this
type of commerce, for privacy tothrive, for identity to thrive,
we need to think more and moreabout standards than ever before

(15:33):
.

Speaker 2 (15:33):
I love that.
I love talking to a real lawyer.
There's so many what we call inAustralia Bush lawyers.
I love talking to a real lawyerand your background, michelle,
I think is deep.
It seems to me as a non-lawyerthat we're probably on the cusp
of new jurisprudence.
Just as shipping created thatand just as the oil rush in the

(15:54):
1850s, 1870s created newproperty law, regulated a whole
lot of intangibles.
I talk to technologists and theysay you can't regulate data,
it's ones and zeros, it'sintangible.
I talk to technologists andthey say you can't regulate data
, it's ones and zeros, it'sintangible.
Well, you know what we do in acivilized world we regulate
telecommunication bandwidth.
That's not even a thing, it'snot even physics.

(16:16):
Bandwidth is so intangible weregulate the hell out of that.
So I think that we're on a cusp.
I see data protection, which issort of synonymous with privacy
.
Gdpr stands for data protection.
I see a bigger form of dataprotection where we look at what
makes data valuable from everydimension the human dimension,

(16:36):
the social dimension, thebusiness dimension titrate out
the properties of data thatmakes it valuable and attend to
maximizing that value.
And knowing that it's amulti-discipline sort of thing,
I can see I'm so optimistic thatwe're actually going to see
some good things in the next 10or 20 years.

Speaker 3 (16:52):
I think we will too, and some of it.
Again it goes back to, like,those human needs.
Greed is a big motivator and wesaw that when Apple decided to
flex its muscles on what it'scalling privacy.
You saw what it was able tobasically punch Facebook in the
nose pretty hard enough so thatthey did this weird head fake
with Web3.

(17:13):
Like, oh, we're doing LyndonLabs again.
I was like you guys are so cute.
That clearly was a head fake inmy mind.
It was like right after themarket fell over on its side,
suddenly we're doing this weirdthing that didn't work, doesn't
have a business case.
That's cute.
Why?
Because suddenly we've got thisimmense power with this, as

(17:34):
Steve says, an intangible right.
And if you go back and you read, it's actually a beautiful
piece of prose the Morn andBrandeis piece from 1892, I
always get that year wrong fromHarvard Law Review called the
Right to Privacy.
What was it about?
It was actually about the dawnof new technologies.
It was about Kodak-ing ortaking portable photography.

(17:59):
Although the portability was agiant heavy box, but you might
see a woman's ankle.
Oh my, think about it.
It's like morality and ethics,and what they discuss in that
article is really the evolutionof you know, when Fred got out
of the cave and hit Ted over thehead with a hammer, that they

(18:20):
understood was something thatthey should have rules about,
and then when you justthreatened him with the club, it
was a very long time beforethat was actually recognized as
a criminal and a civil harm.
The fear of that bat hittingyou was an ephemeral right that
we agreed as a society ourethics.

(18:42):
Before the law happens, we haveethics.
We decided that to happen.
So privacy and this is why wefight so voraciously, I think,
against this nonsense andnihilism of privacy's dead
because we have toys.
No, privacy's alive because wehave people and people who have
dignity and stories and we haveethics.

(19:04):
Cultures decide how they wantto interact as groups with one
another, so as long as we havepeople and culture, we will have
privacy.

Speaker 1 (19:14):
Let's take it right down to you what's your work?
Tell us about what you're CEOof yeah, what the company is
doing, because I'm hearing thisand I go.
This is hard.

Speaker 3 (19:26):
Oh no, this is fun.
Yes, it's hard because it'svaluable, so you should pay us
lots of money, but it's fun.
So, basically, two lines ofbusiness.
So Privacy Code is an AI-drivenplatform.
I would have said NLP-driven ayear and a half ago, but now
we'll say AI.
We use natural languageprocessing and we actually take

(19:48):
written documents that areusually written in legalese and
they'll have either your RFPs oryour notices or your policies
or your privacy impactassessments, and we'll scan them
in and we read them against arequirements library that has
three subject mattersresponsible AI or new technology
.
It will be the same set ofcriteria for quantum Privacy, of

(20:12):
course, or data protection anddata quality and data governance
.
So three different sort oftypes of libraries, and so
they're captive in two types oftasks processes, technologies.
So if you make a promise andyou say we promise to respect
your privacy, we list ourvendors.

(20:32):
Well, you're going to find awhole category of requirements
from places like NIST and GDPRthe Australian laws that say
what to do with vendors and sowe've already broken them down
task by task what could be apolicy, what could be a
technology control and then weput them into projects.
So that's one line of businessis helping to do that

(20:53):
translation between and amongsttechnology and word people so
that you can get your work done.
Know what you need to get doneso that you can get your work
done.
Know what you need to get done.
And really no one can beexpected to be as geeky as Steve
and I are.
If you're a UX person, I'mgoing to tell you exactly what I
need in your UX so that we canhelp our customers engage in.

(21:16):
If we're going to call itopting in or opting out, I don't
care.
It's going to be some sort ofinformed consent mechanism and
I'm going to tell you what topresent.
Now, you are not going to bethe right person, probably, to
store the proof of that consent.
That may be someone who's doingmy logging.
That may be someone who's doingmy security criteria or my

(21:37):
consent management.
That person is going to get atask that says every time
someone pushes this beautifulbutton designed by my gorgeous
UX, the result is going to bethis kind of time and date stamp
, et cetera.
So how do we do these times anddates?
So that experts are doingexpert things and then the
person who's doing the datagovernance on top looks across

(21:58):
the environment and says how arewe doing?
How are we doing on?
You know, when we do thingsthat are truly informed consent,
we're actually hitting 1920,maybe even 60 different
countries requirements at once.
That's wonderful.
Look at that.
You can actually celebrate thatstuff and you can actually
start to assign weights andvalues.
So, pravadas, what do we do?

(22:19):
We do strategic consulting.
So we look at your businessproblem and we think about how
do you connect that to your dataissues?
How do you connect that todollars?
How do you connect that toactually getting this
implemented so that your cultureis sustaining what you say that
you're doing?
It's so much more than writingthe right policy.

(22:41):
That's like writing the rightpitch deck to get cash.
There's no magic pitch deck.
It's a combination of a lot ofdifferent things.

Speaker 1 (22:48):
I do create a dashboard of progress around
each one of those three areas.

Speaker 3 (22:53):
It's a progress board .
So, it looks like a task board.
It's tied in with our wonderfulAtlassian, our favorite
Australia Well, Canva is gettingto be a pretty big deal too,
but one of our favoriteAustralian phenoms so it's
already tied into Jira.
So if your technical teams arealready using Jira, like the
rest of the world, all they dois they receive a ticket in

(23:15):
their normal platform and soeveryone else can either go onto
the code and look at the Kanbanor the project progression and
you can discuss right there andeverything is tracked so that
you can say show your auditorsat the end what I call a P-BOM
or a privacy bill of materialsthat you can say here's what it

(23:35):
looks like from end to end.
And if they need to look at itfurther, they can say, aha, well
, and everything's tied back tothe standards too.
So they can say everything'sorganized in the same kind of
chapter and verse and there'scited paragraphs if you want to
geek out on it of like here'swhere it is in GDPR, here's
where it is in secure controlframework, et cetera.

(23:58):
And then, if you really have tofigure out, like, does George
still work here?
He was in charge of the UX lastyear you can click on that and
say, oh yeah, he assigned thisnow to Steve and Steve has now
taken over, so you can audityour stack, you can mature your
stack, you can just know whereyou're at with data rather than
just words.

(24:18):
We're actually assigning deeds,matching them to standards, and
now we're speaking a commonlanguage, so that we're not just
sort of slipping down this wellof, oh my gosh, it's
overwhelming.
Privacy is too hard for me.

Speaker 2 (24:32):
And you're solving that wicked problem of privacy
policy being like a dual-useartifact.
Yep, but it's dominated bylegalese and it's become a
defensive mechanism, or evenit's become a bit of trickery,
where businesses have got acovert mission and they get
their lawyers to write somethingthat supports the covert
mission, without saying what itis.

(24:53):
It occurs to me that it's dualuse, like a privacy policy, if
it's really a statement of whatthe company is going to do with
data.
Why do I want your data?
What am I going to do with it?
What do I owe you?
I going to do with it?
What do I owe you?
I like to see that as like anembryonic document that lives
with the project.
You start by articulating what,how, when, why and where, and

(25:18):
then it turns into a guidancefor designers and it also turns
into meeting your regulatoryneeds.
Now, I don't underestimate thecomplexity of writing a legally
compliant privacy document, andnobody should.
You should get legal advice, asMichelle would do, but your
process using NLP winds up beingable to extract the dimensions
of that important document thatmattered to different
stakeholders, I think it's veryimportant to you, everything

(25:40):
that you should like.

Speaker 3 (25:41):
Years ago, I did a graphic novel privacy policy
when I was at mcafee intel and Irealized we were doing a b2c
product.
We're doing a lot of familyprotection, we're doing a lot of
mobile phone stuff and securityis.
Security is hard and it has itsown special language and
everyone has their standards andyou know we're using the, the

(26:02):
des, this and that this.
We do that to sound smart toourselves and that's adorable.
Standards and we're using theDES, this and the this.
We do that to sound smart toourselves and that's adorable.
But when we're trying tocommunicate and be fiduciaries,
it's a lot better when you cantalk to them, and nothing
translates faster than pictures.
So we did these pictures ofthese nerdy little ninjas and
they weren't stylized, or so wedid this pictures of these nerdy

(26:23):
little ninjas and they weren'tstylized, or they didn't.
They had one ab, like the nicedome, dad bod, and they
demonstrated you know, this iswhat a cookie is, and we had a
thing that said what cookies are, and this is.
This is what we do with when wesay we're keeping some of your
data.
As for research, this is whatthis means, and so we explained
in in pictures as much as inwords, and it went over really

(26:47):
well and I was really proud ofthat work.
It's not easy to do because youhave to be very thoughtful that
you're not over promising orunder explaining things, and we
did append the actual words tothe policy.
But what I learned from thatexercise is, if you're going to
demonstrate visually what you'redoing, you have to know what it

(27:10):
is that you're doing.
And I think that's the trick ofthis is, once you have invested
and you've done a reallythorough job of understanding
what your data assets are A, youdeserve to brag about them
because they are a commercialdifferentiator.
And B, you should make it easy,because you should be proud of
that and you should not expectno company is compliant.

(27:33):
So let's just start right there, right now.
Forevermore.
There is no such place ascompliance-ville.
That's like thin young, ascompliance-ville.
That's like thin, young,perfect, brilliant, smart, rich.
She's lying, whoever that is.

(27:57):
She's lying to you.
So compliance is something thatyou're going to try to go for
in this complex area, but someof these things are actually
they conflict with each other.
So understanding someone'spoint of view on what they think
is readiness or compliant withtheir point of view.
It's really important tounderstand so that you can
interact with them with theright due care.

Speaker 2 (28:18):
I love how you don't sugarcoat privacy, michelle.
There's a couple of things thathappen.
We've got like the orthodoxprivacy by design manifesto that
talks about a positive sum gameand as if there's just one
dimension and everything's goingto be good.
And it's not.
I mean, privacy involvestensions, and I think the act of
restraint, the act of promisingI'm not going to know things

(28:39):
about you when I do businesswith you, that comes with a cost
and there's a tension there,and I think that that's good.
The other thing, of course, isthat there's privacy
perfectionism, because we thinkthat there's compliance Great,
and we give it to the lawyersand once they issue some sort of
decree that you are compliant,you think job done, privacy tick
, but it's not ever perfect.
The funny thing is that we knowthat security is not perfect.

(29:05):
We know that it would come withinfinite cost to have perfect
security, so you don't eventhink about it anymore, and I
wish that we thought aboutprivacy similarly, and I don't
mean look for compromises by anymeans, but I do mean set an
expectation that privacy is afactor that can be optimized but
never reach perfect.

Speaker 3 (29:23):
Yeah, I mean, I think that's going to be really
important now that we're goingto be sharing more and more
training sets.
So when I go to the supermarketin person, people can see me.
I'm not invisible.
So the fact that I'm there andthe fact that I've gone down
this aisle or that aisle if youwant to call that compromise,
that's fine.
If you want to just talk aboutit interoperable with society,

(29:46):
you can call it that too, butit's a different situation than
if I have like, say, I had adoctor's appointment and they
made house calls.
I don't know if anywhere in theworld they still do that, but
imagine if they did and imaginethe intimacy of that interaction
, very different than thesupermarket example.
So being able to communicate,am I a supermarket or am I a

(30:12):
psychiatrist?
Do my data sets reflect that?
It's going to be reallycritical because the next
frontier is this generation ofgenerative AI is not going away,
nor should it.
I think there are leaps andbounds of things that we'll be
able to do and get through sortof some grunt work that we

(30:34):
didn't want to do in the firstplace, but it's going to be more
critical than ever that wefocus on quality, focus on
quality and we're going to haveto be critical about the kinds
of answers we're getting back,because it's hard enough now.
If you have anything or ifyou're a parent and God help you
you Google some malady, you'reeither totally fine or you have

(30:56):
cancer.
Those are the two answers thatare always true.
It's like sex, drugs, rock androll you're always dying on
Google and yet will it be true?
But if you doubt everything,then you actually won't get the
benefits of some of this stuff.
So if I'm well curating thatgrocery, it may be that for a

(31:20):
community you can really havethe kind of levels of freshness
and wholesomeness and maybe evenhealthcare woven in that are
very helpful and delightfulthings, but you'll also be able
to understand when things arestarting to go wrong.
Data has a shelf life that'svery short.

Speaker 2 (31:39):
Mulling over, mulling over.

Speaker 3 (31:41):
We're not going by your script at all, are we
George?

Speaker 1 (31:44):
We're not, and it's perfect that we aren't.
And sometimes people call megrumpy, george, and your
comments about generative AI.
I've spoken with AIprofessionals themselves who are
actually concerned withregulation and being able to
prove the results and point towhy the AI made a particular

(32:05):
decision, and most of them throwup their hands and say they
can't do.
It Isn't that cool.
That's the magic of it.
Well, it's a black box and thenwe've trained this black box.
I'm sorry, but so much of it.
As you sort of alluded toearlier, michelle, we train
these systems so much on thebullshit that's endemic on the

(32:25):
net.
I'm grumpy because I thinkhuman beings are often just lazy
and don't have the ability tobe critical about the results
that come back.
So it's really theresponsibility of regulators and
the producers and providers ofdata to do a really good job on
this, because citizens, theydon't have the tools or the time
.

Speaker 3 (32:45):
Yep, I think that's true and I think the critical
skills that we're going to haveto.
Hopefully we can use some ofthis to do a better job
educating our people as a whole.
Definitely, here in the USwe're having all sorts of issues
with education and decidingwhat should be educated.
I think we have an opportunityhere.

(33:06):
We have an opportunity of thisbrand new kind of whiz-bang
thing to help communication bywriting better, and we have
beautiful examples of.
Grammarly is a beautifulexample of a, you know, a
writing assist tool and anengine.
But we also will have theopportunity to force citations
and look for, you know, shouldthis issue have more varied

(33:31):
citations or are you going tothe same source for the same
thing?
So there is an opportunity here.
I think that's a little bitPollyanna-ish to assume it's
going to be baked in.
I think you should assume it'snot baked in unless it's worth a
lot of money, but I also thinkthat it shows the value of the
data sets themselves.

(33:52):
So right now we're we'retrained on like what is it?
Reddit and cat videos is most ofthe results on these you need
to have, you know, I think, oneof one of the biggest issues in
bias.
So there's two.
There's a couple of issues inbiases.
A lot of this stuff is trainedon very biased data and even

(34:14):
people who have access to thispodcast or anything online,
that's not the world.
That's only about 60% of thehuman beings on the planet right
now even have access to cleanwater and the internet.
So we're not representing theglobe and I think the other
thing that's you know.
So the sample is biased and theother thing is the answer in

(34:35):
funny quotes might be the rightanswer, but the right answer
might be that you should paywomen less than you pay men.
So the right answer that existsin society today might suck.
So it might be that even thoughyou're getting a correct answer
back, you might be getting amorally corrupt answer back.

(34:56):
I've been asked with noimagination.

Speaker 2 (35:00):
Child labor was a thing when the coal mines were
small and you needed smallworkers to fit through the
logical Exactly, and if it'sjust about money, and that's how
rewarding our society is justcash, then it is, you know, back
to the beginning.

Speaker 3 (35:17):
It's the big oil rush times and you can bonk anybody
you want over the head.
I think that that's not a verysustainable society, and so
that's where regulation and,more importantly in this era,
because it goes so much fasterethics, really comes in.
People think that they can'tknow ethics because it's sort of
like an opinion, but it turnsout this is a 3,000-year-old

(35:40):
discipline of reallyinterrogating what are the basic
structures of ethicalinterrogation and what's our
point of view?
And then again, beingtransparent, am I looking at a
collective ethic where really itis?
The ancestral society, theoldest, you know, continuous
societies on the planet go forcollective ethics, whether we

(36:02):
like it or not, asindividualistic Westerners, it's
true.
Or are we looking at, you know,like zero trust?
Security depends on trusting noone.
That's a very individual,single person narrative.
So if I'm building on a network, that's that I need to

(36:22):
understand to go to the mean andprobably not going to find my
outliers.
My Picassos don't live there,but the guys who can fold jeans
at the mall probably are.
So, depending on what you wantyour outcome of your tool to be,
it's really important tounderstand what their ethics are

(36:45):
, as well as what is theirregulatory kind of compliance
posture, if you will theirregulatory kind of compliance
posture, if you will.

Speaker 1 (36:54):
And Michelle, is your privacy code platform
expressing ethics through itsregulatory understandings.

Speaker 3 (36:58):
To the extent that.
So we have the model cards thatTimo Gru and some others have
come up with to do ethicalinnovation.
So if you're using that process, it basically will suggest that
to you, that you should have aprocess that says how to do
critical design and how to, howto recognize bias and what to do

(37:20):
.
And this is from our Pravadasframework.
It's I have to look at it causeit's a long acronym, but it's
called bears teach.
So that's this is the acronymis B is bias and fairness.
So this is the acronym, is B isbias and fairness.
Ethical compliance and values,alignment, accountability and
responsibility, robustness andreliability, security and

(37:42):
privacy we did put them togetherfor you, George Transparency
and explainability,environmental impact and
sustainability, which is a bigdeal these are heavy models
Autonomy and control we cheatedhere.
It's scalability, so we'll usethe C for teach, Scalability and
performance.
And then, finally, human impactand safety.

(38:02):
So BEARS TEACH is the acronymand this is the ethical
framework, as well as thefunctional framework for our
risk assessment profile.

Speaker 2 (38:11):
Let's pick that up and pop it in our show notes too
, okay, as we edit thisproduction.
That'd be great.
That'd be helpful.

Speaker 3 (38:17):
Yeah.
So we do like a scorecard andkind of a risk heat map of where
are you on any of these things?
Because you might havesomething so critical, because
you've got cellular research andso you are using protein fold
management, maybe you are usingquantum compute Heavy, heavy
load on the environmental factor, maybe high load on societal

(38:38):
impact factor.
So this might be somethingwhere you're saying, okay, let's
go for that because thebenefits are here and we know
about the tradeoffs here and itfits this sort of model.
If you're using quantumcomputing for advertising models
to sell middle-aged people facecream, whatever.

Speaker 2 (39:00):
Yeah, I think some of this stuff talks about the
dynamism of privacy.
That privacy is normally framedaround.
What do I know about you?
Personal information, is itidentifiable?
What do I have?
The fascinating thing to meabout AI and even good old big
data is that there's stuff thatI might have and I don't know
yet because the algorithm hasn'trun or I don't have the

(39:21):
algorithm yet that does theprotein folding to predict the
probability of diabetes.
But the funny conversation thatwe need to have around this
dashboard that you're describing, michelle, is give people some
settings where they can enterinto the bargain with the data
holder.
The data holder says this iswhat I might know about you.
Are you cool with that?

(39:42):
What I might be able to figureout?
I might be able to synthesizethe risk of height conditions.
Do you want me to do that andhave a meaningful dialogue
around what the outcome of someof these enormously important
algorithms could be?

Speaker 3 (39:55):
Mostly outside, if we do it properly.
I think so too, and I think oncewe do that, we'll stay away
from and I've been saying thisin public more and more.
It's a spicy take on it, butI'll give you guys some spice.
I think DSARs have becomeimmoral and I'll tell you why.
The data subject access requestWe've always had the right to

(40:23):
correct and the right toaccuracy and we should continue
to do that.
Who is asking for the datasubject access requests?
Do you think it's like hourlywage makers or people who speak
the native language as a secondlanguage?
Do we think that it's peoplewho don't have the same kinds of
educ for that crust tool on theedge?
And our lawyers are runningaround like chickens because

(40:57):
there's like 72 hours and youhave to get back this data
subject access request and sowe're breathlessly doing this
and breathlessly, maybe, maybe,honoring those requests.
So if they say, please forgetme in your database, if they
have that capability, if theyhave that capability, if they
have done the architectural work, readers, they have.

Speaker 2 (41:17):
That is a hot take and it stimulates me.
It reminds me that one of youknow it's canonical in
electronic health records thatpeople have the right to know
who has been accessing theirrecords.
Now, you know, in a hospitalsituation there's dozens of
professionals who have everyright and they can't do their
job unless they look at myrecords.
Now, you know, in a hospitalsituation, there's dozens of
professionals who have everyright and they can't do their
job unless they look at myrecords.
Now, do I want to get pingedevery three minutes that
somebody's looked up my record?

(41:38):
I mean, it's a sort of a luxuryof the worried.
Well, and the people that careabout that stuff are not.
Actually that's the hot take ofyours, isn't it?
The people that are making thoserequests aren't actually by
some measure important.

Speaker 3 (41:52):
They're not.
I mean, I won't say they're notimportant, everyone's important
.
But I will say like A, theydon't have the tools to do
anything about it other than tosay naughty you or I'm going to
take you to the commissioner.
And if we think that's theresult, we should just give that
staff directly to thecommissions and have them do
what they do.
That staff directly to thecommissions and have them do

(42:14):
what they do.
I think the more important partis it has sucked resources away
from the core, from actuallydeleting stuff at the end of the
process.
Exactly as you say, we'vethought in our past that storage
was cheap and we were going toneed this data forever, and so
people who have long left ourorganizations stored and stored

(42:34):
and stored data.
They don't own this stuffanymore.
It's probably for campaignsthat haven't been run for 10, 15
, sometimes 30 years, as we'veseen in some hotel cases, for
example.
It may have been accurate atthe time, but we've got all of
this sort of liability tornadositting in our bellies, and

(42:56):
shouldn't we be cleaning that upfirst before we start doing all
of this sort of dance of youknow with fans and feathers?
It's a grumpy old lady.
Take on things, but I'd rathergo to the core first than you,
first than put lipstick on it.

Speaker 1 (43:13):
I'm wishing every tech organization was embedding
privacy code into its processright away, right?
So I have to say I loved yourP-BOM, your Privacy Bill of
Materials, because that getsinstructions.
This is what you have to do tobuild a complete product UX,
whatever it is, and that's howwe can talk to each other,

(43:36):
rather than a DSAR, which willbecome valuable if you do these
steps.

Speaker 3 (43:40):
I mean, that's the ironic thing about my crowing
about.
It is if we can talk to eachother as vendors.
If we could talk to each otherand say like this is who I am,
I'm somebody who does it.
Right now it's all or nothing,Do you sell data or not?
And everyone's like oh yeah,you have to say yeah, Like

(44:01):
that's how we stay in business.
There's no, there's no middleanswer.
There's no.
Well, you know, we, we parse itand most of it is we use this
and you can't tell all yoursecrets because then you make
things very insecure.
So there's no real great answer, but there's a good answer here
.

Speaker 1 (44:19):
So, michelle, what's the good word to leave us with?
Because we need to bring thisin for a landing.

Speaker 3 (44:27):
If you like cash, of any denomination or currency,
you should fall in love withprivacy, because we tell the
stories about customers, aboutvoters, about people, about
students.
So if you like having customersand employees, you should love

(44:47):
privacy.
We're about the money.

Speaker 1 (44:50):
Well, thank you very much.
It's been an absolute pleasureto have you on Making Data
Better.
Really appreciate your time.
Thank you very much.

Speaker 3 (44:58):
It's been a wild ride .

Speaker 2 (45:01):
Thanks, Michelle.
Great insights and thanks forthe candor and the optimism.

Speaker 3 (45:07):
Absolutely.
You can't do what we do withoutbeing optimistic.

Speaker 2 (45:10):
Indeed.
Thanks, Michelle.
We'll see you on the other side.

Speaker 3 (45:13):
Yes, sir, thank you.
Advertise With Us

Popular Podcasts

United States of Kennedy
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.