All Episodes

April 30, 2024 54 mins

In this episode, Marie Potel-Saville joins me to shed light on the widespread issue of dark patterns in design. With her background in law, Marie founded the 'FairPatterns' project with her award-winning privacy and innovation studio, Amurabi, to detect and fix large-scale dark patterns. Throughout our conversation, we discuss the different types of dark patterns, why it is crucial for businesses to prevent them from being coded into their websites and apps, and how designers can ensure that they are designing fair patterns in their projects.


Dark patterns are interfaces that deceive or manipulate users into unintended actions by exploiting cognitive biases inherent in decision-making processes. Marie explains how dark patterns are harmful to our economic and democratic models, their negative impact on individual agency, and the ways that FairPatterns provides countermeasures and safeguards against the exploitation of people's cognitive biases. She also shares tips for designers and developers for designing and architecting fair patterns.

Topics Covered

  • Why Marie shifted her career path from practicing law to deploying and lecturing on Legal UX design & combatting Dark Patterns at Amurabi
  • The definition of ‘Dark Patterns’ and the difference between them and ‘deceptive patterns’
  • What motivated Marie to found FairPatterns.com and her science-based methodology to combat dark patterns
  • The importance of decision making governance 
  • Why execs should care about preventing dark patterns from being coded into their websites, apps, & interfaces
  • How dark patterns exploit our cognitive biases to our detriment
  • What global laws say about dark patterns
  • How dark patterns create structural risks for our economies & democratic models
  • How "Fair Patterns" serve as countermeasures to Dark Patterns
  • The 7 categories of Dark Patterns in UX design & associated countermeasures 
  • Advice for designers & developers to ensure that they design & architect Fair Patterns when building products & features
  • How companies can boost sales & gain trust with Fair Patterns 
  • Resources to learn more about Dark Patterns & countermeasures

Guest Info

Resources Mentioned:

Send us a text



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

TRU Staffing Partners
Top privacy talent - when you need it, where you need it.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra co
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Marie Potel-Saville (00:00):
The problem lies in the fact that, because
everybody, all humans, are goingto react in the same way, well
then, it makes us predictableand then it makes us manipulable
.
So, that's how dark patternswere created.
In a way, you know, it's amanipulation of our cognitive
biases.

(00:20):
It's playing on these humanweaknesses to make us do things
without realizing, or evenagainst our interests.

Debra J Farber (00:30):
Hello, I am Debra J Farber.
Welcome to The Shifting PrivacyLeft Podcast, where we talk
about embedding privacy bydesign and default into the
engineering function to preventprivacy harms to humans and to
prevent dystopia.
Each week, we'll bring youunique discussions with global

(00:51):
privacy technologists andinnovators working at the
bleeding- edge of privacyresearch and emerging
technologies, standards,business models and ecosystems.
Welcome everyone to TheShifting Privacy Left Podcast.
I'm your host and residentprivacy guru, Debra J Farber.
Today, I'm delighted to welcomemy next guest, Marie
Potel-Saville, an esteemedlawyer and impact entrepreneur.

(01:13):
After a decade of working intop-rated law firms, she moved
in-house to become GeneralCounsel of the EMA region for
several global companies, likeChanel.
Marie is the Founder of FairPatterns, a solution for
detecting and remedyinglarge-scale dark patterns.
She's also Founder and CEO ofAmarabi, an award-winning

(01:38):
innovation studio specializingin ethical, privacy-friendly,
and age-appropriate design; and,she's Host of The Fighting Dark
Patterns Podcast.
Marie lectures internationallyon human-centered law and legal
innovation through design and isa member of the European Data

(01:58):
Protection Committee's ExpertSupport Pool on Dark Patterns.
So, as you may surmise, todaywe're going to talk about dark
patterns how to make sure thatyou're only creating fair
patterns and not dark patternswhen you are building a new
product or feature.

Marie Potel-Saville (02:17):
Thank you so much, Debra.
It's a pleasure to be here withyou.

Debra J Farber (02:21):
Oh, I'm so excited.
I think this is a really greattopic that people might know
about at a really high level,but haven't had the time to
really uncover all of theinteresting literature and
research around dark patternsand fair patterns.
So, I'm really excited to havethis conversation with you today
.
Maybe just tell us a little bitabout how you ended up

(02:42):
transitioning from practicinglaw to focusing your career on
legal design and fair patterns.

Marie Potel-Saville (02:48):
Yeah, it's been an amazing journey - I have
to say - and I enjoyed everysingle moment.
I guess, I'm at the time of mycareer where, you know,
everything I've done previouslyfully makes sense now.
It's like all of the variousbits and pieces now fully click
all together and I'm able tobring my whole self to work,

(03:09):
namely because I sort ofinvented my own job, which helps
.
But basically, to answer yourquestion, I guess the beginning
of my career was extremelytraditional.
You know, I was in big law, theusual suspects like Creshfields
, Allen and Overy in London,Paris, Brussels, etc.
Then, I moved in-house, as youmentioned, and I guess this was

(03:33):
the beginning of the journey, tothe extent that it made me
realize the huge gap between thelegal advice that you provide
as an external counsel and whatit becomes on the field in real
life.
The answer is not veryreassuring.

(03:53):
Basically, the law, within thecompany, is absolutely not
understood.
It's even rejected as a purebusiness constraint - something
that's going to slow everybodydown.
I really wanted to change that.
It's really because I'm in lovewith the law that I wanted to
give the law back it's placethat it deserves in society, but

(04:18):
in companies as well.
I wanted really to change thosebusiness constraints into
empowering tools to provideefficient solutions, really.
And so, in practice, what I didwas that in my own legal
division - I was VP Legal EMEAat Estée Lauder

(04:38):
at the time; -I I was covering70 countries, 30 different
brands, with a team of fourlawyers, which was an
interesting concept; and so, Istarted experimenting,
leveraging Stanford's LegalDesign Lab.
At first, I think it was veryclumsy on my side.

(04:59):
I was just trying and testingand learning.
But, what was reallyinteresting is that, as clumsy
as it was, it produced amazingresults so that my fellow VPs,
like the brand VPs and themarketing VPs, et cetera, would
knock on my door saying, o"Ohlook, I didn't know that law

(05:21):
could be so engaging, and theywere referring, for example, to
competition law training, whichis not naturally fascinating to
marketing people, to say theleast.
And they were like, o"h, thisis really interesting, can you
train my team " team?
" And so that encouraged me toactually properly train in

(05:41):
innovation by design.
I did a master's degree for 18months and then I set up the
company because I really wantedto share as widely as possible
this new methodology, basicallyapplying design to the legal
arena to solve users' issues andbasically bridge the gap
between the law and their users.

Debra J Farber (06:02):
So tell us about your legal design firm, Amurabi
, which focuses on innovation bydesign.
What do you really mean by that?

Marie Potel-Saville (06:12):
Well, I mean that the law is not doomed
to be impossible to understand,totally inaccessible.
You know all the walls ofjargons that you hit online each
time you click on terms of useor terms and conditions or
privacy notices, etc.
It's not doomed to be this way.
There is actually a rock solidmethodology that enables us to

(06:38):
transform all of these walls ofjargon into content and
information that everybody lovesto read, that everybody truly
understands and that empowerusers to really better
understand their rights but alsomake their own free and

(06:59):
informed decisions.
Amurabi is an agency.
We're not advisors.
We don't produce fancy slidesto tell you what to do; but,
what we do produce, deliverables, tangible projects - for
example, privacy notices thateverybody would love to read,
compliance programs, litigationdesign.

(07:19):
.
.
.
That's basically what we'vebeen doing for the past six
years and that's actually whatled us to create our R&D Lab
several years ago tospecifically tackle dark
patterns.

Debra J Farber (07:32):
That's really exciting.
What exactly are dark patterns?
I think it makes sense todefine that before we go any
further.
I'm sure there's severaldifferent definitions, but how
would you sum it up for theaudience today in our discussion
?

Marie Potel-Saville (07:51):
Of course.
So basically, a dark pattern isan interface that deceives you
or that manipulates you to makeyou do something you didn't mean
to do and that could even beagainst your interests.
Does that make sense?

Debra J Farber (08:03):
It does.
So, how are dark patternsdifferent from deceptive
patterns?
I mean, we hear this.
.
.
both .
.
.
language when it comes tonegative things about design
when it comes to privacy anddata protection and other things
.

So, let's distinguish that: dark patterns versus deceptive (08:16):
undefined
patterns.

Marie Potel-Saville (08:21):
So, basically, Harry Brignull coined
the term dark patterns nearly14 years ago to describe what I

just explained (08:28):
tricks that make you do things you didn't intend
to do.
That was 14 years ago, and,along the way, Harry has really
led the charge against deceptionand manipulation online in an
absolutely brilliant way.
He's now authored a book, whichI can't recommend enough,
called Deceptive Design Patterns.

(08:49):
To answer your question, it'sactually the same - dark
patterns and deceptive patterns.
It's the same concept.
Simply, Harry, with whom we havethe pleasure to collaborate by
the way, Harry was conscious ofthe fact that the term dark
pattern could potentially bemisinterpreted as associating

(09:10):
dark, you know, with somethingnegative.
So, to avoid any possiblemisunderstanding, he had this
term evolve into deceptivepatterns.
That's merely a precaution, andI fully subscribe to that
precaution because, obviously,the intention is not to offend
anyone.
On the contrary, it's really ahuman- centric approach.

(09:30):
That's all about protectinghumans, that's for sure.
So, he's had this term evolve;but, quite frankly, it really
doesn't matter how it's called.
What matters is to be aware ofthe reality of the manipulation
and the deception online and tochange the situation.

Debra J Farber (09:50):
Great.
I mean, that's really helpful.
I think now would be a greattime to talk about your project,
Fair Patterns (which anyonelistening here, you could access
it by going to fairpatterns.
com).
You not only created FairPatterns with your team, but
validated your work by deployingthese patterns on a project

(10:10):
with King Games.
So, that's the maker of CandyCrush Saga.
I think everybody pretty muchknows that game; and that won
you the IAPP award for MostInnovative Project, which is so
exciting.
Congratulations!

Marie Potel-Saville (10:25):
Thank you so much.
Yeah, it was a real surprisefor us, to be honest, That was
back in 2022.
To be named Most InnovativePrivacy Project for basically
the rest of the world, apartfrom the U.
S.
was a huge surprise for us.

Debra J Farber (10:41):
But so exciting.
Tell us, what is Fair Patternsand what motivated you to found
that project along with KingGames?

Marie Potel-Saville (10:47):
Sure.
Fair Patterns is a solutionthat fights against dark
patterns by automaticallydetecting them and by also
transforming dark patterns intothe countermeasure, which is
Fair Pattern, which arebasically interfaces that

(11:08):
empower users to make their ownfree and informed choices.
So, that's the concept thatwe've created after several
years of R&D.
What motivated us to createthis project?
It's basically all of the darkpatterns that we're seeing on a
daily basis in our projects atAmurabi.

(11:28):
Each time we would transform aprivacy notice or terms and
conditions or any onlinejourney, really we would see
them.
We would spot them and we wouldget really irritated, even
angry at times.
It's completely crazy, but youget some dark patterns that
could, for example, say thingslike t"There are two buttons,

(11:51):
one to accept the offer and theother to reject it.
And the button to reject theoffer sometimes says things like
"Oh no, I would rather bleed todeath to make you feel really,
really bad and ashamed ofclicking on it.
That's how crazy it gets.
And so, because we got so, soshocked by the scale of dark

(12:12):
patterns and by the depth ofthis phenomenon.
.
.
it's all around the world.
It's quite deep, not just onthe interfaces, it's also in the
code.
It's also, sadly, inalgorithms; and, that's not
improving.
So, back in 2021, we decided tocreate our R&D Lab specifically

(12:35):
to focus on dark patterns.
Two years after that, we managedto create the concept of fair
pattern as a countermeasure.
Maybe it's worth explainingthis concept a little bit more.
There has been a great amountof work done by academia over
the past 14 years.
So, you've got an amazingamount and quality of research

(12:58):
on what's the name of darkpatterns.
What's the right name for it?
What's the right taxonomy?
You've got like 16 differenttaxonomies.
What are the harms caused bythis problem?
Et cetera.
So, basically, you've got over10 years of problem- focused
research, which is great, whichis amazing; but then, very, very

(13:18):
little research and proposalson the solution.
So, there were some proposalsaround bright patterns or light
patterns.
Basically, these initiativeswere proposing to nudge users
towards privacy-friendly orconsumer-friendly interfaces.

(13:40):
To be honest, we're not surethat nudge is the solution here.
The reason for that is that ispeople don't learn anything when
they are being nudged.
Obviously, they just continueto blindly click.
They're just directed to asupposedly ethical solution

(14:04):
instead of the bad one; but,that doesn't really solve the
core issue, which is that peoplehave given up properly reading
online and thinking online tomake their own free choices.
So, that's really what wewanted to change, and the whole
concept of Fair Patterns is toempower users to even maybe take

(14:27):
the two or three seconds thatthey need to think about it and
to really make the choice that'smeaningful and beneficial for
them.
Does that make sense at all?

Debra J Farber (14:38):
It definitely makes sense.
It just highlights somethingI've been feeling for pretty

much a lot of my career (14:42):
why am I so excited about privacy and
data protection?
Why is that my jam, you know?
Why am I so obsessed with thespace.
Right?
A lot of it, for me, and Ithink generally, comes down to
it's a subset of agency.
You know, how much morefreedom- focused could this
issue be?
Right?

Agency (15:01):
your ability to make choices about your life and what
you're doing and what'scollected about you.
So, I think sometimes we forgetthat that's what privacy is, a
subset of agency.
And, if we're taking awaypeople's ability to make
meaningful choices, we're reallytaking away their agency and
that should be seen as a realnegative, harmful thing at scale

(15:23):
.
Right?

Marie Potel-Saville (15:28):
This is so important and thanks for putting
this context back.
This is so central and I knowthat you were at the IAPP Global
Privacy Summit because we saweach other.

Debra J Farber (15:39):
Yeah, that's where we met.
That's where we met at the last.

Marie Potel-Saville (15:41):
I'm sure you remember one of the speakers
for the closing sessionexplaining that the new frontier
is not so much data governanceor AI governance, it's decision-
making governance.
How do we make sure that humansare still able to make their
own decisions in a meaningfuland ethical way?

(16:03):
I think that's really the newfrontier.

Debra J Farber (16:06):
Yeah, and I think that if that's not done in
a good way, I mean it's prettyeasy to see that that will put
constraints on society'sdecision-making as a whole.
Of course still individualdecision-making, but then we're
steered by big tech or otherforces rather than our own
individual choices.
So, yes, I think that is kindof a great way to end the

(16:27):
conference on those notes andtaking that home to stew on,
marinate on.
Exactly.
You know we're talking somemeta concepts here, but why
should business executivesgenerally care about preventing
dark patterns from being codedinto their websites and apps and
interfaces?

Marie Potel-Saville (16:47):
Yeah, you're right to bring the
question back to reality, backto the field, back to what
really matters.
There are many, many reasonsfor executives to be concerned.
I will focus just on thebusiness reasons, to be honest,
because obviously there areplenty of ethical ones.
But, just on the business side,what's really interesting is

(17:11):
that we've seen that deceptionand manipulation online - all
these dark patterns - arestarting to stop being
profitable.
I think that they used to beprofitable a couple of years ago
, but as customers become moresavvy, but also more demanding,

(17:32):
they're really willing to havethat super profound trust
relationship with the brandsthey buy from.
We're seeing that dark patternssimply do not work anymore, or
not as well as before, sheerlyin terms of profitability.
And, that's really interestingbecause, of course, with a dark

(17:53):
pattern - let's say for arenewal of subscription - you
will likely get a veryshort-term boost in your
turnover, but as soon as userswill realize that they have been
tricked, they will be furious,and rightly so.
Then, they won't want to hearabout you ever again or your

(18:14):
brand.
I guess that's the key pointfor businesses to care about it.
The second business reason issimply the bad user experience.
It's terrible.
We've all experienced it.
Right?
I mean, you get annoyed.
You see them, the 45 peoplelooking at the same room as you.

(18:34):
We've all been through theseinterfaces where we are tricked
into paying for a seat on theplane when we all know it should
be free.
I mean, it gets really on ournerves.
The European Commissionactually produced a very
interesting study back in 2022,where they showed scientific

(18:59):
evidence that dark patternsactually increase your heart
rate and increase anxiety.
I mean, as if this world didn'tgenerate enough anxiety by
itself.

[Debra (19:12):
wow!] The second business reason is really this
bad user experience.
People, they're fed up with it.
And then, of course, we couldtalk for hours about all the
other reasons.
You know, the individual harmsand the structural harms caused
by dark patterns.

Debra J Farber (19:34):
Yeah, and I think we might even get to some
of those questions.
For now, I want to understand -I hear from you and your work
on your website that darkpatterns harm individuals and
exploit our cognitive biases.
So, you've kind of already wentthrough some of the harms; but,
can you describe how it'sexploiting our cognitive biases?
What does that mean?
Give us some examples there.

Marie Potel-Saville (19:54):
Sure, absolutely.
Perhaps it's useful for youraudience, Debra, to briefly
explain what cognitive biasesare.
This all stems from, basicallyfrom Ken Mann's research, who
sadly passed away recently, buthe's been really an amazing
thought leader in neurosciences.
He was also a Nobel Prizewinner in economy.

(20:17):
B asically, he's the author ofThinking Fast and Slow, which
still is the authority book todate.
He very clearly explained -identified first and then
explained - that our brain isworking by two main systems.
System one, which is very fast,very efficient, but which

(20:41):
relies on a number of cognitivebiases precisely to help us make
those decisions or make thosechoices very, very quickly.
And then, system two, which isslower, which is also more
energy consuming - so, the braindoesn't like that by default,
it likes to save the energy.
But, system two is basicallywhat enables us to solve complex

(21:05):
problems like a math problem.
So, back to cognitive biases.
In system one, the reason whywe can act very quickly and be
that efficient is that in orderto make a choice, we will fall
back, we will resort to thesecognitive biases and that's
completely unconscious,obviously.
We won't realize that.

(21:26):
It means that all humans aregoing to react in the same way
when they're faced with a givensituation, a given type of
information.
Overall, there are 180cognitive biases.
For example, when we are facedwith information overload, then

(21:49):
the typical response is we don'tread.
When we are faced with a riskof loss, there's the loss
aversion bias, which makes usabsolutely try to avoid this
loss.
When we are faced with aninformation that is framed in a
specific way, well, thecognitive bias associated with

(22:11):
that means that we stick to thatframe and we are less likely to
challenge the first informationthat we got.
I could go like that, you know,for a long time, but that's
basically what cognitive biasesare and the problem with that.
Well, first of all, you know,we can't avoid having these
cognitive biases.
That's part of being human,really.

(22:32):
The problem lies in the factthat, because everybody, all
humans, are going to react inthe same way, well, then it
makes us predictable and then itmakes us manipulable.
That's how dark patterns werecreated, in a way.
It's a manipulation of ourcognitive biases.

(22:53):
It's playing on these humanweaknesses to make us do things
without realizing or evenagainst our interests.
Does that make sense?

Debra J Farber (23:04):
It does and it makes me wonder.
We use law to address harms topeople, especially in the United
States.
We have a very much - ourprivacy laws have been around
where has there been a actualharm as opposed to rights- based
.
Right?
Like it is in the EU.
What does the law generally sayabout dark patterns?

(23:27):
Do we have good, establishedlaws around the world that
address dark patterns?
What is the state of the lawtoday?

Marie Potel-Saville (23:34):
Great question.
The first thing to say is thatit's always been prohibited to
manipulate or deceive people.
It never was licit in the firstplace.
So, even if the term darkpattern was not specifically
mentioned in a number of actsand regulations, for sure

(23:55):
manipulating someone, deceivingsomeone, is totally prohibited.
So, that's the first thingthat's really, really important.
For example, in the U.
S.
you've got Section 5 of the FTCAct on deceptive practices,
which is totally and fullyapplicable to dark patterns and
that's basically the legal basisthat the FTC is using to go

(24:18):
after Epic Games for all thedark patterns and trying in
Fortnite, leading to a hugesettlement for half a billion
dollars, etc.
So, that's really important.
You don't need to have the termdark pattern in any piece of
legislation for a dark patternto be caught in that legislation

(24:41):
, which means that you've gotprohibitions and general
prohibitions of manipulation anddeception in consumer law,
obviously in privacy law as well.
Let's just remind everyone herethat, obviously, in GDPR,
there's the Article 5 of GDPRwith the fairness principle
(which is at the core of GDPR).

(25:02):
Fairness means that, obviously,you can't trick anyone to
obtain their personal data.
That's completely illicit.
Right?

Debra J Farber (25:13):
Especially as it really discusses consent like
fairness around the consent.

Marie Potel-Saville (25:18):
Exactly, and what's interesting is that
for the past two or three years,we've been seeing new and
specific legislation that wouldspecifically target dark
patterns.
So, it comes in addition to allthe existing legal framework,
which I just described.
For example, in the U.
S.
, the California Privacy RightsAct defines dark patterns

(25:41):
specifically.
The FTC staff report that theypublished in 2022 distinguishes
four types of dark patterns.
In the EU, we've got theDigital Markets Act, the Digital
Services Act and the brand newAI Act, which also provide
specific definitions of darkpatterns and specific

(26:03):
prohibitions.
Basically, what it means isthat the legal net is getting
tighter and tighter.
So, there are now multiplelegal bases around the world
that make dark patterns totallyand utterly illicit.

Debra J Farber (26:19):
That's fascinating.
It'll be interesting to see howquickly companies are ready to
adapt to this growing net oflaws around dark patterns as
enforcement begins and whatthat'll look like.
I know we could probably talkabout that for like half a day
as well.
So, first I want to take theconversation really broad to
talk about structural risks andour economies and such, and then

(26:42):
I want the audience to knowwell, we are going to get very
specific as to what do we meanabout dark patterns and then how
do we create countermeasures.
What are the countermeasures tocreate fair patterns?
But first, I want to reallybring it to the societal impact.
How do dark patterns createstructural risks for our
economies and then, ultimately,our democratic models?

(27:05):
How do they impact competitionand trust in brands and the
overall market and such?

Marie Potel-Saville (27:11):
That's such an important question.
This has been studied by manyregulators.
So, we've got studies by theOECD, for example, and many
other regulators, like theCompetition and Markets
Authority in the UK, theEuropean Commission.
Basically, they all say thesame: ultimately, dark patterns

(27:31):
do affect competition, mostlybecause, well, if consumers are
prevented from changingsuppliers because they are
either caught in subscriptionsthat they can never cancel or
because they are prevented fromobjectively comparing prices -
which is another type of darkpattern - if consumers do not

(27:54):
have transparent, objectiveinformation online which is
another dark pattern thenthey're not able to make the
best decisions for them, and sodefinitely it affects
competition.
Also, because dark patterns canand are used to collect ever
more personal data, ever morepersonal data, which can give

(28:17):
large groups a decisivecompetitive advantage or which
could even strengthen a dominantposition - I'm sure you're
seeing which type of players I'mreferring to - that creates a
true structural risk for theeconomy.
Let's just remind everyone thatthe reason why today we still

(28:43):
consider market economy as thebest model to date is that it's
because it's supposed to bringthe best benefits to consumers,
and by best benefits it isunderstood lower prices, better
quality of services andproducts, and innovation.
But, what the OECD very clearlyexplained in its report in 2022

(29:07):
is that if companies end upcompeting by the quality, so to
speak, of their dark patternsinstead of focusing on
innovation, low prices, etc.
, then it's a losing game forconsumers.
They are being tricked.
They don't get the lower prices.
They don't get the betterproducts.
It's just a losing game.

(29:28):
And then, of course, there'sthe trust issue.
If consumers lose trust inbrands because they've been
tricked, because they've beenmanipulated, ultimately they
also lose trust in the economyat large.
So, that could also endangerthe whole system.
And, back to your point aboutthe democratic model, ultimately

(29:51):
it could also affect theirtrust in the overall democratic
system and it's even beyond that.
Our main concern, I guess, atFair Patterns, is that once
we're all trained to accept;once we're all used to click "I

(30:14):
agree, even though we haven'tread one single line, then
what's the next thing that weaccept blindly, without having
read?
From a democratic standpoint,it's an interesting question,
particularly in 2024, which is ahuge electoral year around the
world.
I think it was the FinancialTimes that published an article

(30:38):
on the fact that no less thantwo billion citizens are going
to vote this year and, of course, yeah, you've got a big
election coming up in the U.
S.
, needless to say.

Debra J Farber (30:52):
Definitely.
I think that all rings true toprobably all of our experiences.
So.
thank you for framing thesocietal importance, or impact,
I should say, of these darkpatterns.
Let's get to solutions.
What are the benefits of usingfair patterns and what are fair
patterns?

Marie Potel-Saville (31:09):
Yeah.
Fair patterns are interfacesthat empower users to make their
own free and informed choices.
It means that, instead of beingtricked, instead of being left
in the dark with a wall ofjargon, et cetera, you are given
the right information at theright time of your journey to

(31:31):
understand the consequences ofyour choices and you are able to
make the choice that'sbeneficial for you, that also
matches your preferences.
That's basically what a fairpattern is.
Maybe it's interesting to get abit more into concrete
examples, if that's helpful.

(31:52):
So, for example, you can have anumber of default settings
which are are harmful.
It's the pre-ticked boxes, it'sall of that.
Well, the countermeasure, thefair pattern to that, is simply
neutral default, where actuallyyou don't have the pre-ticked
box.
You're not framed.

(32:14):
Back to the question ofcognitive biases, it could be
empty boxes.
It could be individualizedsettings that allow the user to
choose for each single case.
Perhaps, as a practical example, we could describe some default
setting where you have onesingle box for several purposes

(32:39):
of data processing.
Well, first, that's illicit,but second, obviously you know
you can't decide to what youagree simply because there's one
single box.
So the fair pattern is simplyto have several boxes that are
not pre-ticked, so that you'renot framed and you can have a
granular consent and a distinctdecision.

(33:02):
So, for example, you couldagree to receive the newsletter
but not the promotional offersfrom the company's partners.
Does that make sense at all?

Debra J Farber (33:12):
Yeah, absolutely .
You're not combining all theconsent into one thing so that
it's binary.
You're giving kind of choicesto the individual as to what
they want to opt in or out of.

Marie Potel-Saville (33:24):
Exactly.
To stick to the privacy darkand fair patterns, very often,
as regards to the privacysettings, you've got what we
call a maze - something that issuper difficult to navigate.
You have to click, like, atleast five times to make the
first choice, and then you haveto go back to a different page

(33:47):
to continue adjusting yourprivacy settings.
None of that is by chance,obviously.
So, the fair pattern thatsolves this type of situation is
simply a seamless path.
It could take many differentforms, but it could be a privacy
dashboard where you have allyour rights and all your options

(34:10):
at a glance on one singlescreen with buttons, and you can
decide okay, I'm happy to sharemy name and my email address
for that purpose to this company, but not to the third party
partners.
I'm not happy to share my phonenumber, except for deliveries,

(34:31):
perhaps you know I want todecide on the information on how
I use the services.
Well, I'm okay for your company, the one I'm buying from, to
use it, but not third-partypartners, etc.
Basically, it's really good UX.
That's what I want to say.

(34:51):
If you think about it, UXwasn't meant to trick anyone.
It was meant to help users dowhat they intended to do in a
quick and easy way.

Debra J Farber (35:12):
That makes a lot of sense, especially when you
frame it that way.
One of the reasons we'retalking about this topic on The
Shifting Privacy Left Podcast isthat we want to shift
addressing privacy way earlieron, before you ever collect data
in the design phase, thedevelopment phase, the products,

(35:33):
the getting the requirementsfor the product phase.
Right?
Well before you even have tothink about the life cycle of
data, shift into earlier in thedecision-making and make sure
you've got good privacy.
So, privacy design and privacyUX is a huge component of that.
It's the perfect conversationfor today and for this show and

(35:56):
for this audience.
So, what I'd like to do now isyour Fair Patterns project has
identified - they came up withseven categories of dark
patterns with, I believe it was,16 different dark patterns that
fit in those seven categories;and you could go to fairpatterns
.
com and see that for eachcategory there is a definition,

(36:17):
what are the main cognitivebiases that are manipulated
through that dark pattern, andthen the main risks to
individuals.
But, I'd rather focus on todayin our conversation, I'd like to
go through each of those sevencategories and talk about what
are the fair patterncountermeasures as I walk
through those.
Does that make sense to you?

[Marie (36:37):
Sure, sure, sure].
So, I think that you alreadytalked about the first one, this
sense of a maze, where you'retricking the user and they have
to go through seems like anendless number of clicks until
they're able to actuallyeffectuate what it is they want
to do maybe opt out, cancelsomething, whatever.
You mentioned that the seamlesspath is the countermeasure, the

(37:00):
fair pattern, to use instead.
So let's go to the second one,which is called harmful default.
What is that and what are thecountermeasures there?

Marie Potel-Saville (37:16):
Yeah.
So the harmful default, youknow, is this pre-ticked box
that could also combine severalpurposes, and so you couldn't
decide whether you agree to onepurpose but not to the other,
and so the countermeasure isthis neutral default with the
granularity of choice, basically.
So, you can decide for eachdifferent purpose, what you
agree with and what you disagreeto.

(37:36):
The third category ismisleading or obstructing
language.
By the way, the way we calldark patterns really doesn't
matter, at least not to us.
What really matters is thesolution.
So, the solution to misleadingor obstructing language is
simply plain and empoweringlanguage.
And you know what, Debra?

(37:57):
Spoiler alert to your audience,it is totally possible to
explain privacy with plainlanguage.
It doesn't have to be, you know, obscure, full of legalese,
absolutely not.
We do that for a living.
And plain and empoweringlanguage means that anyone

(38:17):
without a legal background, youknow, without being a specialist
in privacy, very easily findsthe information they need; that
they understand it upon firstreading - that's very important;
and, that they also understandthe consequences of their
choices.
So, that could be very shortand clear sentences, obviously.

(38:40):
A neutral tone of voice, also,to avoid any emotion
manipulation.
But, that's also sometimesabout adding some information,
precisely to empower users tounderstand the consequences of
their choices.
Then, we've got the categorythat we call distorted UX, and

(39:02):
the fair pattern to that is fairUX.
So, this one is really commonsense, I would like to say.
A distorted UX would bebasically trapping users through
the visual interface.
So, for example, you would havesomething, an interface with

(39:23):
your birth date, and then a bigfat button share it with
everyone, and obviously you areprompted, if not manipulated, to
share this personal information, when a fair UX will be a
visual interface that reallyrespects your intention.

(39:45):
So, you would have threeequivalent buttons, for example,
to decide with which companiesor people you want to share your
birth date.
It could be the company services, it could be your friends, it
could be everyone, but youreally have a choice and you've
got three equivalent buttonswith the same salience, the same

(40:08):
color, and there's no visualinterference that directs your
choices.
Does that make sense at all?

[Debra (40:16):
It does, thank you].
I think what's important tomention is that we've developed,
not just the concept, but alsoa library of patterns that we're
continuously improving andexpanding, and so that's what's
great about a pattern is thatit's a system that will

(40:37):
consistently solve the problem.
And just so you know, we'vebeen testing the fair patterns
with users, also with a range ofexperts, independent experts to
continuously improve them, andso we hope in the future to
develop even further our libraryof fair patterns.

Debra J Farber (40:56):
Yeah, in fact it's not just like legal experts
, right?
You have really interestingpeople, like you have
neuroscientists.
Say the types of experts you'reworking with on this site.

Marie Potel-Saville (41:05):
That's a great question.
So obviously, you know, this isa multidisciplinary
collaboration where we've got,obviously, UX strategists, so UX
designers, but alsoneuroscientists.
This is absolutely critical,given the cognitive biases at
stake.
We've got plain languageexperts working with us within

(41:28):
the team, obviously legalexperts, privacy experts, and
last year we decided to have ourconcept and the library of fair
patterns audited or examined by10 independent experts in the
various fields that I mentioned,including psychology, so that

(41:49):
they would give their honest,objective opinion about what
works, what should be improved,and that was an amazing
experience and that pushed useven further.

Debra J Farber (42:00):
That's awesome.
And just to close the loop, thelast three dark patterns and
countermeasures are "more thanintended is the dark pattern and
the countermeasure is freeaction, and you can learn about
all these on the website.
The sixth one is push andpressure, and then the
countermeasure would benon-intrusive information.
And then the last is the darkpattern would be missing

(42:23):
information and thecountermeasure would be adequate
information.
Again, you could find moreinformation on fairpatterns.
com.
I'm also going to ask in alittle bit where else people can
learn about this, but for now,given that we've got privacy
engineers as an audience basefor this show, what advice do
you have specifically fordesigners and developers to

(42:45):
ensure that they are designingfair patterns as they're
building products and features?

Marie Potel-Saville (42:51):
Sure, this is a very, very important point,
because everything you knowgoes to practice and practical
solutions.
So our advice for designers isbasically to go through a series
of short and simple questions.
This is a framework that we'vedeveloped with Harry Brignull.
He joined us in January, whichis amazing.

(43:14):
Together, Harry and I, we'vedeveloped this fairly simple

framework that we called (43:18):
Does My Design Contain Harmful Choice
Architecture?
" And this was based on TheSeven Stages of Action Model by
Don Norman, obviously, who needsno introduction to designers.
So, we took those seven stagesand applied them to harmful

(43:39):
choice architecture; and,basically, for when you start
the product design phase, youhave the perception, then the
comprehension, then a number ofsteps.
What we advise designers to dois, at each of these steps, ask
themselves five simple questionsat each of the stages of the

(44:01):
user experience.
The first question is Autonomy.
Does my design allow users tomake decisions based on their
preferences?
The second question is Agency.
Back to your point, Debra, doesmy design allow users to take
the actions they want easily,without coercion?

(44:23):
The third question isTransparency.
Does my design providesufficient objective, accessible
information in plain languagefor users to make informed
decisions.
The fourth question is Honesty.
Does my design containmisleading information or

(44:45):
omissions that could inducefalse beliefs?
And the fifth question, whichis probably my favorite, is
Fairness.
Is my design likely to cause anoutcome that's favorable to the
business but detrimental to theusers?
And so, with these five fairlysimple questions, we actually

(45:08):
catch most of the legislativeand regulatory framework.
It seems quite simple when Isay it like that, but this is
actually the result of tons oflegal research to make sure that
we catch GDPR, old privacy lawscurrently tackling dark
patterns, et cetera.
So, we hope that's helpful todesigners.

Debra J Farber (45:28):
That is really helpful and I think you framed
it really well for why designersand developers should care.
Those personas, the designersand the developers, they, a lot
of the times, are going to bedriven by executive and
marketing's desire to boostsales through these interfaces.
Right?
But, we want to do that withoutdark patterns.
So how can companies boostsales with fair patterns?

Marie Potel-Saville (45:52):
Well, the great news is that we've been
working hard with economists andwe've been studying plenty of
econometrics studies, and thegood news is that fair patterns
produce better results than darkpatterns after six months.
It's true that, for a very shortamount of time, dark patterns
might be more profitable; but,that's really only two or three

(46:15):
months.
The curve really changes assoon as six months afterwards,
when fair patterns are actuallyas efficient economically as
dark patterns.
Then, after six months, theybecome more profitable.
We've been seeing that in themedia sector, for example, where

(46:36):
subscription-based modelsclearly show that a fair pattern
is going to be more efficient,simply because people actually
don't want to subscribe, orsubscribe less when they suspect
a dark pattern.
So, they will just refrain fromsubscribing.

(46:56):
When there's a fair pattern,they are more inclined to
subscribe because they know theywon't be tricked, and so that
makes trust and the customerlifetime value that derives from
trust, that makes fair patternsway more profitable - not just
in the middle term, butobviously in the longer term,

(47:17):
because it's also boosting thevalue of your brand instead of
damaging it.

Debra J Farber (47:23):
That's really compelling data.
I would love for you, afterthis call, if you could send me
the [inaudible].

Marie Potel-Saville (47:30):
Yeah, of course.

Debra J Farber (47:32):
Let's get this in the hands of those who are
designing the UX and UI anddeveloping new products and
features so that they could pushback against marketing teams
and executives that are tryingto push them into creating dark
patterns.
Then, if you could have thesecompelling metrics that, "Hey,
we could boost sales better withfair patterns.
I think that that could reallydo a lot of good for well

(47:54):
society as a whole.
But first, to push back onmanagement with what is ethical,
right and going to make themmore money.
It's pretty compelling.
Then, also make sure to sharewith me the document you were
saying that you helped create.

Marie Potel-Saville (48:09):
The framework.

Debra J Farber (48:14):
Yes, I would love to put that in our Show
Notes.
So, where can our listenerslearn more about dark patterns
besides, you have a wonderfulpodcast called Fighting Dark
Patterns.
People should check that out.
And then, of course, the FairPatterns website.
But, in terms of categorizingthem and showing how they work
and spread and which risks theycreate and which laws they
breach, where would you directpeople to learn more?

Marie Potel-Saville (48:34):
So, without hesitation, the go-to source is
Harry Brignull's websiteDeceptive Design Patterns.
It's a goldmine of information.
Again, you know he was the oneinventing the term.
He's been really leading theresearch since then and his
website is a goldmine ofinformation, as well as his book

(48:54):
.
Let's remind that he authored abook last summer.
It's super easy to read.
It's really not just fordesigners, it's for everyone,
and I cannot recommend it enoughIn terms of online information.
There's also the dark patternstip line if you want to report a
dark pattern.
So there's a hall of shame onHarry's website.

(49:16):
There's also a tip line in the,specifically in the US, where
you can report dark patterns.
And then, of course, you knowyou can and you should just tag
the companies.
If you see an interface thatvery much looks like a dark
pattern, just tag them on socialnetworks with the hashtag "dark
patterns.

(49:36):
Ask the company, make themaccountable for what they
produce.
This really has an effect, bythe way.
We know for a fact thatregulators are really tracking
those comments on social mediawith the hashtag "dark patterns.
It could also prompt regulatorsto you know, engage, to launch

(49:56):
a legal action or to askquestions, to launch an
investigation.
So, it's really worth spendingthose few minutes to post on
socials if you see one.

Debra J Farber (50:07):
Oh, that's great .
It feels empowering in a worldwhere you feel like you can't
control everything.
You can actually take someaction that might affect some
change in that way, so that'sawesome.
What is the best way for peopleto reach out to you to learn
more about Amurabi's legaldesign services or fairpatterns.
com?

Marie Potel-Saville (50:24):
There's our Amurabi.
eu website.
There's fairpatterns.
com.
And, I'm always happy to sharethoughts on LinkedIn.
I'm very easy to find onLinkedIn.
Just tag me or reach out to me,and I'm always happy to have a
chat.

Debra J Farber (50:40):
Excellent.
And lastly, I usually ask thisfor most of my guests, do you
have any words of wisdom toleave our audience of privacy
engineers with today?
Or, last words of wisdom;you've given us plenty of words
of wisdom.

Marie Potel-Saville (50:54):
I guess, what I really want your audience
to take away is thisdecision-making governance.
You know we've been talking alot about dark patterns and
interfaces, but there are alsodark patterns in AI, thanks to
AI, but also within AI itself,within the algorithms themselves

(51:16):
.
This is so critical for privacy.
If we're no longer able to makeour own decisions, to make
meaningful choices for ourselves, it's the end of privacy,
simply.
So, decision-making governancethat's really what every privacy

(51:36):
engineer should be working on.

Debra J Farber (51:38):
Thank you so much.
It's been a really fascinatingconversation.
Thank you for joining us on TheShifting Privacy Left Podcast.
Until next Tuesday, everyone,when we'll be back with engaging
content and another great guest.
Thanks for joining us this weekon Shifting Privacy Left.
Make sure to visit our website,shiftingprivacyleft.

(51:58):
com, where you can subscribe toupdates so you'll never miss a
show.
While you're at it.
if you found this episodevaluable, go ahead and share it
with a friend.
And, if you're an engineer whocares passionately about privacy

, check out Privado (52:12):
the developer-friendly privacy
platform and sponsor of thisshow.
To learn more, go to privado.
ai.
Be sure to tune in next Tuesdayfor a new episode.
Bye for now.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.