Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Mathew Mytka (00:01):
You know, our
mission is really about helping
to address that - dedicated tohelping people reimagine and
create technology that enablescollective human flourishing.
You know, and we're passionate.
We've fallen in love with thisproblem -this 'ethical intent to
action gap' that pervades thetech industry, and we're
dedicated to figuring out how tosolve it.
Debra J Farber (00:25):
Hello, I am
Debra J Farber.
Welcome to The Shifting PrivacyLeft Podcast, where we talk
about embedding privacy bydesign and default into the
engineering function to preventprivacy harms to humans, and to
prevent dystopia.
Each week, we'll bring youunique discussions with global
privacy technologists andinnovators working at the
(00:45):
bleeding edge of privacyresearch and emerging
technologies, standards,business models, and ecosystems.
Today, I'm delighted to welcomemy next two guests, Mathew
Mytka and Alja Isaković,Co-Founders at Tethix, a company
that builds products thatenable you to embed ethics into
(01:06):
the fabric of your organizationand that help you embody your
E-T-H-O-S, which we'll talkabout later today.
Mat lives in Sydney, Australia,and describes himself as a
'Moral Imagineer,' SystemThinker, and Doer.
Drawing from variousdisciplines and a diverse career
(01:26):
, he helps people reimagine anddesign organizations, products,
services, and business modelsthat support human well-being
and planetary flourishing.
"With Tethix, he says, you canplant seeds of good intentions
and grow products you can beproud of.
" Alja lives in Ljubljana,Slovenia, and her work at Tethix
(01:48):
aligns with her passionimagining a more inclusive
future in which tech innovationsupports learning and community
building, helps us take bettercare of each other and our
planet, and brings joy andplayfulness into our lives.
She's also a foundingcontributor to the
ResponsibleTech Work framework(which is also a URL address),
(02:12):
which proposes practical toolsfor more responsible product
development in the tech industry, based on core guiding
principles.
Today, we're going to betalking about a really
interesting concept called'Responsible Firekeeping.
' Welcome, Matt and Alja.
I'm really excited to talk toyou about this today.
Alja Isaković (02:31):
Thanks for having
us, we're excited to be here.
Mathew Mytka (02:34):
Yeah, thanks,
Debra.
It's an absolute pleasure.
Debra J Farber (02:37):
There are a lot
of big words and utopian (and I
don't say that in a negativeway), goals - that you're
aiming to bring ethical tech tothe world - that I hear in your
marketing there.
What prompted you to foundTethix and what's your core
mission?
Mathew Mytka (02:53):
Yeah, it's a great
question.
We do have a leaning towardsbeing optimistic, if not purely
for our own mental health, Ithink.
What prompted us, really, isseeing many of the problems out
there in tech and really justbuilding on the decade or more
(03:13):
of really seeing theimplications of a lack of care
and respect for what technologycan do, a negative impact of
technology and technologydevelopment on people and the
planet.
Our mission is really abouthelping to address that,
dedicated to helping peoplereimagine and create technology
(03:37):
that enables collective humanflourishing, And, we're
passionate; we're falling inlove with this problem, this
ethical intent to action gapthat pervades the tech industry;
and we're dedicated to figuringout how to solve it.
Debra J Farber (03:51):
So, you
mentioned this gap.
Can you unpack that for us?
Mathew Mytka (03:58):
This is a tricky
one, but we'll try and narrow in
on it; and I guess I'll talk alittle bit about examples.
I think the intent to actiongap, in general, is essentially
what we do in our behavior doesnot necessarily always match our
intentions, and there's thisgap between what we might say or
stand for and what we actuallydo in our work.
(04:21):
I'll make reference to a studythat was done through MIT and
company called Culture X back in2020.
It was a research looking atcorporate value statements of
Fortune 500 companies.
They looked at all thesestatements and then surveyed the
employees; and essentially theyfound the questions around that
(04:44):
for the employees were around"does the behavior of your
company, does the culture ofyour company, reflect these
value statements that get made?
" They found no statisticalcorrelation between the
behaviors and the valuestatements.
And, this is probably nosurprise for many people.
You can understand people withincross functional product teams
(05:06):
that might be buildingtechnology, as an example, are
always under immense pressure.
There's lots of constraints,there's pressure to move fast
and break things.
An example of this in privacyis cookie notices and privacy
policies, and "we take yourprivacy seriously.
Pat Walsh, the lovely Pat Walsh, who many of the listeners are
probably familiar with.
Put that conger in there.
(05:28):
"We take your privacy seriously!" [Debra: With an exclamation
point at the end, I'm sure.
] I think there's the ethicswashing, there's the trust
washing aspects of this wherethere's no genuine intent, but a
lot of organizations, a lot ofpeople want to actually reflect
the values, the things that theysay they care about, that they
(05:49):
want to stand for, that they'llfight to defend; but there are a
lot of systemic challenges aswell.
You know, we've got extractive,destructive, and linear
socioeconomic systems, andshareholder primacy, and that
disconnect from our nature andour natural world.
So, our approach at Tethix islooking to solve for that, for
companies, individuals,organizations, practitioners
(06:12):
that have that intention, thatreally want to live and breathe
what they want to stand for,particularly in this ethical
sense where we're working tosupport them and also trying to
account for some of thesesystemic challenges as well.
Debra J Farber (06:28):
Thank you for
that.
And so, to address this gap, Iknow that Tethix has created an
ethical framework that you callElemental Ethics, which I think
the audience is going to findfascinating.
Tethix - their stated goal withElemental Ethics is "to move
companies from a state of agilefirefighting to responsible fire
(06:50):
keeping.
Tell us a little bit about that.
What is Elemental Ethics, andhow does Elemental Ethics help
product development teams embarkon what you call the
'deliberate for the good' path?
Alja Isaković (07:04):
That's a great
question, Debra.
Because of what Mat was justsaying about the intent to
action gap being a symptom of somany systemic problems, we
really think a differentapproach is needed.
So, we don't really seeElemental Ethics as a typical
framework, but more a new way oflearning, doing, thinking,
(07:25):
relating, and working.
We really think that we needthe new language, new narratives
, to deliberately choose betterfuture paths because, as we all
know, many of the products thatwe're building are not in fact
making the world a better place,as tech startups often like to
promise.
So, with Elemental Ethics, weemphasize 4 core skills (or
(07:50):
elements, as we call them) thatwe need to balance to make
better product decisions ineveryday practice.
Now, most tech companies arefocused on technology and
practice, which is the fireelement.
With technology, we're buildingand starting all these new
fires; 1) but in order tomaintain a healthy fire, you
(08:12):
actually need to get to know andwork with other elements.
2 1) This means improving theairflow, and what we mean by
that is communication andcollaboration within an
organization; 3 2) groundingyourself in earth by doing
research and exploration ofdiverse perspectives outside the
organization;4) Take the time to pause and
(08:32):
reflect on what you're doing,which represents the water that
you can use to extinguish thefires that accidentally start
through your technology andpractice.
You might have noticed that alot of the language we use draws
inspiration from nature,because we really want to also
emphasize the ecologicalchallenges we are facing today;
(08:55):
and we think that ethics is notjust about being nice to each
other but elevating the mostimportant conversations we
really need to have as a speciesto ensure our future survival
on this planet.
Given the impact that tech hasin our lives and, like Mat was
saying, all the materialresources it's extracting and
(09:18):
relying on, we really have ourresponsibility as technologists
to think of non-humanstakeholders and the broad
systemic impacts of the techwe're building.
We really hope that ElementalEthics can contribute to this
narrative shift we need inproduct development.
Debra J Farber (09:35):
It's fascinating
and obviously that sounds like
something.
.
.
like who wouldn't want to signup for that.
Right?
So, let's get into a littlemore depth with the concept of
'Responsible Fire keeping," nowthat you've articulated the
underpinning of the ElementalEthics.
If we already have Agiledevelopment processes, why do
companies need to consider anadditional approach for ethics?
(09:58):
How is Agile insufficient?
Mathew Mytka (10:00):
Yeah, that's a
good question.
We don't want to introduce toomuch of the new and this is part
of the challenge which we mightdive into as we explore this.
As Alja was just saying, Agileis this fire- focused practice.
It's concerned with deliveringbusiness value.
So, the way that we practiceagile traditionally - and
(10:22):
there's obviously lots ofvariation in organizations and
different organizationalcontexts - we're treating
software development as thisfactory line, leaving no room
for laws and approaches forthese deeper ethical discussions
, moral deliberation, andexercising of moral imagination
because we get obsessed (be thatincentivized, as well) because
(10:47):
of KPIs and the way that wedesign metrics within
organizations.
We're obsessed with sprintingand shipping and velocity and
that is different from what Ibelieve, and what we believe, we
actually need.
I think many probably agreewith this.
If you've vehemently agree,that's excellent as well,
because we enjoy diversity ofperspectives on this.
(11:08):
So, on data protection side andprivacy side, we might see this
as a mindless data collection,collecting because you can, not
because you should.
.
.no purpose specification andwhy you're actually collecting
it in the first place, if youeven actually need to.
We see that, particularly indigital tech products; it's just
so invasive.
By always sprinting, we oftenend up starting these fires and
(11:32):
then get exhausted by thisconstant bug-fixing, data
breaches, and leaks, and PRproblems that come from these
downstream issues that we'vecreated as a result of starting
the fires.
We call that firefighting.
We use these metaphors as a wayto make sense of things, and
for the most part, the languageis very metaphorical; but, we're
(11:54):
not saying throw out all ofyour existing processes.
This is about interfacing andconnecting in and embedding with
the way that people alreadywork.
What we're saying is thatpractice, which is generally
driven by fire, needs to bebalanced with these other
elements that Alja was gettingat: the collaboration, the
(12:15):
research, exploration, pause,and reflection.
This is what we do withElemental Ethics and what it
actually brings attention to.
Debra J Farber (12:24):
So, define
'Responsible Firekeeping' for us
.
Mathew Mytka (12:28):
At its essence,
Responsible Fire keeping is
about, like I said, balancingthe fire of technology practice
with these other elements.
So, instead of wasting time inmeetings due to
miscommunications and peoplemaybe misinterpreting the
acceptance criteria in a userstory, giving people time and
(12:50):
space to tell stories and learnfrom each other and explore
where the values that they wantto stand for actually come to
life in the products andservices that they build and the
features that they actually areworking on.
So, instead of this ethicswashing that tends to take place
in a lot of cognitive and moraldissonance that people might
(13:10):
face at work, you learn how toembody these principles and
values that you say you standfor.
Many organizations have themand there's issues with that
that I'll just briefly touch onas well.
You're building more trust bydemonstrating what you are doing
is more aligned to what you sayyou care about and what you
(13:30):
value as an organization.
This gap exists for lots ofdifferent reasons.
There's just lots of systemicchallenges, but it's generally
top-down and exclusive.
These things are kind ofimposed, disconnected from
workflows.
Business metrics generally winin the end, as I was mentioning
before.
Yeah, so Fire Fighting versusResponsible Firekeeping.
Debra J Farber (13:56):
How does
responsible fire keeping shift
organizations?
Basically, what benefits can berealized from this approach?
And you mentioned trust.
I just wanted to giveopportunity to see if there's
additional benefits as well.
Mathew Mytka (14:10):
Yeah, I mean
additional benefits for this,
the cost that comes downstreamfrom this incessant firefighting
- there's cost reduction nodoubt that comes to this.
When you're being proactive,you think of privacy by design
and privacy engineering.
When organizations are actuallybeing proactive about this in
the way that they're actuallydesigning and developing,
(14:31):
engineering products andservices, it might have some
upfront cost.
There's a shift in approaches,particularly when this stuff is
being introduced.
But downstream, you're saving awhole bunch of money.
You're reducing risk for largepublicly listed companies.
You're reducing the risk of atrust breach.
There's lots of evidence to showthat these breaches of trust
(14:54):
that are reflective of howtrustworthy you are as an
organization, which is afunction of you demonstrating
that you have integrity and beable to follow through on the
value that you promise and thevalues that you stand for.
That impacts bottom linebusiness metrics that you're not
seeing your stakeholders, as anexample, care about.
You know, EBITDA and thesebottom line business values.
(15:17):
There's an economic argumentfor this, let alone something
like the productivity impactsthat likely come, which we're
really interested in doing someresearch on into next year, with
some pilot programs on thedissonance that practitioners,
people within organizations facewhen they know that what they
say they're meant to stand foris actually not reflected in
(15:39):
what they do.
Debra J Farber (15:40):
It just makes me
think of the song by Fun -
"what do you stand for?
It just keeps coming in my headthe entire conversation so far,
so I just needed to get thatout.
Mathew Mytka (15:50):
I love that.
I love that.
Debra J Farber (15:53):
What I hear you
saying, too, is that this
Ethical Fire keeping can helptransition an Agile team from
being reactive to be moreproactive, which is also one of
the principles of privacy bydesign.
So, it definitely seems likesomething that organizations
should at least find out moreabout - Responsible Fire keeping
(16:15):
in your ethical framework.
Let me ask, why shouldorganizations consider choosing
Elemental Ethics overconventional ethics frameworks?
Alja Isaković (16:25):
That's a great
question and I think Matt
already touched on a lot of thebenefits related to trust, risk
management, and all of thesethings.
An important difference that Ithink it's also worth
emphasizing is that conventionalethics frameworks tend to be
top- down in position of valuesand principles that are usually
(16:46):
chosen by the founders,sometimes even defined by
external consultants.
So, to use a bit of thelanguage from The Agile
Manifesto, which has its placein time and it was the shift in
narrative, the way we compareElemental Ethics to conventional
ways of doing ethics is thatElemental Ethics is essentially
(17:09):
choosing to tell living storiesthat grow with the organization
over having posters with valuesthat gather dust on walls.
It's about awakening your moralimagination and playfulness
over using language that's tooabstract to be relatable and
applicable.
Elemental Ethics is about beinguseful in practice at all levels
(17:30):
of the organization overimpressing stakeholders on a
website and being forgotten byeveryone else.
Elemental Ethics is choosingsomething shaped by
practitioners, mindful oftensions and oriented towards
action over a framework orprocess shaped by consultants
without firsthand experiences ofeveryday ethical tensions in
(17:51):
the field.
So, Elemental Ethics is aboutembodying and embedding ethics
at all levels of theorganization as a living process
.
It's something that can reallygrow with the organization to
meet the different challenges,as opposed to conventional
frameworks that essentially seeethics as a checklist, as a
(18:14):
static exercise that you do once.
Right?
Even if you've shifted left,you do it once and then it's
forgotten.
It becomes a static poster on awall with a value that nobody
really knows how to interpret.
Mathew Mytka (18:26):
In a confluence
page rather than on a poster on
a wall, just in a Confluencespace, like an internal intranet
knowledge space that no oneactually visits.
Alja Isaković (18:38):
Or part of HR
training that you do once or
maybe get refreshed once everyquarter, but then you have
deadlines and budget constraintsthat make you forget all about
the wonderful good intentionsthat you have written on the
website.
Debra J Farber (18:53):
Right, and HR
while maybe they come up with it
(and it's well-intentioned) isnot expert at embedding values
into engineering, so I couldunderstand where there is
definitely a big gap.
So tell us, you've got thisother concept called ETHOS,
which is an acronym and itstands for the 'Ethical Tension
and Health Operating System.
(19:14):
' Tell us about ETHOS.
Alja Isaković (19:16):
Yes.
So, like you said, we reallywant to help organizations
embody and embed ethics intoeveryday practice so it's not
just forgotten as a set ofvalues or like a training you do
once in a while, and that's whywe're building the ETHOS suite
of apps that integrate with yourexisting tools.
So, for instance, we'll have anETHOS app for Slack because
(19:37):
that's where most of ourconversations now happen, an
ETHOS app for Jira to supportsoftware development, an ETHOS
app for Figma for designers,ETHOS for Zoom, and so on.
We really want to meet everyonewhere their daily practice
happens at different levels inthe organizations, and all of
these ETHOS apps connecttogether to help you keep track
(20:01):
of all the ethical tensions, andthe overall health in your
organizations, and to guide youtowards nurturing your values
and principles with action.
So there's nothing wrong withhaving explicit values and
principles.
The problem is that they'reusually forgotten and nobody
really understands how tonurture them.
(20:21):
This is where the ETHOS appshelp you.
We also really want to reducethe friction of shifting - not
just privacy, but all thesedifferent areas of
responsibility - left byintegrating with existing tools
and help people learn anddevelop their Elemental Ethics
skills by doing.
(20:43):
All the ETHOS apps are designedto grow with teams within our
organization to meet theirmaturity level because, like we
said, ethics is not a staticcheckbox exercise.
It's a living process and, atthe same time, we also want to
make ethics fun, accessible, andsomething that makes you smile,
which are not usually wordsthat you associate with ethics.
(21:06):
So, to give you a practicalexample what this means, in
ETHOS apps, instead ofdeveloping a traditional
dashboard with charts andmetrics, we are developing the
ETHOS garden as a living spacethat your team can .
.
.anyone on the team or in theorganization can visit and see
how their practice is nurturingtheir good intentions.
(21:28):
So, a space that reallyreflects your embodiment of the
ethics that you do.
Debra J Farber (21:37):
That's
fascinating.
Besides privacy, what are someof the other ethical tensions
that are addressed in ETHOS andin your overall Elemental Ethics
framework?
Alja Isaković (21:45):
This is entirely
up to each organization because
each organization is unique andthe problem that a lot of ethics
or ethics related frameworkshave is that they try to find
one solution to fit them all;and we intentionally design all
our frameworks and apps to bereally modifiable and that they
(22:08):
can meet teams where they're atso that they can integrate with
different ways of working,different processes.
This is really important, Ithink, for any ethics tool to
succeed; you really have toadapt to the people who are
using it.
Debra J Farber (22:23):
Thank you for
that.
That's really helpful.
How can a person or anorganization become a
Responsible Firekeeper?
Can you just walk us through, Iguess, a best practices
approach?
Mathew Mytka (22:35):
Yeah, best
practice is probably one of the
things that is somewhat of achallenge, but we'll weave this
perspective in here.
Change happens at differentlevels of the organization.
It's organizational and it'spersonal, so it's got to be
bottom- up, top- down, middle-in and -out, and that's really a
(22:58):
crucial element.
What we were referring tobefore was that top- down, an
exclusive aspect.
That's why there's part of theproblem.
If you're imposing these values,ensure that any of your ethical
, your purpose, your values,your principles that you might
have formally defined, thatneeds to actually reflect how
(23:21):
people feel across anorganization.
In a bigger organization it'sprobably more work that's
actually got to be done.
Just building on what Alja wassaying, Elemental Ethics, it
implicitly reflects our valuesand our principles that we have
as an organization, particularlyan ecological orientation.
(23:41):
That's reflected in the naturemetaphors, as an example.
There's a lot about accountingfor a pluralism of these values.
So, we do tend to have a fatbell curve of distribution of
values.
Humans tend to have very commonways of expressing what we
actually care about in thesemoral senses, you know like
(24:04):
social justice, and humanagency, and accountability, and
openness, and transparency, andthings like that tend to
commonly come up.
So, when those are gettingdefined, ensuring that the
discussions about what itactually means are reflected
across the organization andpeople actually understand what
that means.
You would see the same inprivacy.
(24:27):
What does that mean in contextwhen you're building?
What does that mean to thestakeholders, the communities in
which you might serve, orimpact as an organization?
There's a lot that can be doneon a personal level by just
going beyond this fire (24:40):
aiming
more for collaboration and
diversity within theorganization; looking at
research and learnings fromdifferent, diverse fields;
really trying to bemulti-disciplinary and
trans-disciplinary and takingtime to pause and reflect - you
know, dare to ask thosequestions, those tough
questions; becoming an advocate.
So, those types of practices.
(25:00):
You'll see those.
.
.
there's the privacy advocatesin organizations and you need to
.
.
.
not everyone's going to be those, but that needs to be reflected
as part of the culture andencouraged.
This idea of creating thepsychological safety for hunch
cultures.
When people feel that somethingisn't right, making sure that
(25:21):
that place and space is there toactually be able to.
.
.people speak up and say, "ohhey, I think this new feature,
or the way that we've defined it, is designed in a way that is
taking advantage of people.
We say that we value ourcustomers' agency and their
autonomy that we're actuallytaking it away here.
(25:41):
" We're working against them insome onboarding process or
something like that.
That's at a CX design level,but it might also be just in the
way that things are engineered,as well, or architected.
So, your listeners might havethat experience of being an
advocate for shifting privacyleft in their organization.
(26:02):
Thinking about it like that,those best practices do start
with those conversations dostart sometimes with people
being those advocates andspreading out.
We also offer lots of workshops,as well; we like to support
people in this process.
We're building product that.
.
.we're also doing a lot oftraining that builds on many
(26:23):
years of experience andinstructional design,
particularly between myself andAlja.
I helped start Australia'sfirst entrepreneurial college
and really love helping peoplelearn.
Alja's long part of her pastdecade and more working to
design courses and curricula andlearning materials.
(26:43):
So, we offer workshops forteams and organizations that
want to develop these elementalskills and learn how to look
beyond this allure of the fireand just increase that
possibility space - so that youcan the things that you want to
make the most possible, thethings that you want to care
about become the most possible.
Next year, we'll be releasingthe first in the suite of ETHOS
(27:06):
apps, which will support theseteams and organizations in this
transformation process, in thismetamorphosis from firefighting
to Responsible Firekeeping, andhelping them to translate this
high-level intent that theymight have into actionable
pledges, commitments, and havethose nurtured in their ETHOS
(27:26):
garden.
To get a better idea of this,as well for the listeners that
might work in an organizationinterested in this, you can read
"A Day in the Life of theResponsible Firekeeper," a
wonderful blog that Alja wroteand that goes through 'What does
it actually look like?
' in a reasonable amount ofdetail, so in a relatable story.
Debra J Farber (27:46):
I will put a
link to that in our Show Notes
so that'll be easy for people tofind the blog post.
What it sounds like to me isthat ETHOS - the Ethical Tension
and Health Operating System,the suite of apps that helps
with implementing elementalethics into your organization -
it sounds like any companydefines what their values are
(28:07):
and this suite of tools willhelp in the proactive engagement
of communication and feedbackloops and ability to speak up,
and it facilitates a ongoingdialogue with these values and
these ethical tensions presentedto the developers within their
(28:29):
workflow so that they cansurface and address ethical
issues.
Is that generally what I'mhearing?
Alja Isaković (28:35):
Yeah, that's a
fairly accurate description.
Obviously, it's way easier tounderstand once we have the apps
out and ready to be used.
But yeah, essentially, it's upto every team to decide what
they want to nurture in theirETHOS garden and what they want
to pay attention to; and thismight change over time.
Debra J Farber (28:56):
Great.
I wanted to just stress thatit's not like adopt your ethical
view on things and then its.
.
.Elemental Ethics is Tethix'sview on how any organization
should implement their values;but instead, it's a process and
you define your own values as anorg, and your approach is how
(29:16):
do you deal with the tensionsand to facilitate that within
the engineering workflow?
Alja Isaković (29:22):
Yeah,
essentially, Elemental Ethics is
the underlying philosophy, andwe'll be using a lot of that -
just the language and themetaphors - in the user
interface, for example.
We might have suggestedstarting seed sets of values or
principles to make this processeasier for you.
(29:42):
It's always best.
.
.
you know, like the IKEA effect,when teams choose their own
good intentions that they wantto work towards, and it will
also depend on just where theyare in the organization.
For example, if it's a teamthat's responsible for, let's
say, bringing generative AI intotheir product, they might take
(30:03):
something like the EuropeanCommission's principles for
Trustworthy AI and make thosethe seeds that they nurture as
they figure out how to integrateAI in their product.
Or, if it's a team that'stasked with improving the
privacy of a product orsomething like that, they might
have privacy as the core valuethat they nurture in their ETHOS
(30:26):
garden.
So, yeah, it's absolutelymodifiable and adjustable, as it
should be, because we are notthere to impose our value system
, but we do provide theunderlying philosophy, and the
metaphors, and the languagethat's used in the apps.
Debra J Farber (30:45):
Yeah, it seems
like it really helps with the
'by design' approaches:
security- by- design, privacy- (30:48):
undefined
by- design, ethics- by- design.
It'll help you embed it intoyour Agile processes.
So, I think that's reallyunique and I really like the
approach.
I am curious, how hard do youthink this would be to.
.
.
what's the level of effortrequired to embed this into
(31:08):
engineering practices for asmall but growing company versus
an enterprise?
I know you probably haven'tdone this yet, so it's just more
around what are you envisioningwould be some of the tensions,
or how hard it would be, or howwould you approach it if you
were an enterprise versus asmall but fast growing company?
Mathew Mytka (31:26):
Yeah, that's a
really, really good question and
it varies.
We're trying to designsomething that is able to be
adaptable and extensibledepending on your organizational
context.
So, like Alja referenced, ifyou don't have formally defined
ethical values in relation tothe technology you're building,
(31:48):
or you're going down toprinciples and your guidelines
for actions that principles aremeant to represent, we're
offering this easy way to getstarted by having these seeds
that reflect lots of thedifferent, already- defined and
common language that's used intechnology and that might be
(32:09):
adaptions of Salesforce orprivacy- by- design, and taking
those principles - the sevenprinciples of privacy by design
- having those in what we call'The Seeds Catalog.
' When you take those, you pickand choose, and you plant them.
You plant them in your ETHOSGarden and this becomes, if
you're using a Slack app or it'sintegrated within Jira, you can
(32:31):
get started pretty quickly.
That's the whole idea.
For larger organizations, theymight already have these values
defined.
So, it's about having thosepopulated, being able to have
those reflected in terms of astarting process and how
technical our kind of technology, our interim CTO, he calls it
(32:52):
'Tethix Runtime,' like how youintegrate it with your existing
enterprise IT fabric - thatwhole idea of being able to get
up and get started.
In a large organization, youknow, 5,000, 10,000+ employees,
that's obviously going to take alittle bit more time because
there's a lot more people withinthe organization that need to
(33:16):
adapt and work with this, andhave them including the process.
So, you might file it somethingwithin a specific product team,
for instance.
It's got a.
.
.
in a larger organization, it'sactually got to have the
authorizing environment to dothis to some degree for it to
actually translate into thatlarger-scale cultural change.
(33:38):
But, it can also accommodatethis ground-up approach, where
there's people within a largerorganization that are like, "we
really love this!" hey've gotthe autonomy within their
business units or within theirpart of the business to adopt a
tool like this and integrate itwithin their task management;
their project management typesoftware, like JIRA; or
(33:59):
something like Slack, as anexample.
Alja Isaković (34:02):
I just wanted to
add that we essentially have
different experiences in workingwith different-sized
organizations; and as we'redeveloping these ETHIS apps,
we're also thinking how theymight be used from the smaller
scrappy startups that are justgetting started to the large
organization.
We're trying to really designsomething that will grow with
(34:23):
the organization, no matter whatthe starting size is, and
essentially support ethics as aliving process that changes as
the organization changes,matures, and encounters
different challenges.
Debra J Farber (34:38):
It seems like
ethics should have its own
lifecycle, the same way we havethe 'software development
lifecycle,' and the 'datalifecycle' and the 'DevOps
lifecycle.
' It seems that there should bealso an ability to constantly
monitor and get feedback for andtest your ethical framework as
(34:59):
well, so that makes a lot ofsense.
Alja, I'd love to turn ourattention to talk about the
ResponsibleTech.
Work framework that you've beencontributing to.
Tell us about why we need aResponsible Product Development
Framework and what is the goalof the framework?
Alja Isaković (35:15):
Yeah, thanks for
this question.
I basically startedResponsibleTech.
Work because I think we needpractical tools that can help
tech workers take responsibilityfor their work, regardless of
job title.
The work I do with Tethix ismore focused on supporting
organizations and teams.
ResponsibleTech.
(35:35):
W Responsibletechwork is aboutinspiring and empowering tech
workers who might not be workingin an organization.
That's fully on board with thewhole ethical responsible thing.
So, it is a pretty small opensource project that essentially
anyone can use to learn moreabout different areas of
responsible product development.
(35:57):
Using familiar language andtalking about responsible
product development orresponsible AI, it's such a big
word that encompasses so manythings; and it's really hard for
any one person to become anexpert in privacy, ethics,
sustainability, accessibilityand all these other areas that
(36:18):
essentially fall underresponsible product development.
I think it's still important tohave, for anyone that works in
tech, a broad awareness of thesedifferent topics and questions
that you should be asking.
So, with ResponsibleTech.
W ResponsibleTechwork, we wantto introduce all these aspects
in a simple way without usingcomplicated that's That's often
(36:40):
using academic research or alot of the tools that are out .
Even even responsible techreports use a lot of really
complicated language.
The focus with the framework orthis open source project is
essentially to offer practicaltools that can help you in your
day- to- day work, not justlearning about an abstract ideal
(37:02):
.
Debra J Farber (37:03):
Awesome.
What are some of the coreelements of the framework and
how does that help guide onresponsible product development?
Alja Isaković (37:12):
If you go to our
website - and the name is the
URL to make it easier to find -you'll find pledges that can
inspire tech workers to act andstart conversations in their
workplace about all thesedifferent areas of
responsibility.
We have resources onmethodologies that can help you
evolve your existing practiceand tools that can help you act
(37:35):
more responsibly every day.
The main tool we've developedso far is called PledgeWorks,
which introduces pledge writinginto existing processes as a way
to seek better outcomes withina given context.
Like all my other work, withPledgeWorks, this tool is also
not prescriptive and pledges areintended to be adapted and
(37:58):
modified to fit differentcontexts and processes.
As Matt was saying before, weare also using pledges in a lot
of the work we do with Tethix.
For instance, in the ETHOS app,we'll be supporting pledge
writing because we really thinkthat writing actionable
commitments within a context andlimited time frame is a great
(38:19):
tool to help you think about howyou'll translate your good
intentions into concrete action.
So, here we're coming fullcircle to where we started this
conversation to the intent toaction gap.
As you can see, Matt, myself,and our diverse collaborators
from around the world are reallypassionate about approaching
(38:40):
this problem from differentangles to support both
organizations and individualsthat really want to do good in
the world.
It is our core belief thatpeople want to do good.
They just need the rightenvironment, the right support,
the right language, the rightstories.
So, the intent to action gap isthe problem worth solving for
(39:02):
us, and it's how we really wantto use our diverse expertise and
knowledge to make a differencein the tech industry.
Debra J Farber (39:09):
I love it.
I think that's awesome.
I love the intent behind it andI definitely am going to be
paying attention to Tethix'sdevelopment over the next
several years.
I'm really excited for whatyou're bringing to market and
how that's going to help manyteams and, ultimately, the
customers.
The outcomes of these productsis going to be better products
(39:30):
and services for individuals, sothat makes me excited.
If listeners want to delvedeeper into Responsible Fire
keeping, this concept, and learnmore, where should they go?
What should they read?
What do you recommend?
Alja Isaković (39:44):
We talked about a
lot of things today, so if any
of this resonated, we invitelisteners to visit our website
at Tethix.
co where they can browse ourworkshop catalog to find
workshops on Responsible Firekeeping and other topics.
Browse our other offerings.
If they want to keep track ofwhat we're developing and
(40:08):
thinking about, they cansubscribe to the Pathfinder's
Newmoonsl etter and our Substackpublication.
We also hope they'll send us anemail if they want to visit our
virtual tea garden and have achat about how we can help you
transform the way you think,talk, feel and do Tech Ethics.
Debra J Farber (40:27):
I just want to
be part of a virtual tea garden.
That sounds like fun.
Alja Isaković (40:30):
Yeah, you should
visit.
You should visit.
We have a blog post and videowith a tour of that.
It's how we do our meetings.
We don't usually use Zoombecause we also experiment with
virtual spaces to help withlearning and memory encoding and
bring a bit of playfulness in aworkspace.
(40:51):
So, all our meetings happen inour virtual tea garden.
Debra J Farber (40:55):
Awesome.
I love it.
Do you have any words of wisdomthat you'd like to share with
the audience before we closetoday?
Mathew Mytka (41:00):
I guess, as Alja
had mentioned, tech ethics -
it's intimidating.
So, the important part is justto start.
We launched a series ofchallenges, as an example, on
LinkedIn just recently, calledTethix Pathf inding Seeds that
challenge you to practice yourelemental skills in small but
accessible ways.
Those tiny steps havecompounding effects.
(41:23):
So, just getting started withthese things is really important
, rather than waiting forpermission, or waiting to have a
clear view, or formally yourpurpose, your values, your
principles defined in detail.
Just getting started.
Making time to slow down, sitdown around the campfire and
bring your own tea garden,whether physical or virtual.
(41:45):
Tell stories.
Learn from each other.
Most people care about thisstuff.
These are probably some of themost important questions and
discussions for us to be havingat this point in time,
considering the impact that ourtechnology has on the world
around us and our role as aspecies.
(42:07):
So, just start; have theseconversations; learn; grow; and,
that'll compound over time.
Debra J Farber (42:16):
Well, thank you
so much, Matt and Alja.
It's been a pleasure.
Thank you for joining us totalk about Ethical Fire keeping
and Responsible ProductDevelopment.
Until next Tuesday, everyone,when we'll be back with engaging
content and another great guest, or guests.
Thanks for joining us this weekon Shifting Privacy Left.
Make sure to visit our website,shiftingprivacyleft.
(42:37):
com, where you can subscribe toupdates, so you'll never miss a
show.
While you're at it, if youfound this episode valuable, go
ahead and share it with a friend.
And, if you're an engineer whocares passionately about privacy
, check out Privado (42:51):
the
developer- friendly privacy
platform and sponsor of the show.
To learn more, go to privado.
ai.
Be sure to tune in next Tuesdayfor a new episode.
Bye for now.