Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Hi, I'm Josh, and welcome to Ethical Tech Innovation, a value based engineering podcast.
(00:14):
In this episode, I'm joined by Sarah and Mario to discuss the basics and background of value
based engineering.
I'm your host Joshua Roe.
And in this episode, I'm joined by two co-founders of the VBE Academy, Professor Sarah Spiekermann-Hoff
chair of the Information Systems and Society Institute at the Vienna University of Economics
(00:35):
and Business, and Mario Tokarz, founder of RightMinded AI.
So to kick things off, I think I'd just simply like to ask you, so what is value based engineering?
Well, value based engineering is a method which tells innovation units and organizations
(00:59):
how to design their systems in a value based manner and hence in an ethical manner and
hence in a good and right way so that customers and users of the systems are happy in the
long term and that the system is sustainable and also legally compliant.
That's value based engineering.
(01:21):
Yeah, and maybe if I can add on this, I think from let's say corporate or business perspective,
it's always very important to have methods which are rather clear cut and which kind
of give you this step by step approach so that you know that if you want to do something,
you know exactly how to achieve it.
(01:43):
And a lot of people talk about making products better or more ethical, but then as a business,
the question is, how could you make that claim?
You work on something for, I don't know, three, five years and afterwards you say it's more
ethical than something else.
And there needs to be something behind it that you actually do, which is tangible and
very clear cut that you can base that claim on where you can explain to your customers,
(02:06):
this is exactly what we did and why we think we did a good job in that.
And this is something that value based engineering delivers as a method.
Could you say a bit more about how this project came about and how this method was developed?
Yeah, it's actually sometime around 2015 when everybody knew already that AI standing in
(02:28):
front of our doors, the IWE organization launched a huge initiative.
It was called the ethically aligned design initiative.
And then they said, let's try to build standards, international global standards on how to go
about ethical engineering.
And they launched 7,000 theories and value based engineering is basically the IEEE 7,000
(02:52):
standards, so the baseline standard.
And it was all driven by IEEE and the volunteers from around the world who helped build that
standard.
Yeah.
Thank you.
So then the other question is, so we understand now a bit more about what value based engineering
is.
I've also introduced the value based engineering academy or the VBE academy.
(03:17):
Could you say more about what that is and what its role is in relation to value based
engineering?
Well, sure.
Yeah, I can say a few words on that.
So I mean, as I outlined, there's different challenges that you might face in a company.
So Sarah talked about standardization of this method a bit, and that was a challenge in
(03:40):
itself, maybe not for today, but there's a lot of good stories she has on how standards
come to live.
But then as a company, you may have a method, but it's not only to know the method, it's
also to know on how to apply it.
So you need to basically train people, get experience, get started.
(04:02):
Nothing that you do in life will work out just perfectly on first try.
So you need to have that knowledge and that method kind of accessible, so to speak.
And value based academy, as the name says, it's an academy.
It's very much about helping companies to access value based engineering, which might
be through training or it might be through the case studies that we do and we publish
(04:25):
and that we collaborate on so that people can actually learn how value based engineering
can be applied.
Then every method, of course, needs to be fit into the way that you work and every company
is specific.
And that's why we felt that it was very important to create an institution, a corporation, if
(04:46):
you will, in that sense that can work with other corporations and help them to access
that knowledge and to make it usable for their own businesses.
And that's what the main goal of our VBE Academy is to help people to access VBE in terms of
knowledge and to apply it to their own domain and use cases and products.
I'm always saying that there is a standard, let's say a standard like how to build a
(05:11):
mobile network or how to build a 5G network.
And you can have a standard on paper, but if you're not trained on how to use it, you
won't be able to actually build that network.
You have to learn how to apply the standard.
(05:33):
And this is actually the same for the 7000 standard.
It's not that you like build an ethical and compliance system out of nothing.
So you need to be an expert in how to do that.
And we are educating those experts.
Yeah, I mean, it's a bit like we are here close to the university.
You know, it's not that somebody puts a book on your desk and says, read this and in five
(05:56):
years you can have your master degree.
But it's about having a professor that explains the subject matter to you.
It's about you doing your own studies, of course.
And it's about maybe having a tutorship or also exercising.
And this is no different for value based engineering, only maybe with the fact that we understand
that companies need an efficient way of consuming that method.
(06:19):
You cannot send all your employees back to university to learn a new method.
And that's why with VBE Academy, it's very much about creating programs and offerings
which actually are easily consumable by companies and by their employees in order to learn about
value based engineering.
Of course, you still need to learn it.
I mean, that's something that's a given.
(06:41):
But I think that's also pretty straightforward as a thought that it's also a bit of an investment
of your time.
But we try to make it as efficient and as well tailored to the broader, let's say, business
community as we can.
So thank you.
Yeah.
So I wonder, could you tell me a bit more about the method or maybe to start that, maybe
(07:02):
if I could kick us off with the question of what does it mean to be value based?
What is a value?
And I think that's a very basic question.
What are values and what does that mean in this context?
I mean, a lot of our values are enshrined in the law, for example, privacy or freedom or
equality.
These are all principles that are in the international conventions and the human rights declaration.
(07:30):
So this is what we aim for, that value based systems are catering to the EU Charter of
Fundamental Rights, for instance.
And that's where the values come from.
But also, sometimes there are issues, for example, human addiction to social networks,
and you won't find this kind of value infringement in the law.
(07:55):
So what we do is basically we go a little bit beyond the law.
So with value based engineering, when you know how to do that, you comply with the law, but
you also have on your radar that potentially your system implies risks that the regulator
might not have seen yet.
(08:19):
And this is what we do.
And values are basically from the way we look at it also philosophically is what humans
perceive as being good for them, but also where their human values need to be protected
(08:40):
if they are infringed by technology.
Maybe if I can add, so as you said, initially, I've worked for car companies a long time
in a big corporation like BMW and then also in smaller startups.
So I have more of a technical background and I maybe didn't look so much into philosophy
(09:00):
before I was in touch with VBE.
And what I learned is that in very simple terms, I would say values are things that
matter to humans.
And I noticed doing the work with value based engineering that it is very difficult for
us to even communicate on what is important.
(09:24):
So we know some topics such as privacy, for example, and this has a very high, let's say,
that resonates with a lot of people in our societies that privacy is something that is
worth striving for.
But I think very often we see also that people are afraid of new technology.
And this is maybe because they fear that if they use this, it might be addictive, it might
(09:45):
be dangerous for their children.
And all of this we can also put in a value language so we can say, well, child safety
is important, privacy is important, freedom is important, right?
Addiction takes your freedom from you.
You basically are kind of hooked to something and you cannot really live your life as you
want.
And so what I learned is that values may be a bit more concrete.
(10:08):
It's a tool for us as humans to align on what we think is important.
And I always make this example about a company Nokia, which doesn't exist in the shape anymore
as it maybe did 20 years ago, because when I worked, I worked with people who just basically
got out of Nokia because they shut down most of the company, if you remember, in the early
(10:29):
2000s or 2010ish, I think it was around that time.
And they always thought that a cell phone needs to be robust.
So they would put it on a string, they told me, and they let it drop on the floor all
the time, right?
To see whether it's robust, because they thought people value robustness.
And then comes the iPhone and you know, you drop it and it breaks and still everyone kind
(10:52):
of buys it.
Everyone wants to have it.
They wanted beauty.
Yes, they wanted beauty.
Exactly.
And I think there was a rather misconception where also Nokia failed to understand how people
evaluate different values when they see it in a product.
And I think in short, and sorry, that was not the shortest version of it, but I hope
people get this idea that values are a way for us to talk about what matters in life
(11:19):
and what matters also in a product.
And sometimes if the values don't line up, right, maybe you cannot have all of at once,
then it's important to know how you decide on what matters more and maybe also on things
that you don't want to see in a product, right?
That could also be the consequence that you say something is so important.
If I can't have it in a product, I would rather not take the product at all.
(11:41):
And I think this is so important to value-based engineering to understand that it's more than
just a process.
It's also a way of thinking and it's also a language that you need to use in order to
even align on what is important for your product and for your customers and maybe also for
our societies.
Perhaps something to add on AI at this point.
(12:01):
As we recently did a huge study with our national employment service, who has a GPT based recommendation
system for anybody in Austria who's interested in a new job or an educational program.
And what we did is we applied a VBE's value language to the dialogues of this GPT based
artificial intelligence.
(12:23):
And what we found is that we can actually very well analyze those dialogues for corporates
and for the national employment service.
For instance, it's really important to know the coherence as a value.
Is the dialogues, is the AI's response coherent?
(12:44):
Or is it polite?
Is it accurate?
Is it consistent with what the user asked?
So we developed a value monitor with a lot of categories and fine grained value qualities
is how we actually call this in value based engineering.
(13:06):
And we created for AI providers a way to understand how good their system is from a user perspective.
And so this is hugely important for organizations.
So it's not only to comply with the law or to build a beautiful product or to build a
(13:30):
product that consumers like, but also for organizations to have control over the value that they provide
through their technology to the customer.
This is also part of it.
Thank you.
There is one thing that you haven't seem to mention and that's economic value.
So I guess the other question would be is value based?
(13:54):
Does that mean economic value based?
That's a very good question.
You've worked longer in corporates than I did.
But I'm looking currently into a new definition, economic definition of the so-called value
proposition.
(14:14):
In economics, we actually talk about aggregate value in German Wertschöpfung.
So I'm currently developing a new aggregate value theory.
But that value theory goes beyond our original economic arguments where value was always
almost equalized with money, as it was a willingness to pay of the customers and where market equilibrium
(14:43):
was a matter of where demand and offer match in terms of willingness to pay.
And I'm actually based on this value theory behind value based engineering, developing
a new model.
And my argument is that what we need to do to measure the economic success of our innovations
(15:10):
and of our products and services, we have to look at the aggregate human and social value
costs and value benefits.
And just like in this dialogue system where we look at the dialogue qualities, value qualities,
we add them up, what's positive, what's negative, and we come up with a new kind of equilibrium
(15:36):
that integrates basically human and social perception.
And the challenge, of course, that I have with this new theory of aggregate value is
that the question is, of course, how can we measure it from an economic perspective?
And here certainly more work needs to be done.
(15:58):
But the long-term economic view that I defend here is that we need to go away from the original
measurement of value in economic theory.
Yeah.
As you said, I am the big professor.
So I think one thing that I usually say, and I don't think we should talk politics, but
(16:23):
I think the question of where we are heading with our societies and with the technology,
it is certainly one of the biggest questions of our time.
So engineers like myself, when I came in, to be quite honest, I was a bit like, "Nah, I
don't want to do politics.
I just want to know how to solve it."
But then you have to be fair and also acknowledge that there's very big questions of how our
world ought to be.
(16:44):
And we see that also in how the global political landscape is shifting.
So when Sarah talks about how value could be also measured in the future, I think we
should all also connect to ourselves and think about what's the world we would like
to live in.
If you work in a company today, you might not have the luxury of thinking about the big
(17:06):
question.
You might be more tempted to also think like, "Yeah, how can I maybe make a good product
and how can I also, let's say, really mean it in the sense of I wanted to have something
that is socially and ethically responsibly built, and yet I'm also constrained by how
(17:27):
the world is and maybe not part of the discussion at the moment of how the world ought to be."
So in that sense, I say you shouldn't necessarily try to say that today's economic order in
the sense of something needs to create revenue is totally incompatible with the idea that
values matter and you would like to design a good product.
(17:49):
I made this example with the iPhone where I said, "Well, it was beautiful and people
were willing to, let's say, sacrifice values like robustness in the sense of when it falls
down it breaks to have a beautiful product."
So obviously, Apple did a great job, right?
So they found a new value proposition nobody had and they built a product which is economically
(18:11):
extremely successful to the day.
So in that sense, you can actually do both.
You can look at values and you can make an analysis and you can build products which
are economically successful already today.
And so when I talk to customers, I always say, "Well, there's two types of motivations
today.
You can only say it's all compliance, I need to follow the law and the law is also, I think,
(18:36):
in a lot of areas right to look more closely at what companies are doing, for example,
with AI.
But you can also say, "Well, I need to be compliant.
I need to look at the law."
Yes, but maybe there's even more I can do and maybe looking at values can even help
me to understand better how I can design a great product.
And in a lot of scenarios, you actually need to have other people liking your product.
(18:59):
And it's not only the end customer product we talk about, it's for example, when you
change the way your company works and you want to roll out new processes and there's,
I don't know, 50 people who work in a certain plant or site and they do something for your
company and now you bring in a new system, just imagine everybody hates it.
It's like, "No, this is bad.
It will destroy my workplace.
(19:21):
I'm worried about this."
This will make your project fail 100%.
You cannot bring change to your own company as one example without having the buy-in.
And so having a dialogue and understanding what matters to your employees, that's something
where you don't need to talk about compliance.
It's in your best own interest as a company to look at these value trade-offs and how people
(19:44):
feel about your product and about your processes.
And I think we should really say it's both.
It's us thinking about the future, like what world do we want to live in?
But also today, there's a lot of things where I would say, "Hey, have you thought about
how to design that product in a way that your customers, your employees, your people will
(20:05):
actually love it and how do you do that?"
And I think value-based engineering can also help us that.
I mean, you know that when we've been conducting projects, we very often get a lot of positive
feedback from union representatives, for instance.
Or now we also work with 10 projects.
You've actually run one of these projects for the city of Vienna.
(20:27):
City of Vienna is considered to be the most beautiful city in the world for a very long
time and also the best one in terms of quality of services.
And they want to bring this also to the digital service provision.
So they work with value-based engineering now.
It's called the Vienna method, basically, because the output of the digitalization process
is that citizens love it.
(20:50):
So yeah, so we really are about that.
Thank you.
So I guess the other part of it, so I've asked about the value-based, what does that mean?
I guess the other question is, what comes under engineering in this sense?
We've talked a lot about AI.
That seems to be very popular at the moment.
(21:13):
Is it just about developing AI systems?
Is there more than that?
Well, normally I say no, it's more than that.
I think the best example I can give, usually I think everybody can relate to this, is the
like button in social media where people just see you post something and you see how many
people like that.
(21:35):
And you know that there's a lot of people who get super anxious when they post something
and nobody seems to react to it.
And I always say the like button, it's just a counter.
It's not in the first instance AI, it's just you seeing how many people interacted with
your content, sometimes also with your friends.
So you know that they have seen it.
(21:55):
So also on social media where everyone sees it, like WhatsApp, everyone sees the status.
And if people don't like it, people might feel very insecure or it might actually cause them
real emotional pain to see that.
And so I think it's not about only AI, but it's all technology which can reach out to
so many people in such a short time.
Usually I say if it is AI, it almost certainly has an ethical implication, but it doesn't
(22:20):
necessarily need to be AI in the technical sense of the term.
I would more say it's about scalable technology, which reaches large amount of people in short
time and then you will almost always have this ethical implication or at least it's worth
thinking about how it could affect people.
I mean, there are also two things to your question.
(22:42):
One is, as I agree with Mario, that it's not only AI because very often AI is just a large
language model.
Technology is just a small component within a much bigger system.
So if you as an engineer just look at a little component, which is important, but you have
(23:04):
to as an information systems scholar, I know that systems are socio-technical systems.
And in information systems work, we also look at processes and perhaps in one workflow component,
you will have an AI system.
But then what about the other parts of the process?
So value-based engineering is always starting its analysis and its work with the system
(23:31):
of systems.
It looks at the ecosystems, it looks at various suppliers who all together make a service work
for the customer.
So that's one thing.
And it's correct that you also ask us, "All right, why is it engineering?
Why didn't we talk about value-sensitive design?"
(23:51):
That was an option that we had when we gave the name to the child, to this IEEE 7000 project.
And it's because what we do here is we do not only think about human values and we do
that morally and philosophically and so on, but we translate the value expectations of
users and direct and direct set was into system requirements.
(24:16):
So we go all the way down from a vision and need and risk and so on to the actual settings
in the system.
So we are helping engineers to build better systems because we tell them, "You know, you
have to think about your features and also how these features are designed."
(24:41):
You look at the small things.
Engineering is about the small things.
The small things go wrong, then the whole system can collapse.
And this is why we call it engineering also, because we are able to go from a wonderful
vision to the small things.
Yeah.
(25:02):
So I'm a software engineer by training, right?
So I can tell you my personal take or why I also came across value-based engineering
in the first place.
So at some point when you do something like developing software, for example, or in any
type of engineering, at one point there is an actual person who types in things into
(25:26):
a computer and that is what basically then is your product, right?
So if you, for example, have a software engineer, they type code and this code gets executed
on your phone.
So at one point where this person writes the line of code that makes, I don't know, your
screen red or green or blue, there is an actual decision to be taken.
So that person really needs to know what color should my app has as a simple example.
(25:50):
And in the same sense, if you talk about all the analysis that you did and the consideration
you made from a moral perspective or based on the values that you discovered with your
stakeholders, at some point you really need to say, "Is it now left or is it right?"
Because there's a person who needs to actually put that into the system.
And this is why when I was exploring which solutions could be there for building...
(26:17):
Right-minded AI.
Right-minded AI as a company, of course.
I was very keen on that because if you come down to requirements in the end, requirements
is something that companies can work with.
And if you have a requirement that just says how you should do things, it does not matter
at that point why that requirement is the way it is.
(26:37):
As a technical person, you can say, "Okay, there is the requirement.
I know people have thought about it and if I wanted to, I could go all the way back and
understand why it is the way it is."
For example, go to YouTube.
Just recently they have changed the counter how they show you likes and dislikes.
They used to have this thing where they showed you how many people liked it and how many
(26:59):
people disliked it.
And they changed that.
So now you can only see how many people liked your videos.
You can't say how many people dislike it.
And at this point, there is a requirement where some technical person did it just the way
it looks like now when you go onto their website.
And this is the great thing about value-based engineering.
When you implement your product, it's basically not so relevant why the requirement is in
(27:22):
place but you can work with that requirement and you can build a software and you know
that by doing that, you will also have followed the ethical consideration because they are
baked in there.
You could also say the requirements are baked the other way around from the ethical considerations
you made.
And that makes it super applicable and executable and you can just use your normal machinery
(27:42):
as a company and implement requirements.
And that's what engineers have been doing for, I don't know, decades, centuries maybe
even.
So that's a very well-known process how that works.
Thank you very much.
Yeah, unfortunately we're out of time right now.
I guess if you could just say one more thing.
So if you're listening to this and you're watching this and you're interested, what
(28:06):
further steps could you do to get more engaged with value-based engineering?
Well, I mean, I'm actually spending a lot of time right now to record all the wonderful
lectures and case studies that I did over the past years.
And yeah, we ramped up a program where we have a baseline value-based engineering course
(28:29):
and then if people really like it and they want to be professional, they can also take
extra education in terms of digital ethics.
That's a little bit.
I can tell more in another podcast.
Yeah, so our website is vbe.academy.
So it's very simple to remember, hopefully.
And I think that's a good place to get started.
(28:51):
And also we are on LinkedIn on social media in that sense.
So if you wanted to learn more, there is information available.
If you wanted to get in touch and see how things might be useful for you, we're also
happy to anyone who wants to reach out to us.
So yeah, ample opportunity hopefully to get connected and we're looking forward to it.
(29:13):
[Music]
[Music]
[Music]
[Music]
[Music]
[Music]
[Music]
[Music]
[Music]
[Music]
[Music]
[Music]
[Music]
[Music]
[Music]
[Music]
(29:34):
[Music]