All Episodes

August 19, 2023 24 mins

In the race to dominate Al we have seen our data privacy, democracy, and even our human rights impacted. 

To understand what brands and consumers need to do to fight back, I spoke with Anton Christodoulou, Group Chief Technology Officer at leading experiential design company, Imagination and co-founder of the new Trust 3.0 initiative.

The Trust 3.0 initiative  is a Data Privacy Advocacy Group convening the brightest minds in privacy, AI, and technology to champion responsible innovation for a safer society.

I’m proud to say that I’m also a part of this timely initiative.

Anton is responsible for overseeing Imagination's global technology strategy, project and service delivery execution; to deliver immersive, engaging and measurable experiences to clients including Mastercard, Ford, Major League Baseball, Jaguar Land Rover and Shell.

I started our discussion by asking Anton more about Trust 3.0 and why it has been set up.

This episode looks at its implications on data privacy for consumers and businesses and offers a deep dive into the necessity of transparency, security, and fair exchange of value in handling customer data.

We also discuss how businesses can leverage this to promote trust and security.

More on Anton
Anton on LinkedIn
Imagination website
Trust 3.0 website

Thanks for listening to Digitally Curious. Pre-order the book that showcases these episodes at digitallycurious.ai/pre-order

Your Host is Actionable Futurist® Andrew Grill

For more on Andrew - what he speaks about and recent talks, please visit ActionableFuturist.com

Andrew's Social Channels
Andrew on LinkedIn
@AndrewGrill on Twitter
@Andrew.Grill on Instagram
Keynote speeches here
Pre-order Andrew's upcoming book - Digitally Curious

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to the Actionable Futurist podcast, a
show all about the near-termfuture, with practical and
actionable advice from a rangeof global experts to help you
stay ahead of the curve.
Every episode answers thequestion what's the future on,
with voices and opinions thatneed to be heard.

(00:23):
Your host is internationalkeynote speaker and Actionable
Futurist, andrew Grill.

Speaker 2 (00:30):
So in the race to dominate AI, we've seen our data
privacy, democracy and even ourhuman rights impacted.
Maybe you could tell us a bitmore about what is Trust 3.0 and
why you've set it?

Speaker 3 (00:39):
up.
Trust 3.0 really came out of agap in the market in terms of
how data privacy is beingaddressed in the current
landscape.
There are a number of privacyadvertising groups helping
consumers protect their data.
What we really believe is, inorder for this to be implemented
in the way that it needs to be,we need to make this accessible

(01:01):
and practical for brandsparticularly, but companies
generally, that are dealing with, I would say, huge amounts but
any amount, arguably of customerdata and ensuring that they are
doing it in a way that istransparent, fair and
essentially enables a customerto feel comfortable interacting
with the service.
And I think the key here is,although it is absolutely for

(01:25):
the end customer, there are realbenefits to brands in ensuring
that the data is stored, managedand made as transparent as
possible from a value exchangeand a marketing perspective as
well, and even from a securityperspective, because if the data
is handled and managed in theright way, it actually makes it
much harder for bad actors toget access to that data as well.

Speaker 2 (01:48):
Before a charity organisation.
You've got some reallyinteresting people involved in
that.
There's a bunch of regulationaround this.
What's the need for this?
You see, there's a gap in themarket.
A regulator is not doing enough.
A brand is not doing enough.
How can you possibly, as athird party, influence some of
these big brands and governmentdecisions?

Speaker 3 (02:04):
You've got two challenges.
You've got the consumers on oneside, who, predominantly, are
potentially scared.
They want access to theservices, but don't really
understand the implications.
What we've seen in the lastdecade or so is that consumers
have essentially just willfullygiven their data over without
really understanding what theimpact is, but are becoming much

(02:24):
more aware and much moreconcerned On the business side.
Businesses are being hacked,and so there are organisations
and certifications such as ISO27001, which enables you to deal
with the security side.
You've got B-Corp, which enablesyou to address sustainability,

(02:44):
which, again, even 10 years ago,there were only a few very
large organisations that werereally taking sustainability
seriously.
However, there isn't anorganisation that's specifically
looking at working withcompanies or brands to ensure
that the data privacy side ofthings is properly handled.
At the moment, it's being verymuch left to individual brands

(03:05):
and companies to decide how theydeal with that.
If you look at someone likeGoogle, who have become much
more transparent but essentiallylive on customer data, versus
someone like Apple, who havetaken a very stringent data
privacy approach but also usethat as a marketing tool.

Speaker 2 (03:21):
So where will you interact, where will you
interface with the brand side ofthe world and the consumer side
of the world?

Speaker 3 (03:27):
The plan really is to primarily work with brands,
which is what we're doing now,and it's to really work with
brands to help them understandwhat frameworks and approaches
they need to implement.
So, although this is ultimatelysolved through some form of
technology infrastructure, thefocus really is on identifying
what are the frameworks.

(03:48):
What is it that you need toimplement in order to ensure
that your customer data isstored in the right way, it's
controlled in the right way,that there is transparency
around the way that that data isbeing used?
And then, once you've kind ofbuilt those frameworks depending
on what their particularservice they're providing is,
looking at then how youimplement a technology framework

(04:10):
that enables that to happen,not at a kind of regulatory
level, but basically within thelayers of the services that
you're using, so that obviously,they're still a layer of
ensuring and making sure thatthose services are correctly
implemented, but once they areimplemented, brands can feel a
lot more confident, a lot morecomfortable that they are

(04:31):
managing and using customer datain the right way and, more
importantly, they can then starttalking about that to their
customers to say you know, ifyou are interacting with us, if
we're providing a service to you, we are doing it in a way where
we are protecting you, we'reprotecting your data.
How far they take that isentirely up to them.
They could go for a completelydecentralized and anonymized

(04:54):
approach where you're able toconsume the service without the
company that's providing theservice having any knowledge of
you or what you do.
The kind of most extreme examplewould be something like Signal,
which obfuscates literallyevery step of the process, down
to kind of where a call is made,when it's made and for how long
, as well as, obviously, all thedata that's required and the

(05:15):
conversations themselves, versussomeone like WhatsApp, which
actually stores all of that dataexcept the conversation, versus
just being more open about kindof how you use that data.
Will you share it with thirdparties and we're kind of at
that point at the moment is thatmost companies now have some
sort of privacy policy.
They're encouraged to make thatprivacy policy as clear as

(05:36):
possible, although you know mostpeople still don't read it or
even understand it, howeverclear it is.
And even if they do, typicallyas best as you can tell, the
main question is are theysharing it with a third party or
are they just using it fortheir own purposes?
This takes it a lot further andmake sure that they're only
essentially using the data theyneed, potentially only using the
data at the point that theyneed it and, as I say, it may

(05:59):
even be that the data they'reusing is completely anonymized,
so they're using it purely atthat point, to provide the
service, and then, essentially,they'll just continue down that
path.
What they provide as a valueadd is the service itself, not
the mining and exploiting ofthat data in order to use it for
any other purposes.

Speaker 2 (06:16):
How would you describe imagination?
Are you a marketing agency?
We're an experienced designcompany.

Speaker 3 (06:20):
We have been designing and building
experiences for brands for over50 years.
We started predominantly in thesort of automotive space to
some degree.
We launched the original eventsat the Millennium Dome, the O2.
We did the New Year's Evecelebrations for Sydney for
nearly a decade.
We launched products andservices for a variety of

(06:40):
different companies.
We run large physical events aswell as either physical,
digital and virtually integratedevents.
So, for example, recently welaunched the new European made
EV for the Ford, which was apurely virtual event using pixel
streaming to be able to enablepeople to drive the new car

(07:01):
before is actually available tothe public and be able to order
that.
So, whether it's a physicalevent through to a completely
virtual event, we are handlingour clients, customers' data and
we're using that data often tocreate kind of personalized
services.
Our focus is always, from anexperienced design perspective,

(07:22):
is how can you create the bestpossible experience for the end
customer of that service?

Speaker 2 (07:27):
Your close work with brands, you've really realized
that your own clients need to beprivacy trust assured.
So is that why you've goneahead and looked at the trust3.0
initiative.

Speaker 3 (07:36):
We have always taken how we handle customer data very
seriously.
We have applied, and do apply,the various regulations around
the world like GDPR and CCPA.
That's kind of for us, a kindof baseline that we would do.
Anyway, what we're looking atnow, and being aware of where
the market's going, is lookingat ensuring that when we're
working with clients in the sameway, in the early days of kind

(07:59):
of the sustainabilityconversation, as soon as they
kind of come to us and sayactually we want to build this
with a slightly differentframework.
Maybe they want to launch aproduct or a service that
actually focuses on data privacyor they want to make it a key
tenet of their marketingstrategy, as Apple has, we have
the ability to one obviously beable to provide the expertise,

(08:22):
the training and ultimately, ifrequired, the certification.
We may certify ourselves, butwe may also work with brands
that want to be able to do thatthemselves.
As well as our own platform,which handles all of the
customer data for all of ourclients, we want to be prepared
for the future and adapting inadvance of what's coming, rather

(08:43):
than being reactive.

Speaker 2 (08:45):
You mentioned before decentralized systems.
I'm really interested indecentralized identity, also
called sovereign identity.
So where will the decentralizedidentity platform, where
consumers control their own data, where will that fit, and will
platforms like this giveconsumers the power back of
handling and managing their owndata when it comes to trust and
privacy?

Speaker 3 (09:02):
One of the reasons that I got involved with, and
one of the co-founders of, trust3.0 is this is actually, for me
, a decade's long passion arounddecentralization.
Decentralization and the way inwhich it empowers users has
been something that's beentalked about for probably at
least 20 years, but certainlythe last 10-15.

(09:23):
However, we weren't in aposition the very broad way, the
internet itself was stilldealing with much simpler
challenges like how do youstream a video to a browser,
which obviously we all do andenjoy now, but that took a
surprisingly long amount of timeto fix.
Consumers generally don'tnecessarily want to own the data
themselves.
The analogy we use recently isyou might want to rent a car.

(09:45):
You don't necessarily want toown the car, but you do want to
know that the car is insured,that you know you can drive it
safely.
I think a lot of consumersdon't necessarily want the
hassle of total ownership,although they should definitely
be given that option.
What they do want is to knowthat the data is being handled
in a secure and transparent way.
And going back to your questionon decentralized identity,

(10:08):
there's real benefits to havingcompletely decentralized
identity, and I would go furtheras to say having an anonymized
you know decentralized identity,so that you can essentially
prove who you are and that youhave the ability, for example,
to pay for a service or toconsume the service safely,
without necessarily the serviceknowing exactly who you are.

(10:28):
If a bank has a trillion dollarsin the bank, then that's a much
more attractive target forsomebody to come and steal all
the money from that bank.
If there is actually a trillionbanks all with one dollar in it
, which essentially is the kindof principle of decentralization
, decentralized data ownership,the attack vector is much, much
harder.
It's much less attractive andtherefore brands are less likely

(10:52):
to see the kind of data hacksthat we still regularly see.
I mean I think there was therewas another recent big data hack
.
Individual users can be moreconfident that they've got
control of their data, whilebrands and bigger companies can
have more confidence thatactually they're less likely to
essentially have the data to becompromised in the first place.

Speaker 2 (11:12):
I'm going to speak with Marie Wallace in a few
weeks from Accenture and this isher focus.
She runs decentralized identityfor Accenture.
How far away are we fromactually having this where
consumers can actually veryeasily do this and it's
frictionless?
Because I think at the momentit's a bit like when you first
start getting involved incryptocurrency and having you
set up wallets and authorizingthings that there's just so much
friction there for the consumerthey think are just all too

(11:33):
hard.
Is their technology out there?
Are their processes that aregoing to make this really easy?
And to my first question howfar away are we from this?
Really, it's a really goodquestion.

Speaker 3 (11:41):
I think the biggest barrier to this, despite the
fact that it's been possible formany years, has been exactly
that issue.
Going back to the question to aconsumer do you want to own and
control all your data?
Yes, I do.
Ok, here's the key.
Keep the key in your pocket andmake sure you don't lose it,
but if you do lose it, all ofyour data that you own is gone.
The people that cut the key andcreated the save for you will

(12:04):
not be able to get into thatsave.
Then you go back and ask thesame question so you want to own
all your data?
It's like, well, yeah, butcould you just hang on to the
key for me, or a copy of the key, just in case I lose it?
And I'm like, well, we can dothat, but bear in mind, then you
might own the box and the key,but if we wanted to, we still
have the key.
And so Apple have done quite anice job of addressing this and

(12:26):
I only refer back to thembecause they have actually
they're one of the few big, bigplayers that have implemented
this is they actually give youboth options.
They do store your data within,essentially, a not necessarily
decentralized, or you couldargue it is decentralized but
within a kind of data store.
They lock it and they create akey and then they offer to keep
that key in your iCloud account,which you still have to unlock

(12:48):
with a password usernamepassword and various other
things.
But in theory there is still anability for the service
provider to be able to unlockthat service in some way.
They give you the key and youhave to store it and then they
basically get you to agree thatif you lose that, then you've
lost everything, and I actuallywent for the latter.

(13:08):
But even the version where theykeep the key in iCloud is
actually a very good balance,and so the technology is
absolutely there.
It's harder to implement,obviously, it's easy to keep go
back to the original energy atrillion records in a large
database.
I mean it's easier from a pointof view of the kind of initial
implementation and it's mucheasier to have one key for the
whole thing rather than having atrillion keys.

(13:30):
You know, the whole managementof those different data stores
and the way you access it ismore complicated.
However, the technology doesexist.
It's fairly well established.
There are a number of differentapproaches to doing that.
There are also a number offairly well tried and tested
ways of implementing that,through two factor
authentication and deviceauthentication and so on, and so

(13:52):
there is a way to implementthis that gives a considerable
amount more control back to theconsumer or to the individual,
while also enabling to continueto consume those services in
much the same way as they do now.
I mean, no one, I think, wouldargue, unless they've lost
access to their Apple account,that on a day to day basis they

(14:16):
see any difference in terms ofthe way all of that data is
stored.
But nonetheless, it is actuallyquite well encrypted, and
that's why, you know, companieslike Accenture, of course, are
well positioned to implementthis.
I think one of the challengeswith this is that we do not
replace one poor system withanother poor system, and there

(14:37):
is a risk that largeorganizations at that kind of
final, that kind of final areawhere they've implemented
everything just right, and thensomebody goes in and says, well,
let's just leave one master keyunder the map, just in case.
As soon as you do that, youdestroy the entire.
It's kind of basically pullingthe rug from the entire system.

(15:00):
So if you implement it, youhave to commit to it.
You also have to make sure andthis is where things like open
source comes in is make surethat the system that you're
implementing is one that youalso relink with some control of
, so you may use it, you maysupport it, you may develop it,
you may charge for providingservices and everything on top
of it, but you need to make surethat the system itself is part

(15:23):
of the Internet.
It is an open framework thateveryone can use and contribute
and build.
That truly gives us thatcontrol back without any back
doors.
And that's my biggest concernat that last piece is that, the
worry that wait a minute I'mactually giving up control,
which is fine.
As humans, when we have control, we naturally quite like to

(15:46):
keep it.

Speaker 2 (15:47):
Trust 3.0 have just released a report.
Maybe we could talk aboutsomeone that the first thing was
around implicit data.
So, first of all, what isimplicit data and why is it
important?
What should we be doing with it?

Speaker 3 (15:56):
So implicit data is all of the data that's basically
been collected before, duringand after you consume a service
that you won't necessarily beaware of.
Even before the currentadvances in AI, they can pretty
much work out what you want.
They could work out that you'regoing to order a coffee in 20
minutes from the Starbucks onthe high street in Sheffield,

(16:20):
even though you may not havedone that before, because they
have so much data on you thatthey can work that out.
Now, obviously, some of theadvantages of that is that you
get there, your coffee's alreadyready, it's nice and piping hot
, there's a table outsidebecause they know that you like
to sit outside and that's areally lovely user experience.
So the use of implicit dataisn't inherently a bad thing.

(16:41):
It's the way in which it's usedand the transparency for which
it's used and the length forwhich it's retained.
So if all of that data was usedand they knew which coffee you
liked and you got the coffee andyou got your favourite seat,
they actually didn't know whoyou were, so they didn't know it
was Andrew.
They just knew from all ofthose implicit data points that
you wanted and you were able toconsume that service.

(17:03):
It suddenly again changes thatslightly.
But that's where implicit datais the thing that people don't
realise just how much data isbeing used, collected on an
ongoing basis.

Speaker 2 (17:13):
That brings me to another point which was raising
the report this value trade-offwhen it comes to consumer
privacy.
You gave some great examplesthere of data that could be
captured and used.
So two things where is thatbalance?
And secondly, if we move tosovereign or decentralised
identity, does that remove thatlevel of information?
So brands say, well, we can'tgive you a personalised
experience, we don't know whoyou are.
So where does the pendulumswing between helpful and creepy

(17:35):
?

Speaker 3 (17:35):
This really goes back to the question you had earlier
on imagination.
We are passionate aboutcreating really personalised
services for customers andactually very immersive, joyful
experiences that can be mademore joyful and cooler as a
result of us using certain datapoints, whether they are based
on you as an individual or theway you interact in a space.

(17:58):
My belief is that you cancreate a service that is
extremely anonymised to a quitehigh degree, while also actually
being able to use some quitesophisticated data points.
In the moment, it's the way inwhich that's implemented.
Part of that could be that thedata you sort of give permission
for that data to be used at acertain point and then that data

(18:21):
is erased, so you can providethe service at that point
without the company providing ithaving kind of long term
control over it.
The flip of that is thatactually they have all of the
data but they just don't knowwho you are, and so it's
completely anonymised data,which is still very useful for
brands to be able to improve theexperience and even understand,
kind of how consumers interactwith the brand and what products

(18:43):
they buy.
But they don't know it's youparticularly you, as Andrew,
might go to a show in Berlin andthen another one in LA and then
another one in London, andinteract with those and buy
certain services and products.
But if they actually said whowas that person, they don't know
who it is, and so there's lots,and this goes back to what we

(19:04):
were saying earlier.
With Trust 3.0.
We want to start not with thetechnology, but start with the
approach and the framework tounderstand what is it that you
need to be able to provide areally rich service.
What kind of relationship areyou looking to build with your
customers and then build aframework that enables you to
deliver those services in atransparent way, while also
protecting your customers fromthe risk of overly intrusive or

(19:28):
creepy technology taking over.

Speaker 2 (19:30):
So when we consider the ethical ramifications of any
technology that should be bakedinto strategic planning for
launching an annual initiative,you know from vendor profiling
test and learn.
Technology on its own seemsneutral.
It's the use case that suppliesthe risk.
Where do ethics and innovationmeet when it comes to privacy
and trust?

Speaker 3 (19:46):
When you're building those approaches and frameworks,
you need to build your ownethics framework.
So you know, obviously ethicsmeans different things to
different people.
I'm not saying that you cannecessarily come up with your
own version of an ethicalframework and then say, well,
I'm being ethical becauseclearly some things aren't.
But I think if you come up withwhat your approach is and
you're open about what thatapproach is and then you're

(20:07):
implementing and using thoseservices and honoring that, then
I think that's kind of where itmeets.
If you're opaque in yourapproach, if you're downright
dishonest, the ethical frameworkneeds to be open and then use
the technology to implement itthe way you've said you have.

Speaker 2 (20:26):
So I'm the Action More Futureist and I like to
look far enough ahead to beuseful.
So where do you think the trustdebate will be in 12 months and
what will Trust.30 have beendoing to further this debate?

Speaker 3 (20:36):
We'll be working through continuing to build out
those frameworks, working withsome brands to help essentially
identify the best approaches andframeworks to do that, having
some sort of events andsymposiums to be able to have
these debates in an open forum,working towards providing
training and support for how youstart to implement some of

(20:57):
these things, especially arounddecentralization and some of the
identity areas that you've beendiscussing.
And within 12 months, we'rereally looking to get to a
position where we can start toprovide a certification
framework.
So we would do the initial kindof survey or light audit on
this and then we would create aframework that enables other

(21:18):
companies to be able to do theactual kind of proper auditing
and certification with aTrust.30 certification kind of
stamp on it.
And so, yeah, that's wherewe're looking to get to and then
, beyond that, just continue tobe able to help brands and
companies do this better so thatwe get to the point where we
have a very open, fair andequitable internet.

(21:39):
And actually, when we talkabout internet now, we're
talking about our day-to-daylives.
We're all using the internet orrelying on it, you know, even
when we're in a very analoguesituation.
So it's not just protecting theinternet itself.
It's protecting our kind ofsociety and how we operate going
forward.

Speaker 2 (21:56):
Almost out of time.
My favorite part of the show,where we ask our guests a quick
fire around iPhone or Android,oh, iphone Window or aisle
Window In the room or in themetaverse In the room.
I wish that AI could do all ofmy first thing I thought of as
homework what's the app you usemost on your phone?

Speaker 3 (22:10):
Probably calendar.

Speaker 2 (22:11):
Best piece of advice you've ever received Purpose
beyond self.
What are you reading at themoment?

Speaker 3 (22:15):
I'm listening to a lot of Lex Friedman podcasts on
AI.
Who should I?

Speaker 2 (22:20):
invite next onto the podcast.

Speaker 3 (22:21):
Lex Friedman.

Speaker 2 (22:22):
So, as this is the Actionable Futures podcast, what
three actionable things shouldour audience do today when it
comes to better understandingthe opportunities and threats
from a trusted environment?

Speaker 3 (22:31):
Treat data privacy as as important as security and
sustainability.
Don't wrap it into those things.
It's its own thing.
Seek support and advice notnecessarily from trust through,
but you know, from whateversources to ensure that you
understand what the challengesand opportunities are.
And in the same way that manycompanies and individuals maybe

(22:54):
are looking at things like AI,look at data privacy now with
the same lens.
Ai will probably help in thatrespect.
Focus on it now because it willbenefit you in the future, both
personally and as anorganisation.

Speaker 2 (23:06):
Anton, a great chat.
How can people find out moreabout you and your work and the
work of Trust3.0?

Speaker 3 (23:11):
For Trust3.0, it's trust30.org.
For imagination, it'simaginationcom, For me probably
LinkedIn.

Speaker 2 (23:18):
Thanks so much for your time today.
Great to chat about this veryimportant part of technology
going forward.
Thanks very much.

Speaker 1 (23:23):
Andrew, Thank you for listening to the Actionable
Futures podcast.
You can find all of ourprevious shows at
actionablefuturescom and if youlike what you've heard on the
show, please considersubscribing via your favourite
podcast app so you never miss anepisode.
You can find out more aboutAndrew and how he helps

(23:44):
corporates navigate a disruptivedigital world with keynote
speeches and C-suite workshopsdelivered in person or virtually
at actionablefuturescom.
Until next time, this has beenthe Actionable Futures podcast.
Advertise With Us

Popular Podcasts

1. Stuff You Should Know
2. Dateline NBC

2. Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

3. Crime Junkie

3. Crime Junkie

If you can never get enough true crime... Congratulations, you’ve found your people.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.