Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Mark Smith (00:01):
Welcome to the
Ecosystem Show.
We're thrilled to have you withus here.
We challenge traditionalmindsets and explore innovative
approaches to maximizing thevalue of your software estate.
We don't expect you to agreewith everything.
Challenge us, share yourthoughts and let's grow together
.
Now let's dive in it's showtime.
Chris Huntingford (00:21):
What's up
folks?
So I'm speaking to you from adeeply freezing, cold Hamburg.
I went outside it's so coldthat my forehead nearly got
frostbite, and there's a lot offorehead right, so it was a
little scary.
I was wandering down what Ithink is the red light district
of Hamburg, which is equallyterrifying, and decided to come
back to a very noisy hotel bar.
(00:42):
So that's what we are doing nowand today.
Considering we're in Europe, Iwanted to talk a little bit
about some of the laws that arecoming out across places like
the European Union and theirfocus on AI, as well as what's
happening in the UK and some ofthe interesting things we've
seen come out in the press,especially from folks like the
(01:02):
Secretary General from theUnited Nations.
So I reckon we should get intoit I like it, I like it.
Mark Smith (01:10):
And uh, yeah, sorry
, we've had a bit of a hiatus
for a couple of weeks, but we'reback.
We're back at it.
Um, our lives it seems 2025 hasmade all our lives extremely
full on and, uh, we're workingon a project that we want to
release to market shortly, so weare burning the midnight oil
and, uh, trying to do theseshows in between where we can.
(01:31):
Anna, where are you at themoment?
Ana Welch (01:34):
I'm uh, I'm at home
in london where we've done like
a little of a super minimal pitstop in London from the US and
we're going to Spain next week.
That being said, I am in Londonalone and was in Geneva again
burning the midline oil and justwe hustle.
(01:57):
It's just that time when wehave to give it our all.
I think, yeah, important thingshappening and we want to be
there, we want to help people,we want to understand what's
going on, we want to enable folkand we want to do our best.
Like Andrew was saying, we'rereally trying to drink our own
(02:18):
champagne.
Mark Smith (02:19):
Yes, I love that.
I love that and, by the way, Ilove champagne.
I wouldn't say no to a DomPerignon, and if it's just an
evening, let's go with a VerveClico.
It'll do me, and there's manyothers in between that I also
enjoy.
That aside, a lot's going on inAI in the moment right.
(02:43):
There's a heck of a lottranspiring.
I read Yuka's post this morningaround what he did post.
What was the event you guyswere at Chris?
Chris Huntingford (02:56):
Technology
Town Hall Talent.
It's the one that Viv andYannick host.
Mark Smith (02:59):
Yeah.
Chris Huntingford (03:00):
It was a
really good event actually.
Mark Smith (03:03):
You talked a lot
about our need to have a voice
right and in the AI discourseand I liked Anuka's post where
he referred to.
Are we and I've been sayingthis for a while how deceived we
are sometimes in the Westernworld to believe that
(03:24):
everybody's acting out ourinterest.
They're not right, as aparticularly big tech big tech
I'm talking about here, right asin you know, somebody had done
a tweet that he has in his postaround.
Well, the chinese government,you know, released um deep seek
and therefore can we trust them?
And he goes well, you couldjust have removed the word
(03:44):
chinese and said us it's a usayeah and why does it make any
difference?
because, and, and you know, someyears ago I was walking the
great wall of china.
I went a hundred kilometers out, had a personal guide to take
us over the rough wall not thetourist part of the wall and so
there's nobody around, right?
We're in this massivemountainous wilderness zone and
(04:07):
I'm always one to challenge,right?
I don't care what country I'min.
Chris Huntingford (04:14):
Are you one
to challenge?
Wow?
Ana Welch (04:16):
I cannot believe
that.
I know wow.
Mark Smith (04:19):
And so I said to
the guy so what really happened
at denham and square?
You know, like, what reallyhappened at Denim and Square?
You know, like, what didn'tcome out?
And that was the start of theconversation.
And Meg, you know with me shegulps when I like ask the real.
And he talked about some veryinteresting things, like they
had a blackout period where onlycertain houses were allowed to
(04:42):
have lights on at night andstuff.
This little kid went and turnedthe light on, not knowing
getting up in the night.
They just come in.
That kid never saw her again,type thing.
Like so it's, we got into thewhole area of man.
There's cameras everywhere inBeijing.
Right, you're watchedconstantly.
I think at any point there's atleast three to five cameras
looking at you and he goes.
(05:02):
In the West you have a falsesense of privacy.
He goes.
We know we have no privacy sowe just live with it.
We never grow up with aconstruct that anything is
private.
He goes.
But you in the west think youhave privacy.
And I was like, wow, what apoignant point.
Right, he goes.
You think you have privacy, yetall the time you're being
(05:23):
tracked constantly.
But it's under the shroud ofthat.
You have some form of controlover it, and I was like, wow, he
hit that good.
Chris Huntingford (05:32):
Yep, it's a
little scary.
So when I read Yuka's post,right, because there's a fine
line between being like aconspiracy theorist and a
realist, right.
Mark Smith (05:46):
Yeah.
Chris Huntingford (05:46):
And I think
you could translate that line
very well and so do you.
Where there's no, he evidencesthings that he finds and says
right, which I like, but he'sbeen challenging the Microsoft
AI ecosystem and actually the AIecosystem in general pretty
hard for a year now and I'vebeen reading a lot of what he
puts up.
I've got to do this closingkeynotes year now and I've been
reading a lot of what he putsout.
I got to do this closingkeynotes in Talents and one of
(06:08):
my final points in the keynotewas that don't accept the
nonsense you read, justchallenge it, right, because a
lot of it is bullshit, right,even some of the partners we
work for, the customers we speakto.
There's a lot of misinformationout there actually and it's
unfortunate, right, and I feelsorry for people that don't
challenge.
But actually people are wakingup and finally realize that
(06:30):
actually they can think forthemselves all right, and I want
to give you an example here.
I read this funny meme theother day and it was this lady
had posted on Twitter I startedprompting the GPT with a really
accurate prompt and thenrealized I could solve my
problem just by writing out theprompt and somebody had copied
(06:52):
it and said my, my AI peoplehave discovered thinking.
I thought it was funny, right,because this is what's happening
, right?
What's happening is that thereare a lot of people in the tech
sphere that have discoveredthinking and now we're
challenging, right, and it'sgood, and I think it's our
responsibility as AI citizens.
Ana Welch (07:08):
It is good, but I
think that comment is a bit
snarky.
It's super snarky.
Chris Huntingford (07:11):
Like how
many?
Ana Welch (07:12):
times do you have a
problem and then you get a peer
to help you, especially inprogramming, right, it just
doesn't work out.
And then you get a peer to helpyou and you tell them the
problem and you discover theanswer for yourself.
Or they ask you a few questionsand you've got the question for
yourself.
So like that comment is a bitnot cool, Not cool, yeah.
Chris Huntingford (07:35):
I love it.
I love it.
I think it's great, right, it'slike rubber ducking.
So when you put your littlerubber duck on the desk, when
you're coding and you have aconversation and you figure the
problem out.
But that is what we've beendoing for years.
But all these noobs to AI arelike, oh shit, thinking wow.
And we're not talking about thepeople that are AI-ing as like
(07:59):
professionals.
We're talking about people thatare being lazy, right?
I think that's what thecomments are saying.
Ana Welch (08:02):
Do you think that
thinking became harder in the
last year and a half two years?
Chris Huntingford (08:08):
No, I don't
.
I think it got a lot easier,and that's the thing.
Mark Smith (08:13):
But in schooling,
right in our upbringing, in
schooling, is the schoolingmodel designed to teach you to
think or is it designed to teachyou to learn facts and check
the right box and show up Like?
I don't think in school we wereever really taught to become
thinkers, like, do you ever justtake time out to ruminate on an
(08:37):
idea or topic and just let yourmind take you to wherever it
wants to take you and reallydeep think on anything?
And I think it's actually goingto be a real important skill as
we go deeper into AI.
And I think the cousin to thisis critical thinking getting
really good at calling bullshit.
Ana Welch (08:58):
So I think that
you're asking the wrong audience
if you're asking Chris and I,because of course we weren't
taught how to think in school,but Brits and Americans, I feel
they are Like there's a lot ofdebate Interesting, there's a
lot, and I've seen I volunteeredin a few schools in the UK, in
(09:22):
a few schools in the UK, and Iwas surprised that children
seven to like 10 are able tothink about a subject, come up
with a story and argument theirpoint in 60 seconds or less and
like they even have like a goodstoryline.
They work in a team.
That's definitely somethingthat they are being taught in
(09:46):
school.
And I was also talking to afriend of mine Um, she's a
lawyer, a really importantlawyer in fact, and she was
telling me that a lot of herleadership comes from Hong Kong.
And this leadership they have,you know, children in Hong Kong
(10:08):
schools and you know sometimesthey visit each other and stuff
like that.
So people from Hong Kong comeand like bring her, bring her
gifts in, like books and thingslike that.
So you have special mathematicsfor toddlers, you know.
So like a book on math for 12months old.
(10:29):
That's insane.
And I also like have a neighbor, my neighbor just down the hall
.
They have their daughter in areally, really good school in
London, one of the top schools,private schools in London.
She's fun.
In parallel she's doingtutoring in literature, math and
(10:52):
geography, I think because shebelieves that if they were ever
to move back to Hong Kong shewould not excel academically Wow
.
However, my colleague alsotells me that when it comes to
university age, a lot of thosepeople who teach their children
(11:15):
in a very factual,academics-based system, they try
to get into a school in the UKbecause they don't feel like
their child learned how to think.
They memorize really well, theycan solve problems, but they
cannot think.
So that's really interesting.
I really do think that thesetwo nations, just from maybe
(11:38):
there are others as well, butjust from my experience, britons
and Americans learn criticalthinking.
Chris Huntingford (11:46):
I actually.
I'll give you an example ofsomething like that.
It's quite interesting.
So my mom is a teacher, okay,and when the AI came the AI
right my mom was like, oh, thisis terrifying, right.
And I'm like, no, don't youknow?
I said I'm like you got nothingto worry about, right, because
she works in a syllabus calledthe International Examination
(12:07):
Board in South Africa.
It's called IEB, and IEB, asopposed to the GDE Gauteng
Department of Education, this isJohannesburg.
Teach two different ways ofthinking.
The first one, the KaatseingDepartment of Education, will
say what is the little dot atthe end of the sentence.
So you say that's a full stop.
The IEB syllabus.
Say what is the little dot andwhy is it there?
(12:28):
All right, give us examples ofwhen to use it.
And that is critical thinkingright Now.
It's a very simple example, anextremely simple example, but I
think it's real interesting,right, and I was trained in both
.
So I was taught both thingsright, and I didn't excel very
well at critical thinking inhigh school because I was a lazy
(12:51):
asshole.
But in my work life I love it.
I think it's one of the mostfun things to do.
And I'll give you an example.
I said to Andrew yesterday whenwe met up I love it.
I think it's one of the mostfun things to do.
And I'll give you an example.
I said to Andrew yesterday whenwe met up I'm like, hey, andrew,
I'm on a Zoomie today.
And he's like, what's a Zoomie?
And I'm like, well, I've justspent five days thinking about
something, so now I'm able toimplement it really quickly.
And I am implementing it reallyquickly.
It's pretty look, but now thatI know I can just hustle and get
(13:23):
going.
And that's actually aninteresting thing I'm seeing in
AI and the way that people workis that AI, albeit giving you
extremely quick responses, youdo have to think about how to
structure the thing you're goingto ask it.
And then I think that's whereyou were going with the.
This is cheeky, right, becauseyou do.
You do have to think about howto ask the question, not what
the output is, and actuallythat's hard right.
(13:49):
So I was being belligerent onpurpose, and the reason is
because I do think that actuallyit does take a lot of work to
figure out how to ask thequestion.
Mark Smith (13:53):
I think one of the
superpower skills that we can
develop is good questioning,definitely Right Like really
good questioning, deepquestioning, and once again, I
think it's a muscle, it's askill that can develop.
You can get better at it and Isaid to Meg a couple of days ago
I think that one of the keyskills that everybody's going to
(14:15):
need to learn, whether you're ahairdresser, whether you're a
proctologist, whether you're ayou know someone in programming,
you've got to really go.
Am I asking the right questionand is it?
You know, I always feel he whoasks the questions controls the
conversation right.
Chris Huntingford (14:35):
It's always
going to be like that, and this
also goes back to my sensemakerdiscussion on LinkedIn, which
is why I put it up there.
Ana Welch (14:42):
It's like all the
people that are my sensemakers.
Chris Huntingford (14:45):
Thanks, man
.
All the people like you twothat are my sensemakers ask the
hard questions.
Mark Smith (14:51):
Yeah.
Chris Huntingford (14:51):
Do you know
?
Okay, let me give you anexample, right?
So, Jason, I want to use Jasonas an example.
Jason asks me more questions ina day than anyone in my life.
Okay, so, jason, I want to useJason as an example.
Jason asks me more questions ina day than anyone in my life.
Okay, so, jason, this morning Ican count them on WhatsApp.
He quite literally asked meseven questions before I'd even
gotten to send him one message.
Wow, okay, anna, let me giveyou an example.
(15:15):
So when we were working atMicrosoft, when you were
teaching me about integration,you never once gave me a flat
answer.
You asked me many, manyquestions about what I was doing
, and then you were able to belike okay, this is how I think
this should work.
Mark, same thing with you.
You never just give an answer,you always ask a ton of
questions.
And that's the commonalitybetween all of the sensemakers
(15:36):
in that list, and, I think, thepower of your ideas at some of
the questions.
And that's the commonalitybetween all of the sensemakers
in that list, right?
Mark Smith (15:42):
And I think, the
power of your ideas in the
question, and that is the personthat will make sense of the
scenario, and AI is all aboutthat.
I want to switch gears and talkabout the three large documents
that the UK government releasedrecently.
Can you tell us your views,chris, on those and what's your
interpretation of them?
Chris Huntingford (15:59):
So there's
been a couple.
So obviously the three big onesnow are the playbooks.
So if you go into the UK govsite you can download the
playbooks, which is reallyinteresting.
I haven't read them all indetail, but I kind of want to
hit on a couple of points there.
But there were also, previousto that, the breakdown, the
(16:21):
50-point strategy about whatthey were doing from an AI
perspective.
So I read that in detail and Ibuilt a Power BI dashboard to
help me break down each pointand where it fits in and how.
And the reason I did that wasbecause I read a post from
somebody in the place that Iwork and it was a five-line
response to the UK AI law and Ithought that was pathetic.
So I did.
I thought it was pathetic.
It shows a lack of perspective.
(16:43):
I'm sorry, but it's the truth.
So I took a jab at reviewingwhat had been written in the AI
law and actually there were someinteresting things that come
about and the UK have actuallyofficially identified themselves
as laggards when they talk inthe press release about where
they are and they're very openabout it, but not in a bad way.
They're wanting to innovate andthey're wanting to build, which
(17:05):
shocked me.
I was like this is incredible,right.
What they also did, though, wasthey say that they are looking
for talent.
They are looking for talent.
There's a line in the law, inthe things in the 50-point plan,
that says we will attract andbring in talent where we need to
.
Okay, so that's real important.
So, switching gears, I'm goingto look at the playbooks.
(17:28):
Right, I'm going to read you anextract from the playbook.
This is my favorite extract,all right, and this, I think, is
fundamentally going to beimportant to everyone, and this,
I think, is fundamentally goingto be important to everyone.
In European AI law, it statesthat if you implement an AI
system, you have to train peoplein the way that AI works.
Mark, you posted that.
This is from the UK AI Playbook.
(17:48):
Are you ready?
Principle number nine you havethe skills and expertise needed
to implement and use AI systems.
You should understand thetechnical and ethical
requirements for using AI toolsand have them in place within
your team.
You and your team should gainthe skills needed to use, design
, build, maintain AI solutions,keeping in mind that deploying
(18:09):
bespoke AI solutions andtraining your own models require
different specialist skills.
And there's more to it.
Now, what's funny is that wehad not planned this
conversation at all and I justhappened to have that open
because I sent that to a bunchof people.
But here's the thing there is avery interesting merger between
what Europe are doing from anAI perspective, including their
(18:30):
200 billion euro investments inAI for skills, and they're
looking at innovation as wellaround that and looking at what
America is doing as well.
It's wild dude.
So those documents, thatplaybook if I was a UK partner
and I was doing anything with AI, I would invest in spending
time not creating a five-lineresponse to a law, but actually
(18:53):
reading them and understandingwhat it is they're wanting you
to do.
Mark Smith (18:57):
Yeah, because in it
it is opportunity everywhere,
right, exactly, there's justopportunity to get this right,
to innovate I think it's acommon word that I keep coming
back to.
With AI, it's going to allowfor innovation and to do things
that could never have been donein an organization before.
(19:21):
Even you know it's interesting.
I had a conversation thismorning with somebody and they
were talking about buildinghomes, right, and they've
recently had a beautiful homebuilt and you know, normally the
relationship with the builderfinishes with the keys being
handed over.
And he was like, imagine if yourun AI over that project and
(19:44):
you brought out a 15-yearengagement plan with that
homeowner where you might usesubcontractors et cetera, but
using AI, you'd know whendifferent parts needed to be
maintained it might need abuilding wash, might need a
ceiling fan.
It's the end of life.
And you can have thisrelationship that even like
(20:06):
almost a ongoing, like you know,with your car, if you get it
serviced regularly and you go tosell it.
You go here's all the serverlike I've maintained this car in
pristine condition.
You go here's all the serverLike I've maintained this car in
pristine condition, but he goes.
Ai is going to enable this kindof new revenue streams that
never existed before.
And what you said there, chris,I just see for Microsoft
(20:27):
partners particularly so manynew ways if they get on board.
But what I do also see withMicrosoft partners at the moment
is a lot of them struggling tomake sense of AI themselves, and
you know they hear whatMicrosoft is saying.
They see the kind of three mainproducts Azure, foundry,
(20:50):
copilot, studio and then more ofwhat you can do in Copilot, and
you know if they've come fromthe biz app world they're like
well, copilot is m365, play andwhat I'm observing there's no
kind of silos anymore.
Maybe microsoft siloedthemselves in the past, but
that's not the future like yougotta be across it.
(21:13):
And and and I I just cannotemphasize enough the importance
of data and understanding dataand structuring data and why I
think Microsoft's biggest ballermove was Fabric and what you
can do with Fabric and thenPurview under it.
Ana Welch (21:31):
And it's not just
the fact that, oh yeah, purview
over everything, right?
Not even just Fabric, justabsolutely everything, right?
Not even just fact, justabsolutely everything.
Yeah, you go plug in.
The other thing that I amseeing is not just the fact that
you have to be acrosseverything, as I don't know,
(21:55):
microsoft seller, anorganizational partner, with
your focus as an organization,but also as an individual.
I do feel like Microsoft triedto combine skill sets before
they tried to embed fusion teams.
They tried to use DevOpsmethodologies.
(22:16):
They tried to use, you know,devops methodologies.
They tried to make, I don'tknow.
There was the whole pro code,no code unite Granted.
That was not from Microsoft, itwas from the community.
But that was the gist of it all.
Yes, but nobody really tookthat seriously.
So everybody just like kept onwith their skill.
(22:37):
But I think I'm seeing that,even I'm seeing that now, very
code heavy technical people whothis is what they do all day,
right and architect complexsystems like that, all of a
sudden they do start askingthemselves questions.
You know about, like you said,purview, fabric, azure, foundry,
(23:00):
the law, because there's thisthing that Chris says all the
time AI is non-deterministic.
Programmers find it really,really hard to test something
that cannot be tested in a unittest.
If I put two integers in, Ishould get an integer out.
You know, it's just the way itis we test for these things.
(23:23):
Or in an integration test.
Everything is like very normal.
All of a sudden, their bosscomes and says I think you'd
better just build a chatbot togive answers to our customers
rather than you do, and if youcannot do it, I'll just ask AI
to do it and I'll get rid of you, or stuff like that.
(23:45):
I've actually seen that on aReddit thread and these
architects are starting toquestion it, you know, and are
starting to ask their boss.
Fine, and how am I going totest this?
Like I'm really uncomfortable.
I don't know if this is goingto be right or wrong.
So I feel like the UK AI Actand the EU AI Act is almost
(24:09):
enabling these people, you know,telling them you were right to
question these things.
Mark Smith (24:14):
Yes.
Ana Welch (24:15):
And it's really
great.
Chris Huntingford (24:17):
I could not
agree with you more.
So I've got a project for bothof you.
All right, and actually,listeners on the call and
watchers, go onto the internetand find me a public-facing
chatbot.
Go and do it.
Do you know how many?
You will find Very few.
Do you know why?
Because they've all been pulled.
Ana Welch (24:39):
Most of them have
been a lot of little public
facing chatbots actually, butvery small from like small
companies if you can find,please send me the links.
Chris Huntingford (24:49):
I'll tell
you why.
So even if you look at placeslike Kia, air Canada, heathrow
Airport, manchester AirportGroup, I can list off retailers
going off of my head right now,ikea are the only one that have
a working bot which I've redteamed personally, like here are
the only one that have aworking bot which I've red
teamed personally to see if Icould break it, and I couldn't.
Okay.
But here's the thing A lot ofthem have been removed because
of that non-deterministicbehavior, because people like us
(25:12):
are getting smarter and weunderstand how to query the AI
and prompt, inject and lead theAI Right, I know how to do that.
So what I do is I will feedthat back to the owner of the
bots on the website and say thisis what I found, but they don't
(25:32):
want to get themselves into alegal bind anymore because of
why euai acts, ukai acts.
You can get sued now for this.
Mark Smith (25:35):
Let's just talk
about legal for a moment,
because if you're a microsoftpartner, you've never really in
my my 20-year career in the bizapp space, I never, ever
considered anything.
What we built had to have anykind of legal oversight right.
You'd set up your MSA with thecustomer, your non-disclosure
(26:00):
agreements, you'd have astatement of work and that was
about the main legal documentsthat you would have in place.
Now I think there's a wholelevel that Microsoft partners
particularly, are going to haveto look at around their legal
cover for the stuff that they'redoing on behalf of their
customers and therefore, whenyou're dealing with generally
(26:24):
technologists in a lot ofMicrosoft practices, this is
whole new territory for them.
Chris Huntingford (26:30):
Yeah, so
can I ask you a question, and
I'm going to ask it to both ofyou.
I am a man from Poland, right,I am riding my bicycle in the
middle of a council in Englandand I smash my bicycle into a
pothole, crack my head on thepavement and end up in hospital.
Okay, who is liable?
Ana Welch (26:53):
The UK Okay.
Chris Huntingford (26:55):
Now let me
throw another thing into the mix
.
I go through the same process.
It turns out the council that Iwas in uses AI to do pothole
detection using Azure ComputerVision.
They incorrectly determinedthat a pothole did not need to
be filled.
The partner implementing thesolution, by simply switching on
(27:16):
Azure and enabling computervision, is partner X.
Who is liable?
Ana Welch (27:23):
I think the
customer Is it the partner?
Chris Huntingford (27:27):
So you get
the deployer, you get the vendor
and you get the user.
When you look at EU AI acts,Now here's something crazy.
The European Union AI act isextraterritorial, which means
that it applies no matter wherethe citizen is.
Wow, All of a sudden we have aproblem because now, if that
happens in Australia, the EU AIAct applies.
It's like GDPR.
(27:48):
It's the same sort of process.
So when you go through thisprocess of deploying an AI
solution, using the worddeploying, I am the partner
deploying a solution.
Regardless of whether you areturning the key or building the
thing, you are liable.
And what happens in legal casesis they shift liability between
as many parties as they can.
Mark Smith (28:09):
Yes.
Chris Huntingford (28:10):
Okay.
So this is why I am extremelyagainst just randomly deploying
anything within a businesswithout going through
responsible AI treatments, goingthrough red teaming testing,
correctly going through theprocesses of digital safety.
Ana Welch (28:26):
So you're not even
saying public-facing bots,
you're saying internal tools.
Chris Huntingford (28:32):
I'm saying
anything that has an impact on
human life.
So the NHS shared businessservices recently released a
tender that talks about using AIvision to categorize and
classify certain things in ahuman body.
I would run as far away fromthat tender as I possibly could
(28:53):
because it impacts health and itis critical as far as European
AI law goes, given the number ofimmigrants in the UK.
Yeah, you see what I mean.
Yeah, Now, what they don't takeinto account is that the fact
that when you build the solution, you're going to spend I'm just
going to use days randomlyyou're going to spend one day
building the solution.
(29:13):
You can then times that by 150,getting it red teamed and
deployed correctly andproductionized correctly, and
that's where we have the problemis that people are still
putting out these ideas of likeoh, we can make it in a day.
Therefore, it's okay, it's notokay.
Mark Smith (29:29):
This is why you
need legal what was that phrase
you used then around somethingextra, extra territorial.
Man, that's interesting andalso impactful for anybody, as
you say, no matter where thatcitizen is, and of course that's
on top of anybody that'sexporting into the EU or those
(29:53):
markets right as a whole.
Chris Huntingford (29:56):
So you want
to see something crazy.
Do you mind if I screen share?
Yeah?
Mark Smith (29:59):
go for it.
Chris Huntingford (29:59):
Okay, so
this is documented on the actual
website.
So, if you give me a sec, man,this is why, when people put
these piffy little responses outto things that are released
into the public forum withoutreading them, I find it
hilarious.
So let me just get thescreenshot.
Ana Welch (30:19):
You're responsible.
Chris Huntingford (30:21):
Ah, come on
.
Ana Welch (30:23):
Sorry, guys.
So, as you do that, last nightI spoke about trustworthy AI at
a user group.
I was super nervous because itwas like a highly technical
Azure user group and the speakerafter me spoke about Fabric and
(30:54):
they had a big demo and thewhole thing.
I'm talking about legislationand trustworthy AI and things
that we need to do and how tomove forward with the technology
and so on and so forth, and itjust was not true, because they
(31:14):
were really actually veryinterested in knowing how to
talk to their leadership, theirCIO, about these workloads and
about the tools that they wereusing.
But somebody came to me andshowed me you know their tool, a
little customer support toolthat's supposed to be deployed
in many like small businessesand it's like an AI chatbot.
(31:37):
And I was like okay, and howdid you test this?
How did you like this appliesapplies to you.
You need to really look at thelaw and follow it?
And he was like well, I don'thave tons of money to give it to
teams to test.
I build it myself.
I'm a small startup.
What am I?
What am I to do?
(31:58):
Regardless that you are a smallstartup, you can get into big
trouble unless you follow theserules.
These are not like a littlejoke.
Mark Smith (32:10):
Yeah, so true Chris
what were you saying?
Chris Huntingford (32:14):
Okay, so
the scope of the appropriation
of the Act is outlined asArticle 2, encompassing actors
both within and outside of theEU.
The AI Act applies to providersplacing on the market or
putting into service AI systemsor GP AI models in the EU,
irrespective of whether thoseproviders are established or
located within the EU or a thirdcountry.
Mark Smith (32:34):
Wow.
Chris Huntingford (32:35):
Yeah, so
just be careful, right?
I don't think that people takeinto account.
Actually, it's the same as GDPR, right?
Like, even if you were dealingwith a European citizen outside
of Europe, the law still appliesto you and I can call a subject
access request at any pointthat I want, right, and I have,
(32:57):
by the way, and I will do thesame with this.
Mark Smith (33:00):
Yeah, yeah, same
with this, yeah, yeah.
Now interesting times and I Ithink the whole focused on
trustworthy ai is going tobecome increasingly important
that you know everything fromyour ecosystem to your, your
strategic leadership inside anorganization to the actual, you
(33:20):
know AI ops that you might berunning.
You're going to need to buildin the trustworthy AI kind of
mindset across it, red teamingappropriately the amount of
rigor you're going to need toprotect yourself, I think is
going to be extremely importantgoing forward.
Chris Huntingford (33:39):
And not
necessarily even just protect
yourself.
It's also respect yourcustomers or internal data.
Can I tell you something?
So I always say to people ifdata is a digital representation
of you, your customers and yourbusiness, why do you hate it so
much?
And it's going to be the samewith AI.
If that's going to be a digitalrepresentation or front end for
your customers, you and yourbusiness, why are we messing
(33:59):
around so much?
So I definitely think it'simportant more front end for
your customers, you and yourbusiness.
Why are we messing around somuch?
Ana Welch (34:04):
Yeah.
Mark Smith (34:05):
So like I
definitely think it's important,
it applies to that quote, Chris, that you've said in the past
If your customers knew how youtreat their data, would they
still be your customer?
Chris Huntingford (34:14):
Yeah, and
it's going to be the same with
AI, so good luck folks.
Mark Smith (34:22):
Alrighty, well, I
think we're at time.
Thanks everyone, everyone forjoining us.
Feel free.
There's a new actual feature onif you're on the blog on your
uh, if you're, you're grabbingthis by your app and listening
to this you can leave us avoicemail and what that means is
that you can record somethingthat you want us to play on air,
maybe a question that you wantus to answer, but you can
(34:43):
feature on the show via thatvoicemail function.
So feel free to use that.
Get in touch, but thank you forjoining us today.
Ana Welch (34:51):
Thank you everyone,
thanks Oli.
Mark Smith (34:53):
Thanks for tuning
into the Ecosystem Show.
We hope you found today'sdiscussion insightful and
thought-provoking, and maybe youhad a laugh or two.
Remember your feedback andchallenges help us all grow, so
don't hesitate to share yourperspective.
Stay connected with us for moreinnovative ideas and strategies
to enhance your software estate.
(35:13):
Until next time, keep pushingthe boundaries and creating
value.
See you on the next episode.