Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Stephen King (00:00):
We're back and
here we are for yet another
episode of the incongruent andjoining me once again is the OG
G-O-A-T A-R-J-U-N-R-J.
Arjun Radeesh (00:14):
Hello, hello,
hello.
How's it going?
Stephen King (00:16):
Everything is
going phenomenally well.
Right, so we are today we'retalking education and AI ed
tech, which is a verycontroversial and complex topic.
Who did we speak to today?
Arjun Radeesh (00:27):
So we spoke today
to Yara Alatrach.
She is she works withMicrosoft.
She's also one of theco-founders at EDNAS, which is
our education-based platformwhich is committed into
equipment uh KS1 to KS3 studentswith the skills needing to
(00:48):
thrive in an AI-powered world.
Stephen King (00:51):
It was a really,
really interesting conversation,
and uh EDNAS is uh here in theUAE with you, correct?
Arjun Radeesh (00:59):
Yes, it is
an Emirati product.
Stephen King (01:04):
EDNAS is inspired
by the great policies and the
direction that's been takenthere in terms of AI uh across
the entire country and in theeducation in particular.
So here is a wonderful,wonderful conversation.
I hope you are ready for it.
So here we go.
Arjun Radeesh (01:20):
Hello and welcome
to a brand new episode of
Incongruent.
So today we are joined by YaraAlatrach, who is an AI and
digital transformationprofessional with over 10 years
of prospractice experienceacross technology consulting and
energy.
She began her career as anengineer at ADNOC before moving
(01:43):
into business management and AIstrategy role at Boston
Consulting Group.
And currently she leads theEurope and Middle East and Asia
AI strategy for energy andresources as an industry
director at Microsoft, where shehelps governments and private
sector organizations designnational AI transformation,
(02:04):
adoption, and upscalingstrategies.
Along with her role atMicrosoft, she also has her own
venture, EDNAS, where Yaraoversees platform operations,
curriculum localizations, anddelivery, ensuring that every
module is both globallybenchmarked and regionally
relevant.
With a Masters of Science inData Science and over three
(02:26):
decades living in the UAE, shebridges product, education, and
policy to make AI literacyaccessible and scalable across
key stage 1 to 3 and K-12schools.
So, Yara, welcome to the show.
And how has it been going?
Yara Alatrach (02:46):
Thank you, Arjun.
I do apologize.
That sounds like a veryexhausting intro to have made
across everything.
Knowing quite well.
Very excited to beparticipating in this podcast.
And honestly, I love talkingabout AI in education, so I'm
very excited to get started.
Arjun Radeesh (03:02):
You've moved from
engineering at ADNOC to AI
strategy at Microsoft toeducation with EDNAS.
How has this cross-sectorjourney shaped and how you see
AI's role in the society?
Yara Alatrach (03:18):
Yeah, it's
surprisingly actually quite uh
cross-curricular, by the way,and like cross-industry, because
with my role and what I'vedone, when I started in ADNOC
and I worked as a reservoirengineer, I used to focus a lot
from an operation perspective.
And at one point I startedusing AI, of course, in my role.
So I was an end user.
(03:39):
And when I moved intoconsulting, it was more focused
on strategies.
And when I moved to Microsoft,I started even elevating it a
bit more with um strategies forlarge enterprises like ADNOC and
other oil and gas companies andenergy companies.
And what I noticed the most,what was very interesting, is
because I had that experiencefrom a different aspect and a
(04:00):
different stakeholder everysingle time, I was able to
notice how people engage andspeak about AI completely
differently depending on wherethey're standing.
And what I noticed the most iseveryone speaks about it in a
different language, which a lotwhich causes a lot of uh it gets
lost in translation, basically.
Because you have the end userwho's speaking from their own
(04:22):
perspective, you have the peoplewho come from a tech company
who speak in very deep techconversations.
And a lot of times, and this iswhat where my main role is,
it's I try to bridge that gap.
And with AI over the last twoyears moving from a hidden
infrastructure and becoming morefront-facing, and everyone is
being able to engage with itwith the natural language.
(04:43):
We, and with the mandate, ofcourse, that's happening in the
UAE to teach AI in education, wenoticed that we need to shift
on how we teach AI in education.
It's no longer about justpassive introductions to common,
uh common models, for example.
It actually has to be become abit more advanced.
We need to be looking at itmore from a critical thinking
(05:04):
perspective.
So that thought is what led toADNES.
We want to focus on fluency,not on the hype of AI.
We want to focus on givingstudents and teachers the
capability of when they'reengaging with AI, as it's being
introduced more into schools aswell.
What is it?
What is it not?
How do I use it?
When do I use it?
(05:25):
And if I'm using it, how can Ibe aware of the bias that may
exist within these technologies?
So the focus of ADNES isbasically to it, it exists to
turn that curiosity intoconfidence for both the learners
and the teachers.
So AI becomes moreunderstandable, more usable, and
more responsible in theirhands.
(05:46):
And if you're an end user aswell, because they're going to
participate in the nextgeneration of workforce, they
need to understand if I'm an enduser, how do I need to question
the tools that are beingprovided to me or they're being
tried to be sold to me?
How can I go beyond of saying,like, I want AI in my own
operation or in my ownworkflows?
(06:06):
And from a leadershipperspective, those who are going
to become to be leading theirown companies, their own
organizations, their ownprocesses, they'll be able to
have a better understanding howdoes it fit and where should it
fit as well?
So to go beyond the idea ofjust programming and coding,
because AI is not justprogramming, that's the reality
of it.
Arjun Radeesh (06:28):
Okay, that's
that's that's quite that's quite
an interesting uh story outthere.
Uh well, like yeah, uh livingin the UAE for 30 years, what
are the unique opportunities orchallenges that you see this
region has taken in its approachto AI adoption?
Yara Alatrach (06:47):
It's quite
insane, but I think it's also as
well quite um how do I saythis?
It's very UAE because UAEitself, uh the growth that
they've done over the last 30years is insane across all the
different industries.
And when they move towards AI,it's quite interesting how they
(07:08):
were one of the first countriesthat started that mandate.
And I love how they see this asthey don't see it as a
challenge, they see it as anopportunity, they see it as an
opportunity.
Uh, and they're treating UAE asthe uh treating technology as a
nation building.
It's it's the hub now for mostof the startups as well,
competing with the likes ofSilicon Valley, if not even
(07:30):
more.
And it's they're going beyondjust tool adoption.
They want to create the runwayfor responsible AI education.
That's the up that's theopportunity that they're leading
with.
And that's the message that weare aligning ourselves as well
with.
Like with ADNES, we're tryingto align with the national
priorities that they have on AIand digital literacy.
We're trying to support schoolswith the concept of critical
(07:54):
thinkers and how they can theycan grow to be the future
economy as well.
And how we're looking at ithonestly, it's from a very
simple LAN's perspective.
Country needs first and theclassroom practicality always.
How can we uh shape the uh theconcepts in schools in order for
them to understand how AIworks, how to design it, how it
(08:16):
matters, not just the literacyof how to use Chat GPT to prompt
and get the research that Ineed for X, Y, and Z or this
thesis or this topic?
Stephen King (08:26):
Uh, how are you uh
helping the teachers uh become
more welcoming and be morechampioning of this kind of
literature, this kind of uhpedagogy?
Yara Alatrach (08:38):
Yeah, so this is
why uh from ethnos we're looking
at it from um we want it to betaught uh teachable by any by
any teacher.
You don't have to be from acomputer science background.
So our curriculum is designedone, it's cross-curricular, so
it builds on what's happening atthat year level.
So let's say in year one thefocus is on X, Y, and Z.
(09:00):
The curriculum kind of buildson that these topics.
So it starts building on how AIis being used in terms of
writing books, how AI is beingused in specific industries, how
AI is being used as such.
And we're not forcing AI toolsin our curriculum as well.
So, what we're trying to do isto work with the existing
(09:21):
ecosystem of these schoolsbecause a lot of schools, over
that, even before the mandatefrom the UAE, they're already
embedding AI tools in a way, andkids use them on a daily basis
as well.
So, what we're trying to do isto make sure that it weaves into
the classroom in a much moresubtle manner, and we're also we
(09:41):
have also designed it to betaking one hour in the week to
teach a simple concept and buildon it across the year.
Um, it's understandable, ofcourse, like you mentioned,
teachers are busy, and ofcourse, this topic it can be
daunting to a lot of people aswell who don't interact with it
on a daily basis.
(10:02):
And one of the things that weare we want to do is the
training portion as well, wherewe when we teach teachers, when
we train teachers on AI, wedon't want to train them on AI
from the concept of this is whata neural network is, this is
what uh an auto-encoder is.
We're trying to pass them.
Stephen King (10:21):
Yes, it can it can
it can be very, very confusing.
Uh but the I'm seeing this nowas replacing home economics as
being computer economics, if youremember that particular
course, because each year you'regonna have to learn something
uh different, which is and youyou mentioned the the use of
phone, the use of your smart TV,the use of your Alexa, all of
(10:45):
these different things as theyoung person grows up, they're
gonna become interacting withdifferent technologies.
So I can see that.
Uh Arjun, what do we have?
Do you have what do we havenext?
Arjun Radeesh (10:53):
No, so um again,
well, we're talking more all
about AI, and with AI, alsothere is that uh thought about
okay, the ethics, the risk, thesocial impact, which kind of
comes into play.
So, how do you balance teachingtechnical AI concepts with
critical thinking about itsethics, risks, and social
(11:14):
impact?
Yara Alatrach (11:15):
Yeah, so I think
this goes back to what I was
saying a bit earlier as well.
We're not trying to focus onjust the use, we're trying to
focus on looking at thesetechnologies from every single
perspective.
So we we believe that thetechnical and the ethical should
go hand in hand.
It must grow together.
So, and and separating themwill create blind spots, right?
(11:35):
It's gonna create passiveadopters and passive users, and
that's the name that we'retrying to steer away from.
So, from our perspective, wehave our own AI competency
framework.
I'm not gonna go deep dive onit uh on a podcast.
Uh, I think people will getbored.
But the general consensus, itfocuses on systems.
What are the systems that wedeal with?
What is the data that we haveto consider when we're designing
(11:57):
something?
How do we design it?
How do we think about how it'show it's accessible by people?
And who might we who might bewe leaving out when we're
designing this specific model aswell?
And of course, communication aswell.
So students basically they theylearn to build tests and ask
who benefits, who's beingexcluded, and what's potentially
(12:19):
could still need human call aswell.
That that's this is a verycontroversial topic, of course,
and I think no one has made anyconsensus in terms of so we go
fully autonomous in certainthings and should or should we
keep things in human and loop.
But the idea is to build thathabit that in every it's not a
one, it's not a single leftlesson thought, it's embedded in
every single lesson, in whichthey always analyze and they ask
(12:43):
what data is driving this, whobenefits, who's excluded, and
how our judgment should stillmatter.
Arjun Radeesh (12:50):
Um yes, there one
big question I gotta ask now.
That what do you think theskills and or literacy is
required today for the10-year-olds when they enter
into workforce 10 years down theline?
Yara Alatrach (13:07):
Uh I think the
true skills that now that we're
seeing more is um the people whoare surviving with this uh
complete change and theacceleration of these
technologies are the people whoare adaptable, the people who
are able to get the criticalthinking.
(13:27):
Technology is evolving way toofast, and who knows what it's
going to be in 10 years.
The tools are going to change,but the the mentality and the
skill in order for you tounderstand and how to be able to
adapt and consume newtechnologies is what's going to
endure and what's going to bemore essential.
I think there was even um therewas even an a study that was uh
(13:50):
done by Microsoft on this, um,in which they said that the
focus is actually more aboutmaking sure that people are
empowered with the mentality ofhow can they become more fluent
when they speak abouttechnologies, how can they go
beyond simple functionalitiesand be able to frame problems,
frame better, knowing how to askbetter questions.
(14:12):
And this is one of the thingsthat I've experienced when I
when I when I work with peopleand when we sit and we talk and
about a specific solution, and Itell them, like, don't think
about the technology.
I don't want, I know I'm fromMicrosoft and I'm speaking, I'm
speaking to you, but I don'twant you to speak to me about
from a technology perspective.
Let's talk about this on howyou would think about this
problem, the questions that youwould like to have, the outcome
(14:35):
that you're trying to uh toachieve, and will weave the AI
into it.
That's how you become more evennative and thinking about a
specific um what is specificsolution.
Stephen King (14:46):
It's yeah, it's
good that you brought that into
there because our very, verylast question is all about ed
tech's influence, and there issome negativity about the power
that ed tech is is isinfluencing on teachers at the
moment.
Um you mentioned Microsoft, andI'm on the Microsoft AI fluency
(15:08):
program, which is brilliant.
Like I will happily say it isbrilliant, uh, but it does
tailor me down, it does guide meinto a the Azure uh foundry,
right?
Um, and so in order for me tobecome literate in uh in machine
learning generally, I will thenhave to go and do the Google
(15:30):
one, I'll have to do the Lambdaone, I'll have to do so many, so
many others.
Um so what is the role of theprivate sector and big
technology in supporting AIliteracy without it becoming
training rather than education?
Because education, we have tobe balanced, we have to be
critical.
Too much technology, all of theprivate companies will, you
(15:51):
know, they have sales targets tohit.
Where's the line?
Yara Alatrach (15:56):
I am gonna put a
very hard line on this.
I do believe AI in education,any focus when it comes to that
should be it should be agnostic,it should not be driven by a
single vendor.
But let's talk about realities.
Public, private partnerships,they are needed.
Technology companies areleading the AI innovation, they
(16:17):
are the ones who know what'shappening, what's the latest,
what's um what's a roadmap.
And public, when we're talkingpublic, I mean I mean here in
education, um, systems, schools,governments, ministries,
they're the ones who, of course,are trying to make sure that
whatever the generation thatwe're bringing up is able to
capture the knowledge that'sthat's that's happening in the
(16:39):
world.
And it needs to be uh dynamic,it needs to be able to adapt to
everything that's changing inthe world.
So we need to be able toleverage that to have
discussions, open discussionsforums, if possible, as well
with um uh uh with techcompanies.
Adness by design, we are techagnostic, for example.
(17:02):
So when we're teaching aboutAI, we're not referencing a
specific brand, we're notspecificing a specific tech
company.
What we're looking at it isfrom the core AI topic rather
than anything else.
And we're also not enforcingany AI tools because we want to
work with whatever the schoolhas, because some schools are
using ChatGPT, some of them areusing Gemini, and some of them
are using such.
And it doesn't matter.
(17:23):
The reality is it doesn'tmatter who is the one, what what
is the tool that you're using,because what we're trying to
teach is the concept itself, thehow to interact with it, how to
assess it, and how to evaluateit.
Um so we always welcomepartners who would love to come
and collaborate uh with areal-world context, provide
(17:43):
their tools, provide theresources, but like you said, in
my opinion, it needs to beagnostic.
It cannot be driven by beingforced to only work with X, Y,
and Z.
Because the reality is you'regonna graduate, you're gonna go
into the workforce, and you needto be able to jump in between
different companies.
And I do have to say one lastthing, I do apologize, one last
(18:06):
thing.
I see I see this as well, evenfrom a company perspective,
because when I work withcompanies, even their approach
now is I don't want to be stuckwith one vendor, I want to be
agnostic, I want to be able tohave something that's
interoperable, and that's themessage that we need, even in AI
education.
Stephen King (18:22):
Right, I think
that's been a wonderful final
note.
Actually, would you like toclose us down for today?
Thank you very much, Yara.
It's been wonderful.
Yara Alatrach (18:29):
Thank you,
Stephen.
Arjun Radeesh (18:33):
Um, but like
yeah, thank you so much, Yara,
for coming on.
Um, thank you so much fortuning on to another episode of
Incongruent.
Please do like, share, andsubscribe.
Uh and like yeah, follow us onyour favorite podcast platform
and yeah, see some feedback.