All Episodes

July 7, 2024 26 mins

Get ready for a captivating episode with Eleonore Fournier-Tombs, the brilliant mind behind 'Gender Reboot: Reprogramming Gender Rights in the Age of AI'.

Eleonore shares her path from a shy teenager in Model UN to a trailblazing data scientist at the UN. 

Listen as she dives deep into the motherhood penalty and the urgent need to challenge stereotypical gender roles in caregiving. 

Listen to learn:
- Why women’s participation in AI development is essential
- How AI tools often reinforce traditional gender roles 
- Ways that AI can be used for the greater good.
- How we can all fight gender biases and promote equality 

 

Find out more about We Are PoWEr here. 💫

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:08):
Hello, welcome to the business you're in.
Hello, welcome to the we ArePower podcast.

(00:28):
This is the podcast all abouthighlighting role models.
Every single week, I get theopportunity to speak to somebody
from across the corner of theworld doing something wonderful,
with the hope that they willpass on their top tips, guidance
, hacks, whatever that may be,that will help support you,
whether it's your career, yourlife, whatever path you're on.

(00:49):
We hope that we can help passon some of that insight and help
you navigate your way.
And this week I am reallyexcited because we have gone
super global in many differentways, in many different ways,
and I have got EleanorFournier-Tumes, who is the
author of Gender Reboot, whichis reprogramming gender rights

(01:10):
in the age of AI.
But not only that.
It's super accomplished, superaward winning Eleanor.
I don't know where we're goingto start today.

Speaker 2 (01:18):
Eleanor, welcome to the podcast.
Thank you so much.
It's such a pleasure to be here, to be on such a wonderful
podcast.
I'm really excited to speak toyou today and I'm looking
forward to getting started.

Speaker 1 (01:31):
And you are so accomplished.
You can find out more.
We'll put details of Eleonora'sbook and her connections right
in the chat as well, so pleasedo connect and find out the work
that she's been doing.
So how did you find your wayinto this world of where you are

(01:51):
now?
You're in New York, right, youknow you're working with the UN
and many other thingsaccomplished author, but how did
you get to there, to havingthis whole toolkit of
accomplishment?

Speaker 2 (02:06):
Well, that's such a big question.
So I've always really beeninterested in global affairs,
actually as a teenager.
I was doing Model UN in highschool and so I just thought it
was fascinating.
I actually had a little bit ofbackground because,
interestingly enough, mygrandfather worked for the

(02:27):
League of Nations back in theday a long time ago, and so
there was a little bit of familyinterest in these international
organizations and so when I hadthe opportunity to be exposed
to that in high school I was thekind of student who did things
like debate team and model UNand all those kinds of things.

(02:48):
I was extremely shy, which Istill am a little bit, so that
was always challenging because Iwas always trying to do debates
and public speaking and allthose kinds of things.
But I was very interested and soin my studies I studied
political science at first and Istarted becoming very
interested over time a littlebit later, maybe in my mid 20s

(03:12):
in this whole world of data anddata science and data analytics.
Really big UN experience was atUNDP in a research office
called the Human DevelopmentReport Office, where every year
they publish a report on thestate of development of

(03:33):
different countries and thequality of life of people in
different countries and I wasdoing a website and data for
them and I just really wanted tobe a researcher and I wanted to
have the kind of jobs that Iwas seeing actually many women
in that office doing.
It was so interesting and sothat's when I started pursuing a

(03:54):
PhD and I became a datascientist.
I started doing more academicwork and now I lead a research
team inside the UN focusing inlarge part on AI policy and AI
governance, which is just thehot topic right now.

Speaker 1 (04:11):
Absolutely for sure, and I really hope you're able to
bring some of those insightsinto this.
And also, it's thedemystification, I think isn't
it around AI, because there's somany different forms of it and
there's so so much.
There's also a real fear of it.
There's a, there's anexcitement for it, but there's
also a fear as well.
Is what?
What?
What inspired you to write yourbook, which is gender reboot

(04:32):
reprogramming gender rights inthe age of AI?

Speaker 2 (04:35):
well, um, during COVID it was just such a um, you
know, everybody was just sostressed and anxious during that
time and I was in a kind ofunique place where I was working
full time as a consultant, so Ididn't have that much job
stability, but I had small,small children, and so I was

(04:57):
terrified, and I had.
So my youngest was nine monthsold and then I had a
four-year-old and so, with myhusband, we were doing what many
parents of young children weredoing, which is taking care of
the kids during the day and thenworking at night.
And it happened that in myconsulting work I ended up

(05:19):
working a lot on predictionsrelated to COVID-19.
So particularly predicting thespread and the severity of the
disease in countriesexperiencing humanitarian crises
and looking at compound risks,which I thought was so
interesting.
So COVID-19 plus a naturaldisaster, plus already having
poverty or food insecurity justtrying to understand that for

(05:41):
international organizations andI started becoming angry because
I felt like how can Icontribute to this really
important global issue if Idon't have time?
And it made me really thinkabout gender roles and the
pressure that we have on womento perform care work, although
you know, as a mom, I love mykids and I love spending time

(06:03):
with them.
I do feel that there's thisexpectation that women are going
to do both, that they're goingto succeed professionally and
that they're going to be really,really engaged and take on the
majority of the care activitiesand duties.
And so I started unpacking thatand thinking that a little bit
more during the pandemic and Icame across the concept of AI

(06:30):
and AI bias and discriminationagainst women and realizing that
AI is being adopted around theworld in so many different
sectors and it's acceleratingand accelerating, but that there
are certain dimensions of AIthat are actually could be
harmful to women and could bediscriminatory or stereotyping
and actually regress some of theadvances in gender equality

(06:54):
that we've had in the last fewdecades of the advances in
gender equality that we've hadin the last few decades.
And I was also thinking abouthow important it is for women to
be engaged in these activities.
As a data scientist, as someonewho was developing AI models,
that I faced also barriers toself-actualizing in that space,
and how important it is to havewomen fully participating in

(07:17):
this really new and powerfultechnology.
So that's why I wrote the book.

Speaker 1 (07:22):
Wow, and it kind of very much explores the impact on
women's rights.
It impacts on the bias againstwomen as you just talked about.
The stereotyping I know is sortof something that's so massive
Because your book is exploringthe history of gender dynamics
in the workspace.
What do you think is the mostradical thing that has changed

(07:44):
at work in the last generation?

Speaker 2 (07:46):
I do think that there's been so many more
opportunities for women, in thesense that this awareness of
having the importance ofdiversity and having women in
leadership has really changed Inthe last generation.
My mom was telling me a storyabout how she had applied for a
senior position in her domainand how she had been told we

(08:12):
don't want a woman in that role.
And I think and that's a hugebarrier, and actually I put in
the book that my grandmother wasworking as an analyst in the
Secret Service in the US duringthe Second World War and when
she got married they had amarriage ban which they had in
the US and also in the UK.
So when you got married youwould have to leave the civil

(08:33):
service and you couldn't workanymore for the government.
So that has been abolished froma long time and telling people
that they can't access women,that they can't access positions
of leadership, has also changeda lot, particularly in the West
.
So it depends in which country.
There's still many countriesaround the world where there's

(08:54):
enormous barriers for women'sleadership and women's
participation in the workforce,Like Afghanistan is an area with
a place which has regressed alot, but in the last generation,
I think, where we're sittingright now, you can't really tell
a woman because you're a woman,you can't apply for this
position or we're not going toeven consider you.

(09:15):
So that's really really beenvery encouraging, and I think
it's allowed us to have reallydifferent kinds of careers than
our mothers and grandmothersmight have had.

Speaker 1 (09:24):
Wow, and it's fascinating that you've brought
your personal experiences intothe book.
But what can we do about it?
How can organizations out thereor individuals, people
listening to this podcast?
How can we work better andsmarter towards eliminating or
mitigating against these genderbiases in AI technologies and

(09:46):
creating that much sought afterequality in the?

Speaker 2 (09:50):
workplace.
So many different dimensions tothe question.
I think, if we focus on AI inparticular, in the book I talk
about three main categories ofrisks for women and gender
equality in AI, and so one ofthem is discrimination.
So, if an AI system hasdifferent outputs for women than

(10:11):
for men based only on thegender, for example, ai is being
used more and more in humanresources algorithms.
So, for example, you apply fora job, which many of us do you
upload your CV onto a webplatform.
So now many, many companies usean AI to analyze the text in
your CV and to put a hireabilityscore on that.

(10:34):
So we've seen and news startedcoming out on this in 2018, so
it's been quite a while that insome cases, discrimination based
on group and so the big examplewas Amazon in 2018 was
criticized because it waseliminating women's CVs that had
applied.
So if you have in your CV I wascaptain of the female rugby

(10:59):
team you have the word femaleand they identify you as a woman
, and then they were droppingthat CV.
And so also there in the US,they have women's colleges
there's one called Barnard,which is affiliated to Columbia
University, and there's anotherone called Smith and so they

(11:20):
were also eliminating CVs wherethe applicants had gone there.
It was not unintentional bias,but it had an enormous impact on
women.
If you think about how oftenthese tools are used, they're
not audited and how often theymight discriminate against women
and block us in havingopportunities, so that's a

(11:42):
really immense risk for us, andwe also are seeing this also in
loan attribution, unfortunately.
So not only do we have a riskrelated to applying for jobs,
but also getting less money fromthe bank than men.
In many cases, there's beenresearch shown that the
algorithm will attribute ahigher loan capacity and a

(12:03):
higher capacity to repay to menthan to women, with all other
variables being equal, andthat's because the tools are
trained on data which showshistorically that men made more
money or that men were morepresent in the workforce, and
then so they tend to biastowards men.

(12:24):
So the most important thingthat needs to be done is to
raise awareness and to make surethat all of these tools are
audited and tested beforethey're deployed, specifically
for this, because otherwisewe're not going to have access
to jobs and we're not going tohave access to capital, which
has an immense lifelong effect.
You know, all you need in yourlife is to get one job or one

(12:47):
mortgage, you know, and then itcompletely transforms your life.
So if you have all thesebarriers over the course of your
life, the trajectory of yourlife as a woman could be so
different than your trajectoryas a man.

Speaker 1 (12:59):
Wow, and you talk as well.
I've heard you talk about themotherhood penalty as well and
how the impact of that?
Only the lack of equality inthe distribution of roles, and
there's a whole debate, I think,about whether caregiving could
be given by robots and stufflike that.
Can we unpack that a little bitaround the motherhood penalty?

Speaker 2 (13:21):
Of course.
So I think one of the things Italk about a lot in the book is
that it's not just about women,so it's about gender roles and
this rigidity that we still havein our society about gender
roles.
So it is improving.
But it's not just womenaccessing what I talk about the

(13:41):
public sphere, so womenaccessing employment and
leadership and so on, but it'salso about men accessing care
and the home and domestic labor.
Often we have this idea thatsomehow women are better
caretakers, so women should justdo all the care work, and I
fundamentally really disagreebecause I think that as long as

(14:04):
women have to do all the carework, they're going to do it
because we all love our childrenand we can do everything.
So we're going to not be ableto fully self-actualize if we
have so much work at home.
Be able to fully self-actualizeif we have so much work at home
.
And many men that I spoke to inthis book really felt that they

(14:24):
were suffering a lot ofdiscrimination in relation to
care work.
So there were men I spoke tothat have paid employment.
So, for example, a midwifewho's a man.
There's one in the province ofQuebec, which is where I'm from,
where Montreal is, and there'sonly one male midwife,
considering that there's manymale OBGYNs, for example, so

(14:45):
it's not like they're not in thefield, but for a very
care-related employment.
There's only one, and hestruggled a lot, particularly
with women, to be allowed toaccompany women in that process,
and people were suspicious whywould you be a man and want to
be caring and want to have thiskind of profession, to accompany
women in that process?
And people were suspicious whywould you be a man and want to

(15:06):
be caring and want to have thiskind of profession?
And men in nursing, men, in evensocial work, face suspicion
from their friends or fromsociety in terms of why would
you want to do this?
This is not what men do, and sowe have these really, really

(15:31):
big biases.
For, for example, in elementaryschool and and early childhood
education, it's 95, 97% women.
It's almost only women.
And and I really think that ifwe're going to have true gender
equality and that women arereally going to be able to be
leaders, be technologydevelopers and so on, we have to
open the door for men to be ababysitter, to be an early

(15:52):
childhood educator, and let themalso express their nurturing
side, which is definitely thereand which is still quite blocked
up in our society and notacknowledged.

Speaker 1 (16:03):
Absolutely, and we talk so much about those
stereotypes being formed at thatearly age and so around our
young children, around youryoung youngest, if you like,
that's a six, seven thosestereotypes are formed and if
you're not seeing, you know sortof either men or women in all
those roles, then our younginsare growing up with that in mind
.
One thing I I heard you talkabout as well was um, really, um

(16:27):
the apps.
When you're building apps andtechnology, build them
intentionally, don't try andretrofit them.
And this, this went around us,um, you were talking about um
ride sharing apps, I thinkenvironmentally, but you managed
.
You managed to do good withthat in the end out of something
that was so dreadful againstwomen.

Speaker 2 (16:48):
Yes, so this example was an example from my research
in Indonesia, where they have aride-sharing app which does a
few other things.
It's called Gojek.
It's very similar to Uber orLyft, and so the issues that
they face with this app can bealso faced with all kinds of

(17:08):
other, with other apps, andbasically what was happening is
that women were using thisapplication and getting rides
and then suffering sexualviolence, being kidnapped, those
kinds of things, so realsecurity risks from this AI tool
, and so they were heavilycriticized, of course, and they
were requested to go back andrethink the design of the app to

(17:31):
make it safe for women to use,and so they started adding
features, for example, share myride feature, which you know you
just, we all know what it couldbe.
It allows you to share whereyou are with your friends and
family, for example, when allknow what it could be, it allows
you to share where you are withyour friends and family, for
example, when you're taking aride.
There was also better biometricidentification of drivers, so
it couldn't just be a randompeople picking person picking

(17:53):
you up, and they added a fewother features like that to make
it safer, and they reallythought about women's safety and
security as kind of a core partof the company, and it really
changed things a lot.
And so in the book I talk abouta lot of different examples.
When you put women's rights andwomen's security and gender

(18:16):
equality at the core of theapplication, what a big
difference it can make.
And also when you give thetools to women, because women
can also develop tools with adifferent perspective in mind,
so they can develop, you know,chatbots that are specific in
helping women that you know haveexperienced domestic violence,

(18:37):
or helping women in with women'shealth, for example, or even
developing kind of tools andapplications that would help in
care work.
So one of the things that we'vealluded to a little bit in our
conversation is thisstereotyping, and I do really
want to address it.
So when I was writing the book,there was starting to be this
idea that some AI tools actuallystereotype women or have gender

(19:02):
stereotypes.
So in translation algorithms,for example, we were seeing that
if you write a certain text andyou translate into another
language, the translationapplication would default to
stereotypical genders for yourstory, basically, or your text.
So if it will give you genderedmale gendered pronouns for

(19:24):
phrases related to leadershipand technology and philosophy,
and being a captain, and forwomen women would be.
You know you would use a femalegendered pronoun for things
like care, work and you know,dancing, which I think is
cultural as well, or cleaningthe house and so on.
And then since then this hasreally, really exploded because

(19:48):
generative AI, which we're usingin the last two years, is very,
very, very stereotypical.
So when you have a tool, agenerative AI tool that creates
text or images or videos, theygo along these historical kind
of rigid gender roles to such agreat extent where the texts are

(20:09):
all about, you know, they arevery gendered, where the images
that they propose and create arevery stereotyped.
And my worry is that it's goingto start eroding the way that
people think about gender rolesand the progress that we've made
so far and kind of put us backinto that, because when we're
exposed to stereotypical content, we start to internalize it.

(20:32):
If I never see examples ofwomen that are leaders and women
that are innovators andexplorers, how would I think, as
a woman, that I can do that too?
Right, and I think it's thesame thing for men If you don't
see examples of men that arestay-at-home dads or caretakers
or nurses or midwives.
As a man, you would feel sostrange to have the inclination

(20:55):
to do that.
So we have to be really carefulof the content that we put up
there and we really need to makesure that the companies
developing these tools are awareand are always testing for that
and preventing it and notallowing these kinds of tools to
be deployed if they're going toharm gender equality.

Speaker 1 (21:13):
For sure, and I think people can be quite fearful out
there about AI.
But what can we be excited?
What are the opportunities?
I know you have this phenomenalrole as research lead for the
UN and the new AI advisory board, but what are the opportunities
?
Give us some hope.
What's the opportunities outthere as AI as that real tech

(21:34):
for good, in the field aroundgender equality?

Speaker 2 (21:38):
So I don't think that we should be afraid of AI.
I think there is a very binarykind of discourse that either
it's just risks or opportunities.
I always try to think about itmore as responsible or safe AI,
in the sense that we have allthese different technologies in
the world, like aviation, andit's very powerful and it needs

(22:00):
to be safe, otherwise you can'tuse it.
If we had planes and they werejust unsafe and falling out of
the sky, we would never be ableto harness the opportunity.
We would never use them.
So I think it's kind of thesame with AI it needs to be safe
and then, once we've addressedthese different risks, we can
use it.
So in my work, I've used AI alot in a field of predictive

(22:23):
analytics.
So it's predictions, and we cando a lot at the UN for
predicting humanitarian crises,natural disasters and
understanding with analyticaltools, for example, what would
be the impacts of climate change, who are going to be most

(22:43):
affected, how can we help them.
So AI is very powerful inanalyzing big data sets and
predicting what might happen,not always with extreme accuracy
, but enough to give us ideas interms of how to move forward.
And I actually think that inthe domain of the climate crisis

(23:03):
, we really need to harness AIto help us preventing climate
change and addressing carbonemissions and energy efficiency,
and so on.
I also think we should use AIin terms of adapting to the
effects of climate change.
So figuring out, using analysis, and figuring out where to move

(23:26):
do we have to move communities,how do we use air conditioning,
how do we support vulnerablepeoples?
And so I really think thatthese kinds of big global issues
, we could use AI to really helpus get out of it because of the
immense capacity to analyzedata and to predict.

(23:48):
So to me, those are the mostpowerful uses of AI, and if we
have women front and center inthe development of these tools,
I think we'll have a verydifferent perspective on how to
develop them.
What are the kinds of featuresthat we could have?
Women really have, in a lot ofcases, different life

(24:09):
experiences, and we want toaddress these tools so that they
can really affect societypositively.
So we need to have a lot ofdiversity in who develops it.
Otherwise, it ends up beinginappropriate or kind of
incomplete for the people thatare being affected and, finally,
I think, for women themselves,because it's so powerful.

(24:29):
There's a lot of money involvedin AI.
There's a lot of economicopportunity and social
opportunity, and so it's areally great for us to be
involved, to really have ourcareers take off and be engaged
in this transformationaltechnology, because those in

(24:49):
society that control technologyreally controls society, so it's
a really big opportunity for usto have um a say in how our
society is developed fast liketotally fascinating and totally
reinforces everything that we'realways talking about.

Speaker 1 (25:05):
Let's make sure we've got more seats at those tables.
Let's have more people enabledin those conversations, putting
women front and center in thisevolution of technology, in
these decisions, in when we'rebuilding this technology.
Make sure that we have women atthe heart of it.
Eleanor, thank you so so muchfor enlightening us.
Thank you for sharing yourstory with such passion and your

(25:26):
research.
What you're doing is literallychanging the world.
Please check out Eleanor's book, which is called Gender Reboot
Reprogramming Gender Rights inthe Age of AI, which is
absolutely necessary, and Ithank you so much for joining us
.
I really appreciate it.

Speaker 2 (25:42):
Thank you so much.
It was such a pleasure to behere with you.
I really appreciate the workthat you're doing on this
podcast and it was a real honorto speak to you.

Speaker 1 (25:51):
Oh, thank you, and please do keep this conversation
going.
Thank you for listening.
Wherever you're listening,please pass on this.
Share this with someone who canmake that change or can be at
that table.
That's what this is all about.
How can we keep role modelingthe really great stuff that is
happening over there, across theworld?
Please stay connected on all ofour socials Facebook and
LinkedIn.

(26:11):
We are power.
Tiktok, Insta and TikTok andTwitter.
We are power.
Underscore net.
Thank you so much.
This is the we Are Powerpodcast, a what Goes On Media
production.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.