All Episodes

September 1, 2023 35 mins

In this episode, Laura engages in a riveting conversation with Anne Ikiara, the Executive Director of the nonprofit Digital Action.  She has a remarkable background directing social enterprises in Global Majority countries and is known as an author, poet, speaker, gender consultant, and social advocate.

We talk about:

  • Lived Experience of Disinformation and Violence: Anne shares her personal connection to the impacts of disinformation, misinformation, and hate speech during the 2007 elections in Kenya, where violence ensued. This experience inspired her to join Digital Action and make a global impact on protecting democracy from digital threats.

  • Understanding Disinformation and Violence in Elections: We delve into the factors that lead to disinformation and violence in elections, particularly in Kenya, where ethnic divides play a significant role. Anne sheds light on how misinformation and hate speech are propagated online and offline, contributing to social conflicts.

  • Digital Action's Mission and Initiatives: Anne outlines the role of Digital Action, a nonprofit organization focused on holding tech giants accountable for safeguarding democracy from digital threats. She explains the disparities between investment in Global Majority and Global Minority countries and how Digital Action seeks to bridge that gap.

  • Challenges and Strategies in Tech Justice: Anne discusses the challenges presented by the ever-evolving social media landscape and the fragmentation of platforms. She elaborates on how Digital Action's coalition is working to ensure that tech companies invest in safeguarding democracy across the globe, not just in certain regions.

  • Global Campaign for Tech Justice and Protecting Elections: Discover Digital Action's campaign for 2024 to make it the Year of Democracy and Safe Elections. Anne emphasizes the importance of partnering with various organizations to raise awareness about digital harm during elections and compel tech companies to address these issues.

  • The Power of Context-Specific Safeguards: Anne stresses the significance of context-specific content moderation and safeguards in addressing digital harm. She discusses how tech companies should collaborate with local organizations, governments, and civil society to ensure effective protection.

  • Anne as an Author and Poet: Learn about Anne's creative side as an author and poet. She shares her passion for writing about justice, equity, human rights, and women's rights. Her forthcoming book sheds light on the nonprofit sector's inequalities for people of color.

  • Personal Responsibility in Combating Disinformation: Anne provides practical advice for individuals to combat disinformation and hate speech. She emphasizes the importance of verifying information before sharing it and encourages spreading positive messages that promote democracy and human rights.

Don't miss this informative and thought-provoking episode with Anne Ikiara as we explore the complexities of digital threats, democracy, and the power of collective action.

 

Connect with Anne and Digital Action:

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Laura May (00:10):
Hello and welcome to the Conflict Tipping podcast from Mediate.com,
the podcast that explores socialconflict and what we can do about it.
I'm your host, Laura May, andtoday I have with me Anne Ikiara.
She's the executive director of thenonprofit Digital Action and has
a wealth of experience directingand working with social enterprises

(00:32):
in Global Majority countries.
She speaks six languages and hasthe entrancing LinkedIn tagline
of "author, poet, speaker, genderconsultant, and social advocate".
So I'm excited to dig intoall of those identities.
Welcome, Anne.

Anne Ikiara (00:48):
Thank you thank you, hi Laura so much for having me.

Laura May (00:52):
No, I'm so excited to have you here because you know, the, the
work you've been doing with DigitalAction these last few months since
you've started has already been sointeresting and fascinating to me as
someone who stalks to you on social media.
So I'm really glad to have you herewith me today to talk about it and
learn a bit about you and learn abit about the organization as well.
So, I understand that DigitalAction protects democracy and

(01:16):
human rights from digital threats.
But before we dig into that, Iactually wanna know about you.
So what led you there?
What sort of piques yourinterest in this kind of work?

Anne Ikiara (01:29):
Thank you.
Thank you, Laura.
I have lived experience in the effectsof disinformation, misinformation,
hate speech that is propagated online.
In 2007 we had elections in Kenyaand uh, owing to disinformation,
misinformation and hate speech,we had post-election violence.

(01:52):
At that time I was running a smallorganization, national organization
called Nairobits, and as you mayanticipate, the epitome of the violence
was in the non-formal settlements.
So I came face to face with youngpeople whose livelihoods had had
been destroyed, their houses hadbeen burnt, they had lost relatives.

(02:14):
And even some of themhad physical injuries.
And I remember one day a youngperson coming to me and telling
me that their home had been burnt.
And as a result of that morethan a thousand people lost
their lives and more than twohundred others who are displaced.
So I understand from a livedexperience perspective what this

(02:38):
could mean at the personal level.
That is why when I saw the role at DigitalAction, I got very interested because
I wanted to have this kind of impactglobally and contribute to elections
and protect democracy from threats.

Laura May (02:58):
Absolutely.
And so, I mean, I know very little aboutKenyan politics, and I have a hunch that
maybe quite a few of the listeners don'tknow much about Kenyan politics either.
So can you give me just a littlebit more information about what this
disinformation and misinformation was?
Like what actually led to theseoutbreaks of violence and displacement

(03:19):
in Kenya around the elections?

Anne Ikiara (03:21):
Well, actually it is something that
perpetually happens in Kenya.
It happened in 2007, 2000 and 12 again.
And even last year, 2000and 22, it did happen.
What we have in Kenya is um, ethnicity.
There are different tribes in Kenya,and much of our politics follows

(03:43):
ethnic lines, so it's very easyto to have disinformation and hate
speech especially now with thedigital media along those lines.
So that's exactly what happened in 2007,where We used actually the phone SMSs
to send hate messages and disinformationagainst other communities, which

(04:07):
is how the the violence happened.
And the same way now it's even becomeworse, because Kenyans have embraced
digital media and social media morethan most other African countries.
So it has uh, escalated.
Disinformation, misinformation hasalways been there in the context of

(04:27):
elections , has always been there.
But now it is very easy tospread because of the tools
that we have in social media.
So that's what happened in 2007, 2008.
And that is how the violence happenedbecause we, we, I mean the information
pitted communities against each other.
And then it went to offline violencewhere we physically fought each other.

Laura May (04:53):
Awful.
Yeah, and thank you so much forshedding light on that and I mean,
it does sound really difficult 'causethere's along those ethnic lines as
well that I guess it makes such a, avisible cleavage for people to use and
to exploit for their own political ends.
So it sounds really, really difficult.
So tell me then about Digital Action.

(05:14):
What does the organization actually do?

Anne Ikiara (05:16):
So Digital Action is a small but mighty organization.
Started in 2019 to protectdemocracy from digital threats.
We are a fiscally sponsored organizationthat is funded by the SCO Foundation,
Luminate, the MacArthur Foundation andOpen Society, and the Ford Foundation.

(05:37):
And our work really is to take techcompanies to account, to protect
democracy from the threats thatare propagated on their platforms.
So big tech companies such as MetaTwitter and YouTube have under invested
in the Global Majority countries.
So much of their investment in protectingcitizens is spent in the Global Minority.

(06:03):
But then you and I know thatmuch of the harm happens in
the Global Majority countries.
So Digital Action is trying to take techcompanies to account to invest as much
to protect the Global Majority as muchas they protect the Global Minority.

Laura May (06:26):
Absolutely.
And for those listeners who haven'tencountered these phrases before, Global
Minority and Global Majority, it alignsmore or less with sort of this idea
of West and non-West or Global Northand Global South but stresses that in
fact what had previously been describedas the Global South is the majority
of population, majority of countries,the majority of land area, and yet

(06:48):
not getting the majority of resources.
And so, yeah, for those who are listening,that's what we're talking about.
I understand that, in the Eu forinstance, there's all, there's a lot
of talk about the Digital Services Actand, and things like that, which will
help to, as I understand it, to regulatesome of these social media platforms.
Are there similar initiatives andlegislation underway in Africa?

(07:12):
I mean, I guess I wanna know, like thisdivide in resources, is it related to
local legislation or is it related to thebiases of the tech companies, or is it
related to something else, do you think?

Anne Ikiara (07:25):
It is related to the biases of tech companies.
You understand thattech is very versatile.
It usually will go faster thanregistration in specific countries.
And it's a very complex legal situationbecause most of the servers, of course,
are not based in the Global Majority.
They're in the Global Minority.
So it's very easy for a big tech companyto side step local registration.

(07:51):
So what Digital Action is tryingto do is to take them to account.
To provide safeguards not based onthe level of uh, resources that they
get or the business model, but alsothe level of harm that could happen.
So if in Kenya, for example, or inBrazil or in any other country, the

(08:13):
level of harm is huge, then theyshould invest more in that context.
As much as they invest in Us, wherethey get most of their business from.
So that is what DigitalAction is trying to do.
Because right now, themodel follows the money.
Where they get advertisements andwhere their, their revenue is

(08:34):
coming from is where they invest,ignoring the Global Majority, where,
of course much of their platformshave been uptaken by the citizens.
And the effect is even worse for obviousreasons because of lack of resources
to, to mitigate some of the challengesthat are occasioned by that situation.

Laura May (08:57):
No, it, it absolutely makes sense.
And something I was really struck byis when I was reading Chris Wiley's
book about Cambridge Analytica , asone of the whistleblowers, and he
talked about how this organization hadstarted off experimenting in Africa
and trying to influence elections thereand trying to like stir up different
types of partisan violence there.

(09:19):
And so it was almost as thistesting ground, I suppose for this
Global Minority based organization.
And the consequences have just goneun untalked about right, because
we, I mean we heard about Brexit, weheard about like, you know, obviously
Trump's election in the US as well.
Like, oh yeah, this is all becauseof media manipulation, whatever.
But what we don't hear aboutis the harm in Global Minority

(09:40):
countries, as you've just flagged.

Anne Ikiara (09:43):
That is why that's what Digital Action is trying to amplify.
Because we are working with partners.
We are a frontier organization.
We don't necessarily do the workourselves, but we like to front
organizations in the Global Majoritythat are doing different things
to make that environment safe.

(10:05):
So there are different peopledoing different things.
There are researchers,there are civic educators.
There are, other policy people at theintersection of policy and regulation.
But Digital Action is the convener.
And at the moment we havemore than a hundred and forty
organizations across the world.
And we are having a campaign, to make 2024the Year of Democracy, elections safe.

(10:33):
And in 2024, over 65 countriesare having elections.
And that is the first in a centurywhere so many people will be
having elections and also the levelof threats then is heightened.
Because if there is no regulationand if there are no safeguards

(10:53):
in that space, then you cansee the level of harm in 2024.
So we are having a campaign that isbeing launched on the 15th of September.
And we are calling it " protectpeople and elections and not big
tech", and there are organizationsin the space, partnering with us to

(11:13):
really make sure that the campaign isvery strong and that the big tech
companies listen and pay attentionto some of the asks that we have.

Laura May (11:24):
It actually sounds really scary.
'cause I mean, you've just highlightedthat misinformation, disinformation,
hate speech had this profound andin fact physical effect in Kenya.
And yet now we're talking about 65different countries which are gonna
have elections, which could be affectedin similar ways and by similar means.
It sounds like we could be hearingabout violence, about co-optation

(11:48):
of democracy in countries.
Like it's, it's quite scarywhat you're talking about.

Anne Ikiara (11:53):
Yes.
It's very, very, yeah.
And that is why our Campaign GlobalCoalition for Tech Justice is convening
to really protect people and not the bigtech companies and call really the Metas,
the YouTubes, and the Googles to account.
To protect, to mitigate that situationin much the same way as they would

(12:15):
mitigate in the Global Minority, tomake sure we are all safe in 2024.
It's really a big test and it's a big alsoopportunity for them to show concern and
responsibility in protecting democracy.

Laura May (12:32):
And so when you talk about protecting people and not big tech, and
you've mentioned safeguards a few times.
What are the asks?
What are the safeguards?
What could actually protect us?

Anne Ikiara (12:42):
Okay, what could protect us is.
Some of their policies are alignedto the West, you know, they, they are
specific to the English speaking contextespecially, but in other countries, like
in Kenya, for example, you just said inthe beginning that I speak six languages.
I could write in any of those languages,you know, hate speech on Twitter, and

(13:04):
it'll not be flagged unless they havefound somebody or they have context
specific safeguards so that contentmoderators really understand that
language and the challenges thatare specific to the Kenyan context.
So, one ask is for them to makesure that content moderation and

(13:24):
safeguards are context specific.
And then the other is theyshould be transparent.
Because right now we really don'tknow what safeguards are in place.
We don't know how much money is beingspent, where; we really don't know.
So we ask them to be transparent.

(13:45):
You know, we are using this amountof money in US, for example, and we
are using this amount of money toprotect Kenyans as well, for example.
So they should be transparent and theresources should match the level of
harm anticipated, and not the revenue.
That is another one.

(14:06):
And then they should also operatethroughout the election period.
Because like you saw in the, in theUS and like I've given you the example
of Kenya, they let their guard downimmediately the election happened.
And then we are talkingabout post-election violence.
So they stop moderating.

(14:26):
So, so they should put in measuresbefore, during, and after the elections.
They should offer comprehensive range oftools and measures and adopting to local
context that we, we have talked about.
And they should also involve governments.
Not, in the way of buying them offso that they, they're silent about the

(14:49):
harm, but also partnering with them tomake sure that the elections are safe.
And not just governments but alsoelection bodies, civil society.
They should partner with us, because weare on the ground and we can point out
areas of concern that they can invest in.
So in a nutshell, thoseare some of our asks.

Laura May (15:09):
Yeah, I have so many questions about the asks.
The first one that comes to mind is youmentioned that they need to put resources
into protection, not just during electioncampaigns, but also afterwards, because
as you mentioned, post-election violence.
And something that strikes me, 'causeyou know, before we started having this

(15:31):
call, before we started recording, we weretalking a lot about gender and racism.
And so I guess when I think aboutthis, I think, oh well, yeah,
post-election violence is bad,domestic violence is also bad.
And so maybe they should have thesesafeguards and these moderation always
. Like why not dedicate resources toprotecting people, not just in the context

(15:53):
of elections, but to protecting peoplefrom misogyny online or racism online,
which also lead to violence, right?

Anne Ikiara (16:01):
Exactly.
Misinformation, disinformation isa very wide subject and covers
different kinds of concerns.
And this is just one of them,but that is what we focus on.
But even in the context ofelections, it's not gender blind.
Women candidates, even women electionofficials, have been targeted with hate

(16:24):
speech that really removes agency fromthem as election officials, and integrity,
but also some of that has moved fromonline to physical harm to themselves.
Because really the way they'reportrayed in media, in social media,

(16:45):
can sometimes expose them to harm.
And it has happened in severalplaces where women have really
been targeted and sometimes evenphysically harmed, and their families.
Even some of the harm hasextended to their family.
So it's not a very, itis not gender blind.
It's a very gendered concept.
Mm.
Yes.
So that is also something thatshould concern them, and that is

(17:08):
why it should be context specific.
Because like in the context of Africa,for example, in the Global Majority, other
Global majority countries, women are, arejust now getting into elected positions.
Competing for positions , inthe electoral space.
And it's not yet a very acceptableconcept in some, in some areas,

(17:30):
especially in mostly in Africa.
So, women are really targeted,candidates especially, and, and it's
gonna appear very interesting andvery annoying because for the men
nobody talks about their private lives.
And, and,

Laura May (17:48):
What they're wearing

Anne Ikiara (17:50):
or what they are wearing.
But for women, somebody will talk aboutwhat they wear, who they are married
to, and how many children they have.
I don't know, who they ever dated.
And there are all these things thatreally not not relevant to the electoral
position that they're looking for.
So that should also be a concern.

(18:10):
But most of these thingsare context specific.
That is why we insist that theyshould enable accountability at the
level, at the level of the context.

Laura May (18:21):
This actually makes me really curious about Rwanda of all
places, because as far as I'm aware,they're the only country in the world
that has majority female government.
And so I'm really curious, especiallygiven the context of their history.
You know, it's what, 30 yearsnearly since the genocide.
I'm like, I wonder for myself likewhat hate speech looks like in

(18:44):
Rwanda nowadays around the electoralcycle and around the role of women.
And if it's somehow different.
Very curious.
I mean, I don't know if you know this,I'm just like, ooh, that's so interesting.

Anne Ikiara (18:55):
Rwanda is a very progressive country, and the
rule of law is followed.
I'm not very familiar with thatcontext, but my estimation is that
there will always be subtle subtlegender issues in this context,
but it may not be as pronounced inRwanda as it is in other places.

(19:15):
Like, for example, compared to otherAfrican countries, Rwanda might
be a little bit ahead, but thatdoesn't mean it's exclusively absent.
It might be, it might be there,but it might be more subtle
than it is in other countries.

Laura May (19:30):
Hmm.
No, I would be really curious becauseyeah, when I think about the Australian
context, and obviously we had JuliaGillard as as a woman Prime Minister,
and she was just shredded in media,for yeah, what she was wearing and
her, inverted commas, "lifestylechoices" and all of these other things.
And it was brutal.

(19:51):
You know, this, this sheer misogynyshe faced on a day in, day out basis.
And yet, people think about Australiaas this, you know, developed country,
it should be progressive, right?
Like women get into governmentso it can't be sexist.
I was like, well, I'vegot news for you buddy.
Like, that's not how it works.
That's not how it works.
Oh my goodness.
I'm gonna leave that, that alone.

(20:14):
There's actually something else Iwanted to talk about, which is also
difficult to measure because youreferred to this idea of aligning
funding and resources to the level ofharm done on social media platforms.
So how do we measure levels of harmand particularly potential harm for
an election that hasn't happened yet?

Anne Ikiara (20:37):
Yeah, good question.
That is, that is difficult.
And one of the things that we arestruggling with is that there's
no baseline data, but given thepast elections, and given the
the heat before elections, it ispossible to anticipate that indeed

(20:58):
we need to invest heavily here.
Because like if I give an example ofKenya, because that is where I come from,
elections are usually hotly contested, andit's very clear what the proponents are.
So it is possible to estimate thatpeople will be posting comments in their

(21:19):
native languages or probably sometimesin Kiswahili, and I think the level of
investment should follow that trajectory.
And it should be properly monitoredso that as is escalates, then also
the level of protection follows.

(21:39):
Because once you have a service that ispotentially dangerous, then I think you
have also the responsibility of mitigatingthat risk, however big it might be.
Yeah so it's a grey area, admittedly,but um big tech companies should
have the resources to do their ownresearch and be able to anticipate the

(22:03):
level of investment that is requiredin their platforms to mitigate.
And I don't think it is impossible,because many of the factors are known
long before the election takes place.
And if they're willing to havepartnerships with local civil
society organizations that areinvested in the local context, and

(22:24):
governments and electoral bodies.
Then it should be possible toreally understand and single out
the factors that constitute risk, sothat they're better able to mitigate,
long before the harm happens.

Laura May (22:38):
It's beautifully put.
I mean, yeah, you have the concreteneed for local language moderators,
but as you've just highlightedyourself as well, people already on
the ground in civil society alreadyknow what the danger zones are.
They already know ifsomething's gonna blow up.
And yeah, by partnering thoseorganizations, social media platforms

(22:58):
can say, "oh, we do actually need to toallocate some resources here, we do need
do a better risk assessment for here.
Like it absolutely makes sense.

Anne Ikiara (23:06):
Yes.
Exactly.
Yeah.
Risk assessment should happen in everycontext, and that's actually one
of our requirements, one of our asks.
For you to make it context specific,then risk assessment must take
place in that particular context.

Laura May (23:27):
So, tell me a bit more then about this campaign you're launching.
How do people get involved?
Like how does the campaign work?

Anne Ikiara (23:34):
Okay.
The way it works is that over the pastone year, we have been researching and
trying to find out the best method tocoordinate and cooperate with people.
And it's been a very consultativeprocess in which we have talked
to different people globally.
So in June we launched our website inwhich different people and organizations,

(24:01):
both individuals and and organizationsacross the globe could sign, and
agree to our regulations because theyare for any, any organization, any
coalition, there has to be somethingthat is bringing you together.
So they needed to agreeon our campaign asks.
Since then, 140 organizationsand individuals have signed on.

(24:26):
We call it a coalition for tech justice.
That's the name that we have given it.
And then together we arehaving different activities.
We are having the official launchin September 15, the International
Day of Democracy, that's whenwe are having the launch.
After that, then there'll be differentactivities to highlight what we are

(24:49):
doing, and by different organizations thatare already decided to partner with us.
And of course tied to that is whatI talked about earlier, writing to
big tech companies specifically toask them to make the environment
safe, equitably across board.

(25:11):
And different organizationswill have different activities.
Even individuals will havedifferent activities, all to
create a lot of visibilityaround the issue of digital harm.
And we shall monitor different electionsthat are happening in 2024, and make
sure that we understand the level ofharm and the safeguards that are being

(25:35):
put in place, so that then in the year2025 we'll be having some data, to
take big tech companies to account.
And to ask for policy direction, now basedon hard data that we are going to have
collected from monitoring the elections.

Laura May (25:55):
Amazing.

Anne Ikiara (25:56):
Yeah.

Laura May (25:56):
And a huge project, a huge campaign.

Anne Ikiara (25:58):
Yes.
Yes.
And you'll be surprised, the campaign isran by four people within Digital Action.
Our team is small, but we have thebigger network to front our case.

Laura May (26:11):
Amazing.
love

Anne Ikiara (26:13):
Yes.

Laura May (26:14):
that.
And so, something else I'm curious aboutis the sheer confusion of the social
media landscape at this point in time.
Obviously we've seen Twitter has becomeX with some horrible looking branding
and it's obviously sort of falling apart.
You know, people talking about the darkdays of Twitter, the fall of Twitter.

(26:34):
We've seen similar things happenwith Reddit in terms of, there was
a lot of fuss about the APIs beingcut off, apps no longer being used.
We've seen migrations to Mastodon serversto Lemmy, kbin, we've seen as well the
launch of Threads, which, after thefirst three days, I heard nothing about.
Who can we even talkto in this environment?

(26:57):
Like who, who are the people?
What are the correct platforms?
This sounds like such aconfusing, huge puzzle.
Digital Action has written letters tospecific people that are responsible
for exactly what you have described.
We have through our own networks, we haveidentified people who are responsible for

(27:20):
making the platform safe at Twitter, atGoogle, at YouTube, and at TikTok, and
we have written specifically to them,and actually the deadline for them to
respond to us is the 4th of September inanticipation for our launch on the 15th.

(27:41):
It is not as faceless as it itmight look because there are
people running those offices.
There are people who report tothat office every day, and their
task is to make the platform safe.
So we have written specifically tothose people to make sure that they
tell us what exactly they're going todo for the 65, more than 65 countries

(28:02):
that are having elections in 2024.
And what does that look likefor the decentralized platforms.
Like Mastodon for instance, like Lemmyand kbin, where there's not one person
in control or one company in control.
For example, for Mastodon, I'm ona server for social scientists.

(28:23):
Which is managed by socialscientists and you have to like be
a social scientist to be accepted.
But I mean, there's heaps andheaps of different servers and
they all have their own rules.
I mean, Truth Social, like Trump'snetwork, is a Mastodon server.
And I mean, I'm assuming you can't reallywrite to that server or who, whoever's
running it server and say, Hey, wouldyou mind just not using hate speech?

(28:44):
Is that cool with you?
Like, so how do you deal withthis decentralization issue?

Anne Ikiara (28:51):
We have to find strategies to deal with, because it's evolving.
It's an evolving threat.
Every day is something that is different.
So it's something thatwe should anticipate.
As we, as we continue, because thereare evolving threats every day.
That's why this field is verychallenging, and it's also very
exciting, because you see differentthings every day which you might either

(29:16):
anticipate or respond to as they evolve.
So that's another new challenge.

Laura May (29:23):
Yeah, it sounds really difficult in the era of decentralization
and sort of fragmentation of the socialmedia landscape, at least with targets
like Meta, like Google, you do have,as far as I'm aware, the majority
of the world's population on there.
So they're pretty good targetsfor reducing harm in the interim.

Anne Ikiara (29:41):
Yes.
They also have a very, very wide reach.
And in Majority countries there is quitea, a sizeable chunk of the population
have access to those platforms.
So I mean, you put your resourceswhere the harm is greatest and
where you can score big wins.
I, yeah, that's part of our thinking.

Laura May (30:04):
Absolutely.

Anne Ikiara (30:05):
Yeah.

Laura May (30:06):
Okay.
There's actually something elseI wanna ask you about, because
I've been so curious about youridentity as an author and a poet
since I saw that on your LinkedIn.
I love that that's your LinkedIn tagline.
I love that it's there, youknow, you have this creative
component to your personality,you've got your soul on display.
So so tell me, what kindof things do you write?

(30:27):
What is your poetry about?

Anne Ikiara (30:29):
My poetry is about justice, and equity.
That's what I write about.
You're not surprised, no?

Laura May (30:38):
I'm not surprised at all.

Anne Ikiara (30:40):
I write about equity and human rights and democracy.
I'm a child of, as you may as assume,from parents that experienced colonialism.
So there's a bit of abit of that in my poetry.
My book actually is I have amanuscript that is currently going

(31:02):
through editing, that is about myexperiences in the nonprofit sector,
and the inequalities that exist inthat space for for people of color.
And the colonial aid systemstructures that follow the same
trajectory as colonialism did.
I also write about women'srights and, and gender issues.

(31:27):
That's my passion.

Laura May (31:29):
Incredible.
Absolutely incredible.
I mean, to me it sounds like wewould be better off having your
writing circulating on socialmedia than disinformation for sure.
Like.

Anne Ikiara (31:39):
Yes, yes.
Yes.
I, I hope it'll circulate
uh,
soon.
It usually happens that disinformationspreads faster than positive messages.
I think that's the way human beings are.

Laura May (31:54):
It's true.
I mean, we've got thatnegativity bias, right?
And threats are more immediate andmore important and pressing than
things that make us feel good.
Absolutely.
Yes.
So tell me, if you had a magic wandand you could use it to change one
thing about the digital landscape.

(32:14):
What would you do with your magic wand?
If you could do any one thing?

Anne Ikiara (32:18):
I would make all platforms safe for the 2 billion people that
are going to have elections in 2024.
I would just wave my magic and 2billion people would be safe.
There'll be no disinformation, nomisinformation, no hate speech,
so democracy would thrive.
People would have their agency, becausedisinformation robs people of their agency

(32:43):
because these spaces target messages atyou that skew your thinking, and that's of
course taking your agency away from you.
So citizens in those countries wouldhave their agency, would have the best
leadership, would have the best democracy.
There would not be any hate speech.
There would be serenity in allof the world and we would

(33:05):
interact in those platforms.
To, you know, to have messages ofhope and peace and, and progression,
not how to hate on each other andhow to make life difficult for
each other, rather to progress.
And we'd discuss things thatare great for take us forward,

(33:25):
rather than that divide us.

Laura May (33:29):
I love that answer.
You know, sometimes, sometimes whenI ask people a question like this,
they'll be like, hmm, I would changethis program to be something different.
And you're like, no, no.
With my magic wand, I'm gonnacause world peace in the next year.
I love that.
I love that.
And why not, right?
I mean, that's what you'redoing with Digital Action.
That's, that's the wholeend goal, so good on you.

(33:51):
And on the more personal level,Do you have any recommendations for
us as individuals and as listeners?
Like what should we do if wethink something is disinformation?

Anne Ikiara (34:03):
I would ask citizens, private citizens, not to spread misinformation,
disinformation, hate speech.
Verify information before you passit on, because the platforms are
powerless without us participatingin the disinformation, in spreading
disinformation and hate speech.
So don't spread hate speech.

(34:24):
Verify the information beforeyou spread it, and instead of
spreading disinformation, spreadthe right information that is
bringing peace, democracy, andpromoting human rights to the world.
Both in the Global Minorityand Global Majority.
all people are the same.

(34:45):
We are all human and I think that'sthe way we should see ourselves.

Laura May (34:52):
Amazing.
So look Anne, thank you somuch for joining me today.
For those who are interested inlearning more about your work, whether
as a poet or on the behalf of DigitalAction, where can they find you?

Anne Ikiara (35:04):
Okay, our work is on www dot Digital Action dot co.
That's where you can findour campaign materials.
You can find our asks, and youcan find our coalition partners.
Thank you so much for having me.
I really appreciate the opportunity totalk about Digital Action and the Global

(35:26):
Coalition for Tech Justice and how peoplecan partner with us to protect people
and elections and not big tech companies.

Laura May (35:35):
Absolutely.
Thank you so much again Anne, andfor everyone else, until next time,
this is Laura May with the ConflictTipping Podcast from Mediate.com
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.