Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Welcome everybody too. It could happen here. Um podcast about
I don't know, how things are kind of kind of
kind of crumbling and how we can maybe put it,
put stuff, put stuff back together. And today I am
excited to talk with a senior let's see what is
what is what? What is the actual? What's actual term?
I saw it A senior programs strategist at Wikimedia, Alex Stinson. Hello, greetings, Hi,
(00:33):
it's so good to be here. I'm very excited about
our talk today because I mean, this should this should
surprise nobody that I used to I used to be
a Wikipedia editor back in the day. Not not shocking
at all, if if if you know me. Um, but yeah,
we're gonna be talking about what kind of Wikipedia just itself,
(00:53):
and then also uh, climate misinformation and disinformation and how
we can maybe create a better understanding of climate change
in its effects across kind of the world and how
digital information works. Those are all kind of topics we
talk about often enough, but never within the actual context
of like Wikipedia as an entity. Um so I guess
(01:15):
let's let's just start there with with Wikipedia and like
for those who don't maybe maybe people like use website,
but they're not quite sure what it is, Like, how
how do you actually describe what Wikipedia is? Because it
is like an interesting kind of amorphous entity. It's so
many things. Um. I think most people are used to
thinking about Wikipedia is like the fact checking device. Like
(01:35):
I have a bar argument with my friends and I
pull out my phone and yeah, yeah, this website at
website at me. Right. Um, it's a lot of things.
It's three language Wikipedia's. Actually it's not just one. Each
of these communities has its own editorial community. Um. At
last I checked, it's like sixty million articles across the languages.
(01:57):
It's it's really it's a lot of different content, um.
And a topic can be on each of those Wikipedia's right. Um.
And this is important as we start talking about disinformation.
Is like each Wikipedia because it's edited by people in
that language, and it's written by that language community. Um.
You know, each article is different and it has different perspectives. Um.
(02:19):
Two d eighty thousand volunteers editing every month. So this
is a lot of people, right. But the bulk of
that's happening on English Wikipedia and some of the larger
languages that are spoken across multiple cultural context And then
there's also a lot of other content sitting behind Wikipedia.
So there's a media repository, um, And there's a we
(02:42):
called Wikimedia Commons, and there's a database called wiki Data,
which kind of powers those little knowledge graphs on the
right side of Google and a whole bunch of other
parts of the Internet. Wiki data shows up in Amazon,
Alexa and all kinds of other places, right and and
so it's it's we're not just like one website. It's
(03:03):
many websites, lots of knowledge, lots of platforms, lots of context, um.
And we'll come back to that. Yeah, what really interesting
part of it is like I don't know my my
personal kind of social leanings. I generally kind of like
things that are more decentralized in general. Um. Other other
(03:25):
hosts on the podcast are generally kind of on like
the progressive left libertarian spectrum. Um. And one thing I
do really appreciated about Wikipedia is is it's more like
it's It's not I I don't think it's like open source,
but it the way it has decentralized editing and all
that kind of stuff. It's just a really interesting model
(03:47):
of of like what if a lot more stuff worked
this way? And I'm not not sure, like how how
much of like a decentralization focus is there, like consciously
at people at like the foundation and people who try
to like actually like run it behind the scenes stuff. Yeah.
So Wikipedia grows out of the like open source movement
and the kind of early days of the Internet. Right,
this idea that like knowledge wants to be free, technology
(04:09):
wants to be free, software wants to be free. Um,
let's let's use the legal infrastructure to like create freedom,
right uh in that sense, And then there's also the
free as in like anyone can edit, and then the
free to like do whatever you want out there in
the world. Um. There there's people are like free as
in beer and free as in speech, right uh. And
(04:30):
those things are those things are also there's they're always
intention uh and they're kind of working. And as you
can imagine, especially when you get outside of kind of
multicultural Internet spaces like English Wikipedia, um, it can get challenging.
Like if you're in Croatia and everyone is speaking Croatian,
there's a very small bubble in which to create that Wikipedia, right, Um.
(04:53):
And so it's interesting in that sense. Um. I think
there's also another part of Wikipedia that a lot of
people don't see what just the movement behind it. So
there's the editorial communities, people show up and make edits um.
But because there's this ideology that you're talking about, this
like decentralized, like we need to share knowledge or culture
or language on the Internet, there's also a whole social
(05:14):
movement sitting behind the scenes. Uh. And there there's a
podcast recently dot com The Wikipedia Story that kind of
captured that the essence of that UM and it's it's
a lot of people like myself, So I started editing
in high school. Yeah, yeah, one of those like, oh,
(05:35):
I know how to click the edit button and I
figure out how to use the Internet and that kind
of thing. But there's a lot of people that like
the intuitiveness of clicking and edit button on a piece
of open source software to create content is just not
It's not clear, right, And so you have to organize
and invite people in and so we have a whole
movement that does that too. That there's about a hundred
(05:57):
fifty organizations around the world that we organized events, work
with libraries and museums and educational institutions, and so there's
always this um kind of interesting dynamic where our values,
which is this like open software platform stuff has also
lived in our practice, in our outreach, like creating change
(06:18):
through society by sharing knowledge and education. Um And so
I think, yeah, it's it's it's an interesting it's an
interesting dynamic. Yeah, I think that does create a really
oftentimes beautiful reflection. It could it can have some dark
side different once in a while, but it is It
is really nice to have like kind of the ideology
driving it being reflected in the actions of operating it
(06:39):
and spreading it and that kind of thing. Um. So
this is something we kind of briefly touched on already,
but I think I'd like to move on to kind
of why, like how climate change and broader like social
issues are covered on Wikipedia, because you already mentioned like
it's kind because there's not a Wikipedia, there's many based
(07:00):
on different languages in places. It feels like to me,
whenever social issues kind of get covered on Wikipedia, it's
going to be in some part like a local reflection
of whatever is in that area. You know, if if
there's like a white liberal writing articles in New York,
it's gonna be different than someone you know, halfway across
the world writing them in you know, a much smaller country,
(07:22):
let's say, like Belarus, who's under like what I would
call a dictatorship. Um, so that's gonna change kind of
the nature of what people are making because of that
kind of divide. So how how does that kind of
crop up and is there any like solutions to that
or because because the because of the decentralized thing, it's like,
how much can we like impose like who like I'm i'm,
(07:45):
I'm I'm not in Belarus? How much can I impose
what I want their Wikipedia to look like? Yeah, there's
kind of two or three dynamics you're you're touching on here.
The first is because there's kind of an intention bias,
Like something comes up in the news, and our Wikipedia
community like people are within minutes of breaking news stories
(08:06):
are usually like editing the page, working to improve it. Right.
Um So if things show up in the you know,
European American press, it's very likely, especially something like English
Wikipedia will pick up on it and immediately cover it.
And because there are multiple perspectives in those press, usually
um kind of the ideological uh kind of multi sided
(08:30):
nice like works itself out because there's a lot of
eyes and a lot of people who know how to
edit there, right, um on in a kind of cultural linguistic,
geographic context where there's like one set of stories and
there's not a lot of diversity. Um uh, this this happens.
And and I'm going to refer to the Croatian Wikipedia
(08:52):
because we we actually had an external researcher look at
Croatian Wikipedia because part of it has been kind of
caught by by folks with kind of very ideological leanings
in a way that's excluding others and this is not good, right.
It creates a very one sided information environment and it
really reflects um, kind of the news dynamic going on there.
(09:14):
So when like breaking news happens, or when a topic
like a social issue or not necess like climate change
is not a social issue, right, this is a global
like life threatening issue. Um. When when something becomes politicized,
it's very easy for especially in smaller language wikipedias, for
few people to kind of swing the whole perspective on that. Um.
(09:38):
So yeah, there's this breaking news issue and this is
where are kind of organized communities are really important. So
the the example we want to point out of this
working well, um is in medicine. So are our medical
community during the Bola outbreaks a few years back. UM
in West Africa, we're able to organize both on English
(10:00):
and in languages that were accessible for local communities, high
quality coverage of the medical content because it's like has
impact on people's lives, and so they recruited translators. They
thought about, like what's a simple way to communicate the story,
um in that context, and like what do the workers,
the or the advocates or whoever on the ground who's
(10:22):
working with that crisis, what knowledge do they need? Right? UM?
And you see, like other open technology movements do stuff
like this, like humanitarian Open Streetmap has a similar kind
of way of organizing. They're like, hey, there's a crisis happening, UM,
let's pull people together from different parts of the world
who have the right knowledge or skills and like address
(10:45):
the knowledge gap. UM, so so you can solve it.
It's just it's complicated. UM. And you know, we've been
trying to address as a movement what we call the
gender gap. So there's both less women editors as women's
content on many of the wikis, and like it's taken
years and it's very hard to organize, and even when
(11:08):
there's investment in it. Um, It's it's challenging to to
make substantial progress because there might be contextual issues around
it too. And so you can't just like drop in
on a Central Asian language with a like Western perspective
and expect to like change the culture of the wiki overnight.
(11:28):
You have to engage with it consistently and be persistent
and work on it over and over and over again.
We are going to take a short break to hear
a message from our lovely lovely advertisers unless it's exomobile again,
but we will be back shortly, okay, and we're back. Um.
(11:55):
One one thing that we cover decently part of my
job and and and and Robert Evans job is disinformation
and misinformation and how that type of stuff spreads online. Um,
particularly you usually kind of linked to like political extremism
um or conspiracy theories or you know, in that general
(12:16):
kind of bubble um. And so what what type of
kind of climate misinformation has really been festering on various
you know wikipedias across the world really, because like we
would just be talking about like these topics and how
and how and and like why it happens, But like
what are the main types of misinformation or disinformation that
is much more like prevalent. Yeah. Um, so the first
(12:40):
is just kind of neglect of uh content that's happening
across the various things related to climate. Um. But we've
identified on English Wikipedia over three thousand seven d articles
that are directly related to climate change. Uh. We don't
have a very big editorial community and English on that topic.
(13:01):
That's like interesting fluent in the science and fluent and
the other stuff. And then you go out to the
other languages and like some of the languages have like
three thousand of them, some of them have like two hundred, right, um,
And so there is both um and some of that
content was like translated several years ago, right or five
(13:22):
or ten years ago, and yeah, you know, and like
the climate rhetoric has really changed, and like numbers and
statistics all that stuff gets updated every year. And it's yeah,
that is there is there's there's a lot to cap
with and like reading the IPCC report or looking at
any of the consensus science, there's like a lot of
change that you have to be influent in like science communication.
(13:44):
You have to understand like where to look for the
information um. And it's interesting. My partner is a Spanish
language speaker and she was in a kind of workshop
for journalists in Argentina for climate communication, and the the
workshop was like, oh, you should cite the Guardian, right,
so even as to to kind of understand this climate stuff.
(14:07):
So a lot of these local language contexts, there aren't
even good sources, and the sources they do have are
often citing like the dominant narrative that's going on and
like the anglophone news cycle, right, because there's not a
lot of climate communication going on, and so there's just
a lot of complexity involved in updating that much content
(14:29):
all the time. Um. And so the bulk of the
stuff that kind of creeps in is like this neglect. Right.
It's like some dominant idea in the narrative just hasn't
been updated, and like we need someone to update it. Um.
And that's like an organizing problem, right, that's uh, like
we need more people who are science literate, who speak
the local language, who understand how to edit Wikipedia. Um.
(14:51):
And that's trainable, Like we can do that. Yeah. The
reason that matters, the neglect matters is it stops people
from making decisions about climate change because they don't have
like an accurate sense of what we need to do, right,
which is cut the false fuels, increase increased resilience through
adaptation like actual political change, right. Um. And so so
(15:15):
that that's just it's a problem. Um. The other stuff
is a bit more it's a little bit more complicated. Um.
One of the things that happens is that, as you know,
there's quite a manipulation of narrative that has happened around
climate change. There's this really great podcast by Amy Westervelt
(15:37):
about how the fossil fuel industry like got its message
into schools in the last three years in the US,
and like that narrative is just so prevalent. And so
one of the things about wikipedias that we try to
do a balance of positions. If there are reputable sources
(15:57):
kind of describing or analyzing a topic, and this is
back to here polarization question too, if they're reputable sources
describing on topic, we try to give them equal weight
and balance across the article. The problem with climate is
that some of the narratives that look like reputable sources
(16:17):
are just pumped out of fossil fuel industry funded think tanks. Right,
and these things are not truthful narratives, right. Um. And
so the BBC ran an article, uh two weeks ago
on kind of climate denial and some of these smaller languages,
a smaller language wikipedias, and what they found was a
(16:40):
lot of these narratives being given equal weight with the
climate science. Um. And I took a look our community
after that BBC article came out, started looking across all
the language Wikipedia articles about just the main climate change page,
and they found thirty one Wikipedia's that had some of
(17:00):
that like equal weight of bad climate science. Interesting yeah, um.
And you know the BBC article only found like five
or ten, right, we found another lot a lot more, yeah,
yeah yeah. And so it's it's like a it's a
really like these narratives just seep in and you know, again,
(17:20):
I'm gonna go back to the Croatian example, like if
your media environment has been locked down by a certain
political rhetoric, too, those narratives might have traveled from like
the Anglo sphere into these other spaces and then gotten stuck, right,
and it's just like keeps getting recycled, and so that
(17:44):
causes delay. Um. And I was listening to your podcasts
recently about soft climate denial. Like this is what's happening
in other language environments, right, is people are rehearsing this misinformation.
It seems like a valid position because it's been rehearsed
(18:04):
so many times by by folks. Some people who are
championing that position are like doing so unknowingly, and then
the process, we're kind of disconnecting it entirely from the
source of the information, and that is just it's it's
really bad. One interesting thing that I thought of when
(18:25):
you were bringing up like sourcing, how sourcing itself coming
an issue like in the States, there's kind of like
a joke that like wicked like when people use just
Wikipedia as like as a source to be like they
just they just link the article. But like that is
the default for so many people when they begin begin
a research project, is like, Okay, what's the what's the
what is what does Wikipedia have on it? What's the
(18:46):
sources Wikipedia uses um and kind of branch off from there.
It's a very common thing. So I'm not sure what
like how different internet culture will be different in in
other countries. But if they do not, if they if
they don't have like the base sourcing necessary to create
like a decent home page article, then just sourcing from
(19:07):
Wikipedia in the first place becomes so much harder. Um,
because you like you were saying, like just use the
Guardian is is like one of the things, like that's
not horrible advice, but if it's only just from one thing,
then that that's going to change the entire nature of
like coverage and information on specific topics. Yeah. Yeah, I've
had just being really interesting kind of thing that I
(19:27):
never thought of before is how different countries wikipedias or
like language with Wikipedia's will have will have like different sources.
So then getting information from from the page, it's just
going to be so different. And like yeah, like like
the whole, like the whole like teared of sourcing is
just completely changed. Yeah. And and I think, like you know,
(19:49):
in medicine, and most medical practitioners expect most of the
medical literature to be in a handful of languages like
English and Chinese and that kind of stuff, right, and
like part of your professional work and part of like
saving people's lives is being able to use those sources.
And so if a medical Wikipedia article has a translation
from like an English article into another language, and you're
(20:13):
distributing that to medical practitioners and they find the citation
and it's in English and they can go follow the source.
Like that's not such a big deal. But with in
a topic like climate um, where the vast majority of
the people that have to make decisions on this information
do not have access to other languages. Maybe their access
to English is through like machine translation, Google or something
(20:37):
like that. Like having not having sources in your local language, UM,
or just having the sources that were translated from an
English Wikipedia article, which happens a lot on these smaller
language wikipedias, is kind of like not helpful for climate
decision making. UM. And this is where it's um. It's
(20:58):
easy for example, and a lot of these like Eastern
European languages or Central Asian UH languages for like a
politically spun news site opinion about something to kind of
creep in at the same level of of kind of
UH validity as as another as scientific research as the
(21:21):
the you know, consensus understanding of the climate pricis, So
how how might I know? We talked about like like
training for like journalists and people to start editing Wikipedia's
in their language, but like, how how do we kind
of improve climate communication overall with open access to information
and you know, creating more linguistic um diversity and stuff. Yeah. Well,
(21:45):
I think there's like a couple of opportunities UM in
this and then I there's some other misinformation. I also
want to talk about two UM, but I think that
this the sourcing one, is a particularly challenging one. UM.
We need like more basic science based climate communication and
(22:06):
more languages. And I'm not saying like just the the
like more languages like the big u N languages are
the ones that are kind of colonial across cultural languages
like Spanish or French or Arabic or you know, all
these languages that have been used across cultures. We also
need it in local languages, UM. And we needed to
(22:26):
be evidence based and we needed to be an audience based. Right.
So if if someone is like searching online in Swahili
about how like drought is happening in Kenya, right or
Tanzania or or the you know there's suddenly flooding or
like I need to deal with X, Y and Z
adaptation to the climate crisis, UM, which is by the way,
(22:50):
what all of the global South is doing right now, right,
like the global South is having to adapt to this
crisis that polluting countries has have made, and we're not
actually giving them the resources to the to the problem
that we've cost. You. Well, it's not even like giving
the research. We're not even like the people who are like,
we want to adapt our society. We're not resourcing the
(23:13):
folks on the ground who have the agency, who have
the understanding, who know how to do the research in
the context, who know how to do the communication in
the context. Right, We're not even like bolstering their their
request for help. Right, Like the the the You and
Climate Conference kind of failed on this adaptation funding, right,
(23:33):
And uh, this is you know, this is where like
a platform like Wikipedia and like kind of approaching this
from a knowledge activist perspective where you're like, there are
people who need this knowledge to address like understand what's
happening around them so they can make decisions. That doesn't
like you know, yeah, we need those kinds of information.
(23:57):
We need open source knowledge, not just Wikipedia but one
of the platforms um and and you know the you
all do open source investigation and you're used to like
open source software communities, And I listened a couple of
your podcasts, and you're kind of constantly speaking back to
those open communities that that come out of like angleophone,
(24:19):
software spaces um. And like we need to acknowledge that,
like we we figured out how to knowledge, but we
haven't given all those tools. We haven't transferred the knowledge
on how to do it. We haven't adapted those tools
to other parts of the world and other languages. UM.
And so just like starting to look for these other communities,
(24:41):
asking for the people, like who's ready to organize, like
giving them money to go do it? Right? These things
are like really practical UM. And I think we're we're
not We're not often not listening or we're not looking
for that solution and render like most of the people
having to adapt um are in the global South and
(25:03):
speak other languages. Like we need to be there in
that language if we want the climate crisis to like
resolve itself without you know, destraining people's lives. Yeah, absolutely, um. Yeah,
that's that's the thing we try to bring up that
the people is going to be initially worse affected are
(25:25):
the people who are already kind of not in the
greatest situation in the first place. That's like how how
like how like the areas that are gonna that are
gonna experience the most flooding, the most extreme weather events,
all this kind of stuff. It's it's it's not it's
not starting with something like New York City. It's starting
with areas that are already dealing with a lot of
like local issues and now this is just something else
(25:48):
on top. And yeah, fixing all of that is uh,
I mean, fixing all of it's impossible. We can only
take like small adaptive steps to like mitigate some of
the worse effects. And yeah, I mean that that's that's
stuff that comes up a bunch. But you mentioned you
wanted to at least briefly mentioned, um, some other forms
(26:08):
of disinformation. Yeah, so we We've also witnessed a couple
of times, um where something will hit like breaking news
or become a political position in a context, and then
like we will see bad actors show up on Wikipedia
and try to manipulate it. UM. I have two examples
of those. The first is about a year year ago. Uh,
(26:31):
we found a group of accounts editing about some of
the inter Amazonian highways that the Bolsonaro presidency is building
through through the Amazon um where they were trying to
remove the environmental and Indigenous people's uh impact assessments from
(26:52):
the Wikipedia articles UH and so like basic human rights stuff,
basic you know, healthy environment things that the government is
like expected to follow through on. We're being like manipulated
out of the articles for a more like pro economic
(27:13):
growth narrative UM. And so you know, it's we can't
like the shift towards this like very extreme right like
economic growth only version of reality um does play out
on the wikie now were we were lucky that this
was fairly trans like fairly easy to see once we
(27:34):
found it, But we had to coordinate across um, English,
Spanish and Portuguese to like address the problem. So so
we need like multi lingual communities who are kind of
coordinating and talking to each other to address that. UM.
The other thing we've seen is like, so did you
I don't know how well you follow the climate movement? UM,
(27:56):
but did you see when Disha Rabb got arrested and
India by chance? I don't think so. So she she's
a youth climate activist that was part of Friday's for
Future India, which is like a group kind of sister
group of the group that formed in Europe. Around Greta Twinberg, right,
um and uh she uh um. Her Gmail account got
(28:23):
attached to a Google doc. Uh just seen active on
a Google doc that was about sharing social media about
the India the farmers protests in India, which have been
like a real political sticking point issue. And I had
written so I'm both a volunteer and a professional who
organizes the community. And in my volunteer time, I had
(28:46):
written the biography of Nishar Rabi like months before the
Indian government kind of identified her with this social media
tool kit. And um, when she got arrested for something
that's like just basic social organizing tactic social media, UMU,
the kind of Hindu nationalist social media environment like zoomed
(29:10):
in on her Wikipedia article and on all these other
social media presences she had, and they tried to silence it.
Um be like, okay, we need to leave this article.
And uh Fortunately, like a group of us were watching
the page and we caught it and we're able to
stop that. But there's kind of the the kind of
(29:31):
flash mob situation that happens a lot now in social media,
where it's that is, this thing has been polarized. Now
we need to go attack it um. And so you
can imagine, like English Wikipedia has a healthy immune system
for this kind of stuff, it like sees it. But
it has enough, it has enough people that it can
do that. Yeah. Yeah, but you can imagine on a
(29:54):
smaller wiki that the narrative could shift and stay permanently
shifted quite quickly. Yeah, um if that happened. And so
that that's another concern. Right, So there's like the subtle
like a few accounts just like quietly removing things and
then like the act of political um kind of intervention
that happens. And in terms of like disinformation, do you
(30:15):
see the Wikipedia as being kind of susceptible to like
intentional disinformation campaigns of people slowly kind of editing the
ideology of of articles to to push kind of some agenda.
What whether that be like individually and like like you know,
more of like a crowd operation um or even like
run by like people with political power, um. Like do
(30:36):
do you how much of a risk do you see
that with this kind of open source idea? Is that's
of of like intentional slow dissemination of disinformation on like
important articles and stuff. Well, so I think I might
reframe your question a little bit like, uh, all open
source kind of knowledge spaces are susceptible to that, right. Um.
(30:59):
The question is is to like what degree and how
harmful is it going to be? Right, Um, like is
it is it like very open to this and will
it cause a lot of problems? Um, The bigger language
Wikipedia's have healthy immune systems, but we we have a
combination of kind of bots that are like AI generated
(31:20):
that flag bag edits, and then we have a lot
of community patrolling happening. And even in some of the
smaller communities that have like medium sized editor communities like
Swedish Wikipedia, it doesn't take a lot for that local
language community to patrol the pages and like be like,
oh okay, um, these changes are kind of weird. I
(31:41):
can roll it back, um, Like this doesn't seem like
it fits our culture of Wikipedia. The problem is when
a language Wikipedia has very few editors and they're not
active all the time. Um. And and so this is
where we need kind of more eyes on the content,
right because it's it's very easy for like a really
(32:03):
small language community to kind of have a little bit
of content but never see it maintained. Um And and
this is where the like where where our communities are
forming around these languages, like a lot of the West
African languages for example, that our communities are are kind
of organizing, and we we like invest in those communities
(32:23):
existing and like figuring out the governance and training people
how to edit and getting access to the kind of
technical skills to do this. Um. And you know, we
have kind of systems that we're hoping over the next
few years invest in that resilience, right, like building a
code of conduct making it easier for communities to see
this kind of stuff. But it is three languages, right, um. Yeah,
(32:49):
And it is a volunteer built system, and you do
need a healthy editatorial community in order to keep a
wiki from like drift team too much. Um. So a
good example of listening to get a reference creation because
it's the one we've done research on, Like it was
possible for a few people to push people who are
(33:13):
more in consensus with the global position on various topics
out of the wiki. Um. And that's just like we
we have to find a balance between like local language. Uh.
And this is my personal opinion, right, we need to
find a balance between kind of local language sovereignty on
(33:34):
this stuff, and also not like radicalizing, it's topical environment
and we and we see this particularly on impactful topics, right,
like ones that directly affect like politics or in the
kids climate crisis, like people's livelihoods and ability to function
in society right. Um And we just like we need
(33:57):
to be cautious about that. But but you know, Wikipedia
is a common resource, and I think this is really important.
Like the way Wikipedia works is you know, the Wikimmunitia
Foundation provides the servers, We fund our communities, we support them,
we help them work through governance issues. But like the
we we need editorial communities to maintain it. That's what
(34:17):
those two thousand people are doing as volunteers as they're
building an editorial practice that makes the content work. Um.
And and we we need that um. And so we
need you know, like minded communities like the people for
your your podcasts who are like, oh, we need the
Internet to be reliable and have accurate information a lot
(34:39):
to show up um Because if we don't do that,
it's it's really like it's the common resource. We we
we have a decent international listening base as well. UM.
(35:00):
I'm thinking like, what what would you like recommend people?
You know in different countries or even people inside inside
kind of like uh, you know, the States, America, Canada,
the UK who are like multi langual would at least
encourage them to browse other language wikipedias and maybe start
making edits when they see this type of misinformation popping up. Yeah,
(35:22):
so I to kind of perspectives on this one. UM,
look for a local organized community. So we we have
what's called wikimedia Affiliates. These are fifty organizations around the world.
They regularly run events, especially now that we're leaving COVID,
increasingly more in person events. They trained folks like look
(35:43):
for them in your context and if you need help finding,
you know, find me on Twitter and I can connect
you with those communities. Um. And the other part is
small edits. So I think a lot of people look
at Wikipedia and they think about like a traditional publishing firm, right, like, oh,
you know, I have to write the whole whole article.
(36:03):
Yeah yeah, yeah, yeah, I have to be a master.
And and the secret sauce to all of this is
like most people start with one citation, one comma, one
type of fix, and they do a handful of those
a month, and then they keep coming back and as
you do those small edits, you start reading the content
more carefully and fixing the things you can fix. And
so I recommend going in to like add one citation,
(36:27):
Like if you go and add one citation today, that
like makes life better, or you fix the communication on
the sentence. Um. The other part of it is, you know,
I said, there's these organized groups, uh for the climate
In particular, I run this campaign called Wiki for Human Rights,
which is focused on UM. We it's a theme that
(36:48):
we kind of identified with you and human rights on
the right to a healthy environment, which is this new
human right that has been acknowledged by the Human Rights Council.
And we're organizing kind of writing contests and editor fonds
and kind of trainings for communities to go and look
for the human dimension of the climate crisis. So I
think when we think about climate communication, a lot of
(37:10):
people are like science right there, like oh, this is
you know about how weather systems work and how the
atmosphere it forms and the stuff and the content that's
more impactful is this like human inflected stuff, like how
does the climate crisis in fact, you as an individual
and agriculture in the cities you live in, and the
(37:34):
clothing you buy in the manufactured goods absolutely mine around
the corner that's producing water pollution that's gonna harm your
children for the next thirty years, right, Um. And and
that is the kind of stuff that we're encouraging communities
to pay attention to. Is it is more the like
justice and human rights oriented perspective on these topics. And
(37:59):
your cat is very cute. Yeah, every once in a
while they love to love to take the camera. Um
and so yeah, So so if you follow me on Twitter,
I will I can hook you up with that campaign
as well. Yeah. Yeah. Where where can people find you
online and to learn more information about you know, the
(38:20):
various kind of topics we've discussed today. So, um, search
if you're interested in climate change stuff on Wikipedia English.
Wikipedia has a wonderful wiki project climate change that has
a little tab at the So if you search wiki
project climate change on Google and you find there's a
tab at the top that says get started with easy
(38:41):
edits and that kind of can get you oriented to
like where can you affect English Wikipedia on this? And
you know, once you find a gap on English, it's
easy to find it on other languages. Um. For the
kind of learning about Wiki for Human Rights. You can
search for that um and or follow me on Twitter.
Um A S A A D S sad ads on Twitter. Um. Uh.
(39:04):
We also have a group called Wikimedians for Sustainable Development
who's kind of communicating on Twitter, which is the group
that's really focused on sustainability topics more generally. Um. And
you know, the other way to look is find something
you've been reading about about the climate crisis or sustainability
issues in the news. Look it up on Wikipedia. See
(39:26):
if it's missing. Um. If it's not, click the edit
button at a sentence. Right. Um. The good example of this,
I learned about a park and uh the center of
Nairobi that's being protested by environmental activists because some of
the big trees were being cut down a Huru park, right. Uh.
(39:48):
This came by on my Twitter handle, Like I'm not
connected to this at the moment, right, Um, But because
I had news sources, I had three or four news
sources I could say really simply in two thousand one,
the art came under scrutiny for renovation that included removing
old trees. That's a climate action, right Uh. And I
(40:08):
think you know I am constantly overwhelmed by the climate
crisis as as a lot of people. Yeah yeah, and
and like just being able to tell that little story, like, hey, um,
the decisions people are making are not productive here, right, Um,
just just gathering that story is important. And what's important
(40:32):
is Wikipedia plays institutional memory on this, right. I feel like,
you know, a lot of a lot of activists work
is very temporal. It's very like in that moment, right, um,
And if it doesn't get documented on Wikipedia, the local
news sources are gonna get lost in the wind of time. Um.
And so I think, you know, if you to do
(40:52):
your little activist motion, like a sentence describing what happened
in a moment where resistance was happening, is like a
huge step forward, right because it connects the environmental crisis,
climate crisis, human rights issues to like daily lives. Like
people look up this park probably on Google because they
(41:13):
want to go there, right, or they read about it
because people are like when was it created? What was
that protest that happened there the other day? And if
those source isn't there, um, then it doesn't really exist
in their minds. Yeah, it doesn't exist in their minds.
And I think that's like one of the big issues
with climate crisis and you know, amplified even worse in
(41:34):
other languages, right, is that people aren't making that connection.
They aren't seeing it around them, and they're not you know,
kind of connecting action to how we address it. That. Uh,
that is a really good that's a really good point.
And yeah, I mean I will encourage everybody to to
start making small leddits. That's what's what I did for
(41:56):
a long time before I moved into like open source
um journalism and reporting. It's a great way to get started,
and it's a great way to get just start start
disseminating small bits of information because the only thing that
we can really do is people is small steps. We
can have like an adaptive goal in mind, but you
need to take small steps to get there. And that
(42:18):
is a really great way to start influencing the way
people think about climate and our situation. Um. Yeah, and
and I think too, you know, your your podcast kind
of appeals to folks who are interested in like finding
the truth and reality, right, and that that's that's like
that that investigation is what a Wikipedia article is. It
(42:40):
is like one ten hundred editors out there in the
world trying to go like, what the heck is this
topic about? Right? How do I compile my notes? Uh
in a way that helps other people? And I think
in the face of the climate crisis. Dr Ianna Johnson says, like,
find the thing you're good at, find the thing your
past and about, and find the thing that like or
(43:02):
that that makes you feel good and you're you're is rewarding.
And find the thing that actually like helps affect the
climate crisis. Right, And a small DNA on Wikipedia meets
your kind of knowledge needs. It's very satisfying because people
will read it and it is incremental change in the
right direction. Right, people will make decisions on it. Uh. Yeah,
(43:26):
I mean, and I guess uh, I think that I
think that probably closes this up today. And let's do
anything else to add Um, I guess one more plug
for your Twitter so we can get get more eyeballs
on you, um and the work that you're doing. Yeah.
Um so at S A D A D S. It's
my long term handle on the internet and you you
(43:49):
can find me Oliver plates Uh and I tweet about
Wikipedia on the climate crisis, will and we'll will link
the Wikipedia wiki project climate change page in the description
for people to find. Thank you so much for taking
time to talk to us all about these topics. UM,
I'm really really great, uh, really grateful to have this
(44:10):
type of knowledge readily accessible to more people. Also, you know,
in the spirit of Wikipedia. Thanks so, thank you so much. UM.
You can follow us by subscribing to the feed and
on Twitter and Instagram at happen Here pod and cool
Zone Media see you on the other side, everybody. It
(44:33):
could Happen Here is a production of cool Zone Media.
For more podcasts from cool Zone Media, visit our website
cool zone media dot com, or check us out on
the I Heart Radio app, Apple Podcasts, or wherever you
listen to podcasts. You can find sources for it could
Happen Here, updated monthly at cool zone Media dot com
slash sources. Thanks for listening.