Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
I do think that we have had someparticularly untrustworthy
politicians. Donald Trump comes to mind.
His relationship with the truth is tenuous at best.
One of the key moments in recenthistory was when we all saw the
poor queen sitting on her own atthe funeral, and then we find
out later they were having a party in #10 the night before.
And I think people were just like fed up.
(00:21):
If you impose a rule on Wikipedia that required us to
hire thousands of moderators, kind of breaks the role of
trust. As long as people are interested
in having some fun riding an encyclopedia together, I think
we'll be here. Hello and welcome to Ways to
Change the World. I'm Christian Guru Murthy, and
this is the podcast in which we talk to extraordinary people
(00:44):
about the big ideas in their lives and the events that have
helped shape them. Today's guest is the man who
created the online resource where the podcast team here
normally go first when doing research on guests coming on to
the show. Wikipedia, the free digital
encyclopedia that just about everyone in the Western world
goes to if they need to quickly know anything.
(01:04):
It's been going for almost 25 years, determinedly non profit
and user run, but increasingly under attack now by right wing
influencers, Republican politicians in the US, and Elon
Musk for being biased with an overly progressive ideology.
And now AI models like ChatGPT are parking their tanks firmly
(01:25):
on Wikipedia's lawn, claiming tobe the real font of all
knowledge. Wikipedia calls them
conversational chat bots. Jimmy, welcome to the show.
Thank you. How do you want to change the
world having already changed it?Well, I mean, I think that what
I've done is, has changed the world, interestingly enough.
I mean, when I think about how to change the world, I tend to
(01:48):
think very small about taking small steps and experimentation.
You know, in, in the book we talk about if we want to fix
things, we're going to need to do a lot of experimentation.
And so it's not one thing. It's like, let's try lots of
things. Yeah, I mean the book is the 7
Rules of Trust. Why?
It is today's most essential superpower.
And there's a series of steps that you you lay out in the
(02:12):
book. But I mean, how do you think
that is applied to Wikipedia's experience?
If you think back to its founding, you would, you know,
people used to joke about you, make fun of you.
People would go to incorrect Wikipedia entries and say, well,
look, this is nonsense. This is what happens when you
get anybody and his dog writing,writing Wikipedia.
But you've gone from that to, aswe say, the place most people
(02:35):
end up looking pretty much everyday when they want to know
anything. And that requires a huge leap of
trust. How did that happen?
Yeah, definitely. Well, I mean, that is where a
lot of the lessons, you know, for the book came from is the
experience of thinking about trust in the context of
Wikipedia. I mean, first of all, Wikipedia
(02:56):
has always been very trusting. There was this sort of open
offer to people like come and take part and come and join us
and let's write an encyclopedia together.
And, you know, I had a belief, which is proven to be true, that
most people are basically very decent and, and want to do
something productive and useful with their time.
And people really rallied to that call.
(03:17):
And, you know, slowly but surelyover the years, we had to think
about things like, OK, how do weimprove what we're doing?
How do we, you know, so it's things like we need to have
reliable sources. We need to be really neutral
because that's a core component of trust.
And so all of these lessons overthe years add it up.
(03:38):
And then when I look around the world and I see that we're in
this crisis of trust. You know, we see if you look at
the Edelman Trust Barometer surveys, declining trust in
politicians, declining trust in journalism, business, each
other. Meanwhile, we've kind of got
some trust, some good stuff going on with trust in our
(03:59):
community that I think we can share with others.
So the first thing is, is trust other people and they will have
a reason to trust you. Yeah.
And I think this is, you know, this is, we all know this pretty
well in in our personal lives and in our experience, which is
if you say to somebody, you know, yeah, I'll just trust you
to do that, they'll generally live up to that.
(04:19):
And that's important. What do you think the level of
trust is though, in in Wikipedia?
Because, you know, if I think about my own use of Wikipedia, I
use it as a guide. I don't, I kind of assume that
not everything on there is absolutely right, that it's a,
it's a curated page in which theemphasis might be right or
wrong. I look at my own page and I
think, well, that's not what I would have said about me.
(04:40):
So, so, so when I look at other people's pages, I think, well,
maybe that's wrong too, you know?
Yeah, I mean, definitely like I,I would say the, the level of
trust probably varies across different topics.
It varies depending on, you know, how people feel about that
topic. You know, if you if you want to
know about some chemical reaction, you probably would
(05:03):
trust Wikipedia to be very, verygood, and it probably is.
And if you wanted to know about some controversial issue, well,
how much would you trust it? Well, that partly depends on
you. For better or worse, right?
It may be you don't trust it because sadly you, you don't
like the facts of reality, right?
And so therefore you feel emotionally sort of resistant to
(05:24):
it. Or you may see it and you may
go, actually, this seems a bit biased and actually there seems
to be something going on here. So it varies.
I mean, it is a, it's a dialogue, it's a discourse.
I think the important thing for us has always been, you know, we
always need to think about how can we do better?
So if somebody says, oh, Wikipedia, it's been taken over
(05:47):
by a woke mob, that's just not true.
Like that's just simply not true.
If you say Wikipedia has a problem with bias in certain
areas because they follow the mainstream media, which has a
certain bias, etcetera, etcetera.
OK, now that's a conversation wecan grapple with, right?
That's very, you know, there's some interesting things to be
had there. But, you know, if, if some
(06:09):
people say, you know, well, I'veactually been getting yelled at
a lot on, on Twitter by UFO believers, right, who are angry
that Wikipedia doesn't take themmore seriously.
And I'm like, I mean, we're, we're pretty kind about the
whole thing. You know, we sort of lay out
some evidence and we talk about the this and the that and the
(06:31):
history and all of this. And yeah, we don't go all in for
your conspiracy theory because it's not justified to do so.
And that's OK. Like and.
They feel also surprised. So the the Wikipedia community,
which is a bunch of geeks really, they tend to be quite
smart. They tend to be not particularly
(06:52):
political people. They're really passionate about
sharing knowledge and reliable sources and all of these kinds
of things. And you know, we try to remain
as open as possible. So anybody who sees something in
Wikipedia and they how that doesn't seem right, right, Come
and join us, come and talk to us, come and talk, become one of
(07:13):
us and and like get involved because that's hugely important
in terms of forgetting as human beings at the truth.
You have to have that sort of open curiosity, questioning
spirit to say, OK, let's see, what can we make better here?
But if you if you go onto any Wikipedia page and you look at
the chat that's gone on behind the page, you will see some sort
(07:36):
of discussions and disagreements.
Oh yeah, definitely. Going on Yeah, so, so how, how,
how and who works out who wins? Or is it just whoever's most
recent? I mean, it's, it's, it's a
variety of mechanisms. So it's often, I mean, in one
very narrow sense, it's whoever's most recent.
But in many, many cases, that dialogue produces a consensus,
(07:59):
A, a compromise, you know, So for example, you know the
article, some people may say, you know, A and other people say
B, and then they're both producing sources.
And the article ends up saying, you know, the two leading
theories are A&B. And here are the reasons given
(08:20):
for that. So we call it going meta, like
step up a level. We can't take sides.
If it's a legitimate dispute, wecan just describe the dispute.
And that actually works really well in most cases.
What kind of behaviour on Wikipedia have you had to change
in order to improve your trust? Well, I mean, I, I think we
always had the same spirit. So we haven't really had to
(08:40):
change our behavior per SE. Like people are basically decent
and nice. I would say we had to, you know.
You you describe yourself as a pathological optimist.
Yes, yes, yes. Yes, I am, I am and I always
have to have a few pessimists around to keep me in check.
But yeah, so I mean, I, I can give one example of, of
something that that did change. So we have a policy about what
(09:03):
we call BLPS. We have way too many acronyms
like many communities do biographies of living persons
where if there's something negative in a biography, if it
it has to have a source. And if it doesn't have a source,
you should remove it immediatelyand then go and talk about it on
the talk page. You don't leave it in while you
figure out what to do. And that was a change like we
(09:24):
before it would be common, somebody would say, oh, this,
this says this negative thing. Is that right?
Let's have a discussion about that.
And now we're like, take it out immediately.
And you know, that was a good change over the years.
We've we've definitely, I would say matured in our nuance around
a lot of things. So just an example I was looking
at a couple of days ago, is thatthe question of, you know, the
(09:49):
names of family members of controversial people who aren't
themselves in the news and the sensitivity around their
privacy. And in some cases you would say,
well, actually it's quite relevant to the the biography.
In other cases, you're like, well, it doesn't really matter
who the parents were because it doesn't seem to have any impact.
So you wouldn't list their parents names.
(10:09):
And, and so those are the kinds of things that we've become, you
know, more nuanced over time as we saw examples.
And we're like, oh, OK, hold on a second, actually, because we
do think for biographies, like neutrality is really important,
even for very controversial people.
You know, sometimes people are like, oh, but how are you
neutral about Hitler? And it's like, it's quite easy
(10:32):
to be neutral about Hitler. And, and that doesn't mean you
describe him in glowing terms. It means you just tell what he
did. And that's bad enough.
Like, you don't need to add a rant against Hitler on the page.
So, you know, I would say the the, the biggest change, you
know, has just been becoming more experienced and more mature
over the years. How important is it, do you
think, to the trust of a organization like yours that
(10:54):
it's a not not-for-profit? I think it matters.
I think it matters though not somuch because the reader actually
knows or cares. I mean, often times people have
no idea. It's just like magically appears
on the Internet. They don't know where it comes
from, but it's certainly important for the the volunteer
community who really pour their hearts into it.
I think it's more than for profit versus non profit.
(11:17):
I think what's really important is that we are not advertising
driven. We don't have ads in Wikipedia
and if we did, that would lead to certain consequences.
Questions people have is like, oh, is the content sponsored
content you would have questionsabout?
Well, we would have questions like we, we are really very
intense about user privacy. And so we, we collect almost no
(11:42):
data. We have very little tracking
data. You know, we, we don't do all
the kinds of things quite normalonline in terms of trying to
figure out who your readers are so you can target ads at them.
We just don't care who they are.Like great, you know, and the,
the donation model that we are not funded by governments, We're
not funded by a handful of majordonors were funded largely.
(12:05):
I mean, we do have some major donors and that's much
appreciated. But we're largely funded by the
small donors, like, you know, people giving their £20 a month
or whatever it is, £20 a year. That's the backbone of the
support for Wikipedia. And then that gives us the
incentives inside the organization.
We're not thinking about, oh, how do we please our
advertisers? How do we please our wealthy
(12:27):
donors? We say, how do we please our
readers? How do we please the general
public who donate? Because, you know, that, like,
our best fundraising pitch is always, you know, you basically
you use Wikipedia all the time, you should probably chip in.
And people go, yeah, you know what?
I, I love it. I should chip in.
And they do. How?
How different would it be, do you think, if you were Wikipedia
(12:47):
billionaire Jimmy Wales and one of the tech Bros who suddenly
become part of the sort of the global monarchy?
Right. I don't know.
That's kind of a weird thing to think about.
You know, it's sort of, I do have a certain role.
And actually I feel like one of the great things about Wikipedia
(13:09):
is our on Internet public policyissues.
We often have a voice that gives, you know, regulators and
legislators some pause because it's quite easy to just go just
YouTube wants to make more moneyor whatever like that.
But when we say hold on you, youshould be careful not to damage
Wikipedia. And here's the ways that this
regulation could impact us negatively.
(13:31):
They do pause and they say, oh, hold on a second, we don't want
to break Wikipedia. Like that's really important.
We call that the Wikipedia rule.Like, don't break Wikipedia when
you're thinking about Internet regulation.
How do you feel about the power of those tech giants in the
world right now? I'm pretty relaxed about it.
You know, I mean, I think, I think the power is sometimes
(13:51):
overblown, overstated. You know, obviously all big
companies have a certain amount of power and so on and so forth.
But, you know, I mean, we you can think about the power, but
you can also think about them standing on stage at the
inauguration looking slightly like hostages and realizing they
were, they felt a need to go there to appease Trump.
(14:16):
That's power. And Trump loves that kind of
power. And it's unfortunate, you know.
But do you see a concentration in the hands of a relatively
small group of people of the, you know, what most people are
looking at, whether it's social media, media, television, all
(14:37):
kind of going into these, these giants giving, giving them,
giving them a huge amount of power.
Yeah. So I mean, I think we can be a
bit nuanced about it and talk about different areas.
So for example, in the USA, hugeamount of local TV stations and
local news is controlled by a handful of companies and they
(14:59):
have top down direction over what gets said and what gets
published. And, you know, it's like
actually they've been increasingthat centralized control of the
local news. OK, that's concerning.
Like that doesn't seem like a great thing to me.
Meanwhile, you know, if you think about social media, it's
like, OK, there's centralized control maybe of some aspects of
it, but still it's like it's incredibly diverse.
(15:22):
You know, it's like YouTube has all kinds of stuff on it and
people who I'm not sure there's videos on there ranting against
Google and YouTube and all of that.
And so that's a different kind of question really.
It's not quite they don't have any control over the the path of
ideas and you know sort of the antitrust kind of business kind
(15:42):
of issues. Those are really probably more
about the advertising market andthings like that.
But ex Twitter is is the big onethat has changed in its
characteristic over years, and it's changed in its character
since hundreds of millions of people signed up.
Yeah, yeah, definitely. I mean, it's certainly, you
know, the Elon Musk has made a pretty clean break with the past
(16:03):
in terms of the way they handle moderation, for example.
And it was never great. I mean, I, I have been upset
about about Twitter moderation in the olden days.
Like, it was pretty bad. Like you could complain about
somebody attacking you and saying things that are clearly
against the terms of service andthey would kind of not do
(16:24):
anything about it. I mean, I once emailed Jack
Dorsey and said, look at this horrible thing they say, oh,
don't worry, Jimmy, we'll we'll sort that out.
Like, sorry about that. And I'm like, it's not really my
point. I I know you and I can e-mail
you. Like what about the teenager
who's getting abuse? You know, like your systems
aren't good enough and now they're not just not good
enough, they're completely destroyed.
(16:45):
And so I think that's a problem.And what, and what about the
algorithm itself? I mean, you know, the I
certainly feel looking at my feed, it's, it's rigged, it's
right wing, it's kind of throwing me a certain type of
material. How do we as users kind of
detect, identify, know what's going on?
Yeah, I mean, I think it's a real problem.
(17:06):
It's a, it's a problem of a lackof transparency.
It's a problem of, you know, there's a lot of talk by Elan
about censor, anti censorship and free speech, but there's
clearly a bit of a thumb on the scale there that is problematic
in lots of ways. But also I think a lot of users
are leaving. I think one of the reasons
Twitter feels like it's become sort of more right wing is
(17:30):
because a lot of people of different viewpoints have been
like, I'm out of here, like I don't like this place anymore.
I, I know personally, I still have a Twitter account.
I use it some, I use it a lot less than I used to.
And that's not so much about left versus right.
It's just about, it's just unpleasant, like it's no fun.
And you know, you, you sort of feel like I still need to be
(17:50):
there because I do send, you know, there's an audience there
and all that. So for professional reasons, it,
it still matters, but I don't really feel like engaging.
And also, you know, I've got a few friends on there and, and
people I talked to and that's fine.
And I actually think one of the biggest things that people can
do is basically say, I don't like this.
I'm not going to use it anymore.It's like it's not OK to have a
(18:13):
platform that is this sort of horrible.
I'd rather go somewhere else where?
Does does it matter if you don'thave another platform that is as
big? You know?
I mean you, you, you, you started WC Social.
I don't know how big it is. Small.
Yeah, it's a pilot project. You know, and lots of other
people have tried various things.
There's blue sky, there's, you know, half a dozen other social.
(18:37):
They don't have the critical mass and therefore they don't
have the same effects. Yeah, but I and I do think, you
know, that's interesting, but I think we should also remember, I
remember when Myspace looked like it was going to become a
monopoly in social networking and it's completely gone now.
I remember when Facebook was more dominant than it is today.
(18:57):
These things can change. And, you know, I don't think,
you know, with Blue Sky or threads from Facebook, they're
not doing that because out of a charitable instinct, they see a
business opportunity that that Twitter is basically very in a
weak position. You know, the advertisers have
fled largely the business model doesn't make sense.
(19:20):
How long can they carry on running a platform that even the
people who are heavy users kind of hate?
I it's not a good business. I mean, transparency is one of
your rules, yeah. So how, how could you apply
transparency? If you're in charge of X or
Facebook, you know, what would you do?
Yeah, I mean, it's it's a hard problem.
So social media, so one of the things with Wikipedia that makes
(19:43):
things easier is purpose. Another one of my roles, which
is we're trying to write an encyclopedia.
So we're not a wide open free speech forum.
And I've said that since the beginning.
It's like, yes, we want to have a discourse, A dialogue, but
we're trying to write an encyclopedia.
So it's not the place to come and rant about whatever.
Whereas social media kind of is the place to come and rant about
(20:04):
whatever. And so that means the
challenges. For moderation are much harder.
I mean, I would recommend that they look into more features
like the the notes feature. So on Twitter now if somebody
posts something, people can FactCheck it.
And then if and there's a votingprocess and all that and it's
reasonably transparent and controlled by the community is
not transparent enough. I'd like to see more
(20:25):
transparency there. And that actually is a good
thing. You know, like, sometimes it
does prevent, like, blatant misinformation from going viral
because, you know, under it, there's a note saying, like,
here's some links to check out. This isn't really true.
And I think they should pursue more things like that because,
you know, the problem with what they were doing before was it
(20:47):
just wasn't scalable. Like, you can't check every
tweet coming in. Like, it's impossible.
Same thing. You know, I have some sympathy
for when Facebook has a challenge around misinformation
floating around on the platform.So, well, that's just the users
talking to each other. And it's not really.
I don't think we want a world inwhich Mark Zuckerberg gets to
(21:07):
decide what's true and what isn't.
At the same time, there's this problem of abuse and a sort of
really bad behavior and toxic stuff.
It's hard, but transparency would help.
And do you think I mean, you, you, you've been asked, I've
asked you over the years, you know, should governments get get
involved and regulate and you, you've always said no.
(21:28):
Yeah, I, I do you still think it's a no, even though social
media was so powerful and havingsuch a large effect on our
politics? I mean, I do think broadly it's
no. And one of the reasons is it's
really hard to put forward a proposed law or regulation that
would actually solve problems and not just cause problems.
You know, if you said, well, it should be illegal to post false
(21:50):
content, that's really hard, right?
Because somebody's got to decidewhat's true and what's false.
And in free societies, we don't think that's normally the job of
a government to decide. And it's not even indirectly a
job for the government to penalize a company because they
didn't decide in the right way, etcetera, etcetera.
So, you know, I think you can say, you know, some of the
(22:11):
things like, you know, it, it's necessary to have some sort of a
reporting mechanism in a way to appeal a ban and all those
things which they already do. So it's like, OK, what, what's
the regulation for exactly? I think what I'm much more
interested in is, you know, preserving the ability for
competition. And I think sometimes regulation
(22:33):
can make it harder for new competitors to emerge, like here
in the UK, the Online Safety Act, it has a framework in mind,
which is what I call the feudal model, which is the, the
platform owns the platform and they're like the master and all
the search live on the platform and they follow the rules of the
master and so on. And it doesn't contemplate a
(22:53):
much more democratic situation where actually the rules are
made by the users. Like this is the Wikipedia model
and they have the tools they need to control the space
they're in and so forth. So if you impose a rule on
Wikipedia that required us to hire thousands of moderators, it
kind of breaks the role of trust, which is we trust our
community and they trust us and,and all of that.
(23:16):
And so we have to be really careful that we're not sort of
locking in a clearly not scalable and bad system by
passing regulations that say, well, if you have more than X
million users, you have to do XYand Z, even though maybe you
don't have to do these things. Maybe you only have to do these
things if you've got a broken model to start with.
And you you couldn't, for example, regulate for neutrality
(23:37):
then? Yeah, I know.
I don't see how you could. I mean you, you could, you could
sort of demand. Because if the algorithm at the
moment is biased to the right wing or left wing, right wing on
on on X couldn't use telex, you've got to stop doing that.
Let people choose it if they want.
Could, I mean, but I think Elon Musk would say we're allowing
all content that isn't illegal. And so then would a regulation
(23:59):
like that make it possible to say, you know what, on our, on
our platform, we actually don't want racism.
We, we want to say you're not welcome here if you're going to
post racist commentary and someone's going to say, well,
like, you're not being neutral. Like that is a set of ideas.
And I want to put forward my ideas and I'm going to write
about them and OK, well, I thinktoo bad, go somewhere else.
(24:22):
And I also think that isn't necessarily neutral.
And so regulating for neutralitymight make it harder for people
to do that sort of thing. So it's hard.
I mean, I think we can just say I think we should always take
when government regulation is atstake, what sort of US First
(24:42):
Amendment law you would say strict scrutiny like is, is
there any other way to achieve the goal?
Is there a compelling interest? You know, those are the kinds of
questions not just, oh, this sort of sucks, let's call it,
let's regulate it because then you get into really murky areas.
I mean, we've seen in this country like quite an
interesting debate about, you know, Graham Lynam getting
(25:04):
arrested at the airport. And then now they've sort of
dropped all the charges and apologized.
I think they apologized. But anyway, they they said they
were going to stop doing that sort of thing, which I think is
definitely the right answer because I mean, whatever, he
posted some stupid stuff on Twitter fine and.
And so comes to the next sort oflesson for trust, if you like,
(25:29):
after well, you, you mentioned sort of having a purpose, which
was to say we are an encyclopedia when lots of people
were saying, well, you're not a real encyclopedia.
So how important was it to say this is what we are?
Oh yes, in order to persuade people that that's what you.
Are yeah, yeah. Hugely important because, you
(25:50):
know, like the idea, you know, if I, if I say encyclopedia
article about the Eiffel Tower, you immediately have an idea of
what that is. And we pretty much have the same
idea. I should tell the history and
where it is and cultural impact.And there should be a picture or
two. You know, we know what that's
going to look like. Which made it possible for
people to cooperate in a way. If we said, oh, it's not an
(26:11):
encyclopedia article, it's just an article.
It's just a, it's a blog post. It's a whatever it might be,
then people could be all over the map.
You wouldn't be able to settle in.
Like, how do you set rules? What is it?
This is why, you know, that purpose, it was just clearly
identified from day one, has enabled the community to sort of
(26:31):
say, OK, right now we know what we're doing, we know what we're
here for. And, you know, that kind of idea
can be applied in all kinds of contexts.
You know, a lot of companies, you know, it's quite important
that they have a, a mission statement that's not just some
fancy words that somebody, some consultants came up with, but
they actually need to understandlike what are we trying to do
(26:52):
here in this business? Like what is it?
What are the things we do and the things we don't do?
What are we specialized in? What are we going to get really
good at? Because without that, how do you
make decisions? You just all over the map.
You, you mentioned the attacks that you're under, you know,
over whether it there's a woke takeover.
Now, you mentioned in the book Elon Musk has has had this sort
of defund Wikipedia campaign andhe wants to set up a rival,
(27:16):
which you think is sort of a badcampaign strategy on his part.
But I mean, how do you persuade people or demonstrate that
you're not, that you've not beentaken over by some?
I mean, I think the the proof isin the pudding.
I mean, right now I'm meeting a,a neutral point of view working
group within the, you know, within the organization,
bringing in like we're talking with the, our research team who
(27:39):
are coming in with, you know, proposals and ideas.
We want to encourage more research into this question.
You know, like how what's the best way to research?
You know, because just vague accusations out in the public
aren't, you know, like, whatever.
Are they having an effect, do you think, having those
accusations? I mean in certain segments.
I mean, I think the truth is most people who use Wikipedia
(28:02):
and read Wikipedia realize it doesn't make sense.
I mean, somebody sort of on on Twitter sort of came at me with
the the classic gotcha question,what is a woman?
And there, you know, the, there was this right wing documentary
about it. And the big punchline in the end
is that a woman is an adult human female, which is
apparently controversial. And I'm like, well, at
(28:22):
Wikipedia, what it says is a woman is an adult human female,
Like, and then further down in the entry we do talk about trans
issues in society and whatever, biology and chromosomes and all
that stuff. And I think most people would
look at that and go, oh, actually this article isn't
crazy woke left. It's it's got some interesting
stuff in it and it sort of addresses the issue in a very
(28:44):
thoughtful and calm way. So I think most people who read
Wikipedia realize like, oh, actually that doesn't make
sense, even if you also might say, oh, well, in some cases, I
read this article. It seemed very biased.
Yeah, OK, that happens too. And so part of my work, you
know, these days is saying to our community and sort of
saying, why don't we just doubledown on neutrality?
(29:05):
Like, let's be really, really thoughtful about it.
Let's let's have a a big conversation about areas where
we might be failing in neutrality, where, you know,
some of the questions that are valid questions is, you know,
are we failing to look at a wideenough range of sources and
where might we do that? And I actually think technology
(29:26):
can help us with that just by helping us go through articles
and say, oh, look in in this setof articles, we only cite, you
know, the Guardian, but not the Telegraph.
Yeah. OK, interesting.
Let's take a closer look at that.
It's quite a challenge though, isn't it?
Sourcing on Wikipedia because you know what what what is
mainstream media changes and andand there are some media outlets
(29:49):
that have got huge numbers of viewers or views which are
peddling lies. Yeah, Yeah.
No, I mean, I think there there is this, you know, we're quite
old fashioned, I would say in our approach to thinking about,
you know, what is what is low quality media and how do you
(30:09):
handle that? You know, and it's, it has to do
with things like corrections policies and sort of do you, are
the stories inflammatory or are they just reporting in a
straight way and so on and so forth.
And then you have to sort of be thoughtful about that.
I mean, I think we can both say,you know, just to give a very UK
(30:30):
answer, The Guardian is a quality newspaper that is
left-leaning and the Telegraph is a quality newspaper that's
right leaning. And they're both legitimate
sources and you wouldn't say of either of them, this is low
quality propaganda crap. I mean, some people would say
that, but they're mistaken. At the same time, you can look
at, you know, other sources, particularly online, more new
(30:50):
media sources, and some of them are just terrible, like they're
just not good enough. And so do you because what?
Because one of your, I mean, youtalk about, you know, the need
for civility and not to take sides.
Do you do you think it's important to take action against
people who break those rules? No, definitely.
Yeah, yeah, yeah, definitely. I mean, I think, I think it's,
(31:11):
you know, when people come to Wikipedia with a warrior
mentality, right? And they are clearly, I mean, we
have an expression not here to write an encyclopedia.
And so somebody who comes in whothinks, OK, my job here is to be
an activist and to promote whatever point of view.
I hope you find it kind of uncomfortable and I hope you
(31:33):
actually with a little bit of experience go, oh, wow,
actually, this isn't the why. This isn't A blog, this isn't
Twitter, this isn't whatever. This is something different.
And actually I kind of like it. It's actually better.
Like I'm engaging in interestingconversations with people who
maybe I disagree with, but I cansee as human beings and who are
(31:54):
really nice to me, even though Icame in kind of like a jerk.
And that's what you hope for. Some people probably can't learn
that and they just are going to get themselves blocked, but
that's what we hope for. Let's talk about the impact of
AII Mean, you know, I've, I've got kids 18 and 20.
I've seen their progression through reading Wikipedia pages
when they were doing their school homework to being on
(32:15):
ChatGPT. Yeah, even though ChatGPT uses
Wikipedia as a primary source. Huge, yeah.
What do you think is the impact and the threat from AI to the
Wikimedia model? Yeah.
Well, I mean, so far we haven't seen a huge amount of impact.
You know, just to give one example, you know, now Google
(32:38):
has their AI summaries at the top of a lot of search results.
Well, a few studies showed that about 3% of the links in the 1st
10 of traditional search resultsare to Wikipedia.
And AI summaries link to us about 6% of the time.
So we get a lot more links by Google, but those links at the
(33:00):
top, people don't click through them nearly as much because
they've already got the answer that they ask, you know, yes,
Google, how old is Tom Cruise? It just tells you when there's
the link. But the net has cancelled out.
And so our traffic is roughly the same.
It's not, it's just unclear at the moment, but it seems not not
impactful. And this is very different from
some tech websites and apparently some news websites
(33:21):
are experiencing big drop offs in traffic just depending on
what their subject matter is andthings like that.
But do you think that that's going to carry on?
I mean, you know, couldn't we bemoving to a world where Google
isn't the default and that people just go straight to their
ChatGPT app or whichever large language model they end up
using? I think it depends on the use
case, right? So I think if I mean.
(33:43):
So here, here's the issue. Like if you, you know, at least
right now, so I can't predict what's going to happen in 10
years time. So certainly in 10 years time,
if church BD has perfect sort ofaccuracy on everything, then
yeah, people probably will be like, oh, I don't use Wikipedia
anymore because this is just as good or better.
OK, that's it. You know, that's the future.
(34:05):
It'll be a shame, but you know, that's technology for the
foreseeable future though, Like the hallucination problem with
large language models, which Gary Marcus, a well known AI
researchers, argues, I think persuasively is it's not just a
temporary bug. It's like endemic to how these
models work. It's not really possible to fix
(34:25):
it. And certainly there was a lot of
hype before GTG PT-5 came out, which once it came out, people
are like, oh, actually it's still full of nonsense, right?
And it hallucinates quite a lot.And it hallucinates on less
popular topics. You know, if you say, you know,
(34:46):
tell me the biography of Tom Cruise, it's probably pretty
decent. I always ask, who is Kate
Garvey? My wife, because she's not a
famous person, but she's known abit.
And it's pretty much always wrong.
And it's pretty much always plausible.
And that's what we see, you know, at Wikipedia when we look
and investigate this. It's like, you know, this isn't,
(35:08):
it's not good enough. Like, we can't use it to help us
write Wikipedia because it frankly unapologetically lies a
lot. And so at least for now, we
don't foresee that as being a problem.
And also I think we also should remember the use cases are very
different. So if you want to say, as I did
(35:29):
use ChatGPT for this and it was quite good.
You know, I'm, I'm renting an RVwith my kids and we're going to
drive around the Southwestern America.
What should we see? Can you make an itinerary?
It makes a great itinerary and it tells you what the things are
you'll see and great. Like that's wonderful.
You wouldn't use Wikipedia for that.
Like you, you might use Wikipedia if you're like, I
(35:50):
actually want to know a detailedhistory of a certain, you know,
National Monument, like how did it be designated that and blah,
blah, the history of the national parks.
But you wouldn't use it to create an opportunity.
You wouldn't use it for cooking.So give me a recipe for these
ingredients and those are the most popular use cases for these
chat bots is, you know, people ask a question or they want to
(36:11):
interact with somebody of knowledge.
Super interesting, but not a direct competitor for Wikipedia.
It's not really doing any of your seven lessons for trust
either, is it though? No, I mean, I think this is
something that I think the AI companies should really take to
heart some of these lessons, which is to say like if you
(36:32):
know, if you're saying the next model that comes out, we've
solved the hallucination problemand then it comes out and people
are like, no, you didn't. People will lose trust and you
know, you like, you know, does it do what it says on the 10?
Is it like a super important test?
And there's there's virtually notransparency as well.
Virtually no transparency. Yeah, although in in the in the
(36:55):
case of AI, it's actually very hard because how it processes
the data is all super mathematical in a way that's
impossible to really understand.It's sort of like, you know, how
did you decide your favorite colour is green?
Like you have no idea how you decided that, right?
You can't really explain it. And do you know how much AI is
relying on Wikipedia A. Bit there's a few numbers, you
(37:18):
know, I think the in terms of the sheer size of the content
that makes up apparently Reddit's number one, because I
think there's more words in Reddit and Wikipedia #2 but then
if you read millions of Reddit posts, you're probably mainly,
mainly learning how people talk.And then for factual
(37:40):
information, you're probably mainly leaning on Wikipedia.
But it's really hard to know because obviously they're
feeding in everything they can get their hands on.
But it's, you know, without any doubt, it's huge.
It's like super important for them.
But you're not concerned that the number of Wikipedia users
and editors will reduce as reliance on AI increases?
(38:03):
Because you could get into a sort of a a spiral there,
couldn't you? It's Wikipedia gets worse and AI
gets worse as a result. Yeah, yeah, yeah.
No, I mean, I, I think that, youknow, but I always when you have
those kind of spiral, there's anequilibrium that you reach.
And what does that equilibrium look like?
I'm not super concerned about it.
I mean, I think part of what gives me the calm is we as a
(38:26):
community, just, we love writingan encyclopedia.
It's our hobby. Like we enjoy doing that and
turns out, wow, people like it, they read it a lot.
That's great too, right? And it's fun for us.
But we've never thought, you know, at the Wikimedia
Foundation, we don't obsess overtraffic numbers.
We don't obsess partly because we're not an advertising driven
(38:46):
website. So like, we don't collect data
and all that. And so, you know, we're just
like, oh, OK, well, great peoplelike us, That's fun.
You know, obviously for the funding, right, it's important
that when people get knowledge, they understand, like, actually
supporting Wikipedia is quite important because it's giving
you this human curated sort of amazing thing for free.
(39:10):
That's great. So yeah, I mean, you have to
think about it a little bit. But at least right now, we're
not seeing an immediate threat. Where in the world are you
blocked? Where?
Where do people try and close down Wikipedia, and is there
anything you can do about it? So we are currently only blocked
in China, as far as I know. Things happen day-to-day, but we
(39:33):
were blocked for about 3 years in Turkey and we fought that in
the courts in Turkey. We fought all the way to the
Supreme Court and won. And therefore we're unblocked.
And it's a landmark decision in Turkish law that they can't just
block websites. So great, good, good news.
And we're very popular in Turkeyas a result because I think a
(39:53):
lot of people in Turkey are concerned about authoritarianism
and, and sort of the the path that the government might be
taking. But yeah, currently China and
what can we do about it? Not much.
I mean, they block it on their own network, obviously.
What we still have users in China.
People use VPNs, you know, lots of places.
(40:15):
You know, that's really important.
There was a time when we used tobe filtered in a lot of
different countries where they would block certain pages.
So when we moved to being encrypted, so you go to
Wikipedia's HTTPS, like when yougo to your bank, rather than
just HTTP like most websites arethese days encrypted, then they
no longer have the policy optionof blocking particular pages.
(40:37):
It's, it's all or nothing. And we actually thought at that
time, well, we'll see what happens.
We think some of these countriesmay just block us completely.
They didn't. So we're available, you know,
throughout the Middle East and, you know, places you might think
would be uncomfortable with someof the content of Wikipedia.
But I think they made that policy decision because
Wikipedia is just too awesome. You know, like, like people need
(41:00):
it and they use it. And, you know, so maybe they're
not happy that, you know, Wikipedia describes topics about
human sexuality that are very taboo to talk about there.
But, you know, it's like, well, we can't do without Wikipedia
and. What's the main cost of
Wikipedia? I mean is it servers?
No, I mean, yes, servers are apart, bandwidth is a part, but
(41:22):
it's really, it's very human Wikipedia.
So we have local chapters that we fund all around the world who
are doing outreach work with. We partner with galleries,
libraries, archives and museums.There's that sort of thing.
There's dealing with the press, there's keeping the website
running is is a part of it accounting, finance, you know
(41:43):
all the operations of a non profit, but surprisingly
inexpensive considering you knowthe magnitude of it.
I mean, our budget compared to any other top 50 website
property is tiny, but you know, we, we manage.
And Gee, I mean, because the other thing that's striking
about Wikipedia is sort of how low fire it is, I suppose, you
(42:05):
know, and it hasn't really changed.
Yeah, as a as a look. We like it.
You know, you get a picture and you get a load of text and do
you, do you think that is something that will change,
should change, you know? I, I mean, I think we, we sort
of did a page redesign which most people probably didn't
notice. I don't know when it was two or
(42:25):
three years ago, there's more white space on the page and sort
of elements that were sort of modernized a little bit.
But you know, you don't want to change that fundamental because
it actually really works well for an encyclopedia.
I always joke, you know, we're not going to become TikTok or
something. And, you know, it's sort of,
we'll probably have more, more video over the years and more
(42:46):
sort of animated illustrations of things like the internal
combustion engine and what, you know, you can imagine a few
technical changes, but that the fundamental just really works
really well. Like, you know, like Albert
Einstein, here's who he was and here's what he's done.
And there's a picture of him because.
I've heard you say we'll be herein 100 years.
That's the there. Aren't many institutions or
things that keep you can save confidently.
(43:08):
I'll be in here in 100 years. How confident are you?
Yeah, I'm, I'm very confident, right.
We, we run the the Wikimedia Foundation in a very cost
conscious conservative way. We build our reserves.
We've gotten an endowment fund that's a separate non profit
that has sort of is about planning for the long term,
(43:29):
about safety in the long term. A lot of our major donors donate
there because they don't want usto just spend their money.
It's more of a legacy kind of idea of like keep Wikipedia
safe. And then because we're community
driven and as long as people areinterested in having some fun
riding an encyclopedia together,I think we'll be here.
(43:50):
I thought for a couple of years,I suppose, writing this book
about trust. What is Trust?
What is trust? What I mean for me, the the
thing that defines trust, not asa epistemological definition of
trust, but thing that defines itfor me is a, a deliberate
vulnerability. Like if I trust you, I'm letting
(44:10):
go of control. I'm saying I trust that you're
going to do the thing. I'm not going to control what
you're doing. I'm going to just say I, I'm
going to trust you. So for example, at Wikipedia,
it's like, it's very open for editing.
Like you can come in, you can make your first edit without
getting permission from anyone. And we're extending that trust,
which is, you know, still a bit shocking when we think about it
(44:31):
in the context of Wikipedia. But you know, there are places
here in London you can go in andask for food, and they'll give
you food right up front and not even ask for money until the
very end. And they trust that you aren't
going to just run away. And we call it a restaurant.
And it's not that exciting, right?
Trust is really a fundamental part of human life and human
culture. You know, one of the things we
say in the book is if we were ashorrible about trust, if we
(44:54):
weren't trustworthy, broadly speaking, we would have gone
extinct a long time ago. Like we have to cooperate to
survive and we like cooperating so.
Yeah. But do you, do you think trust
is sort of almost under threat by the times we live in, You
know, contested times arguments to times tribal, you know, are
we trusting people less now thanwe, well, should?
(45:17):
So the Edelman Trust Barometer survey does show a a big drop in
trust over the last decades, notjust in social media, like over
decades, trust in journalism, trust in politics and so forth.
At the same time, on a day-to-day basis with each
other, we mostly still trust each other and all the usual
kinds of ways. So there is this moment in
(45:38):
culture where there is a trust crisis.
And you know, I think there's a lot of things we need to do to
get back to a culture of trust. And one of the things we need to
do is talk about trust and sort of insist upon politicians who
are trustworthy is like super important.
And, you know, and I think that that is beginning to happen.
(45:58):
I mean. Do you think I mean you because
you, you, you swim in the world of politics?
Your, your wife, who you mentioned Kate, obviously worked
for Tony Blair. Do do you think politicians are
untrustworthy? Well, you know, we can always be
super cynical and of course there's always untrustworthy
politicians around. I do think that we have had some
particularly untrustworthy politicians.
(46:20):
I mean, Donald Trump comes to mind.
I think for most people, someonewho, you know, his relationship
with the truth is tenuous at best.
And people are aware that peoplevoted for him not because he's
honest and trustworthy, but because of other reasons.
And they like people realize like, you know, he does say a
lot of stuff, you know, like it's not necessarily the most
(46:42):
sort of fact bound guy. But, but if you're sort of a
modern, sort of, you know, you're, you're the Prime
Minister, right? You're Keir Starmer now and you
go, well, I, I read these principles and I think, well, we
have a manifesto. We say what we're going to do.
We say what we're about. We're pretty transparent.
We try and deliver what we, we don't always get it right, but
we try and do what we said we'd do.
Why? Why am I not trusted?
(47:04):
Well, I mean, I think there's a long history, right?
And I think, you know, I would say in, in this country, one of
the, I think one of the key moments defining moment in
recent history was when we all saw the poor queen sitting on
her own at the funeral because COVID restrictions and all of
that, the funeral of her husband.
(47:26):
And then we find out later they were having a party in #10 the
night before. And I think people were just
like fed up with that. They're like, we can't trust
these people. Like they're telling us what to
do. And they're not clearly not
following the rules themselves. And, and they lost the election,
I think as much because of reasons like that As for policy
reasons and and so forth. So I think that was for me a
(47:47):
moment of hope in terms of, yeah.
Are people getting fed up with untrustworthy politicians?
And should we start to demand itmore in our culture that, you
know, when a politician screws up, they apologize.
Sometimes they even resign rather than just, you know,
black their way through it. And but, you know, it doesn't
(48:09):
happen overnight. And I do think it's, you know,
with politics, I mean, politics is a brutal sport.
And so they're always going to be people criticize whatever you
do, that's for sure. And so I think politicians have
to have a pretty thick skin, butthe risk for them in that having
a thick skin is that they stop listening and they stop saying,
(48:31):
you know, well, it doesn't matter if I'm trust me or
they're not. They're going to hate me anyway.
OK. It does matter.
I think we should be vocal aboutthat.
It matters like that. Yeah.
Things don't always work out. And people make mistakes.
And, you know, that one person in that job screwed it up
massively. That's OK.
It's not. It's not.
And that's where you are very different, I think, to most
(48:52):
business leaders, if you like, where if you criticize what the
people get defensive and they say, no, no, no, this is what
we're doing, criticize Wikipedia.
You tend to say, well, OK, we'llhave a look at it.
I try to, yeah, I try to. And I try to encourage us all to
be like that. You know, it's to say, hold on a
second like this isn't working. Like we need to really think a
lot about trust. Jimmy Wells, thank you very much
(49:12):
indeed. Thank you.
Thank you for sharing your way to change the world.
I hope you enjoyed that. You can watch all of these
interviews on the Channel 4 NewsYouTube channel.
Until next time, bye bye.