Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:20):
B B Big, Hello and welcome to Alternatively, Season one,
(01:04):
episode eight.
Speaker 2 (01:05):
We're doing something tonight that we haven't done on the
show since it was Conspiracy Pilled, which is a live interview,
and we will be interviewing Aaron Bandler about his work
with Wikipedia, and he'll tell us all about it.
Speaker 1 (01:20):
We'll pull him in a minute.
Speaker 2 (01:21):
He's backstage, but we had a couple announcements first, but
I want to let you know that we'll do the
interview for about an hour and then we'll say bye
and we'll have a chance to chat with you live.
So well, yeah, we'll have a chance to talk about
it behind his back, even though he'll still be I'll
still be listened back in front of no. I wanted
(01:42):
a chance to really connect with you guys tonight and
talk about talk about things. Take take the conversation wherever
it wants to go after we're done.
Speaker 1 (01:53):
So let's see.
Speaker 2 (01:55):
Number one announcement that I forgot to say last week
is that we are closing up the merch shop. It's
just costing us more to keep it open than it
is making and it just doesn't make sense right now.
But we'll keep all the files. Well, we can bring
it back at some point and the website is still up.
Website still up, Liz. What what installment of your book
(02:16):
are we on. We're on installment number five. I'm going
to link that into the discord, but yeah, part five
is up. So it's part five out of twelve of
Planet Eyes. It's my creepy sci fi sci fi.
Speaker 1 (02:28):
It is creepy and you should definitely read it late night,
late at night. Yes, read it preferably in a room
with lots of windows. I won't specify why.
Speaker 2 (02:40):
And then the other thing is that Rumble just instituted
a change that means that anyone who is a subscriber
to this channel can get access to the Rumble premium
content that we make. That used to be that you
had to be a Rumble Premium subscriber to get to
get it, and now you just have to be sub
(03:00):
to the channel, which is way cheaper than being a
Rumble premium subscriber. But if you're a Rumble premium subscriber,
you get access to every other creator's premium stuff too.
So I just wanted to let you know that is
the case. But we are still also our main paywall
platform is locals and tomorrow night will be on locals
(03:20):
talking about Forrest Frank's broken back miracle.
Speaker 1 (03:25):
Question mark No. Zero.
Speaker 2 (03:28):
So it's been a really really viral story on Instagram
with this song Lemonade that he put out while his
back was broken and then he got prayed for and
his back was healed. And do we think that this
miracle was a real miracle or was it stage for clicks? Anyway,
That's that's tomorrow night. But enough enough of that, enough
of that will get Aaron here and we'll start talking.
Speaker 1 (03:47):
Hey, how are you good?
Speaker 3 (03:48):
How are you guys good?
Speaker 2 (03:50):
Introduce yourself, tell us what you do, why you're here,
what we're talking about tonight.
Speaker 3 (03:56):
So as as you said it at the beginning, I'm
Aaron Baandler. I'm an investigative journalist. I currently work with
a Jewish news and The Kid as a national reporter.
Prior to that, I was with the Jewish Journal of
Greater Los Angeles for almost eight years. I've also been
published at Real Clare Investigations. And you know, a big
(04:18):
area focus of mine has been Wikipedia. An anti Israel
bias at Wikipedia started back in actually the days, all
the way back in twenty eighteen, where yeah, I would
when when we have the Jewish Journal, were tipped off
to it, and you know, it took some time to
sort of figure out like the best way to you know,
kind of start getting it out there, and we eventually
started publishing a series of articles last year and that
(04:42):
series has continued over at JNS, and as I said earlier,
I've been published in Birth Clare Investigations about it. You know,
Wikipedia in a way is kind of the gift that
it keeps on giving it in regards to you know,
like newsy stuff because of how out of control h
you know, I activist editors have become on that site.
Speaker 1 (05:02):
Mm hmm. We've talked.
Speaker 2 (05:03):
We talked about this maybe a year ago on the show,
and I'm excited to hear more about it because I
think that there's a I remember back in the day
when it was like never cite Wikipedia, you know, we
don't trust Wikipedia.
Speaker 1 (05:15):
Anybody can edit it.
Speaker 2 (05:16):
And then it passed through a period of time where
it was very it's been very trusted. It's like one
of the top Google hits for anything you've searched, and
now it seems to be severely compromised in a lot
of ways.
Speaker 3 (05:28):
Yeah, well so What's interesting is that I think if
you ask the average person about Wikipedia, they would say, like, oh, like, no,
it takes Wikipedia seriously. They can't be cited in research papers,
but like studies that show that, like the vast majority
of students begin like the research process for their coursework
by looking at Wikipedia. And as you mentioned, Wikipedia is
(05:48):
at the top of every Google search usually, and that's
not coincidence. There's actually Wikipedia article that details how Google
and Wikipedia have been collaborating for years. You know, Google
donated three million dollars to the Wikimedia Foundation or so
something that effect in twenty nineteen. That's that's again, it's
it's all in the article of Wikipedia page on Google
and Wikipedia. So and I think that Wikipedia is like
(06:11):
the six most viewed website worldwide. It's just like a
top it's like six eight, you know, so somewhere in
that range. And I think basically, even if like majority
of people don't don't think Wikipedia seriously, we all use
for information. We all, you know, I mean, like everyone
(06:32):
at some point uses Wikipedia in their daily lives. It's
just inexcapable, you know, like whether it's you're looking up,
you know, stats about your favorite sports player or like
who played what on? Like you know, a beatle on
a Steely Dan album or Beatles album, you know. And
I think for those whose Wikipedia is fine, you know,
the problem is when it comes to contentious topics, it's open.
(06:53):
Wikipedia is an open source model, and so it becomes
vulnerable to bad faith actors who have an agenda, you know,
and those people are very good at being able to
figure out the complex labyrinth system that is Wikipedia deuocracy
and then weaponizing it, you know, to sort of justify
putting their narrative onto Wikipedia and disguising it as neutrality.
Speaker 1 (07:16):
Do you think it's an outgrowth of the like leftist
activism of the universities to some extent or is it
completely separate.
Speaker 3 (07:30):
I think that that plays a role to an extent
because Wikipedia is articles, so editor is the only thing
they can do. Really is some as reliable sources say,
and Wikipedia is very deferential toward academic sources, and we
know that academics are tending to be rather biased toward Israel.
And of course we've seen that, you know, what's been
happening on the campuses. It's October seventh, and so part
(07:52):
of what's been happening on Wikipedia is that anti inital
editors have been taking citations from anti is Real academics,
you know, like Rasheet Holiday, Elon Pope, Abbi Schlim, you know,
like and and and they are taking those citations and
(08:12):
pretending that that's like the mainstream view quote unquote, and
therefore it's being used in a neutral voice, I think
encyclop like a wiki voice. This is what's known on Wikipedia.
And so you see that with like on the Zionis
in Wikipedia page, there's a sentence that says that zionis
wanted as much land as as many Jews in as
few Arabs as possible. Citations really it's almost exclusively anti
(08:37):
Israel academics like like Rashet Holiday and Elon Poppe. I
talked about this my Real Clear Investigations piece, and so
that's a good example of how that's editors have taken
the views of anti Israel academics and trying to like
you know, and trying to mainstream it. But and I
think the way that they get away with this is
because Wikipedia. Wikipedia has a model called consensus, which basically
(09:03):
the idea behind it is that if you make a
change to an article and no one disputes it, it's
it's presumed to have consensus. But if if someone does
dispute the change, you're supposed to do get your differences
on what's called the talk page of an article. And
every Wikipedia arcle has a talk page like it, Like
it's if there's a tab on the right that says
talk on every Wikipedia base, you can see these discussions
(09:25):
play out where editors can gues see right there where
it says talk right next to article undersign that. And
so if you go to talk you can see the
verious discussions editors have had on the page and as
they argue over what the article should or should not say.
And so how it's supposed to work is that editors
are supposed to you're supposed to see compromise with each
(09:48):
other then say kumbaya afterword. But this is the Internet,
you know, so of course, like you know, first of
the Internet over contentious stuff, you know, like that doesn't
always result in compromise. So one of the ways they
tried to in the salement is they have these formal
discussions called requests for Comment and others like it, where
basically they take a poll of Wikipedia community on a
(10:11):
specific question, and the Wikipedia community puts in their stated
position with the rationale, and the rationale they have to
have to make the arguments based on Wikipedia policy. You're
not my personal opinion Wikipedia policy. And at the end
someone knows a closer, which is an uninvolved Wikipedia and
good standing grander is a verdict based on the numbers
(10:35):
as well as the strength of the arguments presented, and
so numerically speak it, census is usually considered to be
like at least two thirds, So that means like for
a change, you need like at least two thirds of
Wikipedia community's support, you know, to make a change you know,
to articles that have long standing content in them. And
so that's why it's hard to get something anti Israel
(10:56):
content change on Wikipedia articles because there's enough editors you know,
opposed to it to stop it, basically stone wallet for
being changed.
Speaker 2 (11:05):
So is there a time limit from when a change
is made to have an issue with it before it
becomes consensus.
Speaker 3 (11:13):
That's a good question. I mean there's no real like
hard and fast rule. I guess it's like if it
just kind of stays and like no one challenges it,
then it's fine. I mean maybe if it's like you know,
maybe if it's like someone puts someone puts in edit,
someone reverts it like i'mdoing it, so it puts it
back in, and then they kind of like just stops
(11:36):
for a bit. It's circumsensus. It's basically they consider it
based on what's consider the stable versions. In other words,
like when editors just stop edit mooring over it, or
if they just or if discussion kind of peters out
on the top page, you know, they shouldn't have consensus,
you know, or of course, like when a verdict is
given for like like for these like formal discussion polls,
(11:58):
you know, then consensus is clever. Otherwise it's you know,
it's it's kind of one of those like I know
it when I see it kind of things right now,
looking at the looking at the edit history and the
talk page discussions.
Speaker 2 (12:11):
Okay, so fairly recently some of these pages were edited
to an anti Israel bias.
Speaker 3 (12:20):
Well, it's been there for I think against Israel bias
in Wikipedia has been there for years. It just became
more pronounced. I thought Chober seventh. So a good example
of this is that prior to Tiber seventh, there was
an article and for many years had the tile that
excuse me the title of Israel and Apartheid or Israel
and the apart is when the apartheid analogy like something
like that. But the article was biased because it had
(12:43):
all these allegations about Israel being accused of being in
the partheid state, and those allegations would dominate the article
and you would have some rebuttals. Sure, but like you know,
they're kind of buried down and where as well presented.
And so even though like it was coush as like allegations,
you know, like the that's clearly they were trying to
(13:04):
steer the reader to a particular perspective. Sure, after October seventh,
that article is now called Israeli Apartheid and treats it
as as if like Israel being a part heeyd state
is an objective fact, which of course it isn't. Like
like those of us who know who follow the conflict
know that it's that it's a the Israel is obviously
(13:25):
not in a park hed state, and that everyone in
Israel has equal rights and so forth, and but but
but but they basically but again like they take a
dance to Israel perspective and have mainstreamed it, you know,
unto the guy's in neutrality and they're able to do
that because you know, a small cobal of editors were
able to just make the change, you know, and it
(13:47):
was kind of an under the radar discussion that happened.
Speaker 1 (13:51):
For for those who aren't familiar, perhaps what what would
an apartheid state be? Maybe an example.
Speaker 3 (14:00):
Most South Africa is is is usually the one that
is the most well known and best example of the
prior state because because because it was based it was
it was segregation. Like the blacks in South Africa, we
(14:20):
were deprived of of equal rights basically under under under
a white government. And so it's basically when they're acusing
this will being a partid state, like they're accusing Israel
of the base essentially being a racist state, like they're
being racist for the Palestineans depriving the Palestines and equal rights.
You know, they're being subjected to segregation, you know, and
(14:42):
the cleansing, displacement, you know, that's what they're that's that's
that's the that's how they try to demonize Israel like
it's like a part height South Africa. It's because South Africa,
I mean the talk about a party of South Africa
that was prior to Nelson Mandela, right, and the Nelson
(15:03):
Mandela you know in South Africa ended uppartheid, which of
course was was was a great thing.
Speaker 1 (15:10):
So would Palestine be considered an a partie state.
Speaker 3 (15:15):
That's a great question, and I would consider it a
part well. Of course, the Palestine technically is not a state,
but like the Palestine and they're in the West Bank,
you know, like they phiboit the sale of Lanta Jews
and the penalty punishment by death, you know, so that
that sounds pretty Apartheidi. To me, there's a.
Speaker 2 (15:36):
Lot of big bad words that are used around Israel. Apartheid,
ethnic cleansing, genocide.
Speaker 1 (15:45):
And it's.
Speaker 2 (15:49):
The words are being watered down until they almost have
no meaning anymore. But they're also kind of being turned
on their head as far as the people doing the
accusing of the people doing the thing half the time,
but they're just repeated over and over and over, and
then they're.
Speaker 1 (16:08):
Repeated in something like Wikipedia that we don't think heavily about.
I think you're right about Wikipedia, like it's not a
highly respected source, but we just don't really when we
read a Wikipedia article, we're.
Speaker 2 (16:21):
Not like, oh, man, I wonder if this is correct
or not. We just casually only talk to us about
about your work in this whole series you've been doing.
Speaker 3 (16:35):
Yeah, So basically what I've been doing is it's it's
been a lot of years of research. Really, I'll use
the research talking to various Wikipedia editors. Most of them
are quoted anonymously in the articles because they are afraid
of retaliation. And so I've been talking to them and
learning more about how Wikipedia works and ask not various
(16:57):
examples and so forth, and just really understand and the
Wikipedia legalize of it, you know, and how how these things,
how these articles are supposed to look, how they're supposed
to read, and so and then also like getting like
outside perspective from experts, you know, a software Morowski, He's
a Middle East historian who I've quoted quite a bit
in my work, as is Elsisi, who's a vice record
(17:17):
Tellity University. They've both been very generous with their time,
you know, for me in this regard. And basically what
what I try, what my articles do is they show
basically how Wikipedia's protties have been bastardized, how the policies
have been bastardized. And then and it basically it decodes
(17:38):
all the Wikipedia legally and explains like how it's been
turned on its head basically, you know. So it's sort
of like, I mean, it's someways an article of mine
is almost like Wikipedia topic discussion that like, you know,
Wikipedia editors are commenting in my story as to like
how the discussion should be going or how the article
should be read, you know. But of course they're talking
(17:59):
to me because it's not because Wikipedia is so far
gone this point that you know, it's the only way
to really like make your voices heard is to like
have you know, is to get through the press.
Speaker 2 (18:12):
Yeah, okay, So are are changes continuing to be made
to like these pages getting worse and worse and worse?
Are they kind of static since October seventh?
Speaker 3 (18:27):
I would say they're static. So the at the beginning
of the year, there are a bunch of anti Israel
editors got banned from editing Israel Palestine articles, uh for
and definitely although they can appeal it. In twelve months,
so six six anti as editors were topic bands. That's
what it's called as we're too pros with editors. And
(18:49):
after after that happened, it seems like and that, and
also there's been now letters to Wikipedia from the d
o J members of Congress. You know, I think now
people are starting to realize that, like people are watching
us now, so I think it's now become a bit
more static. But I mean the way I view it
(19:11):
is that is that is that because it's static, it's
like the damage has already been done, you know, like
there's a sign of support to like overturn you know something.
The worst answer is, well, content that happens after October seventh.
And then you've also seen things like an example that
I mentioned earlier with with the Zionism Wikipedia page and
that sentence that zionis wanted as much Zion as many
(19:34):
Jews as eups as possible, that they put in a
moratorium on that on anyone discussing that sentence for a
year in February. And so what that means is that
that line is basically like frozen in place until February
of twenty twenty six. And that happens, you know, I
happened in February, so as the topic bands happened, like
(19:55):
you know, there was overwhelming support for for for moratorium
to be placed on that sentenced. So like that sentence
is not going away anytime soon, and even the more
term expires it like like in you know, in you know,
come February, like it's going to be a upill battle
to try and not do it.
Speaker 1 (20:12):
Where where was that sentence in the you said it
was a Zionism article.
Speaker 3 (20:16):
Yeah, it's it's right there the opening paragraph.
Speaker 1 (20:19):
Zionism is an ethnocultural nationalist movement that emerged in the
late nineteenth century Europe to establish and support a Jewish
homeland through the colonization of Palestine and region corresponding to
the Land of Israel and Judaism and central Jewish history.
Zionis wanted to create.
Speaker 2 (20:34):
A Jewish state in Palestine with as much land as
many US and as few Palestine Arabs as possible, so
by the.
Speaker 1 (20:39):
Time can be disputed, the damage is way done.
Speaker 2 (20:42):
So it is really significant for something like that not
just to be in the article, but to be in
the first couple sentences.
Speaker 3 (20:49):
It's right because most people don't read beyond the lead section,
especially the opening paragraph. You know, because with the because
our Google search is oftentimes like the first sentence or
two uh, usually the first sentence is is what a
well it could yeah, Okay, Like the the opening sentence
(21:10):
or two will pop will pop up in like a
little box like on Google after you search it, right,
and so and that comes from Wikipedia and so and
so That's why, like those opening sentences are very important,
you know, for what art Wikipedia article says and and
and that's that's why a lot of times like these,
discussion center around the lead and typically those opening sentences
(21:32):
and what it should or should not say.
Speaker 2 (21:35):
So when you say you said six anti Israel and
two pro Israel and your's work topic band, what's the
reasoning given for that?
Speaker 1 (21:45):
What does that entail?
Speaker 3 (21:49):
Basically? So, the Wikipedia has what's called an arbitration committee.
It's their version of a supreme court. And they had
a case of about Israel palace pasting topic area just
because tensions were just so high, you know, after October seventh,
and there have been a lot of like news coverage
(22:10):
of it too. But but but but Wikipedia purposes, they're
just looking at you know, a lot of editors were
finally complaints against each other. Wikipedia has it has like
its own little like court system, and so they have
like lower courts where editors can file complaints against each
other and then admins will you know, we'll like rule
on and amins like in these kinds of topics, they
can unilaterly ban editors at any moment, you know, but
(22:34):
usually like like like they will wait for other admins
to like deliberate and you know, come to a consensus
for lack of a better terms. But but but when things
get so heated, you know, there's so much disruption, uh
happening in a topic area, it goes to arpcom our
Wikipedia Supreme Court and basically basically like their determination was
(22:56):
that these editors weren't editing in a Collie agil or
neutral fashion, and that and and and and that and
that was then that was reason too to put these
topic bands in place. And they aren't. They they're indefinite,
but they can be appealed in twelve months, so in
January they can be appealed, and I guess we'll see
(23:17):
if any of these editors will have any successful appeals.
Speaker 2 (23:22):
So they're not banned from editing anywhere on Wikipedia. They're
just banned from the topic that they're clearly too biased
in to be working on.
Speaker 3 (23:31):
Yeah or can yeah, I mean right, the too biased
in or just like they can't conduct themselves in a
like collegial manner because of Wikipedia, like it's very stability
is a very important thing in Wikipedia. They're very much
stress like being kind. You know, of course it's the Internet,
so no funny Internet. You know, it's not always not
always a kind places, you know, what comes to contentious topics.
(23:51):
But you know, so so so if editors are to
get too heated, you know, sometimes these sanctions can be
a way of like poling things down, like take like
taking out temperature bit in terms of the discussions and
the topic area. So so basically like these editors have
to show that they can be productive elsewhereund Wikipedia and
(24:14):
edit collegially and you know, collegially and objectively, and then
they can found appeal. You know, come January and be like, hey,
you know, I like I think, you know, I've had
the time to reflect, you know, like I've shown that
I'm shown I have shown the past year that I've
been productive in these areas. I've done a lot of
great work here. You know, I'm ready to I think
(24:36):
I'm ready to tackle this subject again, you know, and
you know, and then then's up to the to artcom
uh to you know, decide if the appeal is worthy
or not.
Speaker 1 (24:47):
Are all these editors doing this for free?
Speaker 3 (24:51):
Well they're supposed to, I mean Wikipedia. Wikipedia says that
the editors are volunteer editors. Now now paid editing is
allowed on Wikipedia so long as you disclose it. And
I think a lot of times people who engage in
editing without disclosing it, you know, they get caught because
there's some better Wikipeda editors who are really good at
(25:13):
sipping this stuff out and they can usually tell it
because like if some an editor, say, is being paid
to edit and have like a corporation, and they notice
like some random account just pops up US corporations Wikipedia
page and it's putting a promotional content, they go, hmm,
it seems a little odd, you know. You know, but
(25:33):
you know when it comes to something like you know,
Israel Palestine, and it's a little more and people are
a little more i say, careful about how they do things.
It's like, I mean, look, I mean perception of innocence, right,
like so you know, I mean I have reason to
like there's no evidence that ties any of these editors
(25:56):
to like, you know, propaganda groups or whatever, but for
all we know, they could be right because they're anonymous.
Speaker 2 (26:05):
Like it's it's if somebody is aggressively editing the Microsoft
page to be pro Microsoft, it's easier to think, oh,
maybe Microsoft's paying them then a topic that it is
reasonable to think people might care about without needing to
(26:25):
be paid, And so it'd be a lot easier for
somebody who was being paid to kind of slip in there.
Speaker 1 (26:33):
Is that what you're saying?
Speaker 3 (26:36):
Yeah, I mean right, it's right basically, yeah, exactly if
they'd have to disclose that they're being paid by Microsoft
to make those edits, and yeah, I mean it's uh,
you know, and as some editor is like, you know,
we help us promotional content in Microsoft page and just
talk about of nowhere people are going to be suspicious,
but yeah, they just flip something in, you know, they're
less likely to be noticed. But we're special a palisad.
(26:58):
We're talking about talking about like corporations or that we're
talking about like you know, we're talking about like a
very complex conflict, talking about countries and regions, and so
it's it's like, how do you like, like, how would
you know if if someone's being paid you know, you know,
I have a prop again in the group, I mean.
Speaker 2 (27:20):
And is it Is it any better if people are
spending their whole lives doing like obsessively editing Wikipedia pages
for free, Like what kind of person is that?
Speaker 3 (27:36):
But you know that that's a great question. I guess
it's so. I think when I talk to editors and
how they got involved in Wikipedia, it's it's usually something
along the lines of like they wanted to check something
they wanted they saw something that was inaccurate and wanted
it and wanted fixed. And then it was like it's
(27:56):
basically like something like pique their interest about what Wkipedia,
and they just like, you know, once to start editing,
and they just they couldn't stop. You know, It's like
it's like going down the rabbit hole, you know. I
mean with Wikipedia. Apparently it's not just addicting for Wikipedia,
Like Wikipedia rabbit holes are a thing, right, you know,
Like it's very easy, like if you re read to
(28:18):
read one Wikipedia article after another after another. For the reader,
that's very addicted. But for editors it's it's addicting too
because they get addicted to making edits. You know, they
get addicted to the discussions and arguing about the policies
and you know, so so basically it's like these basically
these editors just just grew to fall in love with Wikipedia,
(28:39):
And to be honest, I kind of have too, just
recovering it, you know, because I find it so fascinating.
It's just because it's really as a world onto itself,
and you know, you read about like like like how
all these policies work and the procedures. It's just it's
such a fascinating subject that people just don't know much about,
you know, and which makes all more disheartening to see
(28:59):
it become.
Speaker 1 (29:00):
You know, I can imagine that the.
Speaker 2 (29:05):
Having power over information could be incredibly addicting.
Speaker 3 (29:10):
Oh for sure, Yes, that's for sure a big part
of it, I would imagine.
Speaker 1 (29:15):
Yeah, this is a side note. When I was in
college ten years ago, there was a story that was
told in our history class about a former student a
few years prior who edited the Archduke Ferdinand page concerning
(29:39):
his assassination that began World War One to say that
his assassin had first pistol whipped a bystander before shooting
our Ferdinand. And this edit stood for a long time.
It ended up getting in a circular citation. So there
(30:01):
was Yeah, there was another article that had quoted it,
and then that was cited. And years later, several years
even after I graduated, so probably a decade after this
and it had been made, I could still find an
article that was citing the Wikipedia article about this thing
that this guy just made up out of his head
(30:21):
for fun as a joke.
Speaker 3 (30:24):
Yep. So so so what you what you're describing that.
There's a term for that on Wikipedia. It's called cytogenesis,
where we're basically like, something gets made up on Wikipedia,
it gets cited in an article, and then that article
gets cited on Wikipedia as a source, and and I
think and Wikipedia as a whole list of these incidents.
(30:44):
So you can see like Wikipedia, well they say Wikipedia
is that like it's very transparent and open for the
most part, but you know, so you can see, you
can like you can you can find these things on
Wikipedia and how they all transpired. But I think that
it's us to show, like, what, you know, how influencial
Wikipedia is, because because as we discussed the begining of
(31:07):
the show, like how it's you know, top it's like
the six of us, the website worldwide and so forth,
top of every Google search. It's like because everyone used
to Wikipedia for research. You know, that means that like
you know, journalists and academics, you know, students, you know,
with the research papers, like they're looking at Wikipedia at
some point, even not setting it, they're still like looking
(31:27):
at what the article says and incite the sources and
so forth, you know. And so the fact that says happens,
you know, it shows how influencial Wikipedia is, and it
shows how it's like a closed speedback loop between Wikipedia, academia,
and the media. It's and and and so and so
that's what the cards of the problem, you know, is
(31:48):
that because you have this close useback loop between you
know three you know systemically biased in you know, institutions,
you know that it's it causes this like self sustaining
ecosystem of bias. And it's and it's like, how do
you stop.
Speaker 1 (32:03):
Speaking of Are you familiar with the issue with the
Scots part of Wikipedia? The guy who made up all
that stuff? So I variable in the cons will have
to correct me if I, oh, there's a meme in discord? Okay,
let me pull it up. Because basically my understanding is
there was a guy who just for a long time
was just posting stuff in Scots on the thing and
(32:26):
just he was making it all up and I just
have to type in my password real quick.
Speaker 2 (32:30):
If you can pull up the meme before me, go
for it. It should be in the meme stream, is
what I'm assuming.
Speaker 1 (32:37):
But so he was he was writing in Scots. Yeah,
I think or making up. I've suddenly lost ability to
test dream means there it is. Okay, ah, thank you
the think about it earlier and that I totally forgot it. Sorry,
take me a second, how dare you? Okay? Oh oh yeah, okay,
(33:02):
go ahead. Almost every article on Scott's Wikipedia is written
by one American teenager who does not speak Scott's and
is just writing English in an accent. If you haven't
a multi lingual language model, this fakery might be your
entire trading data for Scott's. And the telecas article is
telcanas as a form of movement in objects with your mind.
Speaker 3 (33:24):
That's hilarious. I like that that. Yeah, boy, that's great.
Speaker 1 (33:32):
That's amazing.
Speaker 2 (33:35):
So yeah, you're right, it's a classy back loop that
creates some of these like really funny issues. But like
you said, like even even someone's when I'm doing research
for this show, I will oftentimes look at a Wikipedia
page first, and of course I'm not going to excite
it as like this is the this is the end
all be all, but it sets the tone of a
conversation and it sets the structure of a conversation in
(33:58):
a way that I think even people who in the
back their mind a like I don't take this super
seriously are still probably not fully realizing.
Speaker 3 (34:08):
Exactly yeah.
Speaker 1 (34:09):
I mean.
Speaker 3 (34:09):
And the corourse of the issue is that is is
that Wikipedia because everyone uses it and it doesn't really
think about like whether or not like what's being said
it's true. It's like it's just kind of like like,
oh yeah, Wikipedia says this, and then it's like kind
of taking this gospel, and then it kind of spreads
its way throughout the culture. And and so again I go,
(34:30):
I go back that sentence about the Zionists wanted as
many much land, as many Jews a few Arabs as possible.
Because last year there was a there was a city
councilman and in in in the Bay Area who made
ansis will comment and when he was called out on it,
(34:52):
he pointed to that sentence in the Wikipedia page as
justification pom marks And so now you're seeing how it's
oh wait, like now it's like reaching its way to
like you know, it's not being promulgated by politicians too,
you know, it's I mean, it's like you know, it's
today it's a you know, it's it's city councilman like
you know in the southern part of the area, but
(35:13):
like to while it could be a member of Congress
you know who knows. Right. So so that's a Wikipedia
session important issue. And it's uh, it's frustrating how like
little how it's gone on to the radar for so long.
I think in the past year it started to get
more attention, thanks for reporting for myself and others, you know,
(35:34):
but it's it's still not like it's still not really
like people's radar, and it should be.
Speaker 2 (35:39):
We we came across it because we began to realize that.
Speaker 3 (35:44):
Uh.
Speaker 2 (35:45):
For example, if a highly respected academic in a certain
field suddenly decided, I agree with this conspiracy theory and
he wrote a paper about it, put it out there.
If he used to have a highly respectable Wikipedia.
Speaker 1 (36:04):
Page, as soon as he crossed that line and said
something that he wasn't allowed to say, all of a
sudden somebody would be editing his top two sentences to
go instead of highly respected, respected academic in this field
to conspiracy theorist, and completely like tanking his reputation in
(36:31):
just those top two lines just because he you step
having he did a wrong a wrong thing. And so
I care a lot about Israel and about that whole
mess of stuff in our culture. But I think this is,
this conversation is in a lot of ways a microcosm
(36:53):
of like a much bigger Wikipedia problem in general, a
much bigger information problem in general, because if they're doing
this with is real, they're doing it with everything, right.
Speaker 3 (37:04):
I agree, Yeah, I mean, I think it's important to
realize that, like they're different editors open different topic areas,
you know, and the topic areas like it's a little
like Chief Dumb of sorts where it's like you have
like the regulars to kind of dictate the coverage and
so forth, and uh. But but it's also just bac
the same problem, which is that Wikipedia overalls as a
systemic left wing bias, and so you talk about like
(37:26):
it's like, you know, yeah, you see see a lot
of instances of people's Wikipedia pages who are accused to
have of you know, misinformation, you know, conspiracy theory, you know,
and and and you know it there are instances where
where that's warranted. But but but then you men of
some where it's like, you know, it's just from a
wrong think, you know. When it comes to like a
(37:47):
wrong think, you know, it's because they found like you know,
some like immediate hit pieces calling this person a conspiracy theorist.
And it's like, oh well, you know, like daily based
and media I or whatever else it could give me
The Hill York Times, like they call this person a
conspiracy theorists, so we should too, you know. And there
(38:08):
are some editors who argue that Wikipedia has a policy
called BLP, which stands for Biographies of Living People, and
it's it's supposed to a policy against the information, you know,
and it's the idea is that it's supposed to be
a high threshold to like you know, try and and
like you know, turn a person walkipia page to attack page.
But people still do it all the time, especially when
(38:29):
it comes to right leading figures, you know. And part
of it's because it's not hard for them to find
media hit pieces, you know, against right leading figures, and
like oh well yeah, the media says this person is
a you know, is a you know, it is a
conspiracy theorist who problem gets this information. So we got
to say it too, you know, whereas others I talk
(38:51):
to say, like, you know, we should just let's just
summarize with those person's views, aren't you know, And if
people dispute the views, we can say that. But like
you know, it's not appropriate for us to be telling
people what to think about those views. A very dogul concept, right,
like that's probably supposed to be, you know. And if
you look at like any like like left wing figures
(39:15):
get pages, I pick anyone like pick Biden, pick Kamala, Harris,
pick Obama. I mean they're all like very I mean
they're very very anodyne, you know, and some case it's
very sympathetic. Look at the tone of the pages. It's
just like a quick summary of their of their views.
Talk about like Biden France, if I don't if I
(39:35):
recall correct, like Biden's page talks about how he's considered
moderate by some people and well regarded by a Center colleagues.
It's something on those lines. I mean, I've looked at
that page in a while, so maybe it's changed a
bit since then. But but you can pat like Biden's
pace and looked like Trump's and like Trump's page talks
about like, oh, you know, Trump has made a lot
(39:57):
of false in this leading statement, you know, and I
mean like open paragraphs is that Trump page to Biden
And it's just like it's night and day compared to
how wikipcovers the two of them. And but editors just like, oh,
we're just following what the sources say. You know, sources
say Trump has false misleading things. Sources don't say Biden
says false and misleading things. And part of that this
(40:20):
thin gets back to the issue was considered a reliable
source from Wikipedia and so and again it comes back
to the issue of consensus. So so editors decide amongst themselves,
like what is or is not reliable source? If they don't,
they take these poles, well I say it, they discuss it,
they take these formal poles, and you know, a closer
ring is a verdict. And what happens is that like
(40:41):
they've taken all these all these mainStreet media sources are
considered like general reliable, which is the highest that's the
goal standing Wikipedia. So it's like CNA, New York Times,
while all generally reliable, you know, you can cite them
whenever and wherever for the most part. And then uh,
but then you have the sources like the Daily Why,
Uh Daily Coller, New York Post, Fox News that are
(41:05):
basically considered unusable mostly on Wikipedia. And then you also
have far left sources like The Nation and Mother Jones
in SNBC, all jaz or they're also considered generally reliable.
It's like these sources are It's so now you have
like this this very obvious misrate balance, you know where
(41:26):
it's like all these lefty sources are considered coacher Wikipedia,
but these right link sources are not. And as I
said earlier, Wikipedia editors can only summarize what reliable sources say,
so that alone and it caused bias. You know, I
if the only articles you can alwa sources you can
choose from, or primarily left wing sources, do.
Speaker 2 (41:45):
You have any idea what percentage of people who read
a Wikipedia article are looking at the sources with any
sort of critical eye.
Speaker 3 (41:56):
That's a great question, almost in wayually measured that I
would just which just guessed that most don't. I mean,
Wikipedia has to say like a show that like only
sixty percent that it's like, sorry, that's the sixty percent
of people don't read be on the lean section of
an article. So we figured say that those that that's
actually percent of people if they're not reading past the
lead section, probably not reading and looking at the space,
(42:19):
probably not right. And look, I mean like I'm look
at Wikipedia for like, you know, I guess summary of
a movie or or you know, like whop like whop
are you know like the World series back in nineteen
ninety five or whatever, you know, Andy six, whatever, pick
pick a year. You know, I'm not looking at the sources,
(42:39):
you know, because it's just what people it's like, really
look at Wikipedia. They just don't think I'll do that.
Speaker 1 (42:45):
It's freak and so most people.
Speaker 3 (42:49):
That's a great way of putting it. It's Wikipedia's for convenience,
It's for readers convenience, or anyone needs to know something
about topic. Boom Wikipedia, right there, Great, I know about
it now, you know. It's it's oh no, he frows,
Oh no, we'll come back. You're back, You're back, yay.
(43:10):
So so but yes, it's I don't know, I don't
know where I got caught off there. But but yeah,
it's it's it's definitely a convenience thing. For that's why
it's so popular. It's that it's convenient, you know. But
the problem is that now because it's so convenient, it's
now become a it's not very easy Franti Israel and
(43:30):
left wing bias more generally to be spreaded like people's
iPhones and and homes and desktops because Wikipedia is.
Speaker 2 (43:38):
Yeah, I think so many of us, at least in
my like Twitter circles, have focused really heavily on how
social media has been controlling the information and how mainstream
media has been controlling the information, with almost no one
talking about how Wikipedia is controlling.
Speaker 3 (43:56):
Yeah, yeah, yeah, it's it's just it's been under the
radar for so long, and it's I mean, maybe it's
because people, it's because wicked people don't think of Wikipedia.
Maybe it's asidered after thought where it's like, oh Wikipedia,
you know, and it's it's I think it's it's also
a very complex subject too, you know, because like like
(44:17):
it's very easy for your isiccallys over once you start
looking through the policies and the editors all reference to
policies and like alphabet, soup, shorthand and whatever. It's just
kind of like what you know, which is why I
I took the time to go to go through all that,
so you don't have to, you know. So so yeah,
(44:37):
I mean I I that's my theory is to web
wikip has got under the radar for so long. But
I have noticed that, like when like wanting to post
arco what Wikipedia, it tends to do pretty well on
social media. So it shows I think, And again I'm
not the only one who's published a Wikipedia. I mean,
there is actually Rensburg, a pirate Wires, Schlow, meet Lear,
the researcher University of Haifa. They've done some great work
(44:59):
on this subject. That's also you know, gone viral. So
I think that shows that there is that there is
an audience for this, you know, that that people do
want to know more about Wikipedia, and they are and
they're hungry for more information about how Wikipedia works and
how it's become biased. You know. It's just there's not
many of us that can really, you know, dissect it thoroughly.
Speaker 1 (45:23):
What are some if I'm if I'm researching on Wikipedia
and I go to an article, what are some red
flags maybe to look for that an article might be biased.
Speaker 3 (45:35):
Okay, So I so one thing you can and look
for is an excessive amount of citations. And so there's
there's actually a this is phenomenal Wikipedia, Like, there's an
essay that that that discusses this. It's called citation overkill
also as a ref bombing, And what that means is that,
(45:56):
like a claim, someone is concerned they' putting a claim
into an article that's going to get disputed, so they
put a bunch of citations in it to like tryumph
be like, hey, all these sources seem to say it,
so you know, don't don't dispute this. It should be
in here. And so I think an example of this
(46:16):
actually is on the Wikipedia page for Hamas. You know,
there's a there's a sentence or two that talks about
comparing the Hamas Church to the COUD party platform. It's
like two sentences, and I think like one of them
at least at one point had like seven or eight citations.
Speaker 1 (46:32):
Oh for like one sentence to have seven r eight citations.
Speaker 3 (46:36):
Is that it's excessive. Yeah. I think anything anything more
than like three or four is considered excessive.
Speaker 1 (46:42):
So is a citation, right, Yeah, Okay.
Speaker 3 (46:47):
Control f Lacun. I think you can find it. Actually,
I think it's you think you had it. I think
you had it.
Speaker 1 (46:55):
Oh yeah, I see, I see thee with there's like
a bunch of others.
Speaker 3 (46:58):
One yeah, I mean I don't know about that sense
sem per se.
Speaker 1 (47:00):
But look at that.
Speaker 3 (47:04):
Yeah, this is so are you look at like look
if you control f lacud, how do you spell that
l I k u d so likud? That's likud is
that's Benjamin and Yahu's party in Israel. See right there. See.
Several scholars have compared Moss's lack of recognition to the
recognition of Israel to the ku'sla recognition of a palacinine state.
(47:24):
Look at all the citations, and I think there was
more at one point. At one point it was like
seven or eight. I guess they trimmed it down. But
I mean, but like look at the citations, Like one
of them is Peter Beiner, you know, who is a
stunt like he was very much like an Israel critic,
you know, Noam Chomsky, No, I mean you know so
like so that right there is a tell of bias
(47:47):
because it's like, you know, they're making like this very
like this very general point by citing you know, like
very left wing ants, Israel figures, and and they kind
of hit it by putting in all those citations.
Speaker 1 (48:01):
To make it podcast. Right, those are famously very reliable,
I would know. Yeah, this whole paragraph is like excessively
cited all over that.
Speaker 3 (48:13):
Yeah, and and and so and so what that so yeah,
So it shows that the editors whoever wrote that was
consumedly a good challenge and and maybe in some cases
like they were challenged and then like oh yeah, like
I'm gonna show you that this is worthy of conclusion
by adding in more sources. Uh, and so that so yeah,
that absolutely is is is itellian If you see that,
like your alarm bust be going off, like huh, I wonder,
(48:36):
I wonder what happened here that led to this moment?
Speaker 2 (48:42):
Okay, that that's something I would never have known or guessed. Yeah,
I think your instinct is, oh, it has tons of
citations that must be true m hm.
Speaker 3 (48:51):
Right, not be And also like you have excitations because
sometimes like the sentence doesn't like the citation just doesn't
justify with the sentence since says or like stuff is
left out of the Like there's stuff on the sources
that didn't make its way into the article, like the
article on critical race theory, Like, there are sources that
talk about critical race theories Marxist roots. That's what I
(49:13):
mentioned in the article at all. It's also been well,
so it maybe it's changed, it's been on. Look, I
looked at this page. But critical race theory maybe yeah, yeah,
I mean there are sources. I have to go back
and find them. But like if you notice I, at
least last time time this plays this page, c RT's
(49:35):
Marxist roots were not mentioned, but some of the sources
do mention it. Ah, So okay, well okay, well, well
I guess that thing find said.
Speaker 1 (49:51):
It has probably been been fixed since people noticed it.
Speaker 3 (49:55):
Okay, yeah, but at one point didn't.
Speaker 1 (49:58):
Yeah, And like, so.
Speaker 2 (50:01):
People are the editors are supposed to be summarizing the
source to cite it. But what they're doing, you're saying,
is taking the parts of the source that they like
and leaving out the parts of the source that they
don't like, but citing it as if they summarize the entirety.
Speaker 3 (50:16):
Yeah, basically, okay, so those are just things you have
to look for, Like you have to always check the
sources on Wikipedia and what they say.
Speaker 1 (50:24):
Oh, it's almost like we should never have gotten this
convenient with our information. Yeah right, yeah, because it's the
our culture is losing its grip on truth with this
kind of stuff, with censorship with AI. It's not just
it's letting go on purpose.
Speaker 2 (50:43):
The s also true, but even for those of us
who who care about the truth. I was just talking
to one of our our fans and DMS on discord
and he was talking about how he's just kind of
crashing out over the AI thing of like I just
can't tell what's real anymore and on my head spinning
over it. And even people who deeply care about the
(51:04):
truth and finding it are struggling to parse that out.
Speaker 1 (51:08):
And I don't. I don't have a solution. Do you
have a solution.
Speaker 3 (51:12):
I really don't have that has the way about my pig.
I just want to say the AI again. We talked
about AI chat GPT. Chat GPT relies a lot on
Wikipedia for its information. I bet What I thing is
interesting though, is that and this is a problem that
goes beyond Wikipedia. This moren AI problem in general is
(51:34):
that chat GPT doesn't always like cited sources. And so
what's happening is that it relies on like public information
for its source, like Wikipedia news sites. But now it's
taking away traffic from these sources. And so obviously, like
you know, news news sources. Wikipedia's like, like all these
websites rely on traffic, you know, to pay the bills
(51:57):
and somipe to be Wikipedi is nonprofit site. But obviously,
like you know, but but you get what I'm saying,
so and so, but now AI is sort of taking
the traffic away from these these sites because everyone's just
looking what chat gpt says. Yeah, and so it's like,
(52:17):
what is what a guy's going to have long term?
You know, if new sources aren't able to get enough
traffic to pay the bills because people looking at chat gpt, yeah,
well no, I mean I don't know what the answer is.
I mean, I've heard it suggested that maybe chat gpt
should be paying for licenses for so I mean these
(52:39):
new sources, I don't know. I mean, I above my
pay grade, but but these are But I think as
AI becomes more and more a part of our culture,
you know, these are questions that we need to grapple
with as society.
Speaker 2 (52:53):
It's almost so you're almost saying that, like it is
possible that AI could kill its own source material by
beating it how hard of the market, and then but
then it would kill itself.
Speaker 3 (53:06):
Like yeah, I love that's the irony of it. So
I mean, this is something that you know that that's it.
I mean like this, look, that's something that's gonna need
to be figured out at some point.
Speaker 2 (53:15):
Wow, that's really okay. Yeah, there's been so many really
interesting things that we've covered. Have we missed anything that's
important about your work that you want to share?
Speaker 3 (53:26):
I could talk about Wikipedia for hours, honestly, I would
just say that, like, for people to understand the editing
base of Wikipedia, I've heard that anywhere from like two
thirds or three quarters of Wikipedia's editing bases left leaning
and and and it's usually like it's either like, you know,
like younger gen Z tech types, or it's like, you know,
(53:49):
weird old Marxist Internet dudes, you know. And actually, like
the Wikipedia editors are predominantly European. Americans are a minority
on Wikipedia in terms of of its editor base, and
so I think, and so I think once you sort
of to realize, like who and and all these people.
It's like it's I mentioned there's tech you but like
(54:11):
you're also a lot of academics who edit Wikipedia too,
and we almost we know how, you know, very left
wings and as Israel academics are generally, so once you
start look at Wikipedia through that lens, you start to
understand why it reads the way that it does. And
and again, as we've discussed in this show, it's like
it's been under the radar for so long. I I
(54:33):
really there needs to be congressional investigations about this. You know,
let's you know, let's let's call let's get the hearings going,
you know, let's see Jimmy Wales go up there and
have to defend you know, that terrible sentence. The design
is an article, right, you know, I mean like with
the with the universities, wadings are changed on the campuses
until those university presidents got haled before Congress and had
(54:54):
to answer for what was going on in campus, and
there was a Congress put them on blast in front
of the whole world.
Speaker 2 (54:59):
So yeah, and it's it's necessary, and I absolutely agree
it's necessary. And it also is tough when something is
supposedly like crowdsourced and free speech. To have the government
haul you up and make you answer for for your
free speech.
Speaker 1 (55:15):
It gets sticky.
Speaker 2 (55:18):
Yeah, that I think our hunting fathers had no concept
of like the Internet and all of these problems coming
for it, right.
Speaker 3 (55:25):
But I think I think part of the the issue
at hand here is Wikipedia tack is abstance status because
it's not it's nonprofit, and so that's where that's sort
of where like some of the dog inquiry kind of
stem from. It's like does attack of some status seems
be reviewed? You know, and so that would be where
(55:48):
it could be within like Congress's jurisdiction.
Speaker 1 (55:52):
Yeah.
Speaker 2 (55:53):
Yeah, I think that there's like these are sticky issues,
but we shouldn't avoid them just because they're sticky.
Speaker 3 (55:59):
Yeah for sure.
Speaker 2 (56:00):
And yeah, wow, lots of really interesting stuff to think
about and see how it plays out down the line.
Where can our listeners find you find your work?
Speaker 3 (56:13):
Follow me on on on x at Baanley's banter, I
besting handle on Instagram. I'm also on Facebook and LinkedIn.
I'm currently a national reporter for jans Jewish News and
the Kids, so you can read my work there. I also,
as I said, have a bioline of real Clare investigations,
where I have an article published on Wikipedia that talks
(56:34):
about some of the things that we mentioned here, and
you also read my password to the Jewish Journal or
I've done a lot of articles in the Jewish Journal
about Wikipedia and about like other topics that id anti
Semitism and the campuses and so forth.
Speaker 2 (56:49):
So yeah, I have linked just about all of that
in the description. So before we have you go, it
is very important and on on topic, I think that
we play the Wikipedia Hitler game.
Speaker 1 (57:08):
Please tell me you've played this before.
Speaker 3 (57:10):
I haven't. I've heard of it, but I haven't.
Speaker 1 (57:12):
Okay, it's it's really really so fun. So for everybody
in chat if you don't know, the challenge is to
get using the links inside Wikipedia from a random page
to Hitler in as few clicks as possible, and you
should be able to get it under five. So I'll
(57:35):
leave it to you. What page should we start on?
Speaker 3 (57:37):
Oh I I honestly that this is not a good
one for me to do. I okay, I've never played
this game before and I have no desire.
Speaker 1 (57:45):
To, so okay, okay, well, okay, well if that's if
that's the case, and you're uncomfortable, we could go ahead
and say say goodbye, I apologize.
Speaker 2 (57:57):
It's okay, okay, all right, thank you so much for
coming on. And oh hey j brook In Chat.
Speaker 3 (58:09):
All right, fair well, thank you so much for having
me on.
Speaker 1 (58:14):
Of course, ok well that I had a really fun
time with.
Speaker 3 (58:25):
That.
Speaker 1 (58:25):
I there's so much about Wikipedia that I have not
like I've not known before that I hadn't thought of
in the past. I do want to play the Wikipedia
Hitler game chat or Liz, would you like to give
me a page to start on? The mubonic plague? Are
(58:48):
you sure that we haven't done this one before? We've
probably done this one before. I mean, like, no, we'll
do it. Do you know who you're talking to here?
We'll do it.
Speaker 2 (58:56):
Okay, let's see, so we've got bubonic play. Usually I
try to.
Speaker 1 (59:03):
Look for are we trying to race against each other?
Are we trying to race against each other? We can well,
not so much race, but I usually try to look
for a oh who, okay, I'm gonna I'm gonna link
to the World Health Organization. Okay, I am going to
(59:24):
click European Economic Development. Oh, that's probably better than mine. Potentially,
I have. That's my next thing that I'm also clicking
because I found it account. And then let's see the name.
(59:48):
Where's Hitler? You're so bad at this? All right? Chat?
What do you think Abby got to it first? So
even though I could in the same metal clicks, she
wins that one? What what page? What page should we do?
We'll play maybe one or two more. I think we
should do feet in honor of Annie, Feet in honor
(01:00:09):
of Annie, but Karma Kats's fascism. That's gonna be a
fast one. No, no, it's way too easy. Feet. Okay,
I'm sharing, Yes, I'm sharing my screen. You can actually
get to it pretty fast from Penguins.
Speaker 2 (01:00:26):
Just from Penguin I've done. I think you probably got
from Penguins straight to Germany somehow. I remember, Yeah, it was,
it was very fast.
Speaker 1 (01:00:36):
I think. Let's see, and this one's hard. There's a
lot of just flame ah, maryam, I definitely do. I'm
(01:00:58):
looking at Penguins that you can definitely get from Penguins
directly to Argentina. Oh that's how. Just checking to see
if there's any other poor promising things.
Speaker 2 (01:01:06):
Okay, I think we're gonna go from feet to the
Mary or marry or whatever Mary tribe.
Speaker 1 (01:01:13):
Yeah, and then we will Western culture, Western your mom,
Polynesian languages. I can also click on France instead of Australia.
(01:01:37):
Which do you think would take me faster? Probably France.
Did you go from feet? No? No, I started with penguins.
I apologize a different race, but I should have started
with feet. We'll do that next time.
Speaker 2 (01:01:52):
We're all start at the same way instead of continuing
to skim for something better. I will do Western culture. Wait,
this is a feat to Marie and then Mari to
Western culture. So that's two clicks.
Speaker 1 (01:02:09):
Is it cheating if I do control for certain words? Yes,
I won't do that. I'm pretty sure it is. Helena
stick juice is potentially the one I should click on.
Next Slater I made it from a random bar Babian
cricketer to Hitler and four clicks. You're supposed to start
(01:02:32):
with a random article from the English landing page. Okay,
it's more fun when somebody challenges you to a page. Also,
Variable five gifted five subscriptions to I Think you Can
Chat and I got one? Thank you. Oh so now
I am subscribed to s who Annie got one? Carmie
(01:02:53):
Cat gone one. Jay Brooks got one, Melody got one,
and Laurna dude eleven, go on, thank you you, Thank
You's variable. I really appreciate that. Thank you. The drones
straight to their houses. Yeah, yeah, for now, Okay, I
should just go to Helena stick juice because that's gonna
(01:03:14):
take you there pretty fast. Ooh, this leads to the
Black Death. I'm just gonna get sidetracked and go there.
See actually, let's see history. We could also choose another
subject that's more random to see how fast we can
get in. So instead of Hitler, be like, how fast
can you get from point A to platypus Because it's
going to be longer, but you have to be more
creative and bonus points. People are more comfortable and we
(01:03:38):
don't have to talk about Hiller. It's up to you.
I sorry, I have to forgot how many clicks I'm on.
I think I'm on my first click. I'm just still
trying to do.
Speaker 2 (01:03:50):
I'm trying to get like a reference to Germany or something,
a reference to your mom. Top All right, Western culture.
We're gonna go Hellen and stick juice. That's three clicks. Okay,
lay into quitty early. You know what this was probably
really dumb because this is too old. This one is
(01:04:12):
potentially too old to link to skill issue or anything about.
What if I start on feet now and see, oh,
oh I saw.
Speaker 1 (01:04:20):
I saw something. Hang on, okay, I saw something on Holocaust.
Stay on, wait this one I saw. I saw it
in the the Holocaust in Greece. Hm, it's okay. I'm
(01:04:43):
really hoping I could do this in two clicks, but
I don't know if that is going to happen, jaid,
come on, Oh, this is way harder than I hold
what I found. Okay, So I did not find as
Hitler directly, but I filmed Nazi Germany. Do you think
that's gonna be a good thing to click on?
Speaker 3 (01:05:04):
Yeah?
Speaker 1 (01:05:04):
You can probably I could click found him. Okay, I
was one. I was sorry. I went penguin, Uh Fritz,
Nazi Germany? What's his face? Weird? Must Okay, I'm going
to go to genocide. Oh, okay, and a man perpetrators. Perpetrators,
(01:05:28):
let's see.
Speaker 3 (01:05:30):
Man.
Speaker 1 (01:05:30):
I don't know if it is actually gonna link. Abby
is revealing herself to be I don't know why this
feat was really hard. Guys, you reconi, They're the one
costs five and then Hitler's going to be in here.
I just haven't seen his name, so six, So that
that's technically more than you're you're supposed to do. So,
so we have UFOs. It's a suggestion that's way too easy.
Speaker 2 (01:05:52):
Well show you, I'll show you, all right, this is
the last one. UFOs and unidentified flying object.
Speaker 1 (01:06:05):
Another rumble. Rants it's also be able five five dollars. Also,
if you click on the first link of an article
over and over again, you will always end up on
the philosophy page. We'll have to test that. I'll past
that next. We'll be doing that starting at Hitler. Well,
actually we should you should wait to do it so
that we can do it showing it your mom. Uh,
cultural probably after World War two. You have false goes
(01:06:26):
directly to World War two. Oh okay, so that's pretty fast. Yeah,
it's in Hitler's right there. So all right, let me
show you. So you said, if you click, let's start
with Hitler, because that would be really funny. Uh, if
you click the first link, you said, it always goes
to philosophy. That's what he said. Yeah, but it is
the first link kind of the very first one, you see,
because like we're talking about that. I think it's the
(01:06:47):
first in the article, right, So, adall Hitler the first
link is Hitler disambiguation. Your mom is about Oh, there's
a link. It's so the first link in the article proper.
Let's do that where it's starting out after so instead
of like that introductory stuff, that's yeah, that's linguistic, okay,
in the Nazi period and then German State and then
(01:07:13):
German and then West Germanic language and then Germanic and
then into European family language, family languages, communication, information, concept, rules, proposition, philosophy,
(01:07:36):
language for other use of C philosophy, We got it.
How did you get? Where? Did?
Speaker 3 (01:07:45):
Where?
Speaker 1 (01:07:46):
Did I gound? Okay, you hit it philosophical for other
uses C philosophy, Yeah, and then I hit Ancient Greek.
Ancient Greek is the first site in philosophy. But we're
already on philosophy. Oh, I got excited and kept going.
I forgot the target because I was just supped up
in the moment. That is incredible. I did not realize that.
Oh I wish we had played this one with him.
(01:08:07):
Let's do this again, but with a different starting point.
Let's try tree frog okay, we go. Frog, semi aquatic biology,
scientific study, scientific theory, natural world space, three dimensional geometry,
(01:08:36):
Greek language, Greek language, Indo European language, modern Greek. Oh,
ok yeah, you're right. Endonym, your mom's Oh okay, let's
say name name, entity, entity exists real well reality, aggregate
(01:09:02):
aggregate data, data, data values, semiotics. Sorry, I'm getting excited, semiotics, semiosis,
semi ancient Greek. Oh are we back to ancient Greek?
But uh, I feel like Greek language instead of clicking
on woh, I go for Indo European language. Oh, you're right,
(01:09:24):
because this is the language self reference, familiaration for my stone.
You do have to fiddle with it a little bit
to make sure you don't get like in a recursive
clicking on the thing that's like just the same as
the thing. But rules, okay, promises proposition, trying to think
(01:09:45):
of language. I have an idea of choosing a sat tendell.
I rememberrant I made it to philosophy from the Barbadian
cricketer in twenty clicks out of you philosophical, and there
we go. I'm doing a random topic generator to make
this fair. And the chosen topic is I'm just click
a random link on your on your computer. Hopefully it
(01:10:06):
won't Oh, Marionnette, Marionnette, that's happit. I had one made
in Mexico, marrying it. I think I still have it,
but it's been okay, So what marrying it is? Puppet
puppet first one, mythical figure, supernatural nature, repeated, repeated, just
(01:10:28):
a little off or scientific method, scientific method, empirical evidence, proposition,
philosophy of language, philosophical There we are. That was pretty fast.
That was pretty fast. Okay, we're gonna play this for
(01:10:48):
the rest of the night. No, okay, you guys don't
love one.
Speaker 2 (01:10:52):
Thing I've been thinking about a lot, going back to
like serious things, is is what we do about truth,
what we do about apprehension of truth. It is becoming
more and more difficult to evaluate people's claims, and you
see something over and over again. And I've found sometimes
even though I know what I believe and I know
what I stand for, I will see a claim over
(01:11:14):
and over and over again, and it starts to get
into me in a way, or I'll be treated a
certain way over and over and over again on the internet,
and I'll catch myself getting like ashamed of something that
I know that I'm not actually, you know, I don't
want to be ashamed of it. It doesn't align with
my beliefs. It's just the repetition is getting is getting
(01:11:35):
into me. And that's something that Hitler talked about of
if you repeat something long enough and often enough and
all of that, I'm not quoting him correctly at all,
And I don't.
Speaker 1 (01:11:45):
Know, I definitely tell you aren't quoting correctly, but that.
Speaker 2 (01:11:48):
You can make somebody believe it. And so it's the
repetition of lies. It's the bad sources, it's people believing
their heroes and whatever their heroes say. Even though their
heroes have an incentive structure of money and clicks, they
don't have an incentive structure that necessarily points to honesty.
(01:12:09):
Integrity isn't necessarily rewarded in these incentive structures, and we
have we are very quickly as a society losing all discernment,
losing all wisdom when it comes to truth. But also
with AI and with AI video and AI images, it
is becoming harder and harder to know what is real,
(01:12:30):
even if you're seeing it with your eyes. And I
think that more and more we're going to have a
retreat back to people only believing what they see with
their own eyes in real life, because the written word
is suspect. Photographs are suspect, videos are suspect. There was
one I saw the other day that I was really
surprised I had shared it. It was a video of
(01:12:50):
an undersea creature, and I got a notification from Wikipedia later,
not Wikipedia, from X later that it had been community
note and I was like what, So I went and
the community said, this is an AI video that has
that was developed off of an image. So it was
a front facing image of the weird fishing. Yes, it
(01:13:12):
was a weird fish thing, and so the AI video
was developed from the front facing fish. But when it
turns side to side, what the what the AI guessed
it looked like from profile was completely wrong. And I
was like, I would never have known that from the
video it looked completely real because it was based on
something real.
Speaker 1 (01:13:31):
Yeah, because we've seen images of that kind of fish before.
Speaker 2 (01:13:35):
Right, and so like it's just even even something so
small where I would never have thought to question it.
And you could argue that it doesn't particularly matter, but
but our connection to truth and reality and that like
grounding of like our senses and what we can know
to be true, and more and more, I think people
are falling back, even me, even as as intentionals and
(01:13:58):
trying to be I think we're falling back on just
like I don't know what to believe. So I'm going
to believe when I want to. I'm going to believe
what already confirms my priors. I'm going to believe this
in that And it's tough to that source material. It's
tough to evaluate things, especially if it's something that you
don't want to hear.
Speaker 1 (01:14:14):
Yeah, because you don't want to hear the alternative.
Speaker 2 (01:14:16):
Yeah, yeah, Annie is completely right, She says, I feel
like lots of arguments I see in real life and
online are based on the two parties believing different news sources,
kind of like at school when the little kids are like, well,
my dad says that he can left a million pounds,
and the other person's like, well, my dad says that
that's not possible, right, Yeah, yeah, you're fighting over over
(01:14:39):
source material, and there are One thing that really disturbs
me is that you can demonstrate proof positive that a
source has lied and people will keep quoting it and
citing it and trusting it.
Speaker 1 (01:14:53):
And there's like a.
Speaker 2 (01:14:56):
A stubbornness in the way that people stick to their
source is even though those sources have been proven to
lie over and over and over, and they'll be like, well,
I really like this influencer. I'm going to you know
they made a mistake one time, or know they never lied,
or it's just there's just a dissonance. And I think
in a lot of ways, it's people not caring about
(01:15:19):
truth anymore, not caring about reality anymore, wanting to believe
what they want to believe, not wanting a source to
actually tell them what to believe. But like I said earlier,
like you can completely you can deeply care about the
truth and still be struggling to apprehend what it is
and to parse it out. And I it's just it's
a troubling thing, and it's these are troubling times. Yeah,
(01:15:42):
you almost have to default assume that anyone on the
internet is lying to you. Except for us, we would
never like we would never ever ever. Yeah, you have
to default assume you have to. I think it's it
should force people to be more discerning, It should force
people to be more thoughtful, more careful, more in intentional
to slow down and evaluate things. But instead I think
(01:16:06):
that it's just people go to more and more convenience.
So like, well, if I can't tell what the truth is,
I'm just not gonna it's too much work.
Speaker 1 (01:16:14):
That's my butler's job. I'm just gonna believe whatever I
want to. I'm gonna believe my truth. Well, people, I
think what I've seen people do is because often they'll
be like this one, this one YouTuber has this theory
of everything and you should go to them, And so
they've outsourced their thinking to one specific person that they
filter it through. They're like, well this person stood out
of the fact checking for me, so just think what
they think, right, and not to not to put anyone
(01:16:34):
I'm blast. I'm not going to name any names, but
somebody had had kind of reached out saying like I
rely on you guys to like tell me what to
like what sense to make up the thing like, to
to tell me make it make it work, make it
make sense, make it like slide into place, and I
(01:16:55):
will My thought on it was just I appreciate honor
honored absolutely, but you shouldn't be outsourcing your you know,
connection with truth to us to lead you forward if
we if we can be helpful in helping keeping you.
Speaker 2 (01:17:15):
Connected with truth. Absolutely, but yeah, I think that it
hurts your brain. It really hurts your brain when you
have a hard time figuring out what the truth is.
And it is so tempting to go lean on somebody
to just tell you what to think at any given time,
somebody you trust, who's just going to go here's, here's
the talking points, here's the lines, here's what you believe,
(01:17:38):
and then to walk away with it. And the only
thing you can do that with is scripture. The only
thing you can do that with this God. So when
you are flailing about and needing, needing.
Speaker 1 (01:17:51):
Some help, pray and open your Bible and read' That's
what I will tell you to do. And keep listening
to us because we're like, you know how loneliness is
next to godliness? Yes, this podcast is next. No, I'm
not going to do a blase. We take showers at
least once a month, whether we need them or not,
so we're yeah, is that what we're talking about.
Speaker 2 (01:18:11):
Anyway, A little bit of a shorter show tonight, guys,
But I am very tired, and there's nothing wrong with
being short get short on sleep lately. I think that
tomorrow night's episode might be a little bit longer than
a typical.
Speaker 1 (01:18:27):
No No promises.
Speaker 2 (01:18:28):
But I think I pulled together all the notes today
and there's a significant amount of reels that have a
good amount of time on each of them, and I want, like,
there's a bunch of stuff I want to discuss around
this perhaps miracle of Forest Frank's back being healed. So
I'm really excited to talk about it. I hope you
guys can be there with us on locals.
Speaker 1 (01:18:49):
Can I just have a record? The name Forrest Frank
is oddly upsetting, and I can't explain that. You can't
explain it. I don't think on it. Don't like how
I pray about it and figure out and have something
to say about it tomorrow. Okay, we'll do.
Speaker 2 (01:19:03):
Good night, everyone, love you, thank you so much for
being here. And thanks again to Aaron baandlor for thank you.
He's very fast, like I love when somebody loves a
topic and they can just yet out about it forever.
Speaker 1 (01:19:15):
Very fun. All right, good night, Cup of flo