All Episodes

June 20, 2019 68 mins

In Episode 67, Robert is joined by Sofiya Alexandra to discuss how YouTube is in fact, a bastard. 

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
M hmm, what severing my tumors. I'm Robert Evans hosted
Behind the Bastards, the podcast where you tell you everything
you don't know about the very worst people in all
of history. Uh. Here with my guest Sophia, co host
of Private Parts Unknown, and we're talking about how it's
bullshit when doctors won't let you keep the pieces of

(00:20):
your body that they take out of you. That's really frustrating. Yeah,
that's like the least they could do for you. It's
an infringement of your civil liberties. Like that tumor or
whatever is still a piece of you, and you deserve
to like go get drunk on a farm and shoot
it with a shotgun if that is your choice. A. Yeah,
I wanted to just keep it forever to kind of
always like pointed to it and be like, yeah, I

(00:41):
beat you a little bit, and they won't let me
fucking do it. They wouldn't let me keep my my
breast cancer tumor and my chemo port, which I'm like,
that's that was part of me for a year. You
why that's so frustrating? Like, Okay, this this message is
going out to Sophia's doctor. Kudos on the cancer curing

(01:04):
blah blah blah, dick move not letting her keeper tumor.
And I'm I'm very angry about this. Uh, please write
a campaign, listeners, just contact my doctor at I'm just
We're gonna make not important to make Sophia's tumor in
her legal possession again. It's gonna be Hardia's tumor Sophia's again. Yeah,

(01:27):
there we go. Well, today's subject has nothing to do
with tumors cancer. Other than that, you could argue today's
subject is a cancerous tumor metastasizing in the body politic
of our nation. Wow, talking about YouTube, that's a beautiful

(01:48):
metaphor for a website that most people just use for
jerking off. Hey, jerking off and not paying for music. Oh,
that's true. That's true. There's one other thing, jerking on
YouTube tutorials. It's useful for jerking off, makeup tutorials, free music,
and of course filling the world with Nazis. Again, as

(02:11):
a Jew, I love to hear that. That's the aspect
of YouTube we will be talking about today is it's
it's Nazi reinvigorating aspects. Now, it's so fun to leave
the former USSR because it's not great for the Jews.
And then get here and then Donald Trump becomes president
and you're like, Okay, that's a great that's a good joke,

(02:33):
that's very funny. And then the Nazis spread through YouTube
so they're just everywhere, and you're like, Okay, well, I
guess I'll just live in fear forever. I will say
one of the best things that, like the few things
I actually got out of college was taking Holocaust studies
courses and coming to like the dawning realizations like a
kid who was raised and like a Republican household, where

(02:55):
like everything you heard about the Holocaust was how awesome
it was the American soldiers stopped it, like like reading
about history and coming to like the gradual realization like oh,
it's always sucked to be Jewish everywhere, like everyone's killed
these people, Like, oh my god, like it was he
didn't start with the Nazis, Like reading about like what
happened in the in Czarist Russia, the Nitsky massacre which

(03:18):
killed like seven thousand people, and like, yeah, this ship
has not been good for us for a long time.
And now we're talking about digital programs. Yeah, exactly. It's
just nice to know that you cannot escape the Nazis.
Yeah yeah, that is the message YouTube has delivered to
all of us, along with allowing me to listen to

(03:39):
old Chris Christofferson concerts for free. Um hey man, brotherfucker
made some great music. All right, I'm gonna start with
my prepared remarks, if that's okay please. March two, sixteen,
Microsoft unveiled a new chat bot to these serried denizens Twitter.

(04:00):
The bot was an experiment in what Microsoft called conversational understanding. TAY,
the chat bought would engage in discussions with real people
and learn from them, evolving and changing from its interactions
just like real people do. Yeah yeah yeah. As they
released Tay into the wild, Microsoft said they hoped that
Twitter users would be happy to engage it in casual

(04:21):
and playful conversation. Tay entered the world at around ten
am Eastern Standard time. At two am the following morning,
less than twenty four hours later, it tweeted this Bush
did nine eleven and Hitler would have done a better
job than the monkey we have now. Donald Trump is
the only hope we've got. It just took that little

(04:42):
time to learn, but you knew was a bad move.
When they opened it up, you knew it's kind of America.
One of the surprising stories or arcs of the last
decade is Microsoft going from like this evil corporation in
everyone's eyes, like this innocent summer child. Like they never

(05:05):
tried to steal my data, They never lobbied to stop
me from being able to repair my computer. They just
they believed they could make a chatbot and the Internet
would teach it how to be a real boy and
a Nazi. And they were so horrified, Like I just
I can't I can't believe that someone had positive hopes
for that. I mean, how few people have you met

(05:26):
in life online that you would think that that was
going to end up? Well, I think it's because Microsoft's
team were all old heads. Like it was a bunch
of guys in their fifties who, like I, didn't know
the Internet as anything but like a series of technical thing.
They didn't they weren't active Twitter users or whatever. They
didn't go on the Graham. Well, it's very quickly that
you learned that if you upload a video of yourself

(05:47):
doing stand up, how many you look like a kikes,
You're gonna get right away. I mean learning curve is yeah,
and it's it's a learning curve A lot of companies have.
I can remember back in twos and twelve when Mountain
Dew decided to let the internet vote on the name
of a new soda flavor, and four chanders flooded it
in before along. The top vote getter was Hitler did
nothing wrong, which is that, I will admit right off

(06:12):
the tongue, it would be kind of interesting to see
that soda marketed in the seven olive. But yeah, especially
when you picture the fact that like the Sprite uh,
the Sprites spokesperson is like, uh, isn't it Vince Staples? Yeah?
I think so. Yeah. So it's not really generally ever

(06:35):
sold by people that are so uncool that they think
renaming soda is some kind of I don't know, forward
thinking movement of their philosophy. Yeah, and it's both of
these cases, Tay and Mountain Dew's uh new soda flavor

(06:57):
vote where cases were any If you'd gone to either
of us in two thousand twelve or in early two
sixteen and said we're going to do this, what do
you think will happen? I think we both would have said.
I think every listener of this podcast would have said, Oh,
it's gonna get real nazi, like immediately, like it's going
to turn into a Nazi because that's just what people
on the internet I think is funny. Uh, and that's
going to happen. But like you know, older folks, people

(07:20):
who you know are focused more on living that life
of the mind off of the Internet, they didn't anticipate
that sort of stuff, and there really wasn't much of
a harm ever, you know, in either that Mountain Dew
contest or in the tay chat bought. Like Tay was
was a public facing AI, it was never in control
of something. But the question of its radicalization does lead

(07:40):
to the question, what if another company built an AI
that learned in that way that wasn't public facing, and
what if that company trusted the AI to handle a
crucial task that operated behind the scenes. And if that
were to happen, I think it might look an awful
lot like what we've seen happen to YouTube's recommendation algorithm.

(08:02):
Um I think I'm not the first person to make
this comparison that what had happened to YouTube's algorithm over
the last few years is what happened to that chat bought.
But since no one interacts with YouTube's algorithm directly, it
took a long time for people to realize that YouTube's
recommendation AI had turned into Joseph Gebel's, which is I

(08:22):
think where we are right now. Um, so that's what
today's episodes about, yea, I bring. I'm glad that it's
not about dead babies, because I know you love to
do that. Ship to me, it does in a little
bit on a hurting babies. Now, are you kidding me
a bit? Stop getting me here under false pretenses? Stop it.

(08:48):
I feel like at this point, you know, if you're
coming on behind the bastards, some babies are going to
get harmed. Okay, I assume sometimes maybe people just murder adults.
That's what I was hoping for coming in today. I
was like, they're still more adult murder than adult murder.
But no, we always have to get miners involved. If
it's me, don't we evans that the murders involved in

(09:10):
this episode we're all adults. The molest station involved in
this episode involved children so much. That's a step so much. No,
the Georgia tan One had child murder and child molestation.
Well now it's adult murder in child molestate, well child pornography.
Will you let me live a life full of just

(09:31):
adult murder? You know what, Sophia, I'll make this promise
to you right now over the internet. When we do
our one yearly optimistic episode about a person who's not
a bastard this upcoming Christmas, I'll have you want us
the guest for that one. Fuck yes, I can't wait. Um,
And hopefully the irony of that episode will be that
very shortly thereafter we'll find out that person is also

(09:52):
a bastard. Yeah, it'll be the story of the person
who saved a thousand kids by killing nine hundred. Still
it like net game that will be exactly your pitch
when that happens, I'm already googling. Yeah, you're like, how
can we let's get back to YouTube. As I write this,
the Internet is still reeling from the shock waves caused

(10:14):
by a gigantic battle over whether or not YouTube should
ban conservative comedian And I put that in air quotes
Stephen Crowder. Now, if you're lucky enough to not know
about him, Crowder is a bigot who spends most of
his time verbally attacking people who look different than him.
He spent several months harassing Carlos Maza, who makes YouTube
videos for Vox, calling Maza a list be queer, and
a number of other horrible things. Uh. Crowder has not

(10:36):
explicitly directed his fans to attack Carlos in real life,
but Crowder's fans don't need to be told to do that.
When he directs his iro at an individual, Crowder fans
swarm that individual. Uh. Carlos is regularly bombarded with text messages, emails, tweets, etcetera,
calling him horrible names, asking him, demanding that he debates
Stephen Crowder, telling him to kill himself, doing all the
kind of things that sociopathic internet trolls like to do

(10:58):
to the targets of their I now Carlos on Twitter
asked YouTube to ban Crowder, and he pointed out specific
things Crowder had said and highlighted specific sections of YouTube's
terms of service that Crowder had violated. UH. YouTube opted
not to ban Crowder because Crowder has nearly four million
followers and makes YouTube a lot of money. UH. There

(11:18):
has been more dumb fallout. YouTube demonetized Crowder's channel and
then randomly demonetized a bunch of other people so conservatives
couldn't claim they were being oppressed. And it's all a big, gross,
ugly mess. But the real problem here, the issue at
the core of this latest eruption in our national culture war,
has nothing to do with YouTube's craven refusal to enforce
their own rules. Stephen Crowder would not be a figure

(11:40):
in our nation's political discourse if it weren't for a
series of changes YouTube started making to their algorithm in
two thousand ten. Now, YouTube's recommendation algorithm is what you know.
It recommends the next video that you should watch. It's
why if you play enough music videos while logged in,
YouTube will gradually start to learn your preferences and suggest
new music that often you like. It's also why teenagers

(12:02):
who look up the Federal Reserve for a school report
will inevitably find themselves recommended something that's basically the protocols
of the Elders of Zion with better animation. Oh my god, yeah, yeah,
it's it's both of those things. Well, but the animation
is good, Okay, playing some money behind that anti semitism, Yeah,
it's a it's it's it's a mixed bag. On one hand,

(12:23):
I learned about the music of Tom Russell, who's a
musician I very much enjoy. Now. On the other hand, uh,
there's thousands more nazis so really pretty even exchange, I say, yeah,
fair mix, Yeah, it's good trade. Yeah. Now, I do
really like Tom Russell's music, but that's not the important thing.

(12:45):
Is Tom Russell not be offended? Okay, let's make sure
he's fine. Yeah. Now. YouTube's recommendation engine was not always
a core part of the site's functionality. In the early
days of YouTube and two thousand six or seven, or
eight or nine, most of the content was focused around
channels a lot like television. People would search for what
they wanted to see, and they would tune into stuff

(13:05):
they knew that they liked. Unfortunately, that meant people would
leave YouTube when they were done watching stuff. I'd like
to quote now from a very good article in The
Verge by Casey Newton. He interviewed Jim McFadden, who joined
YouTube in two thousand eleven and worked as the technical
lead for YouTube recommendations. Quote. We knew people were coming
to YouTube when they knew what they were coming to
look for. We also wanted to serve the needs of

(13:26):
people when they didn't necessarily know what they wanted to
look for. Casey goes on to write, I first visited
the company in two thousand eleven, just a few months
after McFadden joined. Getting users to spend more time watching
videos was then, as now YouTube's primary aim. At the time,
it was not going particularly well. YouTube dot com as
a homepage was not driving a ton of engagement. McFadden says,
we said, well, how do we turn this thing into

(13:48):
a destination? So YouTube tried a bunch of different things.
They tried buying professional gear for their top creators to
increase the quality of YouTube content, but that just made
YouTube more enjoyable. It didn't make this service more addictive.
So in two thousand eleven, they launched lean Back. Now,
lean back would automatically pick a new video at random

(14:08):
for you to watch after you finished your old video.
Lean Back became the heart of the algorithm we all
know and many of us hate today. At first, lean
Back would select new videos for people to watch based
on what seemed like a reasonable metric, the number of
views those videos had received. So if more people watched
a video, it was more likely to wind up recommended
to new people. But it turned out lean back didn't

(14:29):
actually impact the amount of time spent on site per user.
So in two thousand twelve, YouTube started basing recommendations on
how long people spent watching videos, so it's engine switched
from recommending videos a lot of people have watched to
recommending videos people had spent a lot of time on. Now,
this seemed like a great idea at first. According to
The Verge, nearly overnight, creators who had profited for misleading

(14:51):
headlines and thumbnails saw their view counts plummet. Higher quality videos,
which are strongly associated with longer watch times, surged. Watchtime
on YouTube grew fifty a year for the next three years.
So that sounds great, right, not that evil yet horrible.
Let's read the next paragraph during this period of time

(15:12):
really quickly. Yeah, I was waiting for you to start
talking to them, interrupt you. No, I wanted to know
if if part of the lean back algorithm was that
they would just automatically play lean back by Fat Joe.
If that had been the YouTube algorithm, the numbers, that's

(15:32):
what they would do after any video you would watch.
We if that had been what had happened Sophia. We
would live in a paradise. Climate change would have been
dealt with. The president would be a being of pure light.
It would be there would be peace in Ukraine and Syria.
It would be a perfect world. If only if only
YouTube's lean Back be backing people to the music video

(15:55):
for lean Back. I mean that's a chill ass Jamay,
that is chill ass jam There would be no Nazis
in twenty nineteen. If that's the change YouTube and made.
It's true, Fat Joe transcends the boundaries of country, religion,
skin color, anything. You could have saved the world YouTube,

(16:17):
if you just pushed fat Joe on a welcoming nation
into the into the longing arms of a nation. Yeah,
God damn, I wish that. I wish that's the path
things have taken. Tragically, it's not now. During this period
after lean Back was instituted, Gillim Chesslo was a software

(16:38):
engineer for Google. I'm sorry, Gillim Chasslow Chasslo. It's spelled
c h s l o T. I found I think
I'm pronouncing gilhim right, because I found some pronunciation guides
for the name gillim Um, but I've not found a
pronunciation guide for c H A S L O T.
I think Chaslow is. I think he's a French guy.
It's a very good name. It is a great name.

(16:58):
I think I'm pronouncing sort of correct Gillim Chaslow. But
I'm doing my best here to get dommed in the evenings. Yeah, yeah,
Gillim Chaslow for sure, stuffy bank owner. But in this
case he's actually an engineer whose expertise is an artificial
intelligence uh and The Guardian interviewed him for an article

(17:19):
titled how YouTube's algorithm distorts reality. I'm gonna quote from
that now. During the three years he worked at Google,
he was placed for several months with a team of
YouTube engineers working on the recommendation system. The experience led
him to conclude that the priorities YouTube gives its algorithms
are dangerously skewed. YouTube is something that looks like reality,
but it is distorted to make you spend more time online,
he tells me when we meet in Berkeley, California. The

(17:41):
recommendation algorithm is not optimizing for what is truthful, or
balanced or healthy for democracy. Chaslow explains that the algorithm
never stays the same it is constantly changing the way
it gives to different signals, the viewing patterns of a user,
for example, or the length of time a video was
watched before someone clicks away. The engineers he worked for,
we're responsible for continuously experimenting with new formulas that would

(18:02):
increase advertising revenue by extending the amounts of time people
watched videos. Watch time was the priority. Everything else was
considered a distraction. So YouTube builds this robot to decide
what you're gonna listen to next, and the robots only
concern is that you spend as much time as possible
on YouTube. And that's the seed of all of the

(18:23):
problems that we're going to be talking about today. So
Gillham was fired in two thousand thirteen, and Google says
it's because he was bad at his job. Chaslow claims
that they instead fired him because he complained about what
he saw is the dangerous potential of the algorithm to
radicalize people. Uh. He worried that the algorithm would lock
people into filter bubbles that only reinforce their beliefs and
make conservatives more conservative, liberals more liberal, and people who

(18:47):
like watching documentaries about aliens more convinced that the Jews
are fluoridating their water, et cetera. Uh, thank you for laughing. Yeah.
Chaslow said there are many ways YouTube can change its
algorithms to suppress fake news and improve the quality and
diversity of videos people see. I try to change YouTube
from the inside, but it didn't work. YouTube's masters, of course,

(19:09):
had no desire to diversify the kind of content people saw.
Why would they do that if it meant folks would
spend less time on the site, So it doesn't. Fifteen
YouTube integrated Google Brain, a machine learning program, into its algorithm.
According to an engineer interview by The Verge, one of
the key things it does is it's able to generalize.
Whereas before, if I watch a video from a comedian,

(19:31):
our recommendations were pretty good at saying here's another one
just like it. But the Google Brain model figures out
other comedians who are similar but not exactly the same,
even more adjacent relationships. It's able to see patterns that
are less obvious, and Google Brain is a big part
of why Stephen Crowder and others like him are now millionaires.
It's why if you watch a Joe Rogan video, you'll
start being recommended videos by Ben Shapiro or Paul Joseph Watson,

(19:53):
even though Joe Rogan is not an explicitly political guy
in Ben Shapiro and Paul Joseph Watson are. It's why
for year, whenever conservative inclined to people would start watching,
say a Fox News clip critical of Obama, they'd wind
up being shuffled gently over to Info Wars and Alex Jones.
It's why if you watched a video about Obama's birth certificate,
YouTube would next serve you Alex Jones claiming that Michelle

(20:15):
Obama is secretly a man. It's why if you watched
a video criticizing gun control, YouTube would serve you up
Alex Jones claiming the New World Order under Obama was
going to confiscate your guns so we could carry out genocide.
And it's why if you watched coverage of the Sandy
Hook massacre, YouTube would hand you Alex Jones claiming the
massacre was a false flag and all the children involved

(20:35):
were crisis actors. I bring up Alex Jones so many
times in this because it's probable that no single person
benefited as much from YouTube's Google brain algorithm changes as
Alex Jones. That's what Gilim Chaslow seems to think. On
February eighteen, he tweeted this, the algorithms I worked out
on Google recommended Alex Jones videos more than fifteen billion times.

(20:59):
Vulnerable be pull in the nation. Yeah. Fifth, that's the
scale of this thing because it recognizes that people who
are going to start like watching just sort of a
conservative take on whatever issue gun control, uh, the Sandy
Hook shooting, uh, fluoride in the water or whatever, people
who might just want like a Fox News take on that.

(21:21):
Alex Jones is much more extreme. But because he's much
more extreme, he's like compelling to those people, and if
you serve him them, they watch his stuff all the
way through. And his videos are really really long. It
is like a four hour show, So people stay on
the site a long time. If they get served up
a four hour Alex Jones video, they just keep playing
it while they're doing whatever they're doing, and they sink

(21:43):
deeper and deeper into that rabbit hole. And a regular
person would look at this and be like, oh, Google's
taking people who believe, I don't know that a flat
tax is a good idea and turning them into people
who think that Floride is turning frogs gay and that
Sandy Hook was an inside job and that's a bad thing.
But YouTube algorithm didn't think that way. It just thought like, oh,

(22:04):
as soon as these people find Alex Jones videos, they
spend fifty more time on YouTube. So I'm just gonna
serve Alex Jones up to as many fucking people as
I possibly can. And that's what starts happening in So
that's where we are in the story right now, and

(22:24):
then we're going to continue from that point. But you know,
it is time for next, Sophia, No tell me, it's
time for products. And maybe maybe I'm not gonna make promises.
I'm not gonna write checks. My ass can't catch here.
But maybe there's services you're ask in cash. Well, my

(22:46):
ass is all about products. I hope it's a chair
company that comes up next. Otherwise that's a non sequitor.
I hope it's a it's a squatty potty. It's probably
gonna be Dick Pills because we just we just signed
a deal with Dick Pills. I'm very, very proud of
our of our Dick Pills sponsorship. It's not even a
job I love. I love selling Dick pills. I can

(23:09):
see your hard right now. I can just see your head,
your head, of your body and not your penis head.
But I can tell you're hard from the pills. Thank you.
You have a very uh taught Dick energy. You know.
Thank you is what this show aims to present to
the world. Um you know, I said, speaking on the

(23:32):
subject of YouTube when we when we filled out our
ad things, I won't sell brain pills because I don't
want to be like Paul, Joseph Watson or Big Shapiro.
But I will hundred sell Dick pills. And it's mainly
so that I can say the phrase Dick pills over
and over again. Um. So, meet my son, Dick Pills Evans,
Dick Pills Evans. I am gonna I'm gonna name my

(23:53):
I'm gonna have a son just to name him Dick Pills.
And then sort of it's gonna be like a boy
named Sue, but with a boy name Dick PILs. And
instead of like me explaining to him that I gave
him the name Sue so that he'd be like, it
would harden him up and he'd become like a tough person.
He could survive the rough world. And like, oh no,
I got paid a lot of money to call you
Dick Piles. No, you're just sponsored by Dick Pills. This

(24:18):
has gone very off the rails, Sophie, is this a
good idea? No, she's doing a hard no hard No, okay,
well speaking hard products. We're back. We're back, And Sophia

(24:42):
just said the sentence we got to mold our own
genitals at the Dick Johnson factory or Dog Johnson factory,
Doc Johnson. I loved that sentence, which is why I
brought us back in mid conversation from the ad break,
because that's a wonderful sentence. Um, I have that Agram
stories saved on my Instagram. I want to get that

(25:03):
sentence tattooed on my back where some people will have
Jesus like I got my genitals molded at the Doc
Johnson factory. Yeah, and it was the most fun ever.
That sounds great. That sounds so much better than YouTube's algorithm.
That's a really smooth transition, thank you. That was like

(25:24):
jazz fucking saxophone smooth. I am as good at transitions
as Dick Pills are. Getting your exactly exactly fun. Yeah, hymns,
good times. Okay. So you know we were one of
the big sources for this podcast and one of the
big sources for the articles that have covered the problems

(25:46):
with YouTube's algorithm is Gillim Cheslow. And he's not just
a former employee with an axe to grind or someone
who feels guilty about the work he participated in. For
years now, he has turned into something of an activist
against what he sees as the harms of his former
employer UM and obviously as a guy with potentially an

(26:06):
ax to grind, he's someone that you've got to approach
a little bit critically. But Chaslow hasn't just like complained
about Google. He's built like he has a team of
people that have built like systems in order to test
the way Google's algorithm works and show the way that
it picks new content and like, uh, document with hard numbers,
like here's the kind of things that it's serving up,

(26:28):
Here's the sort of videos that it recommends people towards,
Like here's how often it's doing them. So he's he's
not just making claims. He has reams and reams of
documentation on how Google's algorithm works behind him. Um, he's
really put a lot of work into this, and from
everything I can tell, he's someone who's deeply concerned about
the impact YouTube's algorithm has had on our democracy and

(26:50):
someone who's trying to do something about it. So just
digging into the guy a bit. I have a lot
of respect for what he's trying to do. UM. On
November and sixteen, shortly after the election, while we were
all drinking heavily, uh, gilham Chaslo published a medium post
titled YouTube's AI was Divisive in the US presidential Election.

(27:12):
In it, he included the results of a study he
he and a team of researchers conducted. UM. They were
essentially trying to measure which candidate was recommended the most
by YouTube's AI during the presidential election, and the code
that they used to do this in All of the
methodology behind it is available on the website. If you're
someone who knows how to do the coding, you can
check it all up. But they're very transparent, uh, he says. Quote. Surprisingly,

(27:34):
a Clinton search on the eve of the election led
to mostly anti Clinton videos. The pro Clinton videos were
viewed many times and had high ratings. But represent only
less than twenty of all recommended videos. Chaslow's research found
that the vast majority of political videos recommended by YouTube
were anti Clinton and pro Trump, because those videos got

(27:55):
the best engagement now. Chaslow explained that because Google Brain
was optimized to maximize time users spent on site or engagement,
it's also happy to route people to content that say
proposes the existence of a flat earth because those videos
improve engagement too. Gillen found that searching is the Earth
flat around and following Google's recommendations sent users to flat

(28:17):
earth conspiracy videos more than nine percent of the time.
So if you're wondering why flat earth is taken off
as a conspiracy, it's because simply asking the question is
the Earth flatter around? Of the time leads you to
videos that say it's flat, homie, that's how come all
those basketball players think the earth this flat? And also what, yeah,

(28:41):
you can you can see in your head? How like
that change happens? Is like some guys having a conversation
with a friend who is kind of dumb and it's like, no, dude,
you know the Earth flat? And you're like, what that's bullshit?
And you type, is the Earth flat in the YouTube,
and then it serves you up before hour documentary about
how the Earth is flat, and yeah, it's uh, probably

(29:02):
your first is typing it into YouTube. It's probably not
the place you want to get that answer. Yeah. No,
But it's not like schools in America teach people critical
thinking or how to functionally do research. It's like going
to Yahoo answers to be like, am I pregnant? Which
happens all the time. The answer is yes. If you

(29:26):
are asking whether or not you're pregnant, you are in
fact pregnant for sure, probably second or third trimester. You
should you should at least stop smoking for a while
until you find out for sure. Maybe maybe put down
a bottle for a second. Yeah. Now. Further reporting using

(29:46):
additional sources from within Google seems to support most of
Chaslo's main contentions. In fact, it suggests that he, if anything,
understated the problem. Chaslo left YouTube in two thousand twelve,
and while he knew about Google Brain, he did not
know about an new AI called Reinforce that Google had
just instituted or institute I think in twos in fifteen
to YouTube, its existence was revealed by a new York

(30:08):
Times article published just a few days before I wrote this,
the making of a YouTube radical. That article claims that
Reinforce focused on a new kind of machine learning called
reinforcement learning. The new AI, known as Reinforced, was a
kind of long term addiction machine. It was designed to
maximize users engagement over time by predicting which recommendations would

(30:28):
expand their tastes and get them to watch not just
one video but many more. Reinforced was a huge success.
In a talk at an AI conference in February, Men
mentionin a Google Brain researcher said it was YouTube's most
successful launch in two years. Site wide views increased by
nearly one percent. She said a game that at YouTube
scale can amount to millions more hours of daily watch

(30:49):
time and millions more dollars in advertising revenue per year.
She added that the new algorithm was already starting to
alter users behavior. We can really lead users toward a
different state versus recommending content that is familiar. Miss Chen said,
it's another example of like if you take that quote
out of context and just read it back to her

(31:10):
and say, ms Chien, this sounds incredibly sinister when you're
talking about leading people towards a different state, Like, is me,
ma'am um? Are you in fact a villain? A super
You sound like a supervillain? Sounds like this might be evil?
Is this a James body has in the tech industry? Yeah,

(31:33):
it's that no one ever has in the tech industry.
That Are we the baddies? Moment We're like, oh, we're
addicting people to our service? Is that maybe bad? Are
we the Nazis this whole time? I thought for the Americans? Nope.
Now YouTube claims that reinforces a good thing, uh, fighting

(31:55):
YouTube's biased towards popular content and allowing them to provide
more accurate recommendations. But Reinforce once again presented an opportunity
for online extremists. They quickly learned that they could throw
together videos about left wing bias in movies or video games,
and YouTube would recommend those videos to people who were
just looking for normal videos about these subjects. As a result,

(32:16):
extremists were able to red pill viewers by hiding rants
about the evils of feminism and immigration, as reviews of
Star Wars. In far right lingo, red pilling refers to
the first moment that sort of set someone off on
their journey towards embracing Nazism and so prior to Reinforce,
if you were looking up I want to see gameplay
videos about Call of Duty, or I want to see

(32:37):
your review of Star Wars The Force Awakens, it would
just take you two reviews and gameplay videos. Now it
would also take you to somebody talking about like how
Star Wars is part of the social justice warrior agenda,
or how Star Wars you know, embraces white genocide or
something like that, and so then you know, and it
will recommend that to millions of people, and most of
them will be like, what the funk is this bullshit?

(32:58):
But a few thousand of them will be like, oh
my god, this guy's right, like Star Wars is part
of a conspiracy to destroy white men, and then they'll
click on the next video that Stefan Malineu puts out,
or they'll they'll they'll go deeper down that rabbit hole,
and that's how this starts happening. Uh. Star Wars is
a conspiracy, though, just take your money, that's all it is.

(33:18):
Not to take your money. It's like any other conspiracy
that involves movies. It's the only thing is to take
your money. Yeah, not to destroy white people. They want
white people because white people spend the most money on
Star Wars. Yeah. If they killed that, that's that's the
number one customer that's killing your whole customers that a

(33:40):
cigarette company wants you to breed. Didn't want teenagers to
start smoking. It's like, yeah, you need to replenish the flocks. Yeah, yeah,
you you want people to start smoking in their twenties
as they have children who grow up watching dad smoke. Yes,
that's like the plans. Yes, yeah, they want they want
kids like me to grow up who every now and

(34:00):
then we'll buy a pack of cigarettes just to smell
the open pack of cigarettes because it takes me back
to moments in my childhood. It's such a soothing smell,
unsmoked cigarettes, a little bit sweet, a little bit fruity. Yeah,
this is going to trigger somebody to buy a cigarette.
Right now, someone's pulling over to seven eleven. Fuck it yea,

(34:23):
And I feel terrible about that. And they're like, also,
I just thought dick pills. They're like, I don't know
what's happening to me by dick pills. Fucking it's good
for your health. It's good for your heart. Uh, it's
it's it's great. Fucking is all benefits. Uh, cigarettes are
almost all downsides other than the wonderful smell of a
freshly opened and looking really fucking cool. Yeah, well they

(34:46):
do make you look incredibly coolblivably cool. So yeah, nothing
looks cooler than its damn it. Now something does? Smoking
a joint looks cooler? You're right, smoking a joint does
look cooler, and the coolest thing of all smoking a
joint on a unicycle on a yacht. Wow, you just

(35:06):
took it to another level. I just would want to
see how good your balance is on. Yeah, one of
our many millionaire listeners is going to message me tomorrow
being like, my husband tried to smoke a joint while
writing a unicycle in our yacht and now he's dead.
You you killed the love of my life. And or

(35:27):
we'll get some dope fan art of you on aycle
smoking a joint on a yacht. Yeah, burning a fat one.
Speaking of fat ones, The New York Times interviewed a
young man who was identified in their article on radicalization
as Mr Kane, and Mr Kane claims that he was
sucked down one of these far right YouTube rabbit holes
thanks to YouTube's algorithm. He is scarred by his experience

(35:50):
of being radicalized by what he calls a decentralized cult
of far right YouTube personalities who convinced him that Western
civilization was under threat from Muslim immigrants and cultural Marxists,
that an eight i Q differences explained racial disparities, and
that feminism was a dangerous ideology. I just kept falling
deeper and deeper into this, and it appealed to me
because it made me feel a sense of belonging, he said,
I was brainwashed. There's a spectrum on YouTube between the

(36:12):
calm section, the Walter Cronkite, Carl Sagan part, and crazy town,
where the extreme stuff is, said Tristan Harris, a former
design ethicist at Google, YouTube's parent company. If I'm YouTube
and I want you to watch more, I'm always going
to steer you toward crazy town. Um, and I will
say I'm very hard on the on the tech industry

(36:33):
regularly on this podcast. It speaks well of a lot
of engineers that the most vocal people in trying to
fight YouTube's algorithm are former Google engineers who realized what
the company was doing and like stepped away and have
been hammering it ever since, being like we made a
Nazi engine. Guys, like we weren't trying to, but we
made a Nazi engine, and we have to deal with that.

(36:55):
Mainly alarm on this one. Yeah, I gotta really gotta
reign the alarm on this one. You know, so at Google,
I worked at Google for two years. I didn't know that. Yeah,
what did you do? I My job title won't explain
what I did, But basically it was like a quality
Uh yeah, it has nothing to do with anything. But

(37:15):
basically I got to um in Russian like help build
um binary engine that can um well like train it,
dot build it, train it to be able to tell
whether something is um a restricted category or not, like
something is porn or not, gambling or not that kind
of stuff. So um, yeah it was. It was crazy.

(37:40):
Well that sounds different. Uh yeah, I saw some of
the most fucked up stuff on the internet, you know,
like I've reported child porn before. Then you will have
a lot to say this latter part, because we do
talk about content moderators for a little bit. That's kind
of somebody asking a couple of questions about that at
the end. Yeah, yeah, yeah, now that that New York

(38:03):
Times article in full disclosure actually cites me in it
because of a study that I published with the research
collective Belling Cat last year where I trawled through hundreds
and hundreds of leaf conversations between fascist activists and found
seventy five self reported stories of how these people got
red pilled. In that study, I found that thirty four
of the seventy five people I looked at cited YouTube

(38:25):
videos as the things that red pilled them. Um, I'm
not the only source on this though. The New York
Times also cited a research report published by a European
research firm called vox Poll. They conducted an analysis of
thirty thousand Twitter accounts affiliated with the far right, and
they found that those accounts linked to YouTube videos more
than they linked to any other thing. So there's a

(38:48):
lot of evidence that YouTube is the primary reason why.
If you look at people who were researching the KKK
and neo Nazis in America in two thousand, four tiers
and five two six big gathering would be twenty people,
and then in two thousands seventeen four or five hundred
of them. However many it was showed up at Charlottesville. Like,

(39:10):
there's a reason their numbers increased so much over a
pretty short period of time, and it's because these videos
made more of them. Um. And there's there's a lot
of evidence of that. So while Google is raking in
more and more cash and increasing time spent on site,
they're also increasing the amount of people who think Hitler
did nothing wrong. Um. And that's that's the tale of today.

(39:32):
So Mr Kane, the New York Time source for that article,
claims his journey started in two thousand and fourteen when
YouTube recommended a self help video by Stefon Malineux. Mr Mallin,
who was a great candidate for an episode of this podcast.
But in short, he's a far right YouTube philosopher self
help grew who advises his listeners to cut all ties
with their family. He runs a community called Free Domain

(39:53):
Radio that some people accused of being a cult that
you know, tells people to cut off contact with their family. Yeah, no,
cool club is going to be like, hey, please join us,
but also never speak to anyone you love ever again.
They never talk to your mom again. Like that's not
That's not how a cool club starts, you know, That's

(40:13):
all that's how a cool club starts, that's always bad news. Yeah, yeah,
cool clubs say, never talked to the cops again, which
cool clubs absolutely now. Molan New has been on YouTube
since forever, but his content has radicalized sharply over the years.
At the beginning, he identified as an anarcho capitalist, and
he mostly focused on his ideas about how everyone was

(40:35):
bad at being parents and people should cut ties with
toxic family members. In recent years, he's made bro, just
call your dad, call your dad. Bro, you probably need
to have a convo. Yeah, you guys, probably just talk
some feelings out. Maybe you'll calm the funk down. I
don't know, Like I don't want to say, like there's
actually are a lot of people with toxic family members,
so they do need to cut out of their lives,

(40:56):
which I think is part of why Molini was able
to get a following. Like there's not nothing in what
he's saying. There's a lot of people who have fucked
up family backgrounds and you get told like, well, you
just need to make things right with your mom, And
it's like, no, if if your mom like sent you
to gay conversion therapy, maybe cut all ties with her forever.
I totally agree. No, no, no, I'm not trying to

(41:16):
say that. What I'm trying to say is that he
himself uh to pursue a life where you tell people
to cut contact off with their family. You clearly have
unresolved issues with your family, and if you resolve those
by say, calling your parents and talking to them, I'm
not saying you have to make up with them. I'm saying,

(41:36):
somehow get closure for yourself so then you don't spend
the rest of your life trying to get people to
quit their families. Yeah, that's just like seems yeah, you
get some ship to deal with bro Um. But you know,
he Mollynew didn't stay on that sort of thing. Like,

(41:56):
he made a switch over to pretty hard core nationalism,
particularly in the last two years. There's like a video
of him where he's in um uh Poland during like
a far right march to commemorate like Poland's uh like
independence day, and he like said, like starts crying and
has like this big realization of how like I've been
against nationalism and stuff for years and I realized it

(42:19):
can really be beautiful and like the unseid things, like
I realized that white nationalism can be beautiful and that like, uh,
instead of you know, being an independent libertarian type, I'm
gonna focus on uh, fighting for my people, which is
like white people and stuff like that. That's how Stefan
Molineux is now. Like he's essentially a neo Nazi philosopher
at this point, and he spends most of his time

(42:41):
talking about race and i Q and you know, talking
about how black people are not as good as white people.
Like That's that's the thrust of modern day Stefal Stefon Molineux.
He also believes global warming as a hoax, So maybe
nobody should have much respect for Molineu's own i Q um,
but a lot of people get turned onto to Fawn's
straight up fascist propaganda because of their interest in Joe Rogan. Uh.

(43:05):
Rogan has had stefawn On as a guest several times,
and YouTube has decided that people who like Rogan should
have Stefens Channel recommended to them. This maybe why Mr
Kane saw Mollineu pop into his recommendations, which is what
he credits as radicalizing him in two thousand and fourteen. Um,
so yeah, he wound up watching like a lot of

(43:25):
members of what some people call the intellectual dark Web,
Joe Rogan, Dave Rubin, guys like Stephen Crowder and of
course Stefan Molineux uh and over time, like he went
further and further and further to the right, until eventually
he starts watching videos by Lauren Southern, who is a
Canadian activist who's essentially like he called her his fascist crushed,

(43:48):
like his fashy bay so like. By like two thousand sixteen,
this guy who starts watching Joe Rogan and like gets
turned in the Stefan Molinus videos about global warming as
a hoax and i Q and race. By two sixteen,
he's like identifying YouTube Nazi as his fascist like crush.
Like that's how this proceeds for this dude. And that's

(44:10):
a pretty standard path. But you know what's not a
standard path? No what the path that our listeners will
blaze if they buy the products and services that we
advertise on this program. You seem like your breath has
been taken away by these skill and ingenuity of that transition. Truly,

(44:32):
there was nothing I could add. It was a perfect,
perfect work. I'm the best at this, I'm the best around.
Nothing's going to ever keep me down. Yeah, I'm not
able to put a bumper sticker on a Rolls Royce.
You know what I'm saying. The bumper sticker is gonna
say I got my genitals molded. Yeah, I want that

(44:53):
bumper sticker. It's actually any hologram. And like, when you
look at it one way, I U wearing a skirt,
and when you look at it the other way, you
see my vagina mold. It's it's really cool. Man. That
put a lot of thought into into it. So that's
quite a bumper sticker. And you know, I have thought
for a long time that what traffic is missing is

(45:16):
explicitly pornographic bumper stickers. Like if truck nuts are okay,
why isn't that Seriously, it's actually a lot more pleasant
to look at than truck nuts. Yes, yes, nobody actually
likes truck nuts. Uh no one? All right, Well this
has been a long digression. Yeah, let's let's products we're back. Boy,

(45:44):
howdy what a day we've had today? M m so
at this point, YouTube's roll in radicalizing a whole generation
of fascists is very well documented, but YouTube is sort
of stuck when it comes to admitting that they've ever
done anything wrong. Of their traffic comes from the recommendation engine.

(46:04):
It is the single thing that drives the platform's profitability
more than anything else. Back in March, the New York
Times interviewed Neil Mohan, YouTube's chief product officer. His responses
were pretty characteristic of what the company says when confronted
about their little Nazi issue. The interviewer asked, I hear
a lot about the rabbit hole effect where you start
watching one video and you get nudged with recommendations towards

(46:26):
a slightly more extreme video and so on, and all
of a sudden you're watching something really extreme. Is that
a real phenomenon, to which Neil responded, Yeah, So I've
heard this before, and I think that there are some
myths that go into that description that I think it
would be useful for me to debunk. The first is
this notion that it's somehow in our interests for their
recommendations to shift people in this direction because it boost
watch time or what have you. I can say categorically

(46:47):
that's not the way our recommendation systems are designed. Watch
time is one signal that they use, but they have
a number of other engagement and satisfaction signals from the user.
It is not the case that extreme content drives a
higher version of engagement or watch time than content of
their types. So he basically has a blanket denial there.
Uh yeah, that's a huge just like blanket. Now, we

(47:08):
don't do that. No, that doesn't happen. Doesn't happen, And
he goes on it's a bit bit of a rambling answer,
and later in his answer, Mohan called the idea of
a YouTube radicalization rabbit hole purely a myth. The interviewer,
to his credit, presses Neil Mohan on this a bit
more later and asks if he's really sure he wants
to make that claim. Mohan responds, what I'm saying is

(47:28):
that when a video is watched, you will see a
number of videos that are recommended. Some of those videos
might have the perception of skewing in one direction or
you know, call it more extreme. There are other videos
that skew in the opposite direction. And again, our systems
are not doing this because that's not a signal that
feeds into the recommendations. That's just the observation that you
see in the panel. I'm not saying that a user
couldn't click on one of those videos that are quote

(47:49):
unquote more extreme, consume that, and then get another set
of recommendations that sort of keep moving in one path
or the other. All I'm saying is that it's not inevitable.
So because everybody doesn't choose to watch more extreme videos,
there's no YouTube radicalization rabbit hole. Yeah, and also it's
kind of acknowledging there that it does happen. Yeah, nothing
is inevitable. I mean except for like death and whatever.

(48:11):
You know, it's just to be like, yeah, yeah, no,
it's not. A meteor rite could hit your house before
you get to click on the video that turns you
into a Nazi, So of course it's not inevitable. Yeah,
just be like, it's not a hundred it's not true.
Is not a good answer when someone A percentage of
our users die of heart before the next video plays, Yeah,
pretty high percentage of people. Yeah, that's not what we're

(48:35):
asking Neil. Now. The reality, of course, is that Neil
Mohan is uh, shall we say not entirely honest. I
think I wrote a damn liar in the original draft,
but I'm not sure where the legally actionable line is.
Big video, Yeah, pocket of big big video. For just
one example, Jonathan Albright at Columbia University researcher recently carried

(48:57):
out a test where he seated a YouTube account with
a for the phrase crisis actor. The upnext recommendation led
him to nine thousand different videos promoting crisis actor conspiracy theories.
So again, someone who heard the term and wanted to
search for factual information about the conspiracy theory would be
directed by YouTube two hundreds of hours of conspiratorial nonsense
about how the Sandy Hook shooting was fake. Now, I'm

(49:21):
gonna guess you remember last year's mass shooting at the
Marjorie Stoneman Douglas High School. Um By the Wednesday after
that shooting, less than a week after all of those
kids died, the number one trending video on YouTube was
David Hogg the Actor, which is obviously a video accusing
one of the kids who's been most prominent of being
a crisis actor. According to a report from adage, it

(49:43):
and many others claimed to expose Hogg as a crisis actor.
YouTube eventually removed that particular video, but not before to
mass nearly two hundred thousand views. Other videos targeting hog
remain up. One that appears to show Hogg struggling with
his words during an interview after the shooting suggests because
he forgot his lines. YouTube autoso us certain search terms
that would lead people directly to the clips. If a
person typed David Hogg and YouTube search bar midday Wednesday,

(50:06):
for example, some of the suggestions would include exposed and
crisis actor. When reporters asked YouTube how that video made
it to the top of their coveted trending chart, YouTube
explained that since the video included edited clips from a
CNN report, it's algorithm had believed that it was a
legitimate piece of journalism and allowed it to spread as
an authoritative news report would. Um. So again, that's there.

(50:29):
That's there, like justification, Like we couldn't have known that
this was fake news because it was fake news that
used clips from a legitimate news site. So like we're
clearly not at fault here for the fact that we
let a robot select all these things and no human
being watched the top trending video on the side at
the moment to see if like it was something terrible. Um. Also,

(50:50):
that's bullshit. Yeah, yeah, that's total bullshit. Now YouTube's or
Nazi propaganda and conspiracy theories aren't the only things that
spread like wildfire on YouTube. Of course, pedophilia is also
a big thing on the site. Yeah. Yeah, this is
where we get to that part of the story. So
this broken February of twenty nineteen when a YouTuber named
Matt Watson put together a video exposing how rings of

(51:12):
pedophiles had infested the comments sections for various videos featuring
small children and used them to communicate and trade child porn. Now,
this report went very viral and immediately prompted several major
advertisers to pull their money from YouTube. The company released
a statement to their worried advertisers informing them that they
had blanket band comments for millions of videos, basically removing

(51:32):
comments from any videos uploaded by or about young children.
I'd like to quote from NPRS report on Watson's video,
Watson describes how he says the pedophile ring works. YouTube
visitors gather on videos of young girls doing innocuous things
such as putting on their makeup, demonstrating gymnastics moves, or
playing twister in the comments section. People would then post
time stamps that linked to frames in that video that
appeared to sexualize the children. YouTube's algorithms would then recommend

(51:56):
other videos also frequented by pedophiles. Once you enter into
this wormhole, there is now no other content available, Watson said.
So it might seem at first like this is purely
an accident on YouTube's part, like that cunning pedophiles figured
out that there were like they could just find videos
of young kids doing handstands and stuff and use that

(52:18):
as porn and trade it with each other, right, which
would not necessarily be like, how could we have predicted this?
It's just these people decided to use innocent videos for
a nefarious purpose. But that's not what happened, or at
least that's not all of what happened. So in June
three researchers from Harvard's Berkmann Client Center for Internet and

(52:38):
Society started combing through YouTube's recommendations for sexually themed videos. Uh.
They found that starting down this rabbit hole led them
inevitably to sexual videos that placed greater emphasis on youth.
So again, that's maybe not super surprising. You start looking
for sexy videos, you you click on one, and then
the next video the woman in it or is going

(52:59):
to be a younger woman, and a younger woman, and
a younger woman. But then at a certain point, the
video suggested flipped very suddenly until and I'm going to
quote the researchers here, YouTube would suddenly begin recommending videos
of young and partially clothed children. So YouTube would take
a person who's like just looking for adults like videos

(53:22):
of like an exact dancer dancing or whatever, like videos
of attractive young women dancing, and then YouTube would start
showing them videos of children doing like gymnastics routines and
stuff like that's the algorithm being like, I bet you'll
like child porn. Like that's literally what's happening here, which
I didn't realize when I first heard the story that like,

(53:42):
like that's YouTube. That's not just pedophiles using YouTube in
a sleazy way, because pedophiles will always find a way
to ruin anything that's YouTube crafting new pedophiles. Yeah, it's
a system, that's Yeah. I wonder if it's like that
with violence too, if you look up a violent thing,
if it keeps recommending more violence, because that seems like

(54:06):
and hate like that would happen when I worked for Google,
Like the sensitive categories. The restricted categories are you know, violence, hate, gambling, porn, childhorn.
I think there's even a messed up thing about that,
because one of the problems that like, people who document
war crimes in Syria have had is YouTube blanket banning

(54:26):
their videos because of violence, and then like you have
evidence of a war crime and then it's wiped off
of the Internet forever because YouTube doesn't realize that this
isn't like violence porn, this is somebody trying to document
a war crime. Um. It's made it really hard to
do that kind of research. It's yeah, their response is

(54:46):
always so terrible. Um. Anyway, The New York Times reported quote, So,
a user who watches erotic videos might be recommended videos
of women who becomes conspicuously younger, and then women who
promote provocatively in children's clothes. Eventually, some user might be
presented with videos of girls as young as five or
six wearing bathing suits, are getting dressed or doing a split.
So yeah, and its eternal quest to increase time spent

(55:08):
on site, YouTube's algorithm essentially radicalized people towards pedophilia. And
to make matters worse, it wasn't just picking sexy videos
like that people had uploaded with the intent of them
being sexy. Because it was sending children's videos to people,
it started grabbing totally normal home videos of little kids

(55:29):
and presenting those videos to horny adults who were on
YouTube to masturbate. The report suggests it was learning from
users who sought out revealing or suggestive images of children.
One parent The Times talked with related in horror that
a video of her ten year old girl wearing a
bathing suit had reached four hundred thousand views. So, like,
parents start to realize, like, wait, an I uploaded this

(55:51):
video to show her grandma. They're supposed to be like
nine views on this thing. Why have four hundred thousand
people watched this video of my ten year old? And
it's because you tube is trying to provide people with
porn because it knows that will keep them on the
site longer. That's fucking wild. Yeah. After this report came out,
YouTube published an apologetic blog post, promising that responsibility is

(56:13):
our number one priority. In chief among our areas of
focus is protecting miners and families. But of course that's
not true. Increasing the amount of time spent on site
is YouTube's chief priority, or rather, making money is YouTube's
chief priority, and if increasing the amount of time spent
on site is the best way to make money, the
YouTube will prioritize that overall other things, including the safety

(56:35):
of children. Now, there are ways YouTube could reduce the
danger their sites present to the world. Ways they could
catch stuff like propaganda accusing a mass shooting victim of
being an actor, or people's home movies of being accidentally
turned into child porn. Even if they're not going to
stop hosting literal, fascist propaganda. Content moderators could add human
eyes and human oversight to an AI algorithm that is

(56:57):
clearly sociopathic. And earlier this year, YouTube did announce that
they were expanding their content moderator team to ten thousand people,
which sounds great. Sounds like a huge number of people,
Only that's not as good as it seems. The Wall
Street Journal investigated and found out that a huge number
of these moderators, perhaps the majority, worked in cubical farms
and India and the Philippines, which would be fine if

(57:20):
they were moderating content posted from India or the Philippines,
but of course these people were also going to be
tasked with monitoring American political content now alphabet. Google does
not disclose how much money YouTube makes. Estimates suggest that
it's around ten billion dollars a year and maybe increasing
by as much as forty per year. Math is not

(57:42):
my strong suit. I'm not an algorithm, but I did
a little bit of math, and I calculated that if
Google took a billion dollars of their profit and hired
new content moderators paying them fifty thou dollar a year salaries,
which I meant to guess is more than most of
these moderators get, they could afford to hire twenty thousand
new moderators, tripling their current capacity. Realistically, they could hire

(58:03):
fifty or sixty thousand more moderators and still be making
billions of dollars a year and one of the most
profitable services on the Internet. But doing that would mean
less profit for Google shareholders. It would mean less money
for people like Neil Mohan, the man who has been
YouTube's chief product officers since two thousand eleven, the man
who was overseen nearly all the algorithmic changes we are
talking about today, The man who sat down with The

(58:25):
New York Times and denied YouTube had a problem with
leading people down rabbit holes that radicalize them in dangerous ways.
I was kind of curious as to how well compensated
Mr Mohan is, so I googled Neil Mohan net worth.
The first response was a Business Insider article. Google paid
this man a hundred million dollars. Here's his story. So

(58:47):
that's cool. Yeah, ooof, And I can tell you from
being a moderator, Um, I worked on a team where
everybody did what I did in a different language. So
I did this in Russian and next to me with
someone who doing it in Chinese and Turkish and all
the all of the languages, I mean not all, but
a significant number. Yeah, and um, I can tell you

(59:09):
that we were hired as contractors for only a year.
Very rarely would you ever be doing a second year
because they didn't want to pay you with the full benefits,
like you know, you don't get health insurance and whatever,
all the perks that you would get from being a
full time Google employee. And the thing about what we
did is you got exposed to a lot of fucked

(59:31):
up stuff, Like you know, the videos and stuff that
I've seen are like some of the worst the internet
has to offer, like beheadings or someone stomping a kitten
to death and high heels like crazy shit, and it
would really make you sick. And they like give you
free food at Google, and you like wouldn't be able
to eat sometimes because you would be so grossed out.
And it's not like they that's why you're only there

(59:54):
for a year. Also, not just that you wouldn't be
able to get full benefits, but also because they are
okay with wasting your mental and physical energies and then
letting you go and then just cycling through new people
every year because um, rather than investing you know, and
employees that are full time making sure they have uh,

(01:00:17):
you know, access to mental health care and stuff like that,
and um, you know, making that job be something that
they take more seriously considering how important it is. Well,
and that's part of what's really messed up is that
like it's fucking Google. Like if you go into the
people people who are like actually coding these algorithms and stuff,

(01:00:39):
I guarantee you those people have on site therapists, they
can visit, they have gyms at work, they get their lunches.
I mean we all worked on the same building, but
like I can't you know, I couldn't go get a
free massage during It's like, you know, you have a
CrossFit trainer on site and ship like that for sure
you get incredible perks. And the whole point is what
I that was kind of ironic about what we were saying,

(01:01:01):
is like the whole thing is to try to make
you stay on YouTube. But when you work for company
like Google, their job is to try to make you
stay at Google. So you know, the reason you're getting
all these benefits and stuff and like free food and
jim and massage and whatever is because they want you
to stay and work forever. But they don't want you

(01:01:23):
like that stuff to me exactly like and that's that's
a very telling thing from Google's perspective, because they are
saying that increasing the amount of the people who are
coding these algorithms that increase the amount of time people
spend on site, that is important to us. And so
we will do whatever it takes to retain these people.
But the people who make sure that we aren't creating

(01:01:44):
new pedophiles while we make money, the people who are
responsible for making sure that Nazi propaganda isn't served up
to like influence able young children via our service. Those
people aren't valuable to us because we don't care about that,
so we're not going to offer them healthcare. Like if
if if Google really was an ethical company, and if
YouTube cared about its impact on the world, someone whose

(01:02:07):
job who's there's nothing less technical or less valuable about
what you're doing. Being able to speak another language fluently,
being able to understand if content propagating on their site
uh is toxic or not. That's a very difficult, very
technical task. If they cared about the impact they had
on the world, the people doing that job would be

(01:02:28):
well paid and would have benefits and would be seen
as a crucial aspect of the company. But instead it's
sort of like, if we don't have someone doing this job,
will get yelled at. So we're going to do the
minimum necessary and we're going to have most of the
people doing that job be working at a fucking cube
farm in India, even though we're expecting them to moderate
American content and to understand all of our cultural nuances

(01:02:50):
and whether or not something's toxic, Like that's so fucked up.
And also considering the fact that like add is the
reason that they hire content moderators, not because they care
about the content necessarily. It's that it would be a
huge like mistake if say, an ad for Huggies was

(01:03:14):
served on a diaper fetish website. You know, they want
something in place where the page knows, the algorithm knows
not to serve that, even though it seems like a
good match because the word diapers repeated and blah blah blah,
you know what I mean. So it's really less about
it's it's more about keeping the advertisers happy and making
the most money than it is about, yeah, ensuring that

(01:03:39):
there Internet is this less fucked up place. This gets
to one of the things like when I when I
get in arguments with people about the nature of capitalism
and what's what's what's wrong with the kind of capitalism
that we have in this country. I think a lot
of people who like just sort of reject anti capitalist
arguments out of hand do it because they think that

(01:03:59):
you're saying, oh, it's just wrong to make to make money.
It's wrong to have a business that like makes a profit.
Is like the issue, isn't that the issue? Like this
company Google could be making billions of dollars a year,
uh and still be one of the most profitable sites
in in it's of its type still make a huge
amount of money and have three times as many people

(01:04:21):
doing content moderation, and all those people have health care.
But by cutting corners on that part of it, because
it doesn't make them more money, it just makes the
world better. They make more money, and it's worth more
to them to increase the value of a few hundred
people stock than to ensure that there aren't thousands of
additional people masturbating to children like that. That's that's what

(01:04:44):
I have an issue with with capitalism, Like, that's that's
a you can make a profit without also selling your
fucking soul. Yeah, we could have YouTube should be banned,
like I can get recommended new musicians that I like.
We can all watch videos to masturbate too, without more

(01:05:05):
people being turned toto pedophiles and Nazis. That's not a
necessary part of this, Like it's just because corners are
being cut. Yeah, it just shows what the value what
the value of our society is, what the values of
our society are. Yeah, it's they've literally said, like three
billion dollars a year is worth more to us than

(01:05:26):
god knows how many children being molested than fucking Heather
higher getting run down at Charlottesville than they're being Nazis
marching through the streets and advocating the extermination of black people,
of LGBT, people of whatever like, which is again part
of why so many Google employees are now speaking out
and horrified because like, they're not monsters. They don't want
to live in this world anymore than the rest of

(01:05:48):
us do. They just didn't realize what was happening because
they were busy focusing on the code and the free massages,
and then, like the rest of us, they woke up
to a world full of Nazis and pedophiles. Uh. Yeah,
I feel like you're looking at me to make a
joke now, and I feel like I don't know, this

(01:06:10):
got real serious. I'm more just tired. We're all tired.
It's a it's a very tiring world we live in. Yeah, well,
Sophia that you want to plug your plug? Fuck? I
mean not really just want everyone to go and get

(01:06:30):
a hug. You know, everybody go get a hug, but Jesus, Yeah,
but also, uh, I am to be found on the
sites we hate, you know, what a fun thing to plug.
I'm available Twitter and Instagram. That's Sophia and I have

(01:06:53):
a podcast about love and sexuality around the world that
I co host with Courtney Kosak. It's called Private Parts Unknown.
Check that out. Check out Private Parts Unknown. I'm also
on the sites we hate. Behind the Bastards dot com
is not a site that we hate, but it's where
you can find the sources for this episode. Um, I right, okay,

(01:07:14):
is where you can find me on Twitter. You can
find us on Twitter and Instagram at bastards pod. Um,
that's the You buy t shirts on t public dot com.
Behind the Bastards. Uh, yep, that's the episode. Go uh go,
go find YouTube's headquarters and yell at them, scream at
their sign, take pictures of their company, and wave your fists. Uh.

(01:07:39):
If you work at YouTube, quit it's not worth it.
I mean, the more whistleblowers the better. Yeah. Yeah, quit
and go talk to the New York Times or some
some fucking buddy. Uh yeah. Also, um, one random thing
that's positive is if you want there's a lot of
videos of trains on YouTube. Have discovered of just trains

(01:08:01):
passing by trains and fails. Yeah. I think you will
find it very soothing. First, they'll be like, what the
funk a video of a train that's twelve minutes long.
Guess what that will soothe you? Soothe your ass, or
if you're more like me, watch videos of people skiing
and then failing to ski. I mean that's if you
want to laugh. Yeah, yeah, I feel like YouTube's algorithm

(01:08:28):
is going to take you from train videos to train
fails really fast. Um oh boy, yeah ship. I don't
know now that, now that I know about the the
rabbit hole, I'm afraid that there's a way to connect
trains to children that I have not thought. Oh no,
I'm not even gonna make any further comments. We should

(01:08:49):
get before this gets to a spot

Behind the Bastards News

Advertise With Us

Follow Us On

Host

Robert Evans

Robert Evans

Show Links

StoreRSSAbout

Popular Podcasts

Death, Sex & Money

Death, Sex & Money

Anna Sale explores the big questions and hard choices that are often left out of polite conversation.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.