Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Wrecked him. Damn near killed him. That was the punchline
to a joke without the rest of the joke, because
I think we can all put things together now as adults.
I'm Robert Evans. Yeah, hi, Jamie, this is behind the Bastards.
We talk about bad people on this podcast. Um. And
that was a little a little bit of levity at
(00:22):
the start of it before we get into depressing ship again.
Some abstract levity, some abstract levity. Yeah, pieces of levity
that one can assemble into comedy. Um. Yeah, very nice. Yeah,
I mean comedy. You know, it is just a series
of things that you put in the correct order. It's
like a deconstruction of comedy, like when people take up
(00:43):
part of sandwich and then serve it on a plane
in a fancy restaurant. I gotta be on it. Anytime
someone says it's a deconstruction of comedy, it's the least
funniest ship you'll ever hear in your entire life. Like
he's deconstructing the medium, and it's usually just like some
some guy. It's yeah, some guy. Yeah, it's never any good,
but you know it is good. Jamie, what Facebook's peanut butter?
(01:07):
I was like, this can't be a transition to Mark Zuckerberg, No,
because nothing about him. Just talked about this delicious lunch
he had as I eat, I had a great lunch
and I'm eating a peanut butter and you know what,
I'm content, Sophie is eating peanut butter. I ate a
delicious lunch, love for you, had a couple of chips,
(01:32):
and most important, most important, we're all going to get
back to my favorite thing to do with my good
friend Jamie Loftus, which is talk about the extensive crimes
of Mark Zuckerberg. Oh yes, I changed shirts between episodes.
Roberts and I have a little Marquis with me. Yeah
you do, You've got your your your Marquez shirt. Yeah,
(01:53):
my favorite, my favorite Mark quote that you can't you
can be unethical and still be legal. That's the way
I live my life. Ha ha. It is amazing. And
he really, I mean there's been a lot said about him,
but the man sticks to his guns. He lives by
this credo to this very day. Yeah. Yeah, yeah, you
(02:15):
know who else sticks to their guns? Jamie whom the
death squads of the various dictatorial political candidates who use
Facebook to crush opposition and insight race riots. That was
a transition. That was a transition. So Jamie, all right,
I'm gonna start. It's time to start the episode, and
gonna start with a little bit of a little bit,
(02:36):
little well bit of an essay here. So once, once
upon a decade or so ago, I had the fortune
to to visit the ruins of a vast Mayan city
in guatemality call Um. And the scale of the architecture
there was astonishing. If you ever get the chance to
visit one of these cities, you know, in Guatemala or
in mexicoever really worth the experience. Um, Just again, the
(02:58):
size of everything you see, the the the precision of
the stonework. Um, it's just amazing. And one of the
things that was most kind of stirring about it was
the fact that everything that surrounded it was just hundreds
and hundreds of miles of dense, howling jungle. UM. So
I spent like an afternoon there and I got to
sit on top of one of these giant temple pyramids,
drinking a one lead bottle of guy O beer and
(03:20):
staring out over the jungle canopy and just kind of
marveling at the weight of human ingenuity and dedication necessary
to build a place like that sounds metaphorical, and while
I was there, Jamie, I thought about what had killed
this great city and the empire that built it, Because
a couple of years earlier, really not all that long
before I visited, theories had started to circulate within the
(03:41):
academic community that the Mayans had, in the words of
a NASA article in this subject in two thousand nine,
killed themselves through a combination of massy forestation in human
induced climate change. A year after my visit, the first
study on the matter was published in Proceedings of the
National Academy of Sciences. I'm gonna call to hear from
the Smithsonian magazine. Researchers from Arizona State University analyzed archaeological
(04:06):
data from across the Yucatan to reach a better understanding
of the environmental conditions when the area was abandoned around
this time. They found severe reductions in rainfall were coupled
within a rapid rate of deforestation as the Mayans burned
and shopped down more and more forests to clear land
for agriculture. Interestingly, they also required massive amounts of wood
(04:27):
to fuel the fires that cooked the lime plaster for
their elaborate constructions, Experts estimated would have taken twenty trees
to produce a single square meter of city escape. So,
in other words, the Mayans grew themselves to death, turning
the forests that fed them into deserts, all in the
pursuit of expansion. It's a story that brings to mind
(04:48):
a quote from the great historian Tacitus writing about Augustus
Caesar and men like him solitudinum faciunt pack him appellante.
They make a desert and call it peace. That's what
he's saying about Augustus Caesar and the Emperor's like him.
They make a desert and call it peace. That's That's
(05:09):
one of those Sun dial phrases. I think a more
accurate summation of the two hundred years of peace that
Augustus Caesar created than than what Mark put out. They
make a desert and call it peace. Now, I read
that quote for the first time as a Latin student
in high school, and I saw it referenced in relation
to Mark Zuckerberg in a Guardian article covering that New
(05:30):
Yorker piece. We quoted from last episode and the title
of that New Yorker article was, can Mark Zuckerberg fixed
Facebook before it breaks democracy? So democracy a free and
open society where numerous viewpoints are tolerated, cultural experimentation is possible,
and evolution is encouraged. These are the things that have
made Facebook success possible. It could not have come about
(05:53):
without them, um, and outside of a culture that embodies
those values. And now that Facebook's member count is closing
in at three billion, the social network is doing what
all empires do. It's turning the fertile soil that birthed
it into a desert. And as it was with the Mayans,
all of this is being done in the name of growth.
(06:14):
Katherine Lois was an early Facebook employee and Mark Zuckerberg
speech writer for a time. In her memoire The Boy Kings,
which is what you call it, sigured, I don't know
that's a good title, um, she lays out what she
saw as the engineering ideology of Facebook. Quote, scaling and
(06:35):
growth are everything. Individuals and their experiences are secondary to
what is necessary to maximize the system. Mark Zuckerberg and
thus Facebook have held a very consistent line since day
one of the company operating as an actual business, and
that Lion is that Facebook's goal is to connect people,
but this was and always has been a lie. The
(06:57):
goal is growth, growth at the cost. In two thousand seven,
Facebook's growth leveled off at around fifty million users. At
the time, this was not unusual for social networks, and
it seemed to be something of a built in natural
growth limit, Like maybe fifty million is about as much
as a social network can get unless you really start
jinking the results um and fifty million users. That's a
(07:21):
very successful business. You can be a very rich person
operating a business. You can call it a day. That's
a great thing to accomplish. My Space Tom was thrilled
with that. Yeah, my God, blessed Tom. Space Tom is
now who hasn't done a goddamn problematic thing traveling the
world now taking photographs of of of the world that
(07:42):
Mark Zuckerberg is destroying. I mean, he might be the
only person worth hundreds of millions of dollars that I'm
fine with not taking the money back, right, Like, let Tom,
You're fine, Like go keep doing your things, use the
money to be boring. It feels like he is just
used his fortune to be boring. What is I remember
it took like five months for me to get my
(08:03):
my space to lead. So I don't remember anything about
my space. That's my favorite thing about my space is
I've forgotten everything about it but the name my space.
Oh for sure my space was not good. But did
I learn about a lot of goth music on it?
Now I'm the middle school anchst from not being in
somebody's top aid. But oh my god, PC four PC,
(08:26):
I did a lot of PC that I'm my friend,
you are so pretty PC four PC, And you know what,
you know what? No one used my space for Jetta's
side that organizing militia's to show up at the side
of protests and shoot Black Lives Matter activists was not
done with with with my space, and I suspect Tom
(08:48):
would have had an issue if it had been. I think, so,
I well, I don't well, I I don't know about
Tom's but the fact that he's kept his fucking mouth
shut since getting rich and going off to do whatever
he does makes me suspect that he's a reasonable man.
So Facebook hits this growth limit and it and it
it kind of levels off a bit um And again,
(09:09):
it's a very successful business in two thousand seven, but
it's not an empire, and that's what Mark wanted. That's
the only thing Mark has ever wanted um in his
entire life. And so he ordered the creation of what
he called a growth team, dedicated to putting Facebook back
on the path to expansion, no matter what it took.
So the Growth team quickly came to include the very
(09:30):
best minds in the company, who started applying their intellect
and ingenuity to this task. One solution they found was
to expand Facebook to users who spoke other languages. And
this is what began. We talked about last episode, the
company's heedless growth into foreign markets. Obviously, at no point
did anyone care or even consider what impact Facebook might
have on those places. Yeah, I'm gonna quote from The
(09:52):
New Yorker again. Alex Schultz, a founding member of the
growth team, said that he and his colleagues were fanatical
in their pursuit of expansion. You will fight for that inch,
Alex said, You will die for that inch. Facebook left
no opportunity untapped. In two thousand eleven, the company asked
the Federal Election Commission for an exemption to rules requiring
(10:12):
the source of funding for political ads to be disclosed
and filings a Facebook lawyer argued that the agency should
not stand in the way of innovation. Oh okay, it
doesn't seem like an innovation to me, real loose interpretation
of the word. You know, I get this to an extent.
So the other day I was drunk driving my Forerunner,
(10:34):
UM and I was I was, I was shooting at
some targets I'd set up in the trees and the
people in the neighborhood. I was doing this and said, oh,
for the love of God, you're you're please, you're endangering
all of our lives. Um, And I said, you're standing
in the way of innovation because I was innovating what
you can do drunk in a Forerunner with a kalashnikov,
(10:57):
and they got in the way of that. I understand
that I have to really heat up a pan and
put it on someone's face just to innovate the art
of what you you innovate their skins. Yeah, and and really,
people standing in the way of that innovation. Am I
supposed to make progress in hurting people's faces? Yeah? I'm
(11:17):
a fan of how Polpod innovated the capital city of
Cambodia by forcing everyone out of it and then killing
hundreds of thousands of just it's just it's innovative. The
way all of this is just horrific and like, I
don't know, like I'm talking about something else, but it's
like the language of Silicon Valley applied to the two
(11:38):
genocidal situations is just so it makes my fucking I mean,
I'm just peeled all my skin off. You have to
you have to agree that Hitler was an innovator. He
innovated so many things. He really did change. He changed
the game. Yeah, he absolutely changed the narrative from there
not being a war in Europe to there being a
(11:58):
war in Europe. That's called disrupting, Honey. He didn't disrupt it.
He disrupted the ship out of the Polish government. Oh
my god, doing this dinner wrong. So Sandy Parakilis, who
joined Facebook in two thousand and eleven as an operations manager,
(12:21):
paraphrase the message of the orientation session he received as
we believe in the religion of growth. Um, the religion
of growth is what he was told when he joined
the company. That's what it was called him not only horrifying,
but like, what could you sound like a more of
a sniveling loser? Okay, he said quote the growth team
(12:46):
was the Cure was the coolest. Other teams would even
try to call subgroups within their teams the growth X
or the Growth Y to try and get people excited. Um.
And in the end, Facebook's finest minds decided that the
best way they could I know, I know, yeah, I'm excited.
(13:07):
I'm a horny, I'm ready to go. I mean with
with that kind of narrative. I love to hear it.
I love you do love to hear it. So in
the end, Facebook's finest minds decided the best way they
could further the great god of growth was too for
Facebook to become a platform for outside developers. Um that
this was the way to really really get things going again.
(13:28):
And you will remember the start of this period when
Facebook made this change. This is when like what had
once been a pretty straightforward service for keeping up with
your friends from college was suddenly flooded with games like
Farmville and a bunch of like personality tests and ship
that period of time, the mom's fucking drone struck Facebook
by coming down with Farmville, sending you five trillion invitations,
(13:51):
leaving your like high school choir concert to go harvest strawberries. Yeah,
I'm making a bunch of fucking money for Facebook. Whatever
rance these apps took. Their main purpose was the same,
which was to hoover up all of your personal data
and sell it for profit. Yeah. I've given her a
social security number to the Farmville and that's just a fact. Yeah.
(14:11):
And the only person you should give your social security
number two is me. I do encourage all all of
our listeners to find my email, um and just just
email me your social uh yeah, just to just do
your tips line. Yeah. Yeah, it's like it's like that
you like that thing from that that documentary about Keith
Ranieri who also did episodes on the vow. It's your
(14:32):
collaterals and me your social security number, so I'll know
that you really care. Um. Yeah. So Facebook's employees kind
of realized very quickly after this change was made and
these developers start flooding the service with all of their
ship um that the company's new partners were engaged in
some really shady behavior. One worker who was put in
(14:53):
charge of a team order to make sure developers weren't
using user data immediately found out that they were. And
I'm gonna quote again from the new Yorker. Here, some
games were siphoning off users messages and photographs. In one case,
he said, a developer was harvesting user information, including that
of children, to create unauthorized profiles on its own website.
Facebook had given away data before I had a system
(15:14):
to check for abuse. Peraculis suggested that there be an
audit to uncover the scale of the problem, but according
to Paraculus, an executive rejected that idea, telling him, do
you really want to see what you'll find? No, which
look I can identify with that too. I recently had
an issue where I left a bag of potatoes and
the top counter of my of my kitchen for I
(15:37):
don't know, somewhere between four and seven months, and when
I took them I didn't want to. I knew something
was wrong. I knew something was wrong up there because
of the flies and the strange smell, but I didn't
want to look into it because I didn't want to
see the extent of the problem. And when I finally did,
I regretted learning what an issue I had made for
myself and my home. You are really since the last
(16:00):
BO you've become very prone to a metaphor. I am
I am a living metaphor. You you are living out
an innovative. I innovated those potatoes. You distruct those potatoes.
Works with a family magic. It's okay. We don't need
to We don't need to talk about what a problem
(16:21):
my life has become. Um Parakeelis told me the New
Yorker reporter quote, it was very difficult to get the
kind of resources that you needed to do a good
job of ensuring real compliance. Meanwhile, you looked at the
growth team and they had engineers coming out of their ears.
All the smartest minds are focused on doing whatever they
can do to get those growth numbers up. Now, Jamie
(16:45):
Jamie Loft, I happened to read this quote well. I
was struggling to work in the midst of unprecedented wildfires
that devastated a huge amount of the state of Oregon
um and made our air quality the worst in the
world for a while. On the very day I read
that article, four of my friends and journalistic colleagues were
held and threatened at gunpoint by militiamen who had taken
(17:07):
to the streets of a town very near Portland in
the middle of an evacuation. Because viral Facebook memes convinced
them that Antipho was starting the fires. Around the same
time that this was happening, that my buddies were getting
held at gunpoint because they were not white people. Uh,
and a militia thought that was suspicious. Around that same time,
(17:29):
a tweet went viral from a Portland resident and a
former Facebook employee named bo Wrin. She posted a picture
of the city blotted out by thick, acrid clouds of
smoke and wrote, my dad sent me this view from
my childhood room in Portland. It hit me that we
have been wasting our collective intelligence in tech, optimizing for
(17:50):
profits and ad clicks. Huhmm, glad you got on the
on that page. Bow well glad. We I mean sometimes
it just takes something to put it all in perspective,
wouldn't you say, like your home burning down? Yeah? Sometimes,
(18:10):
and the militias minutes from your door. Yes, unfu believable
militias that organized on Facebook. Yeah, I mean the story
went pretty viral about it. They weren't harmed, um yes,
um yeah. The two people I knew best who were
there were Sergio almost ingined almost In Justin Yao, who
are both wonderful reporters, But yeah, it was. It was
(18:33):
not lost on me that I think of the four
people who were there, three of them were not white people. Um,
and that some of the white reporters had a much
easier time. Interesting things about militias you learn? Anyway, Do
you think it makes you think? Now? I thought that
quote was interesting? Um. Anyway, opening interesting, disruptive, disruptive, like
(18:58):
the fires, like malicious. The thing has made me think
when I could think, Yeah, I threw up in my
d n mask walking down the street, awesome. Yeah, I've
been jogging and doing pull ups in a gas mask. Um,
just half naked in a gas mask in my front
(19:20):
lawn like a normal person. You're the only person I
know who would have seen this as a as a
possible outcome. And for that, Um, for that, I thank
you and I curse you. Yeah. So anyway, opening Facebook
up to developers made a shipload of money and membership grew,
and from Mark's point of view, everything was going great.
(19:41):
But Catherine Katherine Loss, his speech writer, saw a lot
of the same problems Parakeelis had seen, and in her
memoir she writes the idea of providing developers with a
massive platform for application promotion didn't exactly accord I thought
with the site state admission of connecting people to me.
Connection with another a person required intention. They have to
(20:02):
personally signal that they want to talk to me, and
vice versa. Platform developers, though, went it human connection from
a more automated angle. They churned out applications that promised
to tell you who had a crush on you if
you would just send an invitation to the application to
all of your friends. Idea was that after the application
had a list of your contacts, it would begin the
(20:23):
automated work of inquiring about people's interests in matching people
who were interested in each other. Soon, developers didn't even
ask you if you wanted to send invitations to your friends.
Simply adding the application would automatically notify all of your
Facebook friends that you had added it and invite them
to add it to using each user as a vessel
through which invitations would flow viral e without the user's consent.
(20:47):
In this way, users needs for friendship and connection became
a powerful engine of spam, as it already was with
email and on the Internet long before Facebook. The same
will tell you you have a crush on who has
a crush on you if you just send this email
to your address book. Ploys were familiar to me from Hopkins,
when spammers would blank at the entire email server with
emails in a matter of hours, spread violarly by students
(21:07):
scullably entering the names of their crushes and their crushes
email addresses. So this was the face. This was the
start of Facebook making choices for its users, choices that
were based on what would be best for the social network,
which was keeping people on the site for as long
as possible. The growth teams saw that proactively connecting people
to each other worked out really well for Facebook's bottom line,
(21:30):
even though sometimes, for example, people who had been horribly
abused and raped by their spouses were reconnected to those
spouses who they were hiding from and had their personal
data exposed to them, a thing that happened repeatedly. Um
and still happens repeatedly, you know. But that's a small
price to pay for growth. For that inch, you gotta
(21:52):
fight for that And sometimes fighting for that inch means
connecting abused women to the men who horribly injured them.
That's like Mark talking to Priscilla when they're trying to
conceive a child. He's just like, you gotta fight for
my inch, honey, you gotta fight for it. Oh. Mark
Zuckerberg is incapable of talking during sex um. He lets
(22:13):
out a high pitched hum that is only audible to crickets.
He doesn't. Yeah, he's sort of got a Kendall situation
going on, whereas he just has a sex lump that
gets really hot. Yeah. Yeah. She has to actually withdraw
the semen from inside his sax using a needle. I think,
(22:34):
actually put in a little a little hold on, hold on,
hold on, okay, on the subject holding my vomit but
in a syringe and then suck and then and then
she just has it. And then she just has it.
And if you want to have the emotional equivalent of
Mark Zuckerberg, seem no, that's not that's not fair to
the products or services. It's not anyway. Here they are vomit.
(23:04):
We're back, Okay. So in two thousand ten, Facebook launched
Facebook Groups, which would allow just about anyone to create
a private wald Off community to discuss just about anything,
including fascism, white genocide, or the need to gather a
militia together and use it to kill their political enemies.
(23:24):
If you're a regular listener of my show, you know
the next part of this story. From about two thousand
ten to two thousand and sixteen, the United States saw
an astonishing leap in the number of active hate groups.
For some perspective, just from two thousand fifteen to two
thou twenty, the SPLC estimates there were at increase in
the number of hate groups nationwide. Um All of this
(23:44):
growth was mostly spurred on by social media, and Facebook
was one of the main culprits. And they knew they
were at two. They didn't admit it openly, but internally
they were talking about it from pretty early on. And
I'm gonna quote now from a report in the All
Street Journal. A two thousand sixteen presentation that names is
author of Facebook. Researcher and sociologist monocha Ly found extremist
(24:08):
content thriving and more than one third of large German
political groups on the platform swamped with racist, conspiracy minded,
and pro Russian content. The groups were disproportionately influenced by
a subset of hyperactive users. The presentation notes most of
them were private or secret. The high number of extremist
groups was concerning the presentation says worse was Facebook's realization
(24:31):
that it's algorithms were responsible for their growth. The two
thousand sixteen presentation says that sixty of all extremist group
joins are due to our recommendation tools. Yeah, and that
most of the activity came from the platforms groups you
should join and discover algorithms from the presentation, are recommendation
(24:55):
systems grow the problem? Oh okay, Well, I mean as
long the word grows in the sentence, I think that
that's good enough. Growth is in there, You're good is
in there? So really where we're growing and and what
the consequences are? Not really worried about it. Yeah, It's
just like when I'm in my Forerunner drunk is shipped
on mescal and firing a collash to CoV All that
(25:15):
matters is forward movement. It doesn't matter if that forward
movement is driving through the trailer that a family lives in.
What matters is that I'm moving forward and shooting and drunk.
You're right, thank you? Wow a judgmental statement. I'm innovating
homeownership there. I mean, this is another example of just
(25:37):
you know, Facebook innovating people's interests, Like hey, do you
do you enjoy this? I'm trying to think of the
old Facebook groups that you used to be able to join,
like ten years ago, where it would be like science
is my boyfriend, and it's like enjoy the Smithsonian Institute.
Like groups would be like Class of two thousand and twell,
(26:00):
stuff like that. With like early I mean, it's like,
I mean, obviously very much in the same line of
algorithmic thinking as YouTube, where it's like, oh, did you
enjoy this, like collage of Gerard Butler images? How about
a man sitting at his four runner whispering conspiracy theories
for three hours on end? Yeah, that's just growth. I
(26:22):
love growth. I love growth almost as much as I
love everything that I do. With the Toyota four runner
um hammered in a trailer park. Yeah, that's that's the
real is innovating the trailer parks near my house with
a Toyota and a rifle, just changing the narrative around it,
changing the narrative around it, um to screaming mainly. So yeah,
(26:47):
throughout right, So throughout two thousand sixteen, uh, and particularly
in the wake of the election, a lot of Facebook
employees began to increasingly express their concerns that the social
network they were pouring their lives into might be tearing
the world a right, because again, most of these are
very nice and intelligent people who don't want to live
in a planet dominated by nightmarish dictatorships and a complete
(27:08):
collapse in the understanding of truth that allows, for example,
viral pandemics to spread long after where they should have spread,
because people don't have any sort of common conception about
basic reality UM as a result of the influence of
social media. They don't like that, they like the people
who work at Facebook kind of bummed out about contributing
(27:28):
to that UM. One observer at the time reported to
The Wall Street Journal, there was this soul searching period
after these thousand sixteen that seemed to me, this period
of really sincere, oh man, what if we really did
mess up the world? And two thousands sixteen? Yeah, yeah,
I love that We're going from in the forties, like
(27:49):
like the scientist who does this, who does the same thing,
going now, I am become death, the destroyer of world
an appropriate comment for the thing that he had done,
and then something honestly equivalent in its destructive potential UM.
But the response this time because everything is tacky. Now, oh,
what if we messed up the world, we might have
(28:10):
sucked this up? God, like starting to think we've severely
fucked up the planet never mind, Like, yeah, this is
why Aaron Sorkin is still working is because people are
saying shitty stuff in shitty ways. Yeah, I don't you
(28:30):
should cut that out history. No, No, let's we connect.
We should never cut out criticizing histories. Real villain Aaron Sorkin,
who I call the poll pot of cable television. Yeah.
This soul searching did not extend to Mark Zuckerberg, who
after the election gave the order to pour even more
(28:52):
resources into Facebook groups. Marking that feature out is emblematic
of what he saw is the future of his sight.
He wrote a six thousand and word manifesto in two
thousand seventeen which admitted to playing some role in the
disinformation in bigotry flooding the body politics. So he's like, yeah,
we did, we had we had something to do with it. Um.
He also claimed that Facebook was going to start fighting
(29:14):
against this by fostering safe in meaningful communities. From CNBC quote,
Zuckerberg noted that more than a hundred million users were
members of very meaningful Facebook groups, but he said that
most people don't seek out groups on their own. There
is a real opportunity to connect more of us with
groups that will be meaningful social infrastructure in our lives.
Zuckerberg wrote at the time, if we can improve our
(29:35):
suggestions and help connect one billion people with meaningful communities,
that can strengthen our social fabric. Since then, fascinating use
of the word meaningful, meaningful, meaningful, meaningful. What happened next
was terrible and predictable and meaningful, Jamie, very meaningful. A
flood of new users got introduced and even pushed into
(29:57):
extremist groups on Facebook that changes are consisted upon have
been critical to the growth of q and on, which
was able to break containment from the weird parts of
the Internet and start infecting the minds of our aunts
and uncles, thanks mostly to Facebook, which took no action
against it until like a month or two ago. Within
two years, Facebook hosted thousands of q and on pages
(30:18):
with tens of millions of collective members. I'm gonna quote
now from an NBC News investigation on the matter. Facebook
has been key to q and On's growth, in large
part due to the platform's groups feature, which is also
seen a significant uptick in use since the social network
began emphasizing it in two thousands seventeen. There are tens
of millions of active groups, a Facebook spokesperson told NBC
(30:38):
News in two thousand nineteen, a number that has probably
grown since the company began serving up group group posts
in the user's main feeds. Well. Most groups are dedicated
to innocuous content. Extremists from q and on conspiracy theorists
to anti vaccination advocates have also used the group's feature
to grow their audiences and spread misinformation. Facebook aided that
growth with it with its recommendations feature, powered by a
(30:59):
c great algorithm that suggests groups to users seemingly based
on interest and existing group membership and growth and growth. Yeah,
it's funny. There's one of the things I like about
this NBC report, which is partly authored by Brandy's at Rosne,
who has done a lot of great work on this subject,
um is uh, they kind of talk about how profitable
(31:21):
spreading dangerous fascist content is for Facebook. Quote. A small
team working across several of Facebook's departments found a hundred
and eighty five ads that the company had accepted praising, supporting,
or representing q and on. According to an internal post
shared among more than four employees, the ads generated about
twelve dollars for Facebook and four million impressions in the
(31:42):
last thirty days. Well you have to imagine, like they
have to if if they're doing the math of what
I mean, it has to be financially profitable because it
has to offset the cost of the pr hits that
they know that they're going to take for ship. So
there and again, it's just as signing yes, a signing
of is to uh lives and and and brains. Yeah,
(32:04):
which is a good thing to do. Seems reasonable, Yeah,
it seems fair to me. Yeah. Um so yeah. Outside Facebook,
the only people who really noticed what was happening initially
were a handful of researchers that studied extremist groups. Um
and I wasn't really one of them until like two
thousand nineteen that I realized Facebook groups specifically were a problem.
(32:25):
Was obviously that Facebook was the issue, but the um
I wasn't until Facebook group kept threatening to kill me
for two years. Yeah that did happen to you? Huh?
That did happen to me? Yeah? If you have a
listener her my year in menso podcast What are you
doing there? Thankfully, the people threatening to kill you were
(32:45):
just members of MENSA who I trust are not competent
enough to pull off an assassination. I mean, don't challenge them,
but let's hope. So no, this I'm throwing down the
Gautensd're like, no, no, no, I don't think they could
do it. Yeah, sorry, so, um yeah. I didn't really
(33:06):
grasp the scale of the problem with Facebook groups and
specific until two thousand nineteen when I started really looking
into the Boogaloo movement UM, and it was kind of
camouflaged because there was just so much fascist content everywhere
on Facebook. UM that the fact that groups in specific
we're driving a lot of the expansion of fascism in
this country kind of got lost in the noise. But
(33:27):
there were other researchers who started to realize this early on,
and workers inside Facebook realized what was happening right away,
and two eighteen they held a meeting for Mark and
other senior leadership members to reveal their troubling findings. From
the Wall Street Journal quote, a Facebook team had a
blunt message for senior executives. The company's algorithms weren't bringing
(33:48):
people together, they were driving people apart. Quote. Our algorithms
exploit the human brains attraction to divisiveness. Read a slide
from a two thousand eighteen presentation. If left on checked,
it warned Facebook would feed users more and more divisive
content and an effort to gain user attention and increase
time on platform. So that presentation went to the heart
(34:11):
of a question dogging Facebook almost since its founding, does
its platform aggravate polarization and tribal behavior? The answer it
found in some cases was yes. In some case, I mean,
I guess that's technically accurate. In some cases yeah. So Facebook,
(34:31):
in response to this meeting, starts like a massive internal
effort to try to figure out like how its platform
might be harming people, and Mark Zuckerberg in public and
private around this time, started talking about his concern that
sensationalism and polarization we're being enabled by Facebook. Into Mark's credit,
he made his employees do something about that phrase, yeah
(34:53):
a little bit to his credit. It's okay, we'll take
away the credit in just a second. Um so quote.
Fixing the polarization problem would be difficult, requiring Facebook to
rethink some of its core products. Most notably, the projects
forced Facebook to consider how it prioritized user engagement, a
metric involving time spent like shares and comments that for
(35:13):
years had been the loadstar of its system, championed by
Chris Cox, Facebook's chief product officer at the time, in
a top debt beauty to Mr Zuckerberg. The work was
carried out over much of two thousands, seventeen and eighteen
by engineers and researchers assigned to a cross jurisdictional task
force dubbed common Ground, and employees and a newly created
newly created integrity teams embedded around the company. Integrity teams
(35:36):
sounds good to me. It sounds reliable. It sounds like
they made sure that integrity was accomplished via teamwork. Yeah. Yeah,
So the common Ground team proposed a number of solutions,
and to my ears, some of them were actually pretty good.
One proposal was basically to um uh like to to
kind of try to take conversations that were derailing UH groups,
(36:01):
like conversations over hot button political issues, and exercise them
from those groups. So basically, if if a couple of
members of a Facebook group started fighting, about vaccinations, and
like a group based around parenting, the moderators would be
able to make a temporary subgroup for the argument to
exist in so that other people, like the Zoom breakout room. Yeah,
so that other people wouldn't be expect which I don't
(36:22):
know if that's a great idea, but it was something.
Another idea that I do think was better was to
tweak recommendation algorithms um to give people a wider range
of Facebook group suggestions. Um. Yeah. But it was kind
of determined that doing these things uh would probably help
with polarization, but would come at the cost of lower
(36:42):
user engagement and less time spent on site, which the
common Ground team warned about in a two thou eighteen
document that described some of their own proposals as anti
growth and requiring Facebook to take a moral stance. Guess
how that all went. Yeah, marks Uckerberg almost immediately lost interest. Um.
(37:03):
Some of this. A lot of this is probably due
to the fact that it would harm Facebook's growth, But
another culprit that, like employees who talked to the Wall
Street Journal and other publications repeatedly mentioned is the fact
that he was all but heard about how journalists were
reporting on Facebook because after the Cambridge analytic A scandal,
they kept writing mean things about him. Mr Well, Mr
(37:26):
Marc always has to ask himself, what would what would
bad haircut Emperor do? And that the haircut Emperor wouldn't
you know, wouldn't slow down? And this ship absolutely not.
One person was familiar with the situation told The Wall
Street Journal, the internal pendulum swung really hard to the
media hates us no matter what we do, So let's
just batten down the hatches. But January of Mark's feelings
(37:49):
had hardened enough that he announced he would stand up
quote against those who say that new types of communities
forming on social media are dividing us. According to the
Wall Street Journal, people who have heard him speak, I've
at least say he argues social media bears little responsibility
for polarization. Now there may be an additional explanation from
marks shifting opinions on the matter that go beyond being
(38:11):
just greedy and angry about bad press. And that explanation
is a fella named Joel Kaplan. Do you know Joel Kaplan?
You ever heard of this? Dude? I don't know this
Joel Kaplan character. Well, in short, he's the goddamn devil.
Um long, he's the guy that Facebook hired to head
up US Public Policy in two thousand eleven, and he
became the VP of Global Public Policy in two thousand fourteen.
(38:33):
And Joel was picked for these jobs because, unlike most
Facebook employees, he is a registered Republican with decades of
experience and government. This made him the perfect person to
help the social network deal with allegations of anti conservative
bias as little empathy as possible. I'm sure. Yeah. In
two thousand and sixteen, there's all these rumors that Facebook
is like censoring conservative content that are proven to be untrue,
(38:56):
but the rumors go viral on the right, and so
everyone on the right forever assumes that they were true.
Um And basically, Joel becomes increasingly influential after this point
because he's Mark Zuckerberg's best way out of angering the
right wing, which you actually can't not do because they're
they're always angry, and we'll just yell about everything until
(39:16):
they get to kill everyone who wasn't them, because that's
what life finds a way for that. So Joel was
a policy advisor for George W. Bush's two thousand campaign
and a participant in the Brooks Brothers riot, which is
the thing that was orchestrated by Roger Stone to help
hide a bunch of ballots in Florida that he was
a part of that. That's the guy who's basically running
(39:39):
Facebook's response to partisanship right now. I have a physical
reaction to that. It's awesome, so upset. He worked in
the White House for basically the whole Bush administration, and
in two thousand and six he took over Carl Rob's job.
So if you visualized Joe Kaplan, he's the guy you
get when you can't get car Old Rove anymore. He's
(40:02):
Mr Carl Rove wasn't when the worst person in the
world is like, I can't do this job anymore. Joe
Kaplan's like, I got you. I got you famous monsters,
Like I'm trying to srub some ship over here, infamous
piece of ship, Carl Rove, don't worry. I will continue
your good work. I am Joe Kaplan and now I
(40:25):
basically run Facebook. And if you google him, Google has
enlisted as American Advocate. Yeah, he is an advocate of things.
I was like, I was like, again, I guess more
specific don't enjoy his face, just put it out there.
Joe is one of the most influential voices in Mark
(40:48):
Zuckerberg's world, and he was one of the most influential
voices in the entire company when the common Ground team
came back with their suggestions for reducing partisanship. As policy chief,
you had a lot of power to approve these new changes,
and he argued against all of them. His main point
was that the proposed changes were, in his words, paternalistic um.
He also said that basically babying people. He also said
(41:10):
that these changes story, yeah, be a daddy story. I
can't there daddy stories that end in a genocide. Robert, Well, God,
if it makes you feel any better, all the genocides
that this is going to lead to, heaven happened yet?
Oh okay, well there you go. He yeah. So. Joel
(41:31):
also said that these changes would disproportionately impact conservative content
because it tends to be bigotted, a divisive. Since the
Trump administration was at this point regularly tossing threats at Facebook,
this had some weight. Quote from Wall Street Journal, Mr
Kaplan said in a recent interview that he and other
executives had approved certain changes meant to improve civic discussion.
(41:52):
In other cases where proposals were blocked, he said he
was trying to instill some discipline, rigor, and responsibility into
the progress as he vetted the effectiveness in potential unintended
consequences of changes to how the platform operated internally, The
vetting process earned a nickname eat your Veggies No, which
sounds paternalistic to me. Actually sounds like the beginning of
(42:15):
a daddy story that ends in a genocide. Wow. Okay,
we'll get back to Joe Caplan in a little bit.
For now, we need to talk some more about the
problem of violent of how we want to talk about
how the problem of violent extremism on Facebook groups got
completely out of control. So this summer, which was marked
by constant militia rallies, the explosive growth of the Boogaloo movement,
(42:36):
numerous deaths as a result of violent far right actors
showing up at protests with guns, Facebook finally took action
in late September to ban militias from using their service
because they have to be balanced. They also banned anarchists
from Facebook at the same time, even though anarchists have
not been tied to any acts of fatal terrorism in
recent memory. Um, because you know, you gotta placate the
(42:59):
right wing, because they're the only ones who matters. So
let's play. Let's let's ban the anarchists who have been
spending the last four years trying to lay out the
individual actors and groups who are members of these militias
that are doing stuff like taking over checkpoints and holding
my friends at gunpoint. We wouldn't one of the folks
who were keeping track of them to be able to
use Facebook. That's the wrong kind of disruptive, you see.
(43:21):
That's the wrong kind of disrruptive and advocating. You know,
that's very similar to what the dude and that trailer
said when I was driving my Forerunner through his trailer
and shooting towards his children. Not at um. And I'll
tell I'll tell you what I told him. What did
What did you say? I'm an innovator, Mark, I don't know.
(43:42):
That didn't really tie into it worked for me. I
could see it. I could see it in kind of
an ozarchy kind of way. I could see Yeah. Yeah,
so Uh. Mark. By the way, Uh is on record
declaring that Facebook is a guardian of free speech which
is one of the things he did when he refused
noted that he was refusing to fact check political ads.
(44:04):
In so, anarchists who want to talk about operating a
communal garden or you know, share details about dangerous militias
are the same as militiamen baying for the blood of protesters.
But political candidates spreading malicious lies about protesters who are
being assaulted and killed based on those lies. That is fine.
Back to Facebook growth or anything like that. Let's get
(44:28):
back to Facebook's integrity teams, and they're doomed quests to
stop their boss from destroying democracy. So the engineers and
data scientists on these teams in chief like mainly like
the guys who are working on the news feed. Uh
they yeah, they accorded to the Wall Street Journal arrived
at the polarization problem indirectly. Um asked to combat fake news, spam, clickbait,
(44:51):
and in authentic users. The employees looked for ways to
diminish the reach of such ills. One early discovery bad
behavior came disproportionately from a small of hyperpartisan users um now.
Another finding was that the US saw a larger infrastructure
of accounts and publishers that met this definition on the
far right than the far left, UM and outside observers
(45:12):
documented the same phenomenon. The gap meant that seemingly a
political action, such as reducing the spread of clickbait headlines
along the title of you won't believe what happened next, uh,
it meant that, like, doing this stuff affected conservative speech
more than liberal speech. Now, yeah, and obviously this piste
off conservatives. The way that Facebook works means that users
(45:33):
who posted an engage with the site more have more influence.
The algorithm sees if you're posting a thousand times a
week instead of fifty, it likes that engagement because engagement
means money, and so it prioritizes your content over the
content of someone who posts less often. This means that
a bunch of networks of Russian bots and hyperactive or
like Ian Miles Chong who's a fascist troll who lives
(45:55):
in fucking Malaysia and tweets about how like everybody needs
to have a that they can use to shoot democrats
even though guns are legal in his country, and like
makes like did very recently miss anyway, total piece of
ship that that that these pieces of ship um who
are actively attempting to urge violence, and who have urged
violence and cause death mobs in other countries. It means
(46:18):
that these people, because they're just shotgunning out hundreds of
posts per day, will always be more influential than local
journalists and reporters who are trying to bring out factually
based information, because it's better for Facebook for a stream
of lies to spread on their platform than a smaller
amount of truth. Yeah. And it also lends itself to
(46:38):
just never like to be releasing contents so quickly that
you couldn't possibly disprove or fact check things fast enough,
because there's just it's just a bullshit machine. Yeah uh.
And you know, Facebook's teams found that most of these
hyperactive accounts were way more partisan the normal Facebook users,
and we're more likely to appear suspicious, like to engage
(46:58):
in suspicious behavior that suggested either a bunch of people
working in shifts or there were bots. So these these teams,
these integrity teams did like the thing that has integrity,
which was they suggested their company fixed the algorithm to
not reward this kind of behavior. Now, this would lose
the company a significant amount of money. And since most
of these hyperactive accounts were right wing and nature it
(47:19):
would piss off conservatives. So you can imagine how this
idea went over with Joel Kaplan. Uh. Since Mark was
terrified of right wing anger, he tended to listen to
Joel about these sort of things. Joel's daddy and the
eat your veggies policy review process stymied and killed any
movement on halting this problem. So, well, how do we
(47:41):
feel about that? We feel we feel great, We feel good.
Glad that's in charge, Glad everyone's eating their veggies. I mean,
even just the dystopian nature of like mobilizing these teams
to be like, hey, I've ruined the world. Do you
think you could stop it before it blows up? Because
this is going to be a real pr issue. Why
(48:02):
would you do that? Just of luck to the team.
There was another case where because basically the only way
to combat this stuff is to have another person Mark
Zuckerberg respects or is at least scared of yelling at
him um or you know, talking politely to him. The
daddies of the world opposite of whatever John Kaplan is saying.
(48:23):
And there, thankfully was someone like that in Facebook. They
hired in two thousands seventeen, Carlos Gomez Uribe, who was
the former head of Netflix's recommendation system, which is obviously
made a lot of money for Netflix. So this guy,
Carlos Uribe is a big important get for Facebook. UM.
So he gets on staff and he immediately is like, Oh,
this looks like we might be destroying the world, and
(48:45):
so he starts pushing to reduce the impact that hyperactive
users had on Facebook. Um and one of the proposals
that his team championed was called sparing sharing, which would
have reduced the spread of content that was favored by
these hyperactive users. UM and this would obviously have had
the most impact on content favored by far right and
(49:06):
far left users. Um And number one, there's more far
right users on Facebook than far left, so that was
going to disproportionately impact them. But the people who mainly
would have gained influence were political moderates. UM. Mr Uribe
called it the the happy Face. That's what he called
this plan, and Facebook's data scientists thought that it might
actually like it might actually help fight the kind of
(49:29):
spam efforts that Russia was doing in two thousand sixteen,
But Joe Kaplan and other Facebook executives pushed back because yeah,
and they didn't want to say because you know, Max UiB,
you couldn't like you had to be careful arguing with
some instead of saying this will be bad for money
or it'll make the right angry at us. Joe Kaplan
invented a hypothetical girl Scout troupe, and he asked, what
(49:52):
would happen if the girls became Facebook super sharers as
part of a cookie selling program. That sounds like a
metaphor you would do at the big of an episode. Yeah,
he was like, basically like, what if these girl scouts
made a super successful account to so their cookies? Like,
we would be unfairly hurting them if we stopped these
people who are baying for the deaths of their fellow
citizens and gathering militias to their banner. Okay, I hear you,
(50:16):
but what about fictional girls girl scouts? Yeah, it's awesome.
So the debate between Mr aib and Um and Joel
Kaplan eventually did make it to Mark Zuckerberg. He had
to make a call on this one because both of
them were kind of big names in the company. Uh.
Mark listened to both sides and he took the coward's
(50:37):
way out. He approved Ureb's plan, but he also said
they had to cut the weighing by eight percent, which
mitigated most of the positive benefits of the plan. Um. Yeah.
After this, Mark, according to the Wall Street Journal quote,
signaled he was losing interest in the effort to recalibrate
the platform in the name of social good, asking that
(50:59):
they not bring him something like that again. Neat two
years apiece, Mark, do that that has big two hundred
years of pieces, big two d years of piece energy. Yeah.
In two thousand nineteen, uh Mark announced that Facebook would
start taking down content that violated specific standards, but would
(51:22):
take a hands off approach to policing material that didn't
clearly violate as standards. In a speech to Georgetown that October,
he said, you can't impost tolerance top down. It has
to come from people opening up, sharing experiences and developing
a shared story for society that we all feel we're
a part of. That's how we make progress together. So
(51:43):
you know, it's like that is just such a wild
way of saying, like, I don't feel I am accountable
for this, and once again I'm going to delegate this
to the users. Of the people brains. I'm actively ruining
You know what makes progress harder in my opinion, Jamie
Products and services. No, no, when fascists are allowed to
(52:05):
spread lies about disadvantaged in endangered groups to tens of
millions of angry and armed people because your company decided
sites like The Daily Caller and Breitbart are equivalent to
the Washington Post. This is something Facebook did when, at
Joel Kaplan's behest, it made both companies Facebook news partners.
These are the folks that Facebook trust to help them
(52:26):
determine what stories are true. Um, they get money from Facebook,
they get an elevated position in the news feed. Um. Yeah.
On an unrelated note, earlier this year, Breitbart News shared
a video that promoted bogus coronavirus treatments and told people
that masks couldn't prevent the spread of the virus. This
video was watched fourteen million times in six hours before
(52:47):
it was removed from Breitbart's page. They removed it presumably
because it violated Facebook policy, and Facebook has a two
strike policy for its news partners sharing misinformation within a
ninety day period. When Mark was asked why Breitbart got
to be a Facebook trusted partner while spreading misinformation about
an active plague that was killing hundreds of thousands of Americans.
(53:08):
Mark held up the two strike policy as a shield. Quote.
This was certainly one strike against them FIR misinformation, but
they don't have others in the last ninety days. So
by the policies we have, which by the way, I
think are generally pretty reasonable on this, it doesn't make
sense to remove them. That's pretty great, Jamie, that's pretty awesome.
(53:30):
But you know what's even better about physical and unethical
but still legal, ha ha. What's even better about this
is that Breitbart absolutely violated Facebook policies more than two
times in ninety days, and it was covered up. That's
what's even better. Yeah, you have to imagine Breitbart is
violating Facebook policies multiple times a day like the happlin
(53:54):
have some heide it. Yeah, that is such. I mean,
it's awesome, it's awesome. I'm going to read actually by
citing an incredible report by BuzzFeed, who, by the way,
all credit to BuzzFeed. BuzzFeed, and I think you know BuzzFeed,
And I've cited a number of great articles, including that
one from The Wall Street Journal, which is really important.
(54:15):
BuzzFeed has probably been of all of the different media
companies the most dedicated and like hounding Facebook like a
fucking dog with a groin fetish. I don't know how to.
I'm very proud of Buzzfeeds reporting on Facebook. Thank you
for keeping on this one, y'all. Good work from my head.
(54:35):
But yeah, I'm gonna quote from this report on the
fact that uh Facebook fraudulently hid the fact that one
of their information partners was violating their own policies and
spreading this information about an active plaque. And then and
then you need to take an adbreak, just you know,
I'll take an ad break now. Well, we'll get we'll
get we'll get to this afterwards. Because if there's one
thing that prepares me to hear about how democracies, both
(55:00):
in the nation I live and around the world are
being actively murdered for the profit of a man who's
already a billionaire, there's one thing that makes that easier
to take. It's products and services. It's the sweet lullaby
of a product or service. Ah, Nothing nothing keeps me
going gets me intellectually hard like a product or a service.
(55:21):
I want to be surrounded. I want to die surrounded
by my most beloved products and services. I have a
feeling that you will, because there's a good chance that
a horrible wildfire will sweep through the city you live in.
And sorry, that's getting too dark mine too, maybe, yeah, yeah,
as it to say, I'm like, hey, as long as
we're on the same same page there, that's great, And
it's okay if we make it out of that fire.
(55:43):
Facebook cool. And sure there's lots of armed and misinformed
militias waving guns wildly in the areas we attempt to
evacuate through. Well, as long as my death will have
been completely in vain. Yes, that's what Facebook promises for
all of us, and that's what products and services promise
for all of us. Here we go, Oh, all right,
(56:06):
we're back. So we're talking about how Facebook covered up
the fact that Briitbart was repeatedly spreading this information that
should have gotten them removed as a trusted partner quote
from BuzzFeed. Some of Facebook's own employees gathered evidence they
say shows Brightbart along with other right wing outlets and figures,
including Turning Point USA founder Charlie Kirk, Trump supporters Diamond
and Silk, and conservative video production nonprofit Prager University has
(56:29):
received special treatment that helped it avoid running a foul
of company policy. They see it as part of a
pattern of preferential treatment for right wing publishers and pages,
many of which have alleged that the social network is
biased against conservatives. On July, Facebook employee posted a message
to the company's internal misinformation policy group noting that some
misinformation strikes against Breitbart had been cleared by someone at Facebook,
(56:53):
seemingly acting on the publication's behalf. A Brightbart escalation marked
urgent end of day was resolved in the day, with
all misinformation strikes against Breitbart's page and against their domain
cleared without explanation, the employee wrote. The same employees set
up partly false rating applied to an Instagram post from
Charlie Kirk was flagged for priority escalation by Joel Kaplan,
the company's vice president of Global public Policy. Now, the
(57:17):
whole article itself details just a ton of other instances
in this, and it's all incredibly shady. I'm not going
to go into all of it and tremendous detail because
we are running out of time, But if you read
the article, it's extremely clear that Joel Kaplan is directing
Facebook to actively violate the company's own policies in order
to keep right wing bullshit peddlers spreading lies on the
(57:38):
platform for profit. Kaplan has faced no punishment for this,
although his behavior did provoke outrage from employees in in
Facebook's internal chat system applied to Daddy, that's how it goes.
Facebook employees are be getting angrier and angrier at this
sort of thing throughout the year. Remember back in May
when President Trump posted this message to Twitter and Facebook
(58:00):
vote There is no way, zero that mail and ballots
will be anything less than substantially fraudulent. Mailboxes will be robbed,
Ballots will be forged and even illegally printed out and
fraudulently signed. The governor of California is sending ballots to
millions of people. Anyone living in the state, no matter
who they are, how they got there, will get one
that will be followed up with professionals telling all of
these people, many of whom have never even thought of
(58:20):
voting before, how and for whom to vote. This will
be a rigged election no way. I do remember that
Robert Twitter too. Again, it's unbeliod like like the mildest
I could possibly give someone credit to that level of credit.
Twitter fact checked the president's tweet, which was not nothing,
(58:42):
and that's all I'll say about it. Marginally, that does
not qualify as nothing. Again, that qualifies as the most
responsible action that's made. A social media ceo. Took Mark,
on the other hand, refused to let his employees do
anything similar, allowing the president's flagrant misinformation to circular aid
on his network. This enraged employees, and they got angrier
(59:03):
when his when the Looting Starts the Shooting Starts post
was let up. They created a group in workplace, their
internal chat app, called Let's Fix Facebook parentheses the company.
It now has about ten thousand members. One employee started
a poll asking colleagues whether they agreed quote with our
leadership's decisions this week regarding voting, misinformation and posts that
(59:25):
may be considered to be inciting violence. A thousand respondents
said the company had made the wrong decision on both posts,
which is more than twenty times the number of responses
who said otherwise. So. Facebook employees after this staged a
digital walk out uh and they like changed their workplace
avatars to a black and white fist and called out
(59:45):
sick uh in mass, hundreds of them and stuff. Um
and you know, yeah, I'm gonna quote from BuzzFeed again here.
As Facebook grappled with yet another public relations crisis, employee
morale plunged. Worker satisfaction metrics measured by mic A Pols
surveys that are taken by hundreds of employees every week
fell sharply after the ruling on Trump's looting post. According
(01:00:06):
to data and obtained by BuzzFeed on June one, the
day of the walkout, about forty five percent of employees
said they agreed with the statement that Facebook was making
the world better, down twenty five percentage points from the
week before. That same day, Facebook's internal survey has showed
that around forty four percent of employees were confident in
Facebook leadership leading the company in the right direction, a
(01:00:27):
thirty percentage point dropped from responses to that question have
stayed around the lower mark as of earlier this month,
so pretty significant drop and faith in the company from
its employees. Um and yeah. Zuckerberg, the ultimate decision maker,
according to Facebook's head of communications, initially defended his decision
(01:00:47):
to leave Trump's looting post up without even hiding it
like with a warning like Twitter, Mark stated quote, Unlike Twitter,
we do not have a policy of putting a warning
in front of posts that may incite violence, because we
believe that if a post insights of violence, it should
be removed, regardless of whether or not it's newsmorthy, even
if it comes from a politician. So you have to
wait for their to be violence and study and then
(01:01:08):
be like, oh, it turns out that post was actually
very bad and we should take it down. There I
get the amount of bodies that he needs attached to
do a single thing is staggering. Four days later, Mark
backtracked from BuzzFeed quote. In comments at a company wine
meeting on June two that were first reported by recode. Facebooks,
founders said the company was considering adding labels to post
(01:01:31):
from world leaders that incite violence. He followed that up
with the Facebook post three days later, in which he
declared black lives matter and made promises that the company
would review policies on content discussing excessive use of police
or state force. What material effect does any of this have,
one employee asked in workplace, openly challenging the CEO commitments
to review offer nothing material has anything changed for you
(01:01:53):
in a meaningful way? Are you at all willing to
be wrong here? Mark didn't respond to this, but on
the sly a month of June. Nearly a month later,
he posted a clarification to his remarks, noting that any
post that is determined to be inciting violence will be
taken down. Employee dissatisfaction has continued to swell over the
course of the summer. One senior employee, Max Wang, even
(01:02:13):
recorded a twenty four minute long video for his colleagues
and BuzzFeed and another article has the all the audio
for this, It's worth listening to. In the video, Max
outlines why he can't morally justify working for Facebook anymore,
and he's a pretty early employee. I think his video
quotes at length from books on totalitarianism by Hannah Arrent,
who was one of like the great scholars of the Holocaust. Yeah.
(01:02:36):
He shared the video on workplace with a note that started,
I think Facebook is hurting people at scale? Yes, yes, yeah, yes,
it is absolutely yeah. Yeah. Like Emperor Augustus who had
members of his own family killed for disobedience, Mark did
(01:02:57):
not like being questioned and gasp dista proved of by
his own employees. On Juno leventh, he hosted a life
Q and A where he delivered a message to employees
who were angry at his enabling of hideously violent fascist rhetoric.
This is a lot of this isn't like I think
response to yeah, the killings and such. I've been very
worried about the level of disrespect and in some cases
of vitriol that a lot of people in our internal
(01:03:19):
community are directing towards each other as part of these debates.
If you're bullying your fellow colleagues into taking a position
on something, then we will fire you. Mm hmm, well,
uh good. You know. The the amount of consistency I mean,
you gotta appreciate it that. I'm really glad that employee
I mean just spoke directly about because it's like, at
(01:03:41):
what point, truly, what do you have to lose, Like
I guess except for your life depending on how Mark
Zacher wants to go about it. But I mean, it's
I don't know, it's it is. It is so frustrating,
even though it's like I don't know what else to
do other than you know, whatever, some some ship and minecraft.
But because people are continually waiting for this person and
(01:04:04):
this company to act in the best interest. It's like,
it's not when has it ever happened? Name a time,
even in the face of like the most brutal public disapproval.
There's too much. It's amazing. As you're saying all this,
and as I just finished the thing that I'm saying,
a Bloomberg story just dropped. Like as we were recording
(01:04:25):
this episode, I'm just gonna read you. I haven't read
the story. I'm just gonna read you the title. Facebook
accused of watching Instagram users through cameras no fucking rules.
Oh my god, it's so good. Have we talked about
that before though, because I have. I've had that issue
with Instagram before where I'll close out Instagram and then
(01:04:48):
you'll see the uh, the little section of your iPhone
in the top left where where it indicates that you're
being recorded. Uh, it goes it turns like when I listeners.
Let me know if you've had a similar issue. Sometimes
when I close Instagram, it looks like my phone just
stopped recording, but it goes away really quickly. It's like
(01:05:09):
I'm a mill a second that it's It happens all
the time. So that is not shocking at all. It
bury doo. I don't know what I'm going to actually
title this Um, I don't know what I'm actually gonna
title this episode. The working title that I started this
under was Mark Zuckerberg needs to be tried in the
(01:05:30):
Hague and hung in public until dead. But I don't
think legal is gonna let me go with that title.
I think it's clickable. I think it's clickable as Yeah,
I think you'd get great engagement on that. Um, I
mean that's what he'd want. Yeah, we may have to
go with a different time. I mean, I'm not urging
a legal behavior. I'm urging that he'd be tried in
(01:05:51):
the International Criminal Court and then, once convicted, hung until
the by the neck until dead for his crimes um,
which is what you do when a world leader commits genocide,
right right, Yeah, it's true, But I probably won't title
the episode that. I don't know. I mean, I'm glad
you put it out there, though, let's let's not take
(01:06:12):
it out of the running. Yeah, there's a number of
other options. I mean, Mark Suckerberg continues to disrupt two
hundred years of peace. There's so many options. I can't
wait for the two hundred years of peace that only
involved dozens of worse. Yeah, I mean it was if
the two hundred years of piece began in two thousand
(01:06:33):
and four, imagine how much peace we have to look
forward to. I think it's similar to the amount of
peace I brought that trailer park. There, this, this, this,
you're you're you're a little sick. Oh, you're a little
sick of I know it, I know it. Well, we're
(01:06:53):
all gone be fine, Fine, We're gonna be great. Yeah
you want to plug some ship? Yeah yeah, thank you
for disrupting my life and intersense of peace once. It's
always be disrupted. It's always been a pr You've always
(01:07:17):
been a huge disruptor. And uh yeah, you can follow
me on Twitter or Instagram, which is watching me right now. Uh.
And then if you want to contribute to a candidate
that I love, uh Fatima Ball Zubert. We're doing a
live read of the Twilight Script this Friday evening, five
(01:07:41):
pm Pacific. That sounds very exciting. It is something to
do to distract yourself from the void. See you there,
see you there, and I am going to be uh um,
(01:08:03):
you know he's lost doing the thing that I normally do,
which is staring into the Abyss and going, hey, hey,
quit being an abyss. You really bumming us all out, Abyss.
And the Abyss is like you and the abyss have
great chemistry. We do, we do, And the Abyss has
made me a lot of money, a lot of money,
which is something that I feel very very conflicted about.
(01:08:26):
The abysses is rich. That's the thing that Nietzsche missed
is sometimes when you stare into the Abyss you get
a six figure salary because it's incredibly profitable to talk
about the Abyss on a podcast. Yeah, the Abyss has
facial recognition software and it's pretending to be me elsewhere. Yeah, Jamie,
the abyss loftus, that's what I've been called. You can
(01:08:51):
follow us on Twitter and Instagram, where you're probably being
watched at Bastard Pod. You can follow Robert on Twitter
and I right, okay. You can buy stuff from our
te public store and also the best Dolkasti public store, uh,
where Jamie designs all the artwork for that and it's
amazing And yeah, I think I covered everything. Wher hands
(01:09:16):
wear a mask. Yeah, Bert, did I show you my
bedazzled bolt cutters. I'll send you a picture of them
by Bultcutters. No, I would love to see your bedazzle
bowl cutter. Yeah I have. I have a pair of
bolt cutters that are still usable but also mostly covered
in red stones. I'll send it to you. Yeah, that's
(01:09:38):
the episode. Hell yeah,