All Episodes

June 5, 2020 47 mins

We are living through pretty surreal times... Most of the country is still shut down due to COVID-19. Over the last few days we’ve seen protests and riots across the country following the police killing of George Floyd, an unarmed black man. All while Facebook faces a crisis of its own: internal and external revolt in response to the company’s inaction towards President Trump’s inflammatory posts.

Barry Schnitt was Facebook’s Director of Communications for four years and his recent blog post criticizing the company’s stance on free speech has gotten a lot of attention — especially since Facebook employees, even former ones, are normally so tight-lipped. He talks to Laurie about why both current and former employees are speaking out.

—————————————Show Notes

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
First Contact with Lori Siegel is a production of Dot
Dot Dot Media and I Heart Radio. What's happening now
is the culmination of things that have been happening for
a long time. You know, for a really long time.
I've been looking at the impact that Facebook has been

(00:21):
having on the world with some dismay, and actually I've
I've I've lost some sleep over, you know, laying in
bed at night, looking up at the ceiling and kind
of thinking like, what have I done here? We're living
through some pretty surreal times. Most of the country is
still shut down due to COVID nineteen. Over the last

(00:44):
few days, we've seen protests and riots across the country
following the police killing of George Floyd, an unarmed black man.
Oh Well, one of the biggest tech companies on the planet,
Facebook is dealing with its own turmoil. It's not a coincidence.
It boils down to this. Many believe President Trump's use
of social networks makes situations like these wars. In this case,

(01:08):
critics say some of his latest social media posts could
incite violence and spread misinformation. Both Twitter and Facebook have
taken different approaches. In some cases, Twitter has chosen to
restrict or add warning labels to the president's more inflammatory posts.
Facebook CEO Mark Zuckerberg has chosen a handsoff approach. He
says it's not the company's place to restrict free speech,

(01:30):
and these posts don't violate Facebook's policies. But this comes
down to a larger and incredibly important conversation about the
limits of free speech and the role that tech companies
play in civil discourse. At a time when the stakes
could not be higher. Over the last few days, employees
at Facebook have staged virtual walkouts. They voiced their opposition

(01:52):
to the policy publicly, some of even quit. Now. I've
been covering Facebook for almost a decade. I've interviewed Mark
Suckerberg several times. This is a company that keeps things
pretty close to the vest. Employees don't often speak out
this vocally, So what we're seeing here, to use an
overused word lately, is unprecedented. And yes, I know we

(02:16):
just wrapped up season one of the show, but this
is simply too important to ignore. So my guest today
on this bonus episode of First Contact is Barry Schnitt.
He was Facebook's director of Communications for four years from
two thousand and eight to twelve. These were pivotal years
for Facebook, and he just published an article on Medium
criticizing the company's position. The company was facing very different

(02:39):
challenges back when he worked there. But I think his
perspective is interesting and important, especially as other people are
voicing similar thoughts. I'm Laurie Siegel, and this is first contact.
First of all, how are you doing. I'm well, how
are you? I'm good, I'm you know, look, I'm as

(03:01):
well as can be. Um, we call this first contact.
And I was trying to think because I've covered Facebook
for a really long time, and you worked at Facebook
during four very very important years. So I don't remember
if we've been in contact, but I feel like we
must have had a contact at some point. Yeah, I'm

(03:21):
I have to believe that at some point I sent
you a statement about some privacy controversy or something at
the very least. But you know, I'm sure you were
dealing with so many people there, and I was dealing
with so many generalous you know. But I know I
knew your name and I know who you are for sure. Well,
and and so take me back to your role. I mean,
you were at Facebook for four very important years. Right

(03:42):
back it was twolve. What what was your role there. Yeah,
so my boss at Google, Elliott Treg, moved over to
Facebook and he said, you know, we're doing some really
interesting stuff here and you should come. So I did,
and my role was in communications and some public policy
work around privacy, safety, security, and content issues. And so

(04:07):
what that ended up meaning was pretty much every controversy
and crisis that Facebook dealt with during that time. I
was a spokesperson for it, and I was working behind
the scenes to try and figure out what are not
just our communications response war was, but also what our
kind of substantive response was to it. Yeah, it was
almost like, because I remember covering those days, like Facebook

(04:29):
was growing at lightning speed during that time, and there
were so many things that were happening with privacy, with
the switch to mobile, Facebook was, you know, going public,
launching a mobile app. There was just acquiring all these companies. Um,
you were kind of on the front lines. Yeah for
some of that, for sure. Yeah, And it was it

(04:49):
was a very exciting time and lots was happening, and
it's also very stressful time. But you know, I'm proud
of a lot of the work that we did then. Yeah,
and and part of why I wanted to have you
on today. You wrote this piece on medium, you know,
speaking on Facebook's policies on free speech, and what a
fascinating moment to be having this conversation, and what an

(05:11):
important moment to have this conversation. UM. A lot of
this is happening in the news now, but this it's
almost feels like we've been coming to this moment for
a very long time. You know. I think over the
last couple of weeks, it was a decision Zuckerberg made
to not you know, put a warning label on a
Trump post. Jack Dorsey for Twitter made the decision to

(05:34):
put a warning label on it, and it's really put
into focus. I think some of these issues that a
lot of folks are pretty concerned about the heart of
social media. And what's very unique for our listeners is
like having covered this company for a really long time,
people don't really speak out. It's very rare to see

(05:54):
UM people collectively speak out and really say things and
kind of organize and come together and ploys at the company.
It's a very tight lipped company, UM, and we're really
beginning to see almost like a sea change of behind
the scenes people beginning to talk about these things. So
maybe we can start with what, you know, what was
the premise of what what you wrote and why do

(06:14):
you decide to write it? Well, yeah, it was actually
there's a lot of soul searching for me, you know.
And uh and and I think you you described it
pretty accurately and that what's happening now is the culmination
of things that have been happening for a long time. UM.
And it's it's it's not just about Trump, um, at
least not for me. UM. But you know, for a

(06:36):
really long time, I've been looking at the impact that
Facebook has been having on the world, UM, with with
some dismay. And and actually I've I've I've lost some
sleep over, you know, laying it in bed at night,
looking up at the at the ceiling and kind of
thinking like what have I done here? Because you know,
I I joined Facebook with the idea of you know,

(06:57):
changing the world for better And I think you know,
what you're seeing in terms of employees and former employees
speaking out is because they don't just care about Facebook,
but they care about the world and that's why they
worked at Facebook. And when they're seeing the potential that
Facebook is having to negatively impact the world that they
want to do something about it. Um. And they feel
like Facebook can do something about it, and that that

(07:19):
was part of the premise of my writing is that
you know, Facebook, in the time that I worked there,
in the time before and the time since, has overcome
you know, tremendous challenges, um. And every single time they
rise to it and overcome, you know, whether it's facing Google,
which was a behemoth, or my Space, which at the
time was a behemoth, or you know, you mentioned the
change the mobile. You know, these are these are things
that they took from nothing and they made a tremendous

(07:42):
success out of them. And I think they have the
opportunity to do that here with you know, actually being
a force for you know, information and for understanding and
for truth. But they're not doing that right now, um.
And so my my goal was actually to try to
rally them to that end because I know that they
have the ability to come up with some really innovative
solutions that could have an impact on the world. And

(08:03):
I think don't necessarily need to restrict free speech and
I did. I think that's a false choice and that
those are the words I use in my writing and
I believe that strongly. Yeah, I mean, can you give
us your argument about because you know, I know Zuckerberg really,
more so than any tech founder I know, really digs
his heels. And when it comes to this argument on

(08:24):
free speech. You know when he says when Trump was
posting and saying, um, it was a post that said
something like when the looting happens, the shooting happens or something,
and people are very concerned this was going to promote violence.
But Zuckerberg's argument is he that Facebook will not be
the arbiter of truth, um, and that it's a slippery slope.
This has always been the company's argument, um, and it's

(08:47):
not changing now. What you say in in this piece
is that a lot of things have changed in the
time that you guys drew up those community standards, in
the time that this happened. That words have more meaning
and more powerful in a different way because a lot
of things have changed. Can you explain that to us? Yeah. So,
you know, in two thousand eight, you know, there are

(09:08):
a lot of discussions about how to handle speech on
on Facebook in and the main conclusion was, you know,
Facebook is going to have a handsoff approach to it,
and I think that made sense in two thousand eight.
You know, one there were the professional arbiters of truth,
and I believe the press are are those and have
been for centuries. You know, We're much stronger and had
much more distribution. I think Facebook was growing, but but

(09:31):
still relatively small, and that's changed, you know. And and
Facebook was not a source where their people looked for
news and information. You know, It's a place where you
looked for photos from your friends or you know, funny memes, etcetera.
And all of that has changed, you know, dramatically in
that you know, the press has you know, newsrooms have
been decimated, you know, the economics of news have changed dramatically.

(09:52):
Facebook is and news and information source for literally billions
of people. And I don't think a decision made with
the variables in two thousand and eight still holds in
two thousand twenty, especially if you look at the consequences.
You know, everything from brexfit to elections here in the US,
two elections around the world, to health information during this

(10:15):
pandemic are all being threatened by disinformation that is found
and hosted by Facebook. And I just don't see how
you can look at the consequences of the decision you
made more than a decade ago and see how dramatically
bad they are and say, yeah, that was definitely still
the right decision today, right, I mean, I mean it's

(10:36):
a pretty powerful statement to have been an employee somewhere
years ago, right, and to say that you've been losing
sleep over decisions you made. I mean that is a
pretty powerful thing to say. Yeah, back, what what are
you losing sleepover? Specifically? Well, I mean I can, I
can bring it to you know, just like a few
weeks ago. You know, if if you saw this plandemic video,

(10:56):
which was a very slick, highly produced piece of gross
misinformation about the current pandemic, and it was liked two
and a half million times on Facebook, which means it
was seen by many more people, And so that's literally
millions of people who were misinformed about a current health crisis. Now,

(11:17):
I believe that many of those people will make a
decision based on what they saw on that video through
Facebook that will be detrimental to the health of themselves
or their family. You know, I don't think it could
get more serious than people will die because of something
they saw on Facebook. And yeah, you're right, I lost

(11:37):
sleep over that, UM, and I continue to You know,
you talk about before like a key moment, and you
mentioned like intent bias in this piece, UM, when you're
talking through some of these issues, can you explain that? Yeah? So,
I mean when I worked at Facebook and other tech companies,
you know, you you build these products, and you you

(11:59):
think you know what they're gonna do, but you never
really know until they're out in the world and people
are using them, and they always use them in ways
that you didn't intend. And sometimes that's good and sometimes
that's bad. But you know your intention is good. You know,
you had the intention to give them free speech or
allow them to share, or allow them to create, etcetera.
And so your intent was good. And then you look
at the outcomes and hopefully there are some good outcomes

(12:21):
and they're bad, but because of your intent, you focus
more on the good outcomes than the bad. And I
think that's been happening at Facebook for years, and I
think it's happening today and that you see them right
the stories of success, you know, whether it's selling products
or the organizing of groups. And I provide examples. You know,
they they topped this sisterhood of truckers, which I think

(12:41):
is amazing. But at the same time, you know, Mark
looks at all of that, and then he says, I
don't see how Facebook could have impacted an election, Like
he's too smart to not see that those things are
the same. You can't have all of this good and
organizing and people changing their minds in a good way
and then not have this same thing happened in a
bad way. So then, having worked closely with him, and

(13:04):
having worked closely at the I would say the company,
and being kind of the lead on communications, it could
be crisis communications communications. You say it's too smart for
then what do you think it is? Well, you know,
I haven't talked to Mark in you know, nearly a decade,
and so I can't know what's in his head. But
you know, the two things that my guests are one

(13:26):
is the intent bias. I think he continues to look
at all of the good that Facebook is doing in
the world, and I agree with him that there is
a lot of good, and he says that it's more
than the bad. I would say that's not good enough,
and I would say the bad is continuing, is growing,
and that that ratio was changing in a way that
I think is bad for the world and too. You know,
when I worked at Facebook, there was something that Mark

(13:47):
used to say a lot and you had a chart
and everything, and he would he would continuely say we're
one percent done. And and that was in response to
people being too conservative and people working to protect what
we'd already achieved rather than working taking risks to achieve
that was left. And and I feel like maybe for
him and others at the company, that that that ratio

(14:09):
has changed, and that maybe they're feeling like there is
more to protect than there is to achieve for Facebook,
and and that Facebook maybe is too important in the
world to risk. But I would argue that if if
we save Facebook but the world burns, that we've made
the wrong trade off. Okay, we've got to take a

(14:31):
quick break to hear from our sponsors more with my
guest after the break. It's a nuanced argument, too, right,

(14:51):
Like it'd be too easy to simplify this and say, well,
you should take this down. Right Like you look at
Jack Dorsey, you know he's in the line of fire
with the resident right now. And and now having put
some of these labels on Trump's tweets, people are calling
on Twitter to to put labels on all these other
world leaders tweets, you know, and we're wondering, well, why

(15:13):
does this get a label? And this doesn't get a label?
And from me having covered tech throughout the years, it
certainly seems like sometimes you get very confused about who's
making the decision, why the decision gets made, and those
standards don't seem to always apply in the same way,
and they change quite a bit. So with Facebook and

(15:34):
the argument of it's a slippery slope, can you see
it from the other side, Like, how do you weigh
that argument right now in this car a moment that
this could lead to censorship, it could lead to a
lot of other unintended consequences for the platform. Yeah, well,
the slippery slope, it's such a funny, you know buzzword
that that you know, people invoke in in all kinds

(15:57):
of situations. And I've heard it in meetings for you know, decades,
and I've come to realize that in a lot of
ways it's a cop out. In some ways it means, yes,
we know the right thing to do in this situation,
but we're not going to do it because we don't
know the right thing to do in some of these
future situations. And if if that's the case, you know
that again that feels like a cop out. You know,

(16:17):
I know that the decision that I'm going to make
here is wrong, but that's okay because I don't know
what to do about these other situations. And I make
the argument that Facebook is too smart, has too many resources,
you know, too much innovation, too much technology to just
use a slippery slope as a way of of not
doing the work to figure out, Yes, we know the

(16:39):
right thing to do, and let's work towards figuring out
what the right thing to do on these other situations
will be um and make those decisions as we go,
and it will be hard, and there will be some
inconsistencies and they will make mistakes. But I think all
of that is less bad than the current situation, which
is rampant misinformation, ramp a divisiveness, rampant incitement of violence,

(17:03):
and I and I think it's worth the risk. You know.
It's interesting because when Zuckerberg testified in front of Congress
for the first time, I remember I was there and
day one was kind of the senators just asking like
random questions about the Internet and it was kind of like, uh,
you know, I remember everybody was kind of like, Okay.
The takeaway here is also that you know, the government

(17:24):
needs to educate themselves when it comes to technology. But
day two I remember thinking this very I was like,
Day two is really interesting because you had Zuckerberg in
front of a lot of House members who were all
asking him about taking down content, and who were all
this was you know what, I think this was a
couple of years ago now, right, but who were all
talking about did they have a liberal bias? Um? And

(17:48):
you know this is now we have another backdrop, which
is regulation um in the power of these big tech companies.
And before all this whole pandemic happened, we had the
conversation about are these companies monopolies? Which, you know, I
think that's a little bit we've we've pushed aside, just
a teeny bit because there's so many other huge things
happening in the world. But these companies are under a
lot of pressure right now. And I make no judgment

(18:11):
either way. I think, you know, Zuckerberg, I think it's
it's you know, having interviewed him many times and seeing
some of this stuff, I don't get the the sense
that it's all just political, right, or that he's only
just doing it all for the money. I actually think
if you meet him, it's it's really different. Um. But
do you think that part of this could be outside
forces too? I mean, and not just with this, but

(18:31):
with a lot of the decisions that Facebook is making
and a lot of the pressure that they're under right now. Yeah. Yeah, No,
I think there's outside forces at play here. And that's
what that's kind of what I was meaning to talk
about with the company working more to protect than to
take risks to achieve, is that I think those outside
forces are real and scary. You know. I don't think

(18:54):
any company wants to draw the ire of the president.
I mean, I think he's shown that he will use
whatever government levels he has at his disposal to make
things difficult. I think Jeff Bezos is an example, you know,
the work that he's doing against Section two thirty that
you know that has been described as the twenty six
words that created the Internet. He basically wrote an executive
order to try to rescind that, you know. Um, and

(19:17):
that would be a dagger to the heart of all
Internet companies, I believe, and so I don't think it's
the right answer, but it's an example of you're not
wanting to get on the bad side of the president
in this current administration, and I think that is playing
very much into some of the decisions that the company
is making. And I don't think they're being honest with
themselves or then certainly with the public that that's what's

(19:39):
at play. You know, I think it's important to say
like you and a lot of and we should mention
that like a lot of other employees are speaking out,
some have resigned UM, some are speaking out kind of
behind the scenes. We just obtained a letter that folks wrote, UM.
It was a lot of early employees that wrote, UM,
you know, all these employees and I think this is

(20:00):
a really important point. Aren't saying Facebook is terrible? Facebook
is bad? I'll read like a little. I thought this
was so powerful. This is UM. This is from the
letter that a lot of early employees and I'm assuming
many who you worked with UM, got together and they
collectively wrote like this was this is a letter and
correct me if I'm wrong. This was a letter to
Mark right that they wrote and it was published in

(20:22):
the New York Times, but it was it was kind
of more of a petition for talking about the standards,
but they wrote. As early employees on teams across the country,
we authored the original community standards, contributed code to products
that gave voice to people in public figures, and helped
to create a company culture around connection and freedom of expression.
We grew up at Facebook, but it is no longer ours,

(20:43):
you know. I think that's such a I gotta say.
And and maybe this is me being a little inside
baseball as someone um, you know, who's looked at this
company since I would say two thousand nine or two,
probably around two dozen ten. But but that's you know,
seeing a lot of these names, and I want our
listeners understand the aims on at the bottom of these
and even you write, you let calms like, no one

(21:06):
at Facebook spoke like this or spoke out like this
for a very long time. So I think it's a
really big deal that that people are really beginning to
question some of these decisions. I agree, Um, you know,
and those are I know, most of those people, they're
they're a lot of people smarter than me who signed
that letter and and their dear friends, and um, in fact,

(21:27):
it just coincidentally, I sent my you know, draft post
to one of them, said hey, I'd love your feedback,
and they said, well, that's coincidence. We happen to be
working on something of our own. But yeah, I mean,
I think it's collectively my experience that of of laying
awake at night thinking, you know, what have I done
is not unique. And I think, you know, seeing the
impact that Facebook has had on the world and and

(21:49):
being proud of it for a very long time and
then having that gradually become you know, forms of shame
and dismay is is pretty powerful. And seeing you know,
it's the the are not you know, people who are
cranky critics, you know, the kind of people who always
criticize Facebook about both doing too much and too little,
being too far to the right and too far to

(22:09):
the left. You know, these are people who joined the
company willingly and poured their hearts and souls into it
for years and are really shocked and dismayed about what
it's how it's not of what it's become, but the
impact it's having on the world. And and I say
I have a distinction there because I don't think it's
become evil. You know. I got that response to some people,
you know, Facebook is evil, and I said, well, you

(22:31):
know that's not a solution. You know, if you propare,
I'm in all ears. You know, let's let's provide some action.
But I do think Facebook is not understanding the negative
impact that it's having on the world. And I think
it is not paying enough attention to potential solutions and
certainly not putting enough effort towards developing those solutions. And

(22:52):
I wish I knew the the exact answer, but I don't.
But I do know that Facebook has the ability to
come up with those answers. And I think that's part
of what's so dismaying to two people who used to
work at the company, is that we know that Facebook
can can rise to these challenges and has you know,
limitless possibilities and and there for some reason, it feels
like they're not trying to do that, and we just

(23:13):
don't understand why. Well, what do you think a solution is.
I know they've created projects for journalists, they have an
outside almost editorial board for content now. Um so they
have done quite a bit over the last years for
some of the criticism they've they've received on content decisions
with everything happening in journalism. I mean, what do you

(23:34):
think is this solution? Well, i'll answer the opposite question. First,
I'll tell you what I don't think it's a solution,
you know, like I know Facebook somewhat, And I'll tell
you one thing that I believe is that Facebook doesn't
outsource things that are really important to the company, never has,
never will, And everything that you mentioned is effectively in

(23:54):
outsourcing the journalism project. Hey, here's some money, you guys,
go do some interesting work fact act. Hey, third parties,
here's some money, go do some interesting work the board. Hey,
third parties, why don't you make some decisions for us
because we can't make them. And the amounts of money
that they put towards these efforts sound big, but they
are rounding errors in the in the Facebook universe. And

(24:16):
I think if they were serious about a solution to misinformation,
to the incitement of violence, and just to coming up
with a new way to treat content, that they would
do something internal. They would devote engineers to it. And
and that's the number, you know, having worked at Facebook,
that's the number one signal for for whether Facebook thinks
something is important. How many top engineers are working on it,

(24:38):
and everything that you described, I would argue the answer
the number is roughly zero, and that's that's probably a
little hyperbole because there's definitely engineers working on these related problems.
But the things that they're touting as potential solutions to
this are not actual solutions. They are their band aids.
And you know, as I write, you know, I think
we're actually hemorrhaging truth and civility on on Facebook and

(25:01):
these are these are a start, but they're just at
the margins. And uh, and I think they need to
devote significant resources. I propose, you know, kind of a
symbol in my writing that they just suspend their stock
buy back, which there they've committed another fourteen billion to
doing that. That's the kind of resources that this is
going to take. And I don't think they literally need

(25:23):
to find fourteen billion dollars. They have the money. Mostly
what they need to find is the will. And again,
I don't know exactly what they need to do, but
I know they need to commit to doing it, and
and that's not even something they're willing to do thus far.
You know, it's interesting, I remember interviewing Facebook's former head
of security, who was there for context, like during it

(25:45):
was their team that discovered Russian influence. He was there
for the election interference, Alex Damos and something he said
to me, this is for a documentary I did on Facebook.
Something he said to me was, you know, for a
very long time the growth team, they had more engineers
at the growth team. Minute was bigger and the building
was bigger than for the security team. So I think
that's an interesting point you make about about engineers and

(26:07):
and and to give a sense like these are human problems,
right and so like, and you're talking about technology. But
but the real problem this is not just Facebook, this
is maybe beyond Facebook and for a lot of the
bigger tech companies, and the time I covered it and
they went into these grew into these huge companies, is
that I don't think they anticipated. When you talk about

(26:28):
intent bias, I think there was an inability to look
at the messy, complicated human problems that would happen in
some capacity, right. Yeah, And you know the intent is
important because I think their intention is good and there
and their hypotheses were not crazy, right. I mean what
Mark says is, you know, we think people should decide, right,
But you know, I think we can agree there is

(26:50):
truth and there are lies. You know, there are civil
discourse and there's inciting violence. And I think we would
all agree that truth should get more distribution and attention
than lies, and we should agree that civil discourse should
get more attention than inciting a violence. And what Mark
would say is yes, and people will figure that out
and they will decide for themselves. But hey, that's not

(27:11):
what's happening, and be some people aren't equipped to discern
and see, you know, there are very powerful forces that
are deliberately trying to trick them into thinking that one
is the other. And for Facebook to see all of
that and throw up its hands and say no, we're
just free speech and the people decide I think is wrong.

(27:34):
And that's just what I'm trying to get them to realize.
Right looking at this letter, that a lot of the
early employees and and do you like some of the
early employees. These are early architects of Facebook. But there
are all sorts of people who signed this, who co
created this letter, and we'll put in our show notes.
I would suggest people read it just because whether you
agree with it or not, it's really an interesting look

(27:57):
at I think how people are viewing this moment in
time and the implication and the implications. Um. You know
what some of these folks said, um in the letter
was since Facebook's inception, researchers have learned a lot more
about group psychology and the dynamics of mass persuasion. You know, UM,
we understand the power of words have to increase the

(28:19):
likelihood of violence. You know. I I remember being at CNN.
UM was outside when the bomb that was pulled out
of the building. This was like, I would say, a
year ago or something, someone had sent this, This bomb
had ended up in the mail room. Think God or
the incredible security at the time I had found this,
but you know, um, it had it had stemmed from

(28:40):
I think posts and tweets, and I remember thinking like,
oh my God, like this is actually happened, you know what.
It started there and and the threats had started there,
and you know, and then I was watching a bomb
pulled out of our building where I had been for
ten years, and it was such a maybe as someone
who just has spent my adult career covering tech, it
just such a moment for me thinking like Wow, the

(29:02):
implications like can be very real life. So I thought
that line in in this piece was really interesting. Yeah,
you know, that's it's a good point. You know, this
is not academic or theoretical. Um. You know, it's happening
every day. People are being radicalized, you know, based on
what they're seeing on Facebook. And by the way, it's

(29:23):
not just Facebook. I wrote about Facebook because I worked there.
I I know more about the company. You know, the
same could be said for pretty much every every technology
company that hosts user generated content. You know, I think
Twitter has come up with unique solutions for Trump's tweets,
but there's a lot of work that they need to
do to um, you know, around abuse and around misinformation, etcetera.

(29:44):
You know, they and so I just don't know how
to tell them what to do. Um. But yeah, this
is this is not happening in a far away place.
This is not some uh you know, dystopian future that
we can imagine. You know, people are walking into a
pizza parlor with with an assault rifle because there they
believe that it is a child sex ring, you know,

(30:04):
with with the presidential candidate, you know, the pizza gate.
But it's it's it's absurd, but it's literally happening. People
are planning bombs at CNN because they believe that, based
on what they've seen on Twitter and Facebook, that you know,
you guys are the root of all evil. Um and uh,
and that's something that we really need to take more seriously. Okay,

(30:27):
we've got to take a quick break to hear from
our sponsors more with my guest after the break. I
guess the question for me is, I mean, this is

(30:50):
such a tight lipped company, right, Um, you know, even
covering Facebook, it's a fascinating company and and it has
so much impact and it is completely transformed the world.
But this is not a company where employees like freely
generally tweet about how they feel or facebooks, i should say,
post on Facebook about how they feel. That is you know,

(31:11):
that's not something that we've seen. We're in the lens
of a global pandemic. We have protests and real, real anger,
and you know, rightfully so in this country and around
the world. Um, given what's happening with the racial divisions
and racism, and I think looking at the fact that

(31:31):
people had Facebook and hearing what I'm hearing from kind
of former employees and employees about people really kind of
that turmoil behind the scenes. What do you think it
is about now that's you know, causing people to maybe
take the risk to say something when maybe they wouldn't
have before. Well, I think it's two things. One that

(31:52):
it's it's not theoretical, you know anymore. It's not academic.
You know, we're we're seeing it's not isolated. You know,
it's not just you know, one incident of you know,
a crazy person that you could dismiss. You're seeing it,
you know, simmering across the country and around the world,
just people being incited to violence and radicalized and too
you know, you you hit on every one of it.

(32:13):
The stakes are so high. We are literally in the
middle of a global pandemic. You know, nearly half a
million people have died. The smart people who know about
viruses say many more people will die, likely in the fall.
And in the meantime, Facebook is providing health of misinformation
to them. You know, it's I don't as I write,

(32:35):
you know, the only way that the stakes could be
higher is if we were on the brink of a
world war, and I don't think we are right now.
But I don't see the logical conclusion of this being
a lasting piece, right You know it will it will
be there will be some violent outcome of all of
this if it is not checked in some way. And

(32:56):
in the meantime, in the short term, a lot of
people will make health decisions that will be detrimental to
their life and uh and Facebook will be complicit in it.
And I think that's that's just something that's got to change.
You know, Facebook has always come unto fire throughout by
the way, someone who's been on the other side of
it as a journalist who has asked very hard questions

(33:17):
interviewing Zuckerberg right in the midst of Cambridge Analytics and
during some of the harder moments in the company, the
company has always you know, I feel like they they've
played defense for a very long time, and so there
is I think a certain mentality around that and knowing
that you're going to get criticized, but you kind of
like you just keep going if you have the mission.
I think that's in the DNA of Facebook, if I

(33:38):
could kind of define it in any way. Do you
think that this time is any different, that they're that
they'll listen to some of the former employees or or
some of the you know, some of the maybe because
it's more people behind the scenes. I know how much
Zuckerberg does value the people he works with. UM. The
short answers, I don't know. UM. You know I am

(34:01):
having said all the things that I say and believe
about Facebook. UM. I am almost a decade removed from
the company. I still had a lot of friends there.
But you know, I do think the opinion of the
employees is very highly valued. UM. And that is something
that in the past has moved the company. There was
a transcript that I read of of an all company
meeting earlier this week, and it it seems like, you know,

(34:23):
for the most part, the at least the vocal people
are very against the current stands of the company. I
am sure that is weighing on the leadership. UM. Whether
it makes a difference, I don't know whether the external
pressure will make a difference. I don't know that. The
one thing that's unique about the external pressure right now
is that it is it is so divided in In

(34:45):
most cases when I was at Facebook, there were cases
where people were, you know, on both sides of an
issue telling us we were wrong. But mostly it was
a united front telling Facebook it was wrong. You know,
you're doing the wrong thing on privacy. You're being too open, um,
You're you know, you're not taking down this objectionable content. Um.
But this is a case where people on the right

(35:07):
are saying you're censoring too much and you're taking too
active a role in content, and people on the left
saying you're taking an active role enough. And when it's
divided like that, I don't know how you make the
calculation for you know, which is the path of least resistance,
um and I and I do think Facebook has made
that calculation in the past, um and right now, because

(35:30):
the forces of you know, leave the content alone are
in power, I worry that they will make the decision
that that is the path of least resistance. In this
this letter to that, these employees and they said Facebook
isn't neutral and it never has been, making the world
more open and connected, strengthening communities, giving everyone a voice.
These are not neutral ideas. Fact checking is not censorship,

(35:53):
Labeling a cult of violence is not authoritarianism. I mean,
covering a lot of these companies, it was for so along,
hands off, we're neutral, we're not responsible, We're not media companies.
There's always been this tension for the last decade and
a lot of these companies as you talk about those
twenty six words that that save the Internet based off

(36:14):
of tooth section to thirty right, which makes it so
these companies to degree on to be liable for certain content.
But but it certainly seems like we're seeing a shift
and that words have more meaning and have consequences that
are far reaching, and the stakes seem incredibly high, and
that I you know, I think the debate is certainly

(36:36):
open of you know, and you look at I keep
looking at Jack Dorsey and what he's kind of walked
into as well, and now all the calls to do
all these other things and where are they going to
draw the line? So it's certainly, um it certainly feels complicated. Yeah,
it is. And I tried to address that um in
that you know and let them know, let Facebook no,
you know, I know this is not an easy problem.

(36:58):
I know it's going to be hard, but but to
try to give them the courage to do it and
do something about it. And I think the writing in
the in the letter from my former colleagues and my
friends is brilliant and I those insights are are so
spot on, and uh, and I hope. I hope Facebook listens.
I thought what you said about facebook strengths are its

(37:20):
weaknesses as well, which is, you know, it has always
been the story of technology, right, which is it can
do such incredible things, and it can also do such
terrible things too, And we always just walk this fine line,
and it can get incredibly murky, you know. And I
think what we've seen over the last couple of years,
there's some of them, maybe even the last not even
a couple of years, even before then. Really you know,

(37:40):
these ethical issues that that I think a lot of
these people are working through and sometimes better than others. Yeah,
I agree, And you know, for for Facebook, you know,
there's the biggest strength is the connections that they've created
and fostered and facilitated between literally billions of people. And uh,
there's a lot of benefit to that. But you know,

(38:03):
we're also realizing that it creates some vulnerabilities and there
are evil forces in the world that are exploiting them. Um.
And I just think there's more that Facebook Cannon should
do about it. And uh, and I hope they will.
Um from like about a personal note, You've you have
been at Facebook, You've been at Pinterest, You've been a

(38:24):
part of these companies that that really have kind of
shaped people and behaviors and whatnot. Um, what is what
is your takeaway on people? Wow, that's a really broad question.
What's my takeaway on people? Yeah, you've kind of been
in the fray. You've been in it, You've watched You've
watched things, Bill, You've built things, You've you know, you

(38:45):
were in the line of fire at Facebook. Pinterest is
a more delicate company, I would say, knowing knowing that company,
it's a more delicate culture too. But but what's you know,
You've just had such extraordinary I would say experience kind
of being being in these places and being on the frontlines. Well, yeah,
I mean I what I've observed in those companies and

(39:05):
what I felt is that you know, they most of
the time, the overwhelming amount of time that people want
to do the right thing and they want to change
the world for better. UM. And that is that is
a at least a large part of the what's motivating
And I believe it's what what's motivating Facebook? I believe
it's what's motivating Mark Um. I I believe unfortunately in

(39:28):
some cases they are blind to the consequences of the
decisions they're making. Um, I believe they are not giving
enough weight to the bad outcomes, you know, as I state,
But but I do believe these people are good and
they're trying to do good and they want the best
for the world. And I think that's why you know
to your point that they're usually silent, but they're not
being silent right now because they are seeing that that

(39:51):
intention is not being realized and in fact, quite the opposite,
they may be actually damaging the world and and they
don't want to and they want to do something about it.
And my whole goal was was to try to give
them some ammunition and maybe put some form around their
thoughts and ideas to move the discussion forward towards some action. Um.
And I hope it gets there. And you think action

(40:13):
would be labeling more of this context, Well, I think
it would be not saying we're taking a hands off
approach to content. I think it would be taking some
responsibility for the content. And you know, again, there are
lies in the world and there are truths, and having
a hand in making sure the truth is more seeing
than the lies, I don't think is a bad thing.

(40:36):
And I don't think it's censorship. There is civil discourse
and there's inciting violence. I don't think taking an active
hand and saying this civil discourse is of more value
than this inciting violence is a bad thing. And that's
that's just a a bridge that they haven't been willing
to cross, and I'm I'm urging them to cross it.
And I don't know whether that means one will be labeled.

(40:57):
I don't know whether that means the distribution will be
throw uttled on one and and surged on another. I
don't actually know the solution, but they need to cross
that bridge first, uh, and and commit to having those
outcomes be an actual goal. You know, free speech is
not an outcome. Free speech is a means to an end.
But the end right now is is damaging the world.

(41:18):
So how do we get to an end that actually
makes the world better? And that's the thing that I'm
hoping they'll get to. What do you say to the folks,
you know, um, any time some people come and speak
out at Facebook or whatnot, I would say, yeah, I
saw an executive It's like I think it's Dan Rose
or something, say like, you know, just early people who
have no connection or something like that. He was saying,

(41:41):
you know, don't even know the nuances or complexities of
this argument. You haven't been there in a while. Um.
So to defend yourself bearing, what do you say to
the executives or the people who say, well, you haven't
been there in a while, you don't know? What do
you get to say something? Well? So, I know Dan Rose,
and I think he was having an emotional reaction to

(42:02):
feeling that his life's work is attacked and people he
cares about and is loyalty are attacked. And I don't
think that tweet was actually reflective of him. I I
think he more meant you know, kind of what you're
saying is that you know, you guys have not been
at the company for a long time. You don't know
that the content of the discussions they have been deep
and endless. Um. And you should trust the people that

(42:23):
work there. And to that, I would say, I do
trust the people to work there. But I think you
were not realizing the consequences of those decisions. I am
sure they were thoughtful, I'm sure they were endless, I'm
sure they were as deep as they could possibly be,
But I believe they were wrong and I believe the
evidence supports my position and not yours. And you have

(42:44):
a responsibility to do something. And even more than that,
I think you have the ability to do something. And
I think it could be a tremendous success and opportunity
for the company and for the world more importantly, um
if they would seize it. And I urge you to
do so. And you know, and and so when you
talk about the decisions people at the company are making,
I feel like I couldn't in this interview without saying,

(43:06):
you know, I mean, I think it was reported that
there was one person of color, one woman of color
in the room when the one of the decisions was
made on the Trump post. You know. So when we
talk about and this is a larger conversation, we talk
about the decisions that these folks are making. Who are
the people making these decisions and are they diverse and
a diverse group of perspectives. I think that's something we

(43:28):
have to hold onto that I think we're seeing it
come to a head now in this moment, you know,
I mean, there's just not enough different voices and perspectives.
And this isn't just Facebook, I mean, this is all
of Silicon Valley has a massive problem. Yeah, I agree
with that, and I and as you say it, I wish,
I wish I brought it up in in my writing.
But you know, and it's it's diversity across every dimension, race, religion,

(43:53):
socioeconomic status, education. You know, the the people who work
at Facebook and are making the decisions are highly educated,
highly sophisticated. They're not seeing this stuff and their feeds
because they're not posting it, and the people that they
associate with know the difference. But lots of other people
are seeing it in their feeds and they don't know

(44:15):
that it's a lie. They don't know that it's inciting
a violence in some cases, you know, it's And I
think if more of the leadership team had the perspective
of the people who were seeing all of the content
on Facebook based on their race, the religion, their economic status,
you know, where they come from, I think they would

(44:35):
have a different opinion of the impact because they're not
seeing it themselves because they're not exposed to it. And
then I guess last question, have you had other folks
from inside the company reach out to you after kind
of speaking out and and writing do other people share
your feelings. Yeah, I mean it was you know, the
response has been fairly overwhelming. You know, I didn't know

(44:57):
if anyone would listen or care. It seems like people
are and do which which is it's just gratifying. And
I would say the most common response that I got
from current EMO employees, form employees, and actually even people
who never worked at Facebook but feel a connection to
it in some way because probably because they love the
product and have been on it for years, is some

(45:20):
form of thank you for writing this. You encapsulated some
feelings and thoughts that I've had for a while. And uh,
and so I I am. I am one person who
who you know, had my own ideas, and but it
seems like they are shared by a lot of people.
And and I hope that gives them some weight to

(45:42):
Facebook and that they do something because I want them
to do and I know they can. I think they should.
Um and and and if they did, I think it
could be tremendously valuable to the world. And last question,
I promised, um, you know, you were leading the charge
in calms during some of these very intense situations. Privacy
was a huge one back when you were there. Um,

(46:05):
so you kind of went into the line of fire. Um,
what advice would you give Mark right now? Oh man?
What advice would I give Mark right now? I would Well,
I tried to give some of it, you know, without
naming him in my writing. And I think it is
to pay attention to the outcomes, not just your intent,

(46:26):
and to have courage against the critics who who have
power to limit and damage your business and your company,
and to have faith in your ability to do something
really remarkable for the world in a new way, in
in a in a way that you haven't before, and
not just enabling free expression, but an outcome of of

(46:47):
actually informing people and improving their knowledge of the world
and their understanding of the world, and enabling them to
make the right decisions about it. And I think that's
something that he has the ability to do, and I
urge to do it. First Contact is a production of

(47:11):
Dot Dot Dot Media, executive produced by Laurie Siegel and
Derek Dodge. This episode was produced and edited by Sabine
Jansen and Jack Reagan. The original theme music is by
Xander Sane. First Contact with Lori Siegel is a production
of Dot Dot Dot Media and I Heart Radio.
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.