All Episodes

June 5, 2020 47 mins

We are living through pretty surreal times... Most of the country is still shut down due to COVID-19. Over the last few days we’ve seen protests and riots across the country following the police killing of George Floyd, an unarmed black man. All while Facebook faces a crisis of its own: internal and external revolt in response to the company’s inaction towards President Trump’s inflammatory posts.

Barry Schnitt was Facebook’s Director of Communications for four years and his recent blog post criticizing the company’s stance on free speech has gotten a lot of attention — especially since Facebook employees, even former ones, are normally so tight-lipped. He talks to Laurie about why both current and former employees are speaking out.

—————————————Show Notes

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Listen
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
First Contact with Lori Siegel is a production of Dot
Dot Dot Media and iHeartRadio.

Speaker 2 (00:09):
What's happening now is the culmination of things that have
been happening for a long time. You know, for a
really long time. I've been looking at the impact that
Facebook has been having on the world with some dismay,
and actually I've lost some sleep over, you know, laying
in bed at night, looking up at the ceiling and
kind of thinking like, what have I done here?

Speaker 1 (00:36):
We're living through some pretty surreal times. Most of the
country is still shut down due to COVID nineteen. Over
the last few days, we've seen protests and riots across
the country following the police killing of George Floyd, an
unarmed black man, all while one of the biggest tech
companies on the planet, Facebook is dealing with its own turmoil.

Speaker 3 (00:57):
It's not a coincidence at Boils Day to this.

Speaker 1 (01:01):
Many believe President Trump's use of social networks makes situations
like these worse. In this case, critics say some of
his latest social media posts could incite violence and spread misinformation.
Both Twitter and Facebook have taken different approaches. In some cases,
Twitter has chosen to restrict or add warning labels, to
the president's more inflammatory posts, Facebook CEO Mark Zuckerberg has

(01:25):
chosen a hands off approach. He says it's not the
company's place to restrict free speech, and these posts don't
violate Facebook's policies. But this comes down to a larger
and incredibly important conversation about the limits of free speech
and the role that tech companies play in civil discourse.
At a time when the stakes could not be higher,

(01:47):
over the last few days, employees at Facebook have staged
virtual walkouts. They voice their opposition to the policy publicly.
Some have even quit.

Speaker 3 (01:56):
Now.

Speaker 1 (01:56):
I've been covering Facebook for almost a decade, I've interviewed
Mark Zuckerberg several times. This is a company that keeps
things pretty close to the vest. Employees don't often speak
out this vocally, So what we're seeing here, to use
an overused word lately, is unprecedented. And yes, I know
we just wrapped up season one of the show, but

(02:18):
this is simply too important to ignore. So my guest
today on this bonus episode of First Contact is Barry Schnidt.
He was Facebook's director of Communications for four years from
two thousand and eight to twenty twelve. These were pivotal
years for Facebook, and he just published an article on
Medium criticizing the company's position. The company was facing very

(02:39):
different challenges back when he worked there. But I think
his perspective is interesting and important, especially as other people
are voicing similar thoughts. I'm Lori Siegel, and this is
first contact.

Speaker 3 (02:55):
First of all, how are you doing?

Speaker 2 (02:57):
I'm well, how are you?

Speaker 1 (02:59):
I'm good, I'm you know, look, I'm as well as Camvy.
We call this first contact.

Speaker 3 (03:05):
And I was trying to think because I've covered Facebook
for a really.

Speaker 1 (03:08):
Long time, and you worked at Facebook during four very
very important years. So I don't remember if we've been
in contact, but I feel like we must have had
a contact at some point.

Speaker 2 (03:20):
Yeah, I'm I have to believe that at some point
I sent you a statement about some privacy controversy or
something at the very least. But you know, I'm sure
you were dealing with so many people there, and I
was dealing with so many journalists, you know. But I
know I knew your name and I know who you
are for sure.

Speaker 3 (03:36):
Well, and so take me back to your role.

Speaker 1 (03:38):
I mean, you were at Facebook for four very important years.
Right back it was two thousand and eight, to twenty twelve,
that's right.

Speaker 3 (03:47):
What was your role there?

Speaker 2 (03:48):
Yeah, so my boss at Google, Elliott Trag, moved over
to Facebook and he said, you know, we're doing some
really interesting stuff here and you should come. So I did,
and my role was doing communications and some public policy
work around privacy, safety, security, and content issues. And so
what that ended up meaning was pretty much every controversy

(04:11):
and crisis that Facebook dealt with during that time. I
was a spokesperson for it and was working behind the
scenes to try and figure out what our not just
our communications response was, but also what our kind of
substantive response was to it.

Speaker 3 (04:25):
Yeah.

Speaker 1 (04:26):
It was almost like, because I remember covering those days,
like Facebook was growing at lightning speed during that time,
and there were so many things that were happening with privacy,
with the switch to mobile, Facebook was, you know, right,
going public, launching a mobile app. There was just acquiring
all these companies. You were kind of on the front lines.

Speaker 2 (04:46):
Yeah, for some of that, for sure. Yeah, and it
was it was a very exciting time and lots was happening,
and it was also very stressful time. But you know,
I'm proud of a lot of the work that we
did then.

Speaker 1 (04:57):
Yeah, and part of why I wanted to you on
today you wrote this piece on medium, you know, speaking
on Facebook's policies on free speech, and what a fascinating
moment to be having this conversation, and what an important
moment to have this conversation. A lot of this is
happening in the news now, but this it's almost feels

(05:18):
like we've been coming to this moment for a very
long time, you know. I think over the last couple
of weeks, it was a decision Zuckerberg made to not
you know, put a warning label on a Trump post.
Jack Dorsey for Twitter made the decision to put a
warning label on it, and it's really put into focus.
I think some of these issues that a lot of

(05:39):
folks are pretty concerned about the heart of social media.
And what's very unique for our listeners is like having
covered this company for a really long time, people don't
really speak out. It's very rare to see people collectively
speak out and really say things and kind of organize
and come together eploys at the company. It's a very

(06:01):
tight lipped company, right and we're really beginning to see
almost like a sea change of behind the scenes people
beginning to talk about these things. So maybe we can
start with what, you know, what was the premise of
what what you wrote and why'd you decide to write it?

Speaker 2 (06:16):
Well, yeah, it was actually there's a lot of soul
searching for me, you know, and and and I think
you you described it pretty accurately, and that what's happening
now is the culmination of things that have been happening
for a long time. And it's it's it's not just
about Trump, at least not for me. But you know,
for a really long time, I've been looking at the

(06:39):
impact that Facebook has been having on the world with
with some dismay and and actually I've I've I've lost
some sleep over, you know, laying it in bed at night,
looking up at the at the ceiling and kind of
thinking like what have I done here? Because you know,
I joined Facebook with the idea of, you know, changing
the world for better And I think you know, what

(07:00):
you're seeing in terms of employees and former employees speaking
out is because they don't just care about Facebook, but
they care about the world and that's why they worked
at Facebook. And when they're seeing the potential that Facebook
is having to negatively impact the world, that they want
to do something about it, and they feel like Facebook
can do something about it. And that was part of
the premise of my writing is that you know, Facebook,

(07:23):
in the time that I worked there and the time
before and the time since, has overcome, you know, tremendous
challenges and every single time they rise to it and overcome,
you know, whether it's facing Google, which was a behemoth,
or MySpace, which at the time was a behemoth, or
you know, you mentioned the change to mobile. You know,
these are these are things that they took from nothing
and they made a tremendous success out of them. And
I think they have the opportunity to do that here

(07:46):
with you know, actually being a force for you know,
information and for understanding and for truth. But they're not
doing that right now. And so my goal was actually
to try to rally them to that end because I
know that they have the ability to come up with
some really innovative solutions that could have an impact on
the world. And I think don't necessarily need to restrict
free speech. And I think that's a false choice and

(08:09):
that those are the words I use in my writing
and I believe that strongly.

Speaker 1 (08:13):
Yeah, I mean, can you give us your argument about
because you know, I know, Zuckerberg really, more so than
any tech founder I know, really digs his heels. And
when it comes to this argument on free speech, you
know when he says when Trump was posting and saying
it was a post that said something like when the
looting happens, the shooting happens or something, and people are

(08:34):
very concerned this was going to promote violence. But Zuckerberg's
argument is that Facebook will not be the arbiter of
truth and that it's a slippery slope. This has always
been the company's argument and it's not changing now. What
you say in this piece is that a lot of
things have changed and the time that you guys drew

(08:55):
up those community standards and the time that this happened,
and that words have more meaning and more powerful in
a different way because a lot of things have changed.

Speaker 3 (09:03):
Can you explain that to us?

Speaker 2 (09:05):
Yeah, So, you know, in two thousand and eight, you know,
there are a lot of discussions about how to handle
speech on Facebook and and the main conclusion was, you know,
Facebook is going to have a hands off approach to it,
and I think that made sense in two thousand and eight.
You know, one there were the professional arbiters of truth,
and I believe the press are are those and have
been for centuries. You know, we're much stronger and had

(09:26):
much more distribution. I think Facebook was growing, but still
relatively small, and that's changed, you know. And Facebook was
not a source where their people looked for news and information.
You know, It's a place where you looked for photos
from your friends or you know, funny memes, et cetera.
And all of that has changed, you know, dramatically in
that you know, the press has you know, newsrooms have

(09:49):
been decimated, you know, the economics of news have changed dramatically.
Facebook is a news and information source for literally billions
of people. And I don't think a decision made with
the variables in two thousand and eight still holds in
twenty twenty, especially if you look at the consequences. You know,
everything from brexfit to elections here in the US, to

(10:12):
elections around the world, to health information during this pandemic
are all being threatened by disinformation that is found and
hosted by Facebook. And I just don't see how you
can look at the consequences of the decision you made
more than a decade ago and see how dramatically bad
they are and say, yeah, that was definitely still the

(10:33):
right decision today.

Speaker 1 (10:35):
Right, I mean, I mean it's a pretty powerful statement
to have been an employee somewhere years ago, right, and
to say that you've been losing sleep over decisions you made.
I mean that is a pretty powerful thing to say. Yeah,
go back, like, what are you losing sleepover? Specifically?

Speaker 2 (10:50):
Well, I mean I could bring it to you know,
just like a few weeks ago. You know, if you
saw this plandemic video which was a very slick, highly
produced piece of gross misinformation about the current pandemic, and
it was liked two and a half million times on Facebook,
which means it was seen by many more people, And

(11:12):
so that's literally millions of people who were misinformed about
a current health crisis. Now, I believe that many of
those people will make a decision based on what they
saw in that video through Facebook that will be detrimental
to the health of themselves or their family. You know,
I don't think it could get more serious than people
will die because of something they saw on Facebook. And yeah,

(11:37):
you're right, I lost sleep over that, and I continue.

Speaker 1 (11:39):
To you know, you talk about a week spook before
like a key moment. And you mentioned like intent bias
in this piece. When you're talking through some of these issues,
do you explain that?

Speaker 2 (11:52):
Yeah, so, I mean when I worked at Facebook and
other tech companies, you know, you build these products and
you think you know what they're going to do, but
you never really know until they're out in the world
and people are using them. And they always use them
in ways that you didn't intend. And sometimes that's good
and sometimes that's bad. But you know your intention is good.
You know you had the intention to give them free

(12:12):
speech or allow them to share, or allow them to create,
et cetera. And so your intent was good. And then
you look at the outcomes and hopefully there are some
good outcomes and they're bad, but because of your intent,
you focus more on the good outcomes than the bad.
And I think that's been happening at Facebook for years,
and I think it's happening today and that you see
them write the stories of success, you know, whether it's

(12:34):
selling products or the organizing of groups, and I provide examples.
You know, they topped this Sisterhood of Truckers, which I
think is amazing. But at the same time, you know,
Mark looks at all of that, and then he says,
I don't see how Facebook could have impacted an election,
Like he's too smart to not see that those things
are the same. You can't have all of this good

(12:55):
and organizing and people changing their minds in a good
way and then not have this same thing happened in
a bad way.

Speaker 1 (13:02):
So then having worked closely with him, and having worked
closely at the I would say the company, and being
kind of the lead on communications, it could be crisis
communications communications.

Speaker 3 (13:11):
You say he's too smart for Then what do you
think it is?

Speaker 2 (13:13):
Well, I, you know, I haven't talked to Mark in
you know, nearly a decade, and so I can't know
what's in his head. But you know, the two things
that my guests are One is the intent bias. I
think he continues to look at all of the good
that Facebook is doing in the world, and I agree
with him that there is a lot of good, and
he says that it's more than the bad. I would

(13:35):
say that's not good enough, And I would say the
bad is continuing, is growing, and that that ratio is
changing in a way that I think is bad for
the world. And two, you know, when I worked at Facebook,
there was something that Mark used to say a lot,
and he had a chart and everything, and he would
continually say we're one percent done. And that was in
response to people being too conservative and people working to

(13:58):
protect what we'd already achieved rather than working taking risks
to achieve the ninety nine percent that was left. And
I feel like maybe for him and others at the company,
that that ratio has changed, and that maybe they're feeling
like there is more to protect than there is to
achieve for Facebook, and that Facebook maybe is too important

(14:19):
in the world to risk. But I would argue that
if if we save Facebook but the world burns, that
we've made the wrong trade off.

Speaker 1 (14:30):
Okay, we've got to take a quick break to hear
from our sponsors more. As my guest after the break.
It's a nuanced argument, too, right, Like it'd be too

(14:52):
easy to simplify this and say, well you should take
this down, right Like you look at Jack Dorsey, you know,
he's in the line of fire with the resident right now.
And now having put some of these labels on Trump's tweets,
people are calling on Twitter to put labels on all
these other world leaders tweets, you know, And we're wondering, well,

(15:12):
why does this get a label and this doesn't get
a label? And for me having covered tech throughout the years,
it certainly seems like sometimes you get very confused about
who's making the decision, why the decision gets made, and
those standards don't seem to always apply in the same way,
and they change quite a bit. So with Facebook and

(15:34):
the argument of it's a slippery slope, can you see
it from the other side, like, how do you weigh
that argument right now in this current moment that this
could lead to censorship, it could lead to a lot
of other unintended consequences for the platform.

Speaker 2 (15:49):
Yeah, well, the slippery slope, it's such a funny, you know,
buzzword that you know, people invoke in all kinds of situations.
And I've heard it in meetings for you know, decades,
and I've come to realize that in a lot of ways,
it's a cop out. In some ways, it means, yes,
we know the right thing to do in this situation,
but we're not going to do it because we don't

(16:09):
know the right thing to do in some of these
future situations. And if that's the case, you know that again,
that feels like a cop out. You know, I know
that the decision that I'm going to make here is wrong,
but that's okay, because I don't know what to do
about these other situations. And I make the argument that
Facebook is too smart, has too many resources, you know,

(16:31):
too much innovation, too much technology to just use a
slippery slope as a way of not doing the work
to figure out, Yes, we know the right thing to do,
and let's work towards figuring out what the right thing
to do on these other situations will be and make
those decisions as we go. And it will be hard,
and there will be some inconsistencies and they will make mistakes,

(16:51):
but I think all of that is less bad than
the current situation, which is rampant misinformation, divisiveness, rampant incitement
of violence, and I and I think it's worth the risk.

Speaker 1 (17:06):
You know. It's interesting because when Zuckerberg testified in front
of Congress for the first time, I remember I was
there and day one was kind of the senators just
asking like random questions about the Internet, and it was
kind of like a you know, I remember everybody was
kind of like, Okay.

Speaker 3 (17:21):
The takeaway here is.

Speaker 1 (17:22):
Also that you know, the government needs to educate themselves
when it comes to technology. But day two I remember
thinking this, Barry, I was like, day two is really
interesting because you had Zuckerberg in front of a lot
of House members who were all asking him about taking
down content, and who were all this was you know what,
I think this was a couple of years ago now, right,

(17:43):
but who were all talking about do they have a
liberal bias? And you know this is now we have
another backdrop, which is regulation in the power of these
big tech companies. And before all this whole pandemic happened,
we had the conversation about are these companies monopolies, which
you know, I think that's a little bit. We've pushed
it aside, just a teeny bit because there's so many

(18:03):
other huge things happening in the world. But these companies
are under a lot of pressure right now. And I
make no judgment either way. I think, you know Zuckerberg,
I think it's you know, having interviewed him many times
and seeing some of the stuff, I don't get the
sense that it's all just political, right, or that he's
only just doing it all for the money. I actually
think if you meet him, it's really different. But do

(18:27):
you think that part of this could be outside forces too,
I mean, and not just with this, but with a
lot of the decisions that Facebook is making.

Speaker 3 (18:34):
And a lot of the pressure that they're under right now.

Speaker 2 (18:37):
Yeah. Yeah, no, I one hundred percent think there's outside
forces at play here. And that's what that's kind of
what I was meaning to talk about with the company
working more to protect than to take risks to achieve,
is that I think those outside forces are real and scary.

Speaker 1 (18:52):
You know.

Speaker 2 (18:53):
I don't think any company wants to draw the ire
of the president. I mean, I think he's shown that
he will use whatever government levers he has at his
disposal to make things difficult. I think Jeff Bezos is
an example, you know, the work that he's doing against
Section two thirty that you know that has been described
as the twenty six words that created the Internet. He
basically wrote an executive order to try to rescind that,

(19:16):
you know, and that would be a dagger to the
heart of all Internet companies, I believe, And so I
don't think it's the right answer, but it's an example
of you're not wanting to get on the bad side
of the president in this current administration. And I think
that is playing very much into some of the decisions
that the company is making, and I don't think they're

(19:36):
being honest with themselves or then certainly with the public
that that's what's at play.

Speaker 1 (19:41):
You know, I think it's important to say, like you
and a lot of and we should mention that like
a lot of other employees are speaking out, some have resigned,
some are speaking out kind of behind the scenes.

Speaker 3 (19:52):
We just obtained a letter that folks wrote.

Speaker 1 (19:54):
It was a lot of early employees that wrote, you know,
all these employees and I think this is a really
important point. Aren't saying Facebook is terrible? Facebook is bad?
I'll read like a little I thought this was so powerful.
This is This is from the letter that a lot
of early employees and I'm assuming many who you worked
with for sure, got together and they collectively wrote like

(20:17):
this was this is a letter and correct me if
I'm wrong. This was a letter to Mark right that
they wrote and it was published in the New York Times,
but it was it was kind of more of a petition.

Speaker 3 (20:25):
For talking about the standards. But they wrote.

Speaker 1 (20:28):
As early employees on teams across the country, we authored
the original community standards, contributed code to products that gave
voice to people in public figures and helped to create
a company culture around connection and freedom of expression. We
grew up at Facebook, but it is no longer ours,
you know. I think that's such a I gotta say.
And maybe this is me being a little inside baseball
as someone you know, who's looked at this company since

(20:52):
I would say two thousand nine or two, probably around
twenty ten. But but that's you know, seeing a lot
of these names, and I want our listeners to understand
the aims on at the bottom of these and even you, right,
you let coms like no one at Facebook spoke like
this or spoke out.

Speaker 3 (21:09):
Like this for a very long time.

Speaker 1 (21:10):
So I think it's a really big deal that that
people are really beginning to question some of these decisions.

Speaker 2 (21:17):
I agree, you know, and those are I know, most
of those people there. They're a lot of people smarter
than me who signed that letter and and their dear friends.
And in fact, it just coincidentally I sent my you know,
draft post to one of them, said hey, I love
your feedback, and they said, well, that's a coincidence. We
happen to be working on something of our own. But yeah,

(21:38):
I mean I think it's collectively. My experience of laying
awake at night thinking, you know, what have I done
is not unique. And I think, you know, seeing the
impact that Facebook has had on the world and being
proud of it for a very long time and then
having that gradually become you know, forms of shame and
dismay is is pretty powerful. And seeing you know, it's

(21:59):
These are not you know, people who are cranky critics,
you know, the kind of people who always criticize Facebook
about both doing too much and too little, being too
far to the right and too far to the left.
You know, these are people who joined the company willingly
and pour their hearts and souls into it for years
and are really shocked and dismayed about what it's how
it's not of what it's become, but the impact it's

(22:20):
having on the world. And I say I have a
distinction there because I don't think it's become evil. You know,
I got that response to some people, you know, Facebook
is evil, and I said, well, you know that's not
a solution. You know, if you I'm in all ears,
you know, let's provide some action. But I do think
Facebook is not understanding the negative impact that it's having

(22:42):
on the world, and I think it is not paying
enough attention to potential solutions and certainly not putting enough
effort towards developing those solutions. And I wish I knew
the exact answer, but I don't. But I do know
that Facebook has the ability to come up with those answers.
And I think that's part of what's so dismaying to
people who used to work at the company, is that

(23:03):
we know that Facebook can rise to these challenges and
has you know, limitless possibilities, and they're for some reason,
it feels like they're not trying to do that, and
we just don't understand why.

Speaker 3 (23:15):
What do you think a solution is.

Speaker 1 (23:17):
I know they've created projects for journalists. They have an
outside almost editorial board for content now, so they have
done quite a bit over the last years. For some
of the criticism they've they've received on content decisions with
everything happening in journalism, I mean, what do you think
is this solution?

Speaker 2 (23:36):
Well, I'll answer the opposite question first. I'll tell you
what I don't think is a solution. You know, like
I know Facebook somewhat, And I'll tell you one thing
that I believe is that Facebook doesn't outsource things that
are really important to the company, never has, never will,
And everything that you mentioned is effectively an outsourcing the
journalism project. Hey, here's some money, you guys, go do

(23:58):
some interesting work. Fact. Hey, third parties, here's some money,
go do some interesting work the board. Hey third parties,
why don't you make some decisions for us because we
can't make them. And the amounts of money that they
put towards these efforts sound big, but they are rounding
errors in the Facebook universe. And I think if they
were serious about a solution to misinformation, to the incitement

(24:21):
of violence, and just to coming up with a new
way to treat content, that they would do something internal.
They would devote engineers to it. And that's the number,
you know, having worked at Facebook, that's the number one
signal for whether Facebook thinks something is important. How many
top engineers are working on it and everything that you described,
I would argue the answer the number is roughly zero.

(24:43):
And that's probably a little hyperbole because there's definitely engineers
working on these related problems. But the things that they're
touting as potential solutions to this are not actual solutions.
They are their band aids, and you know, as I write,
you know, I think we're actually hemorrhaging truth and civility
on Facebook. And these are a start, but they're just

(25:04):
at the margins, and I think they need to devote
significant resources. I propose, you know, kind of a symbol
in my writing that they just suspend their stock buyback,
which they've committed another fourteen billion to doing that. That's
the kind of resources that this is going to take.
And I don't think they literally need to find fourteen

(25:24):
billion dollars. They have the money. Mostly what they need
to find is the will. And again, I don't know
exactly what they need to do, but I know they
need to commit to doing it, and that's not even
something they were willing to do thus far.

Speaker 3 (25:36):
You know, it's interesting.

Speaker 1 (25:37):
I remember interviewing Facebook's former head of security, who was
there for context, like during it was their team that
discovered Russian influence. He was there for the election interference,
Alex Stamos, and something he said to me, this was
for a documentary I did on Facebook.

Speaker 3 (25:54):
Something he said to me was, you.

Speaker 1 (25:56):
Know, for a very long time, the growth team, they
had more engineers at the growth team. It was bigger
and the building was bigger than for the security team.
So I think that's an interesting point you make about
engineers and to give a sense like these are human problems, right,
and so like you're talking about technology, but the real
problem this is not just Facebook, this is maybe beyond

(26:16):
Facebook and for a lot of the bigger tech companies
and the time I covered.

Speaker 3 (26:20):
It and they went into these grew into these huge companies.

Speaker 1 (26:25):
Is that I don't think they anticipated when you talk
about intent bias, I think there was an inability to
look at the messy, complicated human problems that would happen
in some capacity.

Speaker 2 (26:37):
Right, Yeah, And you know the intent is important because
I think their intention is good and their hypotheses were
not crazy, right. I Mean what Mark says is, you know,
we think people should decide, right, But you know, I
think we can agree there is truth in their lies.
You know, there is civil discourse and there's inciting violence.
And I think we would all agree that truth should

(26:57):
get more distribution and attention than lies, and we should
agree that civil discourse should get more attention than inciting
a violence. And what Mark would say was yes, and
people will figure that out and they will decide for themselves.
But A, that's not what's happening, and B some people
aren't equipped to discern and see, you know, there are

(27:19):
very powerful forces that are deliberately trying to trick them
into thinking that one is the other. And for Facebook
to see all of that and throw up its hands
and say no, we're just free speech, and will the
people decide I think is wrong? And that's just what
I'm trying to get them to realize.

Speaker 1 (27:37):
Right looking at this letter, that a lot of the
early employees, and do you like some of the early employees.

Speaker 3 (27:42):
These are early architects of Facebook.

Speaker 1 (27:44):
But there are all sorts of people who signed this,
who co created this letter, and we'll put in our
show notes. I would suggest people read it just because,
whether you agree with it or not, it's really an
interesting look at I think how people are viewing this
moment in time and.

Speaker 3 (28:02):
The implegration person and the implications.

Speaker 1 (28:05):
You know, what some of these folks said in the
letter was Since Facebook's inception, researchers have learned a lot
more about group psychology and the dynamics of mass persuasion.
You know, we understand the power words have to increase
the likelihood of violence. You know, I remember being at CNN,
was outside when the bomb that was pulled out of

(28:27):
the building. This was like, I would say, a year
ago or something, someone had sent this this bomb and
had ended up in the mail room. Thank god, her,
the incredible security at the time, had found this. But
you know, it had it had stemmed from I think
posts and tweets, and I remember thinking like, oh my god,
like this is actually happen, you know, and it started there,
and the threats had started there, and you know, and

(28:49):
then I was watching a bomb pulled out of our
building where I had been for ten years, and it
was such a maybe as someone who just had spent
my adult career covering tech, just such a moment for
me thinking like, wow, the implications like can be very
real life.

Speaker 3 (29:05):
So I thought that line in this piece was really interesting.

Speaker 2 (29:10):
Yeah, you know, that's it's a good point. You know,
this is not academic or theoretical. You know, it's happening
every day. People are being radicalized, you know, based on
what they're seeing on Facebook. And by the way, it's
not just Facebook. I wrote about Facebook because I work there.
I know more about the company. You know, the same
could be said for pretty much every technology company that

(29:32):
hosts user generated content. You know, I think Twitter has
come up with unique solutions for Trump's tweets, but there's
a lot of work that they need to do too,
you know, around abuse and around misinformation, et cetera. You know,
they and so I just don't know how to tell
them what to do. But yeah, this is this is
not happening in a far away place. This is not

(29:53):
some you know, dystopian future that we can imagine. You know,
people are walking into a pizza parlor with a with
an assault rifle because they believe that it is a
child sex ring, you know, with the presidential candidate, you know,
the pizza gate. You know, it's absurd, but it's literally happening.
People are planning bombs at CNN because they believe that,

(30:14):
based on what they've seen on Twitter and Facebook, that
you know, you guys are the root of all evil.
And that's something that we really need to take more seriously.

Speaker 1 (30:27):
Okay, we've got to take a quick break to hear
from our sponsors more with my guests after the break.
I guess the question for me is, I mean, this

(30:50):
is such a tight lipped company, right, you know, even
covering Facebook it's a fascinating company and it has so
much impact and it is completely transformed the world. But
this is not a company where employees like freely generally
tweet about how they feel or facebooks, i should say,
post on Facebook about how they feel. That is, you know,

(31:11):
that's not something that we've seen. We're in the lens
of a global pandemic. We have protests and real, real anger,
and you know, rightfully so in this country and around
the world, given what's happening with.

Speaker 3 (31:26):
The racial divisions and racism.

Speaker 1 (31:27):
And I think looking at the fact that people at
Facebook and hearing what I'm hearing from kind of former
employees and employees about people really kind of that turmoil
behind the scenes, what do you think it is about
now that's you know, causing people to maybe take the
risk to say something when maybe they wouldn't have before.

Speaker 2 (31:50):
Well, I think it's two things. One that it's it's
not theoretical, you know anymore, it's not academic. You know,
we are seeing it's not isolated. You know, it's not
just you know, one incident of you know, a crazy
person that you could dismiss. You're seeing it, you know,
simmering across the country and around the world, just people
being incited to violence and radicalized. And two, you know,

(32:12):
you hit on everyone of it. The stakes are so high.
We are literally in the middle of a global pandemic.
You know, nearly half a million people have died. The
smart people who know about viruses say many more people
will die, likely in the fall. And in the meantime,
Facebook is providing health of misinformation to them. You know,

(32:32):
it's I don't as I write, you know, the only
way that the stakes could be higher is if we
were on the brink of a world war. And I
don't think we are right now. But I don't see
the logical conclusion of this being a lasting peace right.
You know, it will it will be. There will be
some violent outcome of all of this if it is

(32:53):
not checked in some way, and in the meantime, in
the short term, a lot of people will make health
decisions that will be detrimental to their life, and Facebook
will be complicit in it. And I think that's that's
just something that's got to change.

Speaker 3 (33:09):
You know.

Speaker 1 (33:10):
Facebook has always come under fire throughout by the way,
as someone who's been on the other side of it,
as a journalist who's asked very hard questions interviewing Zuckerberg
right in the midst of Cambridge Analytica and during some
of the harder moments in the company.

Speaker 3 (33:22):
The company has.

Speaker 1 (33:23):
Always you know, I feel like they've played defense for
a very long time, and so there is I think
a certain mentality around that and knowing that you're going
to get criticized.

Speaker 3 (33:33):
But you kind of like you just keep going if
you have the mission.

Speaker 1 (33:36):
I think that's in the DNA of Facebook, if I
can kind of define it in any way.

Speaker 3 (33:41):
Do you think that this time is any different that
they'll listen to some of the former employees or some
of the you know, some of the maybe because it's
more people behind the scenes.

Speaker 1 (33:50):
I know how much Zuckerberg does value the people he
works with.

Speaker 2 (33:57):
The short answers, I don't know. You know, I am
having said all the things that I say and believe
about Facebook. I am almost a decade removed from the company.
I still had a lot of friends there, But you know,
I do think the opinion of the employees is very
highly valued, and that is something that in the past
has moved the company. There was a transcript that I

(34:19):
read of an all company meeting earlier this week, and
it seems like, you know, for the most part, the
at least the vocal people are very against the current
stands of the company. I am sure that is weighing
on the leadership. Whether it makes a difference, I don't
know whether the external pressure will make a difference.

Speaker 3 (34:37):
I don't know.

Speaker 2 (34:38):
The one thing that's unique about the external pressure right
now is that it is it is so divided. In
most cases. When I was at Facebook, there were cases
where people were, you know, on both sides of an
issue telling us we were wrong. But mostly it was
a united front telling Facebook it was wrong. You know,
you're doing the wrong thing on privacy, you're being too open,

(35:00):
You're you know, you're not taking down this objectionable content.
But this is a case where people on the right
are saying you're censoring too much and you're taking too
active a role in content, and people on the left
saying you're taking an active role enough. And when it's
divided like that, I don't know how you make the
calculation for you know, which is the path of least resistance.

(35:23):
And I do think Facebook has made that calculation in
the past and right now because the forces of you know,
leave the content alone are in power. I worry that
they will make the decision that that is the path
of least resistance.

Speaker 1 (35:40):
In this letter too, that these employees sent, they said
Facebook isn't neutral and it never has been, making the
world more open and connected, strengthening communities, giving everyone a voice.
These are not neutral ideas. Fact checking is not censorship.
Labeling a cult to violence is not authoritarianism. I mean,
covering a lot of these companies, it was for so long,

(36:00):
hands off, we're neutral, We're not responsible, we're not media companies.
There's always been this tension for the last decade and
a lot of these companies as you talk about those
twenty six words that save the Internet based off of
tooth section two thirty right, which makes it so these
companies do a degree don't to be liable for certain content.
But it certainly seems like we are seeing a shift

(36:24):
and that words have more meaning and have consequences that
are far reaching, and the stakes seem incredibly high, and
that I you know, I think the debate is certainly
open of you know, and you look at I keep
looking at Jack Dorsey and what he's kind of walked
into as well. And now all the calls to do
all these other things and where are they going to

(36:45):
draw the line?

Speaker 3 (36:46):
So it certainly it certainly feels complicated.

Speaker 2 (36:50):
Yeah, it is. And I tried to address that in that,
you know, and let them know, let Facebook know. You know,
I know this is not an easy problem. I know
it's going to be hard, but but to try to
give them the courage to do it and do something
about it. And I think the writing in the in
the letter from my former colleagues and my friends is
brilliant and those insights are are so spot on. And uh,

(37:15):
and I hope, I hope Facebook listens.

Speaker 1 (37:17):
I thought what you said about facebook strengths are its
weaknesses as well, which is, you know, it has always
been the story of technology, right, which is it can
do such incredible things, and it can also do such
terrible things do and we always just walk this fine line,
and it can get incredibly murky, you know. And I
think what we've seen over the last couple of years,
there's some maybe even the last not even a couple

(37:38):
of years, even before then, really you know, these ethical
issues that that I think a lot of these people
are working through and sometimes better than others.

Speaker 2 (37:48):
Yeah, I agree, and you know, for for Facebook, you know,
the theirs, their biggest strength is the connections that they've
created and fostered and facilitated between literally billions of people,
and there's a lot of benefit to that. But you know,
we're also realizing that it creates some vulnerabilities and there
are evil forces in the world that are exploiting them.

(38:11):
And I just think there's more that Facebook can and
should do about it, and I hope they will.

Speaker 3 (38:17):
From like, I've found a personal note.

Speaker 1 (38:19):
You've you have been at Facebook, You've been at Pinterest,
You've been a part of these companies that really have
kind of shaped people and behaviors and whatnot. What is
what is your takeaway on people?

Speaker 2 (38:34):
Wow, that's a really broad question. What's my takeaway on people?

Speaker 1 (38:38):
Yeah, you've kind of been in the fray, You've been
in it, You've watched, you've watched things built, You've built things,
You've you know, you were in the line of fire
at Facebook. Pinterest is a more delicate company, I would say,
knowing knowing that company, it's a more delicate culture too.
But what's you know, You've just had such extraordinary i
would say experience kind of being being in these places

(38:59):
and being.

Speaker 3 (38:59):
On the frontlines.

Speaker 2 (39:01):
Well, yeah, I mean what I've observed in those companies
and what I felt is that you know, they most
of the time, the overwhelming amount of time, that people
want to do the right thing and they want to
change the world for better, and that is that is
at least a large part of what's motivating and I
believe it's what what's motivating Facebook. I believe it's what's

(39:23):
motivating Mark. I believe unfortunately in some cases they are
blind to the consequences of the decisions they're making. I
believe they are not giving enough weight to the bad outcomes,
you know, as I state, But I do believe these
people are good and they're trying to do good and
they want the best for the world. And I think
that's why you know, to your point that they're usually silent,

(39:46):
but they're not being silent right now because they are
seeing that that intention is not being realized and in fact,
quite the opposite. They may be actually damaging the world
and they don't want to and they want to do
something about it. So my whole goal was to try
to give them some ammunition and maybe put some form
around their thoughts and ideas to move the discussion forward

(40:08):
towards some action, and I hope it gets there.

Speaker 1 (40:12):
And you think action would be labeling more of this content, well.

Speaker 2 (40:16):
I think it would be not saying we're taking a
hands off approach to content. I think it would be
taking some responsibility for the content. And you know, again,
there are lies in the world and there are truths.
And having a hand in making sure the truth is
more seen than the lies, I don't think is a
bad thing, and I don't think it's censorship. There is

(40:38):
civil discourse and there's inciting violence. I don't think taking
an active hand and saying this civil discourse is of
more value than this inciting violence is a bad thing.
And that's just a bridge that they haven't been willing
to cross, and I'm urging them to cross it. And
I don't know whether that means one will be labeled.
I don't know whether that means the distribution will be

(40:59):
throw on one and surged on another. I don't actually
know the solution, but they need to cross the bridge
first and commit to having those outcomes be an actual goal.
You know, free speech is not an outcome. Free speech
is a means to an end, but the end right
now is damaging the world. So how do we get
to an end that actually makes the world better? And

(41:22):
that's the thing that I'm hoping they'll get to.

Speaker 3 (41:25):
What do you say to the folks?

Speaker 1 (41:26):
You know, anytime some people come and speak out at
Facebook or whatnot, I would say, yeah, I saw an executive.
It's like, I think it's Dan Rose or something, say like,
you know, just early people who have no connection or
something like that. He was saying, you know, don't even
know the nuances or complexities of this argument.

Speaker 3 (41:44):
You haven't been there in a while.

Speaker 1 (41:46):
So to defend yourself very what do you say to
the executives or the people who say, well, you haven't
been there in a while, you don't know. What do
you get to say something?

Speaker 3 (41:57):
Well?

Speaker 2 (41:57):
So I know Dan Rose, and I think he was
having an emotional reaction to feeling that his life's work
is attacked and people he cares about and is loyalty
are attacked. And I don't think that tweet was actually
reflective of him. I think he more meant you know,
kind of what you're saying is that you know, you
guys have not been at the company for a long time.
You don't know the content of the discussions. They have

(42:20):
been deep and endless, and you should trust the people
that work there. And to that, I would say, I
do trust the people to work there, But I think
you were not realizing the consequences of those decisions. I
am sure they were thoughtful, I'm sure they were endless.
I'm sure they were as deep as they could possibly be.
But I believe they were wrong. And I believe the
evidence supports my position and not yours. And you have

(42:44):
a responsibility to do something, and even more than that,
I think you have the ability to do something. And
I think it could be a tremendous success and opportunity
for the company and for the world more importantly, if
they would seize it. And I urge you to do so.

Speaker 1 (42:58):
And you know, and so when you talk about the
decisions people at the company are making, I feel like
I couldn't in this interview without saying, you know, I mean,
I think it was reported that there was one person
of color or one woman of color in the room
when one of the decisions was made on the Trump posts.

Speaker 3 (43:15):
You know.

Speaker 1 (43:15):
So when we talk about and this is a larger conversation,
we talk about the decisions that these folks are making
Who are the people making these decisions and are they
diverse and a diverse group of perspectives. I think that's
something we have to hold on to that. I think
we're seeing it come to a head now in this moment,
you know. I mean, there's just not enough different voices

(43:36):
and perspectives. And this isn't just Facebook, I mean, this
is all of Silicon Valley has a massive problem.

Speaker 2 (43:42):
Yeah, I agree with that, and I and as you
say it, I wish I had brought it up in
my writing. But you know, it's diversity across every dimension, race, religion,
socioeconomic status, education. You know, the people who work at
Facebook and are making the decisions are highly educated, highly sophisticated.

(44:03):
They're not seeing this stuff in their feeds because they're
not posting it, and the people that they associate with
know the difference. But lots of other people are seeing
it in their feeds and they don't know that it's
a lie. They don't know that it's insiding of violence
in some cases, you know. And I think if more
of the leadership team had the perspective of the people

(44:25):
who were seeing all of the content on Facebook based
on their race, their religion, their economic status, you know,
where they come from. I think they would have a
different opinion of the impact because they're not seeing it themselves,
because they're not exposed to it.

Speaker 3 (44:42):
And then I guess last question.

Speaker 1 (44:44):
Have you had other folks from inside the company reach
out to you after kind of speaking out and writing?
Do other people share your feelings?

Speaker 3 (44:51):
Yeah?

Speaker 2 (44:51):
I mean it was you know, the response has been
fairly overwhelming. You know, I didn't know if anyone would
listen or care. It seems like people and do which
which is it's just gratifying. And I would say the
most common response that I got from current employees, former employees,
and actually even people who never worked at Facebook but

(45:12):
feel a connection to it in some way because probably
because they love the product and have been on it
for years, is some form of thank you for writing this.
You encapsulated some feelings and thoughts that I've had for
a while. And so I am I am one person

(45:33):
who you know, had my own ideas, but it seems
like they are shared by a lot of people. And
I hope that gives them some weight to Facebook and
that they do something because I want them to. I
know they can. I think they should and if they did,
I think it could be tremendously valuable to the world.

Speaker 3 (45:53):
And last question, I promised you.

Speaker 1 (45:56):
You know, you were leading the charge in comms during
some of these very intense situations. Privacy was a huge
one back when you were there, so you kind of
went into the line of fire.

Speaker 3 (46:08):
What advice would you give Mark right now?

Speaker 2 (46:11):
Oh man, What advice would I give Mark right now?
I would Well, I tried to give some of it,
you know, without naming him in my writing. And I
think it is to pay attention to the outcomes, not
just your intent, and to have courage against the critics
who have power to limit and damage your business and

(46:33):
your company, and to have faith in your ability to
do something really remarkable for the world in a new way,
in a way that you haven't before, and not just
enabling free expression, but an outcome of actually informing people
and improving their knowledge of the world and their understanding
of the world, and enabling them to make the right

(46:54):
decisions about it. And I think that's something that he
has the ability to do and I.

Speaker 1 (46:59):
Urge to do it. First Contact is a production of
Dot Dot Dot Media executive produced by Lori Siegel and
Derek Dodge. This episode was produced and edited by Sabine
Jansen and Jack Reagan. The original theme music is by

(47:21):
Xander Singh. First Contact with Lori Siegel is a production
of Dot dot Dot Media and iHeartRadio
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Clifford Show

The Clifford Show

The Clifford Show with Clifford Taylor IV blends humor, culture, and behind-the-scenes sports talk with real conversations featuring athletes, creators, and personalities—spotlighting the grind, the growth, and the opportunities shaping the next generation of sports and culture.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.

  • Help
  • Privacy Policy
  • Terms of Use
  • AdChoicesAd Choices