All Episodes

February 10, 2020 60 mins

Yael Eisenstat is trained to analyze an argument from all sides. During her career at the CIA and State Department, she had tea with suspected extremists and sat at tables with people who were programmed to hate her. But the biggest challenge of her career didn’t come from a covert operation… it came when she stepped into Facebook Headquarters to head up their election integrity efforts. This is her story.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
First contact with Lorie Siegel is a production of Dot
Dot Dot Media and I Heart Radio. Like when that's CIA,
it's you can't talk about anything classified, which is essentially
an n DA. They never said you can't say anything
negative about us. The non disparagement at Facebook was so
strong I would have never been able to talk about

(00:21):
even why I was hired, what I was supposed to do.
And honestly, they hired me because of my voice, which
took me a very serious process to find my voice
after leaving a top secret world. There was no way
I was going to let this company silence back. Yel

(00:45):
Eisenstad is not someone who's going to sugarcoat it. She'll
tell you exactly how she feels. She's comfortable walking right
into the center of a debate. But what makes her
different is that she's trained to analyze an argument from
all sides. Yell grew up in Silicon Valley, but she
left to go travel around Africa and play guitar, and

(01:07):
eventually she returned there, but this time as a CIA analyst.
She's had tea with suspected extremists. She's been in war
rooms and set at tables with people who were programmed
to hate her, but she found commonality in their humanity.
The biggest challenge of her career didn't come from her

(01:27):
work at the CIA. It came when she stepped into
Facebook's headquarters in Menlo Park to head up their election
integrity efforts. She says her goal has always just been
to bring different people to the table and tackle tough issues,
But at Facebook, she says, she didn't feel like she
was given a seat. This is her story. I'm Laurie
Siegel and this is First Contact. Welcome to First Contact.

(01:52):
This is a podcast that explores the people and technology
that are changing what it means to be human. And
yeahl you have such a fascinating background. I'm gonna just
say a bunch of the really cool things that you've done,
just so we can get them out of the way,
and then we're gonna go and talk about all of them.
But you work in public service for thirteen years. You

(02:12):
worked in the CIA on counter terrorism, So I think
there are things you can tell me and things that
you can't tell me because I shoot, you would look
that nobody who's listening will be able to see, and
that goes, don't you dare ask? Yeah, they tell of
it you have to kill me kind of thing. Um.
He served as National Security advisor to Vice President Joe
Biden at the White House, and then you moved into

(02:32):
the private sector and you worked in corporate social responsibility
as a strategist at x on Mobile. Um. But why
I think you're here with me today is because you
made an entrance into the tech community and you were
the global head of elections integrity Operations at Facebook for
all of six months from June to November. So there's

(02:57):
so much to to kind of dig into. But first,
let's talk about our first contact. Yeah, we just had
our first contact recently. It happened late at night, right,
it did? How if you describe it? How would a
former CIA agent talk about our first contact? Okay, first
I'm gonna say not agent, but yeah, off, sir. Um.

(03:19):
So you know that there's those few people in life
that when they tell you you need to meet someone
or they you just say yes. Those are few and
far between because I usually don't say like, sure, yeah
intro without knowing who it is. But a mutual friend
of ours texted us both at like eleven o'clock at
night and said, Lauria el you need to know each other, right,

(03:40):
no last names, just Laia el so I text you back,
but we did have a nice text messaging chain going on.
I was like, is it inappropriate that I just I'm
messaging her at midnight on like a Saturday. But then
I was like, and then we decided maybe that's why
John thought we should know each other, because we're both
late night people and apparently are okay with texting it
midnight totally. So that was our our first contact. So

(04:02):
I want to get to all the stuff you did
at Facebook, but let's start with your background. You grew
up in northern California, right, You grew up in Silicon Valley,
but you did not immediately go into tech. As a
matter of fact, you kind of like shunned tech to
a degree. Yeah, I mean I never thought I'd go
into a tech Yeah. I grew up in Los Angeles Soules,
I think before all the tech bazillionaires discovered Los Altos Souls.

(04:25):
But yeah, tech was never really my interest. I mean
even grew up in a time where like the owners
of Attari lived down the street and test their games
out on us, and like everyone in our school's parents
were like CEOs of some like Hewlett Packard or so.
It was around us constantly, but nobody in my family
as a tech background, and it wasn't wasn't my interest

(04:45):
at all. I've always been very globally minded, you know.
When I was fourteen, I think about that time, I
do remember telling my parents that I wanted to explore
the world and I just didn't want to be in
this bubble. At the time, you know, I was going
to high school in Palo Alto and just down the
street was East Palo Alto and it's a murder capital country,
and I wanted to write a story from my school

(05:06):
newspaper and nobody wanted to hear about it, and the media.
I was like, there's something going on here that does
not align with who I am. And my mom, let
me go off. Want my mom and dad and let
me go off. Spend my sophomore year of high school overseas,
and I just got the international bug and just wanted
to work on international affairs and foreign policy and get
out of the Bay Area. I guess I kind of

(05:27):
want to jump right into the CIA portion of your
career because you spent a lot of time in Africa
doing that. So how did how did that happen? So
I moved to d C for grad school and it
was the late nineties, and again at that time, I
just knew I wanted to be involved in international affairs.
It was this weird sort of post Cold War, pre

(05:47):
September eleventh time period where most of us didn't know
exactly how that was all going to play out and
what that means, but we knew that the world was
like open, and let's run around and see how these
different countries are all going to get along and where
the US's role is going to be. So those were
the things that really drew me. I actually thought I'd
end up at USA I d or State Department running
around doing development of foreign policy work. They both had

(06:10):
hiring freezes. So yeah, it was weird, this like weird
lefty California girl with an international relations degree. The next
place to throw in your resume was the CIA. Like
what was that you go in for the interview? What
is I mean? Can you just like take me every
You're this weird lefty California girl who just like spend
some time in Africa, You're really interested in this kind

(06:31):
of stuff? You go in for an interview the CIA
what was it like. I mean, as far as I remember,
I went in and just had this really cool, casual
conversation with one person for like an hour about Africa.
And I'm sure we were doing more than just shooting
the breeze. He was I'm sure assessing how I analyze
situations and now, But to me, it just felt like
we were talking about African politics for an hour. And

(06:54):
then I went home and thought, well, that was really weird.
I was just inside the CIA, and I thought that
would be my one and only day and went home
and went out with a bunch of friends that night,
and they called me the very next morning. So, if
I'm reading between the lines, you went out and totally
just like went out to the wee hours and then

(07:14):
you got a call from the CIA the next pet,
very early, and they're like, you're hired. I mean it's that. Well,
then it's the whole process, but like we want to
make an offer. Yeah, and then you have to go
through this whole security clearance process. It's like a full
year before you actually start working, but yeah, and then
it just there you go, yeah, and so what kind
of stuff did you did you do? I mean, you've

(07:35):
written it you can write about it now, So I'm
assuming you're able to talk about some of the stuff
you did. But what what was your data? I'm sure
every day was completely different, yes, but what were your
days like? Yeah, so it is true. I've written some
of the things, and and for the rest of my life,
I am obligated to have them review things I write
if it's about my time there, and I bide by

(07:56):
all those rules. So I'll talk about the things that
I've already had approved. Um. But yeah, I started out
working as an analyst mostly on African affairs, on political
and leadership issues. By the time the whole processing in
the whole year was up, it was November two thousand,
So it was the week of the elections, but at
a time where we didn't know. I don't know if

(08:18):
you remember the Florida debacle, but so I joined the CIA.
President Clinton is still in power. That week, we have
an election. We don't know if it's going to be
a Gore Bush next. That was my first week at
the agency, and then within a year, starting September eleven, happens.
So up until September eleventh, that I had been mostly
working as an analyst on African issues and for long

(08:38):
periods after that as well. But we all get drawn
into like supporting different task forces, and I mean I
did some of this, like was on the Afghanistan Task Force,
meaning up at like four in the morning helping coordinate
intelligence for that stuff, and like doing cartwheels down the
hallways of Langley trying to stay awake. Just weird, weird
stuff like that. But then out back to working on Africa,

(09:01):
and then in two thousand and four I actually went
over to the State Department and got in as a
Foreign Service officer, and so then I went overseas for
a few years with State Department and went to Kenya
for a few years. I mean, you seem to me
the kind of person that seems very trustworth, like you
could just maybe this is a weird assessment. I haven't
known you that long, but I do feel like I

(09:24):
could take you into a party where I didn't know anyone,
and you could work a room even if you didn't
want to, even if like you wanted to secretly hurt yourself.
That's amazing, Like do you know what I'm saying? Like
I feel like that is you. If this is my
assessment as a journalist. Right, Really, this is like my
personality profile I've done on you and a couple of
times i've met you, and so this is what I'm

(09:45):
I'm trying to relate this to to what your work
overseas was like, right, and some of the things i've
seen you, right, Like part of your job, I'm assuming
it was to sit across from people that maybe weren't
supposed to like you that in really hard situation ends
maybe sometimes could be life or death, not just parties
like I'm mentioning, I mean, could you give us some
specifics of like the crazy stuff that you were doing

(10:08):
out there? Like, Yeah, actually, I will say my two
years and Kenya were really completely career shifting for me.
I showed up, so I was a political officer when
I showed up, meaning that it doesn't sound that sexy.
I'm going to show up. I'm gonna work on political issues.
The ambassador, probably because I had spent years before that

(10:29):
in the CIA, probably assumed that I was this big
security person and decided to give me what was called
the security portfolio. So I ended up doing a lot
of the counter chairs and work, but I also was
in charge of I mean, we don't like to call
it hearts and minds work, but a lot of the
sort of hearts and minds type stuff we were doing
in some of the more remote areas of Kenya. So

(10:51):
what was really amazing about that is there weren't really
marching orders like here's how you do this. And I
mean it was like this young single Jewish female civilian
who just had to figure out how to go make
friends in communities, particularly along the Smalia border, particularly mostly

(11:12):
Muslim communities. And it was really amazing actually because I
just spent a lot of time, you know, sitting and
drinking tea and listening to people's stories and building friendships
and relationships. And I at the time didn't know why
am I doing all this? Am I doing this right?
Am I doing this wrong? But in the end there
was a method to the madness, and some of these

(11:32):
communities that really vulnerable communities right on the cusp of
like the areas that were susceptible to outside influence from
well now we know them as all Shabab, but at
the time from whatever influences might have been happening in Somalia.
I can't say that I won over all these communities.
Single handedly, but I can't say that put in a

(11:53):
lot of time really just trying to build bridges. How
do you sit across from someone who might as suspected terrorists?
You're a Jewish woman, you know You've talked about sitting
across from someone who, until you set across from them,
thought Jews had horns, right, Like, what was your way
of sitting across from these people and creating that bridge?

(12:14):
How did you? How do you do it? Well? So
to be clear, it's not like I was sitting across
from people that we knew were definitely hardened terrorists who
would commit in violent acts. But part of it is
also you start by building friendships and relationships with really
influential people in these neighborhoods, in these villages, in these areas.
I wasn't coming with anything, right, It wasn't coming with

(12:35):
offers of money, and it was really coming to say, hey,
I just kind of want to understand understand you guys more.
And I know that sounds weird, but I think that's
what made it work. Yeah, do you have anyone specific
that you go back to you that you remember that
impacted you from from those days? Yeah, there was there
was one that I think a lot about now as well. Um,

(12:59):
there was one moment where was asked to host this
town hall, and I had been told after I already
got there, that one individual was going to be there.
And this person had spent a few years in jail.
The Kenyans had arrested him for suspicion of a few things,
including harboring the terrorists behind the US embassy bombings that

(13:19):
had happened in So I had followed this man's trial
for a few years. I mean, he was one of
the people that I as the counter terrorism part of
my work, that I was really responsible for understanding and
following through what was happening there. So hearing that he
would be there was the first time I really had
to think about what am I going to do? How
am I going to engage? Do I pretend not to

(13:40):
see him? Do I Is he going to stand up
and yell at me in front of the whole group?
Like I was nervous? And we got there and he
sat there quietly the entire time, and other people from
the community were asking questions, and at the end of it,
he raised his hand and so I braced myself for
what was coming, and he gave us like super passionate speech,

(14:02):
thanking me for being so willing to listen to his community,
and just talking about how engagement like this is the
most important thing we can do to overcome mistrust. And
it was really weird. This man made me feel at ease,
and more importantly, he demonstrated to his whole community, because
it's this whole community is for all sorts of reasons,

(14:23):
really susceptible to outside influence, and he demonstrated to them
that he doesn't view me as the devil and that
this conversation was important. So and then he came up
to me afterwards and he shook my hand. And it
doesn't mean that I think he's a wonderful person now, Um,
it doesn't mean any of that. And to be clear,
some of the charges against him were always a little

(14:44):
bit unclear. So this isn't like the most heartened terrorists
that we know killed people. That would be a different situation.
But that interaction is actually what I first wrote about
the first time I ever sort of spoke out publicly
about Wow. I used to be able to actually engage
with people who are that different for me and have

(15:05):
real conversations, And I think about that a lot. Now
when I see that, I kind of struggle speaking to
Americans anymore about things I disagree about. So I'm trying
to figure out the model, going back to what did
that look like? Why was I able to talk to him?
But now I don't want to engage with people anymore?
What what has changed? We're going to take a quick

(15:28):
break to hear from our sponsors, but when we come back,
we'll hear why after seventeen years of hiding it. Yeah,
l so publicly about her past as a CIA analyst. Also,
do you like what you're hearing? Make sure you hit
subscribe to First Contact in your podcast app so you
don't miss another episode. Now to kind of put on

(16:01):
my tech hat. A lot of this we can put
against the landscape of social media, right, because you couldn't
have had a more human experience when you were um
doing your work, your public service work, right, And then
in the meantime you have social media and this incredible
thing happening over the last decade of social media, which

(16:22):
is supposed to connect all of us, and something has
happened where we have less of an ability to speak
to each other than ever. It seems to me and
you at one point made the shift and started speaking
more openly about some of these things. A lot happened
in between, because we'll get into the fact that you
actually went and worked for Facebook. But what got you

(16:43):
interested in looking at technology and its role in civil discourse? Sure,
so when I was back in the US, I mean
I had left government at this point already starting to
watch this breakdown and civil discourse ahead of our last election.
So it was probably arounden and I'm watching this, and
I'm watching people just getting more and more angry, and

(17:08):
just this inability to even talk to people anymore, And
and to me, it just felt like something really different.
And again, we've always yes anger and polarization has always
been a part of the US, and and the media has,
especially with the advent of the twenty four hour cable
news networks has always pushed this and the outrage machine

(17:29):
and all those things. But I couldn't quite put my
finger on yet. But something was very different and actually
started to concern me to the point where I completely
stopped paying attention to all these threats I had worked
on overseas and started really actually thinking that the biggest
threat to the future of our democracy as we know
it is actually ourselves and it was actually this breakdown
and si Well discourse, and so started writing about it

(17:53):
and started digging in a bit, and that's what started
getting tech conferences and whatnot started inviting me to speak.
And I remember first going, why would I speak at
a tech conference? I'm not a technologist? And it was
so interesting is those when I would start to speak
at conferences. It sounds so simplistic, but I would always
come back to, are you guys at all engaging with

(18:16):
people who are not like minded? That that, to me
is such a concern how we are engaging less and
less and less with people who are not like minded
in every possible way. And so you've always had to
be like, very very careful. But even though you seem
to me like someone who just like really wants to
say certain things, I'm sure it was really great for

(18:37):
you when you kind of came out and you kind
of like publicly outed yourself right from the sea, Like
how did this come about? Publicly? I'm sure that for you,
like being able to have a voice must have been
kind of great. It was great and terrifying at the
same time, because so I, as you're gathering. I am
not shy about sating my opinions, particularly if those opinions

(18:59):
are about something that I'm truly concerned about and concerned
about for something more than me. But talking about my
CIA pass was never part of the plan. And I mean,
it worked really hard to reinvent myself as a private citizen.
And yeah, so I I don't want to go to
down a political road here, but some people might remember

(19:19):
that the very first thing President Trump did on his
first day in office was to go give a speech
at the CIA, and he gave a speech in front
of the Wall of Stars, and there were many many
things about that speech that really terrified me. I mean,
he stood in front the wall of stars bragged about
as inauguration numbers. That was where that whole scandal happened. Also,
one of those stars behind him was a colleague of

(19:41):
mine who had died in service. So that was the
night I just got really upset and realized I have
a voice here. But why was that? Can you tell us, like,
why is that person? I mean, obviously it's personal cause
it's a colleague who died, but like watching a president
talk about that, can you explain to us why that
you know, what's what's so personal about the to you? Well,
there were there were a few different elements. First, there is,

(20:05):
leading up to the elections, a president who had been
denigrating public servants and particularly the intelligence community. And the
funny thing that I'm not some great defender of the CIA.
I am critical of many things from the institutions past,
but I do believe in our democratic institutions, and so
I had already been concerned with you have a president

(20:25):
who is purposely denigrating these institutions for his own purposes,
because these are the institutions that could probably hold them
accountable in the future. So that was the first piece.
It was one thing when he was a candidate, but
the fact that was the first thing he did as
president was to go there and made a bunch of
statements that were both disrespectful and scary. The disrespectful ones

(20:49):
were standing in front of the Wallace Stars, which for
the CIA is like hollowed ground. And presidents in the
past will go give talks at the CIA, but they're
not for the US, they're not for media, they're not
for publicity. They are to thank the men and women
of the agency for the work they do that the
public will never know about. And he stood there again

(21:11):
with this backdrop as a prop and bragged about his
inauguration numbers, and then pointed to everybody in the crowd
and made a joke, who here voted for me? Like?
Those are the types of things a dictator asks, like
to ask the intelligence community, who here voted for me?
Like these are the little things that maybe some people

(21:31):
didn't see how dangerous they were, but to me, they
were really fore voting for what was to come. So
all of those things combined, I knew I had a
voice in a background and a platform that not everybody
gets to have. And I also knew I knew how
to write about it in a way that when you
read the piece, you don't know if I'm a Democrat

(21:53):
or Republican, or conservative or liberal. You just read the
piece and recognize what service means and why this was
so disrespectful. So yes, I wrote a piece in the
New York Times at night and all bed about his
speech at the agency. So again was trying to actually
put a human face on an organization that, love it
or hate it, is still full of human beings, um,

(22:15):
many of whom really want to protect this nation. So
that's why I wrote that piece. But yeah, then all
of a sudden, my seventeen year secret was out. How
do you feel it was terrifying? It's funny. I was
very scared up until that moment. M more scared than
many things in my life that should have scared me.

(22:35):
Like all run around in dangerous places. I've had guns
drawn on me, like whatever, that's just Oh, that's just
who I am. But outing my secret, we shouldn't need
to be secret. I wanted it to not be a
public talking point. That was scarier. But then the next day,
you know what, it was like a cloud was lifted.
It was like a weight was lifted. I suddenly found

(22:57):
my voice. My friends did not disappear, People didn't judge
me terribly because of the way I decided to speak.
You thought your friends would disappear knowing that you were,
I mean, yeah, I don't know. You think that people
are gonna judge you. People are gonna I mean Kobe
ex boyfriends say you like to me, Well yeah I did.

(23:17):
I did, like, but just and this stigma in your head?
And yeah, well, so I want to talk about Facebook
because I've spent a large part of my career covering
Facebook and seeing you know, the good, the bad, all
of it and all the nuance that comes along with it.
So I think it's really interesting that you having your background.

(23:39):
I think the company needs people that didn't grow up
in Silicon Valley, that have different backgrounds. So I can
imagine they would have jumped at having someone who has
a different perspective, who spent a lot of time, you know, humanizing,
having these face to face conversations with folks. So talk
to us about how you you landed at Facebook. I
think you said something like, I'm sure Mark exactor or

(24:00):
didn't set out to destroy democracy. I think you said
something like that I got pick up or something like
you said it on a podcast, and and the next
thing you know, you get a call from Facebook to
potentially come work for the company, and you ended up
going to work for the company. Explained to us, Um,
you know what that was like and and some of
your expectations and what you hope to accomplish going in

(24:20):
sure s Yeah, I was doing a podcast interview and yeah,
so on that podcast, I said, I don't think Mark
Zuckerberg set out to destroy democracy, but I question who
he has its decision making tables, and I guarantee you
it's no one with my background. And what I meant
by that is all the focus on having computer scientists
and engineers and CFOs and financial people and money people

(24:45):
and people who know how to basically cover all the
legal sides of governance are always at the table. People
who I consider like myself. Maybe you would call me
more of a cost center than a revenue generator, because
I'm the person who's saying, have you thought about how
this is going to affect that? Have you thought about
so those people I don't believe are usually at the table.

(25:07):
And so that's what I meant by that comment. And
then you know, during the interview process, I was very
very clear, I use your platform. I like Facebook. I
like what it's offered me, and I like the fact
that I can keep in touch with my friends around
the world. I've lived all over the world. But the
few things I asked is, first of all, is their
support for me to be here because I've done this

(25:29):
before at X and I had gone in as the
outside the box higher that was completely different from the
rest of the team who was asked to help the
company thinks through their corporate social responsibility strategy. Differently, I've
been there, and you don't walk into a role like
that unless you know there's actual support for that person.
Otherwise you're just being set up to fail. So I

(25:51):
was very clear, is their support? Do you guys really
want this? What does my team look like? I wasn't
begging for the role, and they said everything I wanted
to hear. I still pushed back, but they said it all.
You can hire your team, you will tell us what
you need. You're going to build a strategy for this team,
all of that. And it was actually one minute after

(26:11):
Zuckerberg's hearing ended on the Hill that day, the famous
Senate hearing. I was there, so one minute after that
one the famous how do you make money? Hearing? For
our listeners, the senator asked, well, how does Facebook make money?
And there's like this epic just like this. You just
want to like, I don't know, face plane, It's like, well,

(26:32):
we sell ads. This is like in and of itself,
like the moment when people started understanding the lack of
knowledge to to a degree that Washington had, but also
those famous words we sell ads. You know. It was
one minute after that hearing ended that the recruiter called
me back and said, well, we've all agreed, we really

(26:52):
want you. We've created a new role for you. And
it was this big, shiny title and it was Head
of Global Elects Integrity Operations UM, which was a lot
to sort of digest. I think I was silent at first,
and then she asked if I was still there, Like,
what a moment to be offered that job. I mean,
let me get a little inside baseball with folks, like
to be offered that role at a time where Facebook

(27:15):
is under this much scrutiny for what was happening with
Russia and ADS and all this kind of stuff, at
a time when Zuckerberg's testifying for the first time on
Capitol Hill. I mean, that's an extraordinary title and an
extraordinary role. Just anyway, Okay, so back to you. That's
I was silent at first, and again, just to clarify
one more time, it was within business integrity, which, as

(27:36):
you said, but the ADS that means, it's it's figuring
that part out. And obviously this was the moment where
if I meant what I said, I had to say yes,
if I meant that, I truly believe this company as
is one of the biggest threats to our democracy, and
they are asking me to play a part in helping

(27:57):
figure out how to change that. I had to yes,
So I did, and I started, I think in June,
and as this is June, came in our day one,
which is orientation, it's strength, the cool, A day different
than the CIA, right, totally different orientations. There's lots of

(28:20):
snacks who don't know the Facebook fifteen is real. I
put on fifteen pounds working there. But it was all
very happy and very upbeat. And I mean the average
age would definitely have been at least ten years younger
than me, but there were some older and I do
remember just sitting there feeling like, well, this is not
my first rodeo, this is not my first job. It

(28:44):
felt a little bit like a cultish indoctrination a bit.
But any I mean, in a way, any company has
to kind of do that on day one, right, make
you really in what sense? It was just all positive
and very upbeat and very exciting, and a lot of
the sort of this is your company now, implying that
all of you are equally powerful here bottom up culture,

(29:08):
which I did not actually find to be true at
all when I worked there, but they like to continue
to propagate that sort of talking point. So that was
day one, and then day two at my first conversation
with my boss, and I specify this talking point because
I don't know what I could have possibly done wrong yet,

(29:30):
I don't even know that I was logged on the
system yet. And day to my boss let me know that, oh,
things have changed. I'm changing your title. This is actually
you're you're just you're a manager. And I was like,
I'm a manager of what and all sorts of weird things.
We don't use head of here, so that was the
inappropriate title. Was like, really, I could probably name a

(29:52):
lot of people on Facebook who's titled our head of
I also have my offer letter, it's what it says.
So immediately on day two, all power was stripped for
me and it just went downhill from there. Yeah, I mean,
I don't really care what my titles. I care about
am I being empowered to actually help this company think

(30:15):
differently about the future of how they're going to deal
with political interference, with manipulation, with all the things that
were happening, And it's just something I was never really
given the opportunity to do. We're going to take a
quick break, but when we come back, Yell explains why
she's actually allowed to speak openly about these issues. Well,

(30:36):
she didn't sign the papers that would have stopped her
more after the break, And also a lot of work
goes into making every episode of First Contact. So if
you like what you're hearing, I would love for you
to leave a review. What do you think was the

(31:05):
core issue? Like you walked in saying this company is
a threat to democracy? Why then you were on the inside, right, Like,
give me your CIA assessment, right, Like, what could you
say going into the war room of Facebook? What would
you say were the biggest issues you saw? So there
were a few different things. Some of it, unfortunately, was
just the function of who I was hired to work for.

(31:27):
Like I'll be very honest because I don't want to
overstate this case. I don't want to say that everyone
at Facebook is this way. In fact, quite a few
people that I worked with at Facebook we're very excited
that I was there, Both some of the more senior
people and a lot of the more junior people. Some
of the more senior people would say to me, when
are you taking over this role? So that we can

(31:48):
really move this forward. I will say that part of
it was a function of the office I was hired
to work into, But that said, there were lots of
other things as well. Part of it is making and
this is not making any excuses. I have lots of
criticism about the company. Part of it is the messiness
of a company that just doesn't seem to want to
grow up, Like these are not children anymore. Right, this

(32:10):
is a company that has the most profound impact on
how over two billion people communicate around the world. Right,
This idea that the first person that throws spaghetti the
wall should still be the person who gets debated tests
something is ridiculous. This is a company that needs to
grow up, and now they claim they're in their teenage years.
I'm sorry, you have the biggest impact in the world

(32:31):
on how we communicate. I think you skip your teenager
years and you become an adult. So some of it
was just that it was just the chaos of still
having this culture of Okay, they took break things off
of the move Fast and break things, but the move
Fast was still there. I mean I did do work,
I just was drowned in chaos all the time. To

(32:53):
keep me away from actually doing what I was hired
to do? What does that mean? I repeatedly would say,
why am I not in the meetings about the things
I was hired to come do? And I would say this,
and I would put it in writing. So there's a
meeting today to talk about the upcoming midterm elections and
exactly what we are are not going to do on

(33:15):
situation X and political advertising? Why am I not in
that meeting? And other people would always say, wy aren't
you in this meeting? And the boss that I worked
for would just not let me go to any of
these meetings. So there was that, But yet I would
be sent all over the world to go do things
that I wasn't quite sure what I was doing. We
go to India, we come back from India, and the

(33:37):
example of the move fast thing. I remember we were
all talking about one particular thing we were going to
incorporate into our political ads process for India, and I
was the first person to answer the email and said, well,
here's what I saw. Great, that's what we're doing now.
I would love to think it's because they thought I
was so smart. It was because I was the first
one to answer. If that makes sense. That's the move

(33:59):
fast thing, and it's funny. Actually, one of the criticisms
about me when I received feedback was that I don't
answer my emails fast enough. Just to be clear, it's
not we're not talking weeks or months. I always answered
within the day. So I remember sitting down with my boss.
I said, so I don't answer my emails fast enough.
She's like, no, people are sharing to get frustrated that

(34:19):
you don't answer these critical questions fast enough. And I said, well,
these critical questions, the answer will actually affect the lives
of human beings in these countries that we're making these
big decisions about. So when you ask me that question,
I am going to talk to you everyone who's worked
on this before. I'm going to figure out what all
the different options are. I want to make sure I'm

(34:41):
coordinating across the different silos at Facebook so that when
I give you my answer, it's really well thought out
and that might take two or three hours. Well, that
wasn't fast enough. So that was a major issue, like
what kind of critical questions are we talking? I mean
that I'm going to be a little bit careful, even
though it's important to know the reason I speak about

(35:01):
this is because I would not sign the non disparagement
of Yeah. I mean, I think that's really important for
folks to know, because people should know. And I know
this having spoken to and had to speak to people anonymously.
When people leave these tech companies, you know, they are
asked to sign papers in order for them to get
severance and money and all this kind of stuff. They
are not allowed to say anything. You refuse to sign

(35:23):
the papers, right, Yeah, and sorry, I realized that sounding
like a dramatic shift, but it's there's a difference between
an indian and nondisclosure, which is why I'm not going
to get into the details of the like some of
the questions. So yes, of course we all sign indias,
but ndias, I'm almost hesitant to go into too much details.
I don't want Facebook to change this. But NDAs generally

(35:44):
are about intellectual property, about not stealing company secrets. They're
very much written for engineers, mostly on your way out.
If they're going to try to offer you a severance,
they give you the same paperwork, and they make it
seem like they're giving you the same paperwork, But if
you eat it carefully, there's an extra section which is
the non disparagement and the non disparagement. I have never

(36:06):
seen anything like that before. Even the CIA didn't make
me sign anything like that when I left, Like when
I that's CIA, it's you can't talk about anything classified,
which is essentially an n DA. They never said you
can't say anything negative about us. The non disparagement at
Facebook was so strong I would have never been able
to talk about even why I was hired, what I
was supposed to do, And honestly, they hired me because

(36:32):
of my voice, which took me a very serious process
to find my voice after leaving a top secret world.
There was no way I was going to let this
company silence that. So that was sort of an aside.
But so I did not sign a non disparagement agreement,
which is why I can talk about some of these things,
but which is why you see me self selecting when
going through actual details of like what were the questions

(36:54):
and those emails? So I mean some I don't know
they would be details like we're thinking through will the
policies be for this particular election, And although I wasn't
in charge of policy per SAE, I was on the
operation side of how will we actually you know do this.
I was on the strategy part of that. It's convoluted.
Facebook just has way too many people working on the

(37:15):
same thing, which is part of the problem. UM. To
be honest, like, there are probably fifty people who would
tell you she wasn't the head of Elections integrity. This
person was right, and I would say, well, maybe this
is the type of product we should use for identification
to verify political advertisers in this country. It doesn't mean

(37:36):
I have gone through every single possible piece of research
to figure out if that is absolutely the right thing
to do, and what are the pros and what are
the cons. Well, this is what we heard when we
were in this country, and it was like, great, that's
what we're using. I was like, well that what if
the government doesn't There's still more questions. So after that experience,

(37:56):
I slowed up a little bit of my responses and listen. Actually,
also when when I was at the White House, I
know that sounds like a weird transgression, but when I
was at the White House, when I was working as
one of UM on the national security team for Vice
President Biden, I remember I only got two pieces of
advice from somebody there when I first started that was
that was my whole job training, and one of them

(38:19):
was don't ever give an answer to the vice president
if you don't know the answer, which is like c
I one oh one, you do not give an answer
until you have absolutely exhausted every possible way to make
sure that's the best answer. And so apparently me trying
to do that for Facebook, a company that has such
a profound impact on the world, meant I was too slow. Wow.

(38:44):
So I just think, personally, I think this rush to
go dominate every other space. Personally, I would ask the
company to fix your core product before you rushed to
dominate every other space. And I know you have to
be fast in the sense that things are happening in
real time, But if I take three hours to answer

(39:05):
an email instead of five minutes, don't you think that
is worth it? If at the end of the day
we're talking about how this is going to impact a
country's transparency ahead of their elections, what do you think
now looking at the political ad problem, what do you
think is the core of the issue? Is it micro
tar I know we we speak about like this idea
of it, Should political ads be banned? Is the issue? More,

(39:28):
this idea of micro targeting to certain populations. What do
you think it is? So it's interesting because if you
really think about it on the advertising side, it's actually,
if I were to look at a macro level, of
the things I'm concerned about about Facebook, political advertising is
actually not the thing I'm most concerned about. That said,
not only is it very important, but it is actually

(39:48):
the issue that in my opinion, really shows who Mark
Zuckerberg is willing to be in his reaction to this
latest controversy over the last few months. What do you
mean so the idea of should politicians be able to
lie in in ads? Right? That is what sparked the
latest controversy that everyone's talking about, and that's what sparked
me to finally stand up and write a piece about

(40:10):
my time at Facebook. Now, the easiest, most gut reaction
is to say they should just not allow political advertising
on their platform. And I was actually one of the
questions I was asked when they interviewed me for the
job to begin with, do you think we should stop
allowing political ads? Because they did, and I didn't know
I was being interviewed for an elections job yet because

(40:32):
the job didn't exist yet, and I remember saying during
the interview, I don't think we should ban political ads.
We I wasn't there yet. I don't think you should
ban political ads because I think about things in a
global scale, and I'm thinking about the countries where the
government's actually have a monopoly over information, over news, over TV.

(40:52):
And if you ban political ads, you are actually giving
the power to the people who already have the most
access to information or to the control over information. And
so now we get to this question, and I still
don't think banning political ads is the answer, especially listen unfortunately,
and this is something the Democratic Party and all the
people on the Democratic side screaming about what to do

(41:14):
now are going to have to really think about very
carefully whether you like this tool or not. If you
were to banned political ads right now, you would absolutely
be handing all of that power to the incumbent, because
the incumbent already has the organic reach, already has the following,
already has all the data. But separate from that, what
do I actually think the bigger problem is? It is
the idea that it is the ability to micro target

(41:40):
us down to a level where I mean, all I
know that a lot of your listeners are more tech savvy.
But sometimes I like to describe this in a way
that two people who know nothing about tech will understand.
And so I'll always point out like two people and say,
you guys probably both live in the same city, maybe
you even live across the street. But the two of you,
if you both use face Book, is a very good

(42:02):
chance you are seeing two totally different versions of a
political ad from the same candidate because of the human
behavioral data they have gathered on you and the way
you've been targeted by that particular campaign. So if the
two of you, who live across the street from each other,
received two totally different versions of truth, how can the
two of you debate this candidate at all? How can

(42:25):
the two of you debate what you do or do
not think is right ahead of going and voting. You
can't because you don't even have the same version of truth.
And it's one thing to say in organic content, But
you are taking Facebook is taking money for ads. They
are selling these tools that are giving people who are
the most sophisticated at this the most incredible information warfare tools.

(42:48):
They're giving people the ability to hyper target us It's
one thing to say, wow, you seem to really like
fashionable boots based on what you're wearing right now, which,
by the way, Lori's wearing really fast trouble boots right now.
And I know you live in New York, and I
know like these five other things about you. So we're
going to serve you ads that are going to match

(43:08):
those boots. That's one thing. Those same tools that personalize
and customize things for you so that you're seeing the
ads that make you happy. Are we sure those same
tools should apply to political discourse? Are we sure that
those same tools should allow us to see totally different
versions of truth when it comes to the most important
tenant of a democratic society, which is the voting booth,

(43:32):
which is voting our our leaders into power. So it's
funny when you hear Mark Zuckerberg say I don't want
to fact check political leaders and political candidates because it's
not my job to fact check them. Okay, but you're
going to provide them tools to be able to send
you and me two totally different versions of truth, and
you're going to take money for that. Like, how do

(43:53):
you reconcile that? I mean, I remember when everything happened
with Cambridge Analytica thinking like God, like, the question is
like also into is micro targeting turn into manipulation? Like
how far can we go, especially with like the amplification
of a lot of this stuff. I think that's super
interesting and such a more nuanced question of whether we
should ban political ads or not. Yeah, And I mean

(44:15):
in a beautiful, perfect world in the future, maybe right
now ahead of I think we need to give all
candidates the same tools because unfortunately that's where we are.
So I don't think we should ban it all together,
but I do think we should limit some of the
more dangerous things that are both distorting reality, that's rewarding
the most delacious content, that's allowing the most hate filled lies.

(44:37):
You know, then people start sharing ads, so then it
basically becomes organic content, and now the most outrageous ones
go viral because that is how the entire system is built,
right to indict us to engage us all of those things.
So when it is again coming to selling you boots,
I really don't care when it is coming to selling
you who to vote for, I do. So. You were

(45:00):
a national security advisor to Biden, so I want to
role play for a second. Here, you have been on
the inside of Facebook for six months, right, just as
if like and and just like in the past, where
you would go into these these zones and you would
sit with people and have tea and talk and assess
the situation. So now let's let's role play. Now. I

(45:20):
want you to give me your assessment of the situation,
having been in for six months, of of what's going
on there and what you think needs to happen. Where
do we stand on national security? You said it was
the greatest national security threat. Where do you think we stand?
What's the assessment? I think there are lots of people
who actually in the company do in their core believe
that what they're doing is the right thing and want

(45:41):
to do the right things. So I want to start
with that. So you started with the good news, I
start with a good news. Thank you, You're welcome. That said.
The biggest thing that I saw there is that everything
that they seem to be doing was about pushing responsibility
onto others, you know, whether it's I even think the

(46:02):
content moderation board that they're setting up is a bit
about pushing responsibility onto someone else instead of proactively taking
responsibility for the thing you built and for how it's
being used. So there's a lot of whack a mole approaches, right,
Like I do think of some of the security stuff,
they're really trying to be fob. I'm sure Nathaniel Glacier,
who I think is a great guy, is really trying
to do what he can on the security side. But

(46:24):
a lot of especially like in the election side of it,
a lot of it is whack a mole, right. It is, Okay,
you told us this is bad, Maybe we need to
go take this down. Oh is this hate speech? Should
we take it down? Should we not take it down?
None of those things are addressing the core, underlying, systemic
issue of this platform, and that is it is a
platform that is using our human behavioral data to segment

(46:49):
us more and more into these tribes of who we are,
that are putting us in these little different buckets in
order to target us with ads, and in order to
do that, you have to keep us engaged, like figuring
out every possible way to nudge, to push us, to
persuade us, to manipulate us, to keep our engagement there.

(47:10):
And so all these whack a mole approaches, none of
them are addressing the systemic issue, which is the business model.
Did you have those conversations when you were inside? Did
you never bring that up? I mean I just one time,
and one time only with someone very senior, did I.
We were traveling and I said to the person, I
was like, Okay, I get it. We're doing all this
whackable stuff right now, and we're trying to figure out

(47:30):
how to deal with this election. But do you ever,
like just actually sit back and have you actually sat
back and had the conversation about who does this company
actually want to be? Who do we want to be
in this space? And this person admit that no. So no,
I mean I don't remember ever having those conversations. And
I'm not saying those conversations don't happen. They didn't happen

(47:53):
for me. But as long as your company continues to
report out user engagement metrics so that Wall Street will
continue to reward you, then you are not in any
way actually listening to what so many experts on the
outside are telling you is actually your core problem. And

(48:14):
to me, that's that's the biggest thing. Every conversation I
was a part of was always sort of like that,
what's the quickest way that can scale that can you
know that's easy to scale. Well, you know what, when
you're talking about elections, and when you're talking about election
interference and political manipulation and all of that, it might
not scale because the way a Russian wants to interfere

(48:37):
in the elections in the US is going to be
a very different situation than the way somebody wants to
inflame ethnic tensions in India. And the idea that I
have to come up with a solution that scales globally
is part of the problem. But you know, it's funny.
Also there's this, you know, like people like to joke
about people who always say they're busy. Sometimes it's because

(48:57):
they like to say they're busy. So at Facebook, I've
really felt like everyone always wanted to say how overwhelmed
they were. And I was sitting around I just remember
second day, actually second or third day, and I was
having lunch with a group of vets. So it's like
a group of vets and the former CIOs or We're
having lunch and we were all laughing about this, and
I said, I asked, mssic. You know, people keep coming
up to me saying, oh my gosh, you must be

(49:19):
so overwhelmed right now, this must be the most overwhelming
job in the world for you. And then we all
went around the circle and told told our laughing stories
because this has happened everyone, it faces every one of
us in Facebook. And I was like, no, you know
it's really overwhelming is when you know there's a US
hostage and you have a certain amount of time to
like coordinate the intelligence and try to find that person
in that country and help the US you know military

(49:42):
go and rescue that person. That's overwhelming, Like, this is
not overwhelming, um, And each of that would like make
a joke about that. They would be like, you know
what's overwhelming when you're in war and Iraq? And so
it was that culture of like always chaos, always running around,
everything's constantly well, things move fast around here, Like oh,

(50:04):
we change your job title because things move fast around here. Again,
things don't have to move this fast. Yes, there are
people have to respond to things quickly, no question, But
who's slowing down and having that strategic conversation of rather
than constantly being in pr threat here, can we actually
listen to some of these people from the outside who

(50:25):
are screaming at us about what the actual underlying causes
are said? How much money is enough money? I want
to say, why did you leave? Were you where you pushed?
Where you fired? Was it a choice? It happened after
six months, so I can imagine all of the above. Okay,
a few months and after a particularly agregious moment, particularly

(50:47):
agregeious with just I won't go into that particular detail.
I finally went to HR. And I know in a
company like this, when you go to HR, that HR
protects the company, not you. Um and I knew that,
but I wanted to document it. So I went to HR.
I told him exactly what was happening, and then I said,
you guys hired me to help you work on one

(51:08):
of your biggest challenges here. You hired me because you
know how much I care about this and the experience
I bring. Let me do the job you hired me
to do. I am still deeply passionate about helping this
company think through this work, but I cannot do it
in the environment you're setting me up in. So either

(51:29):
you find somewhere else in this company for me to
do the work you hired me to do, or I walk.
So did I quit? No? Because then they find the
reason to push you out two months after you have
that conversation. So I don't know. Did I kind of
say I was going to quit and then they figured
out how to push me out? I guess I got
fired or I quit. I don't know. How did it feel? Oh?

(51:52):
It was? It was nasty like that. The way the
last day went was particularly nasty. But let's and I'll
never be that person who can say I didn't at
least try. And even now listen, me speaking up about
some of these things is still about me trying, and
so I will at least be that person who can

(52:13):
at least say I tried. I would say one piece
of advice I would give to them is to engage
more with their critics. I do think they're under this
like constant, we're under fire mode. Nobody understands us it's
it is. It will be honest. It is a little
bit cultish. They're right, it is a little bit. You're constantly,
just constantly under this you know, tech utopian. We're connecting

(52:36):
the world and that's a good thing, and that's a
good thing, and that's a good thing. And we're so
mission oriented because we're connecting the world. I didn't see
a lot of self reflection of maybe we're not doing
the right thing here, And I get that it's very
hard to say that to yourself and then reconcile that
you still work at a place. So I don't even
blame people for that, but I do think it be

(53:00):
really worthwhile to engage with some of your biggest critics,
because if your biggest critics are criticizing you while trying
to highlight how you could do better, it's not because
they just love criticizing. It's because for some of us,
we fundamentally want you to get this right. Did you
think your work at the CIA kind of properly prepared
you for your time? They're not at all, which is

(53:22):
so funny. I mean, it should have. So here's the
other thing it should have if they had allowed me
to do the job the way I wanted to. Because
the CIA analyst is going to come look at the past,
assess the mistakes that were made along the way, go deep,
dig into all the evidence out there, and then think
through how do we not make these mistakes again? Then
go through all the different scenarios and and plan out

(53:43):
all these scenarios. So that is the way I would
think you should approach a role like this. Unfortunately, it
was made very clear to me by a number of people.
Don't actually look backwards when you come into this role.
Don't actually make anybody feel that they're a person who
responsible for doing anything wrong. Don't actually blame anybody for
anything they did in the past. So you're telling this

(54:06):
former CIA officer to come into this role, but not
to look under the rugs and not to help try
to figure out how we got here. And yes, there
are some people on Facebook and some of the research
shops that are like diagnosing some of them sicks in
the past. But it was very clear to me that
I was not supposed to talk about anything they've ever
done wrong. So no, I mean, my CIA training would
have prepared me to look at everything we had done

(54:28):
wrong and then try to figure out how to not
do that again. That was message to you by a
few people, people I like, even before I accepted the job,
a friend of mine who was in a pretty big
role there. That was that friend of mine's number one
advice during the interviews. Make sure you don't kind of
say anything that makes anybody feel too defensive. Don't actually
bring up the Cambridge analytical thing or the past or

(54:50):
the mistakes and wait what wo Yeah, you know, having
met you briefly and having looked at some of your stuff,
it seems like have been on this constant journey for
your voice, maybe not talking about it behind the scenes
and having this whole you know, like talking to people
your whole career and taking things in and then having

(55:12):
you know, this moment where you come out and see
you have something to say and you're going to say
it publicly for the first time and that's going to
have a lot of personal impact on on you and
your relationships. And then you know, going and joining Facebook
and it just seems like you have something to say
and you've given up money to say it because I know,
having covered technology and knowing people who don't sign those

(55:32):
agreements give up oftentimes a lot of money. What are
you sitting here now? Um? What do you want to say? Wow?
That's a really big, great, but big question. Um right
now in this moment, I just I want people to
realize that it's not There's so much noise and so

(55:54):
much debate, and it's so ideological right now, and none
of these things is like the whole freedom of speech
versus censorship. All of these are not the right nuanced
way to talk about these things. For me, it's what
are your core values? You still have a choice. Is

(56:16):
it more important for you to protect the future of
not just our democracy, but of our cognitive capabilities, of
our abilities to see the humanity in our fellow citizens,
of all these things? Can we get to a place
where that is as important as profit? And if we can,
then we can get to the conversation of how do

(56:37):
we make that happen? And to do that we need
to get out of our silos. But there's I know
there's so much talk about diversity in tech or the
lack of diversity in tech. I would really want people
to consider an addition to racial and sexual diversity and
socio economic diversity, it's also cognitive diversity and diversity of experience.
If you continue to just surround yourselves by people who

(57:00):
also are engineers and computer scientists, and you know, investors
and people are completely steeped in either lawyers or money,
why aren't you bringing more people to the table who
have fought more zones, who have actually worked in refugee
camps as humanitarian assistant workers, who have seen the real

(57:24):
world impacts. Bring those people to the table and have
them help you think through things differently. I still don't
see enough of that in any of these companies. What
have you learned about people in your work? What have
I learned about people? I think we This is gonna
sound so cliche, but it's empathy. If you cannot actually

(57:48):
look a person in the eyes, learn their story, and
empathize with where they are coming from, then you won't
get to the point of being able to figure out
how to find solutions to things that you're both struggling
with and recognizing that most people, not all, but most
people who now you're being pitted against and being told
to hate and screaming out on Twitter. If the two

(58:10):
of you could just sit down and have a conversation,
you'll realize that at the end of the day, you
probably have some shared values, or shared ideals or shared dreams,
and so it's really getting back to how do we
empathize with people. There's no question Facebook has done much
good in the world, that it's connected billions of people.

(58:32):
It's the company line, but it's also a reality. There's
also no question that this company has some major issues.
As someone who's covered Facebook for a long time. I've
heard a similar narrative from people who have left. Many people,
not just Yelle, have said that they've had trouble raising
hard questions within I've heard this rhetoric many times. Certain

(58:55):
people in charge just don't want to hear bad news.
I think that one of the only ways we move
through these complicated issues in tech is by surrounding ourselves
with people who don't think like us, who don't look
like us, who don't necessarily agree with us. And the
news industry, rigorous debate behind closed doors about stories and
how we should cover them is common. Most of the time,

(59:17):
we come out stronger for considering all sides. Now, I
know this debate gets a lot of lip service in
Silicon Valley, but does the talk match the action? How
do things actually play out? Who's at the decision making table.
There's a difference between wanting to do good and fostering
an environment for people who have different backgrounds and different

(59:40):
viewpoints to actually have an impact. This will be one
of the biggest challenges for the tech industry moving forward.
We know this is an issue, we recognize it, but
how do we create an environment that allows people who
aren't in the Silicon Valley bubble to impact policy, impact product,
and most importantly, try their hand bettering humanity. I'm Laurie

(01:00:02):
Siegel and this is First Contact. For more about the
guests you here on First Contact, sign up for our newsletter.
Go to First Contact podcast dot com to subscribe. Follow me.
I'm at Lorie Siegel on Twitter and Instagram and the
show is at First Contact Podcast. If you like the show,
I want to hear from you. Leave us a review
on the Apple podcast app or wherever you listen, and

(01:00:23):
don't forget to subscribe so you don't miss an episode.
First Contact is a production of Dot dot Dot Media.
Executive produced by Laurie Siegel and Derek Dodge. This episode
was produced and edited by Sabine Jansen and Jack Reagan.
Original theme music by Xander Singh. First Contact with Lori
Siegel as a production of Dot dot Dot Media and

(01:00:45):
I Heart Radio
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.