All Episodes

February 17, 2020 62 mins

When the lights went out in his home, and his children asked if it was “Russia,” Facebook's former chief security officer Alex Stamos knew he was bringing his work home. In a candid interview, Stamos opens up about what it was like when his team discovered Facebook had been compromised by Russia, and the personal implications of being at the center of one of the most significant attacks on technology... and democracy. Plus, hear what Alex had to say when asked whether he uncovered spies within Facebook during his time there and why he worries foreign spies have infiltrated every major US tech company.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
First contact with Lorie Siegel is a production of Dot
Dot Dot Media and I Heart Radio. I expect that
every major U. S. Tech company has at least several
people that have been turned by at least China, maybe Russia,
probably Israel and a couple of other U. S. Allies.

(00:22):
Did you ever uncover anybody that had been turned I'm
not going to comment on that, Can you comment a
little bit? What I'll say is there's there's a lot
of weird things that happens at the companies that it's
very difficult to figure out why it did happen. Imagine
being at the center of an event so big that

(00:44):
it changed the way all of us think about social
media and democracy. Alex Stamos was the chief security officer
at Facebook for three years. I met him two months
into the job. His job started with fighting ISIS and
ended with an exploration into Russia, disinformation, and society at large,
questioning the responsibility of technology in our democracy. But before

(01:08):
we get into all that, I want you to get
to know Alex for a second. He has spent his
career dealing with the darkest the Internet has to offer.
At the same time, he's been learning how to balance family,
his career, and this larger sense of responsibility. Oftentimes he's
felt tension between those characteristics that define him as a person.

(01:30):
Can all three coexist peacefully? What do we sacrifice to
do the work we believe in? And is there a
point in our lives where we realize that that work
isn't actually possible? And the role we're in even though
optics may say otherwise. These are all questions Alex navigated
at his job at Facebook. Now that his journey there
is over, here are some of his takeaways from the

(01:52):
other side. I'm Laurie Siegel and this is first contact.
So Alex, I do remember our risk contact. I've been
because I've been listening to your podcast. I've been trying
to figure it out so that I embarrassingly can't mention
it was it at south By Southwest. No, it was
not the other you went better. It was in Las Vegas.

(02:13):
It was right when you started at Facebook, and it
was when Facebook was throwing parties for security at black Hat,
which is this hacker conference, like a security conference in
Las Vegas. So I remember it was like a really
fancy party and I think there's like a pool and
like lots of like nerdy people and like someone ushered

(02:35):
you over to me, me the journalist because as a
CNN at the time, and they were like, you have
to meet Alex Stamos. He's the new head of security
at Facebook. And they were like little cabanas like with
like people inside having conversations. And I remember that was
my first contact with you and you were new on
the job, right, it must have been like yeah, so

(02:56):
that would have been August. I started in June Facebook
and that was the Facebook party. That the last one
we threw there because I actually I killed it after
that one. I was about to say, now, Facebook is
putting a lot of money into security and election integrity,
but I don't think it's going to parties in Las
Vegas for black Cat. No, it's actually it was a
bit of a controversial call in the team, but after

(03:17):
that party I got rid of it because there's just
a lot of behavior. I mean, the problem is if
you hold a big party in Vegas, you incentivize people
to act in a way that's not very professional. And
there are some things that happened there that I didn't
think was compatible with the kind of inclusive brand we're
trying to build a Facebook security and and for folks
who don't know you, I mean, you have a really
strong reputation in the security community, and you also are

(03:41):
known for speaking your mind and um, which I think
probably makes it very interesting when you kind of wait
into big corporations. And you were also before you're at Facebook,
you were at Yahoo, um where there was a whole
big security breach, and you have testified before, and so
you were also at Facebook, and it is your team
who discovered Russian influence. So you just kind of like

(04:03):
kind of step into it right like you're in these
extraordinarily historical moments when it comes to security. Yeah, a
friend of mine called me the forrest Gump of info
sac that wherever I go, there happens to be some
kind of interesting geopolitical disaster unfolds. I'm not sure it
was totally a compliment by him, but it's not completely inaccurate.
He must have a high tolerance for stressful situations. Well

(04:25):
apparently not, because I got out right, that's very fair. Um.
And actually, speaking of how you got out, that was August,
let's fast forward to our I would say, almost last contact,
which was August, and it was it was around your
last day at Facebook, And I remember I was doing
a documentary on Facebook at fifteen and you had made

(04:49):
the decision as the chief security officer to leave the company,
and I just remember, how do I say this, Like
we had known each other long enough that you kind
of let us and to talk to We did some
interviews around it, um and it was a really emotional
moment you decided to leave the company amidst everything going on,
And my last contact with you, really, like on camera

(05:12):
was that last day where you were like heading in
for that last moment. It was emotional. Eight. You know,
one of the interesting things about Silicon Valley is people
really do bring their whole selves to work, but then
work also comes home with them and becomes their whole self.
And you know, leaving a place like Facebook is a

(05:33):
little bit like you're dying a bit, right, because you
you end up after a couple of years there that
most of your social interactions are with your coworkers, that
so much of your identity is tied up in the
work you're doing, and so when you quit the job,
it's a little bit like being excommunicated from the family.
Not just that you're going to have to change things around. Now.

(05:53):
I think being gone, I've I've been very happy to
be able to be in the situation where now I
can work on some of these problems and step back
from the myopic view that you get inside the companies.
And so I think now that I've had a little
more than a year to think about it, that all
encompassing feeling that you get from inside the companies is
one of the reasons that they've failed to react properly

(06:16):
in some of these cases, because you end up really
living in a bubble. But yeah, it was a very
emotional time, and it was also you know, it's a
little scary to walk away from a you know, well
compensated corporate job to go take an academic gig right
when you're supporting the family, and so you know, there's
there's a little bit of trepidation there, but it's all
worked out in the end. Yeah. I remember like sitting
on your back porch and being like, this is a

(06:36):
big pay cut you're taking. I was liked, Laurie, are
you a jerk for like saying that on his last day?
But I was like going through the numbers or whatever else,
Like I was like, this is a very big pay
cut as you're like looking at my mortgage and zillo.
You know, I didn't do that, but you know, I
think it says a lot about the conviction and like
the idea that you weren't leaving to go take another

(06:57):
chief security officer job somewhere out there, a security job
somewhere else like that, that's a big deal, you know, yeah,
you know, I think one of the reasons I was
interested in leaving is that there's some really interesting, big
problems that we're dealing with right now about the relationship
between tech and people's lives, the relationship between tech companies

(07:19):
and governments, who's in charge of online speech, how do
we respect the rights of individuals while also protecting people
from disinformation and bullying and harassment. Like these are really
humongous problems, and most of the problems are being solved
inside the companies right now, and they're being solved by

(07:39):
thoughtful people, but people who are living within a bit
of a bubble, and who are incentivized to think about
these problems only in the context of their specific companies
position right their position in the media, their regulatory risk there,
what what is good for their shareholders, and very very
few people have been able to leave that environment and

(07:59):
then come outside and kind of talk about the base problems.
And so there, I think was a unique opportunity from
a timing perspective to be able to do that, and
I found that really attractive. You know, there's there's not
a lot of X C S, O s and certainly
not people who have worked at a company that has
control over the speech of two billion to and f
billion people. Very few of those kinds of people can

(08:20):
step out into a position where they can kind of
truthfully talk about what the issues involved are, which are
numerous and from my perspective, much more complicated than how
people have generally been talking about in the media, which
is it was nice to be able to put myself
in position where I can kind of honestly criticize the company,
but then also pushed back in situations where people are
pretending that some of these problems are actually quite simple,
because dealing with them like they're simple, it's going to

(08:44):
end up in a situation where we're going to create
more problems than we fix. And I want to get
into a lot of that stuff. You know, for folks
who don't know, can you just explain It was your
team that discovered Russian influence, and you are kind of
on the front lines of what has now become this
five against like democracy and social media and this larger,
you know, more fundamental question about this role of tech

(09:06):
and society. It was your team that really discovered it, right,
And I can't take a lot of credit here. I
was very lucky to work with some really incredible people.
We had a threat intelligence team whose entire job it
was to study what governments were doing on Facebook platforms
to possibly cause harm. One of the core issues going
into twenty sixteen is our focus was not on things

(09:28):
like disinformation. Our focus was on traditional cyber warfare. And
during the run up to the election, we had people
who were dedicated to tracking activity by APT twenty eight
a k A SOFAC a K fancy Bear. So this
is the group of hackers at work inside of the
g r U, the main intelligence directorate of the Kremlin,

(09:48):
and the people who are tracking that group saw mysterious
activity related to the U S election, and we ended
up turning that information over to the FBI. Now we
know kind of now that the FBI the White House
had kind of knowledge of a number of things going on,
and there are some some of that didn't make it
to the people who really need to know about it.

(10:09):
But you know, that was kind of our part in
the run up to the election, and then after the election,
I was part of a couple of rounds of investigation
of what is the fake news crisis? And then is
there any evidence of Russian interference, especially within the advertising
system on Facebook? And so I did get to participate
in those and supervise the team and spent most of

(10:33):
kind of trying to answer those questions. I mean, and
you take that home with you too, write like I
mean into your degree, because I was at your house
as part of this. I mean you had like in
your office or like biographies of like the Russian crap.
I mean, like a Putin biography. Didn't you say you
were like waking up in the middle of and I mean,
like you take that kind of stuff home with you.
I can imagine. Yeah, you know, one of the interesting

(10:55):
things about that kind of job is you're just dealing
with the downsides of your products, right, So Silicon Valleys
kind of overall the feeling inside these companies. The perfect
distillation of that is Steve Jobs is famous keynote or
reintroduced the iPhone, right, Like that feeling of limitless possibility

(11:15):
of technology is good. Technology brings wonder into people's lives.
That is the feeling that pervades these companies. And then
you've got me and my team and a couple of
other teams at Facebook who just wallow and misery all
day all right, like dealing with not just disinformation, but
people trying to attack the platform, people trying to send
malor to each other, to fraud each other, to sexually

(11:37):
abuse children, to perform human trafficking, you know, terrorists who
are trying to use the platform to either celebrate organized
terrorist attacks. Like we would just spend all of our
time on the most horrible parts of humanity as reflected
on the products that our companies build. And so it
makes you kind of separates you a little bit from
the rest of the company, right because the rest of
the company is about we're making the world open and

(11:59):
connected and we're bringing them together, and it is only
a good thing that more people have used our product.
And then you're the person that comes in as a
Debbie downer of like, well, if another hundred million people
use our product, this is how much child exploitation is
going to go up, right, and everybody's kind of like,
oh my god, who invited Stamos to this meeting? I'd
get that a lot like I would walk into a
room and people are like, you can see their faces

(12:20):
fall of, like, oh my god, this can't be good.
Right as you see that SNL, there's like this SNL
Debbie down. Actually the thing, it's like, you're that you're
the person only yore right, Yes, feline age is the
number one killer of household cats want Wow. Yeah, that's
exactly what it was like being me um. And so
this cloud follows you, and it it does follow you home, right,

(12:40):
because it's just it's hard to the thing that I
mean that really followed me home was the child stuff. Right.
So you know, I had two areas of responsibility. One
was the traditional information security, right, people trying to break
into Facebook, steal data, steal money, that kind of stuff.
But then we also had the safety responsibility, and that
included a DEBTYE CATD child safety investigations team that some incredible,

(13:04):
very dedicated, very skilled people that I was very lucky
to work with. But working with them and supporting them
and seeing their output really makes you kind of it's
very hard to do that during the day, and then
to go home and hug your kids right to see
like the incredible levels of depths of depravity and horribleness
that happened in the rest of the world, and then

(13:24):
go home and your kids are like, hey, Daddy, how's
it going. Like It's just it's very hard not to
bring that with you. And so that's the kind of
thing that makes you wake up at three am and
check your phone. It took me about a year to detox,
right to me, about a year before I can sleep
through the night without worrying that if my phone vibrated
that somebody had died or there was, you know, an attack,
that could be disastrous for for our users. Did your

(13:45):
children never, Um, did they ever sense anything? Did they
ever say anything? Um? Yeah, I mean I think I
think they've sensed the difference since then. I think my son,
um My, my eldest who's twelve, says some along the
lines of Dad, you're a lot happier now, which I
cried when he said that. Um. You know, it's it's tough,

(14:07):
it's it's tough to deal with that kind of stuff,
but it's also important, and I feel, you know, a
little bit of a failure for not being able to
do more of it. Right, My hope is I can
still have impact even though I'm not directly putting my
hands on those problems. Now. I remember the one thing
he said to me right before we left, on your
last day, and I go back to that because I
don't think people get to see how human you guys

(14:28):
all are and some of these and some of these
moments people can yell about Facebook is doing this and that,
but like when you're at your home and you have
your children playing with their legos or playing piano, and
you're about to walk away from a big job and
for this thing that you believe in and you know
that that I kept. I guess I kind of go
back to that because I want people to understand what

(14:49):
that looked like. Um, you know, I think people don't
understand like you, you know, you're small, not scary dog,
which I actually thought it was very ironic that the
chief security officer who's like protecting, like billing to people
hasn't not very scary that we have a golden doodle
instead of a German shepherd. Yeah, I was like I
was like, that's that's a protection. There's some kind of
metaphor if she sits there barking and then it's completely

(15:09):
ineffective and stopping things. I think one of my critics
will point that out. Yeah, right, But I remember saying
something about how when the lights went out once or something,
your kids thought it was a rush or something like that. Yeah, no,
that we just had a power outage in our neighborhood.
And my one of my kids asked, Dad, this is
the Russians. And I realized I was bringing perhaps a
little too much, you know, because you're never really off right.

(15:31):
So if I even if I got home at seven pm,
the odds of having two conference calls that night were
pretty good, and so they would overhear a lot more
than I knew, you know, especially that kind of intense
summer of when we were investigating these issues and had
yet to announce it, you know, and there were kind
of on the knife's edge of what's our level of
responsibility here? What do we tell the world? How do

(15:51):
we tell the world, how are they going to react?
It was it was a very ted summer, and I
think and we effectively we canceled all our summer vacations,
and you know, I think the kids definitely noticed that one,
and I think you're speaking about it broadly, but I
also think, you know, it sounds like you had wanted
to be a little bit more transparent about some of
these things. And I'm sure there was some internal conflict
for you, as the person who likes to say things
out loud and who's kind of like a little bit

(16:13):
of a bulldog in this not being able to say
fully what was happening and this and that, and I
can imagine that summer itself was tense. Yeah, a lot
of things have changed in the last couple of years
about the expectations of what these companies should do and
what they should say publicly. So when I started the
job in one of the things that struck me was

(16:33):
the huge gap in how much proactive policing is done
by tech giants and how much the public understands that
that's happening. And the way that that was really becoming
obvious was through our work fighting isis right. So like
when I started, kind of the number one content safety issue,
and Facebook was not just information, it was that isis,

(16:56):
unlike some of their predecessors, was digitally native that they
had young millennials who often lived in Western countries, people
we called jah hobbyists who would spend all day creating
and spreading content on social media to recruit people to
come fight for them in Syria, or trying to celebrate attacks,
or you know, in some cases, you know, threatening the

(17:18):
lives of service members and their their family members and such.
And so that was kind of the big thing, and
it struck me, as we are dealing with this problem,
how far over our skis we were, and that you know,
we built a dedicated counter terrorism investigations team. We caught
a number of terrorist attacks. You know, there's been several
terrorist attacks that have been stopped where the FBI or

(17:39):
some other law enforcement agency takes credit in a press release,
but it was really Facebook. It was really our count
of our investigations team that found it and turned it
over to law enforcement. And the fact that we are
kind of hiding all that because to admit that bad
things were happening was just so far out of kind
of how communications work at these companies. That was kind
of shocking to me. And then that hit the fan

(18:00):
with the Russia stuff, obviously, because you know, we knew
about the g r E activity in twenty sixteen, we
had turned it over to the government. That was kind
of the standard thing, that's how you handle it. And
looking back, clearly, if we had come out and said
these are the kinds of things we're seeing in it
would have been incredibly politically controversial, but it also would

(18:22):
have massively inoculated the company against what ended up happening
in and so yes, there was a lot of discreements
on that kind of The famous example is our team
wrote a white paper about what we knew about g
R activity that we released in the spring of seventeen,
and there was a big back and forth on whether
we would name Russia or not. And at the time,

(18:42):
the policy team at Facebook was trying to kind of
live with the reality of a Trump presidency and did
not want to be pulled into this like that The
term that's bending about inside of Facebook a lot is
let's not break into jail. Explain that. What do you
mean right that? Like, let's not create a communications moment
where we're making a controversy ourselves right now. Our argument

(19:05):
on the security team was we were already deep into
this controversy, right like, there's no way to avoid it.
So we might as well just be honest, and so
we ended up publishing what we knew, but like the
word Russia was removed, which was really controversial, and we
ended up throwing like the compromise was a footnote in
which we said that the Director of National Intelligence report
was compatible with ours, which is us saying, yes, it's Russia.

(19:27):
And then privately, when I briefed Congress on that report,
I told them, yeah, it's Russia. We know it's Russia.
But you know, those are the kind of situations in
which you know, the world's really changed, right, Like the
expectation at the time was there's you know, the U. S.
Government is run by competent people who are good actors.
If we give them this information, they will eventually release
it in a way that's appropriate. And in a situation

(19:49):
where you no longer trust the administration or the administration
may be tied into some of this activity, it blows
away all kind of the ways the companies and so
now they're becoming much more independent actors. Who are now
independent actors on the geopolitical stage, they are pointed figures.
They are saying these countries are doing bad things and
perhaps changing history and it's kind of a crazy place
to be, but that's where we're at. Now. We're going

(20:14):
to take a quick break to hear from our sponsors.
But when we come back, Alex explains why he's got
a list of countries he won't go to. Also, if
you like what you're hearing, make sure you hit subscribe
to First Contact in your podcast apps so you don't
miss another episode. So there's life on the outside, and

(20:46):
now you're on the outside, and you kind of have
this perspective, how are you feeling now? I'm feeling good,
But I also what happens inside the companies is you
end up with this view that is completely active, that
it's all about the last emergency, and you never have
the ability to kind of take a step back and
to think about how did we get here right? And

(21:07):
so you know, and I am sure this is inside
all the big tech companies, inside of Google, inside of Twitter,
is that you're constantly dealing with this media cycle and
this regulatory cycle and this emergency. And now that I've
been out, it's become much clearer to me the kinds
of long term issues that the companies have allowed to
exist and they haven't tackled proactively and in both like

(21:29):
their relationship with their users, but especially relationships with governments.
And it feels good to be able to kind of
talk about that a little bit more. Yeah, and so
tell me a little bit about what you're doing here
and what kind of things you're teaching. Yeah, so, and
so my work here I kind of break into three areas.
So I've got teaching research and then policy work. So
on the teaching our group is teaching to classes. One

(21:51):
is a introduction to cybersecurity for non CS Majors. I
wanted to call it hacking for poets, but the registrar
kicked that back. But it's it's called the hack Lab,
and it's a offensive security techniques for lawyers and m
b a s and people in the international policy masters here,
and it's to give people who are going to be
in the business world going to be in government hands

(22:13):
on experience hacking stuff. And the the reason I'm doing
that class comes out of the experience I had. I
can't not sound like a jerk telling this anecdote, so
I'm just going to accept that. But I was in
a meeting in the White House, uh, you know, because
there's something going on and I was summoned to the
White House and some other folks during the abdministration, and
we're having a discussion with people in the National Security Council,
and all of the technologists had come from Silicon Valley,

(22:37):
and every single person on the other side of the
table is a lawyer, and we're almost impossible for us
to talk to one another. And so I kind of
realized that, you know, there are lots of people who
are now in a position of responsibility where they have
to do this work where is impractical for the ever
have gone real good hands on experience in cyber So
is there a way we can bridge that gap? And
so that's one of the classes I'm teaching. The other

(22:59):
is called rust in Safety Engineering, and so that's a
c S class for computer scientists effectively deal with the
fact that Stanford keeps on graduating twenty two year old
mostly guys who go out and build products and don't
understand all the bad things have happened the past. Right,
they might think, oh, yeah, I'm gonna build this mobile
app and you're gonna be able to anonymously send photos
to an infinite number of women. What could possibly go wrong? Right?

(23:21):
You're like well, here's the list of the twenty things
have gone wrong every time somebody's built an app that
sense photos. Right, Um, and so that class we cover
lots of pretty heavy topics. Right. So we talk about disinformation,
we talked about hate speech, bullieded harassment, we talked about suicide.
We have two whole lectures on child sexual abuse, on
the different kinds of child checks of abuse, not just

(23:41):
the trading of child pornography, but sex dorshi, which is
a whole class of issues. We talking about terrorism and
the terrorist uses of the Internet. So it's not an
uplifting class. You basically talk about all the bad things
that could happen when you build out technology. This is
like the class. Is it like a class of unintended consequences? Exactly? Yeah,
So that they will understand and that they can not
make the same mistakes my generation made right when they

(24:03):
build these apps, that they understand the kinds of things
that have happened in the past. Because also for all
these issues, there are responses. It's you don't just have
to rub gravel in your hair and cry every night,
like there actually are things you can do around these issues.
But if you're not thinking about that proactively well end
up happening is you won't deal with them until after
you ship it. Like we're seeing this right now, TikTok, right,

(24:24):
which is like, you know, this new emergent social network,
they are speed running every mistake Facebook made in its
first ten years, right, Like every problem they're dealing with
is something that one of the bigger social networks have
had to deal with. But because there's no good repository
of knowledge of all these issues, it's effectively impossible for
them to expect to preventively try to design their systems

(24:44):
to stop it. Um, did I see if there's a
list of countries you're banned from? Or was that a
joke that you said, Uh, it's kind of a joke.
I mean, nobody's keeping one list, but I do joke that.
You know. You see those r vs that will have
like fans of like the Nebraska court Huskers, they'll have
a map of every state they've seen the court Huskers
play in. I have the opposite map of countries that

(25:05):
I probably should not travel to just because I've I've
either been involved through Yahoo or Facebook in dealing with them,
or or we've published something. You know, at Stanford does
that ever scare you. I've got like one call in
my career from like after going to too many like
black hat and deacon things, from someone with like a
modulated voice, and I was scared for like weeks. I
can't even imagine you must get all sorts of crazy stuff,

(25:27):
Like have you ever worried about your safety? Much less now? Yeah,
when I was in the corporate jobs, I got death
threats and sometimes physical letters and stuff, so you know,
my mail was being opened at the office and we
had the physical security people executive protection people take a
check out of our house and try to help scrub
mentions of our address and where kids go to school

(25:49):
and stuff. So that's definitely a problem. And this hasn't
happened to me, but there's now a big problem Silkon
Valley of squatting of executives. So a couple of my
friends have had the cops called squat teams arriving house
because you know, somebody called with a fake ID and
said that I'm so and so I've I've murdered my
family and one right here in pal Alto, which is
unfortunate that Palato p D actually did not handle it

(26:11):
very well, which is unfortunately did Silicon Valley. But that's
an issue that a lot of people have been facing,
and now it seems that someone that's being driven from
some of the kind of white supremacist groups who are
not happy with the company's cracking down are taking it
out on individual executives. It's interesting because this has been
going on for a while, like, um, you know, there's
been a lot of anger over them cracking down on

(26:34):
speech and all sorts of stuff that, but this the
latest of Silicon Valley executives being and for people to
know what squatting is, it's just it's someone calling it
a fake threat and then the police show up at
your door thinking you've done something terrible, almost ready to
shoot you essentially, like and it's a it's a psychological thing,
right right, they're trying to say, you're not saving your
own home. And you know, there's at least one example

(26:56):
of somebody dying of a swatting that was related to
a video game dispute where the police ended up shooting
an unarmed person. So yes, it is. You know, that's
the kind of thing you're dealing with if you're at
the company's right now. Yeah, we had Adam O. Siria,
the CEO of Instagram on and it's happened to him. Um,
interesting to hear that it's happening to other folks, you know.
It just it seems like there's a lot of anger

(27:18):
towards Silicon Valley right now and towards the executives here
in every sense, and that is manifesting itself offline in
some capacity. Yeah. In my case, like the worst thress
I got were after a story about isis that listed
three people, like had quotes from three people, and it
was me and Mark and Cheryl. It's like only one
of those people does not have full time security. So

(27:41):
you know, so some stuff happened around that time, but
I'd rather not go into it. Yeah, scary stuff, yeah,
I mean messages and such. But nothing that we thought
we was was actually high risk. I mean that that's
something that isis kind of specialized in, was trying to
intimidate people remotely when there was actually very risk of
something happening. But if if you look at the tech companies,

(28:02):
because you know, there's also a mass shooting at YouTube
that ended up could have been much worse. And so
since that incident, the physical security at the tech companies
has become pretty significant to all of them. Well, it
certainly seems like there's this moment that we're in and
the time I've covered technology that like that, it just
almost feels like that the tension is reaching like a peak.

(28:23):
And maybe it's because of everything else happening around, you know,
what's happening in society and what's happening um with the
power of Facebook and Twitter and Google, and I increasingly
see there's like some tension between the media and tech.
I mean, it really feels like there's just this moment
where there's some real cracks and it's manifesting itself in

(28:44):
a pretty ugly way. Yeah, I think, just like a
lot of these other issues, this is a situation where
divisions and society are being reflected online. Right that if
you if you end up in a really polarized world
where people humanize their political opponents, the place where that
is easiest and where you can get the most implications

(29:05):
going to be online, which then puts these platforms in
the place of being the refs of what is allowable speech.
And for a while, while they're trying to be as
hands off as possible, they were not seen as active players.
But we've clearly crossed the rubicon where both the media
and in the US both political parties, but then political
parties throughout the world now consider working the refs to

(29:27):
be a critical part of their plan. And so you
see a lot of the criticism is legit, and a
lot of it is specifically about trying to change the
behavior by these incredibly powerful platforms, which is why you
also see the platforms trying to find ways that they
can get out of that. Right, So, if you look
at like Mark Zuckerberg's big announcement of moving as much

(29:48):
of Facebook to encrypted and and communication is possible, that
is a huge uplift for privacy. It is clearly a
reaction to the privacy laws that have been passed. It
is also of a move that would take Facebook out
of the job of moderating people's speech in many cases. Um,
and so I think you're gonna continue to see companies

(30:09):
look for ways that they can effectively lock themselves out
of having to make these decisions. Is there an unintended consequence?
They're like, I understand that there's this debate if we
don't want our you know, tech companies, and I don't
really want Mark Zuckerberg in particular deciding what should stay
and what should go. But I'm noticing this trend to
that you're talking about, which is, you know, now they're
as as fast as they possibly can going to say

(30:31):
oh it's this, we're instilling this board to do this,
and we're doing this and this, but it's still the
problem is on the platform. So like, is there going
to be on unintended consequence for this move of kind
of saying like, oh, this isn't on us, this is
on X, Y and Z. Yeah, al, there will be
many unintended consequences for sure. This is actually one of
the research projects we're running in our group at Stanford

(30:52):
is trying to understand what impact does moving all this
to distributed or encrypted networks have on safety and are
there things you can do to mitigate some of that risk. So,
for example, you know What's app is the day that
What's App turned on and then decryption is a very
proud day was probably the largest uplift in privacy and
human history. Never before have so many people been given

(31:15):
the ability to communicate with one another in a way
without powerful corporations or governments seeing their communication. So a
huge privacy win. That privacy also directly contributes to what's
app being used to spur ethnic violence in places like
India or Sri Lanka. It also prevents what's app from
policing the sexual abuse of children, It stops what's app

(31:37):
from policing malware, and so there there has been a
real impact, and I think this is one of the
things I'm glad to be able to be on the
outside that now I can talk about this balance of equities.
Is that you can't both ask these companies to make
the platforms totally safe and then also ask them to
respect everybody's privacy. There's a really hard trade off here,

(31:59):
and is one of the problems of their current debate
is that people kind of wanted all and when you
try to build incentive structures for the companies that aren't
attached to reality, like what's happening right now, then you
end up with the company's looking for these radical solutions.
So for Facebook, it's in an encryption. For Twitter, it's
distributed systems. Right. Jack's announced that Twitter might become some
distributed thing, which is another way of saying we won't

(32:21):
be able to control what people tweet. Right, I mean
it's kind of a scary place to be when um,
you know, we had a Yanc strickle on who is
the founder of kickstarting, and he talked about this dark
forest theory where a dark forest is where all these
animals are there but just no one can see them,
but they all don't come out because they're so afraid,
and and then the only room that's left is for

(32:41):
predators or something. And he talks about, you know, as
social network's going to get more private and this, and
that the town square is going to get less and
less crowded and it's only there We're only going to
have these voices that are more extreme who remain and
that that was an interesting metaphor. And then I also
think some of the stuff you're talking about with like
I don't know where it does seem like silicon value
generals becoming even more libertarian or something like this push

(33:04):
right now feels more and more um, I don't know,
reactionary like and it does seem like there's been there's
this inability to have some kind of debate of where
we fall and how we fall, and because it doesn't
feel like any of the sides actually understand or speak
to each other in a way that is coherent understandably,
so because Silicon Valley hasn't exactly done the best job

(33:26):
of explaining themselves. You've been inside one of these tech
companies that you tried to explain yourself and you couldn't.
And the government hasn't exactly done the best job of
understanding technology, as we all saw when Zuckerberg testified for
the first time. So it certainly seems like a bit
of a mess. It does, And you're right that the
companies have never really talked honestly about the trade offs here, right, Like,

(33:47):
we have promised people that you can have it all
that will respect your privacy and will stop all the
bad guys, that you can have this product for free,
you don't have to pay for it, and therefore there's
no downside for that, which obviously then you have to
do things like run ads and have ad platforms. And
so the fact that the companies so talking about the
trade offs is important because we've ended up in a
situation where it is this incredibly important area of public

(34:09):
policy discussion where the people who have the loudest voices
don't actually understand the equities involved. You know, people understand
what the trade offs are and tax policy or healthcare
policy and the like, but they don't understand the tradeoffs
in tech policy, and so as a result, they end
up asking for things that are completely contradictory. So like
a great example of this is GDP are, the big

(34:29):
European privacy law with both tells the companies that they
have responsibilities to protect user data, which is something I
agree with. They also tell companies that user data belongs
to the users and they have to let the users
take the data with them anywhere, which is a thing
that sounds good, but it turns out those two things
are completely in conflict, right that you cannot the current

(34:51):
ability to take data out of a Google or Facebook
ends up with a hundred times more sensive data than
anything that was leaked during the Cambridge analytical scandal, and
that is fired by the European Union for the companies
to operate there, and eventually somebody will figure out how
to monetize that feature and there will be a massive
privacy scandal. But in this case the companies are gonna say, well, sorry,
the Europeans made us do it, and so like that

(35:12):
kind of not understanding how that how the rubber hits
the road and the equities involved is started in Europe,
but we're starting to see it in the U s
as well. What do you think about I mean, I
know the debate is right now all about how they're
treating political ads and misinformation. What is your take having
been on the inside. Yeah, And so the things you're
trying to balance here is you're you're trying to balance

(35:36):
not allowing the platform to be abused with how much
power do you want these companies have to control political speech? Right? So,
first off, a lot of the public debate about this
is really twisted by people's fascination with c D two thirty,
the law that protects Internet platforms for intermediate liability for
for speech. C D two thirty is effectively irrelevant in

(35:57):
the political speech debate because in the United States, political
spe just protected by the First Amendment, even incorrect political
speech or outright lies, and so since there's no underlying
civil liability, their CD two thirties irrelevant. So I just
want to dismiss with all of that discussion. There's been
a couple of celebrities who keep on talking out of
turn about stuff they don't really understand, and it's really
mixing up the conversation. But on the political ads debate,

(36:19):
I think from my perspective, the reasonable way forward is
one to put limits into how much targeting you can do,
about how finely grained you can cut up the electorate
with your political ads. When you think about what are
the things we want out of online political ads, and
obviously we want them to be honest. We probably don't
want them to be super negative. We want them any

(36:40):
bit about the issues, not about personalities, and we want
them to be as universal as possible. Right, And this
is one of the big differences between I say, a
television ad which you can show on one TV station
but you're probably hitting people, and an online ad where
you can show it to fifty people. Is that with
the online add add because they're cheap, because there are

(37:02):
things that you can generate automatically, and because of the
targeting capability, you end up with political actors creating thousands
and thousands of ads that say different things to different
groups of people. And I think that is a really
dangerous direction for us to go, is for the political
class to break up America into all of these tiny
micro segments and to be different people to all these

(37:24):
different segments, and that means, you know, the candidates, but
also the political action committees and the dark money groups
who can do this. And so I think to address that,
the best thing would be a limit on a floor
of how is the fewest number of people you can
show and add to at any time? And in a
presidential campaign, that might be like a congressional district, so
that's roughly six people, or you might you know, have

(37:44):
some arbitrary number ten. You know, there's no good empirical
evidence about what number it should be. I think we
should just try something and then we can iterate, Right,
But like the important thing is to make it so
that you're not cutting people up into tiny And this
isn't just a partisan issue. One of the things I
really don't like about discussion is it's all about Trump.
But like we've got to be thinking about how do
we want politicians to act for a long period of

(38:05):
time in the post Trump era. And even in this election,
it looks like Michael Bloomberg is building the most impressive
online advertising capability in the history of mankind. Right, So
this isn't just a partisan issue. I shouldn't be seen
as such. We should think about how do we want
both parties and eventually both candidates to act um And
so that's I mean that that's one limit. And then

(38:26):
on the on the ads on the line, I don't
like the idea of Facebook or any other trillion dollar
tech company deciding what is correct or incorrect political speech.
I think a reasonable standard would be that if you're
making a claim about an opponent, that has to be
based in fact, right, but that we shouldn't police claims
about positions and themselves. So if Donald Trump says I'll

(38:48):
make Mexico pay for the wall, then that's something doesn't
get fact checked. But if he says my opponent is
a criminal and is about to get arrested, then that's
something that that campaign has to show some kind of
factual basis for or the ad is banded. And I
think that would encourage the campaigns to run ads that
are about their own positions, that are about the topics
in hand, instead of just calling names and pointing fingers,

(39:11):
And it would also reduce some of this kind of
risk in the election. But I don't think we should
move to a world where things are totally fact checked.
We're going to take another quick break, But when we
come back. We've talked about disinformation, but what about actual
spies inside of tech companies? We ask Alex about spies
inside of Facebook. And if you have questions about the show, comments,

(39:34):
really anything, you can text me on my new community
number five four oh three zero. So a lot of

(39:58):
the other tech companies, they have all taken a much
lighter stance on political ads, whether it's banning them or
doing different types of things. I think that the issue
that I'm really interested is like the micro targeting. I
just think like, I think the micro targeting thing is
really important to kind of dig into. And I think
that was the one thing after Cambridge Analytica that I
was just like, who knows how effective Cambridge Analytica actually was.

(40:20):
But the larger issue was this idea of the ability
to micro target people and manipulate people this grand scale
without kind of this outside world to be like no,
that's wrong or you know, and and these spaces seems
really dangerous. And so what's good when it comes to
a political ad, for you know, what's good when it
comes to selling a product? When you micro target for

(40:41):
trying to sell someone deodoran or something might not be
good when you're trying to micro target and manipulate and
change someone's mind when it comes to democracy, which is
I think kind of this other whole debate that that
doesn't seem to be getting as much attention. I think
when it comes to this, Yeah, and the technique that
everybody is thinking about that we we we should be
really worried about is this thing called custom audiences. Right. So,

(41:01):
this is the ability that exists on on many different
online platforms to effectively upload a spreadsheet of these are
the people I want to show this add to. Right.
So let's say you're the Trump campaign and you you
have all of these lists, and you want to advertise
two people you think are pro gun in Michigan, and
you've gone information from the n r A, And so

(41:22):
you upload the list of n r A members in Michigan,
and then Facebook matches that list to Facebook accounts and
then only shows the add to those Facebook accounts. Right.
So that is the there's equivalence on Google and Twitter
and a bunch of other ad networks. That is the
kind of functionality that I think we should have limits around.
And to me, that is the real thing that needs
to be paid attention to, coming out of the cambra

(41:44):
generally a scandal, So to be completely honest, cambrag Gen
analytic as a scandal is massively overblown. The media loves
to talk about it without really understanding what happened. Cambridge
Analytica was a scam, right, so this is a advertising
company media, and I feel like I understood what happened?
Did I not just say that that? Know? Who knows
actually like how you know how effective Cambridge Analytica was,
but what they did and what they were able to

(42:05):
do in kind of the loopholes they were able to
take advantage of. So anyway, just so as the media,
I politely and respectfully pushed back, present company accepted. So look,
Cambrigela did not get Trump elected, Cambridge did not make
Brexit happen. Right, they were a scam based upon this
crazy idea about psychometric profiling, YadA YadA. The only reason

(42:26):
people care about Cambrige Analytica is they got a bunch
of their data from stealing it from Facebook. Right. There
are a dozen or two dozen more Cambridge analyticas that
have much more thoughtful models for how they do this,
and they're not dumb enough to either have a huge
pr campaign. So, like one of Cambridge Alta is basic
problems is they went out and they made the YouTube
videos with their like Bond villain accents, and they took

(42:49):
credit for Trump and took credit for Brexit, and so
people believe their ads, but they're just showmen. So these
other companies are silent. They don't even have websites, they
don't have domains. You can't figure out who they are.
And they don't steal their data from Facebook. They legally
buy it from Axiom and Experience and TransUnion and a
dozen of their data brokers. And so they're building much

(43:09):
better models than Cambridge Analytica did, but they're doing the
same kind of idea of we're gonna target very small
groups of people. And so I think that is what
we've got to focus on, is we've got to focus
on how do we limit the capability to do that.
You know, if if somebody wants to target these are
the twenty people they're going to buy a Toyota Tundra
in this county and they show them an ad for
a Toyota dealership, I think you could argue that that's

(43:32):
a reasonable thing. It's just like direct mail, right, Um,
But in the political aspect, we don't care about different
toyota dealers showing different ads to different people, but we
do care about one politician being a different person to
thousands of different segments. And I think that is a
real dangerous direction of kind of flooding the zone with
hundreds of thousands of ads that are very hard for

(43:52):
the media to check for opponents to check in the
life go with me and so and so? Why, I mean,
I think that's an important point. So why why? What
you know, they just want to appeal to different audience
is and show different sides of themselves. So I'm only
gonna pushing you to help people understand. So why is
that so bad? Right? Because so they're not thinking about
this is a position that's important because of polling. What
they're doing is they're having a computer auto generate ads.

(44:13):
So you have a bunch of different messages and a
bunch of different photos, a bunch of different quotes, and
you feed them into a computer and the computer generates
thousands and thousands of potential ads, and then it runs
each of the ads with fifteen dollars, ten dollars, twenty
dollars a budget, right, and then it cross tests those
ads because all these different segments, and then the computer sees, okay, great,
this ad did well, this did with this, and then
it puts the rest of the money behind just showing

(44:34):
those ads. So you're effectively mass producing propaganda that is
then tested a B tested against individuals, and then money
is only being put behind and no humans part of it, right, Like,
you can't you can't say there's any part of this.
It's about real persuasion. It's just about what mathematically did
people click on? And I think that's just a really
dangerous direction for us to go. What percentage of Facebook

(44:57):
stance on political ads do you think is partisans or
is is political? I mean, like what they're doing. You'll
hear Zuckerberg argue this is for free speech, and I
and I'm a believer that Zuckerberg believes very strongly in
free speech, that it's not all about ad revenue. That
he does believe really in free speech, you know, to

(45:17):
whatever degree. But you know, do you believe that Facebook
stands on political ads is part political? So I don't
think it has anything to do with money. I think
the percentage of revenue that's political address probably in the
single digits. And that is even an overestimate because every
time you see an ad on Facebook, group Google, it
is because that add one a very very quick auction.

(45:39):
And so if political ads are taken out, that space
will be filled with the the deodorant ads and the
car ads in the subject. So they've taken an extreme,
a pretty extreme stance. So is it because of free
speech or is it because they're in a very sensitive
moment politically where you have a lot of politicians coming
down on Facebook. There's a lot of sensitivity around this
world of the election. You know, what, what do you
think it is? So this is I mean, this is

(46:00):
I read from the outside. I think it is political
in that you know, there is a long history of
the Democratic Party being close to Silicon Valley, right, the
vast majority of executives at these companies are very socially progressive.
They are naturally aligned with Democrats. And during the aboministration
there was this movement of people between d C and
the companies and the like and a kind of close relationship.

(46:22):
And since the relationship has soured significantly, and now they
end up in a situation where they're scene is hated
by both major political parties. The Democrats hate the companies,
possibly because they believe their monopolists, that they're too powerful,
that they are helping Trump get elected. They say that
the companies knew Russia was doing this. I've I've heard
kind of crazy radical theories about effectively that like Russia

(46:44):
called Mark Zuckerberg and said run our ads, which is
just ridiculous. But you know, there's all these theories blaming
tech for bad things on the Democratic side and the
Republican side. They hate tech companies for what they see
a censorship of their views and oppression of their views
and biased towards the Democratic side. Facebook not behated by
both sides, right, And so I think they are making
an explicit decision to in a world where the democratic

(47:07):
demand is you can't exist because you're too big and
powerful and we're gonna break you up, and the Republican
demand is carry our ads. That's what you're going to choose, right,
And so I think there's been kind of a cynical
decision to lean much more into the Republican side of
the aisle because the Republicans have been very good at
working the refs. Because the other thing that's been kind
of underdiscussed is there has been a concerted effort by

(47:29):
Senate Republicans and by the Trump administration to do investigations
to bad mouth the companies. You know, the Trump administration
has like five or six investigations open of just Facebook,
They've got ones open of Google and Twitter. William Barr
doesn't really care about, you know, some of the issues.
He's saying, this is a this is the Trump administration
throwing a brushback pitch of we control the Department of Justice,

(47:50):
we control the SEC, we ConTroll the FTC, we have
the ability to make your life hell and and I
think that has been effective. And again, I don't think
that's necessary because I don't believe the political ad issues
are actually as partisan as people make them out to be. Right,
because of the Bloomberg move, there's a bunch of Democrat
donors that are putting a lot of money into digital
ad technology and like, and so I think actually will

(48:13):
be a much closer fight than Trump versus Clinton was
in twenty sixteen. But that being said, that's not the perception.
The perception is any changes around political ads will only
hurt Donald Trump only help Democrats. Uh, and so because
of that, they're moving to try to neutralize the criticism
from the left. Are from the right and the left
is honest, he walked into it because the Democratic Party

(48:34):
making noise about things like c D two thirty and
stuff that's getting now amplified by Josh Holly and Ted
Cruz and other Republicans who you know. This should have
been a warning to Elizabeth Warren when Ted Cruz endorsed
her tweets about Mark Zuckerberg, Like, Ted Cruiz is not
doing that because he actually has the same position on monopoly.
He's doing that because he's looking for levers to try
to keep the companies from enforcing the rules on speech

(48:57):
that is seen as pro Trump. And I think, unfortunately
there's probably been a very cynical decision to to lean
into that. Did you feel it was political at all
when you were inside, Well, certainly, I mean, the the
concern of looking political was a huge deal, right, Like
the company never wanted to make a decision that looked political.
But did it act political? What do you mean, like, like,
did it make decisions that were based on political purposes

(49:21):
or not wanting to upset one or the other side? Oh? Yeah,
I mean I think decisions are definitely made to try
to to stay as neutral as possible and so and
that has become obvious to folks on the outside, which
is why it's now seen as an effective path to
work the ref right. Like the best example of this
was what was called the Trending Topics debate, where there's

(49:42):
kind of a b S article about quote unquote editors
inside of Facebook suppressing conservative views. So these were a
handful of people who worked on a product that was
used by barely anybody, like the little trending topic box
in turns, that had almost no engagement at all, So
their ability to impact what people were looking at a
Facebook was actually tiny. But the right completely blew this

(50:05):
up into a huge deal, and that was very smart
because it ended up in the run up to the election.
There were things a company could have done around fake news,
around account verification, around people's amplification, and stuff on advertising,
stuff that I'm sure we're not taken because they did
not want to look like they're putting the thumb on
the scale on behalf of Clinton. He says, ever worried

(50:26):
about like spies inside Facebook. Yes, absolutely, that was definitely
concerned and it continues to be a major concern in
Silicon Valley increasingly so or just always. I mean, certainly
was my concern for my entire time, and we did
a lot of work around kind of internal security. But
I think I think it's I mean, what kind of
stuck out to you. What? Well, the truth is that

(50:47):
on cyber issues, at least the tech companies are playing
the same game as the major US adversaries, right. So
a lot of things changed in Silicon Valley in two
thousand nine, which was when the People's Ration Army of
the People's Republic of China was caught breaking into Google
and thirty five other companies. So we we actually just
had the anniversary of all these things. The people who

(51:09):
worked on this at Google have come up with with
cute little souvenirs. I still have to go pick mine up,
and so I got to be involved in that as
an instant responded at a number of companies. Since then,
the companies invested a lot into repelling attackers from Russia,
from China, from Iran and the like, and at least
they're in the same category, right. So if you're the
Ministry of State Security of the Chinese or you're the

(51:31):
g R U R S VR. You are taking a
risk trying to break into Facebook or Google, or Microsoft
or Amazon or a couple other companies at that top tier.
You're taking a risk by trying to break into them
with cyber on human intelligence. We are children compared to
these organizations, right, We are children compared to the FSB
and the SVR, which are the descendants of the KGB,

(51:52):
which are the descendants of Czarist Russia's intelligence services. Right
like countries have been planting spies or turning people using
various forms of leverage for hundreds and hundreds of years,
and tech companies are very open. They hire people of
all kinds of backgrounds, with all kinds of citizenships. We
don't have security clearances, we don't have we don't put

(52:15):
people through lie detective tests, we don't have single scope
background investigations or lifestyle policies, all of the things that
in the U. S. Government are used to prevent spies.
And still they have spies. So if the n s
A can't keep them out with all of that capability,
with eighteen year old marines with machine guns and polygraphs,
then certainly the tech companies can. And the best example
of this has come up happened in Twitter. Now, it

(52:37):
was fascinating. The country that got caught was the Kingdom
of Saudi Arabia, right, so that the Saudis were paying
off Twitter employees just by on people. But I I
expect that every major U. S. Tech company has at
least several people that have been turned by at least China,
maybe Russia, probably Israel and a couple of other U S. Allies.
Did you ever uncover anybody that have been turned Um,

(52:58):
I'm not going to comment on that. You certainly an
area in which we did a lot of work and investigation.
Can you comment a little bit? What I'll say, there's
there's a lot of weird things that happens at the
companies that it's very difficult and then figure out why
it did happen, if that makes sense. So maybe there
wasn't like a solid yes, but there definitely wasn't a
solid nose. So that would be suspicious activity that you

(53:21):
uncovered in your time on Facebook, right, And so you
you end up at the companies, you have to design
your internal systems to detect that. You have to limit
user data access, you have to classify the pieces of
data that are really critical, like find GPS location, like
people's exact GPS location, their private messages. Those things are
more important than I P addresses or public messages, and

(53:42):
so you have to have systems to look for the
abuse of data internally out of curiosity. What does suspicious
activity of a potential spy look like? Um, that's a
great question. So what you probably want to do is
you probably want to have an understanding of who potential
targets are. So let's say I was advising a new startup.
So let's say you were at Face. Let's not say
I was in Facebook. Let's say let's say you're you're

(54:03):
you're starting up and you believe you're going to have
a big company that's gonna have a lot of interpersonal communication.
I'd say one of things you're gonna want to do
is you're gonna want to identify high risk targets, and
you're gonna want to have special kind of monitoring other
accounts and access. You're gonna want to look at the
patterns of data access. So if you're a software engineer
and you needed to bug a problem, then you're going

(54:23):
to have to look at data that's tied to the
exact problem you're not going to end up looking at
random accounts, and so you should be able to see
that somebody opened up a bug. They were assigned this
this task. This task is tied to this one account.
They access that account that's legit. If somebody just sits
down and all of a sudden they look at five
accounts in a row and they're all not related to
a task that they normally do. That's the kind of

(54:44):
thing that should off and alert and a human investigation suspicion. Interesting,
very interesting, I mean increasingly so I can only imagine
that's that's got to be a pretty big problem or
increasingly a problem. I'm probably always a problem. That's something
that people have to pay attention to now, Yeah, I
mean it's the big Silicon Valley companies are not the

(55:05):
most powerful intelligence agencies in the world, but they're in
the top twenty. Right, Like if the capability if if
you're a Google or Facebook or Microsoft or Amazon, I
think i'd put those four at the top. Those four
companies can know what's going on in the world, both
in the cyber domain and in the physical world based
on by people's communication as well as lots and lots
of state level actors, and so that means that they

(55:27):
have to be incredibly careful of how they use that power,
and they have to protect that power from other people
who are looking for some level of asymmetry. What do
you think is the biggest security weakness that everyday people
still have in their tech lives. Oh, by far, nothing
is close to this. It's the reuse of passwords. The
fact that the vast majority of people in the world
use the same password or a slightly modified password on

(55:48):
every single website. Nothings like, there's nothing even close to
how much privacy violation happens, how much damage to people
happens because of that problem. And so for all these things,
I mean people are worried now of like getting hacked
by MBS and malware on their phone and you know,
Chinese spies and stuff like, the number one thing that
any normal person can do is they can download a

(56:09):
password manager. There's three or four good ones, and they
can go and generate random passwords for all the different
sites in their life. And so you have one good
password to log into your password manager and everybody else
has a random one. Or if you're more old school,
get a little black book and write all your passwords
down in the book because a hacker kit from six
thousand miles away reach into your purse or your pocket
and seal that book away, right, which is what's happening

(56:31):
with people's passwords. So funny that like, out of everything happening,
you're like, just don't use the same password. Yeah, no,
it's it's by far the biggest set At Facebook. We
caught about five thousand accounts every single day where a
bad guy came in with the right using him and password,
and we detected this is actually a bad guy. They're
not the person who's supposed to begin half a million
a day. Like there's all these data breaches and stuff,

(56:52):
there's nothing close to the amount of data breach that
happens every single day just to do the stolen passwords
be reused. UM. I want to end on something you
said to me. I want to go back to that
drive that we took on your last day, UM, to Facebook.
I I thought you said something really interesting because we
were we were driving to Facebook as it was your
last day, and and you said that, he said, I

(57:13):
think the company is lasting, but I think also that
the one on one is littered with the old headquarters
of companies that thought they'd be around forever. That as
a constant reminder of Silicon Valley, the death and the
rebirth of those products and these organizations. Nobody can get
too comfortable and too arrogant about how long anybody can
be around and how long you can be on top.
Do you think Facebook lasts? I don't know. I think

(57:37):
the most accurate criticism of the big tech companies is
that they've effectively locked out competition. It's I think that
the competition issue is big, and it's because they've been
able to turn their access to huge amounts of cash,
but then also an understanding of what people want to
do online to either buy or copy their smaller competitors,

(57:58):
and so the vision of all of these dead companies,
you know that kind of Famously, part of Facebook's campus
used to be Sun's campus, and the back of the
Facebook sign is the Sun Microsystem's logo. It has not
been painted over as kind of a reminder of the
turnover in Silicon Valley. I think that's a good thing.
I think it's good for these companies to grow and die.
I think that's one of the reasons why Silicon Valley

(58:20):
has been so competitive with the rest of the world
is because we have this ecosystem. But it does start
to feel like some of the big companies have been
able to block out their smaller competitors through a number
of means, and so I think that is an area
that we have to work on. Yeah, I go back
to this idea of a little bit of hubris and
that filter bubble you spoke about at the beginning at
the beginning of this interview, because it would be too

(58:42):
simple to say all these people are bad. It'd be
too simple to say that this is this place is
just here to do evil. Because both of us have
been in this a long time because we cared about technology,
we cared about people. Um. But there has been a
very insular mentality in a very specific way. It's some
of these bigger companies. UM. I've covered Facebook very closely

(59:02):
and and they missed a lot. You know, there's a
lot of stuff they missed. And so I think it's
really interesting that you're now like on the outside and
that you thought that this is where you can make
the most impact. I mean, I actually think it's kind
of extraordinary sitting here and saying like you had this
big role within a company where you could impact over
two billion people, but you thought you could make a
bigger impact on the outside. I mean that is pretty extraordinary. Well,

(59:24):
I think part of the root problems for the tech
companies is there's bubbles within bubbles, right, so each corporate
campus is a bubble, but then the real decision makers
at the top are living in the even smaller bubble
right where they are surrounded by folks who have gone
to the place they are because they've been successful in
hitting certain metrics. And those metrics are all about growth

(59:45):
and users enjoying the product and advertising revenue and all
those kinds of things. And so almost all of the
problems tech companies that faced over the last three or
four years, almost all of them somebody inside the companies
understood that was happening. It just didn' make it to
the right people. And so, if you know, if I
had advice for my friends who are still there, it's
building a culture where people can be cassandras and can

(01:00:10):
say a storm is coming, this is a problem we
have to deal with, and that's taken seriously, and that's
bubbled up to the top, and that changes how you
build products, and it changes how you approach the world.
That is the most important thing they can do if
they want to survive. Silicon Valley is now a corporate

(01:00:35):
machine with billion dollar implications and an incredibly human impact.
But there's anger and frustration and people retreating into their corners.
You've heard our last two guests on this show frustrated
they couldn't speak more openly within their own companies. And
there's another layer, concerned about saying anything controversial at a

(01:00:55):
time when tech is in the national spotlight. I worry
that people will go even farther into their corners, that
Cassandra's really can't come forward. Now. Of course, this isn't
just tech speaking hard truths to power, raising red flags
not a new concept. Okay, we're going to try something new.
What do you think? Do you have something to say?

(01:01:17):
What hard truths are you struggling with. One of my
favorite parts of the job is getting to hear from people.
So text me. We just launched a community phone number
for first contact. And yes I mean this when I
say it actually comes directly to my phone, So reach
out to me. The phone number is zero three zero,

(01:01:38):
that's a U S number. I'm Laurie Siegel, and this
is First Contact. For more about the guests you here
on First Contact, sign up for our newsletter. Go to
First Contact podcast dot com to subscribe. Follow me. I'm
at Lorie Siegel on Twitter and Instagram. Him and the

(01:02:00):
show is at First Contact Podcast. If you like the show,
I want to hear from you, leave us a review
on the Apple podcast Apple or wherever you listen, and
don't forget to subscribe so you don't miss an episode.
First Contact is a production of Dot dot Dot Media.
Executive produced by Lorie Siegel and Derek Dodge. This episode
was produced and edited by Sabine Jansen and Jack Reagan.

(01:02:21):
Original theme music by Zander Singh. First Contact with Lori
Siegel is a production of Dot dot Dot Media and
I Heart Radio.
Advertise With Us

Popular Podcasts

1. The Podium

1. The Podium

The Podium: An NBC Olympic and Paralympic podcast. Join us for insider coverage during the intense competition at the 2024 Paris Olympic and Paralympic Games. In the run-up to the Opening Ceremony, we’ll bring you deep into the stories and events that have you know and those you'll be hard-pressed to forget.

2. In The Village

2. In The Village

In The Village will take you into the most exclusive areas of the 2024 Paris Olympic Games to explore the daily life of athletes, complete with all the funny, mundane and unexpected things you learn off the field of play. Join Elizabeth Beisel as she sits down with Olympians each day in Paris.

3. iHeartOlympics: The Latest

3. iHeartOlympics: The Latest

Listen to the latest news from the 2024 Olympics.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.