All Episodes

November 7, 2019 46 mins

More than 2.1 billion people use Facebook or one of its services like Instagram and WhatsApp every day. Lately though, the company that started out with the noble vision of “making the world more open and connected” is facing some serious questions about the part it plays in a lot of harmful activities like spreading misinformation, mishandling its users’ personal information, and increasing the deep divisions of our already polarized nation. On this episode ofNext Question, Katie shares her recent headline-making interview with Facebook COO Sheryl Sandberg at the Vanity Fair Summit in Los Angeles. It’s a tough, no-holds barred conversation that gets to the heart of the question on everyone’s mind - is Facebook doing enough to protect its more than 2 billion users and our democracy? 

 

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Next Question with Katie Curic is a production of I
Heart Radio and Katie Kuric Media. Hi everyone, I'm Katie
Curic and welcome to Next Question, where we try to
understand the complicated world we're living in and the crazy
things that are happening by asking questions and by listening
to people who really know what they're talking about. At times,
it may lead to some pretty uncomfortable conversations, but stick

(00:24):
with me, everyone, let's all learn together. More than two
point one billion people use Facebook or one of its
services like Instagram or What's App every single day. That's
nearly one third of the entire world's population. But recently

(00:46):
the company has gone from the brilliant brainchild of a
Harvard dropout named Mark Zuckerberg to one of the most
controversial companies on the planet. He was recently grilled on
Capitol hilld by members of Congress concerned about the plat forms,
increasing footprint, and almost every aspect of our lives. Sure,
Facebook can bring communities together, help you share photos with

(01:09):
your family, and even start movements, but it can also
unfairly impact elections, spread misinformation, create a safe space for
child pornographers, and white supremacists, invade our privacy, exploit our
personal information, and increase the deep divisions of our already
polarized nation. That's quite a laundry list, isn't it, And

(01:31):
with the election fast approaching, you may be wondering if
it might be deja vu all over again, and worried that,
to borrow a phrase from the nineteen sixty six movie
The Russians Are Coming, The Russians are Coming, not to
mention China and other foreign powers, and the company's recent
decision not to fact check political ads lead to a

(01:53):
heated debate on social media between Zuckerberg and Democratic presidential
candidate Elizabeth Warren, who set the platform had become a
quote disinformation for profit machine, and she even placed an
ad on Facebook sayt Zuckerberg was supporting Trump for president
to test if it would be removed. It wasn't. Meanwhile,

(02:13):
more than two hundred and fifty of its own employees
signed an open letter warning that the ad policy is
quote a threat to what Facebook stands for. So I
was impressed that the company CEO, Cheryl Sandberg was willing
to sit down with me recently at the Vanity Fair
New Establishment conference in Los Angeles. She's been with the

(02:35):
company since two thousand eleven and has played a pivotal
role in shaping both its culture and its business strategy,
leading it to more than twenty two billion dollars in
profits last year. She's also an advocate for women in
the workplace with her two thousand thirteen book and organization
Lean In. And I got to know Cheryl after her husband, Dave,

(02:57):
died unexpectedly in two thousand fifteen. She reached out because
I too had lost my husband at an early age.
Cheryl wrote a book about her experience, called Option B,
and I interviewed her for that back in two thousand seventeen.
If you're interested, you can find that interview in my feed.
Our recent conversation at the Vanity Fair summit got a

(03:18):
lot of attention, and I thought it made sense to
share it with all of you on my podcast. So
my next question for Cheryl Sandberg is Facebook doing enough
to protect it's more than two billion users and our democracy?
Or is it time to unfriend Facebook? Cheryl, thank you

(03:39):
for being here. We have a lot to talk about,
as you know, so let's get right to it. We're
just over a year from the election. Three hundred and
seventy eight days to be exactly who's counting? Yeah, But
I think the way Facebook addresses and fixes the platform
that was used in two thousand and sixteen is seen

(04:00):
is a major, critically important test. I know certain measures
have in fact been implemented, for example, thirty five thousand
moderators looking for fake accounts and suspicious patterns. Mark Zuckerberg
announced news safeguards like labeling media outlets that are state controlled.
But do you believe that's enough? I mean, do you
really seriously believe that we won't witness the kind of

(04:21):
widespread interference we saw in two thousand sixteen. Well, we're
gonna do everything we can to prevent it. Um. I
do think we're in a very different place. So if
you think back to we had protections against state actors,
but when you thought about state actors going against a
technology platform, what you thought of was hacking the Sony emails,

(04:41):
the DNC emails, stealing information. And that's what our defenses
were really set up to prevent, and so were everyone
Else's what we totally missed, and it is on us
from missing it, and everyone missed. This was not stealing information,
but going in and writing fake stuff was a totally
diff threat and our systems weren't set up to deal

(05:02):
with it. So the question is as you're asking, what
are we doing going forward and how are we going
into the election? And how did we do in and
we're in a totally different place. The FBI has a
task force on this. They didn't have anyone working on it.
Homeland Security is working on it, all the tech companies
are working together, because when you try to interfere on

(05:22):
one platform, you try to interfere on another. In Sten,
we didn't know what this threat was in We did
one takedown. In the last year, we did fifty and
I read a shocking number. You took down more than
two point two billion fake accounts in a three month.
That's right. We take down millions every day. So thirty

(05:45):
moderators is that even enough? Given I mean two point
two billion is almost the number of people who are
on the platform. So the moderators are looking for content.
The fake accounts are being found with engineering. That's the
only way to find those fake account Most of those
are found before anyone ever sees them. And fake accounts
are a really important point here because everything that was

(06:08):
done by Russia in everything was done under a fake account.
So if you can find the fake accounts, you often
find the root of the problem. And so we are
now taking down millions every day, almost all of which
no one has seen. You talked about disrupting fifty individual
campaigns from multiple nation states so far. But what about

(06:28):
domestic threats. Facebook's own former security chief Alex Stamos has said, quote,
what I expect is that the Russian playbook is going
to be executed inside of the US by domestic groups,
in which case some of it, other than hacking, is
not illegal. My real fear, he says, is that in

(06:48):
it's going to be the battle of the billionaires of
secret groups working for people aligned on both sides who
are trying to manipulate us at scale online. So what
is face spook doing to defend the platform against this
kind of domestic threat. It's a really good question, because
things are against our policies if they're fraudulent or fake accounts,

(07:10):
but people can also kind of deceive. Again, if you
look at where we were and where we are, the
transparency is dramatically different. So you look on every page
on Facebook, you can now see the origin of where
the person is. So if someone is has a page
that's called I don't know us whatever, but they're from
the Ukraine, it's clearly marked that. If you look at

(07:32):
our ad library we didn't have this last time, you
can see any political ad running actually anywhere in the
country or in most places of the world, even if
they're not targeted to you. So before, if they were
trying to reach you, you could see it, but you
couldn't see anything else. Now you can see everything. And
we rolled out on Presidential ad Tracker so that you
can see the presidential campaigns much more holistically. So with

(07:55):
the transparency measures we have, people should be able to
be trying to get rid of the accounts and the
ones that are legitimate, whether they run domestically or globally.
Make sure people understand who the people are behind what
they're seeing. But then why did Facebook announced not to
fact check political ads last month? I know the Rand

(08:15):
Corporation actually has a term for this, which is truth decay.
And Mark himself has defended this decision even as he
expressed concern about the erosion of truth online. So what
is the rationale for that? And I know you're gonna
say we're not a news organization. We're a platform. I'm
not going to say that, but it's a really important question,

(08:38):
and I'm really glad to have a chance to take
a beat and really think about it and talk about it.
So one of the most controversial things out there right
now is what adds do we take? What ads do
others take? And do we fact check political ads? And
it is a hard conversation and emotions are running very
high on this. I also sit here realizing it's however
many days you said before the election. So the ads

(09:00):
that are controversial now we have not even seen the
beginning of what we're going to see. There are going
to be a lot of controversial ads and controversial speech.
So why are we doing this. It's not for the money.
Let's start there. This is a very small part of
our revenue five percent or something. We don't release the numbers,
but it's very small, very small, and it is very controversial.

(09:21):
We're not doing this for the money. We take political
ads because we really believe they are part of political
discourse and that taking political ads means that people can speak.
If you look at this over time, the people who
have most benefited from being able to run ads are
people who are not covered by the media so they
can't get their message out otherwise, people who are challenging

(09:43):
and incumbent so they are a challenger, and people who
have different points of view. That's that's been true historically.
And so we also have this issue that if we
let's say we took political ads off the service, we
would still have all the issue ads. So I'm running
an AD on gender equality, I'm running an AD on
an other political issue. Those ads are much much much
bigger in terms of scope than the political ads, so

(10:06):
you would have every voice in the debate except the
politicians themselves. So instead, what we're doing is as much
transparency as possible. Every ad has to be marked by
who paid for it. We're doing verification to make sure
the people that say they're paid and that adds library
I started talking about is really important because you can't hide.
You can't run one ad in one state, one add

(10:28):
and another, one add to one group, one add to another.
Anyone can go into that library and see any ad
that any politician is running anywhere. Well, this is what
Nita Gupta wrote, the former head of the dj Civil
Rights Division, and Politico simply put she wrote, while major
news organizations are strengthening fact checking and accountability, Facebook is saying,

(10:50):
if you are a politician who wishes to pedal in lies, distortion,
and not so subtle racial appeals, welcome to our platform.
You will not be fact check you are automatically newsworthy.
You're automatically exempt from scrutiny. So I know of Anita,
and I've had a chance to speak to her since
she since she posted that, and I think the debate

(11:10):
is really important. I've had a chance to work with
her on our civil rights work. We've taken a lot
of feedback from her and already continue which she was
writing there was not only about ads, it was really
about content on the platform. So taking a step back,
here's what we do. When you write something. We have
a very strong free expression bent. We think it's very
important that we judge as little as possible and let

(11:33):
people express themselves. But we don't allow anything on the platform.
If something is hate, terrorism, violence, bullying, you know, hate
against protective classes, it comes down we take it off,
voter suppression. If something is false, misinformation, fake news, we
don't take it off. We send it to third party
fact checkers. If they market as false. We market as false.

(11:56):
If you go to share it and it's marked as false,
we warn you with a pop up and we say,
do you want to share this it's been marked as false?
We dramatically decrease distribution, so we decrease it to about
and we show related articles. How can you possibly do
that with two point seven billion users? How can you
possibly keep up with all the content that's being produced

(12:20):
on Facebook and distributed and shared, etcetera. We can't fact
check everything. We're not trying to fact check everything or
send everything to third party fact checkers at all. We
prioritize in terms of what's going most quickly. So when
something is growing really quickly, it gets referred, it goes
to the top of the heap sending it to fact checkers.
And these are really news links. You know, if you're

(12:40):
a bad example because you're a media journalist, but you know,
if my sister writes a post about her kids and
her dogs, which she does all the time, that's not
getting fact check. That said, the challenges of scale here
are really important, and in a lot of the areas
where we are reluctant to weigh in, it's because we
know we can't do this well at scale, so we

(13:01):
have to rely on other sources. I think one of
the most important things we're rolling out in the next
year is our Content Advisory Board. We understand that there
are real concerns with the amount of power and control
we have that right now we are the ultimate arbiters
of what stays on our service, and so we're setting
up a content review board. The final charter has just
been released. We've consulted with over a thousand experts around

(13:24):
the world and they're going to be forty people appointed
and by next year they're going to start hearing cases.
They don't report to me, they don't report to Mark.
It means that if you disagree and something was pulled
down and you think it should be up, or if
you disagree and we are letting something run from someone else,
that you don't think, you have a place to go
and we're going to abide by their decisions. Since two

(13:44):
thirds of people get their news and information now from
social media, do you have any responsibility in your view
to at least attempt to make sure that the news
on your platform is factual? Because oftentimes I've heard, well,
we're a platform, we're not a publisher, right, and so
we're basically the pipes. So where do you see your

(14:07):
responsibility in terms of that? So we do think we
have a responsibility for fake news and misinformation? Would you
say you're not a publisher? Still, well, what would you
call it? So that is a complicated thing and it
means different things to different people. Here's what we are.
We are a technology company. A lot of things are
published on us. But what I think when people ask
that question, they're wondering if we take responsibility for what's

(14:30):
on our service. And my answer to you is is yes,
we're not a publisher in the traditional sense because we
don't have editors who are fact checking, but we take
responsibility and what we've done on misinformation has decreased people's interactions.
Stanford just published a study there down by more than half.
Since it's not perfect, we're not able to fact check everything.

(14:50):
But we had no policies against this in the last election,
and you fast forward to today. I think we are
in an imperfect but a much stronger position. Let's talk
about the free speech rationale at Georgetown. Mark used Martin
Luther King Jr's name and his defense of free speech
on Facebook, but King's daughter, Bernice tweeted, I'd like to

(15:11):
help Facebook better understand the challenges that MLK faced from
disinformation campaigns launched by politicians. These campaigns created an atmosphere
for his assassination. And then Sherylyn Eiffel, as you know,
president of the n double a CP Legal Defense Fund,
called his speech quote a profound misreading of the civil

(15:32):
rights movement in America and a dangerous misunderstanding of the
political and digital landscape we now inhabit. It was a
controversial speech, and I think the civil rights concerns are
very real. Um. In terms of Bernice King, you know her,
her father's legacy, I know her. I actually spoke to

(15:53):
her after that tweet, totally scheduled separately. She's coming to
Facebook tomorrow and I'm going to be in your chair interviewing,
and then I'm hoasting her for dinner tomorrow night. And
what I told her is what I'll say to you,
which is that I was grateful she published. We would
have liked her to push on Facebook, not just tweet,
but we were grateful she spoke out because this is
the dialogue we want to have. And she actually tweeted

(16:13):
again this morning that she heard from Mark and is
looking forward to sitting down and talking with him civil rights.
She's smooth, isn't she. I mean, these are just facts,
she tweeted. You can check again to my friend Vernice.
We'd like you to post on our platform too. But
this is the dialogue, right, there's a lot of disagreement.

(16:35):
Civil rights and protecting civil rights are hugely important to Mark,
hugely important to me. I'm personally leading the civil rights
work at Facebook and we'll continue to do that. And
while we don't agree with everything, and there was certainly
disagreement over some of Mark's speech, there were other things
that we've done because we've listened and learned to them
over the last year, that I think they feel really
good about. We've taken much stronger steps on hate, looked

(16:58):
at white nationalism and white separatism because they informed us
of it. We've really come down on a very strong
policy on voter suppression. We are taking down voter suppression
as hate. If you publish you know the polls are
open on Wednesday, not Tuesday. We're taking that down because
it's as important to us as hate. And that's all
based on work, And so why is voter suppression more

(17:19):
important than voter misinformation. It's not. It's not more important.
It's just a question of how we handle it when
we have misinformation. What we believe is that unless it's hate,
are going to lead to real world violence, we need
to let the debate continue. We dial massively down the
distribution to As I said, we don't want things to
go viral. We mark them as false, but then we

(17:42):
publish related articles. Here's the other side of the story.
We think that's how people get informed that it's the
discourse about the discourse. It is market giving a speech
and Bernice King disagreeing with it publicly, and that dialogue
that matters. Whereas if it's hate or if someone's really
going to show up to the polls the wrong day,
we just want to take it off our service. And

(18:04):
this is really hard because one person will think something
is clearly just they really disagree with it, and we
do too, but they think it's someone else's free expression,
and so these lines are going to continue to be
really hard to draw. Do you really think that people
use Facebook as an opportunity to look at both sides

(18:24):
and to see something when it's corrected, or don't you
think that people are getting stuff in their feed that
is really affirmation that information. And I'm so glad you
asked this because there's actually really strong data here and
no one understands this. So when you think about your
contacts in the world, psychologists, you have what's called um

(18:44):
your tight circle of contacts and then your broader circle
of contact. So you basically can keep in touch with
five to seven people. That's your mom, your daughter's, your husband, John,
the people who you know where they are. What Facebook
enables you to do is keep in top with many
more people. Without Facebook, without social media, without Instagram, Twitter,
you won't hear from your college friends or the people

(19:06):
you grew up with that often. So if you compare
people who do not use social media for people who do,
the people who use social media see much more broad
points of view because if you don't use social media,
you go to maybe one or two news outlets. They
have one particular point of view. You read one or
two newspapers, and that's it. On Facebook, you will see,

(19:27):
on average of the stuff you see a news will
be from another point of view, which means it's not
half and half, but it is broadening of your views.
And that's something that I don't think we've we've been
able to explain to other people really understand. And the
reason for that is if you go to your news feed,
you don't see like half blue and half read. You
just see about more from the other side than you

(19:50):
otherwise would. So it is unequivocally true that Facebook usage
and usage of social media shows you broader points of view,
not narrower points of view than you would see otherwise.
And that's something no one understands. When we come back,
we take a deep dive into the rise of deep fakes,
Facebook's role in the increasing polarization of our country and

(20:11):
what the consequences should be if the company doesn't put
the proper safeguards in place for presidential election. Let's talk
about the free speech argument, which came under attack earlier
this year when Facebook decided not to take down that
doctored video of how speaker Nancy Pelosi. Her speech was

(20:34):
slowed down, it made her appear to be slurring her words,
that people thought she was drunk. You defended the decision
by saying, we think the only way to fight bad
information is with good information, and you said it had
been marked as false but at that point, Cheryl, it
had been viewed two point five million times. So isn't
the damage already done at that point, like when you

(20:56):
do a correction in the newspaper two days later in
tiny on page two. And studies have shown if you
see the false story enough and the correction fewer times,
than the false story actually stays in your head. Not
to mention, another study by m I t that fake
news spreads seventy times faster than real news on Twitter.

(21:18):
So I guess, isn't the current standard operating procedure on
videos like this a case of too little, too late?
I think what the Plosi video it was, and we
have said that publicly our fact checkers moved way to
the process and not the fact checkers. The process for
getting it to them and getting it back moved way
too slowly, and we've made a change in how we
do that to prioritize things that are moving quickly and

(21:40):
massively cut down the review time. In that case, we
should have caught it way earlier. We think you're right,
and we want our systems to work more because now
the technology allows people to appear that they're doing something
and are saying something other than what they're actually saying.
I mean, how do you keep up with all of
those things? Well, deep fakes is what you're talking about.
It is a new and emerging and that's when I'm
met deep. Yeah. And it's a new and emerging area,

(22:02):
and it is definitely one that we don't believe we
know everything about because we don't even know what they're
gonna look like. Here's what we know. We know we're
gonna need to move way, way, way faster. We know
we're going to need very sophisticated engineering to detect them
in the first place. We also know that the policies
themselves are hard to set right, and so we this
is an area where we know we moved too slowly

(22:24):
with the Pelosi video. We are trying to move faster.
But we're also setting up working groups and AI working
groups to try to develop the technology that will help
us identify these in the first place. I wanted to
ask you about Joe Biden because I know he's cut
down substantially on his Facebook ad spending because he wasn't
seeing very good return. Some strategists have speculated that his

(22:45):
message is to centrists and lacking in the inflammatory red
meat content that does so well on platforms like Facebook.
Are you concerned that you are creating an environment where
the most aggressive, inflammatory, each bribal content is what sells.
I know you address that briefly in saying that people
get different points of view, but certainly these people seem

(23:09):
to gravitate towards that kind of content. I mean, I
think that's true across political discourse. I think it's a
problem we face. I think you see it in rallies.
I think you see it in the debates. I think
the problem of people making more inflammatory statements and people
rallying to those, particularly as things get more polarized, is
a real problem. I don't think it's unique to us.
But do you think you've contributed to the polarization in

(23:31):
the country. Um, I think everything has contributed. I do
think Facebook is I think held accountable for that well.
I think we have a responsibility to help make sure
people get real information, to help participate in the debate,
and make sure that people can see other other points
of view. So I think, but are they getting real information?
If they're if they are getting the most aggressive inflammatory,

(23:54):
in other words, sort of more moderate points of view,
they're not as provocative. They don't stoke out rage as
much as some of this other content. Look, I think
that's true. I think you see it in rallies too.
I think you see it on social media. I think
you see it in rallies when does the crowd cheer.
You think you've see it in the debates. But I
think here's what matters. What matters is that we want
people to be able to communicate, express themselves. We want

(24:16):
people to register to vote and stay in the political process.
What I will most worry about is if people start
opting out. So one of the things I'm proud of
that Facebook has done is we registered over two million
people to vote, and on Facebook, when you turn eighteen,
we basically say happy birthday, and you should register to vote.
We have a really easy tool that lets you find

(24:38):
your local representative. Most people don't know who their local
representative is. So yes, I worry about all that, but
we also worry about core engagement to making sure that
people don't just opt out, but stay engaged, that they vote,
that they know who their representatives are, they know who
they're voting, and they participate in the debate. Mark said
recently in a leaked audio from an internal Facebook meeting

(24:59):
that if Eliza with Warren becomes president and tries to
break up the company, it would be an existential threat
and Facebook would go to the mat. What does that
mean exactly, go to the mat? We'll have to see.
But what this is about is whether I mean what
we'll have to see is what this is. This is
about whether or not Facebook should be broken up. And

(25:21):
that's a really important question. I think we're facing it.
I think all the tech companies are facing it um
And it's interesting. What do you think about the fear
about that? Well, I don't know if it's if it's
the biggest fear. I just think it's it would it
would you be okay if it was broken up? Well,
we don't want Facebook to be broken up because we
think we're able to provide great services across the board.

(25:42):
We think we're able to invest in security across the board.
So we invest enough and security across the board, we
invest a lot. We're investing much much, much more. We
have hired an extra thirty five thousand people, We've put
tremendous engineering resources, and we're doing things like red teams,
asking what do we think the bad guys would do
and how would we do it. So we're never going

(26:03):
to fully be ahead of everything. But if you want
to understand what companies care about, you look at where
they invest their resources. And if you look back three
to five years and you look at today, we've totally
changed when we invest our resources. And my job has
changed too. If I look at I've been at Facebook
eleven and a half years. For the first eight or so,
I spent most of my time growing the company and

(26:25):
sometime protecting the community. We always did some protection, but
now that's definitely flipped. My job is a majority building
the systems that protect and minority grow. And so we're
definitely changing as a company. We're in a different place
across the board on all of these things. Do you
think you're changing enough fast enough? I hope. So we're trying.
We're definitely trying. I mean, I think it's about not

(26:47):
just the current threats, but the next threat. The question
we ask ourselves every day is, Okay, we know what
happened in and now we're going to work to prevent it.
What is the next thing someone is going to do?
And that's going to take a lot of thought and
a lot of cooperation across the board. Do you see
breaking up Facebook as the existential threat? Mark Zuckerberg described,

(27:08):
And how are you feeling about Elizabeth Warren these days?
So I know Elizabeth Warren, and would you support her?
She's a Democratic nominee. I mean, I'm a Democrat. I
have supported Democrat nominees in the past. I imagine I
will support a Democrat nominee if it's Elizabeth Warren. I mean,

(27:29):
I'm not in the primary right now. I think that's
a good place for us to be, and so I'm
not going to let you drag me into the primary.
But I am a very well understood Democrat. I was
a supporter of Hillary Clinton. I have spoken for many
years about my desire for my daughter and yours to
see a woman as president. And so I'd like that
sounds like a yes, I'd like that. Not just here,

(27:51):
I'd like that all over the world. I have this
really funny story from a friend of mine in Germany
whose son I love. This said to his mother he
was five, I can't be chance where and she said
why not? He said, well, I'm not a girl because
of Angelo Merkel. Because the only person he has ever
known was Angelo Merkels. That's pretty good. You've said yourself
that you have to get right. What should be the

(28:13):
consequences if Facebook doesn't. I mean, I think we have
to earn back trust. Trust is very easily broken. It
is hard to earn back. I think we have to
earn back trust. I think we need deeper cooperation across
the board. We are arguing for regulation in some of
these areas, including things that would impact foreign interference, and

(28:36):
I think the consequences to us will be grave if
if we don't, what is it? What does that mean?
Consequences will be I think I think it would further
a road trust. I think people will have less faith
in the platform and our services. People are continuing to
use our services. That's trust. We need to earn back,
not just with what we say, but what we do.
And it is about finding those syndicates and taking them down.
It is showing that we can cooperate across the board,

(28:58):
on both sides of the AI, in Congress and around
the world to find the things that threaten our democracy.
What can other people do to help Facebook solve some
of these problems? Well, thank you for the question. I mean,
I think there's a lot of things. So one of
the things that makes us very different than where we
were years ago is I think pretty radical transparency. So,
for example, our community standards are completely public. We go

(29:21):
public every week or so I think every two weeks
with here's some of the decisions we're making, and we
take feedback. We're publishing a transparency report by next year.
We're gonna do it every quarter, just like earnings, because
it's just as important to us as earnings, which has
Here's all the stuff we took down, So here's how
many billions of accounts that where that number comes from.

(29:42):
Here's how much terrorism content, Here's how much hate speech,
and then how much of it did we find before
it was reported to us? So what that report shows
is iss n L KADA content of what we take down,
we find it before it's reported hate speech. We're in
the mid sixty percentages now that's more than double where
we were a year and a half ago, but it

(30:03):
still means that thirty percent of the hate speech we
take down has to be reported to us, which means
someone has seen it, and so we are whack a
mole in a way, though, Cheryl that everything you take down,
something pops up. In its place. How can you ever
get really get control over this. Well, it is like whackable,
right we take something down. I mean, right now, as
you and I have spoken on this stage, someone many

(30:24):
people have posted things. Our job is to build technology
that takes that down as quickly as possible, and have
enough human staff that they can take down the rest
really quickly. It is whack a mole, but it is
the price of free speech. We have a service that
two points seven billion people are using our services. That
means that there's going to be, you know, all the

(30:45):
beauty and all the ugliness of humanity. And our job,
and it is whack a mole, is to get as
much of the bad off as quickly as possible and
let the good continue. And the only way to get
rid of all of it is to shut down all
of these services. And I don't think anyone's really for that.
What about temporarily shutting them down so you can fix
the problems? Would you ever do anything like that? I

(31:06):
don't think the temporary shutdown would fix the problems because
we have to be in the game to see what
people are doing to build the systems to shut down.
But the point is people have speech. Now, Like, if
you think about my childhood, right, I grew up in Miami.
I went to public school. If I wanted to say
something to the world, I had no no opportunity to
do it. Couldn't get on your show. No one was no. Seriously,

(31:26):
you weren't going to take me as a guest. No,
I wasn't young enough for me that. But hypothetically I
couldn't get on the person. Before I could write a
not ed to the local paper, they weren't going to
take it. People did not have voice full stop. Now,
that was a world that people felt actually pretty comfortable,
and you could fact check everything. You could fast forward
to today, whatever services get shut down, you can post somewhere,

(31:49):
which means that everyone has voice, which means that things
are not fact checked. Now that doesn't mean we don't
have responsibility. We do, but we are in a fundamentally
different place where people around the world have voice. And
as hard as this is and as challenging as it is,
I so deeply believe in that world, so deeply I am.

(32:09):
As a friend of mine behind the stage who went
to my high school, our high school teacher found a
kidney donor on Facebook because she could publish, and she
could reach people in a way she never could. We
just announced that two billion dollars have been raised by
people on Facebook for their for their for their birthdays,
and their personal fundraisers. Does that mean everything on Facebook

(32:31):
is good? Of course not. But you can't shut this
down without shutting down a lot of good. And I
don't think that's an unacceptable answer. And so we're going
to fight to get the bad off and let the
good keep happening. And I think there is a lot
of good out there. When we come back a look
at the alarming psychological effects of social media on our kids,

(32:52):
whether it's time to take a second look at lean
in in light of the me too movement, and I'll
ask Cheryl about her legacy. Let's talk about kids in
social media. This isn't so good. The addictive nature the

(33:12):
the the addictive nature of social media is just one concern.
But as you know, I know you have two kids
twelve and fourteen. Now, depression is up dramatically among young people,
and the suicide rate of adolescent girls is up one
hundred and seventy after two decades of decline. And as
you know, the leading explanation is the arrival of smartphones

(33:35):
and social media. So, as a parent and someone who
has been a powerful voice for women, how do you
respond to that terrifying statistic and the bigger question, what
can be done about it? We take this really seriously.
I take it seriously as a Facebook execut I take
it seriously as a mom. So it turns out that
all uses of phones, all uses of social media, are

(33:58):
not equal. There are some that are actually quite good
for well being, and there are some that are not
as good. So when you are actively consuming, when you
are sharing, when you are messaging, when you are posting, liking,
you're interacting with people, that's fairly positive. When you are
more passively consuming, that is more negative. And so we
made a very big change to the Facebook algorithms in January.

(34:21):
And what about Instagram as well? Yeah, and Instagram we're
working on as well. But we dramatically dialed up the
friends and family sharing and dramatically dialed down on self harm.
Our policies are very strict. We do not allow any
glorification of it. We don't allow any We don't allow
any glorification of self harm. We don't allow any encouragement.
We do allow people to post about their experiences, and

(34:44):
that has been very important. We've worked really hard to
develop automated tools, so if you post something that looks
like you might be about to self harm, we will
automatically flag UM phone numbers and helplines. We've had a
tremendous response from this, and if we think there's imminent danger,
we refer it to local law enforcement, and many people

(35:04):
have actually been saved by this. The other thing where well,
that's sort of not addressing the problem of addiction, of
you know, comparison being the thief of joy. Let me
finish some of the other things we're doing, because these
are all really important, and I'm conscious that this clock
is beeping at us UM, so they're gonna give me
a little extra so they are okay, then I can
slow down. So so one of the other things that

(35:30):
happens is, you know, social media can considered by some
to be a place where you know, you're supposed to
have the perfect the perfect life, the perfect body, a
real issue for teenage girls. You and I have talked about.
We're really trying to go against that. We ran a
campaign that's very popular UM on Instagram with real men
and women with real body types talking about that. We've

(35:50):
worked with the National Suicide Awareness Lines on this, We're
working with the w h O on mental health. We're
also I think the answer is almost always technology. So
one of the things I think is great. We have
a comment warning now that we've been rolling out, where
our automatic filters detect that you might be posting something
that's not nice. We will do a pop up and
say do you really want to post that? And again

(36:13):
we're seeing a tremendous response. We also have abilities to
restrict people to prevent bullying, so that you know, if
someone were bullying you, you can restrict them. They won't
know you're restricting them, and if they comment on your post,
no one can see them. And so these issues are
real and we have to work hard on building the
technology and that technology and the answers. There's so many

(36:34):
huge challenges and how difficult is it CHERYLD truly to
address any of these when solving them in some ways
works against your business model. You know, one critic said
Facebook has priced itself out a morality, and I'm just
curious if implementing some of these changes is bad for business.

(36:55):
So on this, I'm really pretty proud of our track
record if you look a number of years ago and
you listen to our earnings calls. So earnings calls are
exactly what people are worried about. They're directed at investors.
It's our quarterly report. If you actually watch us and
earning calls, we are spending as much time talking about
the measures we take on safety and security as we
are about our business growth. Easily. We actually said many

(37:18):
quarters ago, this is so important to us that we
are going to make massive investments and change the profitability
of our company by making real resource investments. And we
have to the tune of billions and billions of dollars,
and we will keep doing it. We've taken action after
action after action that is better for protecting the community
than it is for our growth, and we're going to

(37:38):
continue to do that. Mark has said it over and
over again. I have said it over and over again.
Let me ask you about Mark testifying before the House
Financial Services committing and a hearing focused on Facebook's plans
to launch a new digital currency called Libra. Given the
massive reach and trust the public has experience with Facebook
selling personal information through third parties, is it realistic to

(38:00):
expect the world to embrace cryptocurrency an initiative like libra
given that protecting personal financial data really is next level
in terms of the need for security. And I understand
you were supposed to testify, but you had kind of
a testy exchange with Maxine Waters when you were up
on Capitol Hill or somewhere. Can you tell us what happened.

(38:22):
We have a lot of respect for Maxine Waters for
the work we've done, and we worked really closely with
her committee. It was her choice to have Mark testify,
and that's obviously something we respect. But what happened between
you just answer the question, you don't mind on libra Um,
what we have said is that we are working on
a digital currency. I think it's really important to think

(38:42):
about how many people in the world are not financially
included in the banking system. By the way, not a shock.
Most of those are women. Women pay huge frommittance fees.
If you go to work as a domestic worker in
another home in another country, you're sending back money and
you're paying larger fees if you're a one. And there
are people who are unbanked. They work in the fields

(39:03):
and their money can be stolen by anyone, and women
are the most vulnerable. So I think there are really
good reasons for a digital currency to exist, and I
think they will be good for a lot of people.
That said, we've been very clear that we're not launching
this until we have regulatory approval. It's not a Facebook project.
The currency itself is an international nonprofit set up that

(39:25):
we are part of. I know that we wanted to
have a moment to talk about lean In and some
of the research that you have found about the discomfort
men feel mentoring and spending time alone with women. This
is something that greatly concerns you. And what can we
do about the increasing unwillingness of men to mentor their

(39:46):
female colleagues and tell us a little more about that research. Well,
it's really important because look, the METO movement you and
I have had chance to talk about it is so
important because women have faced too much harassment for too
long and I think we're in a better place, but
we're certainly not protecting everyone we should. That said, we
have to worry about the unintended consequences. So what our

(40:06):
research shows, this is lean In and survey Monkey, is
that six of male managers in the United states are
not willing right now, are nervous about having a one
on one interaction with a woman, including a meeting. We
do a show of hands in the audience. Who's promoted
someone you've never met with, just in case you can't

(40:28):
see there are no hands. If you cannot get a meeting,
you cannot get a promotion. A senior man in the
world today is nine times more likely to hesitate to
travel with the junior woman and six times more likely
to hesitate to travel to have dinner with the junior
woman and a man. So who's getting the travel the men,
who's getting the dinners the men? And who's gonna get
promoted the men? Which is what was happening before, and

(40:51):
talks a lot about that. It's absolutely the case you
promote the people you know better now. I think everyone
should be able to do all of these things with everyone.
You should be able to have a meeting, keep the
door up, and if you want to travel does not
mean a hotel room. Travel means a public airport. Dinner
does not mean you're flat, dinner means a restaurant. We
have to be able to do all of this. But

(41:11):
what we really want men to understand is that if
you're not going to have dinner with women, don't have
dinner with men, group lunches for everyone, make access equal,
because if we don't make access equal, we're never going
to move these numbers at the top, and women today
have seven percent seven percent of the CEO jobs Before before,
we guys want to talk to you because we talked

(41:32):
about lean in prior to Me Too, and given the
systemic failures of so many organizations that we've seen that
have tolerated sexual misconduct and harassment silence women through n
d as, do you think, in retrospect, given the very
real revelations that have surfaced as a result of the
Me Too movement, lean in might have put too much

(41:53):
of the onus on women to change instead of getting
a lot of these screwed up companies to change. Well,
we've always done. One of the problems with the word
lean in is you can really oversimplify without actually reading
the book and ourself. But if you read actually what
we've written and the work my foundation is done. What
we've always said is that we wanted to be okay
for women to be ambitious, and we want companies to

(42:15):
change and fix and it has to be both. It's
actually pretty interesting if you save the sentence he's ambitious,
it's pretty neutral or positive. He's going to get the
job done. She's ambitious. That's a negative. And that is
still true today. If you look at the use of
the word bossy. You know, go to the playground anywhere,
I promise, l A or anywhere this weekend and you

(42:37):
see a little girl. You won't see a little girl
get called bossy. And you walk up to her parents
and you say that little girl's not bossy. Her parents
probably did it, big smile on your face. That little
girl has executive leadership skills. No one says that. No
one says that because we don't expect leadership from girls,
and so we have to fix that problem. And that

(42:59):
means companies have to change, culture has to change, and
women have to feel free. Now they're really well. I
have one question at I might discussion, but time to wrap.
Thank you, Graham. Was that that my final question? Getting
back getting back to all the controversies, I mean Facebook,

(43:20):
My last question is I'm gonna gaun us, but no,
I'm curious because I just wanted to end this conversation, Cheryl,
given all the controversy Facebook is facing clearly in the crosshairs.
I mean, the company people love to hate. Since you
are so associated with Facebook, how worried are you about

(43:41):
your personal legacy as a result of your association with
this company. I think I have a really big responsibility
here for a company I love and believe in that.
I really believe in what I said about people having voice.
I really know that when I was growing up, I
had no ability to reach anyone, and most people in

(44:01):
the world didn't, and social media has changed that. There
are a lot of problems to fix, and we did
a great job in this audience talking about a lot
of them in this interview. They're real and I have
a real responsibility to do it. But I feel more
committed and energized than ever because I want to fight
to preserve the good. Because I met a woman not

(44:21):
so long ago who for her birthday raised four thousand
dollars for a domestic violence shelter that she volunteers at,
and crying, she told me I saved two women from
domestic abuse. I never could have done that before Facebook,
and so there are really big issues to fix, but
I am so committed to giving people voice and giving

(44:42):
people away to react that I just want to keep
doing the work and committed. They feel honored to do
it and committed to fix problem. I want to fix
them all right, Well, they're definitely gonna kill me if
I don't stop now. Definite, Lamberg. Thank thank you, thank you.
After we were done, Cheryl and I later exchanged emails.

(45:02):
She told me this was the toughest interview she had
ever done, but complimented me on being so well prepared.
She was incredibly gracious about the whole thing. Meanwhile, about
a week after our conversation, Twitter CEO Jack Dorsey announced
it was banning all paid political ads globally. Facebook, though,
is still sticking with its policy, at least for now.

(45:26):
Thanks so much for listening everyone. If a weekly podcast
isn't enough of me, you can follow me yet on
social media Facebook, Instagram, and Twitter. And if you feel
like you're drowning in a seven sea of news and information,
sign up for my morning newsletter, wake Up Call at
Katie Curic dot com because, as they say, the best

(45:49):
part of waking up is Katie in your inbox. Sorry, folgers,
that was pretty bad, wasn't it. Everyone, Thanks again for listening. Everyone,
and I can't wait to be in your ear again
next week. Next Question with Katie Curic is a production

(46:12):
of I Heart Radio and Katie Curic Media. The executive
producers are Katie Curic, Lauren Bright Pacheco, Julie Douglas, and
Tyler Klang. Our show producers are Bethan Macalooso and Courtney Litz.
The supervising producer is Dylan Fagan. Associate producers are Emily
Pinto and Derek Clemens. Editing is by Dylan Fagan, Derek Clements,
and Lowell Brolante. Our researcher is Barbara Keene. For more

(46:35):
information on today's episode, go to Katie Currek dot com
and follow us on Twitter and Instagram at Katie currec.
For more podcasts for My Heart Radio, visit the I
Heart Radio app, Apple podcast, or wherever you listen to
your favorite shows.
Advertise With Us

Popular Podcasts

Dateline NBC
The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.