Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
I've just received a call from Secretary Clinton. She congratulated us.
It's about us on our victory, and I congratulated her.
What's the first thing that crossed your mind when you
(00:21):
realized Donald Trump was going to win the presidency. I
think I was just shocked, like the rest of the
country and the rest of the world. I was at
home with my girlfriend in San Francisco, and we were
just watching the returns come in. It took me a
while to process it all too, honestly, But Bobby good Lot,
who's the founder of a political startup called Open Vote,
(00:44):
he immediately started thinking about the role the tech industry
may have played before even the last swing states were called.
Before Trump gave his victory speech, Bobby was writing up
a post on Facebook that would end up getting a
lot of attention. So I was watching me returns come
in and very soon. Don don't mean that what was
about to happen. I think I posted around nine o'clock
(01:05):
on electa night. A lot of people will have from
realization there. Obviously, Uh, the elections did not go the
way I had hoped. I think a lot of us
on the losing side here are kind of going through
the kind of classic stages of grief. You know, maybe
one of those first kind of instincts is a Bobby
(01:26):
was angry at a very specific force in the election.
Highly partisan news outlets their fuel, he said, with social
media and especially Facebook, he said, quote sadly, news feed
optimizes for engagement. As we've learned in this election, bullshit
is highly engaging. He said. It should be a wake
up call. Bobby's post was notable because he used to
(01:49):
work at Facebook. When he said that Facebook was what
fueled some of the news sites that propelled Trump to win,
he was talking about the company that he poured his
heart and soul into for four years. The debate that
followed on Bobby's Facebook wall, publicly viewable, featured a lot
of influential people in Silicon Valley. There are a bunch
of people who seemed to sympathize with his view, and
(02:13):
there are also a couple of executives at Facebook who's
shot back. Yeah. And what caught my eye was this
terist exchange between Bobby and Andrew Bosworth, this long time
executive at Facebook. He also helped create the news feed um.
Bosworth wrote, news feed isn't perfect, but it is at
least or more diverse than the alternatives which dominated consumption
(02:34):
in the late nineties and early odds. And this exchange
that unfolded was a little window into the conversation that
erupted all over Silicon Valley. It's the thing that we've
been all talking about in the aftermath of the election.
Hi am Aki Ito, and I'm Sarah Fryer. And now,
(02:57):
after the biggest electoral set in recent history, we're gonna
be doing some soul searching with the tech industry. And
the question we're asking today is this, do the tech
companies that guide our news diet, so this is Facebook, Twitter,
and Google, do they need to take that responsibility a
lot more seriously. Specifically, people are talking about two things.
(03:20):
One is the fake news stories that may have helped
sway people's opinions about Trump and Clinton, and how social
media helps spread that misinformation to a lot of people.
The second point has to do with the way that
Facebook shows some things over others in your news feed
and how that filtering maybe blinding you from news stories
from the opposite side of the political spectrum. Yeah, these
(03:41):
are long term problems that have just come into focus
in the last couple of weeks, and we're certainly not
going to solve them today, but we're going to try
to explain what's going on. And there's a lot at
stake here. As we were furiously getting this episode together,
even President Obama chimed in the major where ah, there's
(04:02):
so much active misinformation and it's packaged very well, and
it looks the same when you see it on a
Facebook page or you turn on your television, where some
over zealousness on the part of a US official is
(04:25):
equated with constant and severe repression elsewhere. If everything ah
seems to be the same and no distinctions are made,
then we won't know what to protect, we won't know
what to fight for. So Sarah, let's set the scene here.
(04:52):
This is the first election where the majority of people
in the US are getting their news from social media,
with two thirds of Facebook user saying they get their
news there, according to a few study of US adults.
And these are campaigns, especially Donald Trump's, that would run
primarily through social media, not just through the traditional channels
of the press and television ads. And some of these
(05:15):
stories had a clear agenda, sometimes based on just complete
outright lies, and they turned out to be super popular. Yeah,
BuzzFeed actually ran this analysis of election stories on Facebook
a couple of days ago and found that the top
performing one was that the Pope endorsed Donald Trump, which
of course never happened. That got almost a million shares, reactions,
(05:38):
and comments, and all of the top fake news stories
outperformed the top stories from RAPPI people news organizations. It's
incredible the top fake news stories did better than the
top stories from like the New York Times and traditional
news outlets. That's that's really meant that Brook Minkowski and
(05:59):
her staff have been working around the clock. Brooke's been
at the front lines of combating these fake stories during
the selection cycle. She's the managing editor of the fact
checking site snopes dot com. So how have you seen
the proliferation of fake news grow over the last few years.
(06:21):
If it has grown, it has definitely grown. It's become
sort of this beast unto itself. Although I don't want
to give the false impression that it did not exist
before two years ago. I mean, rumors and rumormongering and
fake news has been around for a long time. Satire
has been around for a long time. But I think
the difference now is that it's easily monetized and people
(06:43):
are finally catching onto the fact that you can get
outrage clicks and fear clicks and make a lot of
advertising money in the process. I mean, it adds up
so much if you have enough websites and you have
enough pages and enough aggrieved and fearful people. Here are
some fake stories she has had to debunk that a
van full of illegals went from polling place to polling
(07:06):
place to vote for Hillary. What about this one? Chelsea
Clinton's apartment has a secret hospital where Hillary got treatment
after her fainting spell on nine eleven. Or there's one
says Donald Trump personally sent out a plane to help
some Marines get back home after Operation Desert Storm in
the nineties. Okay, so this is probably a good time
(07:30):
to explain how Facebook's news feed actually works. How Facebook
shows you some stories but not others. Well, Facebook is
making sure that they're showing you content you're likely to
be interested in. That's really all they care about. They're
using signals based on what you like. Even the articles
that you read that you don't actually click like on
(07:51):
or share, maybe you just hover on them a little
longer than other posts. Facebook's algorithm notices and adjust to
your preference that has been weaked by fake news purveyors
because they realized that you can really harness the power
of fear and outrage in particular, and also, by the way,
that's not limited if you're an outrage. Puppies and women's
(08:11):
boobs also make a lot of money, sometimes kittens. So
there's all those kinds of things, and they've managed to
sort of get it to an art form, you know,
the keywords that make people angry and have them share stuff.
So it's with this backdrop that our former Facebook product
designer Bobby good Lot posted what he posted on the
night of the election, and in addition to attracting rebuttals
(08:35):
from some of Facebook's executives, Bobby spot so so caught
the attention of an outspoken guy in Silicon Valley at
the time. He was halfway around the world. Election night,
I was in Portugal with for Web Summit conference with
a bunch of other people from technology in San Francisco
watching you know, watching the election with were you guys
(08:56):
in a bar like what will? Yeah, we were in
a series of bar We kept trying to find a
place to watch it. Was still still open and uh,
you know still had like had TV with audio, So
it was it was an adventure on its own. That's
Justin Cohn, a partner at y Combinator and the founder
of Twitch, which has millions of people streaming video games
(09:18):
as they played them. I saw Bobby's post, and I
saw also the post of a bunch of Facebook execs
vehemently denying it right there, saying it's better. They were
very much rationalizing what they had created, right They were saying,
it's better that everyone can just consume, you know, one
of thousands of news sources. I don't agree, right, I
don't agree with that. And that night I think it
(09:40):
was very eye opening to see it like we are
um creating something that we don't really know what the
full consequences are, but we are basically bifurcating our our
realities into you know, you're living in the factual reality
that you want to now. I think that night was
really when I started thinking about this. And the next day,
Justin was at the conference that he was in Lisbon
(10:02):
for and he was with this other investor called Dave mckler.
He's the founder at five hundred Startups, which invests and
helps young startups grow. Justin hadn't slept much and tensions
were high. We all had all been watching the election
up until five am that morning before and Dave we
were on a panel together supposed to be about ego
(10:24):
and Dave got up and you just started screaming about
the fake news. Kissed right now? What it's wrong with you?
What is wrong with you if you're not kissed right now?
The cathology has a role in that we communicate, We
provide communication platforms for the rest of the country, and
we are allowing it to happen, just like the cable
(10:44):
news networks, just like talk radio. It's a propaganda medium.
And if people aren't aware of it that they're being told,
if they're being told a story of fear, that they're
being told a story of other if they're not like
understanding that people are trying to use them to get office,
then yes, told like Trump are gonna take off, and
it's arguing and our responsibility as preneurs, as citizens of
the world to make sure that does not happen. And
(11:07):
at first I was like really shocked he threw his
water bottle at the audience, But then I realized, Like,
you know, I started talking about it with friends of
mine app in the days following, and we started doing
some research. You can't quantify the effect it's had on
votes necessarily, but you can't not make some difference, right,
And the fact is it's a nonpartisan issue. I think
people making decisions based on fake news is not a
(11:29):
good idea, right. That's that I think everyone can agree
that that's actually going to be bad for American society
and discourse in general. Justin, Dave and Bobby weren't the
only people thinking about this. After the election, a lot
of stories came out essentially blaming Facebook for helping these
fake news stories go viral, and critics claim that some
(11:49):
of these stories may have even tipped the American electorate
in favor of Donald Trump. The pressure was mounting, and conveniently,
Facebook CEO Mark Zuckerberg was wil to appear at a
conference hosted by to Economy. I tuned in. You know, personally,
I think, Uh, the the idea that you know, fake
(12:11):
news on Facebook, of which you know, it's a it's
a very small amount of of the content UH influenced
the the election in any way, I think is a
pretty crazy idea, right, and it's um. You know, voters
make decisions based on their lived experience, right, I mean
they you kind of one part of this that I
(12:32):
think is important is we really believe in in people, right,
and that they can be like you don't generally go
wrong when you trust that people understand what they care
about and what's important to them and uh, and you
build systems that reflect that. Okay, So the founder of
(12:55):
Facebook is saying that Facebook didn't have an impact on
the election, right, kind of different what they tell their advertisers.
But Dave McClure, the investor who threw his water bottle
at the conference in Lisbon, he said it wasn't right
for Zuckerberg to deflect responsibility for this. I mean whether
or not their media company, they certainly have a responsibility
(13:16):
in terms of allowing inaccurate news reports to have substantial impact.
That there's certainly news story on there that are clearly fabricated,
that we're being seen by hundreds of thousands, if not
millions of people, and to suggest that that's not going
to have impact on an election, it's just very dissigenous.
The remark to suggest that Facebook isn't influential in terms
of any kind of content is preposterous. I mean, when
(13:40):
when Facebook is promoting numbers in the billions of people
that are using its products and daily usage numbers that
are substantial portions of the population, to suggest that they
do not have influence is just patently unbelievable. Later on
his Facebook Wall, Zuckerberg noted that end of stories and
(14:00):
newsfeed aren't fake, but one former Facebook employee had some
qualms with that claim. Judd Anton, who now works for Airbnb,
did some quick napkin math to find that one percent
of stories being fake. That's actually kind of big. It
could mean twelve million people see a fake news story
every day. And he notes that if one percent of
(14:21):
the articles in the New York Times or for that matter,
Bloomberg we're wrong. That would be absolutely unacceptable. And behind
all of this is this, Facebook isn't a news organization, right.
It doesn't write stories the way you and I do, Sarah.
And Facebook doesn't want to be the arbiter of what
is and isn't news, what is and isn't true on
the Internet. Yeah, they want to leave it to their
(14:43):
users to report fake and misleading content. If they were
to get too close to deciding that could open the
company after criticism. They'd have to define the truth and
they're afraid of appearing biased. Yeah, And if you compare
that to Google and Twitter, Google soonerach I has come
out and said fake news could be a problem, and
Twitter has come out and suspended a lot of high
(15:05):
profile white power nationalists extremist accounts. But Facebook, Yeah, like
earlier this year, there were some reports that their trending
topics editors were biased against conservatives, and the company made
a big deal of inviting conservatives to their Menlo Park
headquarters for a meeting with Mark Zuckerberg himself, and then
(15:26):
Facebook ended up firing their staff deciding to rely on
an algorithm for the trending tool. And based on what
I've been hearing internally, that whole ordeal sparked a lot
of debate about what Facebook role should be in news,
and after the election, that debate has intensified a lot. Okay,
so that's part one of Facebook's role in the election
(15:48):
according to its critics. We're going to go into another
criticism in a minute, but before we do that, I
tweeted the other day to get our listeners take on
the role of social media in this election, and what
we got was from a staff and the Geeb who
lived in Egypt during the Arab Spring protests, proving that
this issue is pretty global. Hello, I am Mustafa. I
(16:15):
am an Egyptian working in tech in Berlin. We had
a very similar situation in Egypt. We had our revolution
as part of the Arab Spring, and then what happened
is the counter revolution started spreading fake news about key
revolution figures and events. The regime in Egypt even created
(16:38):
what we now call electronic militias, which are groups of
people the regime employees to spread fake news and and
therefore trending algorithms and news feed on Facebook can pick
these up and amplify them even further. Okay, we suffered
(17:01):
from this a lot, and we still suffered from it. Personally,
I lost hope that anything could change. I realized that
people only hear what they want to hear. I decided
to leave the country more than a year ago, and
earlier this year, I deleted my Facebook account because I
(17:21):
just I can't take it anymore. And so it's the
case with what's going on in the US. I'm literally
seeing it as a flashback to what we've been having
for the past five years in Egypt. Yeah, that's incredibly
sobering and really sad um. I think that takes us
(17:44):
to the second criticism of Facebook. Yeah, here's justin again.
The second thing, which I think is going to be
more controversial, is whether Facebook should take steps to make
us see a more centrist point of view. I think
we are moving towards a bifurcated society where people see
different sets of facts, right, whether it's left and right,
(18:04):
whether you live in the cities or you live in
a rural area. And the problem with that is that
if we can't mutually agree upon some common ground, we
don't have much of a country, right. And so I
think it's very important as a society takes steps to
get out of our echo chambers and understand the other side,
the other team. And I think Facebook has a role
because it's very hard for individuals to do that. You know,
(18:26):
it's hard to go and seek out things that don't
confirm your own biases. It's hard for people on the left,
it's hard for people on the right. This has become
another popular observation over the last two weeks. But Mark
Zuckerberg he disagrees with this too. You know, regardless of UM,
what leaning you have on on Facebook politically, or what
your background is the you know, all the research would
(18:50):
show that, you know, almost everyone has some friends who
are on the other side. That means that the media
diversity and diversity of information that you're getting through a
social system like Facebook UM is going to be inherently
more diverse than what you would have gotten from, you know,
watching one of the three news stations UM and sticking
(19:10):
with that and having that be your newspaper, your TV
station UM twenty years ago and even more now. The
research also shows something which is a little bit less inspiring.
We studied not only UM people's exposure in news feed
to content from different points of view, but then what
people click on and engage with, And by far the
(19:31):
biggest filter in the system is not that the content
isn't there, that you don't have friends who support the
other candidate or that are of another religion, but that
you just don't click on it and you actually tune
it out when you see it. So if Facebook started
feeding you centrist content like Justin suggested, Zuckerberg thinks maybe
you wouldn't click on it. Anyway. Yeah, he kind of
(19:53):
sounds like a parent trying to get his kids to
eat more vegetables when when us kids, we actually want chocolate.
So should we talk about solutions? Yeah, here's a here's
Brooke Brinkowski's take. Remember she's the editor from snopes dot com.
It's important to note here that she actually doesn't think
Facebook or Twitter or any of the other Internet platforms
(20:15):
where to blame for the elections outcome. She said, that's
kind of like, you know, maybe blaming a crazy person
holding a megaphone and blaming the megaphone instead of the
crazy person. But she said she has some ideas for
how to make things better. So let's just say I'm
on Facebook, I have tons of money. I would assemble
a really good editorial team to start fact checking. Um.
(20:36):
I mean it's not easy. You need resources, in time,
you need man hours or women hours. I suppose I
would be sure to have journalists in their vetting the
information and misinformation, and I would have more oversight over
what they were curating as well. I mean, I appreciate
the idea, and I know that computers can help and
algorithms can help, and there are ways to ease it.
(20:59):
But like, you can't just rely on algorithms. I don't
think Facebook is gonna like that proposal. Well, I mean, sorry, Facebook,
but you know, and I and and I say this
as a defender of Facebook. If they're going to try
to be in the anything other than just a social network,
if they're really going to be I mean, I know
(21:21):
they already are, but if they're going to be more
serious about becoming a news source than they're going to
have to vet and fact check and build a team
to do so. Well, Justin's suggestion as well as days
was to come up with ways to van or at
least limit the spread of fake news sites from Facebook altogether.
And Justin felt this new sense of responsibility to make
(21:41):
sure that he himself wasn't contributing to these problems either.
What will you do differently starting today? Well, my first
step was I need to identify these issues and talk
about them and maybe write about them. Second step, I'm
not sure. I think we need to On the job side,
(22:02):
we need to think about how we're going to create
jobs and transition people for things that are going to
be like eliminated in the short to medium term. Um,
I think a lot more thought needs to go into
that in terms of I guess the information bubble, I
think we're gonna have to think about what I really
(22:23):
want to invest in or fund a company that is
just giving people what they want and effectively displacing a
central you know, their access to more centrist information. And
I think the answer today is no, whereas the answer,
you know, before election day might have been, well, you know,
I think it will grow very fast, so maybe it's
worth investing in anyways. So I guess that's one actionable step,
(22:49):
So Sarah. After a lot of this kind of commentary
from Justin and and basically everyone else, this was the
topic of conversation over the last few days, Mark Zuckerberg
came out with an update on his Facebook wall. He
seems to have had a change of heart about how
much he wants to reveal about his plans. Here he
(23:10):
actually goes really, um, really in depth into the number
of things he's thinking about for solving the fake news problem.
He does admit that they have a problem with fake news. Um,
sort of a one eighty from what he said earlier. Um.
The points he he notes are the same kind of
suggestions that I've been hearing in the tech community, you know,
(23:32):
fixing Facebook's text so that they can better detect fixed stories,
using third parties like Snopes to help them, warning people
on stories before they read or share them that they
may be fake. Um. And a lot of these things
do have pitfalls, Like you know, he's still relying on
the community to determine what is fake. And you know,
(23:54):
based on what the people I've talked to, people tend
to mark things as fake even when they just disagree
with them. Um, So there are this is not going
to be an easy problem to solve still, um, but
at least he's getting a little closer. One thing I
love about this post though, is he says he wants
to disrupt fake news economics, and that's his his admitting
(24:19):
that Facebook is responsible for fake news economics. Um, that
these these posts exist because they are rewarded by his
current algorithm. That's right. These are the teenagers that BuzzFeed
has reported in uh Macedonia who are making tons and
tons of money out of these fake outrage stories that
(24:39):
go viral on Facebook. Or a couple of guys in
California that the Washington Post profile this weekend that are
just like writing as many conspiracy posts as they can
for the money. Um, this is really an area that
that Facebook can make moves, And you know Zuckerberg, I
want to read a little bit of what he says here. Um.
(25:01):
He says, Normally we wouldn't share specifics about our work
in progress, but given the importance of these issues and
the amount of interest in the topic, I do want
to outline some of the projects we already have underway.
So these are things that will be happening. Yeah. Yeah,
It sounds like these discussions were already taking place inside
Facebook even before he came out with this post, but
(25:22):
given the tenor of the conversation over the last two weeks,
he had no choice but to actually come out and
and say, yes, we hear you, we are working on it.
When I spoke to Bobby good Lot the other day,
our author of the critical Facebook post, it was clear
that he had softened his take. He said that after
(25:43):
doing some soul searching of his own, he gained a
lot of sympathy for the good engineers of Facebook, to
whom he owes gratitude for being able to see pictures
of his baby niece on the regular Facebook's problems, he said,
are the same problems we just have in our culture
at large. You can't change underline reality here. All these
systems are built on top of the kind of real,
(26:06):
the real connections we have in the real world. Yeah.
I think a lot of people are looking for figures
to point right now, and I think it's it's, you know,
the nswer to a lot of this. It's more complicated
than that. And that's it for this week's Decryptid And
(26:27):
what are your thoughts on Silicon Valley's impact on the election?
You can tweet at me at AKO seven and I'm
at Sarah Fryar, or you can send a voice memo
to our producer Pia get Cardi at p g A
d k A r I at Bloomberg dot net. We
might play your thoughts on a future program the way
we played mus Staff is voice memo Today. We're still
(26:49):
a new program, and we'd love your help in spreading
the word. Please subscribe to our show on iTunes or
wherever you get your podcasts, and leave us a rating
and review. This episode Zone was produced by Pierre Gagcari,
Magnus Hendrickson, and Liz Smith. Alec McCabe as head of
Bloomberg podcasts. We'll see you next week. H