All Episodes

November 22, 2018 44 mins

A piece in the New York Times examined how Facebook navigated a tricky situation through questionable strategies in the wake of multiple scandals. What did the journalists find out and how has Facebook responded?

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Get in touch with technology with tech Stuff from how
stuff works dot com. Hey therein Welcome to Tech Stuff.
I'm your host, Jonathan Strickland. I'm an executive producer and
I love all things tech and this is one of
those topics where things are pretty complicated and are unfolding

(00:26):
as I sit down to record this episode, but I
thought it was important enough to actually address it. On
November two thousand eighteen, The New York Times ran a
story written by five reporters and it was six thousand
words long. The focus of the story was about Facebook
and how executives at the social media company have tried
to respond after a series of scandals and accusations and

(00:50):
muddied Facebook's public image. And it was a pretty eye
opening report and it's ended up causing a lot of
people to yell at each other, or at least it
has given a lot of ammunition in the yelling arguments
that are happening in Washington, d C. And Silicon Valley.
So the scandals in question are uh some of the

(01:13):
things we've talked about in previous episodes of Tech Stuff
this year, like the signs that Russian agents and hackers
were creating numerous accounts in order to steal information and
spread propaganda and misinformation, and to generally undermine the democratic
process in the United States. It also included the Cambridge
Analytica scandal, which is a separate thing. I covered that

(01:35):
in an earlier episode. In that mess, the public discovered
that an enormous amount of personal data had been mined
by this political analystic company, Cambridge Analytica, often without the
user's consent. Then there's the ongoing problem of hate groups
and hate speech proliferating on the social network. So this
New York Times piece was really an investigative look into

(01:57):
how Facebook top brass have bonded to these problems, and
spoiler alert, it ain't great. The piece goes through the
process of Facebook becoming aware of Russian hacker activity on
the platform. Alex Stamos, who was then the head of
security over at Facebook, initiated an internal team to look

(02:18):
into suspicious activity. In December, Mark Zuckerberg sort of dismissed
the idea that fake news on Facebook was having some
sort of effect on uh political process and had in
fact played any role in the election. Stamos was worried
that this wasn't exactly true and so he ended up

(02:41):
meeting with Zuckerberg and Cheryl Sandberg, who is the chief
operating officer over at Facebook now. According to the New
York Times article, Sandberg was furious with Stamos and said
that his investigation had opened up the possibility that Facebook
could be held accountable or liable for this stuff. But
ultimately the company chose to expand this investigation in an

(03:05):
internal project that was called Project P. The P stood
for propaganda. I should also point out that Sandberg responded
to the New York Times piece after it ran in
a blog post, and she denied the suggestion that she
wanted to avoid or slow down any internal investigations into
Russian interference. She specifically wrote, quote, Mark and I have

(03:29):
said many times we were too slow, But to suggest
that we weren't interested in knowing the truth, or we
wanted to hide what we knew, or that we tried
to prevent investigations is simply untrue. End quote. So that's
going on. But in April, Facebook would publish a paper

(03:50):
about the subject. However, in this paper, the word Russia
was mysteriously absent from it. It was about interference, but
the company did not name Russian hackers as the perpetrators.
Joel Kaplan, who is Facebook's vice president of Global Policy.
He's also a former deputy chief of staff for Republican

(04:12):
US President George W. Bush, had argued behind the scenes
against Facebook taking a more firm and definitive stance. He
said it would open the company up to accusations that
it was anti Republican and biased towards Democrats, and so
he said, in order to do that, let's not just
let's not lay out that it's Russian hackers that were

(04:33):
attempting to sway elections specifically in favor of Donald Trump.
So how did things get so bad so fast for
the company. While the piece maintains that Mark Zuckerberg, you know,
founder and CEO, and Cheryl Sandberg had focused on personal
projects rather than critical operations at Facebook, and they had

(04:56):
handed off those important responsibilities to subordinates, the New York
Times journalists cite numerous current and former executives who indicated
that there was a lot of delegating going on and
not enough oversight. So there were a lot of people
who were given a great deal of leeway to do

(05:16):
their jobs, and as a result, people probably stepped a
little further out than what Zuckerberg or Sandbrick would have preferred. Now,
when the various scandals all rose to a certain level
and public opinion was really beginning to shift against Facebook,
someone over at Facebook and it's not clear who yet

(05:37):
as of the recording of this podcast, made the decision
to go on the offensive and hire an opposition research
firm called Definer's Public Affairs. Opposition research is a really
nice way to describe the technique of researching an opponent,
typically a political opponent, in an effort to dig up

(05:57):
dirt or compromising information so that that information can be
used against that opponent. This information can be used to
discredit or weaken the person. And this technique is not new.
It's actually ancient. Was used in ancient Rome in their
republic more than two thousand years ago, so this has

(06:17):
been around for quite some time. The term opposition research
is a bit more modern, but the underlying principles are ancient.
And just to be clear here, this is a tactic
that's been used by politicians from all political parties. This
is not something that's someone should say, oh, only Republicans
do that. No, No, all political parties, at some point

(06:39):
or another engage opposition research at some level, and the
question is when does it go from being a legitimate
political strategy to an unethical one. And it's a pretty
gray line. It's uh ugly as well. Politics tend to
be pretty ugly, and in this case, they Facebook was

(07:00):
starting to employ a tactic that had been used in
politics and now was going to be used in business.
So the reason Facebook hired definers in the first place
was to help monitor news stories about Facebook so that
the executives would be aware of the general public opinion

(07:21):
about the company. In October, Facebook would expand this to
direct definers to specifically focus on the story about Russian
hackers on Facebook and how it relates to the manipulating
of the American public and the lead up to the
twenty sixteen election. They said, forget all the other stories,

(07:41):
really focus on these and see where that narrative is going.
The more the Facebook security team investigated the Russian hacker issue,
the bigger and more impactful it was turning out to be.
So it was sort of an attempt to Facebook trying
to stay ahead of what the public narrative was about
this whole thing and to get a better grip on

(08:03):
exactly what had happened before someone else found out and
then put Facebook on the defensive. So Facebook was really
concerned that this increased focus on the company would possibly
lead to government intervention in the form of regulations. Now,
generally speaking, big companies are not huge fans of regulations.

(08:25):
By definition, regulations limit what a company can do, and
since from a very high level perspective, the purpose of
a publicly traded company is ultimately to make money for shareholders,
limitations are generally viewed as a bad thing. They tend
to also require that a company invest money in various

(08:46):
processes and procedures, which means there's less money to go
toward profit. So again, the more costs you have, the
less attractive you tend to be towards shareholders. So all
of this requires a bit of mental gymnastics to separate
out what would typically be considered ethical, as in, what
is the right thing to do and what is considered

(09:09):
good business practices. Those two questions often arrive at very
different answers. A lot of a lot of businesses try
to go and UH and a moral route not immoral.
They're not trying to do something that is UH antithetical
to morals, but rather remove morals from the the question

(09:34):
entirely as much as you can. Uh, that isn't every business,
and certainly I don't think there are very many businesses
that do it to the fullest extent. But you see
a lot of companies try to ignore certain ethical questions
if those ethical questions are inconvenient in the pursuit of profit. Now,

(09:56):
to be fair to Facebook, the situation is incredibly complic
had it. I don't wish to say that there was
a very simple choice to be made and Facebook went
the wrong way. That is far too simplistic for what
was going on. Facebook executives have understandably, I think, argued
that Facebook is a platform, not a publisher. There is

(10:19):
a difference. As a platform, the company is not responsible
for the type of stuff people will post to that platform.
The ideas that the company is agnostic and disinterested. They
provide the venue, they do not provide the script. In
other words, so that guess Facebook a bit of protection
if someone were to post something really awful on that platform,

(10:43):
the company can enjoy a bit of protection. It's related
to a concept it's called safe harbor. The idea that
if you provide a place for people to put stuff.
You actually are not liable if someone puts something illegal there.
You you were fighting a service in the sense of
a place for people to go and do things. The

(11:05):
other person who put the illegal thing, they're they're the
ones who broke the rules. They should be held liable,
not you as the service provider. But then the problem
is Facebook doesn't take a completely hands off approach when
it comes to people posting stuff on the platform. For
one thing, the company has designed algorithms so that users

(11:26):
see some content, but they might not see other content
from their friends. And I'm sure if you've used Facebook
you've had this experience. Maybe you missed out on a
post that a lot of other people are talking about,
and it's not that you were excluded, it's just that
Facebook's algorithm didn't share that post with you, so you
didn't see it. Or maybe you posted something and you

(11:49):
were surprised that more of your friends didn't respond to it.
And again, it may very well be that Facebook just
didn't display your post in people's feeds. So Facebook algorithms
in part determine what you see. Generally speaking, posts that
get more interaction or engagement tend to be seen by
more people. Facebook's algorithm tends to favor those. So if

(12:11):
a post gets a lot of likes, if it gets
a lot of shares, if it gets a lot of comments,
it tends to raise the visibility of that post, and
it tends to show up in more people's news feeds. Well.
News flash posts that get a lot of engagement tend
to be very emotionally charged and controversial because they tend

(12:32):
to invite people to either chime in and say, yeah,
you're totally right for that controversial perspective you've posted, or
you are way off base and you are a jerk
face for putting such a controversial post up on your
news feed. All of that engagement just drives the visibility
of that post and makes it even more visible, which

(12:52):
invites more people to participate, which again boosts the visibility,
and so you start to get these sort of toxic
posts rising to the top. Then there are promoted posts.
So by paying money people, organizations, companies, they can boost
the visibility of a post. They pay Facebook and Facebook
make sure that that post will show up on more

(13:14):
news feeds. So that muddies the waters too, because Facebook,
as it turns out, isn't just an empty stage where
anyone can get up and say anything they want and
be heard by all the people who happen to be
in the room at that time. It's not an even
playing ground, and it tends to favor inflammatory, controversial posts

(13:37):
and people who have money to spend. And again, understandably,
Facebook didn't want to take on the mantle of publisher
at that time. They didn't want to accept that as
their responsibility because that would mean the company would need
to monitor posts and potentially step into sensor problematic users
and accounts. They would where a lot of investment on

(14:01):
the part of Facebook, and this would be an ongoing expense.
They would have to keep on policing their service. And
it's a big service, so that's a huge job. And
since again the purpose of business is to make as
much money for the owners or shareholders as possible, additional
expenses are generally undesirable. I've got a lot more to

(14:24):
say about this whole subject, but first let's take our
own quick break to thank our sponsor so we can
pay our expenses. In addition to monitoring the news about Facebook,
definers began to dedicate resources to deflecting some of the

(14:48):
blame for Russian involvement by trying to steer the conversation
to target some of Facebook's rival tech companies, namely Google
and Twitter definers, along with two other companies that it
shares space with. Those two companies being America Rising, which
is a political action committee, and in t K Network,

(15:10):
which is a news network with a right wing slant
on news items. We're all collectively using the same office space,
some of the same staff, and so that brings some
questions in there. In t K Network would publish stories
that were pro Facebook and anti Facebook competitor during this time,

(15:30):
so they were trying to steer public opinion to try
and take some of the heat off of Facebook itself
and put it on some of its competitors. Well before
the problems that would lead to Zuckerberg having to sit
in front of Congress, activists had been accusing Facebook of
allowing various oppressive governments around the world to co op

(15:52):
the platform in order to spread propaganda or to identify
people that those governments considered a threat in order to
sign lence them or eliminate them. When Facebook was responding
to mounting criticisms, including that famous session in which Mark
Zuckerberg Berg would appear in front of Congress to provide
answers or explanations regarding Facebook user data, Russian interference, and more.

(16:16):
There was a simultaneous problem of people protesting the company
and its practices. So this is going on around the
same time. Activists were calling for oversight or regulations, which
Facebook definitely did not want to have to deal with,
and so definers got the directive to go do some
digging on the activists in an effort to discredit them. Now,

(16:37):
one particular group that was becoming a thorn in the
side of Facebook was called Freedom from Facebook. Now, this
included an attempt to link those activists to a billionaire
financier named George Soros. You probably heard that name if
you've been paying attention to the political news in the
United States. George Soros has backed a lot of different causes.

(17:01):
A lot of them are democratic causes for countries in Europe.
He's backed a lot of philanthropythropic causes in the United States,
but hasn't put as much money in direct democratic races
here in the US as he has in Europe. Soros
would actually create a philanthropic agency. It was called the
It is called the Open Society Foundations, and Soros funded

(17:26):
this organization with eighteen billion dollars of his own money.
According to Bloomberg, Soros is currently worth about eight billion dollars. So.
Soros was born in Budapest in nineteen thirty and he
comes from a Jewish family. His family wasn't really religious,
they were non observant Jews. But growing up in Budapest

(17:47):
in the nineteen thirties, you know, World War two write
on the horizon. It was a stressful time and uh.
He then would move throughout of Hungary. He moved to
the UK and then moved to the United States, and
he built his wealth. His enormous amount of wealth and
his support of various liberal causes, not just in the

(18:09):
United States but elsewhere, particularly in Europe, has evolved to
the point that some conservatives, not all, Some conservatives tend
to accuse Soros of trying to manipulate government policies around
the world through his wealth, and those accusations often come
from the more extreme fringes of the conservative movement, and

(18:32):
those accusations sometimes carry with them anti Semitic messaging. This
notion of wealthy Jewish people trying to control world events
and it's a frequent anti semitic message that has been
repeated numerous times. Uh, and it is a an insidious

(18:53):
and dangerous message to send out there. It feels a
lot of hate groups. So at the same time that
the Definers were starting to link George Soros to this
activist group, Freedom from Facebook, the Freedom from Facebook group
went and did something pretty bone headed and wrong. They

(19:16):
attended a House Judiciary committee meeting where a Facebook executive
was testifying about corporate policies, and the activist group held
up a sign and that sign showed Mark Zuckerberg and
Cheryl Sandberg as a two headed octopus with its tentacles
encircling the globe. And that's more than a little problematic.

(19:38):
Both Zuckerberg and Sandberg are also Jewish, and the depiction
of an octopus grasping the globe has been used as
an anti semitic messaging method before, and so this particular
sign could legitimately be viewed as being anti semitic, as
being uh bigoted against Jewish people. Now, whether or not

(19:59):
the activist group intended to express anti Semitic views or
not is up for debate, but either way, Definers was
able to take that protest and leverage it against them.
They contact the Anti Defamation League and they said, this
is a terrible thing. This should not stand. The group
needs to apologize taking some of the attention away from

(20:21):
Facebook and putting it onto Facebook's critics. And again I'm
not defending Facebook. Freedom from Facebook for doing that sign.
It was a dumb thing to do, and at best
it was dumb. At worst, it was racist. So Definers
takes that protest and leverages it against them. Simultaneously, Definers

(20:42):
is trying to link George Soros and his money to
that activist group, which is doubly weird right, and on
one part the group, this Definers group is reaching out
to the Anti Defamation League and saying, look at this
anti semitic messaging that this activist group is sending out
that is unconscionable. At the same time, this same group

(21:05):
is trying to link George Soros, who has been frequently
uh accused of engaging in various anti conservative causes and
has been linked anti semitic messaging, has been linked to
those claims, like there's been a lot of of anti

(21:27):
semitic messages that specifically target George Soros. So they're playing
both sides at the same time, essentially is what I'm saying.
They're They're saying, how dare this activist group send out
this anti Jewish message while meanwhile they're fanning the flames
of anti Jewish sentiment by linking Soros to that group.

(21:50):
By the way, Soros is philanthropic organization would later say
it had not provided any financial support to Freedom from Facebook,
So the claims were, uh, we're burious to begin with,
they weren't even true. The New York Times piece also
goes into detail about Facebook executives and their relationships with
various politicians, and there are many executives at Facebook who

(22:13):
have worked on political campaigns or held government jobs and
specific administrations over the years. Several of them, Sandberg included,
are very close friends with top lawmakers and have leveraged
those relationships throughout the whole affair. Sandberg would testify in
front of the Senate Intelligence Committee, and leading up to
her appearance, the Facebook team lobbied the committee chairman, Richard

(22:36):
Burr to stick to the topic of election interference and
not to press Cheryl Sandberg on other issues related to Facebook,
like user privacy or the Cambridge Analytica scandal, and Burr
agreed to that. He said, we should really focus just
on the election interference side, so that took some pressure
off of Facebook. Facebook also lobbied to include competitors in

(23:01):
this same hearing. They said, well, we're going to come forward,
but you should also really get someone from Google and
someone from Twitter. And since we're sending Sandberg our chief
operating officer, the people those companies should send should also
be very high ranking executives. Burr agreed to that too,
and he invited both of those companies to send in

(23:22):
comparable executives to appear before the committee. Twitter's Jack Dorsey
did so, he showed up. Google did not send anyone,
So Google's absence became a topic of scorn among the
committee members. People were saying, well, you know this, this
is this looks really bad for Google not to be here.
And that also helped take some of the heat off

(23:42):
of Facebook and also Twitter because they were there, so
they were able to come out of it looking a
little bit better because Google didn't show up. Oh and
and definers got involved in this part too. According to
that New York Times article, definers gathered information about all
the senators on this committee and then sent that information

(24:05):
to various journalists That included information about how much each
senator had spent on Facebook ads during various campaigns, as
well as which tracking tools the various senators websites used
on the visitors to those websites. The message was pretty
clear if those senators were to really go after Facebook,

(24:28):
the journalists had information that could lead to questions from
the senators. They could they could ask the senators, you know,
you really chased Facebook down and you argued about privacy,
but it turns out you're using a tracker on your
website to track information about people who visit your website.
So how can how can you accuse them of being
bad about privacy when you are gathering data or you're

(24:51):
arguing about Facebook, but at the same time you've spent
a huge amount of money on Facebook to advertise your campaign.
How can you be so critical of that company. It
was all meant to kind of add pressure to the senators,
and it's all it's essentially saying something like, sure, right now,
it's politically advantageous to go after Facebook because the public,

(25:13):
their opinion is turning against this social media site. But
let's talk about all the ways you've leveraged Facebook to
get where you are, and as I said before, politics
gets real ugly. Shortly after The New York Times published
its article about Facebook strategies to manage those crises, Zuckerberg
announced that his company had severed ties with Definers. He

(25:34):
and Sandberg both said that they were unaware that Definers
had been retained on behalf of Facebook. Sandberg said she
should have been aware of it, but she wasn't until
this article came out, and they said it was probably
someone in the communications department who had hired Definers, and
they just didn't realize it. That they being Zuckerberg and Sandberg,

(25:54):
they didn't realize that had happened. According to tech Crunch,
a number of face boks communications staff have ties to
Matt Rhodes. Matt Rhodes is the founder of Definers, and
he had previously run the election campaign for Mitt Romney.
So Andrea Saul, who serves Facebook as the director of

(26:15):
Policy Communications, also worked for Rhodes in two thousand eleven.
Two thousand twelve, Facebook spokesperson Jackie Rooney likewise had worked
on the Romney campaign as chief of staff. Another member
of the corporate communications team named Carolyn Glanville also worked
on the Romney campaign as the deputy communications director, and

(26:35):
then Joel Kaplan may have worked with Matt Rhodes while
Kaplan was deputy chief of staff under George W. Bush.
So there are plenty of people who could have initiated
bringing on Definers. Now, I want to be clear, I
don't mean to suggest that the communications department at Facebook
has a particular political bias, or if they do have

(26:55):
a particular political bias, that they performed their jobs and
that bias affects them. I don't know that that's true.
I hope it's not. I'm assuming the department is largely
just doing what most corporate departments do, which is to
act in the best interests of the company, rather than
to push any particular political philosophy. The only reason I

(27:17):
point out the relationships is because I think Zuckerberg's explanation
that he did not hire Definers, but someone in the
communications department did is probably true because there were so
many people who had relationships with the founder of Definers.
One of the things I think is really interesting is

(27:38):
that when Zuckerberg appeared before Congress is the founder who
I don't think always views his company the same way
as fellow executives. Due said that perhaps regulations might be
inevitable for platforms like Facebook. Now, this was not quite
the same thing as saying regulations would be a good thing.
He said he thinks that they be unavoidable. So you

(28:02):
can read that as saying Zuckerberg says yes, they're necessary,
or he just is saying there's no way we're going
to avoid it in the future. Zuckerberg also warned Congress, however, saying, quote,
I think a lot of times regulation puts in place
rules that a large company like ours can easily comply with,
but that small startups can't. End quote, Sandberg would say

(28:26):
essentially the same thing behind closed door meetings with various lawmakers,
and she said Facebook was already changing policies to follow
new best practices to make sure it was doing more
to police the content on Facebook, but that regulations, if
they were made formal, could end up hurting smaller platforms.
The New York Times reports that some of the officials

(28:48):
were a little skeptical of that messaging, understandably, so it
sounds like they're saying, oh, no, we we can handle
this just fine. We're good what we're worried about are
these smaller companies that don't have our resources. It doesn't
strike me as being totally genuine, because I'm not convinced
that Facebook is that concerned with smaller businesses. Um that's

(29:12):
based upon pretty much every action I've seen Facebook take
over the last decade. I've got more to say about
this article and its implications, but first, let's take another
quick break to thank our sponsor. On top of these

(29:34):
stories was one coming from CNBC about Kevin Systrom. Systrom
co founded the company Instagram with Mike Krieger. In two
thousand twelve, Facebook acquired Instagram for one billion dollars, a
princely some Systrom stayed on for several years, heading up
Instagram within Facebook, but in September System left Facebook. His

(30:01):
departure was likely mostly tied to how Facebook has involved
itself with Instagram over the last several months, and how
it has changed the way Instagram photos show up in
Facebook feeds. There were a lot of arguments that said
that Facebook's approach was watering down the value of instagram.
Sistrom and Creeer reportedly felt that Facebook was really interfering

(30:23):
too much with their work and that the decisions being
made were ultimately hurting growth. So a month after leaving Facebook,
System would say at a press conference, quote, no one
ever leaves a job because everything's awesome. End quote. But
they didn't go into a lot of detail. More to
the point of this episode, recently, Systrom said that it

(30:46):
is important for social media companies to be policed well
and that misinformation and harassment is a growing concern. He
even referenced deep fakes, which I talked about in a
very recent episode. But of course that's just one way
some one could misrepresent a person, from faked video footage
to faked audio footage or recordings. I guess I should

(31:07):
say two photoshopped images to smear campaigns. There are tons
of different ways for people to be pretty darn awful
to each other and to also reach a huge audience
to boot, because social media platforms have a very broad reach.
This goes beyond Facebook. Obviously, Facebook is easy to talk
about because the platform is so darned huge, but these

(31:28):
same tactics work on other social media platforms as well.
I don't mean to say that Facebook is the only
one that is vulnerable to this sort of thing. Specifically,
System said at the conference, quote, you start to realize
how important it's going to be for the future of
the world that we police these things well, that we
take it very seriously and put real resources against solving

(31:52):
the problems now that you're at this scale end quote.
Not to be clear, system wasn't necessarily calling for outside regulation,
but rather the need for policing the platforms, which could
come from within. It would not have to be a
formal set of laws. His point, though, was that it
is necessary whether it's internal or external, and that the

(32:15):
added expense of monitoring users and responding quickly in the
event of someone trying to spread lies or harass others
is absolutely critical. US Senator Mark Warner's office published a
paper describing a regulatory scheme for social media platforms after
Zuckerberg's appearances in front of Congress. This proposed policy covered

(32:37):
stuff like media literacy programs that are aimed at helping
people so they can determine if the information they are
encountering online is legitimate or if it's fake. It also
called for more funding of military and intelligence agencies so
that they can focus on misinformation campaigns from other countries
that are aimed to affect domestic politics. Essentially, the policy

(32:58):
was saying, we've gotten pretty good at detecting hacking attempts
and infiltration attempts. You know, not not flawless, but we're
aware of a lot of the tricks people use in
order to infiltrate systems. What we're not good at is
combating these misinformation campaigns, and we need to put money
aside to get better about doing that. The policy also

(33:19):
calls for social media platforms to do more to ensure
that the accounts made on those platforms are in fact
legitimate and not just run by a bot. If they
are run by a bot and it's all on the
up and up, it should be labeled as such so
that users aren't misled into thinking that a bot account
represents a real, human like person. It also calls for

(33:40):
platforms to be held legally liable for failing to take
down posts that include stuff like quote defamation, invasion of privacy,
false light, and public disclosure of private facts end quote. Also,
the companies would be held accountable if they failed to
take down fabricated video or audio if a victim had
secured a necessary judgment regarding the sharing of that content

(34:05):
and they also pointed at the European Unions General Data
Protection Regulation or g d p R rules. I covered
that in an episode earlier this year that would put
some pretty extensive privacy protections for Internet users if they
were to try and copy that. The paper itself wasn't
a draft of any sort of legislation. It wasn't a

(34:26):
proposed law. It was more of a broader policy suggestion
and a call for a discussion about those topics that
could lead to more actionable plans. The authors of the
paper admit that the ideas they propose may have flaws,
that they may not fully understand the situation or the
implementations that they're suggesting, and that in some cases the

(34:49):
proposed solutions might even undermine the goal that the solutions
were meant to achieve. I appreciate that they're very forthcoming
about this because one of the big problems we saw
in the initial congressional hearing with Mark Zuckerberg was that
a lot of these politicians are not exactly clued in
to the way social media works. Not a big surprise.

(35:12):
There's a fairly let's call it statesman like nature too
the Congress in the United States. That's a good way
of saying, a lot of them are old and are
a little out of touch, at least when it comes
to the technological side of things. So this paper was

(35:32):
really meant more as a call to action to get
an official stance of how it would be best to
approach social media platforms as they play increasingly important roles
in the way people get and share information. The paper
did not call for Facebook or Google or any other
big company that plays in this social media space to
get broken up into smaller companies. That is something that

(35:55):
some activists have called for. That these companies represent an
effective monopo bully in various industries, and that as a result,
they have been able to dictate the conversation and sort
of bully their way into favorable positions and favorable treatment
from the government. The general consensus to the policy paper

(36:15):
that I saw was that it would likely not make
much headway, and the only real chance it would have
of getting any real momentum would be if there had
been a really big shift in Congress after the two
thousand eighteen mid term elections, and there was a pretty
big shift. The Democratic Party picked up more than enough
seats to take control of the House of Representatives, but

(36:36):
the Senate still remains a Republican majority, so it's hard
to say if this is going to see any progress.
It may come down to uh partisan lines, where depending
upon which side proposes it, the other side might strike
it down not because of the merits of the ideas,
but because the other side suggested it. Because again, politics

(36:59):
get ugly, and sometimes politicians act like big old babies
when if it's not their idea, it can't be a
good idea. And that again, I apply this to both sides, y'all.
I am not I have my own personal political beliefs,
but I have no illusions that the party that I
support is any better about that than the other party.

(37:21):
Facebook meanwhile, has changed its policy in many ways to
police content more effectively. It launched a new report called
the Community Guidelines Enforcement Report, which goes into the policing
efforts that Facebook is engaged in, including how many fake
accounts it has deleted. According to the initial report, Facebook
deleted one point five billion fake accounts in just six months.

(37:44):
After the New York Times piece, Mark Zuckerberg announced that
the company will produce a report like that every quarter,
rather than you know, annually or semi annually. Zuckerberg also
published a four thousand, five hundred word outline or blueprint
on how the company is going to move forward with
content moderation. In that piece, Zuckerberg wrote, and I quote,

(38:05):
one of the biggest issues social networks face is that
when left unchecked, people will engage disproportionately with more sensationalist
and provocative content. This is not a new phenomenon. It
is widespread on cable news today and has been a
stable of tabloids for more than a century. At scale,

(38:25):
it can undermine the quality of public discourse and lead
to polarization. In our case, it can also degrade the
quality of our services. Our research suggests that no matter
where we draw the lines for what is allowed, as
a piece of content gets close to that line, people
will engage with it more on average, even when they

(38:47):
tell us afterwards they don't like the content. So how
is Facebook going to respond to that? How are they
going to put a cap on that? According to Zuckerberg,
They're going to train AI models to recognize is when
a piece is sensationalist, when it is either fake news
or it's misrepresenting the facts, or it is inflammatory on

(39:11):
purpose and that it will then automatically be able to
remove those items, which seems kind of interesting, especially since
we just finished all those pieces about how AI is
not infallible. But the alternative would be to employ human
beings to go through billions of posts every day, which
doesn't seem like it's particularly realistic either. Zuckerberg also said

(39:36):
that the company would seek out an independent oversight body
to review any appeals made by people who had their
content removed from the platform. Zuckerberg doesn't expect that such
a body will be ready to go until the end
of twenty nineteen at the earliest. But the goal here
is to create an entity that can review these appeals
and do so objectively and remove the possibility that Facebook's

(39:58):
algorithms are behaving on a bias against certain groups. So
let's say it's conservative news. If the news items are
not against Facebook's policies, if they are objective, you know,
they are fact based, they're not representing things, and they're
not inflammatory, but they're still getting removed. That conservative groups

(40:19):
could legitimately say, hey, your algorithms are targeting us based
upon our political stance, but we're not lying, we're not
misrepresenting the truth. This board would be able to review
the appeals and say, you know what, You're right, that
piece is completely legitimate. We're going to allow it on Facebook,
or they might say, I see what you're saying, but

(40:42):
this piece violates our policy because X, Y, and Z.
As of the recording of this podcast, this story is
still unfolding and a lot of people are angry, including
a lot of politicians, And it's likely that Facebook is
going to have to wade through a lot more political
scrutiny in the near future. So I'll probably have to
read at this sometime in the uh in the future,

(41:02):
but I wanted to cover it now because it is
a very fresh story and it brings up a lot
of very interesting questions because if Facebook's a publisher, at
what point is it able to or what point should
it step in to moderate things. I mean, it's a
private company or publicly traded company, but it is a company,

(41:25):
not a government, so it can choose what can and
cannot be shown on its platform. That's been the case forever.
Facebook has been able to do that all the time.
It's just they haven't really enforced it a whole lot um.
I'm very curious to see how this unfolds, because you're
gonna see different groups react in different ways. They're going
to be civil rights groups that might say there's some

(41:46):
freedom of speech problems here. There are going to be
other groups that say you're not doing enough because they're
still problematic posts being made on your platform. It's gonna
be a rough, uneven road, I think, for a while.
But I'm glad to see that some serious discussion is

(42:08):
being held about these issues. It's a shame that it
seems to be largely in response to this expose, a
piece from the New York Times. You would hope that
people would take these initiatives without that kind of public pressure,
but sometimes that's what it takes. Anyway. That's what's going

(42:28):
on so far. We will revisit this sometime in the
future if there's more to say about it. And uh,
I hope you guys have enjoyed this episode. I hope
you're all having a great holiday week for those of
you who are listening to this when it publishes, and
I look forward to talking to you again soon. If
you have any ideas for episodes, why not visit our

(42:51):
website tech Stuff podcast dot com. You can learn more
about the show there you can email us at tech
Stuff at how stuff works dot com. I'll get those messages. Uh.
You'll also find ways to contact me via Twitter or
Facebook up on that web page. Don't forget to go
visit our store. It's over at t public dot com
slash tech Stuff. You can buy merchandise there. Every purchase

(43:12):
goes to help the show. We greatly appreciate it, and
don't forget. We also are nominated in the I Heart
Radio Podcast Awards. You can go to the I Heart
Radio Podcast Awards website. You can log in, you can
vote up to five times a day for your favorite podcast.
You can even dedicate all five votes for tech Stuff

(43:33):
if you are so inclined. And if you vote enough
and tech Stuff wins, I will have no choice but
to go up onto a very large stage in front
of a whole lot of fancy people and accept an award.
And that is terrifying. So if you want to scare me,
vote for tech Stuff and I'll talk to you again

(43:56):
really soon for more on this and thousands of other topics.
Because at how stuff works dot com

TechStuff News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

AboutStoreRSS

Popular Podcasts

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.