Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Pushkin Pushkin from Pushkin Industries. This is deep background to
show where we explore the stories behind the stories in
(00:36):
the news. One of the recurring stories is Facebook, and
particularly criticisms of Facebook that have arisen from a large
body of material leaked by a whistleblower to the press
and to Congress. This series of leaks has, of course
major implications for the company, but it also has implications
for the Oversight Board associated with Facebook, the independent institution
(01:01):
that has the job of reviewing the company's most controversial
content decisions and correcting them if it believes they've gotten
them wrong. Regular listeners of the podcast know that I'm
interested in the Oversight Board, but I would take it
a bit further. I am personally interested in the Oversight Board,
having been one of the people who helped come up
(01:22):
with the idea for it in the first place, and
who worked with Facebook to bring it into reality. For
full disclosure, I also want listeners to know, if you
don't already, that I continue to advise Facebook on free
expression related issues in order to discuss the business of
the Oversight Board. However, I am not the right person
to do the talking because I am not on the
(01:43):
Oversight Board, and the Oversight Board is its own independent
institution for that purpose. We're very fortunate to be welcomed
on today's show by Jamal Green. Jamal is the Dwight
Professor of Law at Columbia Law School. He's an expert
on constitutional decision making and the author of an important
book called How Rights Went Wrong. He's one of the
(02:05):
co chairs of the Oversight Board, in which position he
has a crucial insider's view of the work of the board,
its purposes, its functions, its limitations, and the challenges it faces. Jamal,
thank you so much for being Jamal. Let's start with
(02:27):
the Oversight Board, of which you're one of the chairs,
and tell me how you decided to take that job
when it was offered to oh gosh, when the team
at Facebook that was making the initial choices reached out
and kind of explained the idea, it sounded like an
opportunity both to try to make the world slightly better
(02:48):
and also something that aligned with my own personal interests
and my professional interests in One of the things I
do in my day job is think a lot about
how to balance and optimize different kinds of rights and
how to think about how rights interact with different kinds
of institutions. So it had a professional interest for me,
and given how important what Facebook does in terms of
(03:10):
content moderation, is seemed like an opportunity to contribute to
making the world a little bit better. So that's why
I decided to do it. And he regrets, No, no regrets. Challenges,
of course, but I knew there would be challenges, so
I wouldn't say there are any regrets. No good. I'm
happy to hear that, since my own involvement with the
Oversight Board went back to the point where there weren't
(03:31):
any chairs. In fact, there was only the idea for
the thing. I want to talk about your work on
rights and its relationship to some of the themes at
the Facebook Oversight Board faces. But I want to first
start with transparency, which is, on the one hand, one
of the core rationales for the Oversight Board in the
first place. It exists in some sense to push Facebook
(03:54):
to be more transparent it's decision making and to make
its own decisions in a way that is transparent in
the sense of revealing reasoning and logic. Transparency, though, has
also been a great challenge for Facebook to put it
some what overly politely, I mean, the company has been
very badly buffeted by a whole series of leaks which
(04:17):
have gone to major newspapers, to Congress, to probably the
Federal Trade Commission, all of which seem primarily structured, not solely,
but primarily structure around this idea that the company hasn't
been transparent enough in aspects of its decision making. So
I'm wondering what you think the Oversight Board can do
(04:39):
to push Facebook towards greater transparency that hasn't already been
done by this leaking process. Well, I think there's a
couple of different ways in which one can talk about transparency,
and Facebook has that issue along several dimensions. The first
thing to say is, you know, we care about transparency
(04:59):
to the degree that the particular activity touches on lots
and lots of people in a way that one thinks
that a company or or institution needs to be held
accountable in some way. Right, So if Facebook had no
reach at all, it was just a sort of private
company making its own decisions, we wouldn't care that much
about transparency, just as we don't care that much about
(05:21):
transparency for you know, the person who makes our kids
toys or something. But it's because Facebook's reach is so
broad and now in a global sense, so broad and
touches on a basic social policy when it comes to vaccines,
or it comes to political elections and all sorts of
things that Facebook policies can have an impact on. That's
(05:42):
where the demand for transparency comes in. It's in a
sense proportional to the reach of the company. And so
when we say something like Facebook has not been transparent enough,
I think the right way to understand that is to
say that its transparency is not on par with its reach.
And I think that's a legitimate problem for the company.
We think about leaks, right, A lot of the leaking
(06:03):
has been related to internal research that Facebook is conducted
and the degree to which responded to that internal research.
And that's a certain kind of transparency problem that I
think the board is not centrally focused on. Although the
board is certainly interested in what the company's doing. The
ways in which what the board does can be most
(06:24):
directly relevant to Facebook's transparency is it makes lots and
lots of decisions without fully explaining why it's making those decisions. Right, So,
when it comes to content people's content gets taken down,
other content gets left up. There's not much of an
opinion written about it. Facebook barely gives reasons for why
it does what it does. It seems like it's acting
(06:45):
in consistently in a variety of ways. And so what
the board can do is open up that decision making
process to show what kinds of trade offs are being made,
to make recommendations and in some cases of binding decisions
on how those tradeoffs should be made, and doing it
in a way that we write opinions, we publish those opinions,
We tell people exactly what our sources of information are,
(07:07):
what exactly we're weighing, what the kind of governing I'll
put this in quotes law is. So what are the
resources for relying upon to make those decisions in a
way the company just hasn't And I think that that
goes directly to how fairly the company treats its choosers. Again,
there's a broader question of the transparency of any institution
that's a wielding this much power. What the board does
(07:29):
is one element of that. So I both want to
explore the distinction that you're offering jamal between two different
kinds of transparency and then also maybe ask you whether
they might have more in common than you're suggesting. So
I hear you saying that what the oversight board does best,
and I agree with this is focusing on let's call
it reason giving transparency, right, This is the idea that
(07:51):
when the company makes a decision about some piece of content,
it needs to follow principles and it needs to reveal
transparently what its reasoning process was, and often they don't.
And since the oversight board itself exercises that kind of
transparency in making its own decisions, because you guys, tell
everybody here or the principle is that we invoked, here
are the facts we relied on, here are the moral
(08:12):
ideas that we're relevant to our analysis, and here's the
kind of conclusion that we reached after weighing these factors.
You can urge Facebook to do something similar. So I
get that side of the distinction. And then I hear
you suggesting that there's also a kind of transparency of
you know, how much does a company disclose about its
internal research about the consequences of its actions? And so
(08:33):
first let's just see if that is the distinction you're
pushing here. So, yes, that is the distinction I'm pushing.
I'll be curious what the follow up question is, because
I do think that there is a relation between them,
but I'll let you Yeah, I mean to be collaborative.
I mean, I think that those are in some sense different.
But if you think about, you know, Facebook having access
(08:55):
to internal research that suggested maybe some of its policies
we're having a negative effect on users, I think the
public call is not merely that that fact should have
been known, but that people want to know that Facebook's
decision making process where it decided to continue these services,
or that it decided to tweak them in certain ways
(09:15):
rather than eliminate them all together, was a reasoning process
that they can hear. So I mean, in that sense,
the decision making is always you take certain facts, you
have certain values, and you try to bring those together
and make a decision, and then if you're being transparent
about your decision making, you tell people why you did that.
And so I actually have the ends think that that's
(09:37):
actually more similar to the kind of transparency and decision
making that you're talking about. I don't think it's just
that people were saying, well, they should have told us
about this research. I think they're saying they should have
taken this research into account in making their decisions. And
when Facebook says, which is sort of all they've said
so far, oh, don't worry, we did, people's response to
that is to say, well, how do we know you
did that right? And had they been more transparent about
(10:00):
their decision making process, I think they could have said, well,
this is how we took that into account, or this
is how we didn't take it into account. I do
think it's the case, certainly that part of what the
board does when it sees a decision that's made by
Facebook or increasingly the board has weighed in and will
weigh in on actual policies that Facebook is implementing and considering.
(10:21):
One of the things we care about is why did
the mistake happen? And when we say a mistake, we
use a few different resources figuring out what counts as
a mistake. Right, So we think about whether Facebook as
apply its community standards accurately. We think about if Facebook
is acting consistent with how it understands its values, and
(10:45):
we think about international human rights law, which and the
norms associated with the international human rights system to which
Facebook has committed itself. It may well be that there
are decisions that Facebook makes that are inconsistent with those
values and norms. And one of the things that the
Board tries to do and has been trying to do
(11:05):
in decision is not just make an up or down
decision on a piece of content. I'd say why with
this mistake made? And in order to do that, it
may sometimes be the case, right that we need to
know more about what inputs there were into particular kinds
of content decisions. And I think the Board in asking
Facebook questions about that, in exposing what answers it gives
to us when it refuses to give answers to us,
(11:27):
in pushing Facebook to be more clear about what influences
it has when it's decision making. Right. So, one of
the recommendations the Board has made that Facebook has taken
up is to be more clear about when governments make
requests or Facebook to remove content. Right. So that's relevant
to transparency as well. Right, So, there are things the
Board can do that get at that deeper transparency. What
(11:49):
I meant to say is that there are some very
deep issues about who governs us that at the moment,
the Board's jurisdiction and scope are limited. There's no question
about that. There's very important progress that the Board can
and does make on I'm trying to get Facebook to
treat its users better or more consistently, to start trying
(12:10):
to get at what's influencing its decisions. But there is
always going to be a deeper question about the private
companies engaged in far reaching activities that I think our
questions for sort of all of society, Lots of different
kinds of institutions, the Board, but also whistleblowers and journalists
and researchers and civil society organizations that are equally if
(12:34):
not more situated to get into than the Board itself.
Let me ask you about one point where there was
some overlap between what the Board said in its recent
transparency reports and what the whistleblowers materials disclosed, and that
was the program sometimes called cross Check, through which Facebook
initially was trying to address just a relatively smaller number
(12:57):
of distinctive users with respect to their newsworthiness of what
was being posted on the platform, but that extended to
cover really a very large number of users, many more
than Facebook had acknowledged, and it seems many more than
Facebook told the Oversight Board when the Oversight Board asked
point blank about this in the course of the Trump
(13:20):
deplatforming decision, what can you say about that and what
do the oversight boards say about that recently? So I
can say a bit about the sort of background here,
but I'll note that Facebook has given to the board
a policy advisory requests on how to structure it's cross
check program. And I wouldn't want to say too much
in advance of deliberating about that and getting more information
(13:42):
about it about exactly what problems there might be with
crosscheck or what the right way to resolve those problems
might be. But crosscheck is this system that Facebook has
in place, or had in place, in which it exposes
certain users to additional layers of review, ostensibly on the
theory that they don't want mistakes to be made with
(14:03):
respect to certain users. And the board asked in the
Trump decision about this program, and Facebook it that it
was it was only used for a small number of users,
and it turns out to be it's a few million.
Facebook said to the board later that that was a
small number in relation to the number of people on Facebook,
which is of course true. But it's true in a
kind of layally way. So what the board said and
(14:25):
it's recent, I can say, since we both law professors,
I can say that you're using the word layally in
the negative sense of that term. That's that's right, right. So,
if you're dealing with someone in an adversarial posture, as
lawyers often are, right, sometimes if they ask you a
question you answer it in the most narrow possible way.
You might still be truthful, but in some ways it's
misleading if you're being very narrow about it. But if
(14:46):
you're you know, talking to a friend of yours and
they ask you about some piece of information, it would
be strange to be excessively narrow about that. And I
what I would say is Facebook, in not being I
think fully forthcoming, I think treated this in two adversarial
sort of way. Right. So when we ask them for information,
we think that they should give us the full context
(15:08):
and try to be as helpful as they can in
providing the board with information. And we told them as much, right. So,
going forward, Facebook has promised that it's going to be
more contextual in the way in which it responds to
information requests, and that's going to be I think very
helpful for the board to try to do. It's not
better because you don't always know what you don't know right,
(15:29):
and so understanding better exactly how the cross check program
works can be helpful in deciding whether it's being applied
fairly in a particular case, Jamal, I want to turn
to the core of the Oversight Board's job, which is
decision making, and here I'll be really curious to hear
from you about how your distinctive approach to decision making
(15:52):
plays out. You published an amazing book this year called
How Rights Went Wrong, and that book, in turn drew
on some of your earlier scholarship, which I and others
read in the course of trying to think about how
the Oversight Board should make its decisions in the first place.
So some of your approaches I think maybe you already
baked in before you got there. But I wonder if
you would start by just saying something about your distinctive
(16:14):
view of how courts or bodies that are sort of
like chords, like the Oversight Board, should decide cases where
there are reasonable arguments on both sides. Sure, and I'm
happy to talk about this with a couple of caveats right,
one being that it's not just my approach, right, I
think I have a particular angle on it, But it
(16:38):
is a pushback against the way in which US courts
and often US thinkers about rights tend to think about rights,
less of a pushback against some global standards. And the
second thing I'll say is we are collaborative board, right,
And so if I were writing all of the opinions
just by myself, they might look a little bit different
than writing opinions when twenty members have to more or
(16:59):
less agree on them. The general point is that when
we're talking about rights conflicts, that rights conflicts are very often,
not always, but very often conflicts in which people have
reasonable disagreement about how to apply a set of more
or less shared values at a high level of generality,
but they disagree on how to apply them in the
(17:21):
particular case. When we're in that situation, it's not that
helpful to pick out just a few rights that we
think are important and essentialize them so that they are
applied kind of absolutely whenever they're because that will tend
to silence one or the other side of these rights conflicts.
(17:42):
And I think in the US we tend to do
this with the First Amendment. So the moment someone invokes speech,
there's a battle over saying who has the speech right
or whether there's a speech right or not. And because
we know the stakes of that battle are extremely high,
that you just sort of win if you get to
say that you have a speech right. And what I
try to urge in the book is in freedom of
(18:06):
speech cases, as in many others, that a tremendous institutional
variation in the ways in which speech might be affected. Right,
So a purge of all people who are opposed to
the government in which you put them in prison is
extremely different from let's say, a university deciding how to
regulate the speech of its students, or a platform deciding
(18:28):
who is going to be able to amplify their content
and spread it around the world. All forms of regulation
of speech. But they're in very very different contexts, and
that in some of those contexts we have to think
more carefully about the various other values that we think
are important than we do another of those contexts. Right, So,
we care about national security, but we don't care about
(18:49):
it so much that we allow purges of our political enemies.
But the fact that we care about hate speech or
we care about amplification of misinformation? Are values that we
might think are sufficiently important to put some kinds of
restrictions on who has access to certain kinds of platforms,
And so even though they're both speech cases, they're all
speech case. Is we have to think carefully about what's
(19:10):
on the other side of the balance, depending on the
institutional context. Can I ask you a philosophical question around that,
Jamal that I find myself struggling with very much right now,
and I don't think there's a simple answer to it.
It has to do with the boundary between misinformation on
matters that you and I would probably agree have a
fact associated with them, and misinformation on matters that have
(19:35):
a fact associated with it but has become so politicized
that it becomes a stand in for somebody's political beliefs
and values. Maybe a climate change would be a good example.
I think you and I both think that there is
a science of it, and the scientists are doing their
best to get at it. They might be right, they
might be wrong, but they achieve consensus they follow their process.
(19:57):
But when people argue about climate change and whether it's
man made, a lot of people are using in that argument,
which is nominally an argument about facts, as a kind
of stand in for their political points of view. And
once that happens, the differences in what people are saying
could be put into the box of misinformation if we're
(20:19):
confident that we know what the science says, and in
that case, maybe it's not so important to preserve different
points of view, or they could be put into the
category of political argument about political identity and about what
should be done in the future, and that's really important
and would probably deserve a lot more protection. So I
deliberately didn't choose COVID because it's too close to home
(20:40):
and too controversial, But climate change is still pretty darn important.
So I guess I'm wondering what you think about that.
And again, it's not that I think there's a particularly
right answer to it. I just think it's a hard problem. Yeah, no, don't.
I think it's definitely a hard problem. I don't think
it has a sort of abstract answer. What I would
say is a couple of things. So one is the
(21:02):
bare fact that something is false perhaps should not engage
or enrage or site us as much as whether something
that is false is leading people to do something that
is more materially harmful, So that just goes back into
sort of COVID misinformation might be an exlightly different category
than something like climate change, where the latter is might
(21:25):
be influencing policy in some way, but in a somewhat
indirect way. Whereas if someone really does think that if
they inject bleach into their veins, or if they take
some off labeled drug that might hurt them, that's in
a different kind of category in terms of the immediacy
of harm. And that's important, right because we because it's
very hard, as you say, for these philosophical reasons to
(21:48):
sort of adjudicate these things in the abstract, but when
we are able to connect them to more concrete harms,
that affects how we feel about regulating them, even if
we can't resolve the philosophical issue that you just raised. Right, So,
there are various forms of misinformation and ways in which
we mislead each other. There's a long spectrum from pure
truth to pure life, and we're often somewhere in the
(22:10):
middle of that in our political discourse. So I tend
to think that really the only productive way that a
regulator of some kind can respond to that is to
try to focus on direct and concrete harms. We'll be
right back, Jamal on your oversight board. There is one
(22:38):
of your CoA chairs, Michael McConnell's a retired federal judge
also a law professor. Do people like him, or people
who come out of one particular system find it relatively
simple and seamless to shift to a more overtly recognizing
approach in your view, or is there a kind of
sense of cultural class or cultural difference behind the scenes.
There are cultural differences. The Board is a very diverse
(23:01):
institution along many dimensions, including the legal traditions that people
are associated with, whether they're associated with legal traditions at all.
I think that's a strength of the Board and that
it doesn't become sort of overly lawyerized. My personal deliberative
model is that I'm most familiar with is the you know,
the faculty meeting or maybe the law law school workshop,
(23:24):
which is a particular kind of culture sort of bouncing
ideas off each other, challenging people fairly directly. And I
do think we all take that into the liberation room,
which turns out to be a zoom room. When we
talk about we're going to talk about cases. And again,
I think that's a that's a strength. Everyone who's joined
the board joins it knowing has joined it knowing that
(23:45):
this is a collaborative enterprise that you bring. You're there
for a reason, and what you're bringing is valuable to
the to the room. But we're also trying to reach
a decision and trying to reach a certain degree of consensus.
And I've certainly seen cases where people lodge strong objections
and then they say, Okay, we had a discussion, my
position lost, and now I'm on board. I think that
(24:07):
that's been very healthy, very active, and I actually want
us to be able to try to model that for
people who aren't on the board right that when you
disagree with about things, you hash it out. It's you
have respectful disagreement, and you reach a decision. You move
on to the next fight. What's been Jamal, the most
surprising thing that you've experienced while working on the oversight board. Gosh,
(24:31):
that's a hard question. What's the most surprising, because there's
been a few surprising things, I think, but we'll give
me several. I mean, I'm actually one of the reasons
again full disclosure One of the reasons I'm asking you
is I have a kind of nose pressed against the
glass feeling sometimes about the Oversight Board, you know, like
having dreamed the thing up, pushed for it, and then
(24:52):
decided that I was so close to the company through
the process of building it that I shouldn't serve on it.
I sort of like hope that people whom I hugely
respect and trust, like you, would go off and do it.
But I don't have a feeling for the minute to
minute of what it's like from the inside, and it's
sort of kills me. So I'm actually really curious to
get it. What have been various things that weren't what
(25:13):
you would have expected, So I'll name I'll name two things.
So one is that the work of the board is
not just the work of the board members. Right, So
we have a staff. The staff is excellent. Thomas Hughes
is the director. I hadn't thought very carefully about the
staff because I sort of had this idealized vision of
you get a case and then you sit in your
office and you think carefully about it, and you and
(25:35):
then you just you come to a view. Right. But
the day to day operation of the board the amount
of research that has to go into particular cases, the
complexity of writing these opinions and making sure we get
them right, a million other things having to do with
how do we work with Facebook to try to implement decisions,
how do you actually set it up technologically in terms
of security and privacy and the legal aspects of it,
(25:56):
and just the size and quality of the staff. I
think is one thing that I had not anticipated or
hadn't thought carefully about before I took the job, but
it's completely essential to what we do. The other is
a point about Facebook, which is just the complexity of
the company which I think I hadn't fully grasped. It's
not just that sometimes there's a right hand in a
(26:17):
left hand and they're doing different things, but it's twenty
five different hands right and they're all different doing different things.
And there's a lot of internal diversity at Facebook in
terms of whether people think the company is doing the
right thing or the wrong thing. Powder structure, it's it's platform,
and I think there's a perception of the company that
it's just sort of Mark Zuckerberg is sitting on a
(26:37):
throne just making decisions for everyone. It's a it's a
complicated place. Speaking of it being a complicated place, Facebook
as we know it just changed its overall name to
Meta or Meta Platforms, and that means this company is
going to do many, many more things in the broadly
speaking virtual reality space. Is the Oversight Boards Charter written
(27:03):
to give the Oversight Boards supervisory power or authority in
those kinds of undertakings beyond the product called Facebook. So
I'd have to go and take another look at the charter.
But my belief as I sit here is that the
charter is very much connected to the platform as it
exists today, and that if at some point in the future,
(27:24):
the Board and Facebook we're going to decide that the
Board was going to extend into other Facebook products, that
would have to require some change to the governing documents
of the Board. As I sit here today, I think
all of us are waiting to see what exactly the
company means by its entry into the virtual space. But
(27:46):
at the moment, there's not much for the Board to
say about that. What do you think would be a
good measure? You know, we're now a year into the
life of the Oversight Board, what would be a good
measure in another year or two in your mind as
to whether the work you were doing was having the
kind of impact you hope it will have. I think
the Board has already had some significant impact Facebook. I
(28:11):
think the culture of the company now knows that it
has to justify a number of the kinds of decisions
that makes relating to users to the board. But in
terms of how one would measure it from the outside,
I think engagement with the board right. Part of the
board's challenge has been that the board is an independent entity.
It's structured to be an independent entity. Facebook doesn't control
(28:32):
the board, right, but Facebook creative the board, And I
think from a matter of public perception, I think, just
as people have children and then the children become their
own independent people, I think the board as a new
institution is still working towards, and has to work towards,
making clearer the degree to which it's truly an independent entity.
(28:54):
And that means engagement with lots with people institutions that
are not necessarily connected to, or in league with, or
sympathetic with Facebook. For example, so the board is talking
to former Facebook employees, including Francis Hogan, because we want
to learn more about how to do our jobs better.
The Wall street journal reporting. I certainly, and I think
(29:15):
we've done this officially too. Is celebrates that, which is
to say, we celebrate learning more about the company through
other institutions as well. Right, So, the board is collaborative
with other modes of accountability. It's complementary to other modes
of accountability, including government. Right. I mean, so, I don't
mean the Board doesn't have a position on whether or
(29:36):
to what degree the government should regulate social media platforms.
That's to say that we are part of a larger
ecosystem in which we're all trying to make very powerful,
very important companies as responsible as we can make them.
And so again, the measure of that is who's engaging
with us, who trusts us, who relies on us, who
(29:58):
reads our opinions, who contributes to our policy recommendations, and
who gives public comments. Do people trust this as an
institution that can make a difference, that's listening to them,
that's open to engaging with them in a transparent way.
And I think we're on the road to doing that,
But there's more work to be done. Jamal, What should
(30:18):
I be asking you about the operation or future or
reality of the oversight woord that I haven't asked you. Well,
I think important for a new institution, which is what's
the biggest challenge going forward? I think implementation is one
of the biggest, just in the sense that we make
binding decisions on individual cases. We've made seventeen of them.
We've had five hundred thousand appeals. There's a billion pieces
(30:40):
of content on Facebook every day, and we've made seventeen decisions. Right,
so in some sense, right, how can you make an
impact on just seventeen decisions. Well, of course, you pick
the decisions that you're going to the cases that you're
going to make decisions about, based on whether they can
have a more long ranging impact. But then once you
(31:01):
make a decision on that individual piece of content, the
question is how can that decision spread across the company?
And that often will require engineering changes, changes in the
way the community standards are implemented by both machines and
by humans. It might require changes in UM in specific
(31:22):
policies that the company has UM we have to have
a certain degree of independence from the company in making
the decisions we make, but then in how to actually
implement them on the platform. There's no way around having
to collaborate with Facebook and in some sense, and so
that requires, you know, a tricky balance of not just
figuring out what the right answer is in terms of
(31:43):
implementing decisions, but also you know, figuring out what Facebook
really can and really can't do. UM. Since we're not engineers, UM,
you know, how much do you take This is really
expensive and really complicated as an as an answer is hard, right,
And there's there's no way around that being hard other
than you build expertise over time, you build relationships over time,
(32:04):
You bring expertise onto on board, onto our staff, and
onto our board so that we're not just relying on
Facebook's technical know how. And that's going to be something
that is an ongoing issue. And if you think about it,
just to underscore your point about it taking time, you know,
the Supreme Court of the United States has ordered the
federal government, it's ordered states, it's ordered institutions within states
(32:27):
to do very complicated things in its history, and sometimes
they pull it off. Sometimes they don't. Sometimes they're dragging
their feet because they have bad intentions. But sometimes it
really is just instrumentally very difficult to operationalize the commands
of the court. And so that creates a kind of
overtime dialogue where both sides gained expertise and develops some
(32:51):
degree of trust, even alongside the possibility of occasionally being
adversarial to each other. And I would say that's normal
for any entity that's engaged in oversight and the body
that it's overseeing, in the same way that in a sense,
the US Supreme Court exercise is a kind of constitutional
over the rest of our government. So that's to say
that you're off to the right start, and as you say,
(33:11):
it's going to take time and effort to get it right. Yeah,
I think that's if there's a final point I would
emphasize on the board and measuring its impact, is it
will take time. We live in a culture, certainly in
which the news cycle lasts a day or last two
days or a week at most if it's a really
important story, but building an institution trying to get you know,
(33:34):
we talk about moving the ship of State, but you
know the ship of Facebook. Moving that ship is going
to take time. That doesn't mean that you get an
infinite leash on moving that ship right, and so you
work as quickly as you can. There are a lot
of issues with Facebook and figuring out what to prioritize
what's going to take six months versus what's going to
(33:56):
take five years as part of the challenge, and as
you say, we're off to the right start and there's
a lot of work to be done. Jamal, I want
to thank you for this very wonderful and frank conversation.
I also want to thank you for your academic work,
which is taught me a lot. And I want to
thank you specifically for taking on the challenge of being
one of the chairs of the Oversight Board. It would
(34:17):
not be the same Oversight Board without you. And the
reason that I think it has a chance to make
meaningful contributions is precisely because people like you have agreed
to take on its work. So thank you, Thank you, Noah,
we'll be right back. There are twenty members of the
(34:42):
Oversight Board, but the reason I particularly wanted to speak
to Jamal is that his academic expertise is precisely in
how hard decisions should be made by bodies like the
Oversight Board. What's more, the chairs of the Oversight Board,
of whom Jamal is one, exercise disproportionate power relative to
the other members when it comes to setting the agenda
(35:04):
for the Board and deciding what kinds of important decisions
it ought to make. Speaking to Professor Jamal Green about
the Oversight Board was a kind of split screen experience
for me. On the one hand, I had the pleasure
of hearing a true expert on decision making talk about
how he makes decisions and how the institution that he
(35:26):
is helping to shape thinks about those decisions. Similarly, in
that same screen, I was hearing the perspective of one
of the chairs of the Oversight Board talk about what
its job is and what it needs to be, about
what it's done well so far, and the challenges that
it faces in the future. Over on the other screen
(35:47):
was my sense of wonderment, shock, and to be honest,
a slight feeling of the surreal to realize that an
institution that I helped imagine and create is actually up
and running, and that what it does has absolutely nothing
to do with me or anything I say about it.
In that respect, I am thrilled by what the Oversight
(36:07):
Board is doing, but also nervous on its behalf, sort
of in the way you would be nervous for anything
that you participated in creating that goes off on its own.
You want the institution to do well and be independent,
but of course you also wish it would do exactly
what you wanted to do in every context and in
every element. My ultimate takeaway from the conversation with Jamal
(36:31):
is that the oversight board is going to go its
own way. It is going to continue to assert authority
over Facebook's decisions. It is going to continue to press
Facebook to try to be more transparent, but it's also
going to have to grapple with the limitations of its
own design. As an oversight board that can give guidance
and advice to Facebook on a case by case basis,
(36:54):
and can tell Facebook what to do specifically when a
Facebook asks, but is not designed to nor has the
capacity to fundamentally transform the way the company does business.
For that kind of transformation, Facebook, like other companies, will
have to act on its own and on the basis
of its own conception of the public interest and the
(37:16):
interests of itself and its shareholders. Until the next time
I speak to here on Deep Background, breathe deep, think
deep thoughts, and try to have a little fun. If
you're a regular listener, you know I love communicating with
you here on Deep Background. I also really want that
communication to run both ways. I want to know what
(37:38):
you think are the most important stories of the moment,
and what kinds of guests do you think it would
be useful to hear from. More So, I'm opening a
new channel of communication. To access it, just go to
my website Noah Dashfelman dot com. You can sign up
from my newsletter and you can tell me exactly what's
on your mind, something that would be really valuable to
(38:00):
me and I hope to you too. Deep background is
brought to you by Pushkin Industries. Our producer is Mo La,
board engineer is Ben Taliday, and our showrunner is Sophie
Crane mckibbon. Editorial support from noahm Osband. Theme music by
Luis Skara at Pushkin. Thanks to Mia Lobell, Julia Barton, Lydia,
(38:22):
Jean Coott, Heather Faine, Carlie mcgliori, Maggie Taylor, Eric Sandler,
and Jacob Weisberg. You can find me on Twitter at
Noah R. Feldman. I also write a column for Bloomberg Opinion,
which you can find at Bloomberg dot com. Slash Feldman
to discover Bloomberg's original slate of podcasts go to Bloomberg
dot com slash podcasts, and if you liked what you
(38:44):
heard today, please write a review or tell a friend.
This is deep background