All Episodes

November 14, 2023 45 mins

Speech has probably never been freer in the world than it is today: Multiple venues – especially social media – allow people’s perspectives to take flight fluently, globally, and frequently. The culture of free speech is also under steady and ever more sophisticated assaults, perhaps because its ubiquity is threatening to any person or institution that holds an opposing viewpoint. The very thing that makes speech so free right now – ease of motion – is, perhaps, what also makes it more threatening. Jameel Jaffer is an attorney and the director of the Knight First Amendment Institute at Columbia University.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to Crash Course, a podcast about business, political, and
social disruption and what we can learn from it. I'm
Tim O'Brien. Today's Crash Course Free Speech Versus Censorship. I'll
take a leap and say that speech has probably never
been freer in the world than it is today. Multiple venues,
especially social media, allow people's perspectives to take flight fluently

(00:25):
globally and frequently. Pick your format print, audio, video, and images,
for example, and you can easily put ideas in front
of an audience huge audiences. Potentially, the culture of free
speech is also under steady and ever more sophisticated assaults,
perhaps because its ubiquity is threatening to any person or

(00:48):
institution that holds an opposing viewpoint. The very thing that
makes speech so free right now, ease of motion, is
perhaps what also makes it more threatening. And I'll say
that if speech feels threatening, the solution isn't to bottle
it up, as Supreme Court Justice Lewis brandeis once advised
almost a century ago, and I quote, the remedy to

(01:12):
be applied is more speech, not enforced silence. But we
are awash in efforts to enforce or encourage silence in
our current chaotic era. Everything from education and public health
to political opinion, religion, and art have offered fodder for
attempted censorship. Joining me today to discuss free speech and

(01:34):
efforts to corral it is Jamille Jaffer, an attorney who
is also the director of the Night First Amendment Institute
at Columbia University. The institute deploys what it describes as
strategic litigation, research and public education to defend free speech
in a digitally driven world. Welcome to Crash course.

Speaker 2 (01:54):
Jamil, thanks so much, happy to be here. So just
start us off.

Speaker 1 (01:59):
A little bit, talk about how it came to pass
that free speech has become the focal point of your
own professional life.

Speaker 2 (02:07):
Well, I was a lawyer at the ACLU for almost
fifteen years starting in two thousand and two, and I
focused mostly on national security cases. So this was obviously
right after nine to eleven, and we were doing a
lot of work relating to detention and interrogation surveillance, and

(02:28):
it turned out that a lot of those cases were
incidentally First Amendment cases or free speech cases. So in
the course of doing work on national security issues, I
ended up litigating a lot of transparency cases where we
were trying to get information about, for example, what was
going on in the CIA's black sites. That litigated a

(02:49):
bunch of cases involving the denial of visas to foreign
citizens who had been invited to speak inside the United States,
and those two turned out to be First Amendon cases.
A lot of cases involving access to the courts, and
then some cases involving the free speech and freedom of
association implications of government surveillance. And so I was approaching

(03:11):
all those cases as cases about national security and human rights,
but they all ended up turning on these questions about
the First Amendment, or at least free speech. And so
I became a First Amendment lawyer almost by accident. And
I'd been at these you a long time again, almost
fifteen years, and I got a call from Columbia, which

(03:31):
had decided with the Knight Foundation to set up this
institute here to focus on digital age free speech questions.
That was twenty sixteen. Now we've been doing this for
seven years and it's now a real organization. We have
about twenty five people, including thirteen or fourteen litigators. We
bring strategic litigation, We host and commission research, and we

(03:55):
have a growing public education program as well, so did I.

Speaker 1 (04:00):
They sort of over sell or improperly describe the dynamic
that's a foot in our lives right now. You know
this idea that the digital revolution, and with it, the
advent of social media, has made free speech freer perhaps
than ever before, while at the same time making it
so front and center in people's lives that it appears

(04:24):
to people with opposing viewpoints also be more threatening than
ever before. Or is that the wrong way to think
about it.

Speaker 2 (04:30):
I think that's a legitimate way of thinking about it.
It's not the only way of thinking about it. So
it might depend what you mean by free speech, right.

Speaker 1 (04:37):
Just to clear that up, I guess I would say
the ability to speak freely and the ability to speak
without being censored.

Speaker 2 (04:44):
You know, I think it also depends what you mean
by censorships. Let me tell you why I'm sort of
resisting this frame. So you're definitely right that social media
in particular has democratized speech so that now anybody who
wants to comment on matters of publica or for that matter,
matters of private concern, can do it without the kinds
of gatekeepers that were always in the way twenty years ago. Right,

(05:06):
You no longer need the permission of CBS or New
York Times to speak to a broad audience. There are
lots of people who are doing that right now on
social media without any mediation at all, and in many
ways that's been an amazing thing for free speech. You know.
It means that we can hold government officials and other
powerful private actors to account much much more easily. So

(05:29):
you know, in that sense, yes, absolutely, speech is freer
now than it's ever been. On the other hand, we
do have these new gatekeepers, the social media companies themselves,
that play a very large role in determining what speech
gets heard online, which ideas get traction, which speakers are
allowed to speak. All sorts of new technologies pose new

(05:49):
kinds of threats to free speech. We have a case
against spyware manufacturer whose technology was used to hack the
phones of Central American journalists. That's the kind of threat
to press freedom that nobody even contemplated ten or twenty
years ago. So I would say it's complicated. There's a
sense in which you're certainly right, but that's not the
only way to look at the facts here.

Speaker 1 (06:11):
I'm sure it's not the only way to think about it.
Jimil That's why I wanted to kind of do a
reality check with you. And as it happens, the Supreme
Court itself is wrestling right now with trying to understand
this interplay between digital platforms and free speech and then
the intervention of opposing parties and how that speech is
conducted essentially, And there's a couple of imminent Supreme Court

(06:36):
hearings afoot. Those actually might have taken place by the
time we air, but I wanted to talk to you
about those. In the first one of them, a California
school board blocked parents on their own Facebook page because
the parents had left posts complaining about racism at the school.
And in that specific case, the court is trying to

(06:59):
come out on whether or not the school board has
a right essentially to limit parents speech if the parents
are criticizing the institution. I think that's a sort of
a thumbnail of what's at stake here. Tell me how
you view that particular case and what's in play there.

Speaker 2 (07:15):
So these are representative of a broader class of cases
involving the use of social media by public officials, and
as you know, public officials now use social media often
as their principal means of communicating with the public and
with their constituents in particular, and the result is that
some public officials social media accounts have become really important

(07:39):
public forums, like forums for discussion of public policy. I mean,
Trump's account, I think was the paradigm here. People used
to go to Trump's Twitter account to hear the views
of the president, to engage with those views, to engage
with other citizens about what the president had said. There's
a lot that you could learn from President Trump's Twitter

(07:59):
account you couldn't learn anywhere else. And so that Twitter
account took on this democratic significance kind of like you know,
a city council meeting or a school board meeting, or
a legislator's town hall, but you know, on steroids, almost
like Trump was standing at the front of the room
and there are millions of citizens assembled in front of
him who were listening to him, talking back to him,

(08:22):
talking to one another. You know, that's one way to
think of what that social media account was. And you know,
Trump is unique in many different respects, but many other
public officials now use their social media accounts in basically
the same way. And so this question of what status
these accounts have under The first Amendment is a really
important one. If there's a school board meeting, you can't

(08:44):
get kicked out of it just because you disagree with
what the school board thinks. And if you go to
a city council meeting and you complain about racism at
city schools, the city council can't kick you out just
because they don't like what you're saying. And so the
question is what happens if a public official effectively kicks
you out of his or her social media account, you know,

(09:05):
blocks you from accessing the account. I think that when
people first come to this set of questions, they they
go this is trivial, but given the significance that these
accounts now have to our democracy, it's actually a really
important question. When can a public official block a citizen
from participating in that democratically important space? And that's the

(09:25):
question this before the court.

Speaker 1 (09:26):
Wait, but before you go on here, because I think
we should clarify something. Is you mentioned that school officials
have their own, say, personal accounts. They may also have
individual accounts as a representative of the local government or
a local institution, and then there also might be an
an institutional account. Yes, so there's different classes of accounts

(09:47):
actually that come into play. I would presume that a
local official with a personal account is free to let
anyonet to honor off that personal account that they want
to if it's in their capacity as an individual.

Speaker 2 (09:58):
Yeah.

Speaker 1 (09:59):
You see members of Progress, for example, expressly saying this
is my personal account and this is my federal account
as a politician. In the school board case in California,
my understanding is the parents were posting on a school
board account, not an account representing any individual, either in
their capacity as a local official or as an individual,

(10:20):
and that they were kicked off the school board account
or blocked from it. Essentially. Is that correct?

Speaker 2 (10:26):
I mean, all that's correct, And conceptually you're right that
nobody's saying that the First Amendment should apply to a
public official's personal account. But what counts as a personal
account is actually a complicated question, because you know, Trump
used to say that his account was personal, but he
used that account to do the work.

Speaker 1 (10:44):
I'm a Trump later because he's very Suey ganeerous in
a lot of ways and important in this debate and discussion.
But I want to focus in on what the Supreme
Court right now is looking at in these two cases.

Speaker 2 (10:55):
Yeah. Even with these two cases, though, you know, the
question is when does an account reflect the exercise of
state power? Because when the account reflects the exercise of
state power, it's subject to the constraints of the First Amendment.
But you can't answer that question about whether an account
reflects the exercise of state power without actually looking beyond

(11:17):
the label. It's not enough that somebody says this is
a personal account, or you know, I also have an
official account. You got to look at how the account
is used. And it's one of the things the Supreme
Court is going to have to struggle with is how
do you draw this line? Because on some of these
accounts you've got a bunch of photographs of cats, and
then you have, you know, a legislator saying if you
have comments about my proposal to do X, then please

(11:39):
write to my office. So it's a combination of things.
And how do you decide is this subject to the
constraints for the First Amendment or not? That's a hard question.

Speaker 1 (11:47):
The other case that the Supreme Court's looking at in
the docket that I've referenced to you right now is
is a case in Michigan where a resident was blocked
from the city manager's Facebook page after the resident to
complained about the locality's response to the COVID nineteen pandemic.
I'm assuming the same things that we just discussed in

(12:10):
the school board case in California are at play in
this Michigan case. Again, Can a local entity block a
resident from expressing himself or herself on a social media
platform that's affiliated in some way with the local government.

Speaker 2 (12:26):
Yeah, now that's right. I mean these two cases that
you just mentioned, the court is really focused on the
question of whether the accounts reflect the exercise of state power,
and the court didn't grant cer, so the court hasn't
agreed to consider the question of what those constraints might be.
So it's conceivable that the court says in these two cases,

(12:48):
in both of these cases, the public officials social media
accounts were exercises of state power, and therefore the first
amenment applies. But this question what does it mean when
the first amendent applies is not actually presented by these
two cases and is going to have to be addressed
by the lower courts in the first instance, and you
could imagine a rule that says, well, public officials can

(13:11):
block their constituents for all sorts of reasons, like, for example,
for spamming them, but they can't block them based on
viewpoint alone. Like that would be one possible First Amendment rule,
but we're not going to get that kind of rule
out of the Supreme Court this term. It's going to
be for the lower courts to address that first.

Speaker 1 (13:29):
In this collection of things that the Supreme Court is
looking at, another one that's intriguing to me is they're
going to consider a case involving content moderation on social
media platforms and what protections the platforms themselves, as private entities,
should enjoy around how they moderate what appears on their
own sites, whether it's Facebook or Twitter. And we'll talk

(13:51):
more about this as we go on in the conversation
because this is also kind of, i think, ground zero
of our current debate about the new digital world and
free speech. But talk a little bit about what the
Supreme Court is looking at in that case, what responsibilities
private entities have over content moderation.

Speaker 2 (14:09):
Yeah, So the cases we just talked about are cases
about the government as speaker right, where you have public
officials wanting to use social media themselves and their speakers
in that context. These cases that you just brought up
are cases involving the government as regulator, where the question
is what limits is the First Amendment place on the

(14:30):
government's power to control or influence the content moderation policies
of the platforms, And these two cases involve laws out
of Florida and Texas, both passed in twenty twenty one.
Both of the laws impose what are sometimes called must
carry obligations on the platforms. So the Florida law, for example,

(14:52):
requires the platforms to carry the speech of political candidates
as well as prohibits them from taking down the speech
of media organizations on the basis of the content of
the media organization's articles. And then the Texas law prohibits
the platforms from taking down speech on the basis of viewpoint.

(15:13):
So both of these laws impose again what are called
must carry obligations on the platforms that require them to
publish speech that they might not want to publish. And
both laws also require the platforms to notify users whose
speech is taken down. So if Facebook decides that one
of your posts violates a term of service, then Facebook

(15:35):
is required under these laws to tell you that they've
taken the speech down and to explain why they've taken
it down. So those are the laws, and the question
before the Supreme Court is does the First Amendment permit
the government to impose those kinds of must carry obligations
on the platforms? And does it permit the government to
require the platforms to notify and provide explanations to their

(15:59):
users in the way I just described. So those are
the questions, and they turn out to be really complicated
First Amendment questions, in part because the precedents that we
have don't involve social media. The precedence we have, you know,
sometimes involve newspapers, and then there's a question of how
far do those precedents that were decided in relation to
newspapers go when we're talking about this very different medium.

(16:23):
So for that reason, these two cases are complicated.

Speaker 1 (16:26):
And because of the technology platforms themselves have worked mightily
to claim that they're not publishers, they're merely technology platforms,
even though in my opinion, they do act as publishers
in the world we live in right now, and I
think it's a smoke screen that the tech companies have
thrown up because to moderate more would be more expensive.

(16:46):
That's an extra expense they want to take on. Describing
themselves as a publisher brings them into a different, potentially
regulatory regime. Describing themselves as publishers puts a different onus
on them legally and exposes them to new life abilities.
If they embrace the definition of publisher, it's more expensive
and more complex to run their businesses. And so they

(17:09):
insist that they're merely technology platforms and they're simply offering
people a place to express themselves. But if we've seen
when technology platforms, I think, hide behind that label to
a certain extent, they don't perform the kind of gatekeeping
role you sometimes want in a complicated era in which
propaganda and disinformation exists alongside free speech and facts.

Speaker 2 (17:31):
Yeah, I think that there's no doubt that you're right
that the social media companies have tried to have it
both ways. You know, they sometimes say that we're effectively
merely conduits for our user's speech, and we can't be
held responsible for what's on our platforms because all of that,
you know, has been written by other people and we're
just kind of the infrastructure they have. I would say,

(17:53):
if not abandoned that talking point, now they've certainly drifted
considerably far away from it, and in these cases before
the Supreme Court, their argument is actually just the opposite.
Their argument is that the social media platforms are just
like newspapers for First Amendment purposes. We also exercise editorial

(18:14):
judgment when we decide what content can be on our platforms.
You know, when we decide that misinformation needs to be labeled,
or when we decide that speech that glorifies violence needs
to be taken down. Those are editorial decisions and their
editorial decisions within the meaning of the First Amendment. And
for the same reasons the newspapers are protected, and to

(18:34):
the same extent the newspapers are protected, we're protected too.
That's the argument they're making. That's sort of the first
step of their arguments in these cases, is that we're
just like newspapers. And the second step is, for the
same reasons Congress couldn't tell a newspaper to carry speech
it didn't want to carry, Congress can't tell us, or
legislators can't tell us in this case, is Florida and

(18:55):
Texas can't tell us what to carry. And for the
same reasons that legislators couldn't require newspapers to explain why
they did or didn't publish any particular article. We can't
be required to explain to our users why we took
down their posts. So you're absolutely right that in other contexts,
the platforms have tried very hard to disavow any responsibility

(19:19):
for the content on their platforms. In this particular context,
they're running in exactly the opposite direction and saying that
we're no different from newspapers and are entitled to the
same constitutional protection as newspapers are.

Speaker 1 (19:33):
Okay, on that note, I want to take a quick
break and hear from one of our sponsors, Jamil, and
then we'll come right back and continue this conversation. We're
back with Jamil Jaffer, director of the Night First Amendment
Institute at Columbia University. Jamil is a free speech warrior. Jamie,

(19:55):
let's step away from the specifics of the Supreme Court
cases we've been talking about in the prior segment and
just talk philosophically for a minute about what place the
values or virtues of free speech have traditionally occupied in
American life. Why is this thing that we call free
speech protected in the Constitution, Why is it constantly debated

(20:18):
in our public life. Why are you and I talking
about it right now?

Speaker 2 (20:22):
Yeah, I mean, I think that a big part of
the answer to that question is that free speech and democracy,
or free speech and self government are very very closely connected, right,
and democracy is core to our self conception in the
United States. That's sort of what defines our society is
it's a democratic character. But if you're going to have

(20:43):
a government that is answerable to the people, then the
people need to have the freedom to debate government policy,
they need to have access to information, and it's the
First Amendment that guarantees those things. And so one way
to think about the First Amendment, or the core purpose
of the First Amendment, is that it's intended to create

(21:06):
the conditions that are necessary to sustain democracy, and that,
in fact, is how most free speech theorists have thought
about it, at least for the last fifty years. You know,
I think people don't always know this, but the First
Amendment as we understand it today is actually very young.
It grew out of opinions that Oliver Wendell Holmes and

(21:27):
Louis Brandeis wrote beginning in nineteen nineteen, so just a
century ago, and those opinions were dissents and concurrences initially,
and then over time they sort of moved over into
the majority. But most of the rules that we think
of as fundamental to the First Amendment today were established

(21:48):
by the Supreme Court in the nineteen sixties and seventies
through cases like New York Times versus Sullivan, which insulates
news organizations from most defamation claims, Brandenburg versus Ohio, which
holds that even extreme forms of political advocacy are protected
by the First Amendment unless they amount to incitement, or

(22:09):
cases like the Pentagon Papers case in the nineteen seventy one
case that held that the government couldn't obtain a prior
restraint against the newspapers for publishing a secret report about
the Vietnam War. Like those cases were decided fifty years ago,
and those cases really defined the First Amendment as we
understand it today. So all this is very very new,
but those cases from the nineteen sixties and seventies really

(22:30):
positioned democracy at the heart of the First Amendment. They
really saw the purpose of the First Amendment as again
kind of creating the conditions that would make self government
and democracy possible. And so now when you think about
extending the First Amendment to new spheres. One question that
you might begin with is what would serve our democracy

(22:52):
in this new sphere. What rules relating to free speech
would be best for our democracy in this new sphere, like,
for example, the sphere of SAE social media. So that's
one way to approach these questions, and I think it's
the way that's most consistent with the way that the
Supreme Court approached these questions in this formative period in
the nineteen sixties and seventies.

Speaker 1 (23:11):
And yet, even with the presidents that you've referred to
and the sort of legal architecture that's been built around
free speech, it still gets contested daily and plenty of
venues outside of courtrooms. You've mentioned Florida already in the podcast.
Florida has been sort of on the cutting edge of
asserting I think, state involvement in different forms of speech.

(23:35):
You know, the state government in Florida has intervened around
how the history of slavery and the African American black
experience in the United States should be taught. They've intervened
in things around what the K through twelve curriculum should
look like. It's empowered parents at a very microcosmic level,
to essentially police libraries for texts that are acceptable or

(23:56):
unacceptable to sometimes just one parent in a community of
How do you see that, How do you see some
of these things that have been going on in Florida?
How do you see that shaping this current battle we're
having now over defining both the nature free speech and
the parameter surrounding it.

Speaker 2 (24:16):
Yeah. I would say first that those cases that you
just described only underscore how important First Amendment protections are.
There really is a kind of authoritarian impulse behind some
of those policies. Those policies are intended to restrict the
ideas that the public has access to. And the point

(24:38):
of the First Amendment, and the point of a lot
of the precedents that I just described from the nineteen
sixties and seventies, is to take that power out of
the hands of government to make sure that we the
people get to decide which ideas are worthwhile and which
ones aren't. And these laws that you just described are
these kind of regulatory interventions that you just described, I

(24:58):
think are completely inconsistent with that principle. And so I
would say that some of these First Amendment protections are
going to get tested in cases involving those regulatory interventions
that you just described. But I still have confidence that
the courts will uphold those principles to sort of define
the First Amendment. And again, one of those principles is
just that it's not up to the government to decide

(25:20):
which ideas are worthwhile that's a power that Constitution gives
to ordinary citizens.

Speaker 1 (25:26):
What about private entities ability to shape what isn't isn't acceptable? Obviously,
the First Amendment was erected essentially to protect individuals from
government censorship. It doesn't stretch with the same kind of
robust vigor into private enterprises as it does into how
it moderates and as a watchdog against government censorship. Nonetheless, again,

(25:48):
this digital era we're in has really put all out
of this in stark relief, which leads me to Elon
Musk and Twitter. Musk bought Twitter. When he bought it,
he described himself as a free speech appsolutist, and he
said that one of the reasons he was purchasing Twitter
was because he felt people's free speech, particularly conservative free speech,
was being circumscribed. Since taking it over, I as a

(26:11):
Twitter user and a Twitter observer, think it's become this
sort of carnival less car crash of mismanagement and misinformation,
And in fact, I think Musk has acted to make
it easier for disinformation and a kind of hysteria to
take root on Twitter that wasn't there before. It was
there in bits and pieces, but it's very center stage now.

(26:33):
How do you think about the responsibilities that are on
private owners in this digital era in terms of making
sure that everyone has access from both sides of the
aisle politically, and that good factual information, as opposed to
disinformation and propaganda, don't flow freely across sites.

Speaker 2 (26:53):
I share your view of what's happened to Twitter. I
think that it's too bad because I think Twitter used
to play really important role in underwriting public discourse. I
don't really see it playing that role now, in part
because of the pathologies that you just described. I would separate, though,
the question of the social media company's ethical responsibilities, and

(27:15):
I do think that there are ethical responsibilities in the
same way that media organizations have, you know, a kind
of journalistic set of ethics. Social media platforms should also
be thinking about what their ethical responsibilities are. But I
would separate that question from the question of what the
government should be doing to influence or control the content
moderation policy of the platforms, because it's possible that most

(27:37):
of the work that we need done here has to
be done not through regulation, but through the development of
platform ethics. One concern I have with the cases that
we were just talking about in the Supreme Court, these
Florida and Texas cases, is that the laws that these
two states have passed I think are largely unconstitutional. I
don't see the Supreme Court coming to the conclusion that

(27:59):
the social media plot platforms don't have First Amendment rights.
I don't see the Supreme Court upholding these laws that
impose quite onerous must carry obligations on the platforms for
no articulated reason. But I am worried that in struking
down these laws, the Supreme Court might write those opinions

(28:19):
so broadly that those opinions foreclose other legislation in the
future that might be narrower and more justified by legislative
findings and more closely connected to legitimate democratic goals. I
do think that there is a role for governments to
play in this sphere. I think that some form of
transparency mandate would be a good thing. You know, requiring

(28:42):
the platforms to be more accountable to the public and
to researchers and to regulators about the decisions they're making
would be a good thing. Some version of a notice requirement,
I think would be a good thing. I think it
makes sense that, you know, when people are kicked off
these platforms that have gatekeeper powers or with respect to
public discourse, it makes sense that they should have to

(29:03):
explain their decisions. And I worry that Florida and Texas's
laws will, for good reason be struck down, but struck
down in terms that are so categorical that the court
will foreclose much more sensible legislation that might be proposed
next year or the year after. That's my worry about
those particular cases.

Speaker 1 (29:23):
The COVID lockdown and the COVID here has also introduced
an interesting new development I think or highlighted, maybe one
that pre existed, but this idea around the extent to
which the government is allowed to police digital platforms for
bad information around say healthcare and public health that if
it is false, could be threatening to the well being

(29:46):
of individuals, but that obviously also can run up against
individuals desired to present their own views about a public
health crisis, or the efficacy of government recommendations during a
public health crisis, whether it's asking for vaccinations, wherever it
might be. That's also been playing out in a very
intense way in recent years, in a way that I

(30:07):
didn't think it had in the past, and I was
wondering how you think about that issue.

Speaker 2 (30:12):
Yeah, I mean, I guess two things. So one is authority.
I mean absolutely, you know, the government has information and
insight that the public lacks on issues relating to public health.
It's obviously crucial that the CDC be able to share
that information with the public, important that government agencies be

(30:32):
generally trusted. I think that the lesson from the last
few years, though, is not that we need to clamp
down on misinformation about public health or ensure that only
the government's views are heard. I think the lesson from
the last few years is first that the platforms have
a kind of ethical responsibility to their users to ensure
that their users are hearing information from trustworthy sources, but

(30:57):
also that the platforms have an obligation to sure that
there is space for dissent. You know, the government does
have this special expertise but that doesn't mean the government
doesn't get things wrong. Sometimes the government gets things wrong
in good faith, and sometimes government officials, for whatever reasons,
decide to mislead the public about something or the other.

(31:18):
And part of the reason we create space for dissent
is because the fact that dissenters are allowed to voice
their views is one of the things that gives legitimacy
to the government's views. Right. We're willing to trust the
government in part because dissenters are allowed to have their say,
and we trust that when dissent is persuasive, it'll eventually

(31:39):
have the effect of forcing the government to change its
own views or its own policies. So I think you
need kind of both of these things. You need the
platforms to ensure that their users are given access to
trustworthy speakers, but also they need to make sure that
there's room for dissent. And I think that the way
that platforms can do that is by responding to what
they think of as misinformation with labeling rather than suppression.

(32:04):
I think labeling is a much much better solution to
the problem of public health misinformation than suppression. Is much
better for Facebook to just stick its own speech on
top of what it believes to misinformation, and it can
say we don't think this is accurate. If you want
an accurate view, go to the CDC's website. That is

(32:25):
an appropriate way for Facebook to respond to speech that
it thinks of as dangerous misinformation. If Facebook responds with
suppression rather than labeling, the effect is to give those
speakers of a kind of monopoly on public discourse, and
also to disable the kind of descent that for one
thing you might turn out to be right, but for

(32:47):
another the kind of descent that actually ends up legitimating
the government's views. The fact that the descent is there
is one of the reasons that we are willing to
trust the CDC, because we know if the CDC gets
things wrong, people will say so other scientists will say,
the CDC got this wrong, And here's how I know
I got it wrong right. So I think that's why

(33:07):
I favor labeling over suppression. I don't think it would
make sense to give the government the power to make
misinformation unlawful. And I say that for a number of reasons.
One is that what is or isn't misinformation is always
a contested thing. There's no way to draw that line
in a way that will be seen as politically legitimate.
Another is the government often gets things wrong even when

(33:29):
it's operating in good faith, and still in others of
the government doesn't always operate in good faith. Those are
all reasons why it would be a bad idea to
go down the road of giving government officials the power
to suppress misinformation. And I will say just one more
thing about that, which is that when people propose that
government officials should be given that authority, they always have

(33:51):
in mind that the government officials who will be exercising
that authority are people like them. And you cannot have
any confidence that the people who are going to be
exercising that governmental authority tomorrow will be people like you,
even if they are people like you today. So that's
still another reason to reject that possible purported solution to

(34:13):
the problem of misinformation.

Speaker 1 (34:15):
Okay, Jamil, let's take another break and then we'll come
right back. We're back with Jamil Jaffer, and we're talking
about free speech. Jamil, we talked earlier in the show
about Donald Trump as a sort of avatar for a
lot of the issues that have arisen around free speech
and the uses and potential abuses of social media platforms.

(34:38):
In the era we're in, Trump has actually made free
speech a shield for himself. Recently around some of the
court cases that have been directed against him, he said
that his involvement in the January sixth insurrection that resulted
in a violent clash at the Capitol and an attempt
to overthrow the election result on that day interfere with

(34:59):
the election counting that efforts to prosecute him our assaults
on his own free speech. And I think this raises
an interesting thing in the free speech debate that's worth clarifying,
which is, you can protect speech in all of its forms,
even often hate speech is protected legally. But there's a
difference between speaking freely and inciting violence or inciting a crime,

(35:25):
isn't there.

Speaker 2 (35:26):
Yeah, No, that's right. It's sometimes hard to separate these things.
But if you protected all speech, then you know, you
would presumably protect the person who says attack to their
attack dog, and that would be self defeating, and so
you kind of have to separate out speech that is
part of criminal conduct. I think that if you look
at the indictments of Trump, there's a lot of speech

(35:49):
in there, a lot of what the government is relying
on in accusing Trump of criminal activity is speech. I
don't think that is in itself a First Amendment problem.
Are prosecuted for conduct that involves speech all the time.
Incitement is one example, Fraud is another example. Solicitation of

(36:09):
criminal conduct is another example. You know, those are all
situations where all that the person did is speak, but
they spoke as part of a course of criminal conduct.
So that's a line that's often difficult to draw. But
it's the.

Speaker 1 (36:24):
Line between free expression and being a cod in criminal conduct.

Speaker 2 (36:29):
Yeah, exactly, because criminal conduct often involves speech. So that's
going to be a challenge for the government in these cases.
But I don't think the mere fact that the indictments
accuse or list episodes in which Trump is alleged to
have said this or that, you know, I don't think
that is in itself a reason to think that these
indictments are a First Amendment problem.

Speaker 1 (36:49):
Another troubling, poignant issue in the news right now is
the Gaza conflict that's given rise to all sorts of
debates around free speech, Muslims accusing Jews of being anti Muslim,
Jews accusing Muslims of being anti Semitic. This has taken
root on campuses now around the country. In the US,
the debate about who's in the right and who's in

(37:10):
the wrong in this particular conflict, and in some recent incidents,
students who've either come out as being pro Palestinian or
have said that they don't have an issue with what
I think are some of the grotesque measures Hamas took
and its attack on average Israeli citizens have come under
sanction from their own universities, from outside owners to the

(37:33):
universities who think the students have gone beyond the pale.
My view of this has been that, however wrong some
of the students might be in the way that they're
describing what's occurred or what they're advocating for it, they
are still students on a campus, and if you start
sanctioning them for their speech, you get into very tender territory. Obviously,
disagree with me if you want, but I did want

(37:55):
to put this thing up in front of you because
I think it's also another very public, poignant reminder of
some of the fault lines and difficulties that's around free speech.

Speaker 2 (38:06):
Yeah, I do think that there are some difficult free
speech questions here, but for the most part, they are
not First Amendment questions, right, They are questions about free
speech culture. So, for example, when a donor says to
Harvard University, I used to give you hundreds of millions
of dollars, and because you haven't condemned the students who

(38:27):
didn't vociferously enough condemn the Hamas attacks, I'm going to
withhold future donations. I think that the students had a
right to say what they said, the university had a
right to respond in whatever way it did, and the
donor has a First Amendment right to respond in that
way too. Now, those are the easy questions. The First
Amendment questions are easy. Harder questions are about free speech culture.

(38:51):
It does make me very uncomfortable to see donors putting
this kind of pressure on universities to condemn their students.
You know, one prominent headgefund manager was running these billboards
at campuses around the country accusing some of the students
of being anti Semitic, plastering their faces and names and
home addresses on these billboards. Again, I think that those

(39:13):
actions are probably lawful. I mean, I don't know all
the details, but based on the description I just gave you,
the actions are probably lawful, but they do seem inconsistent
to me with the basic principles of an open, free
speech culture. I don't see that as, you know, a
kind of legitimate form of counter speech. Instead, those billboards
are an attempt to intimidate and coerce students into giving

(39:36):
up their First Amendment rights, stopping students from participating in
public discourse about, you know, an issue whose importance everybody recognizes.

Speaker 1 (39:45):
Since we're talking about campus life, Jamil. Some data or
studies have suggested that faculty members are getting punished or
fired for speech or expression more frequently in recent years
than they have historically. It's not clear to me how
that breaks down if it's faculty members on the left
getting censured by institutions on the right, or faculty members

(40:08):
on the right getting it censored by administrations that are
more left leaning. But it does seem to be increasing
regardless of where the ideologies line up. And I'm wondering
what you think about that. Do you think it's actually
become more ubiquitous and apparent now than it has in
the past, And what are your thoughts about that?

Speaker 2 (40:24):
If so, Yeah, I don't know the statistics, but it
does certainly feel like academic freedom is under a special
threat right now, not just with these sanctions being imposed
on professors who say controversial things, but there are these
attempts around the country to restrict the ways that teachers

(40:46):
public university faculty teach. We have a case in Texas
where we're challenging a law that restricts public university faculty
from teaching with TikTok or studying TikTok. There are the
interventions you mentioned earlier involving critical race theory or motivate
it anyway, by the perception that critical race theory has

(41:07):
kind of taken over schools. So there are all these
efforts to chill the speech of public university faculty. And
it goes beyond universities as well, you know, high schools
and elementary schools too. These efforts to really kind of
narrow the ideas that students are exposed to, and narrow
the options that teachers have to teach their students, even

(41:28):
you know, restrict the books that students can read, and
all of that I think is a matter for real concern.

Speaker 1 (41:34):
You know, the standard here we're talking about is more broadly,
I think, is that airing contentious views is a virtue,
and disagreement about those views is healthy, especially on campuses.

Speaker 2 (41:45):
Yes, absolutely, I would say it goes even beyond that.
The whole point of a university is to create a
space in which people can really consider ideas freely, can
pursue ideas to their limits, can explore ideas even if
they're controversial or unpopular. I mean, that is the point
of the university. If a university can't do that, then

(42:05):
you know you've really undermined it's, you know, entire purpose.

Speaker 1 (42:09):
I always like to wind the show up, Jamil by
asking people what they've learned. What do you know now
as an attorney and an advocate steeped in issues surrounding
free speech that you didn't know when you first embarked
on your legal career.

Speaker 2 (42:24):
I would say that I know that these issues are
more complicated than they seem at first. You know, you
come to free speech, or certainly I did, with a
pretty two dimensional understanding or one dimensional understanding of the
First Amendment and the concept of free speech, that the
whole point is to prevent the government from censoring us right.
And it's not that that's wrong, but it turns out

(42:46):
to be much more complicated, much more complicated, because there
are legitimate questions about when something should count as censorship.
There are legitimate questions about should we be worried only
about the government or should we be worried about private
actors too. There are legitimate questions about the purpose of
the First Amendment. You know, we talked a little bit
about self government and democracy, but there are also completely

(43:09):
plausible theories of the First Amendment that center other values
like individual autonomy or truth seeking or accountability. And if
you think those are the values that the First Amendment
should care most about, then your First Amendment is going
to look a little bit different than a First Amendment
that is focused principally on democracy. And ultimately, there's no

(43:31):
right answer to those questions, or maybe a better way
to say it is that the only way we can
figure out what's right is to figure out what works.
We have to think about, you know, what kind of
society is this going to create? And do we like
that society? So those are really hard questions, and you
come to this for the first time you think it's
just a matter of stopping the government from censoring people,

(43:52):
and again, not incorrect, but not complete either.

Speaker 1 (43:57):
This has been such a great conversation, Jamil, but we're
out of time unfortunately. Thank you for joining us today.

Speaker 2 (44:03):
Thank you so much.

Speaker 1 (44:05):
Jamil Jaffer is the director of the Night First Amendment
Institute at Columbia University. You can find him on Twitter
at Jamil Jaffer. Here at crash Course, we believe the
collisions can be messy, impressive, challenging, surprising, and always instructive.
In today's Crash Course, I learned that the digital revolution
has upended so many things that even free speech and

(44:28):
how we define it, enforce it and build our laws
around it is also in motion. What did you learn?
We'd love to hear from you. You can tweet at
the Bloomberg Opinion handle at Opinion or me at Tim
O'Brien using the hashtag Bloomberg Crash Course. You can also
subscribe to our show wherever you're listening right now, and
please leave us a review. It helps more people find

(44:50):
the show. This episode was produced by the Indispensable Animasarakas,
Julia Press and Me. Our supervising producer is mo Hendrickson,
and we had editing help from Sagebauman, Jeff Grocott, Mike
Mietze and Christine Vanden Bilart. Blake Maples does our sound
engineering and our original theme song was composed by Luis Gara.

(45:12):
I'm Tim O'Brien. We'll be back next week with another
crash course.
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.