Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
[MUSIC]
>> Eryn Tillman (00:07):
Welcome,
my name is Eryn Tillman,
an Associate Director at the HooverInstitution, and we'd like to welcome
you to today's webinar organized bythe Hoover Institution center for
Revitalizing American Institutions,also known as RAI.
Today's session will consist of briefopen remarks from our panelists and
facilitated discussion with our moderator,followed by a period where our panelists
(00:29):
will respond to questionsfrom audience members.
To submit a question,
please use the Q&A feature locatedat the bottom of your Zoom screen.
We will do our best to respond toas many questions as possible.
A recording of this webinar will beavailable @hoover.org/rai within
the next few days.
RAI operates as the Hoover Institution'sfirst ever center and
(00:50):
is a testament to one of our foundingprinciples, Ideas Advancing Freedom.
The center was established to study thereasons behind the crisis and trust facing
American institutions, analyze howthey are operating in practice, and
consider policy recommendations to rebuildtrust and increase their effectiveness.
RAI works with and supportsHoover Fellows as well as faculty,
(01:11):
practitioners and policymakers from acrossthe country to pursue evidence-based
reforms that impact trust and efficacy ina wide range of American institutions.
To date, our webinar series has coveredtopics related to transitions of
the executive government,trust in elections,
how polling helps us understand whatis on the minds of Americans, and
the experiences of conservativestudents on American college campuses.
(01:34):
Today we'll be exploring regulations andrestrictions on speech in other countries
that may impact Americansright to free expression.
And with that, it gives me great pleasureto introduce today's moderator your
Gene Volok, the Thomas M Siebel SeniorFellow and an affiliate at the center for
Revitalizing American Institutionsat the Hoover Institution.
Eugene is one of our nation'sleading legal scholars.
(01:56):
During his 30 year tenure as a facultymember of the UCLA Law School,
he taught First Amendment law,copyright law, criminal law, tort and
firearms regulation policy.
He's a member ofthe American Law Institute and
of the American Heritage DictionaryUsage Panel and the founder and
co-author of the Volk Conspiracy,a leading legal blog.
His work has been cited morethan 350 court opinions,
(02:18):
including 10 Supreme Court cases, as wellas more than 5,000 academic articles.
He's also filed briefs, mostly amicusbriefs, in more than 200 cases, and has
argued in over 40 appellate cases in stateand federal courts throughout the country.
We're thrilled to have Eugene leadingthe Hoover Institution's program on
Free Expression and grateful that he hasagreed to lead today's conversation.
(02:41):
Eugene is joined by Jacob Machangam,the founder and
Executive Director ofthe future of free speech,
he is a research professorat Vanderbilt University and
a senior fellow at the foundation forIndividual Rights and Expression Fire.
He has commented extensively onfree speech and human rights and
outlets including the Washington Post,the Wall Street Journal, the Economist,
(03:02):
Foreign affairs, and Foreign Policy.
Jacob has published in academic andpeer reviewed journals including
Human Rights Quarterly, Policy Review andAmnesty International Strategic Studies.
He is the producer narrator of the podcastClear and Present A History of
Free Speech and the critically acclaimedbook Free Speech A History from
Socrates to Social Mediapublished by basic books in 2022.
(03:26):
Now I'll hand it off to Eugene andJacob for today's conversation.
>> Eugene Volokh (03:31):
Thank you very much,
Eryn, Jacob, thanks so much for for
participating in this.
It's tremendously important topic andno one better than you to address it.
So actually before we getinto the substance, Jacob,
tell us a little bit more about yourbackground studying free speech law.
And in America we say First Amendment law,
(03:51):
which on the one hand there are manytechnical problems with that.
There are statutes thatprotect free speech,
common law rules that protect it orrestrict and such but the other problem is
the 1st Amendment governs what,about 4% of of the world's population.
Free speech, on the other hand,
(04:12):
is a universal principle treateddifferently in different places.
So it's very important to studyit on an international basis and
you've been doing it for a long time.
So tell us a little bit aboutyour background in this area.
And by the first just to make clear,Jacob is from Denmark, which I hear
(04:32):
is a foreign country, which one of theplaces First Amendment doesn't run, and
it's part of Europe, where naturally onewants to study more about the rules there.
He now is in America and knows a lotabout American free speech law too but
he has been studying the free speechrules in a variety of countries,
(04:53):
including outside of both Europe andAmerica, America for a long time.
So tell us a little bit about yourexperience both in Denmark and
now in America,dealing with all these things.
>> Jacob Mchangama (05:03):
Thank you so
much, Eugene.
It's an honor anda privilege to be part of this.
You're right, I was born andraised in Denmark, in Copenhagen, Denmark.
And I think for a very long time,well into my 20s,
I did not think much about free speech.
It was something that I took for granted.
(05:24):
It was like breathing the air.
Denmark is a very secular liberalcountry and free speech was essentially
a battle that was won,it wasn't really threatened by anyone.
And then as some of you might remember,20 years ago, a Danish newspaper
published cartoons of the Prophet Muhammadwhich led to a sort of geopolitical
(05:44):
crisis where suddenly Denmark became theepicenter of a conflict over the battle
over the relationship betweenthe values of free speech and religion.
And I think that particular conflict andsort of the global outfall
from that which I think we'restill living with to this day,
(06:04):
really set me down the rabbithole of free speech,
maybe radicalized me ina free speech direction and
I haven't been able to extract myselffrom the rabbit hole since then.
I'm a lawyer by training,spent some years in corporate commercial,
but then became the director of legalaffairs of a Danish think tank.
(06:27):
Founded a think tank 10 years ago,11 years ago in Copenhagen,
focusing on these issues, butsince 2020 have been focusing
on global freedom ofexpression almost exclusively.
And two years ago set upthe Future of Free Speech,
an independent think tank atVanderbilt University, where we study and
(06:52):
advocate for what we call a resilientglobal culture of free speech.
So looking both at laws thatprotect free speech, but
also norms that had to do withfree speech, which I think
is quite relevant to the topicthat we're discussing today.
(07:13):
Because American free speechis not necessarily undermined
legally by foreign laws ornorms, but it might be more be
the practical exercise of freespeech of Americans that can
be impacted by international andforeign developments.
>> Eugene Volokh (07:35):
So
that's very helpful, thanks very much.
And it's a nice segue into the sortof the substantive question which I
might frame a little bit like this.
So I think many Americans and maybemany people in other countries as well,
think of the world as dividedinto sovereign nations.
(07:56):
And the Danes have their own rules and
maybe Europeans as a wholehave their own rules.
And we may agree or disagree and we maysometimes send our vice president over
to complain about themto foreign countries.
But we understand the foreign countries,they have their own lawmaking,
their own constitutions, we have our own.
So we're just going to peacefully coexistwith us having our own free speech
(08:20):
rules that affect what Americans maysay and what Americans will hear.
And Europeans have their own, andSouth Africans have their own.
And Chinese have very restrictiveviews of free speech, but
that's okay, cuz that's between them andtheir people.
But it's, of course,more complicated than that, and
(08:42):
that's what you're gonnabe telling us about.
Tell us how it is that Americans speech,right,
may be affected by foreign free speechrules and perhaps even vice versa.
>> Jacob Mchangama (08:58):
Yeah, so
I mean, let's start in the 1990s.
So this, I think,was a time of free speech optimism.
It was driven by a techno-utopianideal of free and open Internet.
I think these ideals were sortof built into Section 230 of
the Communications Decency Act.
(09:18):
They were cemented by The Supreme Court's1997 decision in Reno vs ACLU,
which declared the Internet a unique and
wholly new medium of worldwidehuman communication.
And there was an Internet Freedom Agenda,which was a bipartisan pillar of
US foreign policy, where Washington andSilicon Valley worked
(09:39):
hand in hand to promote idealsof free speech around the world.
And for a while,foreign democratic governments and
sort of oppressed populationsaround the world embraced
the Internet Freedom Agenda asa harbinger of liberation and progress,
culminating in the Arab Spring,which was powered by social media.
(10:05):
But that was then.
So since then, we've had two polarizingpresidential elections and a pandemic.
We've had Brexit,we've had a refugee crisis in Europe.
And Americans are now deeply dividedover whether social media protects or
undermines free speech and what,if anything, should be done about.
(10:25):
But outside the US,the mood has also shifted.
I think a lot of democracies have souredon the American model of Internet freedom,
which they associate with Silicon Valleyplatforms enabling the viral spread
of content that is oftenillegal under their own laws or
corrosive to their values and democracies.
(10:47):
This was actually something thatwas predicted by Tim Wu and
Jack Goldsmith in their book from 2006,Who Controls the Internet?
Which at the time seemed a bitcontrarian and against the zeitgeist of
free speech optimism, buttoday looks depressingly prescient.
(11:09):
And over the past decade,governments around the world have launched
regulatory efforts to tame what issometimes called the online wild West.
And some of these effortsmay have consequences for
free speech in the US even ifAmericans still enjoy the strongest
(11:30):
constitutional protection forfree speech in the world,
because it is increasingly foreign and
global standards thatthese platforms employ.
So we actually saw an exampleof this just last week.
So back in 2020, I think,Facebook, now Meta,
(11:50):
created an oversight board which canissue binding rulings on content
moderation appeals from users andalso make policy recommendations to Meta.
And one recent case decided last weekinvolved a Facebook video showing
a transgender woman being confronted for
using the women's restroomat a US university.
(12:13):
And so a US woman filmed this video andshe questioned this
transgender woman's presence,saying she feels unsafe.
And the caption of the video said,
male student is using the women'sbathroom, why is this tolerated?
There were users who found that this videoviolated Meta's hateful conduct policies.
(12:39):
Meta disagreed.
This was appeal to the oversight board,and the board sided with Meta.
But it did not interpretMeta's content policies
by using First Amendment analogies.
Instead, it appliedinternational human rights law,
specifically the International Covenanton Civil and Political Rights, the ICCPR.
(13:05):
And the ICCPR, on the one hand,
Article 19 protects free speech whileit allows for certain restrictions.
But its Article 20,Paragraph 2 goes further.
It actually mandates the prohibitionof certain types of hate speech.
Which, of course, is very differentfrom the US supreme Court Brandenburg
(13:27):
versus Ohio test of incitement toimminent lawless action likely to produce
such outcomes under which the video,
if you were to use an analogy,would would have been protected.
Now, in this particular case,
international human rights lawsupported free expression.
So the American user video was upheld.
(13:50):
But in other cases, it has not.
The oversight board has usedhuman rights law to rule against
anti-immigration speech.
It has also found Holocaust denial to be
unprotected using internationalhuman rights law.
It has also relied on decisions fromthe European Court of Human Rights.
And the European Court ofHuman Rights offers no protection
(14:15):
to hate speech andeven permits restrictions on blasphemy.
To be clear, of course,
US platforms have a First Amendmentright to set their own content rules.
Section 230 protects them broadly fromlegal liability when enforcing them.
But I think there's good reason tothink that international pressure is
(14:36):
pushing American companies towardsless speech protective policies and
stricter enforcement than theywould otherwise have chosen.
After all, these are global platforms.
Their terms of service andcontent policy are generally universal,
that's at least the ideal.
And so when most of your users are outsidethe United States where speech laws
(15:00):
are more restricted,
it makes business sense to align withthe dominant regulatory climate.
This is especially so when the alternativeis facing huge fines, national bans,
criminal investigations, oreven the arrest of local employees.
And this is all stuff that hasactually happened under laws like
(15:21):
Germany's now repeal in SDG,India's IT rules,
Brazil's judiciary-led campaignagainst fake news, and
most prominentlythe European Union's Digital Services Act.
The future of free speech that I run,we've tried to sort of come up
with some data that can maybe sortof give us a picture of the impact,
(15:45):
even though it's difficult tomake sort of any causal claims.
But in 2023, we did this reportwhere we looked at the development
of hate speech policies of asmajor social media platforms.
All of them American.
So Facebook, Instagram, X, YouTube, etc.
And our findings show that platforms havesignificantly expanded their hate speech
(16:10):
policies over time, both in content andin the range of protected characteristics.
So what began as relativelynarrow bans on overt racist or
hateful speech has drawn into sortof including harmful stereotypes,
conspiracy theories, and so on.
And since 2020,
(16:30):
the average number of protectedcharacteristics has more than doubled.
And this creates a clear risk ofover censorship to avoid fines.
This is something that is borne outby reports that we've done looking at
deleted comments on Facebook andYouTube in France, Germany, and Sweden.
(16:52):
Where we found that between 90 to 99%of the deleted comments were perfectly
legal and most of them were not even sortof offensive, which is a subjective term.
But, but they were not even, many ofthem were not even controversial topics.
So that suggests that laws likethe NetzDG that was passed in Germany,
(17:14):
the Digital Services Act, incentivizesplatforms to remove legal content.
And if they do soby changing their terms of service or
their universal law contentpolicies on hate speech,
that then also has an impacton users in the US.
(17:36):
And given that the practicalexercise of free speech of
most Americans is predominantlycarried out on social
media platforms,that obviously then has an effect.
And I briefly mentioned the EuropeanUnion Digital Services act, which might
(17:57):
be the most ambitious and sweepingattempt to regulate the online ecosystem.
The DSA is hotly debated, some seeit as a huge step towards progress,
others see it as a censorship machine.
It does include some positivedevelopments like transparency,
(18:18):
improved appeals processes,stronger user protection.
But I think it also presents a risk tofreedom of expression with its notice and
action system, its obligations forvery large online platforms and
search engines to mitigate systemic risk.
And with the European Commissionacting as a regulator.
(18:41):
And the European Union has been veryclear in saying that its ambition is for
the Digital Services act to act asa global model for online regulation.
It was marketed asa global gold standard for
platform regulation,which is a nod to the so
(19:03):
called Brussels effect,where the European Union,
because of its huge market andits expertise in regulation and
its enforcement mechanism,is able to essentially
set global rules that are likely tobe adopted by private companies and
(19:26):
also emulated by othercountries around the world.
And some of you might rememberthat a former then commissioner,
Cherry Breton, a very assertive Frenchmanwhen he was in charge of the dsa,
sent a number of threateningletters to US platforms
(19:49):
warning that their contentmoderation violated the dsa.
The most infamous example was when hesent a letter to Elon Musk just before he
was about to livestream a discussion withthen presidential candidate Donald Trump,
warning that that might potentiallyviolate the Digital Services Act.
(20:12):
So that was obviously something thatprovoked a lot of Americans saying,
how dare a European bureaucrattry to sort of have
a say on how an Americanplatform facilitate free
speech directly relevantto an upcoming US election?
(20:37):
I think even within Brussels, there was arealization that this was a step too far.
I also think that the currentadministration is overly
hostile to the DSA.
So the impact of the DSA maybe less sort of impactful
than under the Biden administration.
(20:58):
But under the Biden administration, therewere meetings between the White House and
the European Commission where theysort of sent out press statements that
aligned their approach to fightingdisinformation and so on.
And there was a hope in the EU that the USwould voluntarily adopt parts of the DSA,
like risk assessment and independentaudits of social media platforms.
(21:24):
So all this is to say that there are,I think there is
definitely there has been a realimpact on US Platforms and
thereby US Users when it comes to the DSA.
I think it's uncertain to whatdegree it will have an impact.
I think with the new administrationin place, it's likely to
(21:48):
be less direct and there's likelyto be less of an incentive for
US Platforms to followthe Brussels playbook.
So that might be my initial comments.
>> Eugene Volokh (22:00):
So, again,
that's tremendously helpful.
It's important to realizethe First Amendment restricts.
The government doesn't restrictsocial media platforms.
Free speech is more in danger fromthe government than from social
media platforms.
But at the same time, as you point out,for most people, practically speaking,
(22:22):
what they can say to the public is chieflydictated by the social media platforms
because they don't have othermechanisms for doing that.
So it's important not to ignorethe influence that massive
platforms have on free speech.
And of course,you point out it's not just, well,
(22:43):
is it government action oris it private company action?
It's private company actionoften pressured by the need
to comply with government orders,albeit perhaps from foreign governments.
>> Jacob Mchangama (22:56):
And I think this is
one of the gray areas because this concept
of jawboning, when does governmentpressure reach such a level where you say,
well, this is no longer justthe government having a reasonable,
strong interest in whatgoes on on platforms.
(23:18):
Saying, hey, we think what youposted is wrong or false or
dangerous, which I think thatthat's not unreasonable for
government to take an interest in what iswhat kind of information is being spread.
But where does that cross the lineinto where you say, well,
(23:42):
if you don't remove it,it might have consequences for
you, andwhere it sort of becomes state action?
And I think some of theselaws incentivize or
at least facilitateuntransparent jawboning, or
at least they create a risk voidwhere you have sort of processes,
(24:06):
but that are not very transparent.
That are not well described, andthat creates mechanisms where
it provides powerful regulators,possibilities to sort of jawbone
platforms to address contentthat might not even be illegal.
(24:26):
And of course,one of the developments that facilitates
this is the centralizationof social media, right?
It's a very different online ecosystemthat we inhabit now than if we go back,
say 15 years andthe blogosphere was dominant, right?
(24:49):
If we were back in the daysof the blogosphere,
much more decentralized online system,
you could have a blog with a millionusers, but very few people.
And the government would very likelynot take a strong interest in
how that block the content andhow it moderated user comments
(25:13):
because its impact on the entireonline ecosystem was very limited.
That's a very different propositionwhen you have centralized
platforms where billions of usersessentially share and access information.
And that can have a completelydifferent systemic impact on the online
(25:36):
ecosystem of information andideas than a blog.
>> Eugene Volokh (25:40):
Right, so one way of, or
one thing one might wanna thinkabout here is recognizing that there
are basically 200 sovereigncountries more or less in the world.
What one can do to protect their rights,to control what happens within
their borders, but keep that fromspilling over outside their borders.
(26:03):
Of course, some countries say,well, American free speech norms
have been spilling over intoour country for a long time.
We need to stop that too.
But now, of course, Americans maybe concerned about that as well.
So one thing that I know sometimeshappens is there's a court order that is,
(26:25):
let's say, requires the removalof defamatory material, but
the defendant doesn't remove it.
Maybe the defendant is outside tothe jurisdiction or hard to find or
whatever else.
And then people send thatorder to Google and say,
deindex those pages,remove them from Google search results.
(26:45):
It could be defamatory,could be under this right to be forgotten,
that European countries have developed orunder other kinds of restrictions.
And my understanding, andI've studied this matter some, but
maybe things have changed.
But my understanding has long beenthat Google tries to give effect
(27:05):
to those orders, even when it's boundby those orders on a national basis.
So if there's an order that comes in froma French court that says this material
has to be removed, it removes it forpeople who are viewing it from France.
Or maybe if the order isbased on European law and
(27:26):
purports to apply throughout the EUremoves it for people from the eu.
On the other hand,Americans continue to access those things.
So, for example,if somebody gets an order against my blog,
my blog writes about various court cases.
In the process it mentionsthe names of people involved.
Occasionally I get requests or evendemands that I remove posts about people.
(27:52):
I say well if it's accurate or if it'san accurate report of a court decision,
I'm not going to do it.
But imagine that somebody getssomebody who's a citizen of France,
gets an order along those lines.
In France, Google, my understanding, atleast from all I've seen is only going to
make it effective for people whoare apparently accessing it from France.
Now, you could imagine some countriessaying no, that's not good enough.
(28:16):
We want to protect our citizensagainst having being defamed or
having their private informationdisclosed all over the world.
So we're going to demand that you do it onan international basis, worldwide basis.
And one way we'll make it stick isif you have assets in our country,
we will just seize those assets or
arrest your people until you act onwhat we're saying throughout the world.
(28:41):
But while I think that's happened attimes, my sense is that's not the norm.
So Google does indeed try toenforce these foreign orders
in different ways dependingon where the user is located.
So one question is, is that your sense aswell that there is some degree of this
(29:02):
kind of geolocation and geolimitation ofthese foreign orders in pract practice
at least by some big tech companies orwhether on the other hand, no, in fact
courts are pressuring even Google to blocksearch results throughout the world.
And then the next question is,if Google is indeed doing this,
(29:23):
is it something that might be reasonableto demand of other social media or
excuse me, Google is not quitea social media platform here, but
other Internet companies.
Saying look, do what you can to try tomake sure that foreign restrictions
don't bleed over into to Americanusers and perhaps vice versa.
>> Jacob Mchangama (29:46):
Yeah, so
an interesting example of this is that
shortly after Russia'sinvasion of Ukraine,
the European Union puta number of state sponsored
Russian media outlets ona sanction list which meant that
(30:06):
their rights to broadcast withinthe European Union was banned.
But also it also told Google andsocial media platforms
that they had to de-indexsearch results and
remove content from the abilityto spread these state
(30:27):
sponsored Russian mediaoutlet on their platforms.
So this was something thatGoogle implemented within
the European Union but not in the US.
So content that was part of disorderdid not affect American users,
but it did affect European Union usersto access Russian state sponsored media.
(30:54):
It's still in place.
And the sanction list hasbeen expanded since then.
And there are various other cases.
I mean, sometimes, and I guess itdiffers from platform to platform,
sometimes the platforms willfight these orders, looking at,
(31:16):
for instance, international humanrights norms or even national laws.
And I guess, maybe, also they will lookat, is this a democratic country or not?
And I'm sure they also take intoaccount market and business interest.
(31:36):
So I don't have sort of a goodoverview to what extent
whether there's a consistent line in this.
But you can go and look at some ofthe various platforms have transparency
reports, and that's also a requirementunder the Digital Services Act.
(31:59):
But the data is not alwaysvery useful because it
doesn't give you all the details.
[COUGH] So you're absolutely rightthat this geo blocking is a feature.
One of the problems with thiscan be if you say, well,
(32:20):
we don't want to fight with the EuropeanUnion over hate speech, for instance.
So what we will do is we will just adopt
a hate speech policywhich is more expansive,
more restrictive than whatfollows even under German law.
(32:41):
German probably has the most speechrestricted hate speech law in Europe.
So the response of YouTube orFacebook could just say,
we don't want to havethese running battles.
So instead we just say our hate speechpolicies are just much broader.
(33:02):
And that is where we are,at least until very recently,
where we've seen that Facebook havechanged their hate speech policy.
It's no longer called a hate speechpolicy, it's now called hateful
conduct and has been, I think,slightly less restrictive.
(33:25):
But that's the perniciousconsequence of this is that you're
incentivized to just say, well,we don't want to risk not being in
compliance with the Digital Services Actor national laws because
under the DFA you could risk finesof up to 6% of global turnover.
(33:45):
Under the Net DG Act, this German law,I think fines were up to 50 million euros.
So if you're Google or Facebook, do youwant to go to bat for neo Nazis or do you
want to be good friends with the EuropeanCommission and the German government?
You probably want to be goodfriends with the latter.
(34:06):
And so you just adopt more speech,restrictive terms and
then you err on the sideof countries of removal.
That makes more sense.
So that's one of the dangerousconsequences of this, I think.
>> Eugene Volokh (34:25):
Great, thanks very much.
So there are some questionsthat have come in and
one of them is also a question I had soparticularly pleased to ask this.
So, looks like European governmentsare pressuring social media
platforms into doing certainthings that affect Americans.
(34:49):
One logical entity to try to step in and
protect the rights of Americanswould be the US Government.
And there are at least twoways they can do that.
One is they could try to order theplatforms not to comply with European law.
That could put the platformsin a difficult position.
But maybe if we push harder thanthe Europeans, they'll go our way.
(35:12):
Now, given the NetChoicedecision from last year,
it seems likely that at least as tocertain things, the platforms could say,
hey, we have a first amendmentright to remove certain material,
at least from the main feeds.
Maybe it's different as to whether theycan remove it from users own pages, but we
(35:32):
have the right not kind of promote certainmaterials as part of our main feeds.
And just because Europeans are pressuringus to do this doesn't take away our right
vis a vis the US Government to do this.
So but maybe there's still some room forthe US government to act.
But the other possibility,
of course US government can try topressure the Europeans and say,
(35:56):
look, this is not acceptable to us andwe will use what leverage we have to say,
look, you can't pressureUS Companies to do that.
At the very least, you have to acceptwhen US Companies try to geolocate,
try to make sure that theyfollow your orders only
(36:19):
in your countries and not in our country.
And maybe you shouldn't be pressuringthem even in other ways because that's
contrary to our public policy.
You're trying to enforce your policies,
we're trying to enforce our policies aboutprotecting Americans free speech rights.
So let's see who's tougher, let's seewho's got more weight to throw around in
(36:41):
this kind of situation orperhaps another way of putting it.
Let's see if we can work outan arrangement, a deal of some sort.
Is your sense that somethinglike that is happening?
Is there some agency within the federalgovernment that is looking out for this?
Or is it something that justthe government, either under the Biden
administration or the Trump administrationhas, has little interest in?
>> Jacob Mchangama (37:02):
I think that
under the Biden administration,
I think there was a time,especially after 2016,
when there was a real paradigmshift in how social media,
the role of social media wasviewed in democracies that
they were no longer thesepositive forces for
(37:25):
uninhibited androbust speech supporting democracies.
They were instead these maliciousactors that spread disinformation and
hate speech, undermining democracy.
I think that especially among Democrats,
there was a view thatmore needed to be done.
(37:47):
And I think they were envious maybe.
[LAUGH] I rememberHillary Clinton congratulating
the European Union whenthe DSA was adopted.
On the other end,
it's quite clear that Republicanshave sort of had different gripes.
(38:07):
They have tended to say,
well, these social mediaplatforms censor conservative.
And I think it's quite clear,you saw that in partly in JD Vance's
speech in Munich that the Trumpadministration thinks that
European governments are restrictingfree speech in ways that are,
(38:30):
to quote Vance, shocking to American ears.
And they also view not onlyfree speech restrictions,
but generally,I think the European Union's attempt to
regulate US Tech companiesas essentially a move,
(38:51):
a geopolitical move thatthey want to counter.
I'm not sure how much free speech isthe driving force of this, given what
the current administration itself is doingon free speech internally within the US.
I don't think it has a particularlystrong position in lecturing
(39:13):
the Europeans given what has beengoing on in the past 100 days.
But there's definitelybeen these noises coming
out of the administration thatlaws like the Digital Services Act
is an attempt by the European Union tosilence free speech within America.
(39:38):
But, but you're also right thatthere's been this in many countries.
I think the other complaint thatAmerica is that is colonizing or
sort of cultural imperialismof its free speech norms that
it's been imposing onthe rest of the world.
And I think there's some truth to that.
(39:59):
I think it's a particularlybenign form of cultural
imperialism that I wholeheartedly support.
But so for the past 80 years or so,
the US has had not a perfect record.
But if you go back to 1941,FDR's four freedom speech, he talks
(40:21):
about these four freedoms that he sees asthe basis of a new global world order.
And the very first one isfreedom of speech for everyone,
everywhere in the world.
And his widow, Eleanor Roosevelt, foughta very principled, laudable fight for
speech protective standards ininternational human rights conventions.
(40:46):
But fighting bitterly against the Sovietattempts to include hate speech and
laws against disinformation andI think generally, America, as I said,
have had free speech as an importantpart of its foreign policy.
And I think that up until 10 or15 years ago,
(41:09):
Europe and Western Europe andUS were aligned.
The differences between the US andthe European free speech tradition were
relatively minor when you comparedit to other parts of the world,
yes, we might ban Holocaust denial,we might have restrictions for
free speech, but both agreed,you shouldn't have.
(41:31):
You have the right tocriticize the government,
you shouldn't have political prisoners andso on, and so the US and
Europe were essentiallyfighting the same fight.
And then they could nibble around,
they could have minor differenceson where to draw the line,
but those differences havebecome much more significant
(41:54):
now in the age of social media andthe online world.
And so I think US and Europe has driftedmore apart on those standards and
how to regulate them.
>> Eugene Volokh (42:08):
Makes sense,
I think you're quite right on that.
There are a couple more questions,one of them I think I know the answer to,
let me offer it and then see,your reaction may be a relatively simple
thing to answer, butactually important to keep in mind.
So the question is, how can any countryenforce their court orders in another
country where there's no jurisdiction andno enforcement mechanisms in that country?
(42:32):
In effect, aren't the rulings in manycountries merely an exercise in futility
due to the cost to enforce them?
If it is possible to do so,
has the US agreed to enforcementof the Digital Services Act?
And my understanding is the answer tothat is chiefly that many of these large
social media platforms have assets andemployees in foreign countries.
(42:52):
So, yes, if Denmark decidesto go after me because I
had posted somethingthat it disapproves of,
can't really do much exceptmake it very dangerous for
me to visit Denmark, maybe even visit therest of Europe, but still not that much.
(43:14):
On the other hand, if Facebook Metahas an Office in Denmark or
Google has an office in Denmark,then they could say,
unless you do what we want in America,we will seize your assets in Denmark or
elsewhere in Europe where our writ doesrun, or maybe even arrest your employees.
Am I right in understand?
>> Jacob Mchangama (43:33):
Yeah, yeah, no, and
this has happened in India, I think X,
Google and Facebook,at least two of those have had
the situation where their officeshave been raided by police and
employees, sort of detained, arrested.
(43:55):
Many countries have what youmight call hostage laws.
So you can only operate in that countryif you have a physical presence there.
You need to havea designated CEO of a kind.
And this is obviouslya way to exert pressure,
which makes the enforcementmuch more efficient.
(44:16):
Rather than just writing a letter toa headquarter in Palo Alto where, where
Mark Zuckerberg or whoever can just say,well, what are you going to do about it?
How many battalions do you have?
So that's definitely somethingthat has become a feature.
(44:41):
And this is also whycountries like Brazil,
largest democracy in Latin America,significant market for
social media companies,the European Union, a huge market, and
India, of course, the largest democracyin the world, have much more pull.
My father is fromthe Comor Islands in East Africa,
(45:05):
a small island nation withmaybe 500,000 inhabitants,
one of the poorest countries,countries in the world.
If the Comala Islands were toadopt a Digital Services Act,
I don't think that Mark Zuckerberg orElon Musk or
anyone else would quake in their boots,
(45:26):
they would not comply withanything coming out of it.
So obviously the more geopoliticalmuscle you have as a country,
the better your bargaining chips andthe better options you have for
of sort of trying to enforceyour own laws, even Visa,
the big global tech companies.
(45:48):
And remember that Elon Musk,self professed free
speech absolutist fora while was defying Brazil,
but he ended up caving essentially and
sort of giving in to demandsfrom the Brazilian judiciary.
(46:09):
So when you have these hostage laws,
if you want to be a bitpolemical in place, and
when social media companies havesignificant assets in a country,
their law suddenly have much more bite andteeth.
>> Eugene Volokh (46:27):
Right, well,
so another question asks whether
there may be some technicalsolutions to this.
Although the interesting question iswhat the market will think of these
technical solutions.
So let me read it.
Governments and organized interests seemlikely to pursue their political goals
by seeking speech suppression fromcentralized content moderation.
(46:49):
Because yeah, if you just eitherjawbone or just threaten retaliation,
threaten arrest to the management ofa company, then you can get quick results.
But then the question so the question is,are decentralized options possible?
For example,Blue sky is based on protocols,
(47:11):
not platforms, I believe Mastodonwas framed much the same way.
The model looks towards competition aboutcontent moderation rather than centralized
moderation.
What do you think about the prospectsof the Blue sky model of social media?
>> Jacob Mchangama (47:28):
Yeah, I think
decentralization is an important step
forward.
But it's interesting, the Blue Sky,I think, recently complied
with court orders from Turkey toblock a number of accounts in Turkey.
And of course,Turkey is a country that has very,
(47:49):
very speech restricted limits.
One where President Erdogan wasleading a very illiberal country,
which is now,I think turning fully authoritarian, and
where he has often used restrictionson online speech to cement his rule.
(48:09):
So that means even Blue sky isnot immune to this development.
But I do think a moredecentralized ecosystem
of social media is a way forward and
also one where maybe usershave more control over what
(48:29):
type of content they wantto be confronted with.
Because, for instance,when you look at the more speech
restrictive policies interms of US Companies,
it's not only because of pressurefrom foreign governments in Europe.
It's also a lot of Americans disagreeabout where the limit should be drawn.
(48:53):
Various interest groups in the US say,hey, you don't like anti-Semitic speech.
We don't like Islamophobic speech.
Facebook, you should remove this.
Facebook, you should remove that.
But if you provide users withmore control over what type
of content they wantto be confronted with,
maybe if you allow them to curatetheir own third party developed feed,
(49:18):
then you reduce the demand forcentralized content and
more restrictive content moderation andthe supply will maybe also drop.
I think that's an interesting andnot implausible theory.
The central problem, I guess,is that most people use social media,
(49:39):
not for political purposes,not because they're heavily
invested in debates andcontroversies like this, but
because they wanna connect with family orshare cat videos.
And so for them, the ease ofcentralized platforms is really,
(50:00):
it would take a lot for them.
You really need to be heavilyinvested in it to sort of say, well,
I want to do away with the easehaving everything in front of me and
then opting to variousdecentralized models that requires
a lot more on the part of the user.
(50:20):
And that unfortunately,I think is an impediment to this.
But of course, technological innovations,and I think a lot of people
are working on this to sort of say wecan develop the best of both worlds.
We can both have decentralizedsocial media platforms that are also
(50:42):
much easier for users to availthemselves of all the benefits.
But it is a huge ask of people who justuse them to connect with friends or
family who are not particularlyinterested in more political speech.
>> Eugene Volokh (51:01):
Great, thanks very much.
So there's another question thatis a little tangential to this,
but like all tangents, it touches, right?
And I think maybe relevantabout Section 230.
So America has this broadprotection beyond probably what
(51:24):
is required by the First Amendmentunder Section 230,
which basically says that social mediaplatforms are not legally responsible for
material, generally speaking,not legally responsible for
material posted by their users thatis defamatory or invasive of privacy.
(51:49):
And as a result there's a lot more that is
posted that is defamatory,invasive of privacy.
In theory you can go afterthe people who posted it, but
in practice it may be very difficult.
My understanding is that that is notthe norm in many other countries and
(52:11):
that in fact that you could goafter the platform if you can show
that something is defamatory.
Although, of course, that may lead toover chilling as the platform maybe
tries to avoid litigation risk byremoving things proactively too much.
What's your view of Section 230 andwhether it makes sense both in the US and
(52:34):
whether it's something that othercountries should be adopting as well?
>> Jacob Mchangama (52:40):
Well, I'm just
finishing revisions of the manuscript of
the new book on the future of free speech,which I'm co-authoring with Jeff Kosseff,
who wrote the treatise on Section 230.
So, not surprisingly I buy intoJeff's I think convincing argument for
why section 230 really has beenessential to online free speech and
(53:06):
sort of building,one of the reasons why the US has
been a world leader completelydwarfing Europe for
instance, when it comes tobuilding to innovation of
platforms that depend onuser-generated content.
(53:26):
That would be very difficult if you didnot have those protections in place.
If you were to say that,imagine not trying to operate a Facebook
without Section 230 andyou'd be heavily incentivized to try and
remove much more content andthat would sort of defeat the purpose
of platforms that dependon user-generated content.
(53:50):
Now both the E Commerce Directivein the European Union and
the Digital Services Actdoes have shield to
a certain extent fromintermediary liability.
But it basically says that if youhave received a notification under
(54:11):
the Digital Services Act,that could be from a national authority,
but could also be from a trustedflagger about illegal content,
then you should expeditiously remove it.
And if you do so, then you're not liable.
But also, it's not like you don't havea generating monitoring application.
(54:35):
So the DFA does not mean that Facebook or
X has to ex-official monitorall content on this platform,
only when it's been madeaware of illegal content
do they have an obligationto examine it and
(54:55):
then remove it if it's illegal.
So there is a qualified protection,but not as extensive as section 230.
So I think section 230 is a better
way forward than the alternatives.
(55:16):
But I think it's important forfree speech advocates
to acknowledge that it alsomeans that a lot of ugly
stuff is put out there thatcan have harms and costs.
I'm generally very skeptical oflaws against disinformation.
(55:37):
I also think that the threat fromdisinformation has been hyped into sort of
an elite panic.
But it's also true that disinformationcan at times lead to very serious arms.
It can be a serious problem fordemocracies.
It can make collectiveaction much more difficult,
(55:57):
if populations are cleavedinto two polar opposite
blocks who just don'tshare any set of facts.
I happen to believe thatwhen I look at the harms and
the cost, or the harms and the benefits,
(56:18):
I come down on the side where I thinkthe benefits outweigh the harms.
And when you think about if we legislate,what will be the consequences of that?
Will a law make polarization go away?
(56:40):
I think that's quite naive.
I think that's a cureworse than the disease.
I think James Madison wrote beautifullyabout that in the report of 1800
criticizing the sedition Act.
He recognizes that free speechcomes with these harms, but
that the effects of censorship,if you like, are much worse.
(57:01):
So that's where I come down on it.
>> Eugene Volokh (57:03):
Right,
well, very helpful.
Thanks very much.
Let me just close with one last question,which also is a question I have, but
a question that,
that builds off of a questionasked by a member of the audience.
It's basically whatare we gonna do about it?
The question is, are there activediscussions about any proactive or
affirmative approaches to push backagainst this pernicious threat?
(57:27):
So you've identified, I think,a very serious problem.
What's the solution?
>> Jacob Mchangama (57:33):
Yeah, so I think,
let's take a look atthe Digital Service Act.
It would have been interesting ifthe Digital Services Act focused on
transparency and researcher access, for
instance, rather than sort ofnotice an action and systemic risk.
That would have allowed us to sort of say,okay, we have this presumption among
(57:56):
some that social media platforms are awashin illegal content and disinformation.
But is it true?
To what extent is it true?
So I think transparencyis something that can
help us have a more qualifieddiscussion about these issues.
(58:16):
I also, as we mentioned,as we discussed earlier,
I think decentralizationis part of the solution.
Daphne Keller and Francis Fukuyama havewritten about the potential of middleware.
The way you have thirdparties coming in and
saying we can develop content moderationsystems that users may opt into and
(58:42):
then give them more options andthat that could then be adopted
by Facebook rather thanFacebook being in charge of it.
This is another model that Ithink could be interesting.
But I think essentially we need touphold a strong culture of free speech.
We have to be reallyaware of the benefits.
(59:05):
I think we tend to take the benefitsof free speech for granted.
I think,look at what's going on in Brazil.
I think that's quite frightening.
Look at what's going on in India andalso frankly in Europe.
Think about the fact thatrecently a journalist from this
(59:25):
right wing newspaper was sentencedto seven months suspended prison and
fined fora doctor demeanor of the Interior Minister
of Germany holding a signsaying I hate free speech.
And this interior minister is someonewho has actively reported several
(59:48):
people to the police forviolating free speech restrictions.
I think that is a dangerousdevelopment and I think even more so
the fact that authoritarian statesaround the world are obsessed with
controlling the online sphere.
This goes for China, it goes to Russia.
(01:00:11):
And I think the fact that these regimesare so obsessed with controlling
the online sphere tells yousomething very significant.
That is that authoritarianstates know fully well
the benefits of online free speechto a free and open society.
And I don't think their democracies or
their populations shouldlose sight of that.
>> Eugene Volokh (01:00:35):
That's an excellent
closing for our conversation.
Jacob thank you so much for joining us.
Eryn, thank you so much forintroducing us and helping organize this.
And thanks more generally to the HooverInstitution for putting all this together.
(01:00:57):
So Jacob, many thanks and I much lookforward to many further conversations.
>> Jacob Mchangama (01:01:02):
Likewise.
>> Eryn Tillman (01:01:03):
Thank you Eugene,
thank you Jacob.
What a great discussion.
We really appreciate you, the audience foryour participation in good questions,
the events team, forall your work to put this together.
And I wanna let the audience knowthat this recording will be available
on the Hoover event webpage inabout three to four business days.
And our next webinaris only one week away.
(01:01:23):
It will focus on what the Academycan do to build strategic competence
with a particular eye on the importanceof revigorating a history at post
secondary level.
Hoover Fellow Stephen Kotkin willmoderate a conversation with Lt Gen.
H.R. McMaster next Wednesday,May 7th from 10 to 11am Pacific Time.
And you'll find in the chat a linkto our RAI webinar series webpage.
(01:01:48):
You can visit that to sign up forthe next session,
access recordings of previous webinars and
subscribe to our RAI newsletter toreceive updates on upcoming events.
Have a wonderful rest of your day andthank you again for joining.
[MUSIC]