All Episodes

February 29, 2024 52 mins

Unlock the mysteries of online speech regulation with Kabrina Chang and Marshall Van Alstyne as they dissect the debates at the heart of Moody v. NetChoice and NetChoice v. Paxton. This conversation cuts through the legal jargon, promising to enlighten you on the intricacies of Section 230 and its pivotal role in the growth of social networks. Delve into the very fabric of digital communication law, as we peel back the layers of content moderation and explore the protection it affords to both social media companies and their users.

Imagine a world where every tweet, comment, or post you see is unfiltered—this episode contemplates such a scenario as we scrutinize the possible outcomes of a high court ruling concerning free speech on social media platforms. Our guests, armed with years of expertise, help us navigate the choppy waters of private enterprise versus public discourse, dissecting the responsibilities platforms hold as modern public squares. The session also reveals how adopting decentralized marketplaces and user-selected filters could empower listeners and revitalize the marketplace of ideas.

As we sail towards the horizon of this thought-provoking journey, we confront the looming question: Should social media be regulated as public utilities or should they remain private entities with editorial control? This episode doesn't just analyze the potential legal ramifications; it also evaluates the economic principles that could redefine governance in the digital realm. Our experts, Chang and Van Alstyne, provide an enlightening perspective on the delicate balancing act that lies ahead, ensuring the integrity of our online communities while fostering a robust environment for free speech. Join us in this riveting discussion that promises to shape your understanding of today's digital society.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
J.P. Matychak (00:15):
Hello everyone and welcome to another episode
of the Insights at Questrumpodcast.
I'm JP Matychak.
Joining me, as always, is myco-host, Shannon Light.
Shannon, how are you GreatAgain, I forget to bring you up
how are you doing, Shannon?
I'm good.
Thank you, excellent.
Well, I know we're both excitedfor this topic.
It's incredibly interesting,wide-reaching implications and

(00:39):
incredibly timely, which isalways good.
So today's topic is animportant one in the sense that
it can have a lot ofimplications for online speech.
So earlier this week, the UnitedStates Supreme Court heard oral
arguments in the cases of Moodyv NetChoice and NetChoice v
Paxton.
Two cases stem from disputesover Republican-backed laws in

(01:04):
Florida and Texas, passed backin 2021, that looked to restrict
social media companies frommoderating content on their
platforms.
So NetChoice is one of severaltech groups that represent some
of the world's largest platforms, like Facebook and X.
They filed lawsuits against thestates, citing these
restrictions as violations oftheir First Amendment rights.

(01:26):
So we're joined by two greatguests to help us make sense of
these two cases and thepotential impact of the
potential SCOTUS decisions.
Kabrina Chang is clinicalassociate professor of markets,
public policy and law.
Professor Chang's researchfocuses on employment matters,
in particular, social media andhow that impacts employment and

(01:47):
management decisions, and oncorporate social advocacy.
Her work has been published inacademic journals, news outlets
such as the New York Times,bloomberg and the Boston Globe,
and in magazines such as Forbesand Harvard Review.
Kabrina, welcome back to theshow.

Kabrina Chang (02:01):
Thank you for having me.

J.P. Matychak (02:03):
So also joining us is Marshall Van Alstyne Allen
and Kelli Questrom Professor inInformation Systems, he is one
of the world's foremost expertson network business models and
co-author of the InternationalBest Seller Platform Revolution.
He conducts research oninformation economics, covering
such topics as the economics ofspeech markets, platform

(02:25):
economics, intellectual property, social effects of technology
and productivity effects ofinformation.
He's been a major contributorto the theory of two-sided
networks taught worldwide, andto the theory of platforms as
inverted firms applied inantitrust law.
More recently, marshall was therecipient of a $550,000 grant
from the National ScienceFoundation to study

(02:46):
misinformation andtechnology-aided societal
structures to decrease theadverse impacts of fake news.

Marshall Van Alstyne (02:53):
Marshall welcome JP.
Thanks for all these, Come joinyou.

J.P. Matychak (02:57):
Excellent.
So I want to start by talking alittle bit about a foundation,
if we could the level set, atopic that kind of came up in a
cursory way in the oralarguments, although you know
immediately so may not applyhere, but I think it's important
just to level set where we arecurrently with statute.

(03:18):
I want to talk about Section230 of the Common Communications
Decency Act.
Kabrina, can we start with youjust to explain to us what is
Section 230 from a legalstandpoint, from a statute
standpoint?

Kabrina Chang (03:32):
Sure.
So Section 230, as you said, ispart of the Communications
Decency Act.
The CDA was passed in themid-90s, essentially to combat
child pornography online, andwhile sort of Congress was
debating what the wording of theCDA would be, at the same time

(03:55):
there's this case happeningcalled Stratenokmont versus
Prodigy.
If our listeners have ever seenthe Wolf of Wall Street, that's
Stratenokmont, a story aboutStratenokmont.
So it was back when there wasProdigy and Prodigy had a chat
room and it was like a financialservices investment chat room

(04:17):
and people had posted on thatchat room comments about
Stratenokmont that Stratenokmontthought was defamatory, like
they were frauds and criminalswhich you know.
We later learned a lot aboutthem.
So Stratenokmont sued andsaying Prodigy, you're liable
for this.
And Prodigy defended, sayingI'm not liable for this.

(04:40):
We didn't post it and it'sthird-party content Before 230
was around.
so this was all sort ofhappening at the same time,
short period of time, and thecourt said well, hold on,
prodigy.
Actually you're more than justa mere platform.
Your user agreement says thatyou reserve the right to take

(05:02):
down posts that are harassing orinsulting, so you are
exercising editorial control.
If you are getting the benefitof editorial control, you're
going to get the risk ofeditorial control, and that risk
is you're going to be liablefor defamation.
So that case came down andlegislatures were legislators,
were, you know, worried that wewould never grow this social

(05:26):
media industry if every platformhad to be worried about being
liable for everything that wasset on their platform.
So the CDA was written and againall in a very short period of
time.
When the CDA was passed it wasthe ACLU and a bunch of
librarian networks sued, sayingthe CDA violated the First

(05:51):
Amendment because you can't tellpeople what they can and cannot
post.
So the CDA was essentiallygutted, except they amended it
to include Section 230.
So it was gutted of all of therestrictions regarding
pornography and obscenity andstuff like that.
But the liability protectionfor social media was the one

(06:14):
thing that was preserved,essentially in reaction to the
Stratenokman case.
So it has that sort of a littlebit of a dramatic history.
But 230 is really the onlything that has survived from
that.
Yeah.

Shannon Light (06:29):
And Marshall, you've done quite a bit of
research lately.
That's talked about Section 230.
What can you tell us about theimpact it's had on these
platforms and the companies thatown them?

Marshall Van Alstyne (06:38):
Well, the impact.
It meant the two sections orthe two portions of Section 230,
.
There are two differentcomponents.
One of them is that they're notliable for what users post.
That's the first portion of it,but they're also not liable for
their own editorial decisions.
That's allowed organizationslike TikTok, like Facebook, like
Instagram, to grow almostwithout bound because, in some

(07:00):
sense, whatever happens on there, they're unrestricted and
they're not liable for, and youcan also appreciate the
magnitude of the task.
How would you moderate 500million messages a day?
That's almost impossible.
Maybe it's clear that this isenabled in the internet
economies we know of today,especially, user-generated

(07:20):
content is generated by usersand the platforms are not liable
for that, but at the same time,that also means we get a lot of
information pollution and somethings that they don't
necessarily want to propagate.
You're probably familiar, ofcourse, with the whistleblower
testimony of Francis Hoganbefore Congress, and perhaps
Facebook was promoting speechthat you wouldn't want to

(07:43):
promote.
So it's another interestingquestion.
They've been shielded, then,from their own editorial choices
in the way that a newspaperwould not be.

J.P. Matychak (07:52):
Very interesting.
Okay, so now that's goodfoundation to kind of take the
conversation to these twoparticular cases.
And so let's shift gears alittle bit.
And, Marshall, I want to startwith you because I think it's
important to lay the foundationas to what was happening in the
world at that time when thesetwo laws were quickly pulled

(08:13):
together and passed.
So if you could talk a littlebit about the circumstances that
brought about these two lawsthat were passed in Florida and
Texas and the specifics of theselaws before we shift to the
actual arguments in these cases.

Marshall Van Alstyne (08:28):
Well, I will say you're asking the
economists about the laws.

Justice Barrett (08:31):
I would defer Kabrina on on the law about that
.

Marshall Van Alstyne (08:34):
But to give you a little bit of context
, a lot of this happened in thewake of the insurrection a lot
of Trump getting deplatformed.
So to give you a little morespecifics, in Texas, the law is
basically trying to banviewpoint discrimination.
A lot of folks on the rightfeel that conservative voices
have been suppressed, they havebeen censored and in fact, as

(08:57):
some evidence, once Elon Musktook over from Twitter, they
released the Twitter files thatshow that in fact, there had
been some suppression ofconservative speeches.
In particular, it was on theNew York Post story around
Hunter Biden's laptop.
It had been pressed on Twitterand Facebook as a potential
Russian disinformation campaign.
Similarly, in Florida, afterTrump got deplatformed, that law

(09:20):
was written in such a way tomake it extremely difficult for
platforms to deplatformpoliticians.
The law is written such thatyou can't deplatform either
journalistic enterprises orfolks running for office.
So again, in effect, if youwant to misbehave, run for
office and you can't bedeplatformed.
There's no interesting element.

(09:40):
The upshot is what's now beforethe Supreme Court?
Is this interesting question ofshould they platforms be
treated as common carriers likeAT&T, which has to carry
everything and for which they'renot liable, or should they be
treated more like publishersthat make editorial choices and
therefore bear the consequencesof those editorial choices.

(10:01):
At the moment, they haveeditorial choices but no
liability, and so there's aquestion again should we treat
them as common carriers or aspublishers?
That's the core of the issue.

J.P. Matychak (10:12):
So let's now fast forward here to these two cases
.
And we have NetChoice.
The group that's representing anumber of different tech
companies Files a lawsuitagainst the attorneys general of
Florida and Texas, citing theselaws as unconstitutional, in
violation of First Amendmentrights.
Can you talk to us, cabrina, alittle bit about these central

(10:36):
arguments in these cases andparticularly around this free
speech issue?
And yeah, I'm going to play aclip, but I want to do it after.
Maybe we chat first, ok sure.

Kabrina Chang (10:49):
So both parties are making First Amendment.
Claims the states and NetChoice.
Essentially, the states aresaying that social media is a
public square and as a publicsquare, like Boston Common or
Public Street, as a publicsquare, you cannot censor

(11:10):
information based on politicalview and a bunch of other things
.
Interestingly, there was a casefrom 2017 in the US Supreme
Court called Packingham vs NorthCarolina, which North Carolina
passed a law restrictingregistered sex offenders from
accessing social media.
They addressed the FirstAmendment, but only insofar as

(11:36):
saying the law in North Carolinawas written so broadly that
it's unconstitutional.
It's not narrowly tailored toachieve the goal that they want
to achieve.
That everyone agreed was a goodgoal.
However, there's someinteresting language in the
Packingham case that says it'snot just a First Amendment right
to speak on social media, it'sa First Amendment right to

(11:58):
access it, that everyone mostpeople in 2017, get their news,
communicate, learn, participatein the marketplace of ideas,
even if it's texting ridiculousthings or tweeting, whatever
kind of extreme viewpoint youmight have.

(12:18):
So that case is out there,which is a pretty important case
for this.
So the states are saying socialmedia is essentially a First
Amendment space.
The companies, however, aresaying no, we're not.
We are private companies andnet choice would take issue with
Marshall saying it's censorship.

(12:39):
It's not censorship, it'seditorial control, because only
the government can censor.
So the companies are saying weare private businesses, we can
exert editorial control.
In PS, If you're passing a lawthat tells us we have to post
things, that's a violation ofour First Amendment rights.
It's called compelledpublication.

(12:59):
You cannot compel speech.
We are private individuals withFirst Amendment rights and
forcing us to say things throughour business is a violation of
our First Amendment rights.

J.P. Matychak (13:11):
So I want to touch on this public square
thing and I want to go to a clipfrom Justice Jackson.

Justice Jackson (13:19):
Back for a minute on the private versus
public distinction.
I mean, I think we agree thatthe government couldn't make
editorial judgments about whocan speak and what they can say
in the public square, but whatdo you do with the fact that now
, today, the internet is thepublic square?
And I appreciate that thesecompanies are private companies.

(13:43):
But if the speech now isoccurring in this environment,
why wouldn't the same concernsabout censorship apply?

NetChoice (13:53):
So two reasons, your Honor, I mean.
One is I really do think thatcensorship is only something the
government can do to you, andif it's not the government, you
really shouldn't label itcensorship.
It's just a category mistake.
But here's the second thing.
You would worry about this ifwebsites like the cable

(14:14):
company's interner had some sortof bottleneck control where
they could limit your ability togo to some other website and
engage in speech.
So if the way websites workedwas somehow that if you signed
up for Facebook, then Facebookcould limit you to only 19 other
websites and Facebook coulddictate which 20 websites you

(14:37):
saw, then this would be a lotmore like term.

J.P. Matychak (14:43):
So directly to your point, right, yeah, right,
and Marshall, your sort ofthoughts on this and this
categorical mistake.
And in both of you, actually,it's not censorship, it's just a
categorical mistake.

Marshall Van Alstyne (14:57):
He's relabeling a rose here.
So it's going to be one form ofcensorship.
For another word, it's done byone government or private
institution, so I wouldn'taccept that argument at all.
The other thing we have to becareful about is that in so many
of these cases you're alwaysencouraged to do counter-speech.
So in the region, what wouldyou have done?
You would have set up your ownprinting press to offer to reach

(15:18):
a separate audience.
The problem is, in this interneteconomy we're now dealing with
network effects and other reallystrong monopolistic-style
markets.
So take, for example, themarket power of anyone trying to
set up a social network.
Try to set up a competingsocial network now to reach
another audience.
Another element that's somewhatdifferent influencers bring

(15:41):
their own audiences.
If the platform is to interposeitself between you and your
audience, that's a form ofcensorship.
So it's not fair for him toreclassify it because a category
error when in fact, folks aresimply reaching their own
followers.
And so the challenge that theyhave in making that argument is
that, yes, they're exercisingtheir own first amendment rights

(16:04):
, but they're exercising theirfirst amendment rights over your
first amendment rights, andthat's the challenge you still
ought to be able to reach youraudience that you brought to the
platform.

Shannon Light (16:16):
So let's continue this conversation on free
speech and listen to two clips,the first clip being from Chief
Justice Roberts, in the second,from Justice Kavanaugh talking
about the First Amendment.

Chief Justice Roberts (16:31):
So you began your presentation talking
about, concerned about the power, market power and ability of
the social media platforms tocontrol what people do.
And your response to that isgoing to be exercising the power
of the state to control whatgoes on on the social media

(16:53):
platforms.
And I wonder, since we'retalking about the First
Amendment, whether our firstconcern should be with the state
regulating what we have calledthe modern public square.

Justice Kavanaugh (17:09):
In your opening remarks you said the
design of the First Amendment isto prevent suppression of
speech, end quote.
And you left out what Iunderstand to be three key words
in the First Amendment, ordescribe the First Amendment by
the government.
Do you agree?
By the government is what theFirst Amendment is targeting.

State of Florida (17:33):
I do agree with that, your Honor, but I
don't agree that there is noFirst Amendment interest in
allowing the people'srepresentatives to promote the
free exchange of ideas.
This court has recognized thatas a legitimate First Amendment
interest.

Shannon Light (17:50):
So, Kabrina, based on what we're hearing, how
might the High Court's rulingreshape the legal landscape of
free speech and online speech?

Kabrina Chang (18:00):
That's a great question and really difficult to
answer, because if the state'swin and social media platforms
are a public square, that has tobe treated like government
property, quasi-government space, what does that mean for other

(18:26):
businesses?
It's a slippery slope.
Where do you stop?
And social media we around onthis podcast might be thinking
Facebook and WeChat andInstagram and X, but social
media has a definition aboutcreating a user profile, being

(18:46):
able to post things, being ableto communicate.
So that could be a lot ofthings.
That could be some websites,retail websites where you can
post reviews and interact withother people who posted reviews.
So it's a significantly slipperyslope when you think about what
are they talking about whenthey say social media?

(19:08):
Because that's one of theproblems with the Florida law is
the Florida law is so broadthat the justices are wondering
if that includes Uber, and canUber not pick up a customer
because of what they think thecustomer's political views might
be?
So that is something thatreally would have to be very

(19:29):
specifically worded and narrowlytailored If net choice were to
win.
And their private companies?
You do have this strangereality of a very small handful
of companies sort of controllingwhat we see and, to a great
extent what we do, how we feel,and that is really an unsettling

(19:54):
feeling.

Marshall Van Alstyne (19:56):
So let me jump in with a thought on that.
I think Cabrinha has identifieda genuine problem, and I think
lost in this debate is the voiceof the listener.
All too often this ispositioned as the free speech
rights of the speakers versusthe free speech rights of the
platforms.
But the listeners also have aright.
It goes all the way back toFrederick Douglass about people

(20:18):
saying slavery wouldn't stand upif people could actually talk
about it.
If you can't censor it, thenyou can endorse it, then you can
support it, but the listeners,if they can hear Contrarian
voices, they can hear what theywant.
Then you get different outcomes, and the listener's voice has
been left out of this.
One of the things that JohnMarbles was talking about
initially in your recording wasthe market power of the

(20:42):
platforms and using the state tointervene to correct the market
power platforms.
The problem is both are wrong.
We don't want the state tocontrol our speech and we don't
want private enterprise tocontrol our speech.
So what I would argue as aneconomist is we need to actually
get back to a point where wecan create decentralized
marketplaces where no one is incontrol.

(21:04):
Let me give you one steppingstone toward a possible solution
, though.
So it's partly involvingjurisprudence, partly involving
legislation, but imagine you, asa user, had the right to choose
any filter that you wanted.
You could choose BBC orConsumer Reports or Breitbart or
Fox News, as you wanted.
Then the infrastructure couldbe protected, but the users can

(21:28):
choose the filters and thealgorithms that they want, and
you could create a genuinemarketplace of ideas where the
users are starting to choosethose things, and not the
government and not MarkZuckerberg and not Elon Musk.
You've got a balance of choicesbetween the speakers and the
listeners, creating a truermarketplace.
So I think that might be oneway and again, I haven't heard

(21:49):
this as part of the discussionbut we need to elevate the
rights of listeners to this andcreate a marketplace on top of
these platforms.
We can go into some of thedetails how we do that, but I
think that's a better approachto the problem.

J.P. Matychak (22:01):
So I want to, and I think that that is a bit
about an underlying theme withinthis whole public entity,
public utility, common carrierpiece that kept coming and being
talked about, and so let meplay a clip real quick from
Justice Gorsuch and hisquestioning of the state

(22:26):
solicitor from Florida.

Justice Kavanaugh (22:31):
You've analogized to common carriers
and telegraphs in particular.
Why is that an apt analogy here, do you think?

State of Florida (22:40):
I think it's an apt analogy, your Honor,
because the principal functionof a social media site is to
enable communication, and it'senabling willing speakers and
willing listeners to talk toeach other, and it's true that
the posts are more public.
But I don't think that Verizonwould gain any greater right to

(23:02):
censor simply because it was aconference call.
I don't think that UPS or FedExwould gain a greater right to
censor books because it was atruckload of books as opposed to
one book, and so the analogy isindeed apt.
And so there's been talk ofmarket power.
Market power is not an element,I think, of traditional common
carrier regulation, and indeedsome entities that are regulated

(23:25):
as common carriers, like cellphone providers, operate in a
fairly competitive market.

J.P. Matychak (23:31):
So I think that this kind of gets to a little
bit of what you're talking about, marshall, is this whole notion
of are these news agencies?
And look, I'm not a legalscholar, right, and I'm not an
expert in sort of misinformationand platforms but as I listened
to these arguments and heardthis public utility versus

(23:52):
private entity, I really startedto get confused a little bit
and a little bit justquestioning my own thinking a
little bit, to put it that way,because I started thinking well,
if they really see themselvesas editors, are we who post on

(24:14):
there all sort of freelancejournalists, getting to post our
stuff but then they get to kindof say what we see, what you
don't see, and take thateditorial control?
Or are they more like afacilitating platform for us to
share these ideas?

(24:34):
And it does.
You really can see thearguments on both sides of this
issue.
You really can.
So, thoughts as you listened tothis piece of this argument of
the public utility versus thisprivate company, I don't think
it's any of the above and Ithink your confusion or your
struggle is a legitimatestruggle right.

Kabrina Chang (24:58):
You know the New York Times or Breitbart.
They have editors, they havemainstream media.
News media has fact checkersand they have standards and
everything else.
Tv has the FCC, so JanetJackson gets fined.
Social media is none of theabove and I really am interested

(25:23):
to hear Marshall's take on this.
My take on this is you don'thave Verizon or the New York
Times, with engineersmanipulating the platform every
day.
Like every time we look atwebsites, they are manipulating
what I see.
So it's not me using Verizon tocall JP, and Verizon is neutral

(25:46):
.
Technology is not neutral andVerizon is neutral.
That's not what's happening onInstagram and Twitter.
They are manipulating us everysingle time.
They have engineers usingneuroscience to keep us addicted
, to keep us scrolling, to hitus with dopamine to keep us on
there, because that's how theymake money.
The revenue model incentivizesus to stay on there and the

(26:10):
divisiveness drives engagement.
Engagement drives ad revenue.
So it is not a neutral platform, it is not a highway, it is not
a phone line.
I think that is a misleadingargument and ignores the very
business model of social mediaas they are currently which is
what I think these laws arepotentially trying to address in

(26:34):
some way.

J.P. Matychak (26:35):
You can talk about that You're censoring me
and this and that, but I thinkthat that's core to some of this
argument.
That is as it is currently.
You do have this model, but isthat the model that should be?

Kabrina Chang (26:49):
Even the laws in Texas and Florida do nothing
about the science behind themanipulation.
That's just the content.
So I just don't think it's anapples to apples comparison, and
it is a legal argument that hasbeen going around and around in
other cases where plaintiffsare trying to impose liability

(27:12):
on social media companies forphysical injury that they've
sustained from content onlinethat has moved into real life.

Marshall Van Alstyne (27:22):
So let's jump in with two quick thoughts.
The first thought was justpicking up on the arguments of
the attorney here.
They're wrong about some ofthese arguments.
They say there's nomonarchially power.
If you go back in the historyof AT&T, they were broken up
because they had monopoly powerand that was one of the reasons
we got the common carrier roles.
You want everyone to be able tohave access.
Corinna is also completelyright.

(27:44):
These are not neutral conduits.
It's not like a phone line,it's not like a telegraph
channel.
They are indeed manipulatingfor private gain.
There's in the wonderful phrasedescribing what they do.
If it's enraging, it's engaging, and they use the machine
learning algorithms to promotethe most enraging, most engaging
content.
They are allowing individualusers to light fires so they can

(28:07):
pour on gasoline and sell adswhere people watch as the
neighborhood burns.
So it's a rough model to workon and they are in fact engaged
in policing content.
But back up a moment.
We also do need somebody topolice the content.
We need them to take downterrorist recruiting, sex

(28:29):
trafficking, pro-suicide models.
We need that kind of thing.
Somebody needs to takeresponsibility for that.
As an economist, I like toreposition this as a different
kind of problem.
I would call it a pollutionproblem.
Through all these externalities.
These are the damage thatoccurs off-platform.
It's insurrections that occuroff-platform.
It's lynching that occuroff-platform.
It's the loss of herd immunitythat occurs off-platform.

(28:51):
When Zuckerberg spoke beforeCongress, he said, and I quote,
we didn't take a broad enoughview of responsibility.
Well, that's clearly an exampleof an externality.
Now here's my explanation towhy we're at such an impasse,
but the good news is it thenleads to some solutions.
My view is why we're at such animpasse is that we do need

(29:12):
content moderation, but thedesign of Section 230 is such
that no one is responsible forthe pollution problem and no one
can be held accountable for it.
If you're in printer andbroadcast, you are liable for
your editorial decisions.
In social media you are not.
You're not liable for theuser's content and you're not

(29:32):
liable for your editorialdecisions.
So we have two proposals to tryto fix this kind of problem,
one of them legislative, one ofthem for the courts.
But the first is to make thisdecentralized, so it's not one
individual part.
I don't want Elon Musk or Idon't want Mark Zuckerberg or
the head of TikTok choosing withhis machine learning algorithms

(29:53):
what I get.
That's why I want a truemarketplace in listener-decided
algorithms and listener-decidedfilters.
It could be open source onesthat you yourself could modify,
and then you'd get a trueexchange.
What you then need to do is toreattach the liability to the
editorial decisions.
So if we separate theinfrastructure, of course we
grant them complete immunity,the same way the common carriers

(30:15):
have it now, but we reattachthe liability to the editorial
decisions so that if there arelies and defamation that are
happening, that someone can infact be held liable.
The technologist then objects,of course.
How do you deal with 500 millionmessages a day problem?
Once you see it as a pollutionproblem, it's actually really

(30:38):
easy.
You hold them to a flow rate ofaccountability.
If a factory is putting outdioxin, you hold them to a flow
rate, not every molecule.
If a doctor checks yourcholesterol, he or she takes a
blood sample.
They don't take every drop ofblood.
It's not possible.
The way you do it is to holdthem to a flow rate of pollution
, and then you can deal with 500million, a billion messages

(31:02):
daily.
But if you reattach liabilityto editorial decisions, then we
have someone that's responsiblefor the pollution problem and we
can in fact hold themaccountable.
And so I think this combination, within which we preserve the
liability, the absence ofliability, for the
infrastructure in the commoncarrier, we create a marketplace

(31:23):
on top and anyone can have thefilters that they want, but then
those filters do attacheditorial decisions on a flow
rate, can actually help solvethe problem.
But this is a combination ofcourt decisions and legislative
decisions to get us to amarketplace that becomes
self-cleaning, as opposed tohaving government or Elon Musk
do it.

Shannon Light (31:43):
And I think this is a good time to listen to an
exchange that justices Alito andKagan have with the attorneys
representing that choice.

NetChoice (31:54):
The Florida law covered Gmail.
The Florida law, I think by itsterms could cover Gmail All
right.
So does Gmail have a FirstAmendment right to delete, let's
say, Tucker Carlson's or RachelMaddow's Gmail accounts if they

(32:15):
don't agree with his or herviewpoints?
They might be able to do that.
Your Honor, I mean, that'sobviously not something that has
been the square focus of thislitigation, but lower courts if
they don't, then how are wegoing to judge whether this law
satisfies the requirements ofeither Salerno or Overbreath?

(32:39):
So it's you know.
Again, I think it's the plainlylegitimate sweep test, which is
not synonymous with Overbreath.
But, in all events, since thisstatute applies to Gmail if it
applies at all, because it'spart of Google, which qualifies
over the threshold, and itdoesn't apply to competing email
services that provide identicalservices, that alone is enough

(33:02):
to make every application ofthis statute unconstitutional.
I mean, how could that be?
Go ahead.

Justice Kagan (33:08):
How could that be , Mr Clement?
It's not unconstitutional todistinguish on the basis of
bigness, right?

NetChoice (33:14):
It is when you're regulating expressive activity.
That's what this court said inMinneapolis Star.
So the statute in MinneapolisStar was unconstitutional in all
its applications.

Justice Kagan (33:24):
The statute You're saying if there were no
issue here of that.
This is really a subterfuge.
They were trying to get at acertain kind of media company
because of their views and theonly issue was it's not worth it
to regulate a lot of smallsites.

(33:47):
You know, we only want to goafter the big sites that
actually have many millions ofusers.
You think that's a firstamendment violation?

NetChoice (33:58):
I do.
The way you're asking thequestion suggests you think
that's a harder case than theone I actually have before you.

Justice Kagan (34:04):
I think it's a little bit of an impossible case
to say you can't go after bigcompanies under the first
amendment.

Shannon Light (34:13):
So, on that Marshall, how do you see the
classification impacting the waysocial media platforms moderate
this content?

Marshall Van Alstyne (34:22):
I'll be candid as an academic, I have
never like thresholded tests ofthat sort, because you'll get
really exotic behavior on eitherside of the boundary.
You know, and I think that'sperhaps not the, the bigness
test alone is not a good test.
What we would want would be theequivalent of market power

(34:42):
tests, of consumer welfare tests.
Is harm occurring?
Is there anti competitivebehavior?
Is there suppression of speech?
Those, I think, are better waysto look at the problem than
bigness alone.
So I think that argument is nota legitimate argument, sabrina.

Kabrina Chang (35:01):
Yeah, and much of First Amendment jurisprudence,
size doesn't necessarily matterand in fact in other First
Amendment cases, like CitizensUnited, you know, the Supreme
Court was very clear in sayingin their majority that this
business has a First Amendmentright, just like we human beings

(35:24):
have First Amendment rights.
And if we're talking aboutmoney, as in Citizens United,
the fact that they have moremoney than me is not part of the
First Amendment analysis.

J.P. Matychak (35:35):
Okay, so I want to talk now a little bit about
the options that are kind ofbefore the court, and one of the
things that seemed to be arecurring theme throughout the
oral arguments was just howbroad this was.

(35:56):
I want to play two clips, onefrom Justice Sotomayor and the
other from Justice Barrett.

Justice Sotomayer (36:06):
This is such a odd case for our usual
jurisprudence.
It seems like your law iscovering just about every social
media platform on the Internetand we have a me guy who are not
traditional social mediaplatforms like smartphones and

(36:32):
others who have submitted a meguy brief telling them that
readings of this law could coverthem.
This is so, so broad.
It's covering almost everything.

Justice Barrett (36:48):
So Florida's law, so far as I can understand
it, is very broad and we'retalking about the classic social
media platforms, but it looksto me like it could cover Uber.
It looks to me like it couldcover just Google search engines
, amazon web service and all ofthose things would look very
different.
And Justice Sotomayor broughtup Etsy.
It seems to me that there arenow.

(37:10):
Etsy has a feed recommended foryou, right, but it also just
has shops for handmade goodsthat you can get.
It looks a lot more like abrick and mortar marketplace or
flea market than a place forhosting speech.
So if this is a facialchallenge and Florida's law
indeed is broad enough to covera lot of this conduct, which is

(37:34):
farther away from expressionthan these standard social media
platforms, why didn't you then,in your brief, kind of defend
it by pointing out look, there'sall this other stuff that's
perfectly fine that Floridacovers.
We don't want, you know, someperson who wants to sell their
goods on Etsy to be suppressedbecause it's, you know stuff,

(37:55):
handmade goods that express apolitical view.

J.P. Matychak (38:00):
So let's start.
I have two questions, one beingwhat is a facial challenge?
And then, two, let's talkthrough here what the options,
the options that the justiceshave with this, these two
particular cases.

Kabrina Chang (38:15):
Sure, I think they are related.
So when you're challenging alaw as unconstitutional, it
generally takes one of twoapproaches a facial challenge or
as or an as applied challenge.
So an as applied challengewould be something like hey,
we're Instagram and your law isunconstitutional as it applies

(38:38):
to our speech and here's ourspeech and all this other stuff
might be fine, but thesesections, as they apply to me,
are unconstitutional.
And usually when you have achallenge like that because if
I'm Instagram and I'm saying asapplied to me, it's
unconstitutional I have someevidence and I have a record If

(39:01):
you're doing a facial challenge,what that means is you wrote
this law.
This whole thing'sunconstitutional.
It could be sections of it too,but looking at this law, it's
unconstitutional, and so you arejust challenging the law as
it's written, not as it actuallyis applied to a person, and
that's fine.
It's just a slightly differentkind of legal argument and it's

(39:23):
a different evidentiary recordand so often with facial
challenges.
Well, in this case, I think whatthe justices are saying in
terms of a facial challenge iswe don't have a lot of evidence
about how these laws actuallyplay out, because you're just
challenging them.
They were just written in past.

(39:45):
We have no, we have noapplication to an individual
person and how that impactedthem in their life or their
business.
And so I think, listening tothe oral arguments, several of
the several of the justices werea little bit concerned that
they didn't have a tremendousrecord on which evidentiary
record on which to base adecision.

(40:06):
So, leading into the secondquestion about how could this
play out, one option is they'regoing to remand it.
They're going to send it backto Texas and back to Florida and
say build up a, build up anevidentiary record, because we
don't have enough here to make agood decision on whether or not

(40:27):
this law as written isunconstitutional.
I think also, what they'regetting out with Florida is,
when you're talking about theFirst Amendment, the law has to
be what's called narrowlytailored.
It has to address the speechthat you're you're you're
talking about it has to addressthe speech that you mean and

(40:47):
nothing more.
It has to be narrowly tailoredto do just what you want it to
do.
And I think what Justice Kaganwas saying was this is not.
This is overbroad.
They kept mentioning overbroadthis is overbroad, so they could
remand it, which sounded likesome of the justices wanted it
remanded so they could get abetter evidentiary record.

(41:09):
I mean, the other thing theycould do, like most appellate
courts, is they could overturnthe decision of the lower court
for a variety of reasons.
They could also remand it.
They could make a decision andthen remand it back down and say
, have another hearingconsistent with the decision we
just told you and and thereasons we think you got it

(41:31):
wrong the first time and and and.

J.P. Matychak (41:35):
As I understand it right now these cases are
before them because injunctionswere filed by by the, the groups
, the net choice, and theygranted the injunctions until
they could hear the court thecase.
So in this sort of going backdown and getting more evidence
to do the injunction stay, dothey vacate the injunctions and

(41:58):
allow these laws to kind of goin into into place so that they
can get the evidence that youknow in this record and you know
how do you see that potentiallyplaying out?

Kabrina Chang (42:06):
well, so they could still keep going with the
facial challenge.
You know, to get a preliminaryinjunction.
It's not.
One of the elements of gettinga preliminary injunction is
showing a like, a likelihood ofsuccess on the merits of the
underlying case.
So there's something there, butit's a likely.
They have to show other things,but there's a likelihood of
success, so it's not a ton ofevidence.

(42:28):
So if they were to send it backdown, I would think that they
would still keep the injunctionin place and ask for more
development okay, one or twoother quick thoughts on.

Marshall Van Alstyne (42:43):
You can appreciate how hard this problem
is.
You know the Texas and theFlorida laws are relatively
similar and it's interestingthat the Fifth Circuits and the
Eleventh Circuits found oppositedirections, one upholding the
Texas law and one rejecting theFlorida law.
So the you know the circuitcourts actually came to
different decisions and they'rehaving a really hard time.
The categorical issues here youknow speech on Etsy, speech on

(43:05):
Uber, speech on these otherthings are you doing viewpoint
discrimination?
Well, does this includepro-ISIS statements?
Does it include pro-Israelstatements?
How are you?
You do you really want statesor governments making those
choices?
Those are going to be.
Those are going to bechallenges they're going to face
under First Amendment law thatare really very hard to meet.

(43:25):
But then there are the broaderquestions of common carrier
versus needing some sort ofeditorial stuff to produce the
truly illegal content, and whothen has the right to do that.
This is going to be well, thisis going to be a tough set of
choices for them.

J.P. Matychak (43:42):
Yeah, and I think that's a good segue to your.
You had a final question youwanted to ask yes, well,
quasi-final, because I have onemore.

Shannon Light (43:51):
If these laws are allowed to stand, how might I
imagine these decisions couldhave far-reaching impact on the
role of social media platformsin society?
Can you maybe lay those out forus?

Marshall Van Alstyne (44:04):
Well, the consequences of one or another
decision are immense.
You know you play it out in acouple of different dimensions
and I'll be honest, as aneconomist I do not feel
comfortable calling how theSupreme Court is going to rule
one way or the other.
So I don't want to make anypredictions on that.
But if you were to fork thedecision suppose, for example,
platforms are ruled as commoncarriers then they're not going

(44:25):
to necessarily.
Then they're going to have tocarry pretty much everything.
And interestingly enough, youknow there could be a number of
responses.
There's someone at Stanford thathad a marvelous observation.
They said okay, one thing youcould do is simply cut off
service in Texas and Florida andsee what happens.
Or you could give them thefirehose of porn and spam and
pro-ISIS and pro-suicide contentthat they've asked for, Because

(44:48):
that's literally what the lawhas required the platforms to do
.
At the other extreme, if theyare granted their rights to edit
the speech, then they wouldcontinue pretty much unimpeded
as before.
It would be interesting to see.
I would be surprised if it wereno caveats whatsoever, although

(45:10):
there were some otherinteresting pair of cases.
There was Gonzalez for Googleand Twitter Vitaum earlier, you
know, trying to do it and theywere allowed to remain immune
from accountability forterrorist speech that had been
recruiting and things had takenplace on their platforms.
So under those decisions Iimagine they might still be able
to continue, because that's insome ways even more harmful than

(45:34):
some of the political speech atthe moment that we've heard so
far.

Kabrina Chang (45:38):
But also and I wanted to add to what Marshall
was saying if the decision iswhether or not social media
platforms are a public square,you know the government can
still regulate speech.
Right, the government regulatesspeech all the time.

(45:59):
We can't incite imminentlawlessness there are hate
crimes, I mean.
So commercial speech isregulated.
So it's not that it would bewithout regulation, it would
just be the regulation that wesee, which some people are not
happy with, you know, I mean theWestboro Baptist Church

(46:21):
protesting at funerals ofveterans, like no one.
There are very few people whoare happy to see that.
But that's what our FirstAmendment does.
So it's not like there's noregulation.
There's some, but it is notwhat Facebook and YouTube do
currently with getting off thetorture and pornography and that

(46:43):
kind of content currently.

J.P. Matychak (46:44):
And it seems like that was something that was
brought up on occasion duringthe oral arguments this whole
notion that there are probablysome areas of content where we
can all agree that we don't wantthat out there.
But it goes back to this wholeother thing.
When it's not narrowly defined,it's like how do you enforce

(47:05):
this?

Kabrina Chang (47:05):
And that's the part that just well, in
Marshall's point, who's weexactly?
We can all agree, but I don'tknow there's someone posting
that.
But if, with Marshall's idea ofwe get to pick our own filters,
who is we?

J.P. Matychak (47:21):
Yeah, so Marshall has abstained from taking a
guess.
As an economist, you as alawyer and a legal expert.
Where do you think that thisfalls?
The likelihood of beingremanded or dismissed?

Kabrina Chang (47:43):
I listened to much of the oral argument and
the questions were all over theplace and there was absolutely
no.
I could see no consistency withsort of traditional
conservative versus liberalideology.
And you know the justices thatwith most of their decisions
it's always a mismatch, a mixmatch, like it's not really an

(48:05):
ideological divide on many oftheir cases.
But I couldn't figure it out.
It would not surprise me if itwere remanded.
I don't know if that's whatthey'll do, but I wouldn't be
surprised.
But that's really the onlythat's as close as I'm going to
get.

Marshall Van Alstyne (48:22):
And it's, let's say I don't want to call
what the Supreme Court I thinkwill do, but I can say what I
would prefer to see have happen,so that I'm entirely willing to
take a stand on.
So again, what I would like todo is to just bear on a couple
of economic principles.
I don't want monopoly coverageof speech, and a lot of social
networks have undue market powerso it can recede decentralized

(48:43):
markets where no one has control.
I don't want to see pollutionproblems in speech where no
one's held accountable for liesand defamation.
We need to restore someaccountability for things that
have happened.
So, again, some of the proposalsI would like to see put in
place and they may belegislative as well as judicial
would be to preserve immunityfor the infrastructure, as we

(49:03):
have for the common carriers,but to restore some of the
liability for editorialdecisions, but do so, again, as
we said, on a flow rate basis,which then makes it easy to
handle at scale, so you canactually then recover the scale
issues and so we get a truemarketplace where it's
decentralized, listeners can getthe information they want, you

(49:23):
can get clean information, youcan get the information that you
want, and so no particularparty is controlling it.
But we also clean up thepollution at the same time.
So we solve the monopolyproblem and we solve the
pollution problem at the sametime.
So that would be theeconomist's answer to this
rather difficult challenge.

J.P. Matychak (49:42):
It certainly is.
It's incredibly complex andit's making us, I think, all
think about how we you know whatwe say, how we say it, where we
say it and what is theunderlying motivation for how
these platforms operate.
So I want to thank Marshall andKabrina for joining us today.

(50:03):
Thank you both so much.
This was great.
I certainly learned a lot more.
Hopefully our listeners did aswell in Shannon as well.

Marshall Van Alstyne (50:12):
So that's a real pleasure.
Thanks for having us.

Kabrina Chang (50:16):
Thanks for having me.

J.P. Matychak (50:18):
Well, that's going to wrap things up for this
episode of the Insights atQuestrom podcast.
I'd like to thank again ourguests Kabrina Chang, clinical
Associate Professor of Markets,Public Policy and Law, and
Marshall Van Alstyne Allen andKelli Questrom, professor in
Information Systems.
Remember for additionalinformation on this show, our
previous shows and additionalinsights from Questrom faculty

(50:39):
on the world of business, visitinsights.
bu.
edu For my co-host, ShannonLight.
I'm JP Matychak, so long.
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.