All Episodes

October 1, 2025 62 mins
The European Union’s Digital Services Act applies to digital platforms and service providers offering services to users in the EU, regardless of where the company is based—including U.S. companies.
EU officials contend the Digital Services Act is needed to protect democracy from misinformation, disinformation, and hate speech online. Regulators in Brussels promise it will create a safer digital space by holding platforms such as Google, Amazon, Meta, and X accountable for policing these categories. Service providers that fail to comply risk fines of up to 6% of global annual revenue, restricted access to the EU market, or suspension of operations.
House Judiciary Republicans recently issued a report warning that European regulators could use the Digital Services Act to chill speech, suppress political dissent, and establish a global censorship regime. By contrast, House Judiciary Democrats argue the Digital Services Act includes procedural safeguards, judicial oversight of content moderation, and democratic accountability within the EU.
Will the Act make Brussels the new “sheriff of the digital public square”? Could it export European hate speech laws—which have at times been used against individuals peacefully expressing their views—beyond Europe? And what steps can governments, companies, and citizens take to safeguard free expression online?
Join the Federalist Society for a discussion with experts on the EU, the Digital Services Act, and freedom of expression as we consider whether the United States should support—or oppose—the Act.
Featuring:

Stéphane Bonichot, Partner, Briard Bonichot & Associés
Dr. Adina Portaru, Senior Counsel, Alliance Defending Freedom International
Dr. John Rosenthal, Independent scholar and journalist
Berin Szóka, President, TechFreedom
Moderator: Prof. Maimon Schwarzschild, Professor of Law, University of San Diego School of Law
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Welcome to FEDSOC Forums, a podcast of the Federal Societies
Practice Groups. I'm Ny kas Merrick, vice President and Director
of Practice.

Speaker 2 (00:08):
Groups at the Federal Society.

Speaker 1 (00:10):
For exclusive access to live recordings of fedsock Forum programs,
become a Federal Society member today at fedsoc dot org.

Speaker 3 (00:18):
Great Well, Hello everyone, and welcome to this Federalist Society
virtual event. My name is Caroline Bryant. I'm the Deputy
director of Networks at the Federalist Society. Today we're excited
to host this webinar on the Digital Services Act and
global free speech. Our moderator today is Professor my Moone
short child.

Speaker 4 (00:38):
Professor my Own.

Speaker 3 (00:39):
Is a professor of law at the University of San
Diego an affiliated professor at the University of Haifa. After
our speakers give their presentation, we will turn to you,
the audience, for questions. If you have a question, please
enter it into the Q and A function at the
bottom of your zoom window. We will do our best
answer as many as we can. Finally, I'll know that,

(01:00):
as always, all expressions of the Union today are those
of our guest speakers, not the Federalist Society. That Professor,
thank you. So much for joining us, and the floor
is yours.

Speaker 4 (01:10):
Thank you very much, Caroline. There has been a major
legislation in Europe very recently governing or attempting to govern media,
especially digital media, with significant implications the freedom of expression
and of public debate, not only in Europe, but possibly
very serious implications for media platforms and for the public

(01:30):
at large in the US as well. Some of this
legislation originates in Britain, the UK's Online Safety Act twenty
twenty three and an older but still pending Communications Act
in Britain and our examples, but there's even more important
legislation emanating from the European Union in Russels as well.

(01:52):
The US Digital Services Act BSA applies to major digital
platforms or fering service to users in the EU, regardless
of where the company was based, including US companies, which
indeed is most of them. EU officials say that the
DSA is needed to protect democracy from misinformation, misinformation and

(02:16):
hate speech online service providers that failed to combly with
EU requirements. This finds of up to six percent of
global annual revenue, which could easily run into hundreds of
millions or even billions of euros and dollars in Washington.

(02:36):
In Congress. Very recently, the House Judiciary Committee issued a
report by its Republican majorities warning that European regulators could
use the DSA to chill speech, suppress political dissent, and
establish a global censorship regime. The contrast, House Judiciary Democrats

(02:56):
say the EU law includes procedural safeguards, judicial oversight of content, moderation,
and democratic accountability at least within the EU. Will the
EU law make the Russels bureaucracy the new global censor
of the digital public square? Could it, along with other

(03:17):
policies in the EU, and perhaps in the UK as well,
export European hate speech laws, which already are used against
individuals peacefully expressing their views beyond Europe and to the US.
We are very fortunate to have a panel of very
knowledgeable people to explain the EU law and policy and

(03:38):
to explore the implications for American online media, for access
to information and to public debate for Americans, and what,
if anything, the American legal and public policy response should be.
So let me introduce our panelists. Stefane Bernie Schure is
an avoca A barrister in France. He has a double

(03:59):
degree in French and German law from the Universities of
Cologne and from the Suborne in Paris. He also has
a master's degree from the London School of Economics as well.
He's a member of the law firm of theav and
Burning Sure in Paris and he's very knowledgeable about the
law and crocsus. Doctor Ardina Portaru is senior counsel for

(04:23):
the Alliance Defending Freedom International, based in Brussels. Doctor Portaro
engages frequently with EU institutions and practices before the European
Court of Human Rights in landmark human rights cases. Portaro
is admitted to the Buchariest Bar and is the editor
of the twenty twenty book A Precious Asset analyzing religious

(04:46):
freedom protection in Europe. The book is published by Kroos
Press and available online by All Means Look it Up.
Barren Soca is president of of Tech Freedom, a think
tank that has studied internet law and regulation since he
launched it in twenty eleven. He has a juris doctor

(05:08):
degree from the University of Virginia and a master's in
European law from the University of Paris. He is writing
a PhD at Dublin City University, comparing how the US
and US and EU regulation affect online platforms and how
each system is vulnerable to abuse. Finally, John David Rosenthalt

(05:32):
is an independent scholar and journalist with a background in
political philosophy, who has been covering EU politics for a
wide variety of leak media and journals for at least
two decades. He has taught on the faculties of Rutgers
and Colorado in this country, as well as being Akrona
Masier in France. He has a PhD from the New

(05:53):
Schools for Social Research in New York and did graduate
work at the University of Chicago and at the Pre
University in Berlin. Expluent in general and French and kicks
up in other European languages as well. Is the author
of an important and likely know the essay the Claimant
to deal with Books twenty twenty five entitled make Speech

(06:15):
Free Again? How the US and the fields in censorship.
Welcome to all. I'd like to turn to netal BONI
show first and you give us be fon a brief
overview of the of the DSA divisital services are.

Speaker 5 (06:34):
Yes, thank you, professor.

Speaker 6 (06:37):
Hello everyone, So I want first to send the Fair
Society for organizing this panel about the European Digital Services
Act and how it deals with the free speech. First,
what is the Digital Services Act In the European system.
You have a difference between a regulation and directive. A

(07:01):
directive sets goal, not details. A directive lays down objectives
that all European countries must achieve, but each country decides
how to implement them in its national law. A regulation
is much stronger, It is directly binding and member states
cannot change the content it applies as it is written.

(07:26):
The Digital Services Act is not a directive. This is
a regulation which is directly binding for all Member states.
So this is something very important in European law. Now,
why do we have this new regulation which applies since
twenty twenty four. The Digital Services Acts wants to cont

(07:50):
to contribute story to the proper functioning of the internal
market for intermediary services by setting out harmonized rules for
safe and trusted online place. To summarize, this regulation wants
to protect fundamental rights in the online area. This regulation

(08:13):
sets fulls for intermediary services. But what does it mean?
In the regulation intermediary services are defined like this. This
is internet providers hosting services like clouds, online platforms or
very large online platforms or very large online search engine.

(08:34):
So there is four categories of companies which are concerned
by this regulation. When we speak about very large online
platforms or very large online search engine, it means that
this company which have more than forty five millions monthly
users in the U, So this is very broad companies.

(09:00):
Now that we have explained what and why we have
the DSA, let's talk about the content of the DSA.
The intermediary services have mainly for obligation according to the DSA. Firstly,
they have to provide a contact point in Europe and
if outside Europe, a legal representative. Secondly, they have to
publish clear terms and condition for users. Certainly, they have

(09:24):
to set up and notice an action mechanism for illegal content.
And fourthly, very large online platforms have to mitigate systemic risk.
And this is one big point water systemic risk and
we will speak about this systemic risk. This is also
defined by the regulation. This is everything which concerns haigh speech, misinformation, disinformation,

(09:48):
fake news and that kind of content. But this is
not defined in a more precise way, so we'll maybe
discuss now about this and the consequences of this UH, this.

Speaker 5 (10:04):
Lack of definition.

Speaker 6 (10:06):
One last very important thing about the d SAY is
that it applies for all companies are the world. It
doesn't doesn't matter whether the company is located in Europe
or in the US. From the one, if they have
services here in Europe, they have to follow that that
falls and if not, the Commission can can investigate and

(10:33):
eventually UH decides to give fines to the companies which
don't respect the rules.

Speaker 4 (10:42):
Very helpful doctor portal all the applications of those for
freedom of speech and the plaid first of all in
European Union countries themselves.

Speaker 7 (10:51):
Right, So, as it was mentioned before, the scope and
the effect of the d S on free speech are
global wide and long last, we think of the ds
not only as a blueprint for a so called speech moderation.

Speaker 8 (11:02):
In Europe, but for the entire world.

Speaker 7 (11:05):
Now, as it was mentioned before, the DSA requires platforms
to remove illegal content, which it broadly defines as anything
that is not in compliance with EU law or the
law of any member state. You can find this definition
in Article three h of the DSA. The issue is
that given the vestory of anti speech laws, throughout EU countries.

(11:26):
The DSA allows the most speech limiting laws in any
individual country to restrict speech across the entire block and
even worldwide.

Speaker 8 (11:34):
So what happens in this scenario? Under the Act?

Speaker 7 (11:37):
The European Commission can involve s crippling fines up to
six percent of global annual turnover on platforms that refuse
to limit content, which could amount essentially to billions of heroes.

Speaker 8 (11:48):
The Commission can.

Speaker 7 (11:49):
Also restrict access to a platform within the EU or
suspend its operations, showing the massive power that the DSA
gives over private companies.

Speaker 8 (11:59):
And since the company.

Speaker 7 (12:00):
Are threatened with huge finds if they do not limit
enough content or censor enough speech, and there is no
penalty whatsoever for censor in too much speech, what we
expect these companies.

Speaker 8 (12:10):
Will end up doing over in time.

Speaker 7 (12:12):
Going back to the issue of extra territory reality, I
just want to give an example that comes from the
practice of organization ADAF International the case of Finnish parliamentarian
Pi Vidasanen. Six years ago, she posted a picture of
a Bible verse and expressed her Christian views on sexuality
on x She was criminally prosecuted for alleged hate speech

(12:33):
and has unanimously been acquitted.

Speaker 8 (12:35):
In two trials.

Speaker 7 (12:37):
The issue is, however, that the state prosecutor appeared the
case again, and shockingly, in her case in which she
faces trial for posting online, she is facing a new
trial before Finland Supreme Court during October of this year.
This hasn't happened during the essays entry into force. But
if this was the case, because.

Speaker 8 (12:56):
We're speaking about a post which.

Speaker 7 (12:58):
Happened online on X the content that Pive posted in
Finland could be assessed throughout the legal standards of any
country of the European Union. Some of the people listening
to us know that in Germany, for example, it is
illegal to insult a politician. Let's say you have a
political commentator, a satirical content creator in the US criticizing

(13:21):
Guslav Wandad Lion, who is a German politician but also
happens to be the President of the European Commission. Now
that kind of content could be challenged, could be flagged
as potentially legal content in Germany according to the German laws.
So we speak about the content that takes place in
the US or any other country on a US platform

(13:42):
that gets challenged according to German law and potentially can
be challenged and taken down in Germany, and potentially if
we can have a bit more time to discuss in
the entire European Union and outside the European Union, we
have cases before the Court of Justice of the European
Union which clear really states that there is an open
door for worldwide tale doowns based on existing regulation of

(14:06):
digital content online.

Speaker 4 (14:09):
Doctor Soka, you know a little bit philiptically. EU officials
say the DSA is no threat to legitimate free expression.
I must say in my experience, and EU officials are
very reluctant to appear in any forum where they might
confront knowledgeable critics of the DSSA, So when they say
the DSSA is no threat, they seldom seem to say
it in any forum where they're likely to be talented. Nonetheless,

(14:33):
are they right, but it's no threat to freedom of
expression and the date.

Speaker 2 (14:40):
I'm not a doctor, but I have to be precise.

Speaker 4 (14:44):
I suggested that preleptic. You're reading studying for a doctorate,
so we can assume you're going to get there.

Speaker 9 (14:53):
So let's so there are a few different concerns here,
and we should try to unpack them separately. So you know,
first of all, we have to be clear that the
DSA doesn't make anything illegal. That is a task for
Member states, and as I think it's important to note
here for everyone in the audience, the Union has only

(15:15):
the powers that are conferred upon it. So the DSA
doesn't confer any powers to make content unlawful. That task
and power remains in the hands, as I said, of
each parliament of the member state. The next question that
was raised is about whether, yes, it's true the DSA
requires removal of content that is unlawful, and then the

(15:38):
question becomes what is the geographic scope of that removal order.

Speaker 2 (15:43):
I will speak mostly to.

Speaker 9 (15:45):
Not this question, but it is not my understanding that
those removals have to be effective throughout the entire Union.
I think that they, in fact are supposed to be
proportionate and applied in the company, excuse me, in the
country in which this speech is unlawful. But that's not
primarily what this debate is about. It I don't think
it's primarily what Americans are interested in. But the House

(16:08):
Judiciary Committee report was about. But the hearing that happened
before the committee two or three weeks ago, was about
was rather to claim that the DSA requires takedown of
speech and specifically censorship of certain disfavored political speech outside
of Europe. So that's the question that I have focused on,

(16:30):
and if you want a good thorough analysis of that,
I would refer you to the letter that I helped
organize that was led by Martin Husovich, who has written
the best article on this topic. He is, i think,
started to say, the leading scholar of the DSA. His
book Principles of the DSA will answer your questions comprehensively,
and his article on this topic the DSA's red Line,

(16:50):
is really the authoritative thing to read. And it's notable
that that piece wasn't cited at all in the Republican
staff House Judiciary Committee. I'll they merely assert both that
the DSA has a geographic effect outside of Europe and
maybe it requires removal of a specific speech, and as
our letter explains, neither of those things is true. So

(17:13):
with respect to unlawful content, that's one category the DSA. Explicitly,
the Commission has said, and that explicitly that the DSA
has effect only inside Europe. But that's also not specifically
what we're mostly talking about her today. We're talking about
what happens to lawful content that it might be disfavored.

Speaker 2 (17:38):
In that area. I think there is a problem.

Speaker 9 (17:41):
There is a gap in the DSA doesn't say this explicitly,
but as our letter explains, it's actually quite clear that
the DSA does have this red line, as Martin puts it,
that it may not be used in a content specific way.
And here it's important to unpack more specifically what we're
talking about in the obligations of the DSA. So we

(18:01):
referred earlier to the risk mitigation obligation in Article thirty five.
So if you are a large, very large online platform
or search engine, you have an obligation to mitigate systemic risks,
but that obligation is secondary in terms of the timing
of things, to your broader obligation under Article thirty four

(18:21):
to assess systemic risks. And this is where I think
the problem arises because that obligation is quite broad. I mean,
it includes all the things that have been mentioned at disinformation,
and the categories of systemic risks.

Speaker 2 (18:34):
Are very open ended.

Speaker 9 (18:35):
They include risks to electoral processes and civic discourse. And
I find it troubling that those terms are so are
left so open ended. But the Article thirty five obligation,
which is the one that actually has teeth, that's what
you have to do about systemic risks, is much narrower.
And this is where the red line comes in. It

(18:57):
cannot possibly be interpreted to require a site to take
down particular lawful content, to restrict lawful content. And the
reason is, as we explain our letter, is threefold. The
Commission has never been granted the power to do that. Second,
it's very clear under European fundamental rights law that if
you are to restrict free expression, which that would it

(19:19):
would be a restriction upon free speech of some users,
that under European law must be explicit and the Digital
Services that does not say that explicitly. And then finally,
there is this question about the allocation of decision making
between the Commission and the Parliament that it sort of
checks and balances. Once again, there the Parliament did not

(19:42):
confer that power. So what does that leave us with.
It means that yes, you have a very broad risk
mitigation excuse me, risk assessment obligation under Article thirty four,
but under Article thirty five whatever obligation you have whatever
the Commission might fault you for before it imposes penalties.
It can't be con tent specific. It has to be
content agnostic, and so that could include things like how

(20:04):
you design the registration mechanism of your service, what does
the interface look like. The law is deliberately open ended
about those things. It did not do what the digital
laws have previously done, like the privacy law the GDPR,
which was very prescriptive. Instead, it leaves these things open

(20:25):
and it's up to the company to decide how to
mitigate whatever risks it has identified. And then importantly, there's
a safeguard the Commission before it brings an enforcement action,
it has to say exactly what it thinks you did
wrong and what you should do differently. And when it
does that, the Commission bears the burden of defining what
it's proposing is proportionate, so it has to go through

(20:47):
this analysis and we haven't seen that yet. The Commission
did bring an enforcement action against X regarding the adequacy
of community notes, which maybe we could talk about, and importantly,
the Commission hasn't acted on that part of that investigation.
And I think this is why, because I think they
have a very difficult time explaining what it was that
X should have done differently that would be consistent with

(21:09):
that red line. The last thing I would say here
is it is in some ways understandable that people are
very concerned about the DSA, because if you are outside
of Europe and outside of European law, and all you
know about the DSA is what you heard Terry Breton
say about it. Terry Breton tried to use the DSA
to censor Donald Trump's speech when Elon Musk interviewed Trump,

(21:33):
Terry Breton, who was then the commissioner responsible for enforcing
the DSA, on a letter and on Twitter, suggested that
he could bring enforcement actions against X under the DSA
merely for hosting a conversation with Trump. That's absurd, that
obviously crosses the DSA's red line, and he was fired

(21:53):
for that. He was condemned by me, by other civil
society groups. We put in a letter at the time,
I'm saying this is not what the DSA authorizes. And
before he could formerly be fired, after having been condemned
by all of his fellow commissioners, he re resigned or
the President of the commission could fire him. So this

(22:15):
is just to say that, yes, these laws can always
be abused jaw owning, the threats not based on law
are always a problem. That's happening here in the United
States right now as we speak. In the United States.
There's no correction for this. The administration is perfectly willing
to jaw owe companies to do things that are unlawful.
In Europe, when that happened, the commissioner responsible was fired,

(22:35):
and so to that extent, I think the European system
is working better than the US system.

Speaker 4 (22:40):
Well, let me put a question mark two to you.
Is it a defense of the DSA to say that
the DSA itself or Brussels doesn't make anything illegal, it's
the Member of states to do so. There are now
twenty seven states. Until a few years ago they were

(23:03):
twenty eight. There are twenty seven states in the EU.
The most restrictive of them is going to be quite
restrictive of breeding an expression of free speech if the
EU is enforcing in principle with respect only to that state.
But technically, once something is improper to post in State A,

(23:28):
it's going to be very difficult to make it available
and visible in State B without satisfying state A that
you're keeping it off the internet. So the most restrictive
state is the state whose law Russell's is doing to enforce.
Is it anything reassuring for us to be told that
Russell's itself isn't making these laws, it's the member of

(23:50):
states that are doing so well.

Speaker 9 (23:53):
Again, that's a distinct concern from the extra territorial effect
and the censorship issue. And again I don't think that
that's accurate, and I would refer you to Martin's work.
I think that the BSA itself requires those removals to
be effective in the states in which the law applies because,

(24:13):
and the reason is very simple, it would not be
proportionate to require otherwise. I mean, the law applies in
any state, the takedown is proportionate because Germany's laws require
X that that is not the case. Otherwise you could
imagine some some harder cases where, for example, there might

(24:33):
be a speech regarding an election in Romania and the
question is what happens in other states?

Speaker 2 (24:41):
But I think, you know, we just saw that happen.

Speaker 9 (24:42):
We just had this exact fact pattern play out, and
what we saw was that Romania was able to enforce
its laws regarding misinformation about its election inside Romania and
not outside Romania, and the voting patterns reflected that Romanians
living outside media got very different information because they were
not subject to restrictions issued by the Ring and government.

Speaker 4 (25:06):
Doctor Roosevelt, I make time to you. Is it possible
technically to take down something that's illegal in State A
and still make it available in other words, not affect
its visibility and its prominence and its availability in states,
be through be through Q.

Speaker 10 (25:27):
Yeah, that's a good question. I'm not a tech person.
I'm the last person in the world who should be
regarded as a tech person. However, the proof of the
putting is in the eating, and we can look at
what the so called DSA transparency reports that all the
major platforms following under the DSA are required to.

Speaker 5 (25:46):
Publish.

Speaker 10 (25:49):
Believe it's once a year, it's more frequently actually, and
those reports show that certain content is in fact geoblock.
Think that's what Baron is referring to. So they're not
really removed per se. The content is not removed per se,
it's just geo blocked, presumably in the jurisdiction, the relevant jurisdiction. However,

(26:13):
if you look at X's first I actually think they're
published a couple of times a year, at least in
case of X. If you look at x's first DSA
Transparency report, actually they're more removals than blocks. And there's
a third category, which to me is the most important category,

(26:35):
and that I don't believe we've touched upon yet, at
least not many detail, and that's visibility restriction, and that
I think is the most the greatest problem in terms
of the global effects of the DSA. So if there's
a block, that block will be geospecific. If there's a takedown,

(26:56):
at least it's something that's visible and that we know about.
But already in the codes which preceded the DSAY, most
notably the Code of Practice on Decent Information.

Speaker 5 (27:08):
From two thousand and twenty two.

Speaker 10 (27:13):
Which was the strengthened version, there was already a twenty
eighteen version. There's already taught not of removing content which
is not illegal. So illegal content has to be at
least blocked in the relevant jurisdiction. Maybe it doesn't have
to be, but it may in fact be removed. All

(27:33):
around Europe and all around the world, we know there
are cases of that because we see that in the
DSA transparency reports data. But there's the other option, or
the third option, merely of restricting not illegal content, but
restricting so called legal but harmful content. And that's where,
to my mind, things get extremely sticky, extremely complicated, and

(28:00):
you see the real extratoriatorial consequences. But the problem is,
again we don't know when it's happening. If you go
back to the COVID period when so called COVID misinformation
or disinformation was being censored, and I would say, contrary
to the narrative that's taken hold in the States, and

(28:21):
I think even in DC and even in among members
of the House Judiciary Committee, the driving force behind the
censorship of COVID disinformation or misinformation and that in quotes was.

Speaker 5 (28:37):
Never the Biden administration.

Speaker 10 (28:39):
We know that, among other things, because it already began
when Donald Trump was still president in twenty twenty.

Speaker 5 (28:47):
It already began. The driving force was already.

Speaker 10 (28:52):
The EU, the European Commission, and the driving force, even
though it was not that yet law, was already the
DSA because the DSA was proposed as law or at
in December twenty nineteen. In two thousand, mid twenty twenty,
the European Commission creates a program for so called monitoring

(29:13):
or combating COVID nineteen disinformation, and that's created under the
aegis of the Code of Practice on Disinformation. All the
signatories of the Code of Practice on Disinformation are required
to provide data to report back to the Commission, to
provide monthly data later bi monthly data on what they're
taking down and on the accounts that they're suspending. So

(29:36):
you have actual data from X saying okay, in July
two thousand and twenty, we removed so many items of
content and we suspended so many hundreds of accounts, But
that was also something that we were seeing, okay. You
could see the misleading labels, for instance, on posts on Twitter,

(30:00):
and you can see when an account is suspended. But
sort of the default option now for dealing with disinformation
is not removal precisely because it's not illegal. The default
option is, in order to mitigate the systemic risks that
stephan'n talked about, you have to limit the virility of

(30:20):
that so called misinformation. And that's something which X and
all the other platforms are supposed to be doing algorithmically
and what's being done algorithmically, we don't see and that
to me is the main problem. It's happening, and it's
presumably happening all around the world, but we.

Speaker 5 (30:37):
Can't see it.

Speaker 2 (30:39):
Well.

Speaker 4 (30:41):
So, first of all, what's the difference between illegal speech
and farmful speech? That seems to be a distinction that
the BSA is making, And I take it your suggestion
is an illegal speech has to be formally removed, and
it's and it's publicly ascertainable what it is that's being removed.

(31:08):
Powerful speech doesn't have to be removed, but its visibility
has to be suppressed, and there isn't any public information
about what is being what the platforms are being muscled
in effect to make less visible. As a practical matter,

(31:28):
if something is less visible, to what extent are people
who want to get information and want to participate in
the debate. To what extent are they going to be
able to see whatever it is whose visibility is being suppressed.

Speaker 5 (31:44):
They might not be at all. If you look at
the X in its DSA transparency reports, it links it's
so called enforcement options. So again you have three options.
One option is global blocking global taked down. Second options
is geo blocking, so what Baron was referring to just

(32:06):
removing in the relevant jurisdiction. And the third option.

Speaker 10 (32:11):
Is visibility filtering and X mentions that you can stop
a post from being even shown to followers of a
given account, So it won't be shown to followers a
given account, it won't be shown to non followers to
give an account, it will be demoted and replies.

Speaker 5 (32:32):
It will just be available as a unique earl.

Speaker 10 (32:35):
You can go to the profile and find it, but
unless you go directly to the profile of the account user,
you can't find it. So they can actually completely squash
They can make something invisible simply.

Speaker 4 (32:47):
But suppose I'm a digital company and I'm told, well,
you have to block or suppress something with respect to
one country. So if Germany prohibits criticizing or saying bad

(33:08):
things about German politicians, I take it some German politicians,
not all of them, not through the f politicians, I
take it you can talk about them. But if Germany
prohibits that, and I'm a tech company, is it, what
are the incentives for me?

Speaker 2 (33:26):
Is there?

Speaker 4 (33:27):
Do I have to worry about a risk that if
something is visible anywhere in the EU, the people in
Germany might see it and if that might be illegal,
or that might subject that might subject me to penalties
of various time, sanctions of various times. For the you
is my incentive to play it safe and take it
down everywhere.

Speaker 10 (33:49):
I think maybe others might want to address this, but
I think evidently your incentive is to shake it down everywhere.
I can give an example another example from German law.

Speaker 5 (34:00):
There is a prohibition in German law.

Speaker 10 (34:03):
One could say, maybe understandably so on using so called
anti constitutional symbols. So anti constitutional symbols are namely national
socialist symbols, so using the swastika or using the so
called Hitler Hitler salute. A German retiree by the name

(34:25):
of stefan Niehoff. There's the same guy who was actually
had his home rated for having insulted Robert Habeck, who's
a Green Party politician. That charge was dropped, but he
was originally eventually convicted of quote unquote using anti constitutional
symbols in tweets.

Speaker 5 (34:46):
But he hadn't used those symbols.

Speaker 10 (34:48):
He had just shown historical photos of for instance, clergyman
under the Third Reich using the Hitler salute.

Speaker 5 (34:58):
Okay, he wasn't. He wasn't using them himself.

Speaker 10 (35:01):
He was critically using these photographs from the period of
the Third Drich.

Speaker 5 (35:07):
I think it's.

Speaker 10 (35:08):
Evident that anywhere in the world that Twitter is going
to be or Facebook is going to now be sensitive,
knowing that in Germany if those images are shown, it's
a crime, is going to be sensitive to showing even
historical photos of people using.

Speaker 5 (35:28):
The Hitler salute.

Speaker 10 (35:29):
And I had direct experience of this because or at
least of the kind of chilling effect.

Speaker 5 (35:34):
Let's put it that way.

Speaker 10 (35:36):
When I tried to publish an article which eventually did
get published by the European Conservative. But when I tried
to publish an article on the Stephan Niehoff case, what
I wanted to do was show his treets, to show
people how unproblematic they would be outside of Germany.

Speaker 5 (35:51):
And I had a lot of trouble publishing that article.

Speaker 10 (35:53):
Why because I was showing those same historical pictures that
Niehoff showed, of historical figures from the period of that
they're right using the Hitler salute.

Speaker 4 (36:06):
Mister Scolla, is it unfair to suggest that the most
restrictive policy is going to be the one that bleeds
out into the way the Internet is regulated in the EU,
across the EU and even beyond.

Speaker 9 (36:25):
Well, let me say just to start with a as
a German American dual national, I'm not here to defend
exactly where Germany has drawn the line. And I think
John and I would probably agree about the examples that
he just cited being examples of Germany going too far
and enforcing its own laws. But that's not what we're
talking about today. We're talking about the DSA and how
the DSA works. I think we have to be really

(36:46):
precise about this. So the DSA Article nine requires it
governs how orders work for restriction of access, and in
particular Article nine B I think is the critical one,
where it says that the territorial scope of those orders
has to be limited to what is strictly necessary to
reach its subjectives. So, you know, I don't want to

(37:09):
say categorically that those could never have extra territorial effect
outside of a particular member state, but as a general matter,
they don't. So now let's go to the other thing
we've been talking about, which is risk mitigation. That's Article
thirty five. So Article thirty five refers to four broad
categories of risks. One of those is unlawful content, so

(37:30):
you have to analyze the unlawful content part.

Speaker 2 (37:33):
Under both things.

Speaker 9 (37:35):
But then there's a bunch of other stuff in there,
like a civic discourse and electoral processes. And that's where
this conversation starts to go off the rails, because in
part what John and others are responding to is people
in Europe, some people in positions of power, not just
Terry brettom but other people who are around the Commission
want and wanted THESA to be an anti disinformation law.

(38:00):
Talk about it that way, and in particular, I think
it's worth just quoting this one sentence from Terry Breton
in July when he went after Donald Trump where he
asserted that Article thirty five imposed duties regarding quote the
amplification of harmful content term that you used earlier in
connection with relevant events including live streaming, that might increase

(38:21):
the risk profile of X and generate detrimental effects on
civic discourse in public security. Okay, so Terry Breton claimed
that the DSA could be used to censor lawful content
by reducing its visibility or access, including Donald Trump talking
about whatever.

Speaker 2 (38:40):
That's nonsense.

Speaker 9 (38:42):
The DSA doesn't even include the term harmful content, except
as a passing reference to the very narrow category that
is recognized in Europe and in the United States for
content that is lawful in general but harmful to children. Okay,
so the first way that Terry Breton went wrong is, uh,
you know, the d SA really this this these provisions.

Speaker 2 (39:04):
We've been talking about.

Speaker 9 (39:05):
Content is either unlawful or it's not, but the harmfulness
is just that's that's not a thing that So we
have discussed what happens with with content that is unlawful,
and again I think it's limited in its effect that
those orders uh to uh, certainly to Europe and generally
to each member state.

Speaker 2 (39:26):
And then the question is, you know, it was asserted
a few moments.

Speaker 9 (39:28):
Ago that that companies have an obligation to mitigate systemic
risks by reducing the visibility of disinformation, in particular like claims.

Speaker 2 (39:37):
About an election.

Speaker 9 (39:39):
And again I think that's not correct because of the
DSA's red line, because that would be a content specific restriction,
and if that's what had been authorized here explicitly, we
might be having that conversation.

Speaker 2 (39:54):
It's not explicit. And because it's not explicit.

Speaker 9 (39:57):
There is no European court that will uphold a content
tent specific restrictions, So the restrictions might be things like,
you know, in general, you should design your sign up differently.
In general, maybe you know, for the particular kind of
content that you or for all of the content on
your site, you should consider certain features. But it's up

(40:18):
to the services to decide what those risk mitigation features are.
And the Commission cannot impose liability because the site didn't
restrict access to particular kinds of disfavored content.

Speaker 2 (40:30):
Period. That's the red line.

Speaker 4 (40:33):
Doctor Portero, Are you really assured?

Speaker 8 (40:36):
Yes?

Speaker 7 (40:36):
I just actually was listening very carefully and I wanted
to inter in a couple of times.

Speaker 8 (40:42):
I think what was mentioned before was.

Speaker 7 (40:43):
Certainly clear on what is limited to what is strictly necessary.

Speaker 8 (40:49):
That's true, that's what the DC says.

Speaker 7 (40:51):
However, it does say about the extra territorial impact effect.

Speaker 8 (40:55):
Coordination between the digitals service coordinators and just the Internet.

Speaker 7 (40:59):
By its nature of course, is global, so we have
to look at the VSA.

Speaker 8 (41:04):
It's all enseoundles.

Speaker 7 (41:05):
So both the articles and the recitals there are various
references to both what is strictly necessary and then what
is extra ri oial. Also, I think it is fair
to look at the objectives of the VSA, which is
to mitigate system. It grees to remove illegal content using
very loose interpretations.

Speaker 8 (41:24):
So these objectives are so.

Speaker 7 (41:26):
Sweeping that the limitation on what is strictly necessary is
actually not reassuring. The DSA opens the door to regulatory
measures with US hope that is limited only by the
imagination of political actors in play. And I do agree
the Commission does not directly say what is legal or illegal. However,

(41:47):
it has created a structure, and the Commission is at
the top of the structure. So I think it is
a little bit simplistic to say that the Commission has
nothing to do with potentially the removal of legal content
content that is seemed to be harmful in the mitigation
of risks obligation, just because the Commission itself, by A
to B is not removing that content. So if we

(42:10):
look at the VSA as a whole, the ds puts
it based a censorship industrial complex.

Speaker 8 (42:17):
We have the DSA.

Speaker 7 (42:18):
This year, we have had a number of codes of
contact that will pass this information is information Electoral Toolkit
Guidelines on Protection of Minors so this is filing up
and it creates a megastructure of more than one hundred
pages all in all. With European Commission once again at
its head in Brocess, where I base, I take part

(42:38):
quite a lot in the meetings, in the hearings that
take place in European Parliament where the DSA progress is
being discussed. What I hear from the companies is that
they do receive a lot of content that is flagged
extra territorially, so it can be one content that is
visible of course in twenty seven countries or even across

(42:59):
the European Union. They can get ten fifteen like content
from different countries and of course they have to decide
according to that specific law. The practice is of course
geo blocking at princeple go to a global blocking because
no company is going to employ that many people to
study that many big quests for that many laws according

(43:20):
to different interplications and standards. Yes, the Commission does not
directly puts the legal or illegal.

Speaker 8 (43:26):
It is in the end the companies through their own
services and terms.

Speaker 7 (43:30):
But it is done through the DSA that is in
force and reinforce and restressed by the European Commission and
in the end it is.

Speaker 8 (43:37):
A business model.

Speaker 7 (43:38):
I mean we also have to mention this it is
depending on finding content to censor it and inconsistent with
the standards of the rule of law. When you look
at for example, this was very briefly mentioned and I
tried to squeeze all of this in new time to
lead some time for questions and answers during the crisis moments.
The DSc mentions that the Commission can actually take extra

(43:59):
power is regarding the very large online platforms and search engines,
and those extra powers that I would encourage the academics
here would go so far as to overwrite the judiciary
in Member states countries, So I would encourage to see
the Commission is not a judiciary power. However, it does
become so in a crisis management and this is also

(44:20):
deeply concerning.

Speaker 8 (44:21):
And maybe lastly to the point of preton.

Speaker 7 (44:25):
Yes, everyone I think in the space recognize and criticize
the rich that took place there, but that is not
a single incident that happens.

Speaker 8 (44:34):
Actually, in twenty twenty five, after the.

Speaker 7 (44:36):
Elections in the US, there was INAU duration speech delayed
famous moments happening in my country of residence in Belgium,
the Belgian state broadcaster imposed a two mini delay on
every President Trump's inaugural address, not for a translation, but
to pre screen everything that the team might be racist
or ex cenophoblic so mitigation of risks illegal content.

Speaker 8 (44:59):
Let's put to listen.

Speaker 7 (45:00):
Into the wider picture, because this is not necessarily only
a matter of free stinch in Europe or in the US.

Speaker 8 (45:06):
It is really a global blueprint for censorship.

Speaker 4 (45:10):
What would you say, all of you before we turn
into questions to Europeans who might say, Look, we have
a different history and a different balance of freedom and
constraint than the US has in Germany and elsewhere. For example,
our history means that anything snacking of Nazism is illegal. Songs, symbols, speeches,

(45:31):
these might be protected by the First Amendment in the
US that you mustn't impose this on us US Europeans.
Is that a fair point on the part of the Europeans?
And if so, in an increasingly global media environment, what
do you do about that?

Speaker 9 (45:54):
And I take this first, because I have to sign
off a little early. I apologize as a trans atlantic.

Speaker 2 (46:02):
Person, I live in both worlds.

Speaker 9 (46:04):
I spend most of my time on free speech law
in the US. They're different systems, and you know, I'm
critical of abuse in both. In general, I think that
the American system of allowing more speech is the right one.
But you know, my grandparents grew up in Nazi Germany.
I understand why Germany has written the laws that it has.

(46:25):
I understand why Belgium, the Belgian broadcaster might have done
what it did. But to use that example, they did
what they did in that example that was just provided
to you, not because of the DSA. They did it
because of Belgian law or because of their own editorial judgments.
We need to be really clear about what we're talking
about here that Europe, you know, the European Union does

(46:47):
not have the capacity to make these decisions. And despite
these efforts to portray the DSA as a sort of
harmonization of how these things happen in Europe, in fact
these decisions are still made at the national level. I
can disagree with some of them. We can have a
separate discussion about some countries going too far and whether to.

Speaker 5 (47:12):
Sorry.

Speaker 10 (47:12):
Just for the legal part that's correct, but again for
the risk medication part, no.

Speaker 9 (47:17):
Well, and John, I'll close with this on the risk
mitigation part. The red line is, to the view of
myself and thirty one other scholars of the Digital Services
Act is very clear and rests on very clear principles.

Speaker 2 (47:32):
Of EU law.

Speaker 9 (47:33):
Even though it was not put into the DSA, which
we asked them to do. We want this to be
articulated clearly. We want the Commission to embrace this. But
it is in fact clear in European law that the Commission,
whatever it can do with the DSA, it cannot impose
liability because someone failed to mitigate a particular kind of content.

(47:54):
That is the red line. So what was mentioned earlier
about the disinformation Codes of Conduct and so on and
so forth. If companies are signing on to that voluntarily,
it's not because they're afraid of being held to have
a risk mitigation obligation to remove that content.

Speaker 2 (48:13):
Period.

Speaker 9 (48:14):
You cannot blame the DSA for that.

Speaker 10 (48:19):
Well, sorry, I mean baron, it's too bad to have
a debate when you need to leave. But why then
would when the strengthened Code of Practice on Disinformation is
published and I.

Speaker 5 (48:35):
Believe it was.

Speaker 10 (48:38):
June or July in any case, mid twenty twenty two,
why does then the Commission itself maybe you're saying the
Commission is understanding the implications of the law. In some ways.
I think that's correct. The Commission did misunderstand the implications
of some aspects of the law. But the Commission at
that point says, Okay, we have this strengthened code of
Practice of disinformation and now we have an enforcement mechanism.

(49:01):
It explicitly says this in a tweet. We have an
enforcement mechanism. If the companies that are signatories as a
code of practice do not comply with their commitments, and
there are forty four commitments, as you know, under the
Code of Practice, then they are going to get slapped
potentially with these six percent fines. And again that's not me,
that's a commission, a commission tweet. So why do they

(49:24):
say that if what does that mean, okay, that each
company individually is going to decide what it thinks is
information and what it thinks is disinformation. Of course, they
have no competence for doing that, I would say, and
I think you would agree the European Commission doesn't have
competence for doing that either. But it can't possibly mean
that there must be some sort of contential content based

(49:48):
aspect if, for instance, said if for instance, Twitter came
back to the Commission and said, look, we took down
fifty tweets in such and such a month, and they
were tweets by the Commission about the efficacy of COVID
nineteen vaccines, because we think that's misinformation. Do you think
that would be satisfying than the Commission's expectations?

Speaker 9 (50:12):
Okay, So just to be clear, we just switched to
a different vision of a DSA. So we're no longer
talking about want everyone to understand here that this is
how these pieces fit together. So first we talked about
in orders to take down unlawful content.

Speaker 5 (50:25):
No I'm not talking about I.

Speaker 2 (50:27):
Don't I know.

Speaker 9 (50:27):
I'm just making sure everyone understands how the pieces fit together, right.
And then we talked about the risk mitigation obligations, which
are the core of the DSA. And now we're talking
about companies committing to do something and not doing it right.

Speaker 2 (50:44):
So that's a different that's a different.

Speaker 10 (50:46):
And again I'm saying the Commission and I agree with
you in a way, the Commission misunderstood what they were doing.
But the Commission says, well, now we have an enforcement
mechanism and those commitments are no longer voluntary, they're mandatory.

Speaker 2 (50:58):
I agree.

Speaker 5 (50:59):
If you say no or not mandatory, I would agree
with you.

Speaker 10 (51:01):
But nonetheless, the Commission is saying that, which means it
thinks that it's putting pressure on the platforms to limit
the effects as systematic risks of certain misinformation. And I
think the whole problem is what exactly is misinformation and disinformation?
You can't deny that a focal point of the first

(51:23):
Vonderlon Commission is misinformation and disinformation.

Speaker 5 (51:27):
They were talking about that the whole time. You might say,
I mean, yeah.

Speaker 8 (51:32):
Just to take advantage also of Barin being here.

Speaker 7 (51:35):
If either is think correctly, what you're asking John is
actually the role of the Commission. It's not about illegal
content or lowful content being there, or mitigation of risks
or algorithmic or you know, companies being signed on. I
think the point that John is asking or is making
is what is the role of the Commission. Is a
commission simply a neutral actor that has proposed its legislation,
that has gone to member states and now sits back

(51:57):
and just watches and intervenes in egregious situations for enforcement,
or is it actually an active player that comes here.
Your statements encourages platforms. The Commissioner comes and says you
can run what you can hide even if you just
unsubscribe from the Code of Conduct on the information. So
I think my understanding, John, from what you were asking,

(52:20):
the overaction in question is what is the role of
the Commission in all of these imprint is not only
dearly Yeah, so.

Speaker 2 (52:27):
I do have to go.

Speaker 9 (52:28):
But just to answer your question, the DSA is very
clear there. First of all, there's no direct fines for
failing to live up to a voluntary code. And more specifically,
what the Commission is supposed to do is take into
account non compliance with the code in assessing the Article
thirty five risk mitigation obligations. Once again you return to

(52:48):
the DSA's red line. The Commission either has the power
to make content specific judgments about content that you should
have restricted access to, or it doesn't.

Speaker 2 (52:58):
And again the skullar only.

Speaker 9 (53:00):
Consensus on this is the Commission has no such power.
I really encourage everyone to read Martin's article. You can
reader a letter which goes into this in some detail.
And I would just note in closing, since this was
all started by the House Judiciary Committee staff report The
staff report rests almost entirely on this claim the DSA
can be used in content specific ways, and it fails

(53:23):
to engage completely with the scholarship on this topic explaining
why that is not the case, and in that sense
it is not a serious contribution to the discussion. But
on that note, I apologize, I have to go for
a previously scheduled meeting, so pleasure to chat with all
of you, and I hope you can refer us refer
everyone to the letter that I've put forward.

Speaker 2 (53:43):
We filed it in the House Judiciary.

Speaker 9 (53:44):
Committee site and maybe it'll be in the links for
this episode.

Speaker 4 (53:50):
Thanks. Thank you very much therein for joining us, And
let me turn to the audience and to those of
you online who might have questions, and you might, through
your questions let us know whether you are reassured or

(54:10):
otherwise about the potential effects not just the DSA, but
this entire architecture of European policy and law and regulation,
not on not just on Europeans, but on people wanting
to participate in public debate across the world and in
particular in the US. If we have questions, Caroline, can

(54:33):
we turn to those at this point? Absolutely so, I
think we have one question.

Speaker 3 (54:46):
Are flagging agencies in various MSS and I apologize I
don't know what the abbreviation stands for there, but diverse
in ideology France leans towards progressivist only agencies, pro immigration,
prob portion, LGBT activists, et cetera. Other question, where do
AMASS stand in terms of enforcement? It is my understanding

(55:08):
some are really reluctant to abide.

Speaker 4 (55:13):
Any thought. Roosevelt.

Speaker 10 (55:16):
Uh about the first question. In terms of the flagging agencies,
They're certainly not ideologically diverse, but they in fact haven't
yet had much of an impact. And I think the
crucial fact about the flagging agencies.

Speaker 4 (55:35):
Can you explain what the flag what a flagging agency is?

Speaker 2 (55:37):
Yeah?

Speaker 10 (55:39):
Yeah, so so one of the one of the I
guess the central pillars we could say of the DSA
is you're supposed to have this notification mechanism where individual
users can notify Twitter or Facebook or whatever the platform
happens to be of what they take to be, say,

(56:03):
hate speech or disinformation or whatever, and then the platform
is supposed to make some sort of judgment itself and
potentially take enforcement action.

Speaker 5 (56:16):
In addition to individual users being able to do this.
Governments can do this.

Speaker 10 (56:23):
Under I believe it's Article nine adena that they can
I can't remember it's Article nine or Article eight, that
they can actually give something like they're not exactly orders,
but take down requests.

Speaker 5 (56:36):
And you have the creation of this new entity.

Speaker 10 (56:39):
Which are called trusted flaggers. The trusted flaggers or organizations
which are supposed to have special expertise in certain areas. So,
for instance, one of the German organizations that has been
given trusted Flaggers status is called I believe Hate Aid,
so it's supposed to be specialized in hate speech, and

(57:01):
any flagged content that's been submitted by a trusted flagger
has to be given priority treatment by the platforms. But
those trusted Flaggers don't just sort of emerge out of
the blue.

Speaker 5 (57:13):
They're appointed.

Speaker 10 (57:14):
You can as an association or an organization, you can
apply for that status, but it's the respective government that
actually names those trusted flaggers. And if you look recently,
I've been looking at a lot of the data you
can glean from these so called transparency reports, and in fact,
the trusted flaggers up to now in any case, have
very little impact. But you know they're not going to

(57:36):
be any more ideologically diverse than the governments that they're
appointing them.

Speaker 6 (57:41):
So to say yeah, yes, I will adde something. It's
something very important trusted flaggers, even if for now it
does not a very big impact. But it also means
that an association with the political views could be designated
as a trusted flaggers and try to the decision or

(58:01):
the behavior of providers who are hosting companies, so it
could be used in the future.

Speaker 5 (58:12):
Absolute, So I totally agree.

Speaker 10 (58:15):
Yeah, And in fact, in the German case hate aid,
of course they want to go after for instance, or
parties like the a FD, so who true defines what
is actually hate speech? But again it's not it's not
those private associations that have maybe those political agendas. It
could be the governments that are appointing them. And the

(58:37):
second question I didn't hear. I'll leave it to somebody
else to answer.

Speaker 3 (58:42):
I know there's another question as well that maybe we
want to get through. We only have a little bit
of time. But so I think someone's wondering about the
importance of civil discourse and its role in the United
States and American freedom kind of also in French freedom,
and maybe pretty appstentially in the UK. Some he says,

(59:07):
kind of the persitions as the language to use for
those examples could be considered aggressive and unlawful.

Speaker 8 (59:13):
Maybe, but kind of didn't that.

Speaker 3 (59:15):
Move things forward in the United States kind of debate
any thoughts on that maybe for US, France and the
u K.

Speaker 6 (59:24):
Well, I mean, yes, just just just a point. In
the d s A, the DSA refers to the the
European Convention for Human Rights and the Charter for Fundamental
Rights of your opinion, so normally it should it should
protect freedom of expression for the most speech, and in

(59:46):
the idea of the d s A, I don't say
that it would be used in the in the proper way,
but in the in the idea, we always have to
make a balance of interest between the freedom of speech
and the speech or harmful speech. So this is the
idea behind the the this regulation and the European Convention

(01:00:11):
of Human Rights is explicitly quoted in the in the
In the regulation.

Speaker 4 (01:00:20):
It's a global, increasingly global media environment and balance involved
the very different national cultures and national legal environments in
a non national and increasingly global world is clearly going

(01:00:41):
to be both very difficult and with the risk that
it cascades down to the most restrictive country for the
most restrictive regime, and how that will be managed In
a country like the US where First Amendment values have
on the whole up to now prevailed for open debate

(01:01:04):
and for a broad freedom to expression, that I think
is going to be a continuing challenge. And the BSA,
the Digital Services Act as one element in European structure
which emerges from a very different political and historical culture.

(01:01:24):
How that will all be managed, I think is going
to be a continuing ongoing interest in this country, both
in Congress and kind in the country more at large.
So I think this is this has been extremely enlightening
and extremely valuable. I'm grateful to all the participants in
this UH, in this conversation, but I think the conversation

(01:01:48):
will continue in the Federal society and far beyond as
time goes on. So many thanks for all of you,
and Caroline, let me give it back to you great well.

Speaker 8 (01:02:00):
I just want to echo Professor my Moones. Thanks.

Speaker 3 (01:02:03):
Thank you also for bearing with me as I had
some technical challenges. But on behalf of the Federalist Society,
I want to thank our speakers for the benefit of
their time and expertise today.

Speaker 8 (01:02:13):
Thank you also to the audience members who joined us.

Speaker 4 (01:02:15):
We greatly appreciate your participation.

Speaker 3 (01:02:17):
Check out our website fedsock dot org or follow us
on all major social media platforms at fedsocstay dot state.

Speaker 8 (01:02:24):
With announcements and upcome govern ours.

Speaker 3 (01:02:26):
Thank you on some ore for tuning in and with
that you are a journed.

Speaker 1 (01:02:30):
Thank you for listening to this episode of fedsock forms,
a podcast of the Federal Societies Practice Groups. For more
information about the Federal Society, the practice groups, and to
become a Federal Society member, please visit our website at
fedsock dot org.
Advertise With Us

Popular Podcasts

CrimeLess: Hillbilly Heist

CrimeLess: Hillbilly Heist

It’s 1996 in rural North Carolina, and an oddball crew makes history when they pull off America’s third largest cash heist. But it’s all downhill from there. Join host Johnny Knoxville as he unspools a wild and woolly tale about a group of regular ‘ol folks who risked it all for a chance at a better life. CrimeLess: Hillbilly Heist answers the question: what would you do with 17.3 million dollars? The answer includes diamond rings, mansions, velvet Elvis paintings, plus a run for the border, murder-for-hire-plots, and FBI busts.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.