All Episodes

July 8, 2020 65 mins

What happens to our civil liberties when an algorithm is used by law enforcement to make an arrest? Even more concerning, what happens when that facial recognitiontechnology is racially biased? As we enter an age of ubiquitous surveillance, it’s minorities - especially people of color - who are disproportionately affected. The ACLU has recently filed a complaint on behalf of a Black man who was wrongfully arrested dueto faulty police facial recognition tech. It’s the first case in the US, but it’s unlikely to be the last because, according to the ACLU - the tech often can’t tell Black people apart. The organization that has been fighting for civil rights protections forover 100 years, is now calling on lawmakers nationwide to stop law enforcement use of facial recognition technology. For Susan Herman, it’s an extraordinary time to be president of the ACLU. Over the years, the American Civil Liberties Union has fought forfree speech, reproductive rights, and privacy. But as technology continues to muddy the waters, the tradeoffs become more complicated. Where do we draw the line between security and privacy? Herman says we must act now.————————————Show Notes

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Listen
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
First Contact with Lori Siegel is a production of Dot
Dot Dot Media and iHeartRadio. There's a great quote on
the ACLU website. The fact that technology now allows an
individual to carry such information in his hand does not
make the information any less worthy of the protection for
which the founders fought.

Speaker 2 (00:21):
Exactly. I like to talk about, you know, one of
the whole points of the Constitution adding the fourth Amendment,
which is the protection of privacy, is they wanted to
protect what was in Benjamin Franklin's disk. Nobody should know
if he was writing some things that were anti government,
and we now have that on our cell phone, so
of course, but that's where I think that a lot
of the protection of civil liberties is applying our fundamental

(00:43):
principles in different circumstances.

Speaker 1 (00:56):
We are in a moment of reckoning as we enter
an age of ubiquitous surveillance, questionable data collection practices, even
algorithms that discriminate its minorities, especially black and brown communities
that are disproportionately effected. Over the last month, says the
nation has grappled with the conversation around police brutality. We've

(01:17):
seen predator drones used for aerial surveillance at protests facial
recognition technology that wrongfully accused a black man of a
crime he didn't commit, and it wasn't a coincidence. Reports
say the tech is one hundred times more likely to
misidentify African American and Asian people, and as COVID nineteen
continues to spread, there are serious questions being raised about

(01:41):
contact tracing apps and how that data collected could be misused.
These issues raise ethical questions about technology and its impact
on our civil liberties, equality, and the future of our country.
For Susan Herman, it is in extraordinary time to be
sitting in her seat as president at the ACLU. Over

(02:02):
the years, the American Civil Liberties Union has filed lawsuits
fighting for free speech, reproductive rights, and privacy. But as
technology continues to muddy the waters, the trade offs become
more complicated. Where do we draw the line between security
and privacy and how do we prevent technological innovation from
outpacing the law. I'm Laurie Siegel and this is first contract. Susan.

(02:29):
Thank you for being virtually with me today.

Speaker 2 (02:32):
Well, thank you for inviting me, Laurie.

Speaker 1 (02:34):
Yeah, you know, I always start out these interviews with
our first contact. I talked to guests about how we met,
and we don't really have a first contact. We've never
met in person, but we met on an email chain
because we were going to do an interview together for
something else and it fell through. So I said, you
got to come on the podcast because you are just
sitting in such an extraordinary seat at such an extraordinary

(02:58):
moment in time. It's our first contact.

Speaker 2 (03:01):
Well thanks. It just seems to me like our first
contact is total serendipity.

Speaker 1 (03:05):
Yeah, exactly, So you know, to get started. You've been
the president of the ACLU since two thousand and eight,
and I said this before, but you know what, in
extraordinary time to be sitting in your seat, you know,
how are you feeling?

Speaker 2 (03:19):
Oh my, it's just sort of overwhelming. You know, as president,
I'm essentially chair of the board, so you know, I'm
not the one doing the day to day work as
all of the members of our staff are. But to
be a member of the ACLU staff right now is
just it's mind boggling. Because we had a lot of
work that we were already doing before twenty sixteen, with
all of the states making worse and worse laws about

(03:41):
reproductive freedom and voting rights and immigrants' rights, and you know,
all sorts of other things. Then came the election, and
since that we have brought one hundred and seventy three
legal actions against the Trump administration for things like family
separations and the travel ban and prohibiting trans in the military.
Then in March, COVID hit, and at that point, since then,

(04:02):
we've also brought over a hundred lawsuits eccluding with one
hundred lawsuits just about people who are incarcerated in jails
and prisons and iceed attention and who are just in
a hotspot. You know, they have no control over whether
they can social distance, and so we've been working very
hard to get vulnerable people out of those terrible situations,

(04:23):
basically death traps. Plus the COVID also led to a
number of states opportunistically restricting things like freedom of abortion,
declaring abortion to be a non essential procedure so people
could just wait until the pandemic is over to get
an abortion right. And voting rights has also just been
a really fraught area right now because all the restrictions

(04:45):
on voting and the ways in which the vote was
becoming distorted. Have you just been magnified by all the difficulties.
So there's a lot to talk about.

Speaker 1 (04:54):
So I was about to say, what I'm hearing from
you is you're sleeping really well at night. You know,
know where to do?

Speaker 2 (05:01):
Yeah, almost nothing to do the staff that they're just
sitting around polishing their nails.

Speaker 1 (05:05):
Yeah, So I mean, like take me to March, like
coronavirus hits. You have been involved in some of these
monumental cases that have just shaped society in our civil liberties,
like coronavirus hits, and now you know, we have a
little bit I don't even think we have the luxury
of perspective at this point, but we have a little
bit more perspective. But like take me to March, Like

(05:26):
in your role at this extraordinary moment, like what was
going through your head? What were you concerned about at
the time.

Speaker 2 (05:34):
Well, you know, one of the first concerns is just
you have to close the office. So the first concern
is how can people do all this humansivey? It increases
the work and makes it more difficult to do the work.
So we just had to, you know, really make sure
that our technology was up to doing things. So one
thing that the ACL you did was to buy new
laptops for some staff people who are going to be
working for and you have to worry about how the

(05:56):
technology is working, which has been a question for us
every time there's something really big hits. When the travel
band hit, there were so many people wanting to donate
to the ACLU that our website crashed. So even things
like that, you know, that's you know, like number one
of how do you handle this? We have been fortunate
so far that the ACLU is so well managed and

(06:18):
we had not spent every penny that all of our
donors had given us up until that point, so we
have not had to lay people off, which is very
fortunate because, as you're saying, there's more than enough work
to do. But yeah, that's the first concern of just
how do you keep the organization up to speed and
ready to do you know what. Staff members now need
to be doing an incredible amount more work, but for

(06:38):
some of them it's while they're juggling a toddler and
a dog.

Speaker 1 (06:42):
Yeah, can you give me a quick run through of
some of the cases that you've been involved in that
correct me if I'm wrong. You started out as an intern,
right and really just worked your way up. I mean,
I can imagine you've been involved, and I know you've
been involved in some pretty extraordinary cases. To give listeners
some can you explain some of the cases that kind
of stick out to you?

Speaker 2 (07:03):
Well, I wasn't intern for the ACLU back you know,
in the nineteen seventies, you know, around the time when
I was in law school. And just to make sure
that everybody understands, I don't actually work at the ACLU
my day job as I'm a law professor, and I
don't generally work on the cases. What I'm generally doing
is we run the organization. But I'll tell you I think,

(07:24):
you know, it would be interesting to start. But the
first ACLU case that I actually did work on, which
was while I was a law student, and this was
the case. One of my connections with the ACLU originally
was that one of my law professors in the first
year was connected with the New York Civil Liberties Union
and he had some clients who came to him who
were graduate students at Stonybrook on Long Island, and they

(07:45):
had just discovered they were not allowed to live together.
They had rented a house together. There were six of
them and they had just discovered they weren't allowed to
live together because there was an ordinance in their village,
village called Beltaire, that prohibited more than two persons unrelated
by blood, marriage or adoption from living together. So, you know,
they were pretty shocked. And it turned out that under

(08:06):
the laws it was at the time, by the time
they were talking about this, they were liable for all
sorts of criminal fines and punishment. It was really a
very heavy stuff. So I started working on that case
with my law professor, and we went to a federal
judge to ask for a temporary restraining order, which means
to just get until we had litigated whether or not

(08:29):
that was a constitutional thing to do, to tell people
who they could and couldn't live with that the village
should not be allowed to either kick them out of
their their house or to you know, start walking them
up because they owned too many fines for having been
illegal residents. So the judge ended up signing the order,
and he was signing the order about that, and then

(08:50):
one of the ways in which actually the original way
in which our clients had discovered that they were illegal
residents was that they applied for a residents only beach
permit and they were told they couldn't have one because
they were illegal residents. So the judge who we had,
the district judge, who was a very nice man, looked
at the order we had written out and he said, well,
you know, it's the summer. Don't your clients want to
go to the beach while the litigation is pendix? Do

(09:12):
you mind if I write that in that they have
to be allowed to park in the parking lot of
the beach. So we said, sure you, that's very nice,
so he wrote that in. Then, as the junior member
of the team, I was sent out to explain to
our clients, to show them the order and explain to
them what was going on. And they gave me a
tour of you what the village looked like, and the
residents only beach, And the minute the wheels of their

(09:33):
car hit the parking lot, this very large, your fierce
looking man comes striding across and says, what are you
doing here? You're not allowed to be in this parking lot.
And they all look at me and I'm thinking what
am i? I'm like, you know, twenty something, I'm not
very tall, but what am I supposed to do? About
this large man who doesn't want us in his parking lot.
And then I remember that I had a federal court
order right on my person, so I drew myself up,

(09:55):
but I showed him my federal court order and I said, well,
I'm with the New York Civil Liberties Union kind of,
I have a federal court order saying that these people
are allowed to be in this parking lot and go
to the beach. And he melted. And that was I
think one of the points at which I thought, Wow,
you know, this is really powerful stuff. Yeah, you saw

(10:15):
that was my first day lu case. Yeah, exactly, that's great.

Speaker 1 (10:19):
And I saw I read that some of your earliest
memories of speaking up to authority involved I think a
dispute over a book at your school library.

Speaker 2 (10:28):
Yeah, that's right. Even before the Belch Hair case. My
first civil liberties hero was my mother. So when I
was in third grade, we were doing a school play
about a story called Johnny Tremaine, about a boy in
the American Revolution, and I thought the play was interesting.
Players don't have that many words, and we were told
that this was based on the book. So I went
to my school library, my public school library, and I

(10:50):
asked to take out the book, and the librarian said, oh,
you can't take out that book, Deer, that's in the
boys section. And I was surprised to find this out.
I've been reading books in the girls section, which were
all collections of fairy tales and biographies of president's wives,
but it has never occurred to me that I wasn't
allowed to take out a book from the boys section.
So I went home and I told my mother about this.

(11:12):
You're just thinking, you know, that's the way things are,
and she just exploded and she called the librarian the
next day and say, how dare you tell my daughter
you know what she's not allowed to read. So the
librarian told me that from then on, I could take
out any book I wanted. And you know, not long
after that, they changed the policy for everyone. So, you know,
there was another example of how you know you can

(11:33):
kind of speak up to authority when they kind of
tell you who to be and prevent you from making
your own choices.

Speaker 1 (11:38):
Were you always like that?

Speaker 2 (11:40):
Well, you know that's third grade, and I feel like, yes,
I think for most of us are values for when
we're pretty young. Yeah, so you know, seeing my mother
do that I'm sure would have had an impact on me.

Speaker 1 (11:52):
Yeah, that's such a good story. And did you I mean,
did you always know you wanted to go into law? Uh?

Speaker 2 (11:57):
No, I actually really didn't, because you having grown up
as a woman during that era. My father was a lawyer,
and he always used to talk about the fact that
law was You're really not a good profession for women.
Why would you want to do that if you could
be an English teacher and help the summer off, you
take care of your children. So it to be a while.
I graduated from college and then spend a few years
doing other things, and then decided to go to law school.

Speaker 1 (12:19):
Well, I mean, it's it's so interesting and now kind
of seeing where you're at and seeing this moment, it
does feel like a moment. And I was looking at
something you said about you know, this feels like a moment.
We can be optimistic because so many Americans are beginning
to really understand the scope and the depth of structural racism.
It certainly feels you know, I'm based in New York City.

(12:41):
You can just feel it right on the streets with
the protests, and they hear the sirens and the helicopters.
You know, as we sit here and we hear you know,
your rich history and covering and caring about these issues,
what is the challenge for you guys ahead.

Speaker 2 (12:59):
Well, you know, the challenge on that particular subject is
that this is work that we'd already been doing. One
of our top priorities for the past several years has
been trying to break our addiction to mass incarcerations, which,
as everybody is now really coming to terms with, has
been really it's a system that has disproportionately affected people
on the basis of race and income and disability. A

(13:22):
quarter of the people who are arrested or people who
are mentally ill, and our feeling is that the system
has been fundamentally broken and misguided for a long time.
So part of what we're trying to do with this
moment is to capitalize on the fact that people want
to look at what the police do. We're trying to
encourage people to look beyond the police. It's not just
who are the police arresting and how are they treating

(13:43):
the people they arrest I think behind that is the
question of what do we really want to treat as
a crime. So when you treat all sorts of very
minor misconduct as a crime, you're really setting up a
situation where there are going to be more contacts and
therefore potentially more are arbitrary and discriminatory contexts. So, if
you think about it, Eric Garner ended up dying because

(14:06):
he was selling single cigarettes on which the tax had
not been paid. George Floyd. The basis for that encounter
was that they thought he might be passing a counterfeit
twenty dollars bill. So I think that if you look
at why are we criminalizing some of the things we criminalize,
especially if you're talking about people who are mentally ill
and are having problems. Do we really want the police

(14:28):
to be the people who are the first responders to
people who are having a mental health crisis? Or is
there some more effective way to deal with that that
would avoid putting those people into the criminal justice system,
which isn't really good for anyone, and to maybe recommit,
reallocate some of the resources we're using on arresting people
and locking them up to actually dealing with the mental

(14:51):
health crisis. You have mental health treatment. So instead of
viewing everything as all dysfunction as a matter of policing,
why don't we spend more time I'm reinvesting and to
try to prevent more dysfunction. It's sort of like the
old saying, you know, if you're a hammer, everything looks
like a nail. Well, you know, not every problem in
our society is a problem for the criminal justice system,

(15:11):
and an occasion to arrest people and lock them up
a lot of them really should be in occasion for
you know, thinking about public health treatments, or thinking about
how we want to approach homelessness, and you have a
lot of much deeper thoughts about how you prevent dysfunction
rather than answering everything with you we're going to send
in the police.

Speaker 1 (15:28):
It certainly seems also like this moment, even coming out
of the pandemic, I can only imagine the mental health
crisis is going to be even worse.

Speaker 2 (15:36):
Yeah, that could well be. And I think the pandemic
is also showing us. Somebody asked me the other day
whether the protests over policing and police brutality are related
to the pandemic, and I was in a webinar, and
one of the smart people in the room said, oh no, no,
they're two entirely different things. And I said, what do
you mean, you're the same people who are being disproportionately

(15:57):
affected by policing and police brutality, are the people who
are being disproportionately affected by COVID. The statistics is that
people of color are much more likely to die, and
there are a lot of reasons for that, you having
to do with underlying health and having to do with
the fact that minorities and people who are not affluent
don't get to work from home, they don't get to

(16:18):
work through the zoom. They're the people who are out
there on the streets, being the first responders, being the
people who are picking up the garbage, being the people
who are talking the supermarket shills. And I feel like
the virus is really amplifying so many of the inequities
we've had in our society, and I think especially you know,
I don't know what it is like for everyone else,
but I live in Brooklyn and in New York City.

(16:40):
It really felt like a lot of the people who
were out on the street they were out on the
street because they were upset about George Floyd. But I
think it was more that they recognized that George Floyd
was the tip of the iceberg and that there were
just a lot going on that they really could not
tolerate any longer.

Speaker 1 (17:00):
More from Susan after the break, and make sure to
subscribe to First Contact and Apple Podcasts or wherever you
listen so you don't miss an episode putting on the

(17:25):
tech hat. You know, I think most people probably don't
think of tech when they think of the ACLU, but
there's quite a bit of litigation in regards to security
and privacy issues around contact tracing, surveillance, algorithmic bias, and
obviously the ACLU has a hand in checks and balances
and a lot of the issues that are merging from
the pandemic. You know, what are some of the tech

(17:47):
developments that you guys are most concerned about.

Speaker 2 (17:51):
Well, since you were mentioning the COVID and the contact
tracking and tracing, I'll start with that. So the upshot
is that we are neither for nor against tact tracing.
If contact tracing is something that really will contribute to
public health, our concern is not to say no, you
can't do it, or yes, go right ahead and do
whatever you want. What we're concerned about is to minimize

(18:12):
the damage to privacy, the damage to equity. Again, there
are a lot of concerns that we have. The other
thing that we're concerned about is discrimination, again, because there
are ways in which the technology could also increase pre
existing social inequities. We think that people should not be
coerced into participating in testing. We think it should be voluntary,

(18:35):
and we also think that it should be nonpunitive, because
if you start having the criminal justice system enforcing whether
or not people are willing to use their phone to
take a test or whatever it is, you are just
creating more opportunities for police interactions that will at some
point be arbitrary or discriminatory. So we don't want to

(18:58):
see rules and regulations that are good public health rules.
Even if they really are good public health rules, we
don't want to see those become occasions for filling up
the jails with the people who aren't complying, because we've
already seen there were some statistics in New York that
when you ask the police to start enforcing who's wearing
a mask and who's not wearing a mask, that right

(19:20):
away excuse racially racially disproportionate in terms of who they
were questioning and who they weren't questioning. So I think
there are just a lot of issues there, which is
very much up your allity because they're very much ethical issues.

Speaker 1 (19:32):
Yeah, you know, one of the one of the cases
that I'm fascinated by, and I you know, I honestly
I felt like it was just it was only a
matter of time until we saw this headline. And then
we saw the headline. You know, a man was arrested
after an algorithm wrongfully identified him. You know, I've been
covering for so many years. AI is biased. AI is

(19:52):
trained on you know, on data online, which can be
very racist, you know, and I think for so many
years we've been having this conversation, but the question of okay, well,
what happens when it gets into the hands of the police,
what happens you know, if it could go for policing.
And so I think it's such a fascinating case. And
you guys that ACLU filed an administrative complaint with Detroit's

(20:16):
police department over what you guys are calling the country's
first known wrongful arrest involving facial recognition technology. I mean,
for context, a man was arrested because he was wrongfully
identified by an algorithm the police department thought he had
robbed I believe like stolen watches and he was arrested.

(20:37):
I mean, can you talk to me about the significance
of this case. I can't help put on my tech
hat and scream, you guys, this is a really big deal.

Speaker 2 (20:46):
Yeah, it is a really big deal. And as you're saying, Laurie,
we were aware of this problem for a long time
and we've been complaining. So going back for a minute
before getting to the case you're talking about, you Robert Williams,
the National Institute of Technology says that African American and
Asian people are up to one hundred times is likely
to be misidentified by facial recognition. So that's the background problem.

(21:09):
And so we knew that, right, you know, we knew
that before this case came up in Michigan. And it's
not the algorithm's fault. Obviously, there's something that's being put
into the algorithm that that is, you know, that has
a bias. And I think people tend to think that
algorithms are you know, are so neutral and that we
can rely on algorithms. That's what I was saying about
the contact tracking and tracing, that you you start relying

(21:32):
on algorithms or apps that you think are neutral and
you really have to be very wary of that. So again,
before getting to the Robert Williams case, an ACLU staffer
at the ACLU of Northern California had the really interesting
idea of trying out Amazon's facial recognition program Recognition with
the K because you know, they were just offering this

(21:52):
to the police or whatever. This is great, it will
help you identify and see if you have somebody who
matches a mug shot. Well, what they tried to do,
which I thought was very clever, was they tried to
match mugshots against the members of Congress. They got, you know,
the facial pictures of all the members of Congress. This
was in July of twenty eighteen, and there were twenty

(22:13):
eight members of Congress who were misidentified as matching the
mug shots. Twenty eight. There were twenty eight mistakes out
of that, and not only that, but that the false
matches were disproportionately people of color. And one of the
people who was identified as matching a mug shot and
therefore you know, probably you know this criminal was civil
rights legend John Lewis, the guy who was beat up

(22:33):
on the bridge in Selma to get us all voting rights.
So we know that almost forty percent of the false
matches there were of people of color, even though people
of color made up only twenty percent of the members
of Congress. So in some ways, you know, the Robert
Williams case is completely predictable. We knew that we allowed

(22:54):
for that to happen. It might have already happened elsewhere,
but you know, subterraneanly in a way that we don't
really we didn't see the case. But what's amazing about
the Robert Williams case is that it happened right there,
you know, visible to everybody where you can just see it.
So what happened was that they told him that he
was being arrested because they believed that he was that
the algorithm has said that he was a match for

(23:17):
this mugshot. And they showed him the mugshot and he
said to them, do you guys think all black people
look alike? That looks nothing like me? So you know,
it was pretty clear. And if you used your eyes
and looked at the picture yourself, if you didn't trust
the algorithm, and if you looked at the picture in
this man's face, they didn't look alike, but nevertheless he
spent thirty hours in jail under some pretty miserable conditions

(23:40):
because the algorithm said it was a match. So I
think that's really important. In some ways, the fact that
you know a problem exists is not as inspiring to
make people want to do something about it as when
you see it. So that's what happened with all the
protests about George Floyd. People could watch that horrible video.
They could see it. It was recorded on the video,

(24:02):
And here we have an actual person, not just hypothetically
statistics are showing, but an actual person who did get
arrested and did have a miserable time. He was arrested
in front of his family, and it was really traumatizing
and based on again, the officers involved were trusting the
science more than they were trusting their own eyes. When

(24:23):
anybody could have seen me, he didn't look like the
picture right.

Speaker 1 (24:27):
And you know, he wrote an op ed in the
Washington Post, and he asked the question, He said, why
is law enforcement even allowed to use this technology when
it obviously doesn't work? So I guess asking a legal
scholar the question. You know, police departments all around the
country are using different variations of facial recognition software. So
you know, what regulations should we see as we enter

(24:49):
this era of algorithmic discrimination.

Speaker 2 (24:53):
Yeah, that's a great question. And again we've been urging,
you know, long before Robert Williams turned up, we've been
urging police departments not to rely on the facial recognition
technology that it was just it was not reliable enough
to hold people's faces in the hands of algorithms don't
have hands, but for people's face to be dependent on
this facial recognition technology which was being touted. And again,

(25:15):
it's great if a company is doing something to make money,
but if wanting to make money is your only consideration,
and if you're not considering whether you are unleashing something
that is really going to be disruptive of people's lives unfairly,
either because it's just going to be wrong or because
it's going to be wrong in a racially skewed way,
I think that's just really a problem. So we've been

(25:37):
urging police departments not to buy and use the technology.
And I'm sure you know, Amazon has withdrawn the facial
recognition technology temporarily and they're not sure whether or not
they'll bring it back. So the probability of wrongful arrest
is one thing, but when you draw the camera back
and look at the technology in the bigger picture. In
addition to facial recognition, one thing that police departments have

(26:01):
been doing with facial recognition and different law enforcement agencies
is to try to see who attends a demonstration or
see who's in a crowd. So it ties not into
are you, like, is somebody likely to be wrongly arrested
like Robert Williams because they just there was a false match,
But it starts becoming big surveillance too that an agency

(26:23):
has the cameras on and then they have the facial
recognition and they're purporting to identify all the people in
that crowd so that then they can track those people.
They now know that you were at the George Floyd
demonstration and that person was in the anti war demonstration.
And at that point, the government starts having more and

(26:43):
more information about all of us, to the point where
it feels like instead of we're controlling the government, it's
like the government controls us. So I think the facial
recognition is only one part of the whole tendency of
technology to amplify government power to be kind of watching

(27:05):
watching what we do.

Speaker 1 (27:08):
Yeah, I mean, it's it's interesting to hear you say that.
You know, that type of technology is just a part
of it, especially when it comes to this moment where
people are out protesting police brutality, when people are out
fighting for their civil liberties. You know, there's all sorts
of technology that's being built. Their cameras that are being
built that can recognize people in real time that are police,

(27:31):
police are wearing. There's all sorts of technology. This is
just the beginning of it. I know you mentioned Amazon
put a hold on their sales of recognition software. Microsoft
said it's not going to sell face recognition software to
police departments until their federal regulations. I know IBM said
that it was going to announce a ban on general
purpose facial recognition. Is that enough?

Speaker 2 (27:52):
Like?

Speaker 1 (27:52):
What is I guess you know, what is the government's
role here? Like, what do you think should happen? Especially
since this is, just, as you say, one small part
of a larger issue that we're facing as a society.

Speaker 2 (28:05):
I think that's right, and I think that there could
be government regulation, but that's not going to happen unless
the public wants to urge their representatives to start controlling this.
And what we've seen is that an enlightened public can
make something happen even without regulation, right, So it was
that the public was becoming concerned and that's the reason
why Amazon acted to withdraw this. They started being concerned

(28:28):
that their customers were not going to be happy with them.
And I think at this point that's almost more effective
than government regulation. And once you have that wake up call,
then you can start having serious debates, and I think
those debates have to take place in many places. They
should be taking place in legislatures where people can talk

(28:48):
about the trade off between privacy and mass surveillance and
whatever the government is trying to accomplish. Why do they
need this technology? Is it really worth it? Are there
crimes that they wouldn't be solving without it? And are
they crimes that we're concerned about solving or do they
fall into the category of is that something that we
don't think should be a crime at all. People are

(29:10):
generally unaware in terms of what the police do that
only four to five percent of all arrests involve crimes
of violence. So when people think about we want to
enable law enforcement to be catching criminals where we're concerned
about divesting or defunding the police, because who's going to
protect us from physical harm? Almost none of what the
police and law enforcement do is about physical harm. It's

(29:32):
a tiny percentage. Everything else that they're doing is about
this whole array of all sets of other things that
we criminalize. And I think that in addition to having
better conversations about is there a potential for some of
these technologies that the government is using to create arbitrary
or discriminatory enforcement, I think we need to dig deeper

(29:53):
behind that question, in the same way that you need
to dig deeper beyond the George Floyd murder and to
ask if there's something systemically wrong here? Do you need
to rethink the whole question. So when people say, well,
you know, oh, but we need the facial recognition technology
because it helps the police call crimes, well okay, but
you know what crimes and what are the costs? So
I think once people are educated enough and once they

(30:15):
realize what the nature of the problem is kind of
what's being unleashed, they can start really being ready to
have that broader conversation. And I think it should take
place in legislatures. But I think it also should take
place and evidently is taking place in boardrooms at Amazon
and Facebook and Google and Microsoft. They should be talking,

(30:35):
and they do sometimes if the people demand it. And
it also has to take part just among people, you know,
among you know, tech communities and people just beginning to
talk about what are our responsibilities here? Is it okay
for us to create products to you to make money
if we know that there are dangers, that the products
are going to be misused or maybe aren't reliable enough,

(30:57):
or that they just feed into this enormous violence state.
So let me compare this to an earlier moment. After
nine to eleven, we had a kind of similar phenomenon
that in order to deal with catching terrorists, we changed
a lot of laws that ended up really sacrificing a
lot of privacy and allowing a lot more government surveillance.
And for a number of years that went unchallenged, and

(31:19):
people kept saying, oh, oh, well, you know, if that's
what we need in order to be safe, we're willing
to give up a little privacy. So, first of all,
I think people didn't think about the fact that they
weren't giving up their own privacy. They were giving up
somebody else's. And second of all, people didn't realize how
extensive the surveillance really was until Edward Snowden. So then
after Edward Snowden came along and people realized how the

(31:42):
government was just scooping up tons of information about people
and just keeping it in government databases and started realizing
the horrifying potential of all that. What happened was that
Congress made a couple of little changes to the law.
But more important, Microsoft and Google and other places started
to realize that their customers were concerned, and they started

(32:03):
being a little less cooperative. At the beginning, right after
nine to eleven, all of the telecoms, all these companies
were just saying to the government, you want information, here,
take it all. Yeah you have. Verizon said, sure, you know,
here are all the records of all our customers. Take
it all. You're keeping us safe. And I think that
to me, the most important thing is an informed public.
That if people can examine for themselves whether they really

(32:25):
think that we're being kept safe by all of this,
and really examine you both the costs and the benefits
in an educated way, I think we get much better discussions.
And I think not only do you have the possibility
of getting better legislation or regulation, you also have the
possibility that private companies and the tech companies are not
going to want to do it anymore because their customers

(32:46):
don't want them to. Yeah.

Speaker 1 (32:48):
I mean it's hard to have an informed public and
to have these discussions, even in this current environment to
some degree. I mean, people I think are struggling with
the idea of truth. People are, you know, And I remember,
by the way, I remember this note in leaks, like
I remember being in the newsroom covering technology and thinking
to myself because I rode the tech bubble all the

(33:08):
way up right, and thinking, this is an extraordinary moment
because we saw that we've been sharing all our data,
but we saw for the first time that, you know,
the government had a lot of access to things that
we had no idea they had access to. And I
think it was a fundamental shift, and the lens on
tech companies changed at that moment, and tech companies behavior

(33:30):
has changed quite a bit after that. You know, I
wonder this moment we're sitting in where we're having these
debates about surveillance and privacy and whatnot. These are sticky
debates and they're very politicized, as we're heading into an election,
as we have misinformation spreading online, as a lot of
people don't know what to believe and what not to believe.
As the media landscape has changed, it certainly seems like

(33:53):
a harder environment to even have some of these conversations.

Speaker 2 (33:57):
Well, I think in some ways it's harder in some ways.
I think the other thing that is a catalyst for
the discussions is realizing that there is a dimension of
rates to all of this. I think, in talking about
artificial intelligence and facial recognition, not many people saw that
as an issue of structural racism. You know that there's
something wrong with how we're putting together the algorithms, and
it ends up that John Lewis is going to be

(34:18):
misidentified as somebody who matches a mugshot and that Robert
Williams is going to be arrested. So I think that
the fact that we now know that that is an
additional concern enables us to have richer conversations. So we're
not only talking about is there a trade off between
security and privacy? Plus, I think the other thing that
people are feeling much more open to is to have

(34:40):
that deeper conversation about what are our goals here and
if we're enabling all this government surveillance in order to
help the government to catch criminals, well, you know, what
do we mean by criminals? What crimes are they solving,
and how are they using you know, how how is
this actually being used in services? What? So I feel
like in some ways, you know, with the election coming up,

(35:01):
I think that gives people more impetus to want to
talk about these issues, because elections aren't only about the president.
They're also about local prosecutors and sheriffs and the people
who make the decisions about whether to buy surveillance equipment
and what they're going to do with their authority over
the criminal justice system. So one thing the ACLU has

(35:21):
been doing, in addition to everything else, is we've been
very involved in elections of prosecutors because that's a place
where almost people never used to pay attention to who
were these people running and maybe they would vote for
somebody without really knowing what they voted for. So what
we're urging, and I think this is very much what
we're talking about about having an educated public. We're urging

(35:42):
people to go to elections or to go to debates,
to go to campaign events, attend I guess on zoom
these days, to attend campaign events and ask the candidates questions,
what would be your policy about whether or not you're
going to accept military equipment from the federal government in
your police department? Are you going to buy tanks? Are
you going to buy these horrible weapons that are used?

(36:05):
Is that something you would do? Are you going to
buy facial recognition software? Is that how you would use
your power if we elect you say that the prosecutors
would use support a reduction in cash bail and increase
increased alternatives to incarceration. So that's a place where without
waiting for the government to do something, we can ourselves

(36:26):
affect what's happening in our communities by encouraging candidates to
think about what positions they're taking on these different issues
and letting them know that they're going to lose votes.
The more people educated, the more they are educated, the
more they can tell people that they'll lose votes and
to try them. This is something that's worked in some
places to encourage candidates to take a better position. Yeah,

(36:49):
they might never never have thought of that, but once
they commit themselves, you know, that's going to be better. So
there are all sorts of ways that we can affect things.

Speaker 1 (37:00):
More from Susan after the break, and make sure you
sign up for our newsletter at dot dot dot media
dot com Backslash newsletter we'll be launching this summer. Before

(37:23):
I move on from specifically some of the tech issues,
I have to bring up predator drones.

Speaker 2 (37:29):
Yeah, right right.

Speaker 1 (37:31):
You know, the US Customs and Border Protection flew a
large predator drone over Minneapolis protests. You know, people were
protesting police brutality and the killing of George Floyd, and
for many reasons, it almost felt symbolic. You know, it
was raising all these questions about aerial surveillance, about what
data was being collected, Where was this going. What is

(37:52):
your take on this?

Speaker 2 (37:54):
Well, you know, as you're saying, Laurie, and you know
that really it really magnifies the opportunity to gather more
information because you don't even have to have the helicopters
or whatever. But so you know, that of course is
a concern just to know how much information is the
government gathering, what are they going to do with it,
who's going to have access to it, will it ever
be deleted? Or will it just got to stay there
and the government databases forever. But I think the other

(38:16):
thing that the Predator drone brings to mind is a
question that people were also asking, which is about the
militarization of law enforcement. We have had for years in
this country, a posse comatatas act as it's called, which says,
you don't want the military doing everyday law enforcement, because
that's that's not our country. You know. We don't want

(38:37):
the military to be quote dominating the streets, and we
don't want the people who are out protest and to
be considered the enemy of the United States. They're people
who are expressing their opinion. And so the whole idea
of you know, it's one thing. It's enough if the
police held helicopters are flying overhead and trying to keep
track of, you know, who's in the crowd and what

(38:58):
the crowd is doing. But once you start adding an
element of the military helicopters or the military drones or
things that feel like we are being treated as the
enemy of the government instead of the people who are
the government, who are supposed to be controlling the government,
I think that's just it's a very bad paradigm.

Speaker 1 (39:19):
You think it's a slippery slope, Well, it's.

Speaker 2 (39:21):
A slippery slope unless we stop the slipping. And as
we saw what you with Amazon and the facial recognition,
if people say, wait a minute, yeah, I think we
can make that stuff. But I think if people don't
pay attention, I think we have a very slippery slope.
And that's all I've been saying about most of the
issues we've talked about you, starting with the contact tracing
and the surveillance and everything else. It seems to be

(39:42):
that what's really important is transparency. We should know what
the government is doing and accountability. Back on the issue
of contact tracing, one thing that the acl you do,
together with the ACLU of Massachusetts, is we have filed
a lawsuit, actually a records request, demanding that government, including
the CDC, release information about the possible uses of all

(40:04):
the location data that they would be collecting in connection
with contact tracing, because you know, once if you don't
know what they're doing, then you can't have a discussion
about what they should be doing. And one reason why
I was bringing up all the post nine to eleven
changes of law is that I think that the whole
idea that we can't know what the government is doing.
The government has to act in secret in order to

(40:25):
keep us safe, or else the enemy will be able
to know what they're doing and you know, and work
around it. But the government can know everything that we're doing.
I think that just has democracy backwards. We have to
be able to know what's happening inside the government. And
that applies to why are they sending the Predator drone?
What are they going to do with the information? What
does this mean? Are they going to do it again?

(40:46):
And it also has to do with the contact tracking
and tracing. Once they get that data, what happens to it?
You know, are they going to erase it ever? You know,
who do they share it with? What are they going
to do with it? And I feel, you know, those
are really important issues in a democracy that we just
have the right to know what the government is doing
so that we can talk about it. And I feel
like to sort of say, well, this is what the

(41:07):
government is doing and that's really bad, and that upsets me.
I think that kind of misses the point. If the
government is doing something bad, then it is the duty
of every American to find out what they're doing. And
to push back. And so at the ACLU, we have
a program that we call people Power. We first invented
that and used it to explain to cities and localities

(41:29):
all over the country about how they could fight back
against draconian immigration rules by becoming quote sanctuary cities what
their rights actually were. We then used it for voting rights.
We're about to use it some more for voting rights.
But what we have the urge and I hope that
you know, some of your listeners will go to the
ACLU website and see about what people Power is doing
in addition to what the ACLU is doing, Because what

(41:52):
is the ACLU doing, And that's all the staffers at
home trying to you know, work on their new laptops
while they're trying to you know, keep their toddlers quiet.
But is about what every single person can and I
think should be doing. You know, if people really educate
themselves and think about the ethical issues, the costs and
benefits of all this technology in addition to a lot

(42:12):
of other things going on, I think we get a
lot better results if people pay attention.

Speaker 1 (42:16):
Yeah, I mean it's interesting to watch the acl you
take on issues like surveillance facial recognition. I know the
ACL you filed a lawsuit against Clearview AI, which was
this very controversial company that was using biometric data. I
think facial recognition technology helped them collect something like three
billion face prints and they were giving access to private companies,
wealthy individuals, federal, state, and local law enforcement agencies, and

(42:40):
you know, coming from the tech space, it certainly feels
like sometimes these stories, you just don't know what these
companies are doing until you start peeling back the layers
and seeing, well the data went to hear and here,
and why did it go there? And why wasn't this disclosed?
And oftentimes it takes the watchdog to really understand and

(43:00):
where some of this can go wrong and how it's
being used in ways in ways that can be dangerous
in many ways.

Speaker 2 (43:08):
Yeah, I think that's exactly right. And that's why I
was saying before that aren't concerned. Before everybody jumps on
the bandwagon about let's have more contact tracing, and then
you know, like everybody should just be doing all this information,
I think we have to get a dog. Yeah, you're
not going to have the watchdog telling you things unless
you build a watchdog into the system. And if everything
is just you know, a company who's invented this and

(43:29):
is selling it to the police, or a company who's
invented this and now we're all going to buy it.
If you just leave out any sort of oversight, then
you really have a tremendous potential problem.

Speaker 1 (43:38):
Are there any other examples of tech that we're not
thinking about the unintended consequences for our rights or privacy yet?

Speaker 2 (43:45):
Oh? Well, you know, AI is really big altogether, as
you're saying, across many different kinds of issues. I was
just actually, this is not tangential to your question, but
you were asking me before about cases that I had
worked on, and there was another case that I worked
on that was about tech where I wrote the acl
used brief in the Supreme Court. It was a meekas brief.

(44:05):
It wasn't about our client, but it was a case
called Riley versus California. And what the police were saying
there most law enforcement places, the federal government as well
as the state of California and many other jurisdictions, was
that when you arrest somebody, the police get to do
what is called a search incident to arrest, so they
get to see what you have in your pocket makes
some sense, right, you know, if you have a gun

(44:26):
in your pocket, that's a problem or you know whatever.
So they get to do a search incident to arrest.
And the law had been that if they find something
in your pocket, it's like that, that's a container. They
can search inside the container to see if there's anything
in it that could be harmful. And in fact, there
was one situation where they opened up a cigarette package
that somebody had and they you know, they could find

(44:47):
a razor blade, they could find you know, marijuana, cigarette whatever.
So that was law where the Supreme Court said, yes,
you're allowed to search people and search the containers that
are on them. Well, what law enforcement said was, your
cell phone is a container. When we arrest you, we
can search your cell phone. It's a container. We have
the right to search incident to a risk. And so
we wrote a brief saying no, it's not you know,

(45:08):
it's a container, but it's a container that essentially is
your home, it's your library, it's your desk. So allowing
the police to look in your cell phone when they
only had really very feeble and very unlikely scenarios, things
that just wouldn't happen too often for what the need was.
You know, maybe you had some remote thing that would
go off and would blow something up. You know, oh

(45:28):
come on. But yeah, there were other ways to deal
with a lot of that, and so the Supreme Court
actually agreed with that. They said, yeah, you know, this
is really it's just a technological way of finding out
what's in all your papers and books and records. It
used to be there were in your desk, and now
they're in your cell phone. So that, to me, it's
sort of a whole thread of what we've been talking about.

(45:49):
But the challenges to civil liberties are different and in
some ways greater when the technology builds up.

Speaker 1 (45:57):
Yeah, there's a great quote on the acl you left.
The fact that technology now allows an individual to carry
such information in his hand does not make the information
any less worthy of the protection for which the founders
fought from the US Supreme Court, a Chief Justice John Roberts.

Speaker 2 (46:12):
Exactly, I like to talk about, you know, one of
the whole points of the Constitution, adding the Fourth Amendment,
which is the protection of privacy, is they wanted to
protect what was in Benjamin Franklin's disk. Nobody should know
if he was writing things that were anti government, and
we now have that on our cell phone, so of course,
but that's where I think that a lot of the
protection of civil liberties is applying our fundamental principles in

(46:35):
different circumstances.

Speaker 1 (46:37):
Taking a gigantic step back, what do you think is
the biggest threat to civil liberties in the new World Order?

Speaker 2 (46:44):
In the New World Order? Well, you know, it's hard
to just select one. Instead of like Sophie's choice, you know,
which is your favorite child right now? I think one
of our very top priorities, and it didn't have mess.
Incarceration is a big one because so many people's lives
are just being totally disrupted, their families often. The question
really has to be for what. One thing that we're
hoping is that the work we've been doing around trying

(47:06):
to get vulnerable people released from prison so that they
won't get the virus and get seriously all possibly die
is we're hoping that once jurisdictions see that they were
able to release thousands of people from prisons and jails
and that it's not going to cause a spike in
the crime rate, it really is a pretty safe thing
to do. We're hoping that that's going to stick and

(47:28):
that long run will be able to rethink, well, did
we really need to put all these people in prison
in jail to start with? You? What are we doing
with the criminal justice system? So that's really big. But
the other thing that I think is really big right
now is voting rights. I alluded to this at the
beginning of our conversation, but the premise of democracy is
that the people get to decide on who should be

(47:49):
running the government and who should be making the policy
about all these things we're talking about here. What are
the regulations about technology, what are the regulations about your
reproductive freedom? Everything else? LGBT right? And if the people's
vote is distorted, that's the real problem that people can't vote.
So we have litigation going on right now in I

(48:11):
think it's like thirty different states trying to get people
the opportunity to vote. So one of the things that
has happened, in addition to all ways that incumbents had
been using to try to protect their own seats, is
that the virus has really made it dangerous for people
to vote in public places. So we saw the election

(48:33):
in Wisconsin where people were just lined up for you know,
tremendous distance is waiting for a really long time to
vote because Wisconsin would not allow them to submit absenteebellants.
And in fact, a study showed afterwards that at least
seventeen people got the virus from voting. Many many polling
places were closed because they, first of all, the pell

(48:53):
poll workers are generally elderly people, and the poll workers
were not able and willing to man the polling places.
There are a number of states that don't allow absentee
ballots at all unless you have a particular situation, like
if you're disabled, and the states are saying, oh, well,
you know the peer of the virus are getting ill,
that's not a disability. Or before you get an absentee ballot,
you have to have it notarized, you have to have witnesses. Now,

(49:15):
how is all this going to happen? So it's very
concerning that people are going to have to choose between
their health and their right to vote, and we don't
think that that should happen. And that's something that has
to be attended to right now, because if states don't
come up with plans for trying to enable everyone who
wants to vote to be able to vote, and for

(49:36):
counting absentee ballots and for administering this program. If you
don't come up right now with the plan and the resources,
a lot of people are going to be left out,
and they're going to find that either you know, they
can't vote because they're afraid to go out to the poll,
or the vote is not going to be adequately counted.
So I think that right now making democracy work is
really one of our top projects.

Speaker 1 (49:56):
What is the solution to some of these problems? What
are your tangible solutions?

Speaker 2 (50:00):
Well, one tangible solution is that more states have to
make absentee balloting available to people without having all these
conditions and you know, obstacles. The other solution is you
were talking before about truth. A lot of the reason
that's given the very thin veneer of justification that's given
for we don't want absentee ballots, or we need voter

(50:21):
i D people to carry government approved voter ID, which
means you have to go down to a governmental office
live and get your voter ID and show it at
the polls. The excuse for a lot of this is
that there could be fraud. Well, studies have shown that
there's virtually no voter fraud and it's a real unicorn.
And again, I think if people understood that, that might

(50:43):
sound good, but it's not true. I think truth is
another thing that we're really fighting for these days. Can
you listen to the evidence, Can you listen to the
public health officials? Can you listen to what's real?

Speaker 1 (50:54):
I know for a fact that tech companies are very
concerned about voter suppression, you know, and less information spread online.
This idea of countering truth around a lot of these
very important initiatives, whether it's absentee ballots, whether it's showing
up to the polls, all that kind of thing. You know,
I'd be curious to know your take. There's a current
battle happening right now. You have seven hundred and fifty

(51:14):
advertisers boycotting Facebook asking for better policing of hateful content.
Are social media companies doing enough to police harmful content?
Especially as we head into an election where voter suppression
and the spread of misinformation will most certainly be a
tactic used to manipulate voters.

Speaker 2 (51:31):
Well, let me actually break your question down to two
different parts, because you're starting by saying the concern about
voter suppression. I think one thing that everybody should be
doing is to increase awareness of what is a fair
way to improve access to the ballot for everybody. And
some of those things are tech solutions. We've had tech
solutions for years that are available and not widely enough

(51:52):
used for how to enable differently abled people to vote.
You can blind people vote, do they have the technology?
So there are a lot of is where we need
the tech community, and we need everybody to find out
how you vote, to find out a voting can be
made easier, and to let people know what the rules
for voting are where they live. So one thing the
ACLU is doing is we have on our website you

(52:13):
know your rights, know what your voting regulations are. And
that's something that I think people really have to start
thinking a lot about and to let all their communities,
all their friends and family know about the importance of
voting and what they have to do to vote, and
to urge them to just get out and vote in
whatever form that's going to take. So I think that's
really important. In terms of disinformation on social media, people

(52:38):
talk about the First Amendment and whether there's the First
Amendment problem with Facebook telling you what you can't do. Well,
there isn't because the First Amendment only applies to the government,
so you don't have a First Amendment right to say
whatever you want on Facebook. However, I have to say
that we're we don't regard that issue as altogether as
simplistic issues that Facebook should be telling everybody that they

(53:00):
can't say because even though the First Amendment does not
apply to private companies, there's still a tremendous value to
free speech. And there are a number of examples which
you know, we've come up with about people who are
have speech suppressed for bad reasons. I'll give you one example.
There was a woman who an African American woman who

(53:20):
posted something on Twitter and she got all these horrible
racist responses, and she posted a screenshot of the responses
that she got to show people what she was up against.
And Twitter took it down because it included racist words
that you know, okay, you know, kind of misses the point.
There was another ACLU lawyer wrote about a statue in

(53:42):
Kansas that was a topless statue was a woman who
you were, bare breasted, and so whatever the locality was
in Kansas decided to take it down because you know
that they considered that to be unportannt. So the ACLU
lawyer who was challenging whether or not the I think
it was city could take it down, posted a a
picture of the statue and that was it wasn't Twitter

(54:03):
is I think Facebook, And that was taken down on
the ground that it was obscene, so she couldn't post
the picture of what she wanted to do. So we
think that social media control was really a two wage sword.
What I liked is at one point Facebook had a
protocol about you what's true and what isn't true, And
what they did was they gave you a flag. So
if they were concerned that something that was said wasn't true,

(54:25):
they would have a neutral fact checker check it, and
then if it didn't turn up well, they would put
a little flag over it and say this has been questioned,
and you could click on the flag and you could
see why it was questioned. But they didn't just take
it down. So you know, I agree that, you know,
disinformation is a tremendous problem, but I think that the
idea that the solution is asked the tech companies to

(54:46):
decide what we should and shouldn't see. Yeah, I don't
think that's so great either, and certainly they should not
be doing it without a lot of transparency and accountability.
If they're going to be taking things down, they should
tell us what their protocols are, and you know, there
should be more public discussion about where the balance is there.

Speaker 1 (55:04):
Yeah, it certainly seems like the protocols change quite a bit,
especially having covered tech for this many years. It certainly
seems like Facebook changes that Twitter changes it, and oftentimes
it depends on public pressure. I'm curious to see what
happens with all these advertisers boycotting. I think personally, I
have a feeling it won't impact the bottom line much
and they'll go back to business as normal. But who knows.
You know, I do know that Zuckerbird cares deeply about

(55:27):
his employees, and but they've been kind of up against,
you know, public scrutiny for a very long time. But
it certainly is interesting, especially when the stakes get higher
and disinformation can go further, and especially as we get
closer to an election, it certainly feels like everyone feels
more triggered around it.

Speaker 2 (55:46):
Yeah. Yeah, Well, you know, one of the classic statements
about the First Amendment is that in the marketplace of
ideas the best antidote to bad speech is more speech, right, So,
you know, suppression. I think we always have to worry
every time somebody is censoring and suppressing. Yeah, who are
we giving that power to?

Speaker 1 (56:03):
You know, nearing a close Spana. We don't have you
for too much longer. I saw that you gave a
talk a Democrat and a Republican walk into a bar,
and you're saying that it seems like these days Democrats
and Republicans can't really agree on anything, but we all
need to agree on fundamental American principles like due process,
equality and freedom of conscience. So is that possible? Do

(56:26):
you believe you? Are you an optimist? Do you believe
that in this current environment, is that possible?

Speaker 2 (56:32):
Well, I think that's a great wrap up question. May
so that speech. I gave it to the Central Arkansas Library,
And my chief point, as you're saying, is I think
that people have to be able to agree on neutral principles.
The Constitution was designed not to say what we're going
to do about everything. It was designed to have everybody

(56:53):
have a fair opportunity to be part of the process
of deciding what we're going to do. So it sets
up all these Democrats structures where we get to vote
for the people who are the policymakers and we all
get to decide. But the principles there, the underlying principle
is that everybody should have a fair and you know,
the principle should be neutral. Everyone should get to vote.
It's not like, you know, if you're a Democrat, your

(57:14):
vote doesn't count in this area, and if you're a
republic and your vote doesn't count in that area, that's
not fair. And the basic ideas of the freedom of speech,
freedom of religion, they're all to be basically manage the
stations of the Golden rule that if I want the
ability to just choose my own religion and decide what
religion I'm going to practice, I have to respect your

(57:35):
right to make a different choice and have your own religion,
because that's the golden rule. If I want to say
something that's unpopular, I have to respect your right to
say something that's unpopular. And if I want to be
treated fairly and not locked away for doing something minor
and never given a fair trial, I have to respect
your right to have the same thing happen to you,
and to be all those fundamental principles are things that

(57:56):
we really all should agree on. I think people get
into arguing and assuming that they can never agree on
the principles because they're differing on what they think the
results should be. And I think to be part of
the point of civil liberties is it's all about process,
it's not about results. The ACLU is nonpartisan. We don't
try to get Republicans elected, we don't try to get

(58:16):
Democrats elected. We don't favor or disfavor individual politicians or
individual parties, but we favor or that there should be
neutral principles that everybody can agree to to say, okay,
here's what's fair. And the analogy I used in that
talk at the Central Arkansas Library it was one of
the nights during the World Series, but fortunately not a

(58:38):
night where there was a game, so people were able
to come and I said, okay, so what happens before
a baseball game is that everybody has agreed on the
underlying rules, and everyone agrees that your umpires are referees,
and any sports should be neutral, and you don't want
somebody who's partisan. If they were favoring one team, you'd
get rid of them, and all sports fans could agree

(58:58):
to that. Maybe would be a few who would be
just so you know, machiavellian, that they would rather have
the biased umpire to always rule for their side. But
I think sports fans can agree. What you really want
for a fair game is you want a fair game,
you want everyone to agree on the principles beforehand. And
I think that if we could sit down in small
groups around the country and really talk about what the

(59:19):
fundamental principles are, I am enough of the patriot to
think we actually could agree about a lot. And let's
be give you an example of why. I think there's
some basis for hope. Maybe not optimism, but certainly hope.
We were talking about voting rights. So one of the
major problems is gerrymandering, the way when a party is
in power they try to distort all the districts and

(59:42):
they try to stack the deck so that their party
will remain in power. Or if the party in power
in a particular state thinks it's to their advantage to
not have that many people vote, they try to make
it harder to register to vote for new voters, et cetera.
We have had the ACLU and and a number of
other organizations working in coalition with us have had a

(01:00:03):
fair amount of success doing ballot initiatives going to the
people of a state in states like Michigan and Nevada
and Missouri and Florida, where we were part of getting
the Amendment four past that gave the vote back to
people who had been convicted of a felony at some
point and the people of the state. When you ask
the people of the state, you can get a majority

(01:00:25):
sometimes the super majority of people who say no, we
want the rules to be fair. Who doesn't want the
rules to be fair are legislators who want who are
incumbents and who want to keep their seats even if
it takes unfair procedures to do it. So that's a
real problem we have right now that the incumbents, the
people who are trying to maintain power and not allow
any sort of regime change, are pulling all the levers.

(01:00:48):
But what I think, I think the chief grounds for
optimism is that when you go to the American people
themselves and say, well, do you want a fair system
or do you want a system where you think your
aside is more likely to win? Them? About that, and
I think that you're going to get them to say,
they would really like to see a fair system, and
that is the promise of America.

Speaker 1 (01:01:07):
Last question, you have taught at Brooklyn Law School since
nineteen eighty. What is the lesson your students will take
from this moment in history?

Speaker 2 (01:01:15):
Well?

Speaker 1 (01:01:16):
I know there are lots of lessons, but if you
could extract it, what is the lesson your students will
take from this moment in history?

Speaker 2 (01:01:24):
Well, you know, in an individual setting, one thing I'm
doing for the fall is I am preparing a course
that I'm calling COVID nineteen and the Constitution. So what
we're going to do in the seminar is we're going
to be looking at the way in which the Constitution
has been challenged and to see, you know, how well
it holds up. What does the Constitution have to say
about whether you can quarantine people, and whether you can

(01:01:44):
allow people to be at a religious assembly but not
go to a protest, and et cetera, et cetera. So
I think there's a lot of interesting things there which
I think are very much this particular moment. But big picture,
what I would like the students to take away, the
constitutional law students especially is essentially what I just said
to you that the Constitution is about process. It's not
about results. It's not about you know, you're a Republican

(01:02:07):
and you're a Democrat, and we have two different countries
depending on what your party is. I think that we
have one country and it's all about a neutral process
for very good reasons, and I would like people to
think more about that. After my speech at the Central
Arkansas Library, I had two examples of people who talked
to me. One guy came up to me, he said,
I'm the Republican who walked into that bar, and he said,

(01:02:31):
you know, you're making a lot of sense to me.
And then there was another guy who talked to me
who was a Democrat. He said, you know, I never
really thought about that, that maybe it's not right if
we're only trying to win. I never thought about you know,
that's not what we do in sports. And that's what
I'd like people to think about. You know, do you
really want to do things that are only about how
you think it's going to come out and cheat and

(01:02:51):
destroy the system and you know, put a film on
the scale and you know, stack the deck in order
to make things come out to what your preferred result
is In the short run or long term? Is that
just a really bad idea because it's just totally inconsistent.
You know, we've just come from fourth of July. Is
totally inconsistent with the premises on which we would like

(01:03:12):
to believe our country was founded.

Speaker 1 (01:03:15):
Does technology throw a wrench in the system? I mean
it does. It does create lots of things you can't control,
and it always does.

Speaker 2 (01:03:22):
It's always you know, it's always new environment, so you know,
different kind of example. We were talking about technology and surveillance,
where of course technology has enabled a whole lot of
surveillance that we then have to deal with, but technology
also enabled a whole lot of new marketplaces of ideas.
So the ACL you did a lot of litigation a
few decades ago on applying First Amendment principles to the Internet, right,

(01:03:44):
you know, could the government censor what was on the
Internet because you know, a child, a child might see it? Yeah,
And so you know, every new generation of technology, there
are new challenges about how you apply our principles like
privacy and free speech, et cetera, to the Internet, but
the principles remained the same.

Speaker 1 (01:04:16):
I hope everyone is doing well in these strange and
surreal times and adjusting to the new normal. Most important,
I hope you're staying healthy and somewhat saying follow along
on our social media. I'm at Lori Siegel on Twitter
and Instagram, and the show is at First Contact Podcast
on Instagram and on Twitter. We're at First Contact Pod.

(01:04:36):
And for even more from dot dot dot sign up
for our newsletter at dot dot dotmedia dot com Backslash Newsletter.
And if you like what you heard, leave us a
review on Apple Podcasts or wherever you listen. We really
appreciate it. First Contact is a production of dot dot
dot Media Executive produced by Lori Siegel and Derek Dodge.

(01:04:58):
This episode was produced in edited by Sabine Jansen and
Jack Reagan. The original theme music is by Xander Singh.
First Contact with Lori Siegel is a production of dot
dot dot Media and iHeartRadio
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Clifford Show

The Clifford Show

The Clifford Show with Clifford Taylor IV blends humor, culture, and behind-the-scenes sports talk with real conversations featuring athletes, creators, and personalities—spotlighting the grind, the growth, and the opportunities shaping the next generation of sports and culture.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.

  • Help
  • Privacy Policy
  • Terms of Use
  • AdChoicesAd Choices