All Episodes

July 8, 2020 65 mins

What happens to our civil liberties when an algorithm is used by law enforcement to make an arrest? Even more concerning, what happens when that facial recognitiontechnology is racially biased? As we enter an age of ubiquitous surveillance, it’s minorities - especially people of color - who are disproportionately affected. The ACLU has recently filed a complaint on behalf of a Black man who was wrongfully arrested dueto faulty police facial recognition tech. It’s the first case in the US, but it’s unlikely to be the last because, according to the ACLU - the tech often can’t tell Black people apart. The organization that has been fighting for civil rights protections forover 100 years, is now calling on lawmakers nationwide to stop law enforcement use of facial recognition technology. For Susan Herman, it’s an extraordinary time to be president of the ACLU. Over the years, the American Civil Liberties Union has fought forfree speech, reproductive rights, and privacy. But as technology continues to muddy the waters, the tradeoffs become more complicated. Where do we draw the line between security and privacy? Herman says we must act now.————————————Show Notes

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
First Contact with Lori Siegel is a production of Dot
Dot Dot Media and I Heart Radio. There's a great
quote on the A C l U website. The fact
that technology now allows an individual to carry such information
in his hand does not make the information any less
worthy of the protection for which the founders fought. Exactly.

(00:22):
I like to talk about, you know, one of the
whole points of the Constitution adding the fourth Amendment, which
is the protection of privacy, is they wanted to protect
what was in Benjamin Franklin's disk. Nobody should know if
he was writing some things that were anti government, and
we now have that on our cell phone, so of course,
but that's where I think that a lot of the
protection of civil liberties is applying our fundamental principles in

(00:44):
different circumstances. We are in a moment of reckoning as
we enter an age of ubiquitous surveillance, questionable data collection practices,
even algorithms that discriminate. It's minorities, especially black and brown communities,

(01:10):
that are disproportionately effective. Over the last months, as the
nation has grappled with a conversation around police brutality, we've
seen predator drones used for aerial surveillance at protests facial
recognition technology that wrongfully accused a black man of a
crime he didn't commit, and it wasn't a coincidence. Reports
say the tech is a hundred times more likely to

(01:32):
misidentify African, American and Asian people, and as COVID nineteen
continues to spread, there are serious questions being raised about
contact tracing apps and how that data collected could be misused.
These issues raise ethical questions about technology and its impact
on our civil liberties, equality, and the future of our country.

(01:54):
For Susan Herman, it is an extraordinary time to be
sitting in her seat as president of the a c
l U. Over the years, the American Civil Liberties Union
has filed lawsuits fighting for free speech, reproductive rights, and privacy.
But as technology continues to muddy the waters, the trade
offs become more complicated. Where do we draw the line

(02:15):
between security and privacy and how do we prevent technological
innovation from outpacing the law. I'm Laurie Siegel and this
is first contact Susan. Thank you for being virtually with
me today, but thank you for inviting me, Laurie. Yeah,
you know, I I always start out these interviews with

(02:37):
our first contact. I talked to guests about how we met,
and we don't really have a first contact. We've never
met in person, but we met on an email chain
because we were going to do an interview together for
something else and it and it fell through. So I said,
you've got to come on the podcast because you are
just sitting in such an extraordinary seat at such an
extraordinary moment in time, so that it's our first contact. Well, thanks,

(03:02):
It just seems to me like our first contact is
total serendipity. Yeah, exactly, so you know, to get started.
You've been the president of the a c l U
since two thousand and eight, and I said this before,
but you know, what an extraordinary time to be sitting
in your seat. You know, how are you feeling? Oh my,
it's just sort of overwhelming. You know, as president, I'm
essentially chair of the board, so you know, I'm not

(03:24):
the one doing the day to day work as all
of the members of our staff are. But to be
a member of the a c l U staff right
now is just it's mind boggling because we had, you know,
a lot of work that we were already doing before
two thousand and sixteen, with all of the states making
worse and worse laws about reproductive freedom and voting rights
and immigrants rights, and you know, all sorts of other things.

(03:45):
Then can the election, and since that we have brought
a hundreds and seventy three legal actions against the Trump
administration for things like family separations and the travel ban
and prohibiting trans in the military. Then in March, COVID
hit and at that point, you know, since then, we've
also brought over a hundred lawsuits, including with a hundred

(04:06):
lawsuits just about people who are incarcerated in jails and
prisons and ice attention and who are just in a
hot spot. You know, they have no control over whether
they can social distance, and so we've been working very
hard to get vulnerable people out of you know, those
terrible situations basically death traps. Plus the COVID also led
to a number of states opportunistically restricting things like freedom

(04:30):
of abortion, declaring abortion to be a non essential procedure
so people could just wait until the pandemic is over
to get an abortion right. And voting rights has also
just been a really fraught area right now because all
the restrictions on voting and the ways in which the
vote was becoming distorted have just been magnified by all

(04:52):
the difficulties, and so there's a lot to talk about.
So I was about to say, what what what I'm
hearing from the is you're sleeping really well at night.
You know, there's no work to do, almost nothing to
do the stuff that they're just sitting around polishing their nails. Yeah,
So I mean, like, take me to March, like coronavirus hits.
You have been involved in some of these monumental cases

(05:12):
that have just shaped society and our civil liberties, like
coronavirus hits, And now you know, we have a little
bit I don't even think we have the luxury of
perspective at this point, but we have a little bit
more perspective. But like, take me to March, Like in
your role at this extraordinary moment, Like what was going
through your head? What were you concerned about at the time? Well,

(05:34):
you know, what are the first concerns is just you
have to close the office. So the first concern is
how can people do all this u massity? It increases
the work and makes it more difficult to do the work.
So we just had to really make sure that our
technology was was up to doing things. So one thing
that the a c l you did was to buy
new laptops for some stuff. People who are going to
be working and you have to worry about, you know,

(05:56):
how the technology is working. Um, which has been a
question for us every time there's something really big hits.
When the travel band hit, there were so many people
wanting to donate to the a c l U. There
are website crash so even things like that, you know,
that's you know, like number one of how do you
handle this? We have been fortunate so far that the
a c l U is so well managed and we

(06:18):
had not spent every penny that all of our donors
had given us up until that point, so we have
not had to lay people off, which is very fortunate because,
as you're saying, there's more than enough work to do.
But yeah, that's the first concern of just you know,
how do you keep the organization up to speed and
ready to do you know what. Staff members now need
to be doing an incredible amount more work. But for

(06:38):
some of them it's well, they're juggling a toddler and
a dog. Yeah, can you give me a run through
of some of the cases that you've been involved in.
That correct me if I'm wrong. You started out as
an intern, right and really just worked your way up.
I mean, I can imagine you've been involved, and I
know you've been involved in some pretty extraordinary cases. To

(06:59):
give listeners some text, can you explain some of the
cases that kind of stick out to you? Well, I
wasn't intern for the a c l U back, you know,
in the nineties seventies, you know, around the time when
I was in law school. And just to make sure
that everybody understands, I don't actually work at the a
c l un. My day job is I'm a law professor,
and I don't generally work on the cases. What I'm

(07:19):
generally doing is we run the organization. But I'll tell
you I think, you know, it would be interesting start um.
But the first a c l U case that I
actually did work on, which was while I was a
law student, and this was the case. One of my
connections with the a c l you originally was that
one of my law professors in the first year was
connected with the New York Civil Liberties Union and he

(07:40):
had some clients who came to him who were graduate
students at stony Brook on Allowland, and they had just
discovered they were not allowed to live together. They had
rented a house together. There were six of them, and
they had just discovered they weren't allowed to live together
because there was an ordinance in their village village called
belt Hair, that prohibited more than two persons unrelated by blood,
marriage or adoption from living together. So, you know, they

(08:03):
were pretty shocked. And it turned out that under the
laws it was at the time, by the time ye
they were talking about this, they were liable for all
sorts of criminal fines and punishment. It was really in
a very heavy stuff. So I started working on that
case with my law professor and um we went to
a federal judge to ask for a temporary restraining order,

(08:26):
which means to just until we had litigated whether or
not that was a constitutional thing to do, to tell
people who they couldn't couldn't live with, that the village
should not be allowed to either kick them out of
their their house or to you know, start mocking them
up because you know, they owned too many fines for
having been illegal residents. So the judge ended up signing

(08:46):
the order, and he was signing the order about that.
And then one of the ways they which actually the
original way in which our clients had discovered that they
were illegal residents, was that they had applied for residents
only beach permit and they were told they couldn't have
one because they were illegal residents. So the judge who
we had, the district judge, who was a very nice man,
looked at the order we had written out and he said, well,

(09:07):
you know, it's the summer. Don't your clients want to
go to the beach while the litigation is pending? Do
you mind if I write that in that they have
to be allowed to park in the parking lot of
the beach. So we said, sure you, that's very nice,
so he wrote that in. Then, as the junior member
of the team, I was sent out to explain to
our clients, to show them the order and explain to
them what was going on. And they gave me a

(09:27):
tour of you what the village looked like in the
residents only beach and the minute the wheels of their
car hit the parking lot is very large. Your fierce
looking man comes striding across and says, what are you
doing here? You're not allowed to be in this parking lot,
and they all look at me, and I'm thinking, what
am I. I'm like, you know, twenty something, I'm not
very tall, and what am I supposed to do with
this large man who doesn't want us in there in

(09:49):
his parking lot? And then I remembered that I had
a federal court order right on my person, so I
kind of drew myself up, but I showed him my
federal court order and I said, well, I'm with the
New York Cibiliberties Union kind of, and I have a
federal court order saying that these people are allowed to
be in this parking lot and go to the beach.
And he melted that that was I think in one

(10:10):
of the points at which I thought, wow, you know this,
this is really powerful stuff. Yeah, you saw that there
was my first day. Yeah, exactly, that's great. And I
saw I read that some of your your earliest memories
of speaking up to authority involved I think a dispute
over a book at your school library. Yeah, that's right,
Even before the Belchair case. My first civil liberties hero

(10:32):
was my mother. So when I was in third grade,
we were doing a school play about a story called
Johnny Tremaine about a boy in the American Revolution, and I, like,
I thought the play was interesting. Players don't have that
many words. And we were told that this was based
on the book. So I went to my school library,
my public school library, and I asked to take out
the book, and the librarian said, oh, you can't take

(10:54):
out that book, dear, that's in the boys section. And
I was I was surprised to find this out. I've
been reading books in the girl's fiction, which were all
collections of fairy tales and biographies of president's wives, but
it had never occurred to me that I wasn't allowed
to take out a book from the boys section. So
I went home and I told my mother about this.
You're just thinking, you know, that's the way things are,

(11:14):
and she just exploded and she called the librarian the
next day and say, how dare you told my daughter?
You know what she's not allowed to be. So the
librarian told me that from then on, I could take
out any book I wanted, and you know, not long
after that, they changed the policy for everyone. So, you know,
there was another example of how you know, you can
kind of speak up to authority when they kind of

(11:35):
tell you who to be and prevent you from making
your own choices? Were you always like that? Well, you know,
that's third grade, and I feel like yes, I think
for most of us are values of when we're pretty young. Yeah,
so you know, seeing my mother do that, I'm sure
you would have had an impact on me. Yeah, that's
such a good story. And did you I mean, did
you always know you wanted to go into law? No,

(11:58):
I actually really didn't because having grown up as a
woman during that era, my father was a lawyer and
he always used to talk about the fact that law
was really not a good profession for women. Why would
you want to do that if you could be an
English teacher and help the summer off, you take care
of your children, so you have to be a while.
I graduated from college and then spent a few years
doing other things and then decided to go to law school. Well,

(12:19):
I mean, it's it's so interesting and now kind of
seeing where you're at um and seeing this moment, it
does feel like a moment. And I was looking at
something you said about you know, this feels like a moment.
We can be optimistic because so many Americans are beginning
to really understand the scope and the depth of structural racism.
It certainly feels, you know, I'm based in New York City.

(12:41):
You can just feel it right on the streets with
the protests, and they hear the sirens and the helicopters,
you know, as we sit here, um and we hear
you know, your rich history and covering and caring about
these issues. What is the challenge for you guys ahead, Well,
you know, the channel on that particular subject is that

(13:02):
this is work that we had already been doing. One
of our top priorities for the past several years has
been trying to break our addiction to mass incarceration, which,
as everybody is now really coming to terms with, has
been really it's a system that has disproportionately affected people
on the basis of race and income and disability. A
quarter the people who are arrested or people who are

(13:24):
mentally ill, and our feeling is that the system has
been fundamentally broken and misguided for a long time. So
part of what we're trying to do with this moment
is to capitalize on the fact that people want to
look at what the police do. We're trying to encourage
people to look beyond the police it's not just you.
Who are the police arresting, and how are they treating
the people they arrest. I think behind that is the

(13:46):
question of what do we really want to treat as
a crime. So when you treat all sorts of very
minor misconduct as a crime, you're really setting up a
situation where they're going to be more contacts and therefore
potentially more practruary and discriminatory context. So, if you think
about it, Eric Garner ended up dying because he was

(14:06):
selling single cigarettes on which the tax had not been paid.
George Floyd. The basis for that encounter was that they
thought he might be passing a counterfeit twenty bill. So
I think that if you look at why are we
criminalizing some of the things we criminalize, especially if you're
talking about people who are mentally ill and are having problems.

(14:27):
Do we really want the police to be the people
who are the first responders to people who are having
a mental health crisis or is there some more effective
way to deal with that that would avoid putting those
people into the criminal justice system, which isn't really good
for anyone, And to maybe recommit, reallocate some of the
resources we're using on arresting people and locking them up

(14:49):
to actually dealing with the mental health crises. You have
mental health treatment. So instead of viewing everything as all
dysfunction as a matter of policing, why don't we spend more.
I'm reinvesting and to try to prevent more dysfunction. It's
sort of like the old thing. You know, if you're
a hammer, everything looks like a nail. Well, you know,
not every problem in our society is a problem for

(15:10):
the criminal justice system, and an occasion to arrest people
and lock them up a lot of them really should
be an occasion for you thinking about public health treatments.
I'm thinking about how we want to approach homelessness, and
you have a lot of much deeper thoughts about how
you prevent dysfunction. Rather than answering everything with you, we're
going to send in the police. It certainly seems also
like this moment, even coming out of the pandemic, I

(15:32):
can only imagine the mental health crisis is going to
be even worse. Yeah, that could well be, um and
I think the pandemic is also showing us. Somebody asked
me the other day whether the protests over policing and
police brutality are related to the pandemic. And I was
in a webinar and one of the smart people in
the room said, oh, no, no, they're two entirely different things,

(15:53):
And I said, what do you mean. The same people
who are being disproportionately affected by policing and police brutality
are the people who are being disproportionately affected by COVID.
The statistics is that people of color are much more
likely to die, and there are a lot of reasons
for that, you having to do with underlying health and
having to do with the fact that minorities and people

(16:14):
who are not affluent don't get to work from home,
they don't get to work through zoom. There are the
people who are out there on the streets, being the
first responders, being the people who are picking up the garbage,
being the people who are talking the supermarket shills. And
I feel like the virus is really amplifying so many
of the inequities we've had in our society. And I

(16:35):
think especially you know, I don't know what it's like
for everyone else, but I live in Brooklyn and in
New York City. It really felt like a lot of
the people who were out on the street. They were
out on the street because they were upset about George Floyd,
But I think it was more that they recognized that
George Floyd was the tip of the iceberg and that
there were just a lot going on that they really
you could not tolerate any longer. More from Susan after

(17:01):
the break, and make sure to subscribe to First Contact
in Apple podcasts or wherever you listen so you don't
miss an episode putting on the tech hat. You know,

(17:25):
I think most people probably don't think of tech when
they think of the a c l U, but there's
quite a bit of litigation in regards to security and
privacy issues around contact tracing, surveillance, algorithmic bias, and obviously
the a c l U has a hand in checks
and balances and a lot of the issues that are
emerging from the pandemic. You know, what are some of

(17:46):
the tech developments that you guys are most concerned about. Well,
since you were mentioning the COVID and the contact tracking
and tracing, I'll start with that. So the upshot is
that we are neither for nor against tact tracing. If
contact tracing is something that really will contribute to public health.
Our concern is not to say no, you can't do it,

(18:07):
or yes, go right ahead and do whatever you want.
What we're concerned about is to minimize the damage to privacy,
the damage to equity. Again, Uh, there are a lot
of concerns that we have. The other thing that we're
concerned about is discrimination again, because there are ways in
which the technology could also increase pre existing social inequities.

(18:30):
We think that people should not be coerced into participating
and testing. We think it should be voluntary, and we
also think that it should be nonpunitive, because if you
start having the criminal justice system enforcing whether or not
people are willing to use their phone to take a
test or whatever it is, you're just creating more opportunities

(18:50):
for police interactions that will at some point be arbitrary
or discriminatory. So we don't want to see rules and
regulations that are good public health rules. Even if they
really are good public health rules, we don't want to
see those become occasions for filling up the jails with
the people who aren't complying, because we've already seen there

(19:12):
were some statistics in New York that when you asked
the police to start enforcing who's wearing a mask and
who's not wearing a mask. That right away, excuse excuse,
racially racially disproportionate in terms of who they were questioning
and who they weren't questioning. So I think there's just
a lot of issues there which is very much prop
your reality, because they're very much ethical issues. Yeah, you know,

(19:33):
UM one of the one of the cases that I'm
fascinated by. UM, And I you know, I honestly I
felt like it was just it was only a matter
of time until we saw this headline. And then we
saw the headline, you know, a man was arrested after
an algorithm wrongfully identified him. You know, I've been covering
for so many years. AI is biased. AI is trained
on you know, on data online, which can be very racist,

(19:57):
you know. And I think for so many years we've
been having this conversation. But the question of okay, well,
what happens when it gets into the hands of the police,
what happens you know, if if it could go for policing,
And so I think it's such a fascinating case. And
and you guys, the a c L you filed an
administrative complaint with Detroit's police department over what you guys

(20:18):
are calling the country's first known wrongful arrest involving facial
recognition technology. I mean, for context, a man was arrested
because he was wrongfully identified by an algorithm. The police
department thought he had robbed I believe, like stolen watches,
and he was arrested. I mean, can you talk to

(20:38):
me about the significance of this case. I can't help
put put on my tech hat and scream. You guys,
this is a really big deal. Yeah, it is a
really big deal. And as you're saying, Laurie, we were
aware of this problem for a long time and we've
been complaining. So going back for a minute before getting
to the case you're talking about, Robert Williams UH the

(20:58):
National Institute of Sidians and Technology says that African American
and Asian people are up to a hundred times is
likely to be disidentified by facial recognition. So that's the
background problem. And so we knew that, right, you know,
we knew that before the case came up in Michigan. UM,
and it's not the algorithm's fault. Obviously, there's something that's
being put into the algorithm that that is you know

(21:20):
that has a bias. And I think people tend to
think that algorithms are you know, are so neutral and
that we can rely on algorithms. That's what I was
saying about the contact tracking and tracing, that you you
start relying on algorithms or apps that you think are neutral,
and you really have to be very wary of that.
So again, before getting to the Robert Williams case, UH

(21:40):
and a c l U staffer at the ah l
U of Northern California had the really interesting idea of
trying out Amazon's facial recognition program Recognition with the K
because yeah, they were just offering this to the police
or whatever. This is great, it will help you identify
and see if you have somebody who matches a bug shot. Well,
what they tried to do, which I thought was very clever,

(22:01):
was they tried to match mug shots against the members
of Congress. They got, you know, the fatual pictures of
all the members of Congress. This was in July, and
there were twenty eight members of Congress who were misidentified
as matching the mug shots. There were twenty instates out
of that, and not only that, but that the false

(22:21):
matches were disproportionately people of color. And one of the
people who was identified as matching a mug shot, and therefore,
you know, probably you know this criminal was civil rights
legend John Lewis, the guy who was beat up on
the bridge in Palma to know, to get us all
voting rights. So yeah, we know that almost of the
false matches there were of people of color, even though

(22:45):
people of color made up only twenty of the members
of Congress. So in some ways, you know, the Robert
Williams case is completely predictable. We knew that we allowed
for that to happen. It might have already happened elsewhere,
but you know, subterranean lee in a way that we don't.
We didn't see the case. But what's amazing about the
Robert Williams cases that it happened right there, you know,

(23:05):
visible to everybody where. You can just see it. So
what happened was that they told him that he was
being arrested because they believed that he was that the
algorithm has said that that he was a match for
this mug shot, and they showed him in the mug
shot and he said to them, do you guys think
all black people look alike that looks nothing like me.

(23:25):
So you know, it was pretty clearing that if you
used your eyes and looked at the picture yourself, if
you didn't trust the algorithm, and if you looked at
the picture in this man's face, they didn't look alike.
But nevertheless, he spent thirty hours in jail under some
pretty miserable conditions because the algorithm said it was a match.
So I think that's really important. In some ways, the
fact that you know a problem exists is not as

(23:49):
inspiring to make people want to do something about it
as when you see it. So that's what happened with
all the protests about George Floyd. People could watch that
horrible video. They could see it. It was recorded on
the video. And here we have an actual person, not
just hypothetically statistics are showing, but an actual person who
did get arrested and did have a miserable time. He

(24:11):
was arrested in front of his family. It was really traumatizing,
and based on again, the officers involved were trusting the
science more than they were trusting their their own eyes.
When anybody couldn't see he didn't look like the picture right,
And you know, he wrote an offered in the Washington Post,
and he he asked the question, He said, why is
law enforcement even allowed to use this technology when it

(24:33):
obviously doesn't work? So I guess asking a legal scholar
the question. You know, police departments all around the country
are using different variations of facial recognition software. So you know,
what regulations should we see as we enter this era
of algorithmic discrimination. Yeah, that's a great question. And again

(24:54):
we've been urging, you know, long before Robert Williams turned up,
we've been urging police departments not to rely on the
facial recognition technology that it was just it was not
reliable enough to you to hold people's faces in the
hands of the algorithms don't have hands, but for people's
face to be dependent on the spatial recognition technology which
was being touted. And again, you it's great if a

(25:16):
company is doing something to make money, but if wanting
to make money is your only consideration, and if you're
not considering whether you are unleashing something that is really
going to be disruptive of people's lives unfairly, either because
it's just going to be wrong, or because it's going
to be wrong in a racially skewed way. I think
that's just really a problem. So um, we've been urging

(25:38):
police departments not to buy and use the technology, and
I'm sure you know Amazon has withdrawn the facial recognition
technology temporarily and they're not sure whether or not they'll
bring it back. So the probability of wrongful arrest is
one thing, but when you draw the camera back and
look at all the technology in the bigger picture. In
addition to facial recognition, one thing that police departments have

(26:01):
been doing with facial recognition and different law enforcement agencies
is to try to see who attends the demonstration or
see who's in the crowd. So it ties not into
are you, like, is somebody likely to be wrongly arrested
like Robert Williams because they just there was a false match.
But it starts becoming big surveillance too that an agency

(26:23):
has the cameras on and then they have the facial
recognition and they're purporting to identify all the people in
that crowd so that then they can track those people.
They now know that you were at the George Floyd
demonstration and that person was in the the anti war demonstration,
and at that point the government starts having more and

(26:43):
more information about all of us, to the point where
it feels like, instead of we're controlling the government, it's
like the government controls us. So I think the facial
recognition is only one part of the whole tendency of
technology to amplify government power to be kind of watching,

(27:05):
watching what we do. Yeah, I mean, it's it's interesting
to hear you say that. Um, you know, that type
of technology is just a part of it, especially when
it comes to this moment where people are out protesting
police brutality, where people are out fighting for their civil liberties.
You know, there's all sorts of technology that's being built.
Their cameras that are are being built, that can recognize

(27:29):
people in real time, that are police police are wearing.
There's all sorts of technology. This is just the beginning
of it. Um, I know you mentioned Amazon put a
hold on their sales of recognition software. Microsoft said it's
not going to sell face recognition software to police departments
until their federal regulations. I know IBM said that it
was going to announce a ban on general purpose facial recognition.

(27:51):
Is that enough? Like? What is I guess you know,
what is the government's role here, Like, what do you
think should happen, especially since this is, just, as you say,
one small part of a larger issue that we're facing
as a society. I think that's right, and I think
that there could be, you know, government regulation, but that's
not going to happen unless the public wants to urge

(28:11):
their representatives to start controlling this. And what we've seen
is that an enlightened public can make something happen even
without regulation, right, So, you know, it was that the
public was becoming concerned and that's the reason why Amazon
acted to withdraw this. They started being concerned that their
customers were not going to be happy with them. And

(28:32):
I think at this point that's almost more effective than
government regulation. And once you have that wake up call,
then you can start having serious debates. And I think
those debates have to take place in many places. They
should be taking place in legislatures where people can talk
about the trade off between privacy and mass surveillance and
whatever the government is trying to accomplish. Why do they

(28:55):
need this technology? Is it really worth it? You're their
crimes that they wouldn't be solveding without it? And are
they crimes that we're concerned about solving or do they
fall into the category of, you know, is that something
that we don't think should be a crime at all.
People are generally unaware in terms of what the police do,
that only four to five percent of all arrests involve

(29:15):
crimes of violence. So when people think about we want
to enable law enforcement to be at catching criminals, where
we're concerned about divesting or defunding the police because who's
going to protect us from physical harm? Almost none of
what the police and law enforcement do is about physical harm.
It's a tiny percentage. Everything else that they're doing is
about this whole array of all sorts of other things

(29:37):
that we criminalize. And I think that in addition to
having better conversations about is there a potential for some
of these technologies that the government is using to create
arbitrary or discriminatory enforcement, I think we need to dig
deeper behind that question, in the same way that you
need to dig deeper beyond the George Floyd murder and

(29:59):
to ask if there's something systemically wrong here, do you
need to rethink the whole question. So when people say, well,
you know, but we need the facial recognition technology because
it helps the police solve crimes. Well, okay, but you
know what crimes and what are the costs? So I
think once people are educated enough, and once they realize
what the nature of the problem is, kind of what's

(30:19):
being unleashed, they can start really being ready to have
that broader conversation. And I think it should take place
in legislatures, but I think it also should take place
and evidently is taking place in boardrooms at Amazon, Facebook,
and Google and Microsoft. They should be talking and they
do sometimes if the people demand it. And it also
has to take part just among people, you know, among

(30:42):
you know, tech communities and people just beginning to talk
about what are our responsibilities here? Is it okay for
us to create products to to to make money if
we know that there are dangerous that the products are
going to be misused, or maybe aren't reliable enough, or
that they just feed into this enormous survey lean state.
So let me compare this to an earlier moment. After

(31:03):
nine eleven, we had a kind of a similar phenomenon
that in order to deal with catching terrorists, we changed
a lot of laws that ended up really sacrificing a
lot of privacy and allowing a lot more government surveillance,
and for a number of years that went unchallenged, and
people kept saying, oh, well, you know, if if that's
what we need in order to be safe, we're willing

(31:24):
to give up a little privacy. So, first of all,
I think people didn't think about the fact that they
weren't giving up their own privacy, they were giving up
somebody else's. And second of all, people didn't realize how
extensive the surveillance really was until Edward Snowden. So then
after Edwards Snowden came along and people realized how the
government was just scooping up tons of information about people

(31:45):
and just keeping it in government databases and started realizing
the horrifying potential of all that. What happened was that
Congress made a couple of little changes to the law.
But more important, Microsoft and Google and other places started
to realize that their customers were concerned, and they started
being a little less cooperative. At the beginning, right after

(32:05):
nine eleven, all of the telecoms, all these companies, we're
just saying to the government, you want information, here, take
it all your verizons. Are sure you know hear all
the records of all our customers take it all. You're
keeping us safe. And I think that to me, the
most important thing is an informed public. That if people
can examine for themselves whether they really think that we're
being kept safe by all of this, and really examine

(32:28):
you both the costs and the benefits in an educated way,
I think we get much better discussions. And I think
not only do you have the possibility of getting better
legislation or regulation, you also have the possibility that private
companies and you know, the tech the tech companies are
not going to want to do it anymore because their
customers don't want them to. Yeah, I mean it's hard

(32:49):
to have an informed public and to have these discussions,
even in this current environment to some degree. I mean,
people I think are struggling with the idea of truth.
People are um, you know. And I remember, by the way,
I remember this note in leaks, like I remember being
in the news room covering technology and thinking to myself
because I wrote the tech bubble all the way up right,
and thinking, this is an extraordinary moment because we saw

(33:14):
that we've been sharing all our data, but we saw
for the first time that you know, the government had
a lot of access to things that we had no
idea they had access to. And I think it was
a fundamental shift, and the lens on tech companies changed
at that moment, and tech companies behaviors changed quite a
bit after that. You know, I wonder this moment we're

(33:35):
sitting in where we're having these debates about surveillance and
privacy and whatnot. These are sticky debates, and they're very
politicized as we're heading into an election, as we have
misinformation spreading online, as a lot of people don't know
what to believe and what not to believe. The as
the media landscape has changed, it's it certainly seems like
a harder environment to even to even have some of

(33:56):
these conversations. Well, I think in some ways it's harder
in some ways. I think the other thing that is
a catalyst for the discussions is realizing that there is
a dimension of race to all of us. I think
in talking about artificial intelligence and facial recognition, not many
people saw that as an issue of structural racism. You know,
that there's something wrong with how we're putting together the algorithms,

(34:16):
and it ends up that John Lewis is going to
be misidentified as somebody who matches a mug shot and
that Robert Williams is going to be arrested. So I
think that the fact that we now know that that
is an additional concern enables us to have richer conversations.
So we're not only talking about is there a trade
off between security and privacy? Plus, I think the other

(34:37):
thing that people are feeling much more open to is
to have that deeper conversation about what are our goals
here and if we're enabling all this government surveillance in
order to help the government to catch criminals, well, you know,
what do we mean by criminals? What crimes are they solving?
And how are they using you know, how how are
we how is this actually being used in services wood

(34:58):
So I feel like in some ways, you know, with
the election coming up, I think that gives people more
impetus to want to talk about these issues, because the
elections aren't only about the president. They're also about local
prosecutors and sheriffs and the people who make the decisions
about whether to buy surveillance equipment and what they're gonna
do with their authority over the criminal justice system. So

(35:20):
one thing the a c l U has been doing
in addition to everything else, as we've been very involved
in elections of prosecutors because that's the place where almost
people never used to pay attention to, you know who,
who were these people running? And maybe they would vote
for somebody without really knowing what they voted for. So
what we're urging, and I think this is very much
what we're talking about about having an educated public. We're

(35:42):
urging people to go to elections or to go to debates,
to go to campaign events, attending I guess on zoom
these days, to attend campaign events and ask the candidates questions,
what would be your policy about whether or not you're
going to accept military equipment from the federal government in
your police department? Are you going to buy tanks? Are
you going to buy you know, these horrible weapons that

(36:04):
are used? Is that something you would do? Are you
going to buy you know, facial recognition software? Is that
how you would use your power? If we you left
you um say that the prosecutors, would you support a
reduction in cash bail and increase with increased alternatives to incarceration.
So that's a place where without waiting for the government
to do something, we can ourselves effect what's happening in

(36:27):
our communities. By encouraging candidates to think about what positions
they're taking on these different issues and letting them know
that they're gonna lose votes. The more people educated, the
more they are educated, the more they can tell people
that they'll lose votes, and to try that. This is
something that's worked in some places to encourage candidates to
take a better position. Yeah. Yeah, they might never never

(36:50):
thought of that, but you know, once they commit themselves,
you know that's going to be better. So there are
all sorts of ways that we can affect things. More
from Susan after the break, and make sure you sign
up for our newsletter at dot dot dot media dot
com Backslash newsletter we'll be launching this summer. Before I

(37:23):
move on from specifically some of the tech issues, I
have to bring up predator drones. Uh right. You know,
the the U S. Customers and Border Protection flew a
large predator drone over the Minneapolis protests. You know, people
were protesting police brutality in the killing of George Floyd,
and for many reasons, it almost felt symbolic. You know,
it was raising all these questions about aerial surveillance about

(37:48):
what data was being collected, where was this going. What
is your take on this? Well, you know, as you're saying, Laurie,
and you know that really it really magnifies the opportunity
to gather more in formation because you don't even have
to have the helicopters or whatever. But so you know
that of course is a concern just you how much
information is the government gathering, What are they going to

(38:09):
do with it, who's going to have access to it?
Will will ever be deleted or will it just got
to stay there in the government databases forever. But I
think the other thing that the Predator drone brings to
mind is a question that people were also asking, which
is about the militarization of law enforcement we have had
for years in this country. A poppy Comma taught us
Act as It's called, which says, you don't want the

(38:30):
military doing everyday law enforcement because that's that's not our country.
We don't want the military to be quote dominating the streets,
and we don't want the people who are out protesting
to be considered the enemy of the United States. There
are people who are expressing their opinions, and so the
whole idea of you know, it's one thing. It's enough

(38:52):
if the police held helicopters are flying overhead and trying
to keep track of, you know, who's in the crowd
and what the crowd is doing. But once he's start
adding an element of something the military helicopters or the
military drones or things that feel like we are being
treated as the enemy of the government instead of the
people who are the government, who are supposed to be

(39:13):
controlling the government. I think that that's just it's a
very bad paradise. You think it's a slippery slope, Well,
it's a slippery slope unless we stopped the slipping. And
as we saw with you with Amazon and the facial recognition,
if people say, wait a minute, yeah, I think we
can make that stuff. But I think if people don't
pay attention, I think we have a very slippery slope.

(39:34):
And that's what I've been saying about most of the
issues we've talked about you, starting with the contact tracing
and the surveillance and everything else. It seems to be
that what's really important is transparency. We should know what
the government is doing and accountability. Back on the issue
of contact tracing, one thing that the AHL you do.
Together with the a l U of Massachusetts is we

(39:55):
have filed the lawsuit were actually a records request demanding
that government, including the CDC, release information about the possible
uses of all the location data that they would be
collecting in connection with contact tracing, because you know, once
if you don't know what they're doing, then you can
have a discussion about what they should be doing. And
one reason why I was bringing up all the post

(40:17):
nine eleven changes of law is that I think that
the whole idea that we can't know what the government
is doing. The government has to act in secret in
order to keep us safe, or else the enemy will
be able to know what they're doing and you know,
and work around it. But the government can know everything
that we're doing. I think that just has democracy backwards.
You know, we have to be able to know what's

(40:38):
happening inside the government. And that applies to why are
they sending the Predator drone? What are they going to
do with the information? What does this mean? Are they
going to do it again? And it also has to
do with the contact track tracking and tracing. Once they
get that data, what happens to it? Are they going
to erase itever, you know, who do they share it with,
what are they going to do with it? And I feel,
you know, those are really important issues in a democracy

(41:00):
that we just have the right to know what the
government just doing so that we can talk about it.
And I feel like to sort of say, well, this
is what the government is doing and that's really bad,
and that upsets me. I think that kind of misses
the point. If the government is doing something bad, then
it is the duty of every American to find out
what they're doing and to push back. And so at
the a c l U we have a program that

(41:22):
we call people Power. We first invented that and used
it to explain to cities and localities all over the
country about how they could fight back against draconian immigration
rules by becoming quote sanctuary cities, what what their rights
actually were. We then used it through voting rights. We're
about to use it some more for voting rights. But

(41:43):
what we have really urged and I hope that you know,
some of your listeners will go to the a c
l U website and see about what people Power is
doing in addition to what the a c l U
is doing, Because what is the a c l U doing,
and that's all the staffers at home trying to you know,
work on their new laptops while they're trying to you know,
keep their talkers quiet. But People Hour is about what
every single person can and I think should be doing.

(42:03):
You know, if people really educate themselves and think about
the ethical issues, the costs and benefits of all this
technology in addition to a lot of other things going on,
I think we get a lot better results if people
pay attention. Yeah, I mean it's interesting to watch the
a c L you take on issues like surveillance, facial recognition.
I know, the a c L you filed a lawsuit
against clear View AI, which was this very controversial company

(42:27):
that was using biometric data. I think facial recognition technology
helped them collect something like three billion face prints and
they were giving access to private companies, wealthy individuals, federal, state,
and local law enforcement agencies, and you know, coming from
the tech space, it certainly feels like sometimes these stories,
you just don't know what these companies are doing until

(42:47):
you start, you know, peeling back the layers and seeing
all the data went to here and here, and why
did it go there? And why wasn't this disclosed and
and oftentimes it takes the watchdog to really understand and
where some of this can can go wrong, and how
it's being used in ways in ways that can be
dangerous in many ways. Yeah, I think that's exactly right.

(43:09):
And that's why I was saying before that aren't concerned
before everybody jumps on the bandwagon about let's have more
contact tracing and you know, like everybody should just be
doing all this information. I think we have to get
a dog. Yeah, you're not gonna have to watch dog
telling you things unless you build a watchdog into the system.
And if everything is just you know, a company has
invented this and is selling it to the police, or

(43:30):
a company who has invented this and now we're all
going to buy it. If you just leave out any
sort of oversight, then you really have a tremendous potential problem.
Are there any other examples of tech that we're not
thinking about the unintended consequences for our rights or privacy yet? Well,
you know, a AI is really big altogether across as
you're saying, across many different kinds of issues. I was

(43:52):
just actually, this is not a tent gential to your
question but you were asking me before about cases that
I had worked on, and there was another case that
I worked on that was about tech where I wrote
the A c l Used brief in the Supreme Court.
It was an Amika's brief. It wasn't about our client,
but it was a kid called Riley versus California. And
what the police were saying they're most law enforcement places,

(44:13):
the federal government as well as the state of California
and many other jurisdictions, was that when you arrest somebody,
the police get to do what is called a search
incident to arrest, so they get to see what you
have in your pocket. Makes some sense, right, you know,
if you have a gun in your pocket, that's a
problem or you know whatever, So they get to do
a surgeon sent in to arrest. And the law had
been that if they find something in your pocket like

(44:35):
that's that's a container, they can search inside the container
to see if there's anything in it that that could
be harmful. And in fact, there was one situation where
they opened up a cigarette package that somebody had and
they you know, they could find a razor blade, they
could find the marijuana, cigarette whatever. So that was law
where the Supreme Court said, yes, you're allowed to search
people and search the containers that are on them. Well,

(44:56):
what law enforcement said was your cell phone as a container.
When we arrest you, we can search your cell phone.
It's a container. We have the right to search incident
to a risk. And so we wrote a brief saying, no,
it's not you know, it's a container, but it's a
container that essentially is your home, it's your library, it's
your desk. So allowing the police to look in your

(45:16):
cell phone when they only had really very feeble and
very unlikely scenarios, things that just wouldn't happen too often
for what the need was. You know, maybe you had
some remote thing that would go off and would blow
something up. You know, oh come on. But yeah, there
were other ways to deal with a lot of that,
and so the Supreme Court actually agreed with that. They said, yeah,
this is really is just a technological way of finding

(45:39):
out what's in all your papers and books and records.
It used to be they were in your desk, and
now they're in your cell phone. So that, to me,
it's a sort of a whole thread of what we've
been talking about. But the challenges to civil liberties are
different and in some ways greater when the technology builds up. Yeah,
there's there's a great quote on the A. C. L.

(45:59):
You Let's site. The fact that technology now allows an
individual to carry such information in his hand does not
make the information any less worthy of the protection for
which the founders fought. The U. S. Supreme Court Chief
Justice John Roberts exactly, I like to talk about, you know,
one of the whole points of the Constitution adding the
Fourth Amendment, which is the protection of privacy, is they

(46:19):
wanted to protect what was in Benjamin Franklin's disk. Nobody
should know if he was writing some things that were
anti government, and we now have that on our cell phone,
so of course, But that's where I think that a
lot of the protection of civil liberties is applying our
fundamental principles in different circumstances. Taking a gigantic step back,
what do you think is the biggest threat to civil

(46:41):
liberties in the new World Order? In the New World Order? Well,
you know, it's hard to just select one. It's sort
of like Sophie's choice, you know, which which is your
favorite child? But right now, I think one of our
very top priorities and adenia. Mass incarceration is a big
one because so many people's lives are just being totally disrupted,
their families often, the question really has to be for what.

(47:02):
One thing that we're hoping is that the work we've
been doing around trying to get vulnerable people released from
prison so that they won't get the virus and get
seriously all possibly guide is we're hoping that once jurisdictions
see that they were able to release thousands of people
from prisons and jails and that it's not going to
cause a spike in the crime rate thing it really

(47:24):
is pretty safe thing to do. We're hoping that that's
going to stick and that long run will be able
to rethink, well, did we really need to put all
those people in prison and jail to start with? What
are we doing with the criminal justice system? So that's
really big. But the other thing that I think is
really big right now is voting rights. I had alluded
to this at the beginning of our conversation, but the

(47:45):
premise of democracy is that the people get to decide
on who should be running the government and who should
be making the policy about all these things we're talking
about here. You know, what, what are the regulations about technology?
What are the regulations about your reproductive freedom? Are everything else?
LGBT rights? Uh? And if the people's vote is distorted,

(48:06):
that's a real problem that people can't vote. So we
have litigation going on right now in I think it's
like thirty different states trying to get people the opportunity
to vote. So one of the things that has happened,
in addition to all ways that incumbents had been using
to try to protect their own seats, is that the

(48:27):
virus has really made it dangerous for people to vote
in public places. So we saw the election in Wisconsin
where people were just lined up for you know, tremendous
disiness is waiting for a really long time to vote
because Wisconsin would not allow them to submit absentee ballots.
And in fact, a study showed afterwards that at least
seventeen people got got the virus from voting. Many many

(48:49):
polling places were closed because they, first of all, the
poll poll workers are generally elderly people, and the poll
workers were not able and willing to to man the
polling places. Are a number of states that don't allow
absentee ballots at all, unless you have a particular situation,
like if you're disabled, and the states you're saying, oh, well,
you know, the pure and the virus are getting yill,
that's not a disability. Or before you get an absentee ballot,

(49:12):
you have to have it notarized, you have to have witnesses. Now,
how is all this going to happen? So it's very
concerning that people are going to have to choose between
their health and their right to vote. And we don't
think that that should happen. And that's something that has
to be attended to right now because if states don't
come up with plans they're trying to enable everyone who

(49:33):
wants to vote to be able to vote, and for
counting absentee ballots and for administering this program. If you
don't come up right now with the plan and the resources,
a lot of people are going to be left out
and they're going to find that either you know, they
can't vote because they're afraid to go out to the poll,
or the vote is not going to be adequately counted.
So I think that right now making democracy work is

(49:54):
really one of our top projects. What is the solution
to some of these problems? What are your tangible solutions.
But one tangible solution is that more states have to
make absentee balloting available to people without having all these
conditions and you know obstacles. Uh. The other solution that
you were talking before about truth. A lot of the

(50:14):
reason that's given the very thin veneer of justification that's
given for we don't want absentee ballots or we need
voter i D people to carry government approved voter i D,
which means you have to go down to a governmental
office live and get your voter i D and show
it at the polls. The excuse for a lot of
this is is that there could be fraud. Well, studies

(50:36):
have shown that there's virtually no voter fraud, and it's
just it's really a real unicorn. And again, I think
if people understood that, that might sound good, but it's
not true. I think truth is another thing that we're
really fighting for these days. Can you listen to the evidence,
Can you listen to the public health officials, Can you
listen to what you what's real? I know for a
fact that tech companies are very concerned about voter suppression,

(50:58):
you know, and misinformation spreading online. This idea of countering
truth around a lot of these very important initiatives, whether
it's absentee ballots, whether it's showing up to the polls,
all that kind of thing. You know, I'd be curious
to know your take. There's a current battle happening right now.
You have seven fifty advertisers boycotting Facebook asking for better
policing of hateful content. Our social media companies doing enough

(51:21):
to police harmful content, especially as we head into an
election where voter suppression and the spread of misinformation will
most certainly be attacked. It used to manipulate voters. Well,
let me actually break your question down into two different parts,
because you were starting by saying about the concerned about
voter suppression. I think one thing that everybody should be
doing is to increase awareness of what is a fair

(51:42):
way to improve access to the ballot for everybody. And
some of those things are tech solutions. We've had tech
solutions for years that are available and not widely enough used.
How do you enable differently abled people to vote? You
can bind people vote, do they have the technology? So
there are a lot of is where we need the
tech community and we need everybody to find out how

(52:04):
you vote to find out a voting can be made easier,
and to let people know what the rules for voting
are where they live. So one thing the a c
l U Is doing is we have on our website
you know your rights, you know what you're voting regulations are.
And that's something that I think people really have to
start thinking a lot about and to let let all
their communities, all their friends and family know about the
importance of voting and how they what they have to

(52:27):
do to vote, and to urge them to just get
out and vote in whatever form that's going to take.
So I think that's really important. In terms of disinformation
on social media, people talk about the First Amendment and
whether you know there's the First Amendment problem with Facebook
telling you what you can't do, Well, there isn't because
the First Amendment only applies to the government, so you

(52:49):
don't have a First Amendment right to say whatever you
want on Facebook. However, I have to say that we're
you know, we don't regard that issue is altogether a
simplistic issue that Facebook should be telling everybody if they
can't say because even though the First a Moment does
not apply to private companies, there's still a tremendous value
to free speech. And there are a number of examples,

(53:09):
which you know, are we've come up with about people
who are have speech suppressed for bad reasons. I'll give
you one example. There was a woman who African American
woman who posted something on Twitter and she got all
these horrible racist responses and she posted a screenshot of
the responses that she got to show people what she
was up against, and Twitter took it down because it

(53:31):
included racist words that you know, okay, you know, kind
of misses the point. There was another uh A CLU
lawyer wrote about a statue in Kansas that was a
topless statue was a woman who were bare agrested, and
so whatever the locality was in Kansas decided to take
it down because yeah, that was they considered that to

(53:52):
be important. So the A. C. L You lawyer, who
was challenging whether or not the I think it was
city could take it down, posted a a picture of
the statue and that was it was on Twitter was
I think Facebook, and that was taken down on the
ground that it was obscene, so she couldn't post the
picture of what she wanted to do. So we think
that social media control is really a two age sword.

(54:14):
What I liked is at one point Facebook had a
protocol for about you what's true and what isn't true,
And what they did was they gave you a flag.
So if they were concerned that something that was said
wasn't true, they would have a neutral fact checker check it,
and then if it didn't turn up, well, they would
put a little flag over it and say this has
been questioned, and you could click on the flag and

(54:34):
you could see why it was questioned. But they didn't
just take it down. So you know, I I agree that,
you know, disinformation is a tremendous problem, but I think
that the idea that the solution is asked the tech
companies to decide what we should and shouldn't see. Yeah,
I don't think that's so great either, And certainly they
should not be doing it without a lot of transparency
and accountability. If they're going to be taking things down,

(54:57):
they should tell us what their protocols are, and you know,
there should be more public discussion about where the balance
is there. Yeah, it certainly seems like the protocols change
quite a bit, Especially having covered tank for for this
many years. It certainly seems like Facebook changes that, Twitter
changes it, and oftentimes it depends on public pressure. I'm
curious to see what happens with all these advertisers boycotting.

(55:17):
I think personally, I have a feeling it won't impact
the bottom line much and they'll go back to business
as normal. But but who knows, you know, I do
know that Zuckerbird cares deeply about his employees and and
but they've been kind of up against you know, public
scrutiny for a very long time. But but it certainly
is interesting, especially when the stakes get higher and disinformation

(55:38):
can go further, and especially as we get closer to
an election, it certainly feels like everyone feels more triggered
around it. Yeah. Yeah, well, you know, one of the
classic statements about the First Amendment does that in the
marketplace of ideas, the best antidote to bad speech is
more speech, right, So, you know, suppression. I think we
always have to worry every time somebody is censoring and suppressing. Yeah,

(56:01):
who are we giving that power to? You know, nearing
a close because we don't have you for too much longer.
I saw that you gave a talk um Democrat and
a Republican walk into a bar and you're saying that
it seems like these days Democrats and Republicans can't really
agree on anything, but we all need to agree on
fundamental American principles like do process, equality and freedom of conscience?

(56:23):
So is that possible? Do you believe are you are
you an optimist? Do you believe that in this current environment?
Is that possible? Well? I think that's that's a great
wrap up question. So that speech I gave it the
Central Arkansas Library. And my cheat point, as you're saying,
is I think that people have to be able to

(56:44):
agree on neutral principles. The Constitution was designed not to
say what we're going to do about everything. It was
designed to have everybody have a fair opportunity to be
part of the process of deciding what we're going to do.
So it sets up all these Democrats structures where we
get to vote for the people who are the policy
makers and we all got to decide. But the principles there,

(57:06):
the underlying principle is that everybody should have a fair
and you know, if the principle should be neutral, everyone
should get to vote. It's not like, you know, if
you're a Democrat, your vote doesn't count in this area,
and if your republic and your vote doesn't count in
that area, that's not fair. And the basic ideas of
the freedom of speech, freedom of religion, they're all to me.

(57:26):
They managestations of the golden rule that if I want
the ability to just choose my own religion and decide
what religion I'm going to practice, I have to respect
your right to make a different choice and have your
own religion, because that's the golden rule. If I want
to say something that's unpopular, I have to respect your
right to say something that's unpopular. And if I want
to be treated fairly and not locked away for your

(57:47):
doing something minor and never given a fair trial, I
have to respect your right to have the same thing
happened to you and to be all those fundamental principles
are things that we really all should agree on. I
think people get into arguing and assuming that they can
never agree on the principles because they're differing on what
they think the results should be. And I think to

(58:07):
be part of the point of civil liberties is it's
all about process, it's not about results. The a c
l U is nonpartisan. We don't try to get Republicans elected.
We don't try to get Democrats elected. We don't favor
or disfavor individual politicians or individual parties, but we we
favor or that there should be neutral principles that everybody
can agree to to say, okay, here's what's fair. And

(58:29):
the analogy I used in that talk at the Central
Arkansas Library it was one of the nights during the
World Series, but fortunately not a night where there was
a game, so people were able to come, and I said, okay,
so what happens before a baseball game is that everybody
has agreed on the underlying rules, and everyone agrees that
the your umpires, your referees, and any sports should be neutral.

(58:52):
And you don't want somebody who's partisan. If they were
favoring one team, you'd get rid of them at all.
Sports fans could agree to that. You know, maybe they
would be a few who would be just so you know, matchiavellian,
that they would rather have the biased umpire to always
rule for their side. But I think sports fans can
agree what you really want for a fair game. Because
you want a fair game, you want everyone to agree

(59:13):
on the principles beforehand. And I think that if we
could sit down in small groups around the country and
really talk about what the fundamental principles are. I am
enough of the patriot to think we actually could agree
about a lot. And let me give you an example
of why I think there's some basis for hope. Maybe
not optimism, but certainly hope. We were talking about voting rights.

(59:34):
So one of the major problems is gerrymandering, the way
when a party is in power they try to distort
all the districts and they try to stack the deck
so that their party will remain in power. Or if
the party in power in a particular state thinks it's
to their advantage to not have that many people vote,
they try to make it harder to register to vote

(59:54):
for new voters, etcetera. Uh. We have had the A
C l U and and a member of other organizations
working in coalition with us have had a fair amount
of success doing ballot initiatives going to the people of
a state in states like Michigan and Nevada and Missouri
and Florida, where we were part of getting the amendment

(01:00:15):
for a past that gave the vote back to people
who have been convicted of a felony at some point
and the people of the state. When you ask the
people of the state, you can get a majority. Sometimes
the super majority of people who say no, we want
the rules to be fair. Who doesn't want the rules
to be fair are legislators who want who are incumbents
and who want to keep their seats even if it

(01:00:35):
takes unfair procedures to do it. So that's a real
problem right where we have right now that the incumbents,
the people who are trying to maintain power and not
allow any sort of regime change, are pulling all the levers.
But what I think, I think the chief grounds for
optimism is that when you go to the American people
themselves and say, well, do you want a fair system
or do you want a system where you think your

(01:00:57):
side is more likely to win? You talk to them
about that, and I think that you're going to get
them to say they would really like to see a
fair system, and that is the promise of America. Um.
Last question you have taught at Brooklyn Law School, since
what is the lesson your students will take from this
moment in history? Well, I know there are lots of lessons,

(01:01:18):
but if you could extract it, what is the lesson
your students will take from this moment in history, well,
you know, in an individual setting. One thing I'm doing
for the fall is I am preparing of course that
I'm calling COVID nineteen and the Constitution. So what we're
gonna do in this seminar is we're going to be
looking at the way in which the Constitution has been
challenged and to see, you know, how well it holds up.

(01:01:40):
What does the Constitution have to say about whether you
can quarantine people and whether you can allow people to
be at a religioussembly but not go to a protest,
and etcetera, etcetera. So I think there's a lot of
interesting things there which I think are very much this
particular moment, but big picture, what I would like the
students to take away, the constitutional law students especially is

(01:02:00):
essentially what I just said to you, that the Constitution
is about process. It's not about results, it's not about
you know, you're a Republican and you're a Democrat, and
we have two different countries depending on what your party is.
I think that we have one country and it's all
about a neutral process for very good reasons, and I
would like people to think more about that after my
speech at the Central Arkansas Library, I had two examples

(01:02:24):
of people who talked to me. One guy came up
to me, he said, I'm the Republican who walked into
that bar, and he said, you know, you're making a
lot of sense to me. And then there was another
guy who talked to me who was a Democrat. He said,
you know, I never really thought about that that maybe
it's not right if we're only trying to win. I
never thought about you know, that's that's not what we
do in sports. And that's what I'd like people to

(01:02:46):
think about. You know, do you really want to do
things that are only about how you think it's going
to come out and cheat and destroy the system, and
you know, put a film on the scale and you know,
stack the deck in order to make things come out
to what your preferred result is in the short run
or long term. Is that just a really bad idea
because it's just totally inconsistent. You know, we've just come

(01:03:07):
from fourth of July. It's totally inconsistent with the premises
on which we would like to believe our country was founded.
Does technology throw a wrench in the system? I mean
it does. It does create lots of things you can't control,
and and it it always does. It's always it's always
new environment, so you know, different kind of example, we
were talking about technology and surveillance, where of course technology

(01:03:29):
has enabled a whole lot of survalance that we then
have to deal with. But technology also enabled a whole
lot of new marketplaces of ideas. So the A c L.
You did a lot of litigation a few decades ago
on applying first two Moment principles to the Internet, right,
you know, becauld the government censor what was on the
Internet because you know, child, a child might see it. Yeah,

(01:03:51):
And so you know, every new generation of technology, there
are new challenges about how you apply our principles like
privacy and free speech, cetera to the Internet, but the
principles remained the same. I hope everyone is doing well

(01:04:17):
in these strange and surreal times and adjusting to the
new normal. Most important, I hope you're staying healthy and
somewhat sane. Follow along on our social media. I'm at
Lorie Siegel on Twitter and Instagram, and the show is
at First Contact Podcasts on Instagram and on Twitter. We're
at First Contact pod and for even more from Dot

(01:04:37):
dot dot sign up for our newsletter at dot dot
dot media dot com, Backslash Newsletter, and if you like
what you heard, leave us a review on Apple podcasts
or wherever you listen. We really appreciate it. First Contact
is a production of dot dot dot Media Executive produced
by Laurie Siegel and Derek Dodge. This episode was produced

(01:04:59):
and ed it did by Sabine Jansen and Jack Regan.
The original theme music is by Xander Sang. First Contact
with Lorie Siegel is a production of dot dot dot
Media and I Heart Radio m
Advertise With Us

Popular Podcasts

1. The Podium

1. The Podium

The Podium: An NBC Olympic and Paralympic podcast. Join us for insider coverage during the intense competition at the 2024 Paris Olympic and Paralympic Games. In the run-up to the Opening Ceremony, we’ll bring you deep into the stories and events that have you know and those you'll be hard-pressed to forget.

2. In The Village

2. In The Village

In The Village will take you into the most exclusive areas of the 2024 Paris Olympic Games to explore the daily life of athletes, complete with all the funny, mundane and unexpected things you learn off the field of play. Join Elizabeth Beisel as she sits down with Olympians each day in Paris.

3. iHeartOlympics: The Latest

3. iHeartOlympics: The Latest

Listen to the latest news from the 2024 Olympics.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.