Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hey, there, everybody, this is Jonathan Strickland, host of Tax Stuff. Today,
we have something a little different for you. We have
a series called First Contact with Laurie Siegel uh, former
correspondent with CNN, and if you have not listened to
that show, you definitely need to check it out. She's
(00:22):
doing amazing work having incredible conversations with thought leaders in
the tech space and the social space, and this particular
episode I think is really important and one that really
needs to be heard by as many folks as possible.
She sits down with a CLU president, Susan Herman, and
(00:42):
together they talk about the issue of racially biased face
recognition technology. And you've heard me speak on episodes of
Tech Stuff in the past about bias artificial intelligence, facial recognition,
you know, image recognition, that sort of stuff, and how
that is a real problem in tech and it's something
(01:04):
that we absolutely have to get our heads wrapped around
and address. So, without further ado, I'm going to have
that episode play in lieu of a Tech Stuff episode
for today, and I highly recommend you check out First Contact.
It's a great show and one that I think really
(01:25):
dove tails nicely with Tech Stuff. So if you enjoy
tech stuff, I think you're really gonna like First Contact
as well, and we will catch you next week with
new episodes of tech Stuff. Thanks. First Contact with Lori
Siegel is a production of Dot Dot Dot Media and
I Heart Radio. There's a great quote on the A
(01:49):
c l U website. The fact that technology now allows
an individual to carry such information in his hand does
not make the information any less worthy of the protection
for which the founders fought. Exactly. I like to talk about,
you know, one of the whole points of the Constitution
adding the fourth Amendment, which is the protection of privacy,
is they wanted to protect what was in Benjamin Franklin's disk.
Nobody should know if he was writing some things that
(02:11):
were anti government, and we now have that on our
cell phone, so of course, but that's where I think
that a lot of the protection of civil liberties is
applying our fundamental principles in different circumstances. We are in
(02:34):
a moment of reckoning as we enter an age of
ubiquitous surveillance, questionable data collection practices, even algorithms that discriminate.
It's minorities, especially black and brown communities that are disproportionately affected.
Over the last months, as the nation has grappled with
a conversation around police brutality, we've seen predator drones used
(02:56):
for aerial surveillance at protests, facial recognition technology that wrongfully
accused a black man of a crime he didn't commit,
and it wasn't a coincidence. Reports say the tech is
a hundred times more likely to misidentify African, American and
Asian people, and as COVID nineteen continues to spread, there
(03:16):
are serious questions being raised about contact tracing apps and
how that data collected could be misused. These issues raise
ethical questions about technology and its impact on our civil liberties, equality,
and the future of our country. For Susan Herman, it
is an extraordinary time to be sitting in her seat
(03:36):
as president of the a s l U. Over the years,
the American Civil Liberties Union has filed lawsuits fighting for
free speech, reproductive rights, and privacy. But as technology continues
to muddy the waters, the tradeoffs become more complicated. Where
do we draw the line between security and privacy and
how do we prevent technological innovation from outpacing the law.
(04:00):
I'm Laurie Siegel, and this is first contact, Susan, thank
you for being virtually with me today. Thank you for
inviting me, Laurie. Yeah, you know, I I always start
out these interviews with our first contact. I talked to
guests about how we met, and we don't really have
a first contact. We've never met in person, but we
(04:22):
met on an email chain because we were going to
do an interview together for something else and it and
it fell through. So I said, you've got to come
on the podcast because you are just sitting in such
an extraordinary seat at such an extraordinary moment in time.
So that's our first contact. Well, thanks, It just seems
to me like our first contact was total serendipity. Yeah, exactly,
(04:44):
so you know, to get started. You've been the president
of the a c l U since two thousand and eight,
and I said this before, but you know, what an
extraordinary time to be sitting in your seat. You know,
how are you feeling? Oh my, it's just sort of overwhelming.
You know. As president, I'm go the chair of the board,
so you know, I'm not the one doing the day
to day work as all of the members of our
(05:05):
staff are. But to be a member of the a
c l U staff right now is just it's mind
boggling because we had, you know, a lot of work
that we were already doing before two thousand and sixteen,
with all of the states making worse and worse laws
about reproductive freedom and voting rights and immigrants rights, and
you know, all sorts of other things. Then came the election,
and since that we have brought a hundreds and seventy
(05:27):
three legal actions against the Trump administration for things like
family separations and the travel BAM and prohibiting trans in
the military. Then in March, COVID hit, and at that point,
you know, since then, we've also brought over a hundred lawsuits,
including with a hundred lawsuits just about people who are
incarcerated in jails and prisons and ice attention and who
(05:49):
are just in a hotspot. You know, they have no
control over whether they can social distance, and so we've
been working very hard to get vulnerable people out of
those terrible situations, out of basically death traps. Plus, the
COVID also led to a number of states opportunistically restricting
things like freedom of abortion, declaring abortion to be a
(06:10):
non essential procedure so people could just wait until the
pandemic is over to get an abortion right, and voting
rights has also just been a really braught area right
now because all the restrictions on voting and the ways
in which the vote was becoming distorted. Have you just
been magnified by all the difficulties of So there's a
(06:31):
lot to talk about. So I was a gonna say,
what what what I'm hearing from is you're sleeping really
well at night. You know, there's no work to do,
almost nothing to do this stuff that they're just sitting
around polishing their nails. Yeah, I mean, like, take me
to March, like coronavirus hits. You have been involved in
some of these monumental cases that have just shaped society
in our civil liberties, like coronavirus hits. And now you know,
(06:56):
we have a little bit I don't even think we
have the luxury of perspective at this point, but we
have a little bit more perspective. But let's take me
to March, Like in your role at this extraordinary moment,
like what was going through your head? What were you
concerned about at the time. Well, you know, one of
the first concerns is just you have to close the office.
So the first concern is how can people do all
(07:17):
this umcity It increases the work and makes it more
difficult to do the work. So we just had to
really make sure that our technology was up to doing things.
So one thing that the a c l YOU did
was to buy new laptops for some stuff. People who
are going to be working and you have to worry
about how the technology is working um which has been
a question for us every time there's something really big hits.
(07:40):
When the travel band hit, there were so many people
wanting to donate to the a c l U. There
are website crash so even things like that, you know,
that's you know, like number one of how do you
handle this? We have been fortunate so far that the
a c l U is so well managed and we
had not spent every penny that all of our donors
had given us up until that point, so have not
had to laid people off, which is very fortunate because
(08:03):
as you know, saying, there's more than enough work to do.
But yeah, that's the first concern of just you know,
how do you keep the organization up to speed and
ready to do you know what. Staff members now need
to be doing an incredible amount more work. But for
some of them, it's well, they're juggling a toddler and
a dog. Yeah. Can you give me a run through
of some of the cases that you've been involved in
(08:24):
that correct me if I'm wrong. You started out as
an intern, right, and really just worked your way up.
I mean, I can imagine you've been involved, and I
know you've been involved in some pretty extraordinary cases. To
give listeners some context, can you explain some of the
cases that kind of stick out to you. Well, I
wasn't intern for the a c l U back, you know,
in the nineties seventies, you know, around the time when
(08:46):
I was in law school. And just to make sure
that everybody understands, I don't actually work at the a
c l U. Day job is I'm a law professor,
and I don't generally work on the cases. What I'm
generally doing is we run the organization. But I'll tell
you I think it would be interesting start um. But
the first a c l U case that I actually
did work on, which was while I was a law student,
(09:08):
and this was the case. One of my connections with
the a c l U originally was that one of
my law professors in the first year was connected with
the New York Civil Liberties Union, and he had some
clients who came to him who were graduate students at
Stony Brook on Allowland, and they had just discovered they
were not allowed to live together. They had rented a
house together, there were six of them, and they had
just discovered they weren't allowed to live together because there
(09:30):
was an ordinance in their village village called belt Hair,
that prohibited more than two persons unrelated by blood, marriage
or adoption from living together. So, you know, they were
pretty shocked. And it turned out that under the laws
it was at the time, by the time they were
talking about this, they were liable for all sorts of
criminal fines and punishment. It was really a very heavy stuff.
(09:54):
So I started working on that case with my law
professor and um we went to a federal job to
ask for a temporary restraining order, which means to just
until we had litigated whether or not that was a
constitutional thing to do, to tell people who they couldn't
couldn't live with that the village should not be allowed
to either kick them out of their their house or
(10:15):
to you know, start walking them up because you know,
they owned too many fines for having been illegal residents.
So um, the judge ended up signing the order, and
he was signing the order about that, and then one
of the ways they which actually the original way in
which our clients had discovered that they were illegal residents,
was that they had applied for residents only beach permit
(10:35):
and they were told they couldn't have one because they
were illegal residents. So the judge who we had the
district judge, who was a very nice man, looked at
the order we had written out and he said, well,
you know, it's the summer. Don't your clients want to
go to the beach while the litigation is pending? Do
you mind if I write that in that they have
to be allowed to park in the parking lot of
the beach. So we said sure, you know, that's very nice,
(10:55):
so he wrote that in. Then, as the junior member
of the team, I was sent out to explain to
our clients, to show them the order and explained to
them what was going on, and they gave me a
tour of you what the village looked like in the
residents only beach and the minute the wheels of their
car hit the parking lot is very large. Your fierce
looking man comes striding across and says, what are you
doing here? You're not allowed to be in this parking lot.
(11:18):
And they all look at me, and I'm thinking, what
am I? I'm like, you know, twenty something, I'm not
very tall, but what am I supposed to do with
this large man who doesn't want us in there in
his parking lot? And then I remembered that I had
a federal court order right on my person, so I
kind of drew myself up, but I showed him my
federal court order and I said, well, I'm with the
New York Civil Liberties Union kind of, and I have
(11:39):
a federal court order saying that these people are allowed
to be in this parking lot and go to the beach.
And he melted. And that that was, I think, in
one of the points at which I thought, Wow, you
know this, this is really powerful stuff. Yeah you saw
that there was my first day case. Yeah, exactly, that's great.
And I saw I read that. So maybe your your
earliest memories, um of speaking up to authority involved I
(12:02):
think a dispute over a book at your school library. Yeah,
that's right, even before the Belchair case. My first civil
liberties hero was my mother. So when I was in
third grade, we were doing a school play about a
story called Johnny Tremaine, about a boy in the American Revolution,
and I like, I thought the play was interesting. Players
don't have that many words, and we were told that
(12:23):
this was based on the book. So I went to
my school library, my public school library, and I asked
to take out the book, and the librarian said, oh,
you can't take out that book, dear, that's in the
boys section. And I was I was surprised to find
this out. I've been reading books in the girls section,
which were all collections of fairy tales and biographies of
president's wives, but it had never occurred to me that
(12:45):
I wasn't allowed to take out a book from the
boys section. So I went home and I told my
mother about this. You're just thinking, you know, that's the
way things are, and she just exploded and she called
the librarian the next day and say, how dare you
told my daughter you know what she's not allowed to read.
So the librarian told me that from then on, I
could take out any book I wanted, and you know,
not long after that, they changed the policy for everyone.
(13:08):
So you know, there was another example of how you
know you can kind of speak up to authority when
they kind of tell you who to be and prevent
you from making your own choices. Were you always like that? Well,
you know, that's third grade and I feel like yes,
I think for most of us are values of form
when we're pretty young. Yeah, so you know, seeing my
mother do that, I'm sure you would have had an
(13:28):
impact on me. Yeah, that's such a good story. And
did you I mean, did you always know you wanted
to go into law? No? I actually really didn't, because
having grown up as a woman during that era, my
father was a lawyer and he always used to talk
about the fact that law was really not a good
profession for women. Why would you want to do that
if you could be an English teacher and have the
summer off, you take care of your children, so you
(13:50):
have to be a while. I graduated from college and
then spent a few years doing other things and then
decided to go to law school. Well, I mean, it's
it's so interesting and now to seeing where you're at
um and seeing this moment, it does feel like a moment.
And I was looking at something you said about you know,
this feels like a moment we can be optimistic because
(14:10):
so many Americans are beginning to really understand the scope
and the depth of structural racism. It certainly feels, you know,
I'm based in New York City. You can just feel
it right on the streets with the protests, and they
hear the sirens and the helicopters, you know, as we
sit here, um and we hear you know, your rich
history and covering and caring about these issues. What is
(14:32):
the challenge for you guys ahead, Well, you know, the
challenge on that particular subject is that this is work
that we had already been doing. One of our top
priorities for the past several years has been trying to
break our addiction to mass incarceration, which, as everybody is
now really coming to terms with, has been really it's
(14:53):
a system that has disproportionately affected people on the basis
of race and income and disability of co to the
people who are arrested or people who are mentally ill,
And our feeling is that the system has been fundamentally
broken and misguided for a long time. So part of
what we're trying to do with this moment is to
capitalize on the fact that people want to look at
what the police do. We're trying to encourage people to
(15:16):
look beyond the police. It's not just you. Who are
the police arresting and how are they treating the people
they arrest? I think behind that is the question of
what do we really want to treat as a crime.
So when you treat all sorts of very minor misconduct
as a crime, you're really setting up a situation where
they're going to be more contacts and therefore potentially more
(15:37):
arbitrary and discriminatory context. So, if you think about it,
Eric Garner ended up dying because he was selling single
cigarettes on which the tax had not been paid. George Floyd.
The basis for that encounter was that they thought he
might be passing a counterfeit twenty dollar bill. So I
think that if you look at why are we criminalizing
(15:58):
some of the things we criminalize, especially if you're talking
about people who are mentally ill and are having problems.
Do we really want the police to be the people
who are the first responders to people who are having
a mental health crisis or is there some more effective
way to deal with that that would avoid putting those
people into the criminal justice system, which isn't really good
(16:18):
for anyone, and to maybe recommit, reallocate some of the
resources we're using on arresting people and locking them up
to actually dealing with the mental health crises. You have
mental health treatment. So instead of viewing everything as well
dysfunction as a matter of policing, why don't we spend
more time reinvesting and to try to prevent more dysfunction.
(16:40):
Instort of like the old saying, you know, if you're
a hammer, everything looks like a nail. Well, you know,
not every problem in our society is a problem for
the criminal justice system, and an occasion to arrest people
and lock them up a lot of them really should
be an occasion for thinking about public health treatments. I'm
thinking about how we want to approach homelessness, and you
have a lot of much deeper thoughts about how you
(17:01):
prevent dysfunction. Rather than answering everything with you we're going
to send in the police. It certainly seems also like
this moment, even coming out of the pandemic, I can
only imagine the mental health crisis is going to be
even worse. Yeah, that could well be um and I
think the pandemic is also showing us. Somebody asked me
the other day whether the protests over policing and police
(17:23):
brutality are related to the pandemic, and I was in
a webinar and one of the smart people in the
room said, oh, no, no, they're two entirely different things.
But I said, what do you mean. The same people
who are being disproportionately affected by policing and police brutality
are the people who are being disproportionately affected by COVID.
The statistics is that people of color are much more
(17:43):
likely to die, and there are a lot of reasons
for that. You're having to do with your underlying health
and having to do with the fact that minorities and
people who are not affluent don't get to work from home,
they don't get to work through zoom. There are the
people who are out there on the streets being the
first risk wanders, being the people who are picking up
the garbage, being the people who was talking the supermarket shills.
(18:05):
And I feel like the virus is really amplifying so
many of the inequities we've had in our society. And
I think especially you know, I don't know what it's
like for everyone else. But I live in Brooklyn and
in New York City. It really felt like a lot
of the people who were out on the street, they
were out on the street because they were upset about
George Floyd. But I think it was more that they
recognized that George Floyd was the chip of the iceberg
(18:28):
and that there were just a lot going on that
they really just you could not tolerate any longer. More
from Susan after the break, and make sure to subscribe
to First Contact in Apple podcasts or wherever you listen
so you don't miss an episode putting on the tech hat.
(19:02):
You know, I think most people probably don't think of
tech when they think of the a c l U,
But there's quite a bit of litigation in regards to
security and privacy issues around contact tracing, surveillance, algorithmic bias,
and obviously the a c l U has a hand
in checks and balances and a lot of the issues
that are merging from the pandemic. You know, what are
(19:23):
some of the tech developments that you guys are most
concerned about. Well, since you were mentioning the COVID and
the the contact tracking and tracing, I'll start with that.
So the upshot is that we are neither for nor
against contact racing. Contact tracing is something that really will
contribute to public health. Our concern is not to say no,
(19:43):
you can't do it, or you just go right ahead
and do whatever you want. What we're concerned about is
to minimize the damage to privacy, the damage to equity. Again, Uh,
there are a lot of concerns that we have. The
other thing that we're concerned about is discrimination. Again, because
there are ways in which the technology could also increase
(20:05):
pre existing social inequities. We think that people should not
be coerced into participating and in testing. We think it
should be voluntary, and we also think that it should
be nonpunitive, because if you start having the criminal justice
system enforcing whether or not people are willing to use
their phone to take a test or whatever it is,
(20:25):
you're just creating more opportunities for police interactions that will
at some point be arbitrary or discriminatory. So we don't
want to see rules and regulations that are good to
public health rules. Even if they really are good public
health rules, we don't want to see those become occasions
for filling up the jails with the people who aren't complying,
(20:48):
because we've already seen there were some statistics in New
York that when you ask the police to start enforcing
who's wearing a mask and who's not wearing a mask,
that right away, excuse excuse, racially truly disproportionate in terms
of who they were questioning and who they weren't questioning.
So I think there's just a lot of issues they
were which is very much prop your reality, because they're
very much ethical issues. Yeah, you know, UM one of
(21:11):
the one of the cases that I'm fascinated by. UM
and I you know, I honestly I felt like it
was just it was only a matter of time until
we saw this headline. And then we saw the headline,
you know, a man was arrested after an algorithm wrongfully
identified him. You know, I've been covering for so many years.
AI is biased. AI is trained on you know, on
(21:31):
data online, which can be very racist, you know, And
I think for so many years we've been having this conversation.
But the question of okay, well, what happens when it
gets into the hands of the police, what happens, you know,
if if it could go for policing, And so I
think it's such a fascinating case. And and you guys
that a cl you filed an administrative complaint with Detroit's
(21:53):
police department over what you guys are calling the country's
first known wrongful arrest involving facial recognition technology. I mean,
for context, a man was arrested because he was wrongfully
identified by an algorithm. The police department thought he had
robbed I believe, like stolen watches, and he was arrested.
(22:14):
I mean, can you talk to me about the significance
of this case. I can't help put put on my
tech hat and scream. You guys, this is a really
big deal. Yeah, it is a really big deal. And
as you're saying, Laurie, we were aware of this problem
for a long time and we've been complaining. So going
back for a minute before getting to the case you're
talking about, Robert Williams UH the National Institute of Science
(22:37):
and Technology says that African American and Asian people are
up to a hundred times is likely to be disidentified
by official recognition. So yeah, that's the background problem. And
so we knew that, right, you know, we knew that
before the case came up in Michigan. Um, and it's
not the algorithm's fault. Obviously, there's something that's being put
into the algorithm that that is, you know, that has
(22:58):
a bias. And I people tend to think that algorithms
are you know, are so neutral and that we can
rely on algorithms. That's what I was saying about the
contact tracking and tracing, that you you start relying on
algorithms or apps that you think are neutral, and you
really have to be very wary of that. So again,
before getting to the Robert Williams case, uh An, a
c l U staffer at the ah LU of Northern California,
(23:21):
had the really interesting idea of trying out Amazon's facial
recognition program Recognition with the K because yeah, they were
just offering this to the police or whatever. This is great,
it will help you identify and see if you have
somebody who matches a mug shot. Well, what they tried
to do, which I thought was very clever, was they
tried to match mug shots against the members of Congress.
(23:43):
They got the facial pictures of all the members of Congress.
This was in July and there were twenty eight members
of Congress who were misidentified as matching the mug shots.
There were twenty in mistakes out of that, and not
only that, but that the false matches were disc apportionately
people of color. And one of the people who was
(24:03):
identified as matching a mug shot, and therefore, you know,
probably you know this criminal was civil rights legend John Lewis.
You're the guy who was beat up on the bridge
in Salma to know, to get us all voting rights.
So yeah, we know that almost of the false matches
there were of people of color, even though people of
color made up only twenty of the members of Congress.
(24:26):
So in some ways, you know, the Robert Williams case
is completely predictable. We knew that we allowed for that
to happen. It might have already happened elsewhere, but you know,
subterranean lee in a way that we don't really we
didn't see the case. But what's amazing about the Robert
Williams cases that it happened right there, you know, visible
to everybody where you can just see it. So what
happened was that they told him that he was being
(24:48):
arrested because they believed that he was that the algorithm
has said that that that he was a match for
this mug shot, and they showed him in the mug shot,
and he said to them, do you guys think all
black people look alike? That looks nothing like me. So,
you know, it was pretty clearing that if you used
your eyes and looked at the picture yourself, if you
didn't trust the algorithm, and if you looked at the
(25:10):
picture in this man's face, they didn't look alike. But nevertheless,
he spent thirty hours in jail under some pretty miserable
conditions because the algorithm said it was a match. So
I think that's really important. In some ways, the fact
that you know a problem exists is not as inspiring
to make people want to do something about it as
(25:30):
when you see it. So that's what happened with all
the protests about George Floyd. You people could watch that
horrible video. They could see it. It was recorded on
the video, And here we have an actual person, not
just hypothetically statistics are showing, but an actual person who
did get arrested and did have a miserable time. He
was arrested in front of his family. It was really
(25:51):
traumatizing and based on again, the officers involved were trusting
the science more than they were trusting their their own eyes.
When anybody cook he didn't look like the picture right.
And you know, he wrote an offed in the Washington
Post and he he asked the question, He said, why
is law enforcement even allowed to use this technology when
it obviously doesn't work? So, I guess asking a legal
(26:15):
scholar the question. You know, police departments all around the
country are using different variations of facial recognition software. So
you know, what regulations should we see as we enter
this era of algorithmic discrimination. Yeah, that's a great question.
And again we've been urging, you know, long before Robert
Williams turned up, we've been urging police departments not to
(26:37):
rely on the facial recognition technology that it was just
it was not reliable enough to you know, to hold
people's face in the hands of the algorithms don't have hands,
but for people's face to be dependent on the spatial
recognition technology which was being touted. And again, it's great
if a company is doing something to make money, but
if wanting to make money is your only consideration, and
(26:58):
if you're not considering whether you are unleashing something that
is really going to be disruptive of people's lives unfairly,
either because it's just going to be wrong or because
it's going to be wrong in a racially skewed way.
I think that's just really a problem. So um, we've
been urging police departments not to buy and use the technology.
(27:19):
And I'm sure you know Amazon has withdrawn the facial
recognition technology temporarily and they're not sure whether or not
they'll bring it back. So the probability of a wrongful
arrest is one thing, but when you draw the camera
back and look at all the technology in the bigger picture.
In addition to facial recognition, one thing that police departments
have been doing with facial recognition and different law enforcement
(27:42):
agencies is to try to see who attends a demonstration
or see who's in the crowd. So it ties not
into are you, like, is somebody likely to be wrongly
arrested like Robert Williams because they just there was a
false match. But it starts becoming big surveillance too that
an agency has the cameras on and then they have
(28:03):
the facial recognition and they're purporting to identify all the
people in that crowd, so that then they contract those people.
They now know that you were at the George Floyd demonstration,
and that person was in the the the anti war demonstration,
And at that point, the government starts having more and
more information about all of us, to the point where
(28:24):
it feels like instead of we're controlling the government, it's
like the government controls us. So I think the facial
recognition is only one part of the whole tendency of technology.
Two amplify government power to be kind of watching, watching
what we do. Yeah, I mean, it's it's interesting to
(28:47):
hear you say that. Um. You know, that type of
technology is just a part of it, especially when it
comes to this moment where people are out protesting police brutality,
when people are out fighting for their civil liberties. You know,
there's all sorts of technology that's being built. Their cameras
that are are being built that can recognize people in
real time that are police, police are wearing. There's all
(29:09):
sorts of technology. This is just the beginning of it. Um.
I I know you mentioned Amazon put a hold on
their sales of recognition software. Microsoft said it's not going
to sell face recognition software to police departments until their
federal regulations. I know. IBM said that it was going
to announce a ban on general purpose facial recognition? Is
that enough? Like? What is I guess, you know, what
(29:32):
is the government's role here? Like? What do you think
should happen? Especially since this is just, as you say,
one small part of a larger issue that we're facing
as a society. I think that's right, and I think
that there could be, you know, government regulation, but that's
not going to happen unless the public wants to urge
their representatives to start controlling this. And what we've seen
(29:52):
is that an enlightened public can make something happen even
without regulation, right, So, you know, it was that the
public is becoming concerned and that's the reason why Amazon
acted to withdraw this. They started being concerned that their
customers were not going to be happy with them. And
I think at this point that's almost more effective than
(30:12):
government regulation. And once you have that wake up call,
then you can start having serious debates. And I think
those debates have to take place in many places. They
should be taking place in legislatures where people can talk
about the trade off between privacy and mass surveillance and
whatever the government is trying to accomplish, why do they
need this technology? Is it really worth it? You are
(30:34):
their crimes that they wouldn't be solving without it, and
are they crimes that we're concerned about solving or do
they fall into the category of, you know, is that
something that we don't think should be a crime at all.
People are generally unaware in terms of what the police
do that only four to five percent of all arrests
involved crimes of violence. So when people think about we
(30:55):
want to enable law enforcement to be at catching criminals,
where we're concerned about divesting or defunding the police because
who's going to protect us from physical harm? Almost none
of what the police and law enforcement do is about
physical harm. It's a tiny percentage. Everything else that they're
doing is about this whole array of all sorts of
other things that we criminalize. And I think that in
(31:17):
addition to having better conversations about is there a potential
for some of these technologies that the government is using
to create arbitrary or discriminatory enforcement, I think we need
to dig deeper behind that question, in the same way
that you need to dig deeper beyond the George Floyd
murder and to ask if there's something systemically wrong here.
(31:39):
Do you need to rethink the whole question. So when
people say, well, you know, but we need the facial
recognition technology because it helps the police solve crimes, well, okay,
but you know what crimes and what are the costs?
So I think once people are educated enough and once
they realize what the nature of the problem is kind
of what's being unleashed, they can start really being read
(32:00):
to have that broader conversation. And I think it should
take place in legislatures, but I think it also should
take place and evidently is taking place in boardrooms at Amazon, Facebook,
and Google and Microsoft. They should be talking and they
do sometimes if the people demand it. And it also
has to take part just among people, you know, among
you know, tech communities and people just beginning to talk
(32:22):
about what are our responsibilities here? Is it okay for
us to create products just to make money if we
know that there are dangers, that the products are going
to be misused, or maybe aren't reliable enough, or that
they just feed into this enormous surveillance date. So let
me compare this to an earlier moment. After nine eleven,
we had a kind of similar phenomenon that in order
(32:44):
to deal with catching terrorists. We changed a lot of
laws that ended up really sacrificing a lot of privacy
and allowing a lot more government surveillance. And for a
number of years that went unchallenged, and people kept saying, oh, well,
you know, if if that's what we need in order
to be safe, we're willing to give up a little privacy. So,
first of all, I think people didn't think about the
(33:06):
fact that they weren't giving up their own privacy, they
were giving up somebody else's. And second of all, people
didn't realize how extensive the surveillance really was until Edward Snowden.
So then after Edward Snowden came along and people realized
how the government was just scooping up tons of information
about people and just keeping it in government databases and
started realizing the horrifying potential of all that. What happened
(33:30):
was that Congress made a couple of little changes to
the law. But more important, Microsoft and Google and other
places started to realize that their customers were concerned, and
they started being a little less cooperative. At the beginning,
right after nine eleven, all of the telecoms, all these
companies were just saying to the government, you want information here.
Take it all your verizons are sure, you know, hear
(33:50):
all the records of all our customers. Take it all.
You're keeping us safe. And I think that to me,
the most important thing is an informed public. That if
people can examine for themselves, but that they really think
that we're being kept safe by all of this, and
really examine you both the costs and the benefits in
an educated way, I think we get much better discussions.
(34:11):
And I think not only do you have the possibility
of getting better legislation or regulation, you also have the
possibility that private companies and you know, the tech the
tech companies are not going to want to do it
anymore because their customers don't want them to. Yeah. I
mean it's hard to have an informed public and to
have these discussions, even in this current environment. To some degree.
(34:32):
I mean, people I think are struggling with the idea
of truth. People are um, you know. And I remember,
by the way, I remember this note in leaks, like
I remember being in the news room covering technology and
thinking to myself because I wrote the tech bubble all
the way up right, and thinking, this is an extraordinary moment,
because we saw that we've been sharing all our data.
(34:53):
But we saw for the first time that, you know,
the government had a lot of access to things that
we had no idea they had access to. And I
think it was a fundamental shift, and the lens on
tech companies changed at that moment, and tech companies behavior
has changed quite a bit after that. You know, I
wonder this moment we're sitting in where we're having these
(35:13):
debates about surveillance and privacy and whatnot. These are sticky
debates and they're very politicized as we're heading into an election,
as we have misinformation spreading online, as a lot of
people don't know what to believe and what not to believe.
As the media landscape has changed, it's it certainly seems
like a harder environment to even to even have some
(35:33):
of these conversations. Well, I think in some ways it's
harder in some ways. I think the other thing that
is a catalyst for the discussions is realizing that there's
a dimension of race to all of us, I think,
and talking about artificial intelligence and facial recognition, not many
people saw that as an issue of structural racism. You know,
that there's something wrong with how we're putting together the
algorithms and it ends up that John Lewis is going
(35:55):
to be misidentified as somebody who matches a mug shot
and that Robert Williams is going to be arrested. So
I think that the fact that we now know that
that is an additional concern enables us to have richer conversations.
So we're not only talking about is there a trade
off between security and privacy? Plus, I think the other
thing that people are feeling much more open to is
(36:17):
to have that deeper conversation about what are our goals
here and if we're enabling all this government surveillance in
order to help the government to catch criminals, well, you
know what do we mean by criminals? What crimes are
they solving? And how are they using you know, how
how are how is this actually being used in services?
Wood So I feel like in some ways, you know,
(36:37):
with the election coming up, I think that gives people
more impetus to want to talk about these issues, because
the elections aren't only about the president. They're also about
local prosecutors and sheriffs and the people who make the
decisions about whether to buy surveillance equipment and what they're
gonna do with their authority over the criminal justice system.
(36:57):
So one thing the a c l U has been
doing in addition to everything else, as we've been very
involved in elections of prosecutors because that's the place where
almost people never used to pay attention to, you know who,
who were these people running and maybe they would vote
for somebody without really knowing what they voted for. So
what we're urging, and I think this is very much
what we're talking about about having an educated public. We're
(37:19):
urging people to go to elections or to go to debates,
to go to campaign events, attend I guess on zoom
these days, to attend campaign events and ask the candidates questions,
what would be your policy about whether or not you're
going to accept military equipment from the federal government in
your police department? Are you going to buy tanks? Are
you going to buy you know, these horrible weapons that
(37:41):
are used. Is that something you would do? Are you
going to buy you know, facial recognition software? Is that
how you would use your power? If we left you
um say that the prosecutors, would you support a reduction
in cash bail and increase of increased alternatives to incarceration?
So that's a place where without way for the government
to do something, we can ourselves affect what's happening in
(38:04):
our communities by encouraging candidates to think about what positions
they're taking on these different issues and letting them know
that they're gonna lose votes. The more people educated, the
more they are educated, the more they can tell people
that they'll lose votes, and to try that. This is
something that's worked in some places to encourage candidates to
take a better position. Yeah. Yeah, they might never never
(38:28):
thought of that, but you know, once they commit themselves,
you know that that's going to be better. So there
are all sorts of ways that we can affect things.
More from Susan after the break and make sure you
set up for our newsletter at dot dot dot media
dot com. Backslash newsletter will be launching this summer. Before
(39:00):
I move on from specifically some of the tech issues,
I have to bring up predator drones right right, you know,
the the U S. Customs and Border Protection flew a
large predator drone over the Minneapolis protests. You know, people
were protesting police brutality and the killing of George Floyd,
and for many reasons, it almost felt symbolic. You know,
it was raising all these questions about aerial surveillance, about
(39:25):
what data was being collected, where was this going? What
is your take on this? Well, you know, as you're saying, Laurie,
and you know that really it really magnifies the opportunity
to gather more information because you don't even have to
have the helicopters or whatever. But so you know that
of course is a concern just you how much information
is the government gathering, what are they going to do
(39:46):
with it, who's going to have access to it? Will
would ever be deleted or will it just got to
stay there in the government databasis forever. But I think
the other thing that the Predator drone brings to mind
is a question that people were also asking, which is
about the new live touris aation of law enforcement we've
had for years in this country, a passi Comma tat
us Act as it's called, which says, you don't want
(40:07):
the military doing everyday law enforcement, because that's that's not
our country. We don't want the military to be quote
dominating the streets, and we don't want the people who
are out protesting to be considered the enemy of the
United States, there are people who are expressing their opinions,
and so the whole idea of you know, it's one thing.
(40:29):
It's enough if the police held helicopters are flying overhead
and trying to keep track of, you know, who's in
the crowd and what the crowd is doing. But once
you start adding an element of something the military helicopters
or the military drones or things that feel like we
are being treated as the enemy of the government, instaid
that the people who are the government, who are supposed
(40:50):
to be controlling the government, I think that that's just it.
It's a very bad paradigm. You think it's a slippery slope, Well,
it's a slippery slope unless we stopped the slipping. And
as we saw with you with Amazon and the facial recognition,
if people say, wait a minute, yeah, I think we
can make that stuff. But I think if people don't
pay attention, I think we have a very slippery slope.
(41:11):
And that's what I've been saying about most of the
issues we've talked about you, starting with the contact tracing
and the surveillance and everything else. It seems to me
that what's really important is transparency. We should know what
the government is doing and accountability. Back on the issue
of contact tracing, one thing that the AHL you do,
together with the a l U of Massachusetts, is we
(41:32):
have filed a lawsuit. We're actually a records request demanding
that the government, including the CDC, release information about the
possible uses of all the location data that they would
be collecting in connection with contact tracing, because you know,
once if you don't know what they're doing, then you
can't have a discussion about what they should be doing.
And one reason why I was bringing up all the
(41:54):
post nine eleven changes of law is that I think
that the whole idea that we can't know the government
is doing. The government has to act in secret in
order to keep us safe or else the enemy will
be able to know what they're doing and you know,
and work around it. But the government can know everything
that we're doing. I think that just has democracy backwards.
You know, we have to be able to know what's
(42:15):
happening inside the government. And that applies to why are
they sending the Predator drone? What are they going to
do with the information? What does this mean? Are they
going to do it again? And it also has to
do with the contact track tracking and tracing. Once they
get that data, what happens to it? Are they going
to erase itever? You know, who do they share it with?
What are they going to do with it? And I feel,
you know, those are really important issues in a democracy,
(42:37):
that we just have the right to know what the
government is doing so that we can talk about it.
And I feel like to sort of say, well, this
is what the government is doing and that's really bad,
and that upsets me. I think that kind of misses
the point. If the government is doing something bad, then
it is the duty of every American to find out
what they're doing and to push back. And so at
the a C. O you we have a program that
(42:59):
we call People Power. We first invented that and used
it to explain to cities and localities all over the
country about how they could fight back against draconian immigration
rules by becoming quote sanctuary cities, what what their rights
actually were. We then used it for voting rights. We're
about to use it some more for voting rights. But
(43:20):
what we have really urged and I hope that you know,
some of your listeners will go to the a C
l U website and see about what people power is
doing in addition to what the a c l U
is doing, Because what is the A c L you doing?
And that's all the staffers at home trying to you know,
work on their new laptops while they're trying to, you know,
keep their talkers quiet. But people power is about what
every single person can and I think should be doing.
(43:41):
You know, if people really educate themselves and think about
the ethical issues, the costs and benefits of all this
technology in addition to a lot of other things going on,
I think we get a lot better results if people
pay attention. Yeah, I mean, it's interesting to watch the
A c L you take on issues like surveillance, facial recognition.
I know, the A C L you out a lawsuit
against clear View AI, which was this very controversial company
(44:04):
that was using biometric data. I think facial recognition technology
helped them collect something like three billion face prints, and
they were giving access to private companies, wealthy individuals, federal,
state and local law enforcement agencies. And you know, coming
from the tech space, it certainly feels like sometimes these stories,
you just don't know what these companies are doing until
(44:25):
you start, you know, peeling back the layers and seeing
all the data went to here and here, and why
did it go there? And why wasn't this disclosed? And
and oftentimes it takes the watchdog to really understand where
some of this can can go wrong and how it's
being used in ways in ways that can be dangerous
in many ways. Yeah, I think that's exactly right. And
(44:46):
that's why I was saying before that aren't concerned before
everybody jumps on the bandwagon about let's have more contact
tracing and then you know, like everybody should just be
doing all this information. I think we have to get
a dog. Yeah, you're not gonna have the watch dog
telling you things unless you build a watchdog into the system.
And if everything is just you know, a company has
invented this and is selling it to the police, or
(45:07):
a company who has invented this and now we're all
going to buy it. If you just leave out any
sort of oversight, then you really have a tremendous potential problem.
Are there any other examples of tech that we're not
thinking about the unintended consequences for our rights or privacy yet? Well,
you know, a AI is really big altogether across as
you're saying across many different kinds of issues. I was
(45:30):
just actually, this is not a tangential to your question,
but you were asking me before about cases that I
had worked on, and there was another case that I
worked on that was about tech where I wrote the
A c l Use brief in the Supreme Court. It
was an ancus brief. It wasn't about our client, but
it was a kid called Riley versus California. And what
the police were saying they're most law enforcement places, the
(45:51):
federal government as well as the state of California and
many other jurisdictions, was that when you arrest somebody, the
police get to do what is called a search incident
to arrest, so they get to see what you have
in your pocket. Makes some sense, right, you know, if
you have a gun in your pocket, that's a problem,
or you know whatever, So they get to do a
surgeon sent in to arrest. And the law had been
that if they find something in your pocket like that's
(46:12):
that's a container, they can search inside the container to
see if there's anything in it that that could be harmful.
And in fact, there was one situation where they opened
up a cigarette package that somebody had and they you know,
they could find a razor blade, they could find marijuana,
cigarette whatever. So that was law where the Supreme Court said, yes,
you're allowed to search people and search the containers that
are on them. Well, what law enforcement said was, your
(46:35):
cell phone is a container. When we arrest you, we
can search your cell phone. It's a container. We have
the right to search incident to risk. And so we
wrote a brief saying no, it's not you know, it's
a container, but it's a container that essentially is your home,
it's your library, it's your desk. So allowing the police
to look in your cell phone when they only had
(46:56):
really very feeble and very unlikely scenarios, things just wouldn't
happen too often for what the need was. You know,
maybe you had some remote thing that would go off
and would blow something up, you know, oh come on, yeah,
but yeah, there were other ways to deal with a
lot of that, and so the Supreme Court actually agreed
with that. They said, yeah, you know, this is really
is just a technological way of finding out what's in
(47:17):
your your papers and books and records. It used to
be they were in your desk, and now they're in
your cell phone. So that to me, it's a sort
of a whole thread of what we've been talking about.
But the challenges to civil liberties are different and in
some ways greater when the technology builds up. Yeah, there's
there's a great quote on the A c. L U website.
(47:37):
The fact that technology now allows an individual to carry
such information in his hand does not make the information
any less worthy of the protection for which the founders fought.
The U. S. Supreme Court Chief Justice John Roberts exactly.
I like to talk about, you know, one of the
whole points of the Constitution adding the Fourth Amendment, which
is the protection of privacy, is they wanted to protect
(47:57):
what was in Benjamin Franklin's disk. You know, nobody should
know if he was writing some things that were anti government,
and we now have that on our cell phone, so
of course, But that's where I think that a lot
of the protection of civil liberties is applying our fundamental
principles in different circumstances. Taking a gigantic step back, what
do you think is the biggest threat to civil liberties
(48:19):
in the new World Order? In the New World Order, Well,
you know, it's hard to just select one. In sort
of like Sophie's choice, you know which, which is your
favorite child? Right now, I think one of our very
top priorities in Adenia. Mass incarceration is a big one
because so many people's lives are just being totally disrupted
their families, and often the question really has to be
for what. One thing that we're hoping is that the
(48:41):
work we've been doing around trying to get vulnerable people
released from prison so that they won't get the virus
and get seriously all possibly die is we're helping that
once jurisdictions see that they were able to release thousands
of people from prisons and jails and that it's not
going to cause a spike in the crime rate, it
really is pretty safe thing to do. We're hoping that
(49:04):
that's going to stick and that long run will be
able to rethink, well, did we really need to put
all those people in prison and jail to start with?
What are we doing with the criminal justice system? So
that's really big. But the other thing that I think
is really big right now is voting rights. I have
alluded to this at the beginning of our conversation, but
the premise of democracy is that the people get to
(49:25):
decide on who should be running the government and who
should be making the policy about all these things we're
talking about here. You know, what, what are the regulations
about technology? What are the regulations about your reproductive freedom?
Everything else? LGBT rights? Uh, And it's the people's vote
is distorted. That's a real problem that people can't vote.
(49:46):
So we have litigation going on right now in I
think it's like thirty different states trying to get people
the opportunity to vote. So one of the things that
has happened in addition to all way is that incumbents
had been using to try to protect their own seats,
is that the virus has really made it dangerous for
(50:06):
people to vote in public places. So we saw the
election in Wisconsin where people were just lined up for
you know, tremendous distance is waiting for a really long
time to vote because Wisconsin would not allow them to
submit absentee ballots. And in fact, a study showed afterwards
that at least seventeen people got got the virus from voting.
Many many polling places were closed because they, first of all,
(50:30):
the poll poll workers are generally elderly people, and the
poll workers were not able and willing to to man
the polling places. There are a number of states that
don't allow absentee ballots at all unless you have a
particular situation, like if you're disabled, and the states are saying, oh, well,
you know, the pure the virus are getting yill, that's
not a disability. Or before you get an absentee ballot,
(50:50):
you have to have it notarized, you have to have witnesses. Now,
how is all this going to happen? So it's very
concerning that people are going to have to choose between
their health and they're right to vote, and we don't
think that that should happen. And that's something that has
to be attended to right now, because if states don't
come up with plans for trying to enable everyone who
(51:10):
wants to vote to be able to vote, and for
counting absentee ballots and for administering this program, if you
don't come up right now with the plan and the resources,
a lot of people are going to be left out
and they're going to find that either you know, they
can't vote because they're afraid to go out to the poll,
or the vote is not going to be adequately counted.
So I think that right now, making democracy work is
(51:31):
really one of our top projects. What is the solution
to some of these problems? What are your tangible solutions?
But one tangible solution is that more states have to
make absentee balloting available to people without having all these
conditions and you know obstacles. Uh. The other solution is
you were talking before about truth. A lot of the
(51:52):
reason that's given the very thin veneer of justification that's
given for we don't want absentee ballots or we need
voter i D keep to carry government to approved voter
i D, which means you have to go down to
a governmental office live and get your voter i D
and show it at the polls. The excuse for a
lot of this is is that there could be fraud. Well,
(52:12):
studies have shown that there's virtually no voter fraud, and
it's just it's really a real unicorn. And again, I
think if people understood that, that might sound good, but
it's not true. I think truth is another thing that
we're really fighting for these days. Can you listen to
the evidence. Can you listen to the public health officials?
Can you listen to what you what's real? I know
for a fact that tech companies are very concerned about
(52:34):
voter suppression, you know, and misinformation spreading online. This idea
of countering truth around a lot of these very important initiatives,
whether it's absentee ballots, whether it's showing up to the polls,
all that kind of thing. You know. I'd be curious
to know your take. There's a current battle happening right now.
You have seven fifty advertisers boycotting Facebook asking for better
(52:54):
policing of hateful content. Our social media companies doing enough
to police harmful content, especially as we head into an
election where voter suppression and the spread of misinformation will
most certainly be attacked. It used to manipulate voters. Well,
let me actually break your question down into two different parts,
because you were starting by saying about the concern about
voter suppression. I think one thing that everybody should be
(53:16):
doing is to increase awareness of what is a fair
way to improve access to the ballot for everybody. And
some of those things are tech solutions. We've had tech
solutions for years that are available and not widely enough used.
How to enable differently able people to vote. You can
blind people, do they have the technology? So there are
(53:36):
a lot of places where we need the tech community
and we need everybody to find out how you vote,
to find out a voting can be made easier, and
to let people know what the rules for voting are
where they live. So one thing the a c l
U Is doing is we have on our website you
know your rights, you know what you're voting regulations are,
and that's something that I think people really have to
start thinking a lot about and to let let all
(53:59):
their community is all their friends and family know about
the importance of voting and how they what they have
to do to vote, and to urge them to just
get out and vote in whatever form that's going to take.
So I think that's really important. In terms of disinformation
on social media, people talk about the First Amendment and
whether you know there's a First Amendment problem with Facebook
(54:20):
telling you what you can't do. Well, there isn't, because
the First Amendment only applies to the government, so you
don't have a First Amendment right to say whatever you
want on Facebook. However, I have to say that we're
you know, we don't regard that issue is altogether a
simplistic issue that Facebook should be telling everybody what they
can't say, because even though the First Amendment does not
apply to private companies, there's still a tremendous value to
(54:44):
free speech. And there are a number of examples which
you know, are we've come up with about people who
are have speech suppressed for bad reasons. I'll give you
one example. There was a woman who African American woman
who posted something on Twitter and she got all the
it is horrible racist responses, and she posted a screenshot
(55:04):
of the responses that she got to show people what
she was up against, and Twitter took it down because
it included racist words that you know, okay, you know
kind of this is the point. There was another uh
An CLU lawyer wrote about a statue in Kansas that
was a topless statue is a woman who were bare aggreasted,
and so whatever the locality was in Kansas decided to
(55:27):
take it down because yeah, that was they considered that
to be important. So the A. C. L You lawyer
who was challenging whether or not the I think it
was city could take it down, posted a portrait a
picture of the statue and that was it wasn't Twitter
as I think Facebook, and that was taken down on
the ground that it was obscene, so she couldn't post
the picture of what she wanted to do. So we
(55:47):
think that social media control is really a two age sword.
What I liked is at one point Facebook had a
protocol for about you what's true and what isn't true,
And what they did was they gave you a flag.
So they were concerned that something that was said wasn't true,
they would have a neutral fact checker check it, and
then if it didn't turn up well, they would put
(56:08):
a little flag over it and say this has been questioned,
and you could click on the flag and you could
see why it was questioned. But they didn't just take
it down. So you know, I I agree that, you know,
disinformation is a tremendous problem, but I think that the
idea that the solution is asked the tchech companies to
decide what we should and shouldn't see. Yeah, I don't
think that's so great either, And certainly they should not
(56:29):
be doing it without a lot of transparency and accountability.
If they're going to be taking things down, they should
tell us what their protocols are, and you know, there
should be more public discussion about where the balance is there. Yeah,
It certainly seems like the protocols changed quite a bit,
especially having covered tent for for this many years. It
certainly seems like Facebook changes at Twitter changes it, and
(56:50):
oftentimes it depends on public pressure. I'm curious to see
what happens with all these advertisers boycotting. I think personally,
I have a feeling it won't impact the bottom line
much and they'll go back to business, says normal. But
but who knows, you know, I do know that Zuckerbird
cares uh deeply about his employees and and but they've
been kind of up against you know, public scrutiny for
a very long time. But but it certainly is interesting,
(57:12):
especially when the stakes get higher and disinformation can go further,
and especially as we get closer to an election, it
certainly feels like everyone feels more triggered around it. Yeah, yeah, well,
you know, one of the classics statements about the First
Amendment is that in the marketplace of ideas, the best
antidote to bad speech is more speech, right, so, you know, suppression.
(57:33):
I think we always have to worry every time somebody's
censoring and suppressing. Yeah, who are we giving that power to?
You know, nearing a clothes because we don't have you
for too much longer. I saw that you gave a
talk um a Democrat and a Republican walk into a bar,
and you're saying that it seems like these days Democrats
and Republicans can't really agree on anything, but we all
(57:54):
need to agree on fundamental American principles like do process,
equality and freedom of conscience. So is that possible? Do
you believe are you? Are you an optimist? Do you
believe that in this current environment? Is that possible? Well?
I think that's that's a great wrap up question. Where
so that speech I gave it the Central Arkansas Library,
(58:16):
And my chief point, as you're saying, is I think
that people have to be able to agree on neutral principles.
The Constitution was designed not to say what we're going
to do about everything. It was designed to have everybody
have a fair opportunity to be part of the process
of deciding what we're going to do. So it sets
up all these democratic structures where we get to vote
(58:39):
for the people who are the policy makers and we
all get to decide. But the principles there, the underlying
principle is that everybody should have a fair and you know,
the principle should be neutral. Everyone should get to vote.
It's not like, you know, if you're a Democrat, your
vote doesn't count in this area, and if your republic
and your vote doesn't count in that area, is that's
not fair. And the basic ideas of the freedom of speech,
(59:01):
freedom of religion, they're all to be they manage the
stations of the Golden rule that if I want the
ability to just choose my own religion and decide what
religion I'm going to practice, I have to respect your
right to make a different choice and have your own religion,
because that's the golden rule. If I want to say
something that's unpopular, I have to respect your right to
say something that's unpopular. And if I want to be
(59:22):
treated fairly and not locked away for you know, doing
something minor and never given a fair trial, I have
to respect your right to have the same thing happened
to you and to be all those fundamental principles are
things that we really all should agree on. I think
people get into arguing and assuming that they can never
agree on the principles because they're differing on what they
(59:42):
think the results should be. And I think to be
part of the point of civil liberties is it's all
about process, it's not about results. The a c l
U is nonpartisan. We don't try to get Republicans elected,
we don't try to get Democrats elected. We don't favor
or disfavor individual politicians or individual parties, but we we
favor or that there should be neutral principles that everybody
(01:00:04):
can agree to to say, okay, here's what's fair. And
the analogy I used in that talk at the Central
Arkansas Library. It was one of the nights during the
World Series, but fortunately not a night where there was
a game, so people were able to come. And I said, okay,
so what happens before a baseball game is that everybody
has agreed on the underlying rules, and everyone agrees that
(01:00:25):
the your umpires, your referees, and any sports should be neutral.
And you don't want somebody who's partisan. If they were
favoring one team, you get rid of them at all.
Sports fans could agree to that. You know, maybe there
would be a few who would be just so you know, matchiavellian,
that they would rather have the biased umpire to always
rule for their side. But I think sports fans can
(01:00:45):
agree what you really want for a fair game. Is
you want a fair game, you want everyone to agree
on the principles beforehand. And I think that if we
could sit down in small groups around the country and
really talk about what the fundamental principles are, I am
an enough of the patriot to think we actually could
agree about a lot. And let me give you an
example of why. I think there's some basis for hope.
(01:01:07):
Maybe not optimism, but certainly hope. We were talking about
voting rights. So one of the major problems is gerrymandering,
the way when a party is in power they try
to distort all the district and they try to stack
the deck so that their party will remain in power.
Or if the party in power in a particular state
thinks it's to their advantage to not have that many
(01:01:28):
people vote, they try to make it harder to register
to vote for new voters, etcetera. Uh, we have had
the A C l U and a number of other
organizations working in coalition with US have had a fair
amount of success doing ballot initiatives going to the people
of the state in states like Michigan and Nevada and
(01:01:48):
Missouri and Florida, where we were part of getting the
amendment for a past that gave the vote back to
people who have been convicted of a felony at some
point and the people of the state. When you us
the people of the state, you can get a majority
sometimes the super majority of people who say no, we
want the rules to be fair. Who doesn't want the
rules to be fair are legislators who want who are
(01:02:11):
incumbents and who want to keep their seats even if
it takes unfair procedures to do it. So that's a
real problem right where we have right now that the incumbents,
the people who are trying to maintain power and not
allow any sort of regime change, are pulling all the levers.
But what I think, I think the chief grounds for
optimism is that when you go to the American people
themselves and say, well, do you want a fair system
(01:02:33):
or do you want a system where you think your
side is more likely to win? You talk to them
about that, and I think that you're going to get
them to say they would really like to see a
fair system, and that is the promise of America. UM.
Last question, you have taught at Brooklyn Law School since
what is the lesson your students will take from this
moment in history. Well, I know there are lots of lessons,
(01:02:55):
but if you could extract it, what is the lesson
your students will take from this moment in history? Well,
you know, in an individual setting, one thing I'm doing
for the fall is I am preparing of course that
I'm calling COVID nineteen and the Constitution. So what we're
gonna do in this seminar is we're going to be
looking at the way in which the Constitution has been
challenged and to see, you know, how well it holds up.
(01:03:17):
What does the Constitution have to say about whether you
can quarantine people and whether you can allow people to
be at a religious assembly but not to go to
a protest, and etcetera, etcetera. So I think there's a
lot of interesting things there which I think are very
much this particular moment, But big picture, what I would
like the students to take away the constitutional law students
especially is essentially what I just said to you, that
(01:03:39):
the Constitution is about process. It's not about results. It's
not about you know, you're a Republican and you're a Democrat,
and we have two different countries depending on what your
party is I think that we have one country and
it's all about a neutral process for very good reasons,
and I would like people to think more about that.
After my speech at the Central Arkansas Library, I had
(01:04:00):
two examples of people who talked to me. One guy
came up to me, he said, I'm the Republican who
walked into that bar, and he said, you know, you're
making a lot of sense to me. And then there
was another guy who talked to me who was a Democrat.
He said, you know, I never really thought about that,
that maybe it's not right if we're only trying to win.
I never thought about you know, that's that's not what
(01:04:20):
we do in sports. And that's what I'd like people
to think about. You know, do you really want to
do things that are only about how you think it's
going to come out and cheat and destroy the system
and you know, put a film on the scale and
you know, stack the deck in order to make things
come out to what your preferred result is in the
short run or a long term. Is that just a
really bad idea because it's just totally inconsistent. You know,
(01:04:44):
we've just come from fourth of July and totally inconsistent
with the premises on which we would like to believe
our country was founded. Does technology throw a wrench in
the system? I mean it does. It does create lots
of things you can't control, and and it it always
it's always you know, it's always new environment. So you know,
different kind of example, we were talking about technology and surveillance,
(01:05:06):
where of course technology has enabled a whole lot of
surveillance that we then have to deal with. But technology
also enabled a whole lot of new marketplaces of ideas.
So the A. C. L. You did a lot of
litigation you a few decades ago on applying first two
Moment principles to the Internet, right, you know, because the
government censor what was on the Internet because you know, child,
(01:05:26):
a child might see it. Yeah, And so you know,
every new generation of technology, there are new challenges about
how you apply our principles like privacy and free speech,
et cetera to the Internet, but the principles remained the same.
(01:05:53):
I hope everyone is doing well in these strange and
surreal times and adjusting to the new normal. Most of important,
I hope you're staying healthy and somewhat saying follow along
on our social media. I'm at Lorie Siegel on Twitter
and Instagram, and the show is at First Contact Podcasts
on Instagram and on Twitter. We're at First Contact pod
(01:06:13):
and for even more from dot dot dot sign up
for our newsletter at dot dot dot media dot com
Backslash Newsletter. And if you like what you heard, leave
us a review on Apple Podcasts or wherever you listen.
We really appreciate it. First Contact is a production of
dot dot dot Media, Executive produced by Laurie Siegel and
Derek Dodge. This episode was produced and edited by Sabine
(01:06:38):
Jansen and Jack Reagan. The original theme music is by
Zander Sing. First Contact with Lori Siegel is a production
of dot dot dot Media and I Heart Radio