Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hello, everyone, and welcome to it Could Happen Here podcast,
which I'm recording at eight in the morning and thus
without any of my colleagues, and I'm joined today to
discuss the technological aspects of the border regime by Austin
Coca of Syracuse University and by Jake Wiener of the
Electronic Privacy Information Center.
Speaker 2 (00:26):
Hi guys, morning, How are you doing? James good.
Speaker 1 (00:29):
I'm very excited to talk more border stuff. I like
covering this, even though it's sometimes terrible. So what I
wanted to start off with is, I think our listeners
will be familiar with CBP one, right, the most cursed
cell phone out of all time, and both of you
have written a lot and very insightfully about CBP one.
So I thought we could kind of do a little
(00:52):
bit of a breakdown of a the issues with it
and be like with the issues with it as an app,
and then the fact that we're using an app being
a problem inherently. So perhaps we could start with I know, Jake,
you mentioned you wanted to talk a little bit about
the design of the app, So in the process of
sort of commissioning it and making it, should we start there?
Speaker 3 (01:12):
Yeah, And I think this story is pretty interesting and
unique because CBP one was built in house by a
small team at the Office Field Operations aw CBP Yeah,
which is it's unique, Like there's one other app that
they built, and I don't really know of other mobile
(01:35):
apps that have been rolled out with anything close to
the size of CBP one that have been designed by
a government agency.
Speaker 1 (01:43):
Yeah, that's kind of an odd choice, you know.
Speaker 3 (01:46):
Conceptually, it's not something I'm critical of, Like I think,
if we're going to have a government that's providing services,
it's good for them to do things in house.
Speaker 1 (01:55):
Yep.
Speaker 3 (01:56):
It means you're not relying on third parties who are
able to use information from the app and benefit off
of it. But it doesn't mean you need the institutional
competency to be able to design an app and so
to just like provide a quick history. Basically, a CBP
(02:18):
one app was built off of the framework of an
older app called CBP Rome. That app was used just
for people boating on the Great Lakes because technically, if
you go like boating on Lake Michigan, you will leave
the United States if you chase a fish over the
boundary to Canada. YEA and CBP felt that it was
very important that people who did that reported leaving and
(02:40):
coming back into the United States. Yeah, questionable, But they
built an app to let people do that, and the
framework for that app used a GPS pining to verify
when you were back in the US.
Speaker 2 (02:56):
Okay, this is a small app, you know.
Speaker 3 (02:58):
I don't think they encountered too many problems with it
because you're maybe a couple hundred visitors a day. And
on that framework, they built out CBP one to do
a couple of things. It's used for folks like customs folks.
So if you're importing goods into the country, you can
do some of that reporting through CBP one and also
(03:20):
use it to apply for the and obtain the I
ninety four Travel Form, which is the forum that like
most folks coming to the United States are going to need.
And then critically for our uses, is that if you
are applying for asylum, you can use it to schedule
an appointment.
Speaker 1 (03:36):
Yeah, that's been the bulk of my reporting on it,
is that the bulk of its use I think. So, Yeah, okay,
and so I'm still blown away with the fact they
designed it in house. It's it's crazy. Did you ever
find the job did job posting, so the people who
designed it, or did they just like get some people
who were good at it to kind of take a
swing at it.
Speaker 3 (03:56):
The so, as far as I know from you know,
I've talked to one of people involved in the creation.
I think Austin has as well. My understanding is that
it was like an in house team that already existed. Okay,
but Austin, you may be able to clarify that.
Speaker 2 (04:10):
Yeah, that's my understanding too.
Speaker 4 (04:11):
I think they have a technology team within the agency that.
Speaker 2 (04:16):
Is using technology in various ways.
Speaker 4 (04:19):
I don't think we have a full understanding of the
scope of their responsibilities and the work that they've done.
I think, to Jake's point, it is quite interesting that
they produce something for the public. It's not unusual, of course,
for large agencies to have teams in house that deal
with all of the general technological challenges that every agency
(04:40):
in twenty twenty three phases, you know, databases, you know,
keeping government cell phones working and secure and all of that,
all of that kind of thing. But a lot of
the things that are public facing from federal agencies tend
to be contracted out to a private vendor in some way.
Speaker 2 (04:57):
So this is it's quite unique, and.
Speaker 4 (05:00):
But they don't think we have a full scope of
what they what they are aren't producing in house.
Speaker 1 (05:05):
Yeah, they that's interesting because they heavily rely on outside
contractors for so much of it. Like there's a whole
industry that you know, starts here in San Diego and
goes over to Tucson and are probably further into New
Mexico of people providing surveillance technology to border patrols, and
then you know, it goes over to the West Bank too.
Lots of lots of it can be seen. Having talked
(05:37):
about the the sort of unique approach to design, it's
probably a good idea to then talk about the implementation
of zapp and it's kind of lacklusters and understandment.
Speaker 2 (05:45):
It just fucking sucks. It's terrible.
Speaker 1 (05:48):
So like, what in what many ways has it been
unfit for the purpose that it's supposed to do. So
I guess first we can talk about technological and inadequacies
and then more broadly about why this isn't a problem
you can really solve with an app on a telephone
that needs to broadband and Wi Fi.
Speaker 3 (06:06):
Yeah, so I'll start by saying that I think a
lot of what's happening with the problem the CPP one
app is institutional blindness. So the people who design the app,
I genuinely think want it to work well, and I
think they're simply not asking the questions that you need
to be asking. And when you design app like this,
(06:27):
which is who's really going to be using it? What
are their needs? What technology? What wireless services to they
what phones are they using? Basically like, if you're someone
on the southern border with very little money and probably
an outdated phone, yes, are you going to be able
to use this app? Not a great camera? And so
(06:49):
I think the first place to start with that is
simply the fact that the app requires a strong Wi
Fi or SELL signal to use, which is not always present.
And I think Austin has some good insight into the
problems with insufficient Wi Fi.
Speaker 4 (07:07):
Yeah, definitely, you know, I think some of what's interesting
here is the way not only that the app relies
on Wi Fi, but then the kind of real world
social consequences here for how people then try to cope
with these problems. I want to take one step back,
just really quickly and discuss the world that CBP was
dropped into because there's some important context here. So as
(07:31):
I know, you've already covered James. You know, over the
past three years, the dominant border control policy was Title
forty two, a COVID era policy that was purportedly motivated
by concerns about public health. This is where Title forty
two comes from. Title forty two of the US Code
(07:53):
pertains to issues of questions of public health. It's not
an immigration policy. It was a public health policy, although
detailed reporting has I think pretty well established that it
was more of a political moment of political opportunism rather
than a legitimate public health concern. But regardless, that policy
(08:13):
allowed Customs and Border Protection to effectively turn back anyone
who arrived at the border, whether they attempted.
Speaker 2 (08:20):
To cross unlawfully or not.
Speaker 4 (08:22):
And the primary human rights concern here was people who
were seeking asylum, which is their right to do. One
of the aspects of Title forty two was that there
was a rare exemption clause built in the allowed people
who were particularly vulnerable a particular humanitarian concern to attempt
(08:43):
to effectively apply for this kind of exemption. And until
January of this year, that process was run by nonprofit organizations.
CBP had this sort of informal outsourced system where NGOs
on the Mexican side of the border would effectively conduct
massive amounts of intake and prioritization and triaging of these
(09:04):
cases and then submit you know, names to CBP to
a lot of people to come to ports of entry.
CBP one effectively replaced that system in January, which meant
that instead of migrants going through the NGOs, they would
have to download this app, fill out the information and
send it in. This is really important to mention because
(09:27):
the groundwork was actually laid by a tremendous amount of
effectively unpaid labor on the backs of NGOs on the
southern side of the border. And you know, is it
is fair and accurate to say that this was an
extremely imperfect system and that there were absolutely, you know,
significant issues with this. But one of the interesting things
(09:48):
is that the role that NGOs played meant that people
coming and seeking asylum would then in some ways be
potentially connected with a broader network of NGOs, support services,
advocacy and so forth. So the introduction of CBP one
purportedly bypassed the work of NGOs in screening people for
(10:11):
the exemption process. However, NGOs still ended up performing all
this kind of invisible labor because they're the ones who
effectively were working with migrants to make Wi Fi available.
And it's not just Wi Fi, it's actually charging your phone.
When I visited shelters and camps on the southern side
of the border at the end of twenty twenty two,
(10:32):
a big part of their having camps and shelters was
actually providing electricity, you know, when I was there, and
I know others have reported on this, James, I'm sure
you've seen this too. You know, people would be huddled
around the outlets because they needed to charge their phone.
If their phone didn't work, if their phone wasn't charged,
they didn't have access there to CBP one. This was
(10:56):
already a challenge because the primary form of communication was
CB was phone calls. They would individuals would get phone calls.
In fact, I interviewed the Russian family on the Mexican
side of the border and Matamotos in last November, and
the family now they.
Speaker 5 (11:14):
And many of the other.
Speaker 4 (11:15):
Migrants I spoke with, and this was also true for
many migrants. By the way, the families typically the wife
and children. If there were a family unit would stay
either in a hotel or a shelter or someplace that
was more safe, and then the men would effectively have
nights on the street where they could actually get cell
phone coverage and things like that. So CBP one introduced
(11:37):
all of these kinds of technological demands. It's not that
they weren't there before, but I think it's a different
matter when you go from interacting with a network of
NGOs to saying now you're actually interacting with the US
government and this is the only way that you're going
to be able to enter the country.
Speaker 2 (11:55):
I think those.
Speaker 4 (11:56):
Demands were quite high, and they've they've clearly had some
tremendously negative impacts from migrants trying to come through that way.
Speaker 1 (12:03):
Yeah, definitely, I know have one here, But we've bought
so many of these, like solar powered charging brick things
and distributed those. But I have so many photos of
people's hands reaching through the wall and people trying to
charge to their phone on the other side of the wall,
you know, And it's been a big demand for a while.
(12:23):
But it's certainly when CBP were detaining people in paces
where he didn't have power and then expecting them to
also communicate using their telephones. That became a particularly sort
of ridiculous issue, very upsetting to see it like done
like that. So, yeah, this this app really isn't a
solution for the problem we're facing, which is, as you said,
(12:45):
like a three year backlug on people who have legitimate
asylum claims being able to make those asylum claims. And
I guess can we talk about who it favors in,
you know, implementing this system as a catch all, right,
not an option, but the option. Who does that favor
and who does it not?
Speaker 3 (13:06):
Yeah, before we get there, I think it might be
helpful to just run through, like what it is like
to use CVP one.
Speaker 2 (13:11):
Oh yeah, you have to go through because it is.
And that's.
Speaker 3 (13:17):
When you think about that, think that every step is
a potential failure point, right, every step you could have
a glitch, And anytime you have a glitch happen, it's
going to kick you out of the app and you
have to restart. So, if you're on a southern border
need to apply for asylum, You've been walking for months
(13:37):
from Venezuela, Guatemala, et cetera.
Speaker 2 (13:39):
You got your phone.
Speaker 3 (13:41):
The first thing you have to do is log into
the app through login dot gov. That's the single sign
on service that many government agencies use. It works fairly well,
but you got you, so you register yourself a profile.
Then you're going to navigate over. Hopefully you speak one
of the languages that CBP one is elbowent as of now,
(14:01):
I believe that's English, Spanish, and Haitian Creole, although they
may have added a new language recently. You find the
right place on the app, not always super clear, to
submit your asylum application and try and schedule an appointment.
(14:22):
And then you're going to have to fill out a
ton of information. You're giving CBP your name, addresses, people
you know in the US, big form to fill out,
including often information on like how vulnerable you are, so
like are you pregnant? Are you disabled? Have you been
threatened in Mexico? Information that they want to use to
(14:43):
prioritize you, hopefully, And then you're going to need to
take a facial photograph that's going to go into CBP
and Department Homeland Securities databases. It will be run against
facial recognition searches that they populate with, like the this
massive facial recognition system, the Traveler Verification service that can
(15:04):
flag people who are on CBP's target list TSA's target list.
Speaker 2 (15:12):
You could be wrongfully flagged.
Speaker 3 (15:13):
By that because facial recognition is not a perfect technology.
You're also going to take a facial liveness scan. It's
related to facial recognition, but it is different. It's a
different technology and it is untested. There's been no government
agency that has evaluated facial liveness for bias, and that
(15:34):
basically is trying to figure out are you a real
person or are you like a picture of James that
you're holding up because you're trying to get James an
appointment and then sell it to them later or something.
Do the facial liveness scan. That's been the sticking point
where folks with darker skin and indigenous folks have not
been able to get through it. We can talk about
(15:57):
that a little later. You're also going to do a
GPS pain so your phone, pulling from both cell towers
and GPS data, is going to try and establish your
location and send it to CBP. That can create problems.
If you're pinging off a US cell tower, suddenly it's
less reliable, might look like you're in the US and
once you get through all these steps, then you're able
(16:17):
to submit your information and you're in a lottery or
whether or not you get an appointment. Great.
Speaker 1 (16:25):
Yeah, let's the photo thing. I think has been covered.
Maybe I pc betwe have been covered extensively because this
is what I do. But I think maybe some people
aren't aware of the complete inadequacy of those facial liveness scans.
And I know some nonprofits since you want to have
light booths, which can help with that, but it's not
you know, it's again like that money could be doing
(16:45):
something more useful, right, and then making like a like
an Instagram booth for people who just want to use
the exercise illegal right to claim asylum. So let's talk
about that technology and how it's not working.
Speaker 4 (16:58):
Yeah, I think one really important actor here, and the
reason I wanted to paint some of the contexts was
and partly selfish because as a geographer, I'm always very
you know, eager to evangelize about the importance of understanding
social geography for thinking about questions of you know, human
rights and asylum and immigration.
Speaker 5 (17:17):
So the facial life this test is a great example
of that.
Speaker 4 (17:20):
So you know, it's hard to see this unless you've
been on the ground in some of these places. But
you know, again, just a historical thing that I think
will be pretty non controversial. Anti black racism is something
that's existed for a very long time. It's not just
in the United States. It's around the world obviously, yea,
(17:40):
not everywhere, but but you know, obviously through colonialism, through
settler colonialism, and so forth. It's really shaped not just
anti black racism, but anti black racism itself has produced
many of the geographies that we have from redlining, segregation, educational.
Speaker 5 (17:58):
Acts, all kinds of things. The way that the social
world looks today is already shaped by these issues of racism.
What that then means is questions like who has access
to cell phone towers and fast Wi Fi, and who
can afford up to date smartphones that can meet all
(18:19):
of the threshold of require the technological requirements to this
to use. This avenues of software is already distributed and
fractured by questions of race and identity. What that means is,
even if the facial liveness test worked perfectly and there
(18:41):
were no issues with the software, which is not true,
but let's even just assume that it is still true
that access to that technology and software is already structured
by race. So one of the things I noticed, you know,
having spent time along the border, was just how much
even in some of the shelters and where black and
African migrants had access to shelter, was itself much tended
(19:05):
to be more pushed to the out outside of this
where you're less likely to get good cell phone coverage,
less likely to have electricity, much more likely that the
roads even where I visited, were not paved.
Speaker 4 (19:17):
And I was there when it was raining in Reynosa
one day, and you know, get some of the places
where African migrants and African families were staying, and black migrants,
by the way from Latin America. Let's just remind everyone
that there are black Latinos living in Latin America, right,
We're also pushed, you know, more to the outskirts. And
(19:40):
as a result of that, those factors contributed to access.
So it wasn't just issues with the software itself, which
may be there. It's hard for me to evaluate it's not,
you know, because it's not like we've done our own
evaluation of that, but it's also all of those contextual factors,
and I just want to make a fine point on this,
we know this already. CBP should understand that already and
(20:04):
understand the various social factors that impact access. So simply saying,
for instance, if one wanted to take a defensive position
and say, well, look, we ran the test, the software
works as intended. There's no racial bias in the software.
That doesn't get CBP out of the responsibility of saying yes.
But you absolutely had all the information and a reasonable
(20:24):
person should have known that this access to this app
had these kind of technological requirements, and then that access
was not evenly distributed.
Speaker 1 (20:34):
Yeah, I think it's really important you said that, actually
because a lot of reporters it does get reported on.
There are people doing great work, but like sometimes it
gets missed because African migrants might not speak Spanish Black
African migrants and a lot of reporters don't have the
language skills to talk to people in I worked with
a fixer who spoke a Romo to Grian and there's
(20:57):
a lot about it, like five or six other languages
and help to get me an insight into the very
difficult situation that lots of African people face. And you
know that their isolation the relative lack of resources even
in what's a pretty resource spar setting for everyone, and
Haitian people, I've spoken to a lot of Haitian people.
Plus then you add that, like if I think about
(21:19):
last month, the languages which I was able with through friends,
through translation to speak to people with you know, Vietnamese
command you is a dialect of Kurdish, French, right, Swahili, Spanish,
evidently Dutch. Aside from Spanish, those are not covered. Maybe
if you're French you can, I think it would be
(21:41):
still hard to work in Haitian creole of you if
you spoke sort of more mainland French, those are not
covered by that app, right, So you have to find
a way to access that with via translation. And then
it's very the information makes you incredibly vulnerable to whomever
if you if you're asking someone to share. Right, It's imperfect.
(22:01):
It's not a sufficient way to describe it, but it
did it extremely flawed system.
Speaker 4 (22:17):
To Jake's point, like, I'm also like kind of open
minded about, you know, about using an app like this.
I mean there's I mean Jake's right, I mean, if
you're going to have a government in twenty twenty three,
like having some reasonably up to date ways to do
things is not an unreasonable expectation.
Speaker 5 (22:37):
But there's just so many blatantly.
Speaker 4 (22:39):
Obvious sort of shortcomings that are not difficult to identify
in preparing this app and understanding what people are likely
to need, so to have those gaps and then also
to roll out the app at a time when the
same policy announcement that rolls out this app is also
(23:01):
a policy announcement that says this is the only way
to do it. I mean, imagine if like your new
policy for like healthcare for some you know, particular healthcare
you know thing was like, we have to go through
this route, and we know that eighty percent of people
aren't going to be able to use this, but now
this is the only treatment you have an option for.
Speaker 5 (23:20):
I mean, that would be that it's just strange.
Speaker 4 (23:22):
I think I think one thing to just think about
creatively here is I can imagine a phase rollout of
this where they did improve it over time, but they
were adequate, you know, outlets for people who didn't fit
into the categories that they had built into the app.
And I think I think that would be a more
complex and more nuanced and maybe a more more interesting
(23:44):
way to do it. I just don't think I don't
think it was rolled out responsibly in that way.
Speaker 3 (23:49):
Yeah, yeah, I think we should be honest that beta
testing an app on hundreds of thousands of the most
vulnerable people in the world is incredibly responsible.
Speaker 2 (23:59):
Yeah. Yeah, it's just cruel.
Speaker 1 (24:01):
It's not in any way appropriate. So I guess we've
talked a lot about this app. Let's talk about let's
say you're fortunate enough to get an a sign of
appointment how to enter the US. You would then, in
most cases enter something which is called CBP's Alternatives to
Detention Systems icis.
Speaker 2 (24:21):
Sorry, Yeah, you're right.
Speaker 1 (24:22):
Let's explain a little bit like why it's an alternative
to detention? Why would one be detained if you haven't,
in theory, done anything wrong? Well, in many people's perspective,
have done anything wrong, I guess. And then what does
ATD mean? And then we can get into some of
the privacy issues and the way that it affects not
just migrants but also everyone.
Speaker 3 (24:44):
Yeah, one thing before we go there, I think would
be great, just closing the loop on the racial bias discussion.
This is like an element of my advocacy that I
talk about all the time in different areas of like
how facial recognition is used when it's using the criminal
justice system. Is that there absolutely is bias in most
(25:05):
facial recognition systems. They work really well for white men
and incu increasingly less well basically as you run down.
Speaker 2 (25:13):
The privileged spectrum.
Speaker 3 (25:15):
That's an element of how these systems are designed, right,
It's they get fed a lot of images of white
men and fewer images of other folks. That's fixable, right,
Like you can provide a training database that is a whole,
you know, a good spread of people. It seems to
(25:36):
not necessarily have been done with the facial liveness for
CBP one in part because the British company that designed
it probably did not have access to a lot of
images of the type of people who would be on
the southern border, talking about like indigenous Mexican folks, Shield folks,
just a very large number of different ethnicities. But any
(26:01):
bias like that is, as Austin said, sitting on top
of a series of other biases, right of structural biases.
And so the result we see with a lot of
facial recognition systems, and this facial ibness system is ceb
going is no different. Is that a little bit of
even to a little bit of bias in how the
(26:21):
facial recognition works gets amplified, and it's amplified by social biases.
It's amplified by the biases of people who run the
system and people who interact with it every day, and
then it's amplified by institutional blindness as well failure to
recognize a problem. We had facial recognition systems rolled out
(26:42):
since on some levels since like the early to mid
two thousands, and we didn't even know that facial that
bias was a problem in any facial recognition system until
twenty eighteen. So when you're thinking about and you're hearing
about like bias testing and the fact that it's been
bias tested, those tests are never incredibly reliable because they're
not done in the real world, they're not done with
(27:04):
the people actually using the technology, they're done in a
controlled setting, and they're not done by people who have
a nuanced understanding of how the technology impacts people.
Speaker 1 (27:17):
Yeah, I think it's very important to remember that. Yeah,
there's layers of one, layers of bias, and they stack
to make it harder and harder for certain people coming
to the United States to get again, what's that right,
often to just be safe, right, Like some people, especially
the less advantage you are, sort of on a global scale,
the likely the less safe you are waiting in Mexico
(27:40):
to make an appointment for your asylum, right Like, if
you can't get into a shelter, or you're from a
group where you don't have community to look out for you,
you're just that bit more likely to be taken advantage
of or have something bad happen to you at your family. So, yeah,
it'll stacks up, I guess to make for a very
unfortunate situation for people.
Speaker 3 (27:58):
Yeah, which means the cause of quants of having a
glitch happen is way higher.
Speaker 1 (28:04):
Yes, I've personally known people who have had terrible consequences
from what should have been a very very straightforward asylum
application and very easy to process very rapidly. Yeah, it's
it's it's a whole it's a whole mess. And I
know I'm trying to speak more to some of the
folks who work with African migrants because I think that often, yeah,
(28:25):
their stories just don't get told, especially at our southern border,
where like I think Obviously there's this like a lot
of people like to report on the border but not
leave New York or DC or whatever they have their
studio or newspaper or what have you. And I think
it's easy to miss that if you haven't like us,
you'd said, been around a lot and seen all these
(28:47):
things stack up on top of one another. But yeah,
it's an important topic that we don't especially as like
I know, it doesn't get reported on because everyone likes
to report on Ukraine and only Ukraine, but like there's
more wars in our Africa or wars in you know,
set people from Myanmar. It's very hard for them to
get to the southern border actually from hearing from thousands
(29:10):
maybe different cases where people can't leave Thailand. But again,
the system, you know, when you have a whole other
alphabet that you're trying to access the system in and
it doesn't work for you then and that makes it
incredibly difficult for those people. And that, ladies and gentlemen,
is what we call a cliffhanger in the podcasting industry.
Because we will be back tomorrow with more on how
ICE tracks migrants and how that tracking of migrants can
(29:32):
impact other people. People who live within people in their communities.
I hope you're joined us in. Thanks bye.
Speaker 2 (29:42):
It Could Happen Here as a production of cool Zone Media.
Speaker 1 (29:44):
For more podcasts from cool Zone Media, visit our website
cool Zonemedia dot com, or check us out on the
iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts.
You can find sources for It Could Happen Here, updated
monthly at cool zone Media dot com slash sources.
Speaker 2 (29:59):
Thanks for listening.